Dec 01 21:33:34 crc systemd[1]: Starting Kubernetes Kubelet... Dec 01 21:33:35 crc restorecon[4693]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 21:33:35 crc restorecon[4693]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 21:33:35 crc restorecon[4693]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Dec 01 21:33:36 crc kubenswrapper[4962]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 01 21:33:36 crc kubenswrapper[4962]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 01 21:33:36 crc kubenswrapper[4962]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 01 21:33:36 crc kubenswrapper[4962]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 01 21:33:36 crc kubenswrapper[4962]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 01 21:33:36 crc kubenswrapper[4962]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.026497 4962 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.029064 4962 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.029083 4962 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.029088 4962 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.029092 4962 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.029105 4962 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.029109 4962 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.029113 4962 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.029117 4962 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.029122 4962 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.029127 4962 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.029133 4962 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.029138 4962 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.029144 4962 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.029149 4962 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.029155 4962 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.029161 4962 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.029165 4962 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.029169 4962 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.029173 4962 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.029177 4962 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.029183 4962 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.029188 4962 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.029192 4962 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.029197 4962 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.029203 4962 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.029207 4962 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.029211 4962 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.029215 4962 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.029218 4962 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.029221 4962 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.029225 4962 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.029229 4962 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.029233 4962 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.029236 4962 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.029240 4962 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.029243 4962 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.029247 4962 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.029251 4962 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.029254 4962 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.029258 4962 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.029263 4962 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.029268 4962 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.029272 4962 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.029276 4962 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.029280 4962 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.029284 4962 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.029288 4962 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.029295 4962 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.029298 4962 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.029302 4962 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.029305 4962 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.029309 4962 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.029312 4962 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.029315 4962 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.029320 4962 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.029326 4962 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.029330 4962 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.029334 4962 feature_gate.go:330] unrecognized feature gate: Example Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.029338 4962 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.029341 4962 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.029345 4962 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.029349 4962 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.029353 4962 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.029356 4962 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.029359 4962 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.029363 4962 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.029366 4962 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.029370 4962 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.029373 4962 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.029376 4962 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.029380 4962 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.029928 4962 flags.go:64] FLAG: --address="0.0.0.0" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.029954 4962 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.029963 4962 flags.go:64] FLAG: --anonymous-auth="true" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.029968 4962 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.029974 4962 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.029978 4962 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.029984 4962 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.029993 4962 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.029998 4962 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030003 4962 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030007 4962 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030012 4962 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030016 4962 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030020 4962 flags.go:64] FLAG: --cgroup-root="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030024 4962 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030028 4962 flags.go:64] FLAG: --client-ca-file="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030032 4962 flags.go:64] FLAG: --cloud-config="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030036 4962 flags.go:64] FLAG: --cloud-provider="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030040 4962 flags.go:64] FLAG: --cluster-dns="[]" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030045 4962 flags.go:64] FLAG: --cluster-domain="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030048 4962 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030053 4962 flags.go:64] FLAG: --config-dir="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030057 4962 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030061 4962 flags.go:64] FLAG: --container-log-max-files="5" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030066 4962 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030070 4962 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030074 4962 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030079 4962 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030083 4962 flags.go:64] FLAG: --contention-profiling="false" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030087 4962 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030091 4962 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030095 4962 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030099 4962 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030105 4962 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030108 4962 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030113 4962 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030117 4962 flags.go:64] FLAG: --enable-load-reader="false" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030121 4962 flags.go:64] FLAG: --enable-server="true" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030126 4962 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030131 4962 flags.go:64] FLAG: --event-burst="100" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030135 4962 flags.go:64] FLAG: --event-qps="50" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030140 4962 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030144 4962 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030148 4962 flags.go:64] FLAG: --eviction-hard="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030153 4962 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030157 4962 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030161 4962 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030165 4962 flags.go:64] FLAG: --eviction-soft="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030169 4962 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030173 4962 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030177 4962 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030181 4962 flags.go:64] FLAG: --experimental-mounter-path="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030185 4962 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030189 4962 flags.go:64] FLAG: --fail-swap-on="true" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030193 4962 flags.go:64] FLAG: --feature-gates="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030198 4962 flags.go:64] FLAG: --file-check-frequency="20s" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030202 4962 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030206 4962 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030210 4962 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030214 4962 flags.go:64] FLAG: --healthz-port="10248" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030218 4962 flags.go:64] FLAG: --help="false" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030223 4962 flags.go:64] FLAG: --hostname-override="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030227 4962 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030234 4962 flags.go:64] FLAG: --http-check-frequency="20s" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030238 4962 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030242 4962 flags.go:64] FLAG: --image-credential-provider-config="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030246 4962 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030250 4962 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030254 4962 flags.go:64] FLAG: --image-service-endpoint="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030259 4962 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030263 4962 flags.go:64] FLAG: --kube-api-burst="100" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030267 4962 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030272 4962 flags.go:64] FLAG: --kube-api-qps="50" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030277 4962 flags.go:64] FLAG: --kube-reserved="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030281 4962 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030285 4962 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030290 4962 flags.go:64] FLAG: --kubelet-cgroups="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030294 4962 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030298 4962 flags.go:64] FLAG: --lock-file="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030303 4962 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030307 4962 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030311 4962 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030318 4962 flags.go:64] FLAG: --log-json-split-stream="false" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030322 4962 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030326 4962 flags.go:64] FLAG: --log-text-split-stream="false" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030331 4962 flags.go:64] FLAG: --logging-format="text" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030335 4962 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030340 4962 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030344 4962 flags.go:64] FLAG: --manifest-url="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030348 4962 flags.go:64] FLAG: --manifest-url-header="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030354 4962 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030358 4962 flags.go:64] FLAG: --max-open-files="1000000" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030363 4962 flags.go:64] FLAG: --max-pods="110" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030368 4962 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030372 4962 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030378 4962 flags.go:64] FLAG: --memory-manager-policy="None" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030383 4962 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030387 4962 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030391 4962 flags.go:64] FLAG: --node-ip="192.168.126.11" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030396 4962 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030406 4962 flags.go:64] FLAG: --node-status-max-images="50" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030411 4962 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030415 4962 flags.go:64] FLAG: --oom-score-adj="-999" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030419 4962 flags.go:64] FLAG: --pod-cidr="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030424 4962 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030431 4962 flags.go:64] FLAG: --pod-manifest-path="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030435 4962 flags.go:64] FLAG: --pod-max-pids="-1" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030439 4962 flags.go:64] FLAG: --pods-per-core="0" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030443 4962 flags.go:64] FLAG: --port="10250" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030447 4962 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030452 4962 flags.go:64] FLAG: --provider-id="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030456 4962 flags.go:64] FLAG: --qos-reserved="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030459 4962 flags.go:64] FLAG: --read-only-port="10255" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030464 4962 flags.go:64] FLAG: --register-node="true" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030468 4962 flags.go:64] FLAG: --register-schedulable="true" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030472 4962 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030479 4962 flags.go:64] FLAG: --registry-burst="10" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030483 4962 flags.go:64] FLAG: --registry-qps="5" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030487 4962 flags.go:64] FLAG: --reserved-cpus="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030491 4962 flags.go:64] FLAG: --reserved-memory="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030496 4962 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030500 4962 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030504 4962 flags.go:64] FLAG: --rotate-certificates="false" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030508 4962 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030512 4962 flags.go:64] FLAG: --runonce="false" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030516 4962 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030520 4962 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030525 4962 flags.go:64] FLAG: --seccomp-default="false" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030529 4962 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030532 4962 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030537 4962 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030541 4962 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030545 4962 flags.go:64] FLAG: --storage-driver-password="root" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030549 4962 flags.go:64] FLAG: --storage-driver-secure="false" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030553 4962 flags.go:64] FLAG: --storage-driver-table="stats" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030557 4962 flags.go:64] FLAG: --storage-driver-user="root" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030561 4962 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030572 4962 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030577 4962 flags.go:64] FLAG: --system-cgroups="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030581 4962 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030587 4962 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030591 4962 flags.go:64] FLAG: --tls-cert-file="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030596 4962 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030600 4962 flags.go:64] FLAG: --tls-min-version="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030604 4962 flags.go:64] FLAG: --tls-private-key-file="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030608 4962 flags.go:64] FLAG: --topology-manager-policy="none" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030612 4962 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030616 4962 flags.go:64] FLAG: --topology-manager-scope="container" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030620 4962 flags.go:64] FLAG: --v="2" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030626 4962 flags.go:64] FLAG: --version="false" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030632 4962 flags.go:64] FLAG: --vmodule="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030636 4962 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.030641 4962 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.030975 4962 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.030982 4962 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.030987 4962 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.030992 4962 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.030996 4962 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.031000 4962 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.031004 4962 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.031008 4962 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.031012 4962 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.031015 4962 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.031019 4962 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.031022 4962 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.031026 4962 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.031029 4962 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.031033 4962 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.031037 4962 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.031042 4962 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.031046 4962 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.031049 4962 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.031053 4962 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.031057 4962 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.031062 4962 feature_gate.go:330] unrecognized feature gate: Example Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.031066 4962 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.031069 4962 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.031073 4962 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.031077 4962 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.031080 4962 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.031083 4962 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.031087 4962 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.031090 4962 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.031094 4962 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.031097 4962 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.031101 4962 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.031107 4962 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.031110 4962 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.031113 4962 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.031117 4962 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.031121 4962 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.031124 4962 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.031128 4962 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.031131 4962 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.031135 4962 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.031138 4962 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.031142 4962 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.031145 4962 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.031149 4962 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.031152 4962 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.031156 4962 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.031160 4962 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.031176 4962 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.031179 4962 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.031183 4962 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.031187 4962 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.031190 4962 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.031194 4962 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.031198 4962 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.031203 4962 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.031208 4962 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.031212 4962 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.031216 4962 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.031219 4962 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.031223 4962 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.031227 4962 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.031231 4962 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.031235 4962 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.031240 4962 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.031244 4962 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.031247 4962 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.031251 4962 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.031254 4962 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.031258 4962 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.031269 4962 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.044263 4962 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.044312 4962 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.044453 4962 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.044469 4962 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.044478 4962 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.044487 4962 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.044496 4962 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.044504 4962 feature_gate.go:330] unrecognized feature gate: Example Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.044512 4962 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.044520 4962 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.044529 4962 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.044538 4962 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.044546 4962 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.044554 4962 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.044561 4962 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.044569 4962 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.044577 4962 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.044585 4962 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.044595 4962 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.044602 4962 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.044610 4962 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.044618 4962 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.044626 4962 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.044634 4962 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.044641 4962 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.044649 4962 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.044657 4962 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.044664 4962 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.044672 4962 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.044679 4962 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.044687 4962 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.044695 4962 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.044706 4962 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.044714 4962 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.044722 4962 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.044730 4962 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.044738 4962 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.044746 4962 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.044754 4962 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.044762 4962 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.044770 4962 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.044778 4962 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.044785 4962 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.044793 4962 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.044800 4962 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.044808 4962 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.044816 4962 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.044827 4962 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.044841 4962 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.044852 4962 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.044865 4962 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.044874 4962 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.044883 4962 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.044892 4962 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.044900 4962 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.044908 4962 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.044916 4962 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.044924 4962 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.044957 4962 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.044966 4962 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.044974 4962 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.044982 4962 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.044990 4962 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.044998 4962 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045006 4962 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045014 4962 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045022 4962 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045032 4962 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045041 4962 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045049 4962 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045059 4962 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045069 4962 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045078 4962 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.045094 4962 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045369 4962 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045386 4962 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045395 4962 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045404 4962 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045412 4962 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045420 4962 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045428 4962 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045436 4962 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045444 4962 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045453 4962 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045460 4962 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045468 4962 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045475 4962 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045483 4962 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045491 4962 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045499 4962 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045509 4962 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045516 4962 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045524 4962 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045532 4962 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045540 4962 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045547 4962 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045555 4962 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045563 4962 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045574 4962 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045585 4962 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045593 4962 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045601 4962 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045609 4962 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045618 4962 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045627 4962 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045636 4962 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045644 4962 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045652 4962 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045661 4962 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045670 4962 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045678 4962 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045686 4962 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045694 4962 feature_gate.go:330] unrecognized feature gate: Example Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045702 4962 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045711 4962 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045719 4962 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045729 4962 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045738 4962 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045746 4962 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045754 4962 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045762 4962 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045772 4962 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045782 4962 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045792 4962 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045801 4962 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045810 4962 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045826 4962 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045835 4962 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045845 4962 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045854 4962 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045863 4962 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045871 4962 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045879 4962 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045887 4962 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045896 4962 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045905 4962 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045914 4962 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045923 4962 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045931 4962 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045965 4962 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045974 4962 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045981 4962 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045989 4962 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.045997 4962 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.046005 4962 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.046018 4962 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.047351 4962 server.go:940] "Client rotation is on, will bootstrap in background" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.054028 4962 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.054195 4962 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.055034 4962 server.go:997] "Starting client certificate rotation" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.055085 4962 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.055498 4962 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2026-01-05 04:46:47.798138335 +0000 UTC Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.055655 4962 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 823h13m11.742489366s for next certificate rotation Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.063482 4962 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.066147 4962 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.076064 4962 log.go:25] "Validated CRI v1 runtime API" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.098458 4962 log.go:25] "Validated CRI v1 image API" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.100198 4962 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.103761 4962 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-12-01-21-29-21-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.103845 4962 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:41 fsType:tmpfs blockSize:0}] Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.130981 4962 manager.go:217] Machine: {Timestamp:2025-12-01 21:33:36.128326881 +0000 UTC m=+0.229766146 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654124544 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:c1819972-ca37-4089-9661-6671772d5e38 BootID:1be34435-728a-4ba5-8928-fb5e344b2f91 Filesystems:[{Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:41 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:3e:a6:1b Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:3e:a6:1b Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:2b:45:47 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:62:7f:9a Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:19:cd:f2 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:3e:7e:91 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:5e:27:ad:07:ba:66 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:92:d9:04:dd:73:2c Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654124544 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.131380 4962 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.131660 4962 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.132138 4962 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.132430 4962 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.132488 4962 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.134980 4962 topology_manager.go:138] "Creating topology manager with none policy" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.135034 4962 container_manager_linux.go:303] "Creating device plugin manager" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.135526 4962 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.135909 4962 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.136621 4962 state_mem.go:36] "Initialized new in-memory state store" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.136782 4962 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.137979 4962 kubelet.go:418] "Attempting to sync node with API server" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.138029 4962 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.138089 4962 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.138120 4962 kubelet.go:324] "Adding apiserver pod source" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.138144 4962 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.140804 4962 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.141177 4962 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.141531 4962 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.110:6443: connect: connection refused Dec 01 21:33:36 crc kubenswrapper[4962]: E1201 21:33:36.141667 4962 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.110:6443: connect: connection refused" logger="UnhandledError" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.141957 4962 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.142063 4962 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.110:6443: connect: connection refused Dec 01 21:33:36 crc kubenswrapper[4962]: E1201 21:33:36.142217 4962 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.110:6443: connect: connection refused" logger="UnhandledError" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.142623 4962 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.142654 4962 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.142663 4962 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.142673 4962 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.142688 4962 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.142699 4962 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.142711 4962 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.142747 4962 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.142759 4962 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.142771 4962 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.142784 4962 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.142795 4962 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.143387 4962 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.143886 4962 server.go:1280] "Started kubelet" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.144347 4962 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.110:6443: connect: connection refused Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.144431 4962 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.144431 4962 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.145364 4962 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 01 21:33:36 crc systemd[1]: Started Kubernetes Kubelet. Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.149455 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.149512 4962 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 01 21:33:36 crc kubenswrapper[4962]: E1201 21:33:36.148097 4962 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.110:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187d34ebc017ebb5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-01 21:33:36.143854517 +0000 UTC m=+0.245293732,LastTimestamp:2025-12-01 21:33:36.143854517 +0000 UTC m=+0.245293732,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.149876 4962 server.go:460] "Adding debug handlers to kubelet server" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.149984 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 03:33:38.159594555 +0000 UTC Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.150055 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 606h0m2.009544601s for next certificate rotation Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.150681 4962 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.150713 4962 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 01 21:33:36 crc kubenswrapper[4962]: E1201 21:33:36.150721 4962 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" interval="200ms" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.150849 4962 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.151416 4962 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.110:6443: connect: connection refused Dec 01 21:33:36 crc kubenswrapper[4962]: E1201 21:33:36.151500 4962 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.110:6443: connect: connection refused" logger="UnhandledError" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.151850 4962 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.151874 4962 factory.go:55] Registering systemd factory Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.151902 4962 factory.go:221] Registration of the systemd container factory successfully Dec 01 21:33:36 crc kubenswrapper[4962]: E1201 21:33:36.151915 4962 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.154174 4962 factory.go:153] Registering CRI-O factory Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.154212 4962 factory.go:221] Registration of the crio container factory successfully Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.154252 4962 factory.go:103] Registering Raw factory Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.154280 4962 manager.go:1196] Started watching for new ooms in manager Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.157699 4962 manager.go:319] Starting recovery of all containers Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.174275 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.174519 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.174606 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.174710 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.174817 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.174900 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.174927 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.175006 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.175041 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.175111 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.175138 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.175165 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.175254 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.175296 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.175323 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.175351 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.175394 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.175423 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.175500 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.175537 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.175569 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.175648 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.175704 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.175735 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.175765 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.175800 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.175835 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.175908 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.175962 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.175990 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.176015 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.176041 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.176180 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.176209 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.176235 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.176265 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.176296 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.176329 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.176357 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.176384 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.176410 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.176440 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.176470 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.176509 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.176537 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.176565 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.176648 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.176681 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.176716 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.176747 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.176777 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.176804 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.176841 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.176873 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.176903 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.176966 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.177002 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.177032 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.177058 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.177087 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.177118 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.177148 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.177175 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.177201 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.177234 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.177260 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.177288 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.177320 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.177347 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.177374 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.177401 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.177427 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.177454 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.177482 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.177508 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.177536 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.177562 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.177589 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.177630 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.177656 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.177684 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.177711 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.177737 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.177766 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.177792 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.177821 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.177847 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.177872 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.177897 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.178001 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.178103 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.178129 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.178153 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.178176 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.178200 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.178227 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.178252 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.178276 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.178305 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.178332 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.178359 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.178386 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.178413 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.178440 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.178481 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.178513 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.178543 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.178572 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.178599 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.178628 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.178657 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.178685 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.178715 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.178744 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.178775 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.178802 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.178825 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.178850 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.180271 4962 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.180327 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.180356 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.180382 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.180407 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.180432 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.180458 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.180483 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.180507 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.180533 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.180564 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.180652 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.180458 4962 manager.go:324] Recovery completed Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.180689 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.180717 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.180742 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.180768 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.180796 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.180829 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.180853 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.180879 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.180903 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.180928 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.181000 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.181026 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.181054 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.181082 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.181111 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.181136 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.181166 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.181190 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.181215 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.181243 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.181267 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.181291 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.181323 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.181351 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.181375 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.181398 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.181425 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.181450 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.181473 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.181497 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.181524 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.181548 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.181578 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.181603 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.181626 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.181652 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.181680 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.181706 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.181729 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.181757 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.181782 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.181806 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.181831 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.181856 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.181879 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.181903 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.181974 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.181999 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.182026 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.182051 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.182076 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.182102 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.182129 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.182155 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.182183 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.182211 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.182238 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.182263 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.182287 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.182311 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.182338 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.182365 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.182396 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.182422 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.182448 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.182473 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.182497 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.182522 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.182550 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.182577 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.182601 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.182624 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.182648 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.182673 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.182696 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.182720 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.182743 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.182767 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.182794 4962 reconstruct.go:97] "Volume reconstruction finished" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.182811 4962 reconciler.go:26] "Reconciler: start to sync state" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.197468 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.202116 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.202174 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.202193 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.203692 4962 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.203730 4962 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.203766 4962 state_mem.go:36] "Initialized new in-memory state store" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.215813 4962 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.218176 4962 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.218256 4962 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.218312 4962 kubelet.go:2335] "Starting kubelet main sync loop" Dec 01 21:33:36 crc kubenswrapper[4962]: E1201 21:33:36.218399 4962 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.219027 4962 policy_none.go:49] "None policy: Start" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.220318 4962 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.220383 4962 state_mem.go:35] "Initializing new in-memory state store" Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.220340 4962 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.110:6443: connect: connection refused Dec 01 21:33:36 crc kubenswrapper[4962]: E1201 21:33:36.220485 4962 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.110:6443: connect: connection refused" logger="UnhandledError" Dec 01 21:33:36 crc kubenswrapper[4962]: E1201 21:33:36.252179 4962 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.285606 4962 manager.go:334] "Starting Device Plugin manager" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.285684 4962 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.285708 4962 server.go:79] "Starting device plugin registration server" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.286392 4962 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.286422 4962 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.286642 4962 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.286756 4962 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.286766 4962 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 01 21:33:36 crc kubenswrapper[4962]: E1201 21:33:36.294540 4962 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.318765 4962 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.318926 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.320654 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.320716 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.320735 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.321005 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.321389 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.321529 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.322483 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.322563 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.322581 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.322878 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.322928 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.322978 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.323149 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.323200 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.323333 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.324508 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.324552 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.324570 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.324737 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.324824 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.324878 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.324907 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.325046 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.325133 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.326071 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.326142 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.326160 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.326370 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.326391 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.326403 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.326391 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.326516 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.326558 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.327488 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.327537 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.327558 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.327812 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.327869 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.327910 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.327967 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.327989 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.328789 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.328834 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.328856 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:33:36 crc kubenswrapper[4962]: E1201 21:33:36.352052 4962 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" interval="400ms" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.385778 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.385821 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.385851 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.385877 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.385985 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.386040 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.386074 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.386157 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.386261 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.386312 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.386346 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.386369 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.386390 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.386411 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.386433 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.386537 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.388288 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.388346 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.388366 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.388419 4962 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 01 21:33:36 crc kubenswrapper[4962]: E1201 21:33:36.389079 4962 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.110:6443: connect: connection refused" node="crc" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.487785 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.487896 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.487926 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.487980 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.488007 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.488053 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.488076 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.488095 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.488138 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.488158 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.488177 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.488218 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.488238 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.488257 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.488297 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.488320 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.488404 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.488340 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.488425 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.488422 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.488483 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.488324 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.488516 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.488557 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.488617 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.488621 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.488649 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.488640 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.488628 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.488762 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.589820 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.591636 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.591696 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.591715 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.591754 4962 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 01 21:33:36 crc kubenswrapper[4962]: E1201 21:33:36.592398 4962 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.110:6443: connect: connection refused" node="crc" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.653398 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.662955 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.680460 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.692988 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.698671 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-e42f6c06f5a86091927ccf77a7b656c49afaf71062ed0d168b98dc670b4add2e WatchSource:0}: Error finding container e42f6c06f5a86091927ccf77a7b656c49afaf71062ed0d168b98dc670b4add2e: Status 404 returned error can't find the container with id e42f6c06f5a86091927ccf77a7b656c49afaf71062ed0d168b98dc670b4add2e Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.701383 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-5ba568baa7ddf555aa082a613c16cca32bb7de80d89d10f79d324cfbff3ec681 WatchSource:0}: Error finding container 5ba568baa7ddf555aa082a613c16cca32bb7de80d89d10f79d324cfbff3ec681: Status 404 returned error can't find the container with id 5ba568baa7ddf555aa082a613c16cca32bb7de80d89d10f79d324cfbff3ec681 Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.703251 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.723564 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-120209b27fd695764f9cfcc7614ebcabc199e72898861c742bfceadc5fed7b3a WatchSource:0}: Error finding container 120209b27fd695764f9cfcc7614ebcabc199e72898861c742bfceadc5fed7b3a: Status 404 returned error can't find the container with id 120209b27fd695764f9cfcc7614ebcabc199e72898861c742bfceadc5fed7b3a Dec 01 21:33:36 crc kubenswrapper[4962]: W1201 21:33:36.732972 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-2e214f9a75750bd873b03a154b629d70b7cf898a7f212b0f08a60cf91fe0202d WatchSource:0}: Error finding container 2e214f9a75750bd873b03a154b629d70b7cf898a7f212b0f08a60cf91fe0202d: Status 404 returned error can't find the container with id 2e214f9a75750bd873b03a154b629d70b7cf898a7f212b0f08a60cf91fe0202d Dec 01 21:33:36 crc kubenswrapper[4962]: E1201 21:33:36.753636 4962 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" interval="800ms" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.993232 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.995419 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.995474 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.995489 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:33:36 crc kubenswrapper[4962]: I1201 21:33:36.995528 4962 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 01 21:33:36 crc kubenswrapper[4962]: E1201 21:33:36.996043 4962 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.110:6443: connect: connection refused" node="crc" Dec 01 21:33:37 crc kubenswrapper[4962]: W1201 21:33:37.004392 4962 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.110:6443: connect: connection refused Dec 01 21:33:37 crc kubenswrapper[4962]: E1201 21:33:37.004498 4962 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.110:6443: connect: connection refused" logger="UnhandledError" Dec 01 21:33:37 crc kubenswrapper[4962]: W1201 21:33:37.145475 4962 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.110:6443: connect: connection refused Dec 01 21:33:37 crc kubenswrapper[4962]: E1201 21:33:37.145565 4962 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.110:6443: connect: connection refused" logger="UnhandledError" Dec 01 21:33:37 crc kubenswrapper[4962]: I1201 21:33:37.145799 4962 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.110:6443: connect: connection refused Dec 01 21:33:37 crc kubenswrapper[4962]: I1201 21:33:37.232278 4962 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818" exitCode=0 Dec 01 21:33:37 crc kubenswrapper[4962]: I1201 21:33:37.232381 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818"} Dec 01 21:33:37 crc kubenswrapper[4962]: I1201 21:33:37.232492 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1876af9332262ceec91b0e5d366b3626e5ad165b13f1c04a45982571bbd46bc3"} Dec 01 21:33:37 crc kubenswrapper[4962]: I1201 21:33:37.232613 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 21:33:37 crc kubenswrapper[4962]: I1201 21:33:37.234044 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:33:37 crc kubenswrapper[4962]: I1201 21:33:37.234094 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:33:37 crc kubenswrapper[4962]: I1201 21:33:37.234107 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:33:37 crc kubenswrapper[4962]: I1201 21:33:37.234451 4962 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="f376ef58003c1acfc76121626f42fac733707c4a88a8f8998b90cbf82d93730a" exitCode=0 Dec 01 21:33:37 crc kubenswrapper[4962]: I1201 21:33:37.234533 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"f376ef58003c1acfc76121626f42fac733707c4a88a8f8998b90cbf82d93730a"} Dec 01 21:33:37 crc kubenswrapper[4962]: I1201 21:33:37.234601 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"5ba568baa7ddf555aa082a613c16cca32bb7de80d89d10f79d324cfbff3ec681"} Dec 01 21:33:37 crc kubenswrapper[4962]: I1201 21:33:37.234707 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 21:33:37 crc kubenswrapper[4962]: I1201 21:33:37.235690 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:33:37 crc kubenswrapper[4962]: I1201 21:33:37.235743 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:33:37 crc kubenswrapper[4962]: I1201 21:33:37.235761 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:33:37 crc kubenswrapper[4962]: I1201 21:33:37.236497 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 21:33:37 crc kubenswrapper[4962]: I1201 21:33:37.237088 4962 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="5c72b3e92dfa20af11102b274817b0548b41f00933ce9c4dafcc83c8f99bb75d" exitCode=0 Dec 01 21:33:37 crc kubenswrapper[4962]: I1201 21:33:37.237175 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"5c72b3e92dfa20af11102b274817b0548b41f00933ce9c4dafcc83c8f99bb75d"} Dec 01 21:33:37 crc kubenswrapper[4962]: I1201 21:33:37.237220 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"e42f6c06f5a86091927ccf77a7b656c49afaf71062ed0d168b98dc670b4add2e"} Dec 01 21:33:37 crc kubenswrapper[4962]: I1201 21:33:37.237305 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 21:33:37 crc kubenswrapper[4962]: I1201 21:33:37.238023 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:33:37 crc kubenswrapper[4962]: I1201 21:33:37.238059 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:33:37 crc kubenswrapper[4962]: I1201 21:33:37.238073 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:33:37 crc kubenswrapper[4962]: I1201 21:33:37.241114 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:33:37 crc kubenswrapper[4962]: I1201 21:33:37.241163 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:33:37 crc kubenswrapper[4962]: I1201 21:33:37.241189 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:33:37 crc kubenswrapper[4962]: I1201 21:33:37.241881 4962 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="171ba01c49ce48dc0d4b632a8de3217fc6e9eab2c512c7214c377ec51987b2d8" exitCode=0 Dec 01 21:33:37 crc kubenswrapper[4962]: I1201 21:33:37.241958 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"171ba01c49ce48dc0d4b632a8de3217fc6e9eab2c512c7214c377ec51987b2d8"} Dec 01 21:33:37 crc kubenswrapper[4962]: I1201 21:33:37.241983 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"2e214f9a75750bd873b03a154b629d70b7cf898a7f212b0f08a60cf91fe0202d"} Dec 01 21:33:37 crc kubenswrapper[4962]: I1201 21:33:37.242076 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 21:33:37 crc kubenswrapper[4962]: I1201 21:33:37.242814 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:33:37 crc kubenswrapper[4962]: I1201 21:33:37.243069 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:33:37 crc kubenswrapper[4962]: I1201 21:33:37.243087 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:33:37 crc kubenswrapper[4962]: I1201 21:33:37.244578 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b00d16ddb5e5c38227a87ee09c7e0d74ca43fb9226904dc5d604abe6a9c9b5ee"} Dec 01 21:33:37 crc kubenswrapper[4962]: I1201 21:33:37.244614 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"120209b27fd695764f9cfcc7614ebcabc199e72898861c742bfceadc5fed7b3a"} Dec 01 21:33:37 crc kubenswrapper[4962]: W1201 21:33:37.312064 4962 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.110:6443: connect: connection refused Dec 01 21:33:37 crc kubenswrapper[4962]: E1201 21:33:37.312151 4962 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.110:6443: connect: connection refused" logger="UnhandledError" Dec 01 21:33:37 crc kubenswrapper[4962]: W1201 21:33:37.325885 4962 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.110:6443: connect: connection refused Dec 01 21:33:37 crc kubenswrapper[4962]: E1201 21:33:37.325994 4962 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.110:6443: connect: connection refused" logger="UnhandledError" Dec 01 21:33:37 crc kubenswrapper[4962]: E1201 21:33:37.555715 4962 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" interval="1.6s" Dec 01 21:33:37 crc kubenswrapper[4962]: E1201 21:33:37.576668 4962 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.110:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187d34ebc017ebb5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-01 21:33:36.143854517 +0000 UTC m=+0.245293732,LastTimestamp:2025-12-01 21:33:36.143854517 +0000 UTC m=+0.245293732,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 01 21:33:37 crc kubenswrapper[4962]: I1201 21:33:37.797107 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 21:33:37 crc kubenswrapper[4962]: I1201 21:33:37.803997 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:33:37 crc kubenswrapper[4962]: I1201 21:33:37.804086 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:33:37 crc kubenswrapper[4962]: I1201 21:33:37.804099 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:33:37 crc kubenswrapper[4962]: I1201 21:33:37.804199 4962 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 01 21:33:37 crc kubenswrapper[4962]: E1201 21:33:37.804927 4962 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.110:6443: connect: connection refused" node="crc" Dec 01 21:33:38 crc kubenswrapper[4962]: I1201 21:33:38.248163 4962 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="c9db7bc8e24555ca08b5f9d5de79e9a7fa267e1def7c01ee93d2b0d0f1b16ac1" exitCode=0 Dec 01 21:33:38 crc kubenswrapper[4962]: I1201 21:33:38.248250 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"c9db7bc8e24555ca08b5f9d5de79e9a7fa267e1def7c01ee93d2b0d0f1b16ac1"} Dec 01 21:33:38 crc kubenswrapper[4962]: I1201 21:33:38.248401 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 21:33:38 crc kubenswrapper[4962]: I1201 21:33:38.249294 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:33:38 crc kubenswrapper[4962]: I1201 21:33:38.249327 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:33:38 crc kubenswrapper[4962]: I1201 21:33:38.249339 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:33:38 crc kubenswrapper[4962]: I1201 21:33:38.250319 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"329efc0484814f00ee6b557b9788f2847ede432890d1a3a33cea693006ed07cf"} Dec 01 21:33:38 crc kubenswrapper[4962]: I1201 21:33:38.250447 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 21:33:38 crc kubenswrapper[4962]: I1201 21:33:38.251318 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:33:38 crc kubenswrapper[4962]: I1201 21:33:38.251362 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:33:38 crc kubenswrapper[4962]: I1201 21:33:38.251371 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:33:38 crc kubenswrapper[4962]: I1201 21:33:38.262453 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"e689e109554f82f2b6fc4bbde2e2f2a7d6ba0b5772b57c7a53827e6a5f66223c"} Dec 01 21:33:38 crc kubenswrapper[4962]: I1201 21:33:38.262497 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"3ceb0b8e00edd30c0fbc6f9b36340785b50eff3b237aeac7dc21c74fcf801c1f"} Dec 01 21:33:38 crc kubenswrapper[4962]: I1201 21:33:38.262508 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"2d0b3972d5038c6a9d8368101aa0f07a39a4d7023ae9e3ea0ef061b909afb194"} Dec 01 21:33:38 crc kubenswrapper[4962]: I1201 21:33:38.262605 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 21:33:38 crc kubenswrapper[4962]: I1201 21:33:38.263563 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:33:38 crc kubenswrapper[4962]: I1201 21:33:38.263603 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:33:38 crc kubenswrapper[4962]: I1201 21:33:38.263652 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:33:38 crc kubenswrapper[4962]: I1201 21:33:38.265669 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"38e749578c0ce8cb1dd38de90c24c5ddf738558163fe9012afcf1db088756495"} Dec 01 21:33:38 crc kubenswrapper[4962]: I1201 21:33:38.265702 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6bc7177cbcbb0fe51491e055e7a8cb565836f1ffe163b6e160c3b425a5218fc0"} Dec 01 21:33:38 crc kubenswrapper[4962]: I1201 21:33:38.265717 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e8d0603aa8db338a61428d0c439ab252ad7db6ee0061992381a2f2c2534ecc08"} Dec 01 21:33:38 crc kubenswrapper[4962]: I1201 21:33:38.265748 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 21:33:38 crc kubenswrapper[4962]: I1201 21:33:38.267111 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:33:38 crc kubenswrapper[4962]: I1201 21:33:38.267150 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:33:38 crc kubenswrapper[4962]: I1201 21:33:38.267165 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:33:38 crc kubenswrapper[4962]: I1201 21:33:38.269511 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"82bbe0097117ffbc7aa9c8247995dcec3765f03a9aa7657ee0c0da4c3a3ec874"} Dec 01 21:33:38 crc kubenswrapper[4962]: I1201 21:33:38.269547 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2d764e05de4e1d86add3da927e55f8d2f211bba19a2104af89e43d13f986f5ff"} Dec 01 21:33:38 crc kubenswrapper[4962]: I1201 21:33:38.269558 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d2a26275d40e0cdc563a8704d985b92ca116fd053ac663b23ada1f3d55d3d3a8"} Dec 01 21:33:38 crc kubenswrapper[4962]: I1201 21:33:38.269569 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"edc674b0b2e671ee34618f4a89fb513b4d6be3b5b2c65510554ed57b84f305b6"} Dec 01 21:33:39 crc kubenswrapper[4962]: I1201 21:33:39.278875 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4fa02037f2a0411aa4c3b7eff66c72a1885d70e56cf72cdf786d04b5840ecbe6"} Dec 01 21:33:39 crc kubenswrapper[4962]: I1201 21:33:39.279158 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 21:33:39 crc kubenswrapper[4962]: I1201 21:33:39.280840 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:33:39 crc kubenswrapper[4962]: I1201 21:33:39.280900 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:33:39 crc kubenswrapper[4962]: I1201 21:33:39.280919 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:33:39 crc kubenswrapper[4962]: I1201 21:33:39.284120 4962 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="e7359f92e7c2ddd654f51dca4a176d4fbff7b7845dee34318f8e3855ab767339" exitCode=0 Dec 01 21:33:39 crc kubenswrapper[4962]: I1201 21:33:39.284236 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"e7359f92e7c2ddd654f51dca4a176d4fbff7b7845dee34318f8e3855ab767339"} Dec 01 21:33:39 crc kubenswrapper[4962]: I1201 21:33:39.284264 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 21:33:39 crc kubenswrapper[4962]: I1201 21:33:39.284507 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 21:33:39 crc kubenswrapper[4962]: I1201 21:33:39.285790 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:33:39 crc kubenswrapper[4962]: I1201 21:33:39.285838 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:33:39 crc kubenswrapper[4962]: I1201 21:33:39.285854 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:33:39 crc kubenswrapper[4962]: I1201 21:33:39.286601 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:33:39 crc kubenswrapper[4962]: I1201 21:33:39.286652 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:33:39 crc kubenswrapper[4962]: I1201 21:33:39.286670 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:33:39 crc kubenswrapper[4962]: I1201 21:33:39.406009 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 21:33:39 crc kubenswrapper[4962]: I1201 21:33:39.407646 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:33:39 crc kubenswrapper[4962]: I1201 21:33:39.407720 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:33:39 crc kubenswrapper[4962]: I1201 21:33:39.407740 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:33:39 crc kubenswrapper[4962]: I1201 21:33:39.407781 4962 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 01 21:33:40 crc kubenswrapper[4962]: I1201 21:33:40.292138 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"64c7b68c7eee0a36ad3b54ba258b618f2940fad0f4f3130d7efdddd9ce6c1011"} Dec 01 21:33:40 crc kubenswrapper[4962]: I1201 21:33:40.292252 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a9ef9598b28d05d90e75ea5eca19e1a132f8e410c819f1a7123e804eaeb8d678"} Dec 01 21:33:40 crc kubenswrapper[4962]: I1201 21:33:40.292283 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"b1bd7190cf9956ccaf4863959ec7c13bcb6ec7f254b629b6484f681269b746f3"} Dec 01 21:33:40 crc kubenswrapper[4962]: I1201 21:33:40.292196 4962 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 21:33:40 crc kubenswrapper[4962]: I1201 21:33:40.292398 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 21:33:40 crc kubenswrapper[4962]: I1201 21:33:40.294281 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:33:40 crc kubenswrapper[4962]: I1201 21:33:40.294354 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:33:40 crc kubenswrapper[4962]: I1201 21:33:40.294372 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:33:40 crc kubenswrapper[4962]: I1201 21:33:40.546341 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 21:33:40 crc kubenswrapper[4962]: I1201 21:33:40.546539 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 21:33:40 crc kubenswrapper[4962]: I1201 21:33:40.547876 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:33:40 crc kubenswrapper[4962]: I1201 21:33:40.548056 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:33:40 crc kubenswrapper[4962]: I1201 21:33:40.548105 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:33:41 crc kubenswrapper[4962]: I1201 21:33:41.301599 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ca2db71717682e2044fc7c555c92f47d7ef83065773ca8771ff45f7a9ca33e3b"} Dec 01 21:33:41 crc kubenswrapper[4962]: I1201 21:33:41.301669 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"1a5d319b99b2914b4812c468b6a57c15faa4c93a8cd73d7abc3d456d8bf01d46"} Dec 01 21:33:41 crc kubenswrapper[4962]: I1201 21:33:41.301823 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 21:33:41 crc kubenswrapper[4962]: I1201 21:33:41.303312 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:33:41 crc kubenswrapper[4962]: I1201 21:33:41.303360 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:33:41 crc kubenswrapper[4962]: I1201 21:33:41.303380 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:33:41 crc kubenswrapper[4962]: I1201 21:33:41.427218 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 21:33:41 crc kubenswrapper[4962]: I1201 21:33:41.427434 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 21:33:41 crc kubenswrapper[4962]: I1201 21:33:41.429000 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:33:41 crc kubenswrapper[4962]: I1201 21:33:41.429057 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:33:41 crc kubenswrapper[4962]: I1201 21:33:41.429078 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:33:41 crc kubenswrapper[4962]: I1201 21:33:41.943342 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 21:33:41 crc kubenswrapper[4962]: I1201 21:33:41.943558 4962 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 21:33:41 crc kubenswrapper[4962]: I1201 21:33:41.943616 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 21:33:41 crc kubenswrapper[4962]: I1201 21:33:41.945388 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:33:41 crc kubenswrapper[4962]: I1201 21:33:41.945446 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:33:41 crc kubenswrapper[4962]: I1201 21:33:41.945466 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:33:42 crc kubenswrapper[4962]: I1201 21:33:42.305298 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 21:33:42 crc kubenswrapper[4962]: I1201 21:33:42.306788 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:33:42 crc kubenswrapper[4962]: I1201 21:33:42.306843 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:33:42 crc kubenswrapper[4962]: I1201 21:33:42.306861 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:33:43 crc kubenswrapper[4962]: I1201 21:33:43.249747 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Dec 01 21:33:43 crc kubenswrapper[4962]: I1201 21:33:43.308883 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 21:33:43 crc kubenswrapper[4962]: I1201 21:33:43.311011 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:33:43 crc kubenswrapper[4962]: I1201 21:33:43.311079 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:33:43 crc kubenswrapper[4962]: I1201 21:33:43.311101 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:33:43 crc kubenswrapper[4962]: I1201 21:33:43.793292 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 21:33:43 crc kubenswrapper[4962]: I1201 21:33:43.793561 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 21:33:43 crc kubenswrapper[4962]: I1201 21:33:43.795358 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:33:43 crc kubenswrapper[4962]: I1201 21:33:43.795440 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:33:43 crc kubenswrapper[4962]: I1201 21:33:43.795464 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:33:43 crc kubenswrapper[4962]: I1201 21:33:43.802004 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 21:33:44 crc kubenswrapper[4962]: I1201 21:33:44.072053 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 21:33:44 crc kubenswrapper[4962]: I1201 21:33:44.072425 4962 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 21:33:44 crc kubenswrapper[4962]: I1201 21:33:44.072535 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 21:33:44 crc kubenswrapper[4962]: I1201 21:33:44.074829 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:33:44 crc kubenswrapper[4962]: I1201 21:33:44.074904 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:33:44 crc kubenswrapper[4962]: I1201 21:33:44.074924 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:33:44 crc kubenswrapper[4962]: I1201 21:33:44.312225 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 21:33:44 crc kubenswrapper[4962]: I1201 21:33:44.313637 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:33:44 crc kubenswrapper[4962]: I1201 21:33:44.313705 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:33:44 crc kubenswrapper[4962]: I1201 21:33:44.313723 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:33:44 crc kubenswrapper[4962]: I1201 21:33:44.428082 4962 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 01 21:33:44 crc kubenswrapper[4962]: I1201 21:33:44.428212 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 21:33:45 crc kubenswrapper[4962]: I1201 21:33:45.820041 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Dec 01 21:33:45 crc kubenswrapper[4962]: I1201 21:33:45.820308 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 21:33:45 crc kubenswrapper[4962]: I1201 21:33:45.821926 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:33:45 crc kubenswrapper[4962]: I1201 21:33:45.821992 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:33:45 crc kubenswrapper[4962]: I1201 21:33:45.822007 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:33:45 crc kubenswrapper[4962]: I1201 21:33:45.980451 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 21:33:45 crc kubenswrapper[4962]: I1201 21:33:45.980785 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 21:33:45 crc kubenswrapper[4962]: I1201 21:33:45.983116 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:33:45 crc kubenswrapper[4962]: I1201 21:33:45.983190 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:33:45 crc kubenswrapper[4962]: I1201 21:33:45.983223 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:33:46 crc kubenswrapper[4962]: I1201 21:33:46.220695 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 21:33:46 crc kubenswrapper[4962]: I1201 21:33:46.220979 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 21:33:46 crc kubenswrapper[4962]: I1201 21:33:46.222624 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:33:46 crc kubenswrapper[4962]: I1201 21:33:46.222690 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:33:46 crc kubenswrapper[4962]: I1201 21:33:46.222705 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:33:46 crc kubenswrapper[4962]: E1201 21:33:46.294767 4962 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 01 21:33:48 crc kubenswrapper[4962]: I1201 21:33:48.085441 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 21:33:48 crc kubenswrapper[4962]: I1201 21:33:48.085684 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 21:33:48 crc kubenswrapper[4962]: I1201 21:33:48.087538 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:33:48 crc kubenswrapper[4962]: I1201 21:33:48.087603 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:33:48 crc kubenswrapper[4962]: I1201 21:33:48.087622 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:33:48 crc kubenswrapper[4962]: I1201 21:33:48.095749 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 21:33:48 crc kubenswrapper[4962]: I1201 21:33:48.146103 4962 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Dec 01 21:33:48 crc kubenswrapper[4962]: I1201 21:33:48.330693 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 21:33:48 crc kubenswrapper[4962]: I1201 21:33:48.332040 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:33:48 crc kubenswrapper[4962]: I1201 21:33:48.332091 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:33:48 crc kubenswrapper[4962]: I1201 21:33:48.332106 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:33:49 crc kubenswrapper[4962]: E1201 21:33:49.156822 4962 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="3.2s" Dec 01 21:33:49 crc kubenswrapper[4962]: I1201 21:33:49.168397 4962 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 01 21:33:49 crc kubenswrapper[4962]: I1201 21:33:49.168495 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 01 21:33:49 crc kubenswrapper[4962]: I1201 21:33:49.190119 4962 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 01 21:33:49 crc kubenswrapper[4962]: I1201 21:33:49.190181 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 01 21:33:53 crc kubenswrapper[4962]: I1201 21:33:53.286219 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Dec 01 21:33:53 crc kubenswrapper[4962]: I1201 21:33:53.286462 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 21:33:53 crc kubenswrapper[4962]: I1201 21:33:53.288284 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:33:53 crc kubenswrapper[4962]: I1201 21:33:53.288353 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:33:53 crc kubenswrapper[4962]: I1201 21:33:53.288371 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:33:53 crc kubenswrapper[4962]: I1201 21:33:53.306102 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Dec 01 21:33:53 crc kubenswrapper[4962]: I1201 21:33:53.353050 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 21:33:53 crc kubenswrapper[4962]: I1201 21:33:53.354307 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:33:53 crc kubenswrapper[4962]: I1201 21:33:53.354358 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:33:53 crc kubenswrapper[4962]: I1201 21:33:53.354377 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:33:54 crc kubenswrapper[4962]: I1201 21:33:54.082109 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 21:33:54 crc kubenswrapper[4962]: I1201 21:33:54.082365 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 21:33:54 crc kubenswrapper[4962]: I1201 21:33:54.084098 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:33:54 crc kubenswrapper[4962]: I1201 21:33:54.084160 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:33:54 crc kubenswrapper[4962]: I1201 21:33:54.084180 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:33:54 crc kubenswrapper[4962]: I1201 21:33:54.090756 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 21:33:54 crc kubenswrapper[4962]: I1201 21:33:54.189299 4962 trace.go:236] Trace[1573923252]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Dec-2025 21:33:39.869) (total time: 14319ms): Dec 01 21:33:54 crc kubenswrapper[4962]: Trace[1573923252]: ---"Objects listed" error: 14319ms (21:33:54.189) Dec 01 21:33:54 crc kubenswrapper[4962]: Trace[1573923252]: [14.319931202s] [14.319931202s] END Dec 01 21:33:54 crc kubenswrapper[4962]: I1201 21:33:54.189336 4962 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 01 21:33:54 crc kubenswrapper[4962]: I1201 21:33:54.190058 4962 trace.go:236] Trace[839999417]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Dec-2025 21:33:39.677) (total time: 14512ms): Dec 01 21:33:54 crc kubenswrapper[4962]: Trace[839999417]: ---"Objects listed" error: 14512ms (21:33:54.190) Dec 01 21:33:54 crc kubenswrapper[4962]: Trace[839999417]: [14.512888754s] [14.512888754s] END Dec 01 21:33:54 crc kubenswrapper[4962]: I1201 21:33:54.190079 4962 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 01 21:33:54 crc kubenswrapper[4962]: I1201 21:33:54.190549 4962 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 01 21:33:54 crc kubenswrapper[4962]: I1201 21:33:54.190599 4962 trace.go:236] Trace[1406307108]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Dec-2025 21:33:39.212) (total time: 14977ms): Dec 01 21:33:54 crc kubenswrapper[4962]: Trace[1406307108]: ---"Objects listed" error: 14977ms (21:33:54.190) Dec 01 21:33:54 crc kubenswrapper[4962]: Trace[1406307108]: [14.977807771s] [14.977807771s] END Dec 01 21:33:54 crc kubenswrapper[4962]: I1201 21:33:54.190616 4962 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 01 21:33:54 crc kubenswrapper[4962]: E1201 21:33:54.192010 4962 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Dec 01 21:33:54 crc kubenswrapper[4962]: I1201 21:33:54.193219 4962 trace.go:236] Trace[1682833046]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Dec-2025 21:33:39.952) (total time: 14240ms): Dec 01 21:33:54 crc kubenswrapper[4962]: Trace[1682833046]: ---"Objects listed" error: 14239ms (21:33:54.192) Dec 01 21:33:54 crc kubenswrapper[4962]: Trace[1682833046]: [14.240281039s] [14.240281039s] END Dec 01 21:33:54 crc kubenswrapper[4962]: I1201 21:33:54.193261 4962 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 01 21:33:54 crc kubenswrapper[4962]: I1201 21:33:54.261893 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 21:33:54 crc kubenswrapper[4962]: I1201 21:33:54.267637 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 21:33:54 crc kubenswrapper[4962]: I1201 21:33:54.349904 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 21:33:54 crc kubenswrapper[4962]: E1201 21:33:54.363042 4962 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.147173 4962 apiserver.go:52] "Watching apiserver" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.151647 4962 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.152014 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h"] Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.152485 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.152690 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.152698 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.153099 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 21:33:55 crc kubenswrapper[4962]: E1201 21:33:55.153163 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 21:33:55 crc kubenswrapper[4962]: E1201 21:33:55.153024 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.153455 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.153498 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 21:33:55 crc kubenswrapper[4962]: E1201 21:33:55.153523 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.154969 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.155579 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.157670 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.157765 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.157802 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.157865 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.157781 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.162834 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.164731 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.195361 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.215041 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.229826 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.244685 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.251976 4962 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.259873 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dce968b-9adc-4cbe-958c-00bdd7ba6159\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d0603aa8db338a61428d0c439ab252ad7db6ee0061992381a2f2c2534ecc08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b00d16ddb5e5c38227a87ee09c7e0d74ca43fb9226904dc5d604abe6a9c9b5ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc7177cbcbb0fe51491e055e7a8cb565836f1ffe163b6e160c3b425a5218fc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e749578c0ce8cb1dd38de90c24c5ddf738558163fe9012afcf1db088756495\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.276564 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.296970 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.297047 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.297084 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.297138 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.297193 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.297240 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.297287 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.297336 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.297386 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.297435 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.297479 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.297522 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.297579 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.297630 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.297681 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.297728 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.297776 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.297831 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.297878 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.297921 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.298010 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.298065 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.298107 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.298089 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.298211 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.298160 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.298139 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.298396 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.298437 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.298478 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.298516 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.298550 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.298582 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.298614 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.298647 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.298682 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.298718 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.298750 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.298785 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.298820 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.298852 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.298886 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.298921 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.298983 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.299016 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.299047 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.299148 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.299184 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.299216 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.299247 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.299282 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.299313 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.299347 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.299378 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.299413 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.299447 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.299479 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.299570 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.299605 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.299637 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.299714 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.299746 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.299816 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.299848 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.299882 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.299915 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.299970 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.298229 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.298076 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.300041 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.298324 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.298413 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.298782 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.298781 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.298827 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.298833 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.298988 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.299269 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.299386 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.299361 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.299470 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.299602 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.299779 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.299977 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.299978 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.300154 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.300077 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.300438 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.300535 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.300595 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.301507 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.301572 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.301734 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.301774 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.301805 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.302368 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.302424 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.302456 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.302482 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.302667 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.302700 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.302967 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.300069 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.303191 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.303364 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.303450 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.303496 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac15c86b-cd18-4110-bda1-ba116dccb445\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edc674b0b2e671ee34618f4a89fb513b4d6be3b5b2c65510554ed57b84f305b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d764e05de4e1d86add3da927e55f8d2f211bba19a2104af89e43d13f986f5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a26275d40e0cdc563a8704d985b92ca116fd053ac663b23ada1f3d55d3d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fa02037f2a0411aa4c3b7eff66c72a1885d70e56cf72cdf786d04b5840ecbe6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bbe0097117ffbc7aa9c8247995dcec3765f03a9aa7657ee0c0da4c3a3ec874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.303265 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.303311 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.303324 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.303639 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.303919 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.303987 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.304027 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.304179 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.304195 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.304226 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.304245 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.304282 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.304415 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.304482 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: E1201 21:33:55.304878 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 21:33:55.804559667 +0000 UTC m=+19.905998902 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.304930 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.305002 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.305038 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.305071 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.305065 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.305109 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.305130 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.305226 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.305143 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.305182 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.305286 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.305437 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.305627 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.305677 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.305728 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.305444 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.305777 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.305404 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.305861 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.305899 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.306088 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.306204 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.306241 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.306275 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.306391 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.306425 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.306458 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.306344 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.306500 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.306585 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.306813 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.306960 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.307001 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.307033 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.307076 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.307110 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.307140 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.307176 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.307208 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.307238 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.307270 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.307303 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.307336 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.307369 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.307399 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.307431 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.307463 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.307498 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.307529 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.307562 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.307593 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.307624 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.307657 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.307687 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.307718 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.307712 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.307881 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.307923 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.307964 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.307992 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.308031 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.308064 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.308098 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.308135 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.308170 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.308207 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.308245 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.308285 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.308325 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.308364 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.308400 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.308433 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.308472 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.308497 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.308512 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.308594 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.308592 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.308889 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.308961 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.309042 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.309081 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.309118 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.309154 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.309178 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.309193 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.309233 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.309299 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.309332 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.309369 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.309348 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.309414 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.309460 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.309570 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.309637 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.309965 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.309690 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.310070 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.310132 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.310190 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.310245 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.310299 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.310354 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.310421 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.310475 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.310534 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.310589 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.310648 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.310706 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.310762 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.310822 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.310879 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.310929 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.311024 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.311075 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.311132 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.311186 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.311242 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.311295 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.311346 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.311383 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.311417 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.311453 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.311491 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.311528 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.311572 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.311607 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.311647 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.311709 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.311746 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.311785 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.311824 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.311860 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.311896 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.311967 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.312025 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.312071 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.312110 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.312157 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.312217 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.312274 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.312339 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.312399 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.312506 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.312569 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.312628 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.312716 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.312773 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.312828 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.312882 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.313014 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.313085 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.313151 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.313215 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.313278 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.313335 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.313392 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.313515 4962 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.313548 4962 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.313576 4962 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.313607 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.313637 4962 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.313667 4962 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.313698 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.313729 4962 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.313759 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.313787 4962 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.313816 4962 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.313851 4962 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.313878 4962 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.313905 4962 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.313966 4962 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.313999 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.314027 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.314060 4962 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.314090 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.314121 4962 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.314151 4962 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.314179 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.314205 4962 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.314232 4962 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.314259 4962 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.314291 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.314326 4962 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.314358 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.314387 4962 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.314408 4962 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.314430 4962 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.314451 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.314473 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.314495 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.314516 4962 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.314538 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.314558 4962 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.314578 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.314600 4962 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.314621 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.314641 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.314661 4962 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.314682 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.314707 4962 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.314727 4962 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.314748 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.314769 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.314790 4962 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.314811 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.314833 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.314855 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.314876 4962 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.314898 4962 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.314919 4962 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.315008 4962 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.315032 4962 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.315057 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.315079 4962 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.315099 4962 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.315121 4962 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.315141 4962 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.315161 4962 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.315183 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.315205 4962 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.315225 4962 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.315245 4962 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.315265 4962 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.321706 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.322850 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.324972 4962 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.325476 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.328848 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.310012 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.310025 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.310208 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.310329 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.310489 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.310515 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.311063 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.311205 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.311310 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.311465 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.311508 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.311581 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.311605 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.312142 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.312181 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.312242 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.312312 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.312324 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.312849 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.312879 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.313131 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.313154 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.313321 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.313568 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.313785 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.313838 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.314033 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.314100 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.316278 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.317207 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.318091 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.318145 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.318450 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.318628 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.318668 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.318748 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.318772 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.318866 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.319171 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.319251 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.320462 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.320830 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.321119 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.321130 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.321207 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.320485 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.321259 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.321618 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.321637 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.321733 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.321907 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.322029 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: E1201 21:33:55.322084 4962 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 21:33:55 crc kubenswrapper[4962]: E1201 21:33:55.342847 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 21:33:55.842811336 +0000 UTC m=+19.944250541 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.322226 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.322245 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.322657 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.322667 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.323034 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.323151 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.323203 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.323341 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.323408 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.323506 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.323798 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.323879 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.324131 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.324166 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.325032 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.325342 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.325382 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.325750 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.325977 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: E1201 21:33:55.326181 4962 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 21:33:55 crc kubenswrapper[4962]: E1201 21:33:55.343361 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 21:33:55.843350291 +0000 UTC m=+19.944789496 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.328397 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.328638 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.343358 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 21:33:55 crc kubenswrapper[4962]: E1201 21:33:55.348513 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 21:33:55 crc kubenswrapper[4962]: E1201 21:33:55.348552 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 21:33:55 crc kubenswrapper[4962]: E1201 21:33:55.348577 4962 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 21:33:55 crc kubenswrapper[4962]: E1201 21:33:55.348700 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 21:33:55.848638362 +0000 UTC m=+19.950077597 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.351817 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.352551 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.353393 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.353673 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.353677 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.354707 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 21:33:55 crc kubenswrapper[4962]: E1201 21:33:55.355498 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.355531 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 21:33:55 crc kubenswrapper[4962]: E1201 21:33:55.355545 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 21:33:55 crc kubenswrapper[4962]: E1201 21:33:55.355593 4962 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.356524 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.356701 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.357789 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.358149 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.358345 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: E1201 21:33:55.359231 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 21:33:55.856167822 +0000 UTC m=+19.957607047 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.359366 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.359416 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.358740 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.360177 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.360321 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.360765 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.362028 4962 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4fa02037f2a0411aa4c3b7eff66c72a1885d70e56cf72cdf786d04b5840ecbe6" exitCode=255 Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.363027 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.363026 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"4fa02037f2a0411aa4c3b7eff66c72a1885d70e56cf72cdf786d04b5840ecbe6"} Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.363783 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.363922 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.364084 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.364308 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.364528 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.364560 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.364731 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.365854 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.364800 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.367345 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.367843 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.368390 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.368467 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.368778 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.368817 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.369326 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.369577 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.369727 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.369730 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.369817 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.369841 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.369851 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.370096 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.370317 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.370655 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.370676 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.370823 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.371011 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.371131 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.371128 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.371231 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.371317 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.372895 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.373153 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.373339 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.374289 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: E1201 21:33:55.374822 4962 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-apiserver-crc\" already exists" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.374424 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.375743 4962 scope.go:117] "RemoveContainer" containerID="4fa02037f2a0411aa4c3b7eff66c72a1885d70e56cf72cdf786d04b5840ecbe6" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.383029 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac15c86b-cd18-4110-bda1-ba116dccb445\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edc674b0b2e671ee34618f4a89fb513b4d6be3b5b2c65510554ed57b84f305b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d764e05de4e1d86add3da927e55f8d2f211bba19a2104af89e43d13f986f5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a26275d40e0cdc563a8704d985b92ca116fd053ac663b23ada1f3d55d3d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fa02037f2a0411aa4c3b7eff66c72a1885d70e56cf72cdf786d04b5840ecbe6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa02037f2a0411aa4c3b7eff66c72a1885d70e56cf72cdf786d04b5840ecbe6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"message\\\":\\\" 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764624829\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764624829\\\\\\\\\\\\\\\" (2025-12-01 20:33:49 +0000 UTC to 2026-12-01 20:33:49 +0000 UTC (now=2025-12-01 21:33:54.341897199 +0000 UTC))\\\\\\\"\\\\nI1201 21:33:54.342031 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 21:33:54.342089 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 21:33:54.342220 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1201 21:33:54.342244 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1201 21:33:54.341422 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 21:33:54.342706 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 21:33:54.344341 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-602538438/tls.crt::/tmp/serving-cert-602538438/tls.key\\\\\\\"\\\\nI1201 21:33:54.345794 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 21:33:54.345817 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 21:33:54.345844 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348097 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348569 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1201 21:33:54.367723 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bbe0097117ffbc7aa9c8247995dcec3765f03a9aa7657ee0c0da4c3a3ec874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.398930 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.399305 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.400173 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.413379 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.417009 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.417110 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.417202 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.417223 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.417242 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.417259 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.417276 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.417293 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.417310 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.417328 4962 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.417345 4962 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.417361 4962 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.417378 4962 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.417394 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.417410 4962 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.417426 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.417443 4962 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.417459 4962 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.417476 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.417494 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.417513 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.417533 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.417550 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.417566 4962 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.417664 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.417808 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.417828 4962 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.417847 4962 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.417866 4962 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.417883 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.417899 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.417916 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.417958 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.417975 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.417992 4962 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.418011 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.418028 4962 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.418044 4962 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.418060 4962 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.418076 4962 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.418092 4962 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.418108 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.418125 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.418143 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.418159 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.418175 4962 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.418191 4962 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.418208 4962 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.418224 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.418240 4962 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.418259 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.418274 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.418290 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.418307 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.418323 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.418340 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.418357 4962 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.418373 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.418390 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.418408 4962 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.418425 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.418441 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.418458 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.418475 4962 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.418491 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.418508 4962 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.418526 4962 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.418543 4962 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.418559 4962 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.418575 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.418592 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.418607 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.418623 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.418640 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.418657 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.418679 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.418696 4962 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.418712 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.418730 4962 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.418787 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.418875 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.418892 4962 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.418909 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.418925 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.418962 4962 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.418979 4962 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.418999 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.419015 4962 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.419032 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.419048 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.419065 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.419083 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.419100 4962 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.419117 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.419133 4962 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.419149 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.419167 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.419184 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.419202 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.419170 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.419218 4962 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.419337 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.419354 4962 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.419372 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.419389 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.419405 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.419421 4962 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.419437 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.419454 4962 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.419508 4962 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.419527 4962 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.419547 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.419563 4962 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.419580 4962 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.419596 4962 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.419629 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.419645 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.419662 4962 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.419678 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.419694 4962 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.419712 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.419728 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.419744 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.419762 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.419780 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.419797 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.419816 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.419833 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.419851 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.419866 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.419883 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.420747 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.436381 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.457467 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dce968b-9adc-4cbe-958c-00bdd7ba6159\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d0603aa8db338a61428d0c439ab252ad7db6ee0061992381a2f2c2534ecc08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b00d16ddb5e5c38227a87ee09c7e0d74ca43fb9226904dc5d604abe6a9c9b5ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc7177cbcbb0fe51491e055e7a8cb565836f1ffe163b6e160c3b425a5218fc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e749578c0ce8cb1dd38de90c24c5ddf738558163fe9012afcf1db088756495\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.474367 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.475140 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.489225 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.493123 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 21:33:55 crc kubenswrapper[4962]: W1201 21:33:55.497691 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-7e9f90a1515d10102f9d31fecf21467f2cc39606cd81ff2bc71da8c17e3533d9 WatchSource:0}: Error finding container 7e9f90a1515d10102f9d31fecf21467f2cc39606cd81ff2bc71da8c17e3533d9: Status 404 returned error can't find the container with id 7e9f90a1515d10102f9d31fecf21467f2cc39606cd81ff2bc71da8c17e3533d9 Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.501129 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.505296 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.521033 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 21:33:55 crc kubenswrapper[4962]: W1201 21:33:55.522526 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-7d05fd1f8ac0984c5bfce792abc86e3705a812bb6c4983f2327107f17c141ccf WatchSource:0}: Error finding container 7d05fd1f8ac0984c5bfce792abc86e3705a812bb6c4983f2327107f17c141ccf: Status 404 returned error can't find the container with id 7d05fd1f8ac0984c5bfce792abc86e3705a812bb6c4983f2327107f17c141ccf Dec 01 21:33:55 crc kubenswrapper[4962]: W1201 21:33:55.527134 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-8b13abd3cab9ad4725cfe6ee4ca2fed5f0c2b343f74d1799881bec75c66066e3 WatchSource:0}: Error finding container 8b13abd3cab9ad4725cfe6ee4ca2fed5f0c2b343f74d1799881bec75c66066e3: Status 404 returned error can't find the container with id 8b13abd3cab9ad4725cfe6ee4ca2fed5f0c2b343f74d1799881bec75c66066e3 Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.823681 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 21:33:55 crc kubenswrapper[4962]: E1201 21:33:55.823986 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 21:33:56.823921649 +0000 UTC m=+20.925360874 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.925519 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 21:33:55 crc kubenswrapper[4962]: E1201 21:33:55.925848 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 21:33:55 crc kubenswrapper[4962]: E1201 21:33:55.926005 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 21:33:55 crc kubenswrapper[4962]: E1201 21:33:55.926030 4962 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.925921 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 21:33:55 crc kubenswrapper[4962]: E1201 21:33:55.926125 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 21:33:56.926096352 +0000 UTC m=+21.027535577 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.926270 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.926326 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:33:55 crc kubenswrapper[4962]: E1201 21:33:55.926414 4962 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 21:33:55 crc kubenswrapper[4962]: E1201 21:33:55.926471 4962 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 21:33:55 crc kubenswrapper[4962]: E1201 21:33:55.926548 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 21:33:56.926517444 +0000 UTC m=+21.027956679 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 21:33:55 crc kubenswrapper[4962]: E1201 21:33:55.926579 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 21:33:56.926565615 +0000 UTC m=+21.028004850 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 21:33:55 crc kubenswrapper[4962]: E1201 21:33:55.926771 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 21:33:55 crc kubenswrapper[4962]: E1201 21:33:55.926856 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 21:33:55 crc kubenswrapper[4962]: E1201 21:33:55.926925 4962 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 21:33:55 crc kubenswrapper[4962]: E1201 21:33:55.927089 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 21:33:56.927062848 +0000 UTC m=+21.028502123 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 21:33:55 crc kubenswrapper[4962]: I1201 21:33:55.981522 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.225593 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.226105 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.226872 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.227464 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.228011 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.228464 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.229044 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.229560 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.230150 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.230620 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.231085 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.231711 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.232196 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.232735 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.235738 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.236333 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.237275 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.237644 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.238189 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.241243 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.241784 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.242490 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.243380 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.244045 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.244402 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dce968b-9adc-4cbe-958c-00bdd7ba6159\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d0603aa8db338a61428d0c439ab252ad7db6ee0061992381a2f2c2534ecc08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b00d16ddb5e5c38227a87ee09c7e0d74ca43fb9226904dc5d604abe6a9c9b5ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc7177cbcbb0fe51491e055e7a8cb565836f1ffe163b6e160c3b425a5218fc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e749578c0ce8cb1dd38de90c24c5ddf738558163fe9012afcf1db088756495\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:33:56Z is after 2025-08-24T17:21:41Z" Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.245050 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.245763 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.246822 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.247316 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.248237 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.248722 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.249199 4962 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.249667 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.251471 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.252024 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.252819 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.254227 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.254830 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.255701 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.256343 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.257337 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.257816 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.258852 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.259526 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.260504 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.261028 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.261889 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.262578 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.263712 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.264247 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.265178 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.265816 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.266331 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.267470 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.268021 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.268466 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:33:56Z is after 2025-08-24T17:21:41Z" Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.294949 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:33:56Z is after 2025-08-24T17:21:41Z" Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.314017 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:33:56Z is after 2025-08-24T17:21:41Z" Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.347568 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:33:56Z is after 2025-08-24T17:21:41Z" Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.365857 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"7d05fd1f8ac0984c5bfce792abc86e3705a812bb6c4983f2327107f17c141ccf"} Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.367271 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"5db02e5c3ea0837c8534d6d0d89c6343ef077bc79b08bb6514fcc7fa0d71ac94"} Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.367303 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"7e9f90a1515d10102f9d31fecf21467f2cc39606cd81ff2bc71da8c17e3533d9"} Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.370000 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.371280 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:33:56Z is after 2025-08-24T17:21:41Z" Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.371821 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ff7bb726576fbf1e8a595c9fdab30ed4852d838f3ad72aa432b880e8466493ee"} Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.372428 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.374368 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"69adfdccde94cd891284e61c5edd6a092d14c5276ff514618fa3b44169e6347e"} Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.374438 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"6a6c3b22dbe7ed00b2356583225c79a46bfbf7014f2ba5b6d2226f1b1f3f7b39"} Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.374460 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"8b13abd3cab9ad4725cfe6ee4ca2fed5f0c2b343f74d1799881bec75c66066e3"} Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.388471 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:33:56Z is after 2025-08-24T17:21:41Z" Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.412694 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac15c86b-cd18-4110-bda1-ba116dccb445\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edc674b0b2e671ee34618f4a89fb513b4d6be3b5b2c65510554ed57b84f305b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d764e05de4e1d86add3da927e55f8d2f211bba19a2104af89e43d13f986f5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a26275d40e0cdc563a8704d985b92ca116fd053ac663b23ada1f3d55d3d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fa02037f2a0411aa4c3b7eff66c72a1885d70e56cf72cdf786d04b5840ecbe6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa02037f2a0411aa4c3b7eff66c72a1885d70e56cf72cdf786d04b5840ecbe6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"message\\\":\\\" 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764624829\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764624829\\\\\\\\\\\\\\\" (2025-12-01 20:33:49 +0000 UTC to 2026-12-01 20:33:49 +0000 UTC (now=2025-12-01 21:33:54.341897199 +0000 UTC))\\\\\\\"\\\\nI1201 21:33:54.342031 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 21:33:54.342089 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 21:33:54.342220 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1201 21:33:54.342244 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1201 21:33:54.341422 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 21:33:54.342706 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 21:33:54.344341 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-602538438/tls.crt::/tmp/serving-cert-602538438/tls.key\\\\\\\"\\\\nI1201 21:33:54.345794 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 21:33:54.345817 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 21:33:54.345844 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348097 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348569 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1201 21:33:54.367723 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bbe0097117ffbc7aa9c8247995dcec3765f03a9aa7657ee0c0da4c3a3ec874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:33:56Z is after 2025-08-24T17:21:41Z" Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.446443 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:33:56Z is after 2025-08-24T17:21:41Z" Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.470179 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:33:56Z is after 2025-08-24T17:21:41Z" Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.492979 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69adfdccde94cd891284e61c5edd6a092d14c5276ff514618fa3b44169e6347e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a6c3b22dbe7ed00b2356583225c79a46bfbf7014f2ba5b6d2226f1b1f3f7b39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:33:56Z is after 2025-08-24T17:21:41Z" Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.510759 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dce968b-9adc-4cbe-958c-00bdd7ba6159\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d0603aa8db338a61428d0c439ab252ad7db6ee0061992381a2f2c2534ecc08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b00d16ddb5e5c38227a87ee09c7e0d74ca43fb9226904dc5d604abe6a9c9b5ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc7177cbcbb0fe51491e055e7a8cb565836f1ffe163b6e160c3b425a5218fc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e749578c0ce8cb1dd38de90c24c5ddf738558163fe9012afcf1db088756495\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:33:56Z is after 2025-08-24T17:21:41Z" Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.529103 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db02e5c3ea0837c8534d6d0d89c6343ef077bc79b08bb6514fcc7fa0d71ac94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:33:56Z is after 2025-08-24T17:21:41Z" Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.545510 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac15c86b-cd18-4110-bda1-ba116dccb445\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edc674b0b2e671ee34618f4a89fb513b4d6be3b5b2c65510554ed57b84f305b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d764e05de4e1d86add3da927e55f8d2f211bba19a2104af89e43d13f986f5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a26275d40e0cdc563a8704d985b92ca116fd053ac663b23ada1f3d55d3d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7bb726576fbf1e8a595c9fdab30ed4852d838f3ad72aa432b880e8466493ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa02037f2a0411aa4c3b7eff66c72a1885d70e56cf72cdf786d04b5840ecbe6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"message\\\":\\\" 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764624829\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764624829\\\\\\\\\\\\\\\" (2025-12-01 20:33:49 +0000 UTC to 2026-12-01 20:33:49 +0000 UTC (now=2025-12-01 21:33:54.341897199 +0000 UTC))\\\\\\\"\\\\nI1201 21:33:54.342031 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 21:33:54.342089 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 21:33:54.342220 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1201 21:33:54.342244 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1201 21:33:54.341422 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 21:33:54.342706 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 21:33:54.344341 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-602538438/tls.crt::/tmp/serving-cert-602538438/tls.key\\\\\\\"\\\\nI1201 21:33:54.345794 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 21:33:54.345817 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 21:33:54.345844 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348097 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348569 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1201 21:33:54.367723 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bbe0097117ffbc7aa9c8247995dcec3765f03a9aa7657ee0c0da4c3a3ec874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:33:56Z is after 2025-08-24T17:21:41Z" Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.564657 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:33:56Z is after 2025-08-24T17:21:41Z" Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.579789 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:33:56Z is after 2025-08-24T17:21:41Z" Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.834290 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 21:33:56 crc kubenswrapper[4962]: E1201 21:33:56.834537 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 21:33:58.834498234 +0000 UTC m=+22.935937439 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.935433 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.935494 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.935523 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 21:33:56 crc kubenswrapper[4962]: I1201 21:33:56.935549 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:33:56 crc kubenswrapper[4962]: E1201 21:33:56.935643 4962 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 21:33:56 crc kubenswrapper[4962]: E1201 21:33:56.935704 4962 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 21:33:56 crc kubenswrapper[4962]: E1201 21:33:56.935769 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 21:33:56 crc kubenswrapper[4962]: E1201 21:33:56.935800 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 21:33:56 crc kubenswrapper[4962]: E1201 21:33:56.935719 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 21:33:58.935693831 +0000 UTC m=+23.037133026 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 21:33:56 crc kubenswrapper[4962]: E1201 21:33:56.935814 4962 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 21:33:56 crc kubenswrapper[4962]: E1201 21:33:56.935847 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 21:33:58.935822595 +0000 UTC m=+23.037261790 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 21:33:56 crc kubenswrapper[4962]: E1201 21:33:56.935714 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 21:33:56 crc kubenswrapper[4962]: E1201 21:33:56.935880 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 21:33:56 crc kubenswrapper[4962]: E1201 21:33:56.935893 4962 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 21:33:56 crc kubenswrapper[4962]: E1201 21:33:56.935893 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 21:33:58.935872766 +0000 UTC m=+23.037311961 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 21:33:56 crc kubenswrapper[4962]: E1201 21:33:56.935954 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 21:33:58.935925097 +0000 UTC m=+23.037364292 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 21:33:57 crc kubenswrapper[4962]: I1201 21:33:57.219284 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 21:33:57 crc kubenswrapper[4962]: I1201 21:33:57.219330 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 21:33:57 crc kubenswrapper[4962]: I1201 21:33:57.219286 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:33:57 crc kubenswrapper[4962]: E1201 21:33:57.219432 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 21:33:57 crc kubenswrapper[4962]: E1201 21:33:57.219544 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 21:33:57 crc kubenswrapper[4962]: E1201 21:33:57.219633 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 21:33:57 crc kubenswrapper[4962]: I1201 21:33:57.393013 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 21:33:57 crc kubenswrapper[4962]: I1201 21:33:57.395371 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:33:57 crc kubenswrapper[4962]: I1201 21:33:57.395468 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:33:57 crc kubenswrapper[4962]: I1201 21:33:57.395487 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:33:57 crc kubenswrapper[4962]: I1201 21:33:57.395598 4962 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 01 21:33:57 crc kubenswrapper[4962]: I1201 21:33:57.408133 4962 kubelet_node_status.go:115] "Node was previously registered" node="crc" Dec 01 21:33:57 crc kubenswrapper[4962]: I1201 21:33:57.408753 4962 kubelet_node_status.go:79] "Successfully registered node" node="crc" Dec 01 21:33:57 crc kubenswrapper[4962]: I1201 21:33:57.410888 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:33:57 crc kubenswrapper[4962]: I1201 21:33:57.410972 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:33:57 crc kubenswrapper[4962]: I1201 21:33:57.410991 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:33:57 crc kubenswrapper[4962]: I1201 21:33:57.411019 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:33:57 crc kubenswrapper[4962]: I1201 21:33:57.411039 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:33:57Z","lastTransitionTime":"2025-12-01T21:33:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:33:57 crc kubenswrapper[4962]: E1201 21:33:57.440693 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:33:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:33:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:33:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:33:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1be34435-728a-4ba5-8928-fb5e344b2f91\\\",\\\"systemUUID\\\":\\\"c1819972-ca37-4089-9661-6671772d5e38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:33:57Z is after 2025-08-24T17:21:41Z" Dec 01 21:33:57 crc kubenswrapper[4962]: I1201 21:33:57.446256 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:33:57 crc kubenswrapper[4962]: I1201 21:33:57.446306 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:33:57 crc kubenswrapper[4962]: I1201 21:33:57.446323 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:33:57 crc kubenswrapper[4962]: I1201 21:33:57.446347 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:33:57 crc kubenswrapper[4962]: I1201 21:33:57.446365 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:33:57Z","lastTransitionTime":"2025-12-01T21:33:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:33:57 crc kubenswrapper[4962]: E1201 21:33:57.467397 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:33:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:33:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:33:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:33:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1be34435-728a-4ba5-8928-fb5e344b2f91\\\",\\\"systemUUID\\\":\\\"c1819972-ca37-4089-9661-6671772d5e38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:33:57Z is after 2025-08-24T17:21:41Z" Dec 01 21:33:57 crc kubenswrapper[4962]: I1201 21:33:57.472808 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:33:57 crc kubenswrapper[4962]: I1201 21:33:57.472881 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:33:57 crc kubenswrapper[4962]: I1201 21:33:57.472898 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:33:57 crc kubenswrapper[4962]: I1201 21:33:57.472923 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:33:57 crc kubenswrapper[4962]: I1201 21:33:57.472991 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:33:57Z","lastTransitionTime":"2025-12-01T21:33:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:33:57 crc kubenswrapper[4962]: E1201 21:33:57.493995 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:33:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:33:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:33:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:33:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1be34435-728a-4ba5-8928-fb5e344b2f91\\\",\\\"systemUUID\\\":\\\"c1819972-ca37-4089-9661-6671772d5e38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:33:57Z is after 2025-08-24T17:21:41Z" Dec 01 21:33:57 crc kubenswrapper[4962]: I1201 21:33:57.505324 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:33:57 crc kubenswrapper[4962]: I1201 21:33:57.505376 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:33:57 crc kubenswrapper[4962]: I1201 21:33:57.505386 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:33:57 crc kubenswrapper[4962]: I1201 21:33:57.505406 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:33:57 crc kubenswrapper[4962]: I1201 21:33:57.505418 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:33:57Z","lastTransitionTime":"2025-12-01T21:33:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:33:57 crc kubenswrapper[4962]: E1201 21:33:57.524972 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:33:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:33:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:33:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:33:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1be34435-728a-4ba5-8928-fb5e344b2f91\\\",\\\"systemUUID\\\":\\\"c1819972-ca37-4089-9661-6671772d5e38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:33:57Z is after 2025-08-24T17:21:41Z" Dec 01 21:33:57 crc kubenswrapper[4962]: I1201 21:33:57.534195 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:33:57 crc kubenswrapper[4962]: I1201 21:33:57.534236 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:33:57 crc kubenswrapper[4962]: I1201 21:33:57.534248 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:33:57 crc kubenswrapper[4962]: I1201 21:33:57.534266 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:33:57 crc kubenswrapper[4962]: I1201 21:33:57.534281 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:33:57Z","lastTransitionTime":"2025-12-01T21:33:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:33:57 crc kubenswrapper[4962]: E1201 21:33:57.555301 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:33:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:33:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:33:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:33:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1be34435-728a-4ba5-8928-fb5e344b2f91\\\",\\\"systemUUID\\\":\\\"c1819972-ca37-4089-9661-6671772d5e38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:33:57Z is after 2025-08-24T17:21:41Z" Dec 01 21:33:57 crc kubenswrapper[4962]: E1201 21:33:57.555426 4962 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 21:33:57 crc kubenswrapper[4962]: I1201 21:33:57.557507 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:33:57 crc kubenswrapper[4962]: I1201 21:33:57.557533 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:33:57 crc kubenswrapper[4962]: I1201 21:33:57.557542 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:33:57 crc kubenswrapper[4962]: I1201 21:33:57.557553 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:33:57 crc kubenswrapper[4962]: I1201 21:33:57.557563 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:33:57Z","lastTransitionTime":"2025-12-01T21:33:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:33:57 crc kubenswrapper[4962]: I1201 21:33:57.662173 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:33:57 crc kubenswrapper[4962]: I1201 21:33:57.662255 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:33:57 crc kubenswrapper[4962]: I1201 21:33:57.662276 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:33:57 crc kubenswrapper[4962]: I1201 21:33:57.662307 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:33:57 crc kubenswrapper[4962]: I1201 21:33:57.662327 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:33:57Z","lastTransitionTime":"2025-12-01T21:33:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:33:57 crc kubenswrapper[4962]: I1201 21:33:57.766019 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:33:57 crc kubenswrapper[4962]: I1201 21:33:57.766481 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:33:57 crc kubenswrapper[4962]: I1201 21:33:57.766500 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:33:57 crc kubenswrapper[4962]: I1201 21:33:57.766527 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:33:57 crc kubenswrapper[4962]: I1201 21:33:57.766545 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:33:57Z","lastTransitionTime":"2025-12-01T21:33:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:33:57 crc kubenswrapper[4962]: I1201 21:33:57.869763 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:33:57 crc kubenswrapper[4962]: I1201 21:33:57.869825 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:33:57 crc kubenswrapper[4962]: I1201 21:33:57.869844 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:33:57 crc kubenswrapper[4962]: I1201 21:33:57.869870 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:33:57 crc kubenswrapper[4962]: I1201 21:33:57.869892 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:33:57Z","lastTransitionTime":"2025-12-01T21:33:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:33:57 crc kubenswrapper[4962]: I1201 21:33:57.974847 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:33:57 crc kubenswrapper[4962]: I1201 21:33:57.974903 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:33:57 crc kubenswrapper[4962]: I1201 21:33:57.974922 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:33:57 crc kubenswrapper[4962]: I1201 21:33:57.974975 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:33:57 crc kubenswrapper[4962]: I1201 21:33:57.974998 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:33:57Z","lastTransitionTime":"2025-12-01T21:33:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.078293 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.078360 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.078375 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.078402 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.078418 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:33:58Z","lastTransitionTime":"2025-12-01T21:33:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.182203 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.182273 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.182294 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.182322 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.182343 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:33:58Z","lastTransitionTime":"2025-12-01T21:33:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.285727 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.285789 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.285806 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.285834 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.285849 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:33:58Z","lastTransitionTime":"2025-12-01T21:33:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.388816 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.388861 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.388870 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.388890 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.388900 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:33:58Z","lastTransitionTime":"2025-12-01T21:33:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.490917 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.490975 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.490985 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.491003 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.491013 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:33:58Z","lastTransitionTime":"2025-12-01T21:33:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.593532 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.593575 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.593588 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.593603 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.593614 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:33:58Z","lastTransitionTime":"2025-12-01T21:33:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.698519 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.698556 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.698564 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.698579 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.698590 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:33:58Z","lastTransitionTime":"2025-12-01T21:33:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.795048 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-w7dpq"] Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.795332 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-b642k"] Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.795529 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-w7dpq" Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.795564 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-b642k" Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.797520 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.797616 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.798952 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.799265 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.799336 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.799387 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.799456 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.799560 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.801168 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.801206 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.801220 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.801238 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.801251 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:33:58Z","lastTransitionTime":"2025-12-01T21:33:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.820755 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac15c86b-cd18-4110-bda1-ba116dccb445\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edc674b0b2e671ee34618f4a89fb513b4d6be3b5b2c65510554ed57b84f305b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d764e05de4e1d86add3da927e55f8d2f211bba19a2104af89e43d13f986f5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a26275d40e0cdc563a8704d985b92ca116fd053ac663b23ada1f3d55d3d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7bb726576fbf1e8a595c9fdab30ed4852d838f3ad72aa432b880e8466493ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa02037f2a0411aa4c3b7eff66c72a1885d70e56cf72cdf786d04b5840ecbe6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"message\\\":\\\" 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764624829\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764624829\\\\\\\\\\\\\\\" (2025-12-01 20:33:49 +0000 UTC to 2026-12-01 20:33:49 +0000 UTC (now=2025-12-01 21:33:54.341897199 +0000 UTC))\\\\\\\"\\\\nI1201 21:33:54.342031 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 21:33:54.342089 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 21:33:54.342220 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1201 21:33:54.342244 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1201 21:33:54.341422 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 21:33:54.342706 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 21:33:54.344341 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-602538438/tls.crt::/tmp/serving-cert-602538438/tls.key\\\\\\\"\\\\nI1201 21:33:54.345794 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 21:33:54.345817 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 21:33:54.345844 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348097 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348569 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1201 21:33:54.367723 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bbe0097117ffbc7aa9c8247995dcec3765f03a9aa7657ee0c0da4c3a3ec874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:33:58Z is after 2025-08-24T17:21:41Z" Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.846210 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:33:58Z is after 2025-08-24T17:21:41Z" Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.858024 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 21:33:58 crc kubenswrapper[4962]: E1201 21:33:58.858256 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 21:34:02.858221701 +0000 UTC m=+26.959660896 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.869151 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:33:58Z is after 2025-08-24T17:21:41Z" Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.887525 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69adfdccde94cd891284e61c5edd6a092d14c5276ff514618fa3b44169e6347e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a6c3b22dbe7ed00b2356583225c79a46bfbf7014f2ba5b6d2226f1b1f3f7b39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:33:58Z is after 2025-08-24T17:21:41Z" Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.898789 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w7dpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"943bfc9d-612b-4273-9774-f1866b7af4b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w7dpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:33:58Z is after 2025-08-24T17:21:41Z" Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.903961 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.904006 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.904017 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.904036 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.904048 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:33:58Z","lastTransitionTime":"2025-12-01T21:33:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.914742 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dce968b-9adc-4cbe-958c-00bdd7ba6159\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d0603aa8db338a61428d0c439ab252ad7db6ee0061992381a2f2c2534ecc08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b00d16ddb5e5c38227a87ee09c7e0d74ca43fb9226904dc5d604abe6a9c9b5ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc7177cbcbb0fe51491e055e7a8cb565836f1ffe163b6e160c3b425a5218fc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e749578c0ce8cb1dd38de90c24c5ddf738558163fe9012afcf1db088756495\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:33:58Z is after 2025-08-24T17:21:41Z" Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.929022 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db02e5c3ea0837c8534d6d0d89c6343ef077bc79b08bb6514fcc7fa0d71ac94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:33:58Z is after 2025-08-24T17:21:41Z" Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.946698 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:33:58Z is after 2025-08-24T17:21:41Z" Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.959007 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.959051 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/191b6ce3-f613-4217-b224-a65ee4cfdfe7-mcd-auth-proxy-config\") pod \"machine-config-daemon-b642k\" (UID: \"191b6ce3-f613-4217-b224-a65ee4cfdfe7\") " pod="openshift-machine-config-operator/machine-config-daemon-b642k" Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.959072 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.959097 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.959118 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.959139 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prj9m\" (UniqueName: \"kubernetes.io/projected/943bfc9d-612b-4273-9774-f1866b7af4b8-kube-api-access-prj9m\") pod \"node-resolver-w7dpq\" (UID: \"943bfc9d-612b-4273-9774-f1866b7af4b8\") " pod="openshift-dns/node-resolver-w7dpq" Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.959157 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/191b6ce3-f613-4217-b224-a65ee4cfdfe7-proxy-tls\") pod \"machine-config-daemon-b642k\" (UID: \"191b6ce3-f613-4217-b224-a65ee4cfdfe7\") " pod="openshift-machine-config-operator/machine-config-daemon-b642k" Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.959176 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/943bfc9d-612b-4273-9774-f1866b7af4b8-hosts-file\") pod \"node-resolver-w7dpq\" (UID: \"943bfc9d-612b-4273-9774-f1866b7af4b8\") " pod="openshift-dns/node-resolver-w7dpq" Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.959193 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rf67\" (UniqueName: \"kubernetes.io/projected/191b6ce3-f613-4217-b224-a65ee4cfdfe7-kube-api-access-9rf67\") pod \"machine-config-daemon-b642k\" (UID: \"191b6ce3-f613-4217-b224-a65ee4cfdfe7\") " pod="openshift-machine-config-operator/machine-config-daemon-b642k" Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.959214 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/191b6ce3-f613-4217-b224-a65ee4cfdfe7-rootfs\") pod \"machine-config-daemon-b642k\" (UID: \"191b6ce3-f613-4217-b224-a65ee4cfdfe7\") " pod="openshift-machine-config-operator/machine-config-daemon-b642k" Dec 01 21:33:58 crc kubenswrapper[4962]: E1201 21:33:58.959284 4962 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 21:33:58 crc kubenswrapper[4962]: E1201 21:33:58.959299 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 21:33:58 crc kubenswrapper[4962]: E1201 21:33:58.959348 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 21:33:58 crc kubenswrapper[4962]: E1201 21:33:58.959361 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 21:34:02.959342746 +0000 UTC m=+27.060781941 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 21:33:58 crc kubenswrapper[4962]: E1201 21:33:58.959365 4962 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 21:33:58 crc kubenswrapper[4962]: E1201 21:33:58.959287 4962 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 21:33:58 crc kubenswrapper[4962]: E1201 21:33:58.959430 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 21:33:58 crc kubenswrapper[4962]: E1201 21:33:58.959474 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 21:33:58 crc kubenswrapper[4962]: E1201 21:33:58.959494 4962 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 21:33:58 crc kubenswrapper[4962]: E1201 21:33:58.959446 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 21:34:02.959422918 +0000 UTC m=+27.060862113 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 21:33:58 crc kubenswrapper[4962]: E1201 21:33:58.959571 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 21:34:02.959559562 +0000 UTC m=+27.060998757 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 21:33:58 crc kubenswrapper[4962]: E1201 21:33:58.959586 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 21:34:02.959578893 +0000 UTC m=+27.061018088 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.960906 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:33:58Z is after 2025-08-24T17:21:41Z" Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.977740 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac15c86b-cd18-4110-bda1-ba116dccb445\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edc674b0b2e671ee34618f4a89fb513b4d6be3b5b2c65510554ed57b84f305b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d764e05de4e1d86add3da927e55f8d2f211bba19a2104af89e43d13f986f5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a26275d40e0cdc563a8704d985b92ca116fd053ac663b23ada1f3d55d3d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7bb726576fbf1e8a595c9fdab30ed4852d838f3ad72aa432b880e8466493ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa02037f2a0411aa4c3b7eff66c72a1885d70e56cf72cdf786d04b5840ecbe6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"message\\\":\\\" 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764624829\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764624829\\\\\\\\\\\\\\\" (2025-12-01 20:33:49 +0000 UTC to 2026-12-01 20:33:49 +0000 UTC (now=2025-12-01 21:33:54.341897199 +0000 UTC))\\\\\\\"\\\\nI1201 21:33:54.342031 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 21:33:54.342089 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 21:33:54.342220 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1201 21:33:54.342244 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1201 21:33:54.341422 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 21:33:54.342706 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 21:33:54.344341 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-602538438/tls.crt::/tmp/serving-cert-602538438/tls.key\\\\\\\"\\\\nI1201 21:33:54.345794 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 21:33:54.345817 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 21:33:54.345844 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348097 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348569 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1201 21:33:54.367723 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bbe0097117ffbc7aa9c8247995dcec3765f03a9aa7657ee0c0da4c3a3ec874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:33:58Z is after 2025-08-24T17:21:41Z" Dec 01 21:33:58 crc kubenswrapper[4962]: I1201 21:33:58.996533 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:33:58Z is after 2025-08-24T17:21:41Z" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.006449 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.006487 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.006497 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.006512 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.006521 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:33:59Z","lastTransitionTime":"2025-12-01T21:33:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.009696 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:33:59Z is after 2025-08-24T17:21:41Z" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.023420 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:33:59Z is after 2025-08-24T17:21:41Z" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.037267 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69adfdccde94cd891284e61c5edd6a092d14c5276ff514618fa3b44169e6347e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a6c3b22dbe7ed00b2356583225c79a46bfbf7014f2ba5b6d2226f1b1f3f7b39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:33:59Z is after 2025-08-24T17:21:41Z" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.048444 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w7dpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"943bfc9d-612b-4273-9774-f1866b7af4b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w7dpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:33:59Z is after 2025-08-24T17:21:41Z" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.059574 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rf67\" (UniqueName: \"kubernetes.io/projected/191b6ce3-f613-4217-b224-a65ee4cfdfe7-kube-api-access-9rf67\") pod \"machine-config-daemon-b642k\" (UID: \"191b6ce3-f613-4217-b224-a65ee4cfdfe7\") " pod="openshift-machine-config-operator/machine-config-daemon-b642k" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.059622 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/191b6ce3-f613-4217-b224-a65ee4cfdfe7-rootfs\") pod \"machine-config-daemon-b642k\" (UID: \"191b6ce3-f613-4217-b224-a65ee4cfdfe7\") " pod="openshift-machine-config-operator/machine-config-daemon-b642k" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.059661 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/191b6ce3-f613-4217-b224-a65ee4cfdfe7-mcd-auth-proxy-config\") pod \"machine-config-daemon-b642k\" (UID: \"191b6ce3-f613-4217-b224-a65ee4cfdfe7\") " pod="openshift-machine-config-operator/machine-config-daemon-b642k" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.059708 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prj9m\" (UniqueName: \"kubernetes.io/projected/943bfc9d-612b-4273-9774-f1866b7af4b8-kube-api-access-prj9m\") pod \"node-resolver-w7dpq\" (UID: \"943bfc9d-612b-4273-9774-f1866b7af4b8\") " pod="openshift-dns/node-resolver-w7dpq" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.059728 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/191b6ce3-f613-4217-b224-a65ee4cfdfe7-proxy-tls\") pod \"machine-config-daemon-b642k\" (UID: \"191b6ce3-f613-4217-b224-a65ee4cfdfe7\") " pod="openshift-machine-config-operator/machine-config-daemon-b642k" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.059762 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/943bfc9d-612b-4273-9774-f1866b7af4b8-hosts-file\") pod \"node-resolver-w7dpq\" (UID: \"943bfc9d-612b-4273-9774-f1866b7af4b8\") " pod="openshift-dns/node-resolver-w7dpq" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.059799 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/191b6ce3-f613-4217-b224-a65ee4cfdfe7-rootfs\") pod \"machine-config-daemon-b642k\" (UID: \"191b6ce3-f613-4217-b224-a65ee4cfdfe7\") " pod="openshift-machine-config-operator/machine-config-daemon-b642k" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.059826 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/943bfc9d-612b-4273-9774-f1866b7af4b8-hosts-file\") pod \"node-resolver-w7dpq\" (UID: \"943bfc9d-612b-4273-9774-f1866b7af4b8\") " pod="openshift-dns/node-resolver-w7dpq" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.060663 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/191b6ce3-f613-4217-b224-a65ee4cfdfe7-mcd-auth-proxy-config\") pod \"machine-config-daemon-b642k\" (UID: \"191b6ce3-f613-4217-b224-a65ee4cfdfe7\") " pod="openshift-machine-config-operator/machine-config-daemon-b642k" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.067347 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/191b6ce3-f613-4217-b224-a65ee4cfdfe7-proxy-tls\") pod \"machine-config-daemon-b642k\" (UID: \"191b6ce3-f613-4217-b224-a65ee4cfdfe7\") " pod="openshift-machine-config-operator/machine-config-daemon-b642k" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.070391 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dce968b-9adc-4cbe-958c-00bdd7ba6159\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d0603aa8db338a61428d0c439ab252ad7db6ee0061992381a2f2c2534ecc08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b00d16ddb5e5c38227a87ee09c7e0d74ca43fb9226904dc5d604abe6a9c9b5ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc7177cbcbb0fe51491e055e7a8cb565836f1ffe163b6e160c3b425a5218fc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e749578c0ce8cb1dd38de90c24c5ddf738558163fe9012afcf1db088756495\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:33:59Z is after 2025-08-24T17:21:41Z" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.091671 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prj9m\" (UniqueName: \"kubernetes.io/projected/943bfc9d-612b-4273-9774-f1866b7af4b8-kube-api-access-prj9m\") pod \"node-resolver-w7dpq\" (UID: \"943bfc9d-612b-4273-9774-f1866b7af4b8\") " pod="openshift-dns/node-resolver-w7dpq" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.092612 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db02e5c3ea0837c8534d6d0d89c6343ef077bc79b08bb6514fcc7fa0d71ac94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:33:59Z is after 2025-08-24T17:21:41Z" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.098510 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rf67\" (UniqueName: \"kubernetes.io/projected/191b6ce3-f613-4217-b224-a65ee4cfdfe7-kube-api-access-9rf67\") pod \"machine-config-daemon-b642k\" (UID: \"191b6ce3-f613-4217-b224-a65ee4cfdfe7\") " pod="openshift-machine-config-operator/machine-config-daemon-b642k" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.108580 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-w7dpq" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.108996 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.109028 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.109036 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.109050 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.109059 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:33:59Z","lastTransitionTime":"2025-12-01T21:33:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.114882 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-b642k" Dec 01 21:33:59 crc kubenswrapper[4962]: W1201 21:33:59.124216 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod943bfc9d_612b_4273_9774_f1866b7af4b8.slice/crio-39aec3d68d25c176f5b3739bec8822ea5ab3e5877478febde523119025b488bb WatchSource:0}: Error finding container 39aec3d68d25c176f5b3739bec8822ea5ab3e5877478febde523119025b488bb: Status 404 returned error can't find the container with id 39aec3d68d25c176f5b3739bec8822ea5ab3e5877478febde523119025b488bb Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.126330 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:33:59Z is after 2025-08-24T17:21:41Z" Dec 01 21:33:59 crc kubenswrapper[4962]: W1201 21:33:59.135659 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod191b6ce3_f613_4217_b224_a65ee4cfdfe7.slice/crio-37c066bf395b493cb5843ae1f07a78cd4d73221d2670ca165c9b6b53afb749eb WatchSource:0}: Error finding container 37c066bf395b493cb5843ae1f07a78cd4d73221d2670ca165c9b6b53afb749eb: Status 404 returned error can't find the container with id 37c066bf395b493cb5843ae1f07a78cd4d73221d2670ca165c9b6b53afb749eb Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.137849 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"191b6ce3-f613-4217-b224-a65ee4cfdfe7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rf67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rf67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b642k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:33:59Z is after 2025-08-24T17:21:41Z" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.214575 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.214627 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.214636 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.214656 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.214666 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:33:59Z","lastTransitionTime":"2025-12-01T21:33:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.219080 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 21:33:59 crc kubenswrapper[4962]: E1201 21:33:59.219211 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.219587 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 21:33:59 crc kubenswrapper[4962]: E1201 21:33:59.219652 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.219702 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:33:59 crc kubenswrapper[4962]: E1201 21:33:59.219757 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.234260 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-m4wg5"] Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.234800 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-m4wg5" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.249819 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-lv2jr"] Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.250711 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-lv2jr" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.251680 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.251897 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.252275 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.252506 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.252966 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.255630 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.261435 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.270440 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:33:59Z is after 2025-08-24T17:21:41Z" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.291921 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dce968b-9adc-4cbe-958c-00bdd7ba6159\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d0603aa8db338a61428d0c439ab252ad7db6ee0061992381a2f2c2534ecc08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b00d16ddb5e5c38227a87ee09c7e0d74ca43fb9226904dc5d604abe6a9c9b5ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc7177cbcbb0fe51491e055e7a8cb565836f1ffe163b6e160c3b425a5218fc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e749578c0ce8cb1dd38de90c24c5ddf738558163fe9012afcf1db088756495\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:33:59Z is after 2025-08-24T17:21:41Z" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.311414 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db02e5c3ea0837c8534d6d0d89c6343ef077bc79b08bb6514fcc7fa0d71ac94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:33:59Z is after 2025-08-24T17:21:41Z" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.317254 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.317287 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.317297 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.317313 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.317324 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:33:59Z","lastTransitionTime":"2025-12-01T21:33:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.323315 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:33:59Z is after 2025-08-24T17:21:41Z" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.338218 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-m4wg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38b9e31-13b0-4a48-93bf-b3722ca60642\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws4mx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-m4wg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:33:59Z is after 2025-08-24T17:21:41Z" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.362501 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f38b9e31-13b0-4a48-93bf-b3722ca60642-os-release\") pod \"multus-m4wg5\" (UID: \"f38b9e31-13b0-4a48-93bf-b3722ca60642\") " pod="openshift-multus/multus-m4wg5" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.362556 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/47902aa9-b3e5-4279-a0ee-23ec28d1c67b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-lv2jr\" (UID: \"47902aa9-b3e5-4279-a0ee-23ec28d1c67b\") " pod="openshift-multus/multus-additional-cni-plugins-lv2jr" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.362586 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f38b9e31-13b0-4a48-93bf-b3722ca60642-system-cni-dir\") pod \"multus-m4wg5\" (UID: \"f38b9e31-13b0-4a48-93bf-b3722ca60642\") " pod="openshift-multus/multus-m4wg5" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.362606 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f38b9e31-13b0-4a48-93bf-b3722ca60642-cnibin\") pod \"multus-m4wg5\" (UID: \"f38b9e31-13b0-4a48-93bf-b3722ca60642\") " pod="openshift-multus/multus-m4wg5" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.362628 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f38b9e31-13b0-4a48-93bf-b3722ca60642-host-var-lib-cni-bin\") pod \"multus-m4wg5\" (UID: \"f38b9e31-13b0-4a48-93bf-b3722ca60642\") " pod="openshift-multus/multus-m4wg5" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.362718 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f38b9e31-13b0-4a48-93bf-b3722ca60642-host-run-netns\") pod \"multus-m4wg5\" (UID: \"f38b9e31-13b0-4a48-93bf-b3722ca60642\") " pod="openshift-multus/multus-m4wg5" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.362749 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f38b9e31-13b0-4a48-93bf-b3722ca60642-multus-conf-dir\") pod \"multus-m4wg5\" (UID: \"f38b9e31-13b0-4a48-93bf-b3722ca60642\") " pod="openshift-multus/multus-m4wg5" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.362874 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f38b9e31-13b0-4a48-93bf-b3722ca60642-multus-socket-dir-parent\") pod \"multus-m4wg5\" (UID: \"f38b9e31-13b0-4a48-93bf-b3722ca60642\") " pod="openshift-multus/multus-m4wg5" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.362911 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f38b9e31-13b0-4a48-93bf-b3722ca60642-host-run-k8s-cni-cncf-io\") pod \"multus-m4wg5\" (UID: \"f38b9e31-13b0-4a48-93bf-b3722ca60642\") " pod="openshift-multus/multus-m4wg5" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.362957 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f38b9e31-13b0-4a48-93bf-b3722ca60642-etc-kubernetes\") pod \"multus-m4wg5\" (UID: \"f38b9e31-13b0-4a48-93bf-b3722ca60642\") " pod="openshift-multus/multus-m4wg5" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.362977 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/47902aa9-b3e5-4279-a0ee-23ec28d1c67b-system-cni-dir\") pod \"multus-additional-cni-plugins-lv2jr\" (UID: \"47902aa9-b3e5-4279-a0ee-23ec28d1c67b\") " pod="openshift-multus/multus-additional-cni-plugins-lv2jr" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.363058 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f38b9e31-13b0-4a48-93bf-b3722ca60642-host-run-multus-certs\") pod \"multus-m4wg5\" (UID: \"f38b9e31-13b0-4a48-93bf-b3722ca60642\") " pod="openshift-multus/multus-m4wg5" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.363113 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/47902aa9-b3e5-4279-a0ee-23ec28d1c67b-cnibin\") pod \"multus-additional-cni-plugins-lv2jr\" (UID: \"47902aa9-b3e5-4279-a0ee-23ec28d1c67b\") " pod="openshift-multus/multus-additional-cni-plugins-lv2jr" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.363172 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqmfw\" (UniqueName: \"kubernetes.io/projected/47902aa9-b3e5-4279-a0ee-23ec28d1c67b-kube-api-access-vqmfw\") pod \"multus-additional-cni-plugins-lv2jr\" (UID: \"47902aa9-b3e5-4279-a0ee-23ec28d1c67b\") " pod="openshift-multus/multus-additional-cni-plugins-lv2jr" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.363203 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/47902aa9-b3e5-4279-a0ee-23ec28d1c67b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-lv2jr\" (UID: \"47902aa9-b3e5-4279-a0ee-23ec28d1c67b\") " pod="openshift-multus/multus-additional-cni-plugins-lv2jr" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.363227 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f38b9e31-13b0-4a48-93bf-b3722ca60642-cni-binary-copy\") pod \"multus-m4wg5\" (UID: \"f38b9e31-13b0-4a48-93bf-b3722ca60642\") " pod="openshift-multus/multus-m4wg5" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.363268 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f38b9e31-13b0-4a48-93bf-b3722ca60642-host-var-lib-kubelet\") pod \"multus-m4wg5\" (UID: \"f38b9e31-13b0-4a48-93bf-b3722ca60642\") " pod="openshift-multus/multus-m4wg5" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.363290 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ws4mx\" (UniqueName: \"kubernetes.io/projected/f38b9e31-13b0-4a48-93bf-b3722ca60642-kube-api-access-ws4mx\") pod \"multus-m4wg5\" (UID: \"f38b9e31-13b0-4a48-93bf-b3722ca60642\") " pod="openshift-multus/multus-m4wg5" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.363453 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f38b9e31-13b0-4a48-93bf-b3722ca60642-multus-cni-dir\") pod \"multus-m4wg5\" (UID: \"f38b9e31-13b0-4a48-93bf-b3722ca60642\") " pod="openshift-multus/multus-m4wg5" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.363499 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f38b9e31-13b0-4a48-93bf-b3722ca60642-hostroot\") pod \"multus-m4wg5\" (UID: \"f38b9e31-13b0-4a48-93bf-b3722ca60642\") " pod="openshift-multus/multus-m4wg5" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.363528 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f38b9e31-13b0-4a48-93bf-b3722ca60642-multus-daemon-config\") pod \"multus-m4wg5\" (UID: \"f38b9e31-13b0-4a48-93bf-b3722ca60642\") " pod="openshift-multus/multus-m4wg5" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.363619 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f38b9e31-13b0-4a48-93bf-b3722ca60642-host-var-lib-cni-multus\") pod \"multus-m4wg5\" (UID: \"f38b9e31-13b0-4a48-93bf-b3722ca60642\") " pod="openshift-multus/multus-m4wg5" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.363648 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/47902aa9-b3e5-4279-a0ee-23ec28d1c67b-os-release\") pod \"multus-additional-cni-plugins-lv2jr\" (UID: \"47902aa9-b3e5-4279-a0ee-23ec28d1c67b\") " pod="openshift-multus/multus-additional-cni-plugins-lv2jr" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.363670 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/47902aa9-b3e5-4279-a0ee-23ec28d1c67b-cni-binary-copy\") pod \"multus-additional-cni-plugins-lv2jr\" (UID: \"47902aa9-b3e5-4279-a0ee-23ec28d1c67b\") " pod="openshift-multus/multus-additional-cni-plugins-lv2jr" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.362603 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac15c86b-cd18-4110-bda1-ba116dccb445\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edc674b0b2e671ee34618f4a89fb513b4d6be3b5b2c65510554ed57b84f305b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d764e05de4e1d86add3da927e55f8d2f211bba19a2104af89e43d13f986f5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a26275d40e0cdc563a8704d985b92ca116fd053ac663b23ada1f3d55d3d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7bb726576fbf1e8a595c9fdab30ed4852d838f3ad72aa432b880e8466493ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa02037f2a0411aa4c3b7eff66c72a1885d70e56cf72cdf786d04b5840ecbe6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"message\\\":\\\" 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764624829\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764624829\\\\\\\\\\\\\\\" (2025-12-01 20:33:49 +0000 UTC to 2026-12-01 20:33:49 +0000 UTC (now=2025-12-01 21:33:54.341897199 +0000 UTC))\\\\\\\"\\\\nI1201 21:33:54.342031 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 21:33:54.342089 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 21:33:54.342220 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1201 21:33:54.342244 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1201 21:33:54.341422 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 21:33:54.342706 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 21:33:54.344341 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-602538438/tls.crt::/tmp/serving-cert-602538438/tls.key\\\\\\\"\\\\nI1201 21:33:54.345794 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 21:33:54.345817 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 21:33:54.345844 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348097 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348569 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1201 21:33:54.367723 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bbe0097117ffbc7aa9c8247995dcec3765f03a9aa7657ee0c0da4c3a3ec874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:33:59Z is after 2025-08-24T17:21:41Z" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.419201 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:33:59Z is after 2025-08-24T17:21:41Z" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.421549 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.421582 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.421593 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.421609 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.421620 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:33:59Z","lastTransitionTime":"2025-12-01T21:33:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.422976 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"311915b276fcb2749fd7443c07b4a339131584ba32944273ed063ed55edec6c4"} Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.427339 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" event={"ID":"191b6ce3-f613-4217-b224-a65ee4cfdfe7","Type":"ContainerStarted","Data":"ca94caf0040f2d86239f149d34e419adcc86594fda2b4189e74f699c125857f6"} Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.427372 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" event={"ID":"191b6ce3-f613-4217-b224-a65ee4cfdfe7","Type":"ContainerStarted","Data":"37c066bf395b493cb5843ae1f07a78cd4d73221d2670ca165c9b6b53afb749eb"} Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.428549 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-w7dpq" event={"ID":"943bfc9d-612b-4273-9774-f1866b7af4b8","Type":"ContainerStarted","Data":"39aec3d68d25c176f5b3739bec8822ea5ab3e5877478febde523119025b488bb"} Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.437683 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"191b6ce3-f613-4217-b224-a65ee4cfdfe7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rf67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rf67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b642k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:33:59Z is after 2025-08-24T17:21:41Z" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.450681 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:33:59Z is after 2025-08-24T17:21:41Z" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.464179 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f38b9e31-13b0-4a48-93bf-b3722ca60642-os-release\") pod \"multus-m4wg5\" (UID: \"f38b9e31-13b0-4a48-93bf-b3722ca60642\") " pod="openshift-multus/multus-m4wg5" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.464218 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/47902aa9-b3e5-4279-a0ee-23ec28d1c67b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-lv2jr\" (UID: \"47902aa9-b3e5-4279-a0ee-23ec28d1c67b\") " pod="openshift-multus/multus-additional-cni-plugins-lv2jr" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.464239 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f38b9e31-13b0-4a48-93bf-b3722ca60642-system-cni-dir\") pod \"multus-m4wg5\" (UID: \"f38b9e31-13b0-4a48-93bf-b3722ca60642\") " pod="openshift-multus/multus-m4wg5" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.464256 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f38b9e31-13b0-4a48-93bf-b3722ca60642-cnibin\") pod \"multus-m4wg5\" (UID: \"f38b9e31-13b0-4a48-93bf-b3722ca60642\") " pod="openshift-multus/multus-m4wg5" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.464279 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f38b9e31-13b0-4a48-93bf-b3722ca60642-host-var-lib-cni-bin\") pod \"multus-m4wg5\" (UID: \"f38b9e31-13b0-4a48-93bf-b3722ca60642\") " pod="openshift-multus/multus-m4wg5" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.464312 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f38b9e31-13b0-4a48-93bf-b3722ca60642-host-run-netns\") pod \"multus-m4wg5\" (UID: \"f38b9e31-13b0-4a48-93bf-b3722ca60642\") " pod="openshift-multus/multus-m4wg5" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.464333 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f38b9e31-13b0-4a48-93bf-b3722ca60642-multus-conf-dir\") pod \"multus-m4wg5\" (UID: \"f38b9e31-13b0-4a48-93bf-b3722ca60642\") " pod="openshift-multus/multus-m4wg5" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.464354 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f38b9e31-13b0-4a48-93bf-b3722ca60642-multus-socket-dir-parent\") pod \"multus-m4wg5\" (UID: \"f38b9e31-13b0-4a48-93bf-b3722ca60642\") " pod="openshift-multus/multus-m4wg5" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.464354 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f38b9e31-13b0-4a48-93bf-b3722ca60642-os-release\") pod \"multus-m4wg5\" (UID: \"f38b9e31-13b0-4a48-93bf-b3722ca60642\") " pod="openshift-multus/multus-m4wg5" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.464372 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f38b9e31-13b0-4a48-93bf-b3722ca60642-host-run-k8s-cni-cncf-io\") pod \"multus-m4wg5\" (UID: \"f38b9e31-13b0-4a48-93bf-b3722ca60642\") " pod="openshift-multus/multus-m4wg5" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.464417 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f38b9e31-13b0-4a48-93bf-b3722ca60642-host-run-k8s-cni-cncf-io\") pod \"multus-m4wg5\" (UID: \"f38b9e31-13b0-4a48-93bf-b3722ca60642\") " pod="openshift-multus/multus-m4wg5" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.464432 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f38b9e31-13b0-4a48-93bf-b3722ca60642-etc-kubernetes\") pod \"multus-m4wg5\" (UID: \"f38b9e31-13b0-4a48-93bf-b3722ca60642\") " pod="openshift-multus/multus-m4wg5" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.464469 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/47902aa9-b3e5-4279-a0ee-23ec28d1c67b-system-cni-dir\") pod \"multus-additional-cni-plugins-lv2jr\" (UID: \"47902aa9-b3e5-4279-a0ee-23ec28d1c67b\") " pod="openshift-multus/multus-additional-cni-plugins-lv2jr" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.464490 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f38b9e31-13b0-4a48-93bf-b3722ca60642-host-run-multus-certs\") pod \"multus-m4wg5\" (UID: \"f38b9e31-13b0-4a48-93bf-b3722ca60642\") " pod="openshift-multus/multus-m4wg5" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.464510 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/47902aa9-b3e5-4279-a0ee-23ec28d1c67b-cnibin\") pod \"multus-additional-cni-plugins-lv2jr\" (UID: \"47902aa9-b3e5-4279-a0ee-23ec28d1c67b\") " pod="openshift-multus/multus-additional-cni-plugins-lv2jr" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.464526 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqmfw\" (UniqueName: \"kubernetes.io/projected/47902aa9-b3e5-4279-a0ee-23ec28d1c67b-kube-api-access-vqmfw\") pod \"multus-additional-cni-plugins-lv2jr\" (UID: \"47902aa9-b3e5-4279-a0ee-23ec28d1c67b\") " pod="openshift-multus/multus-additional-cni-plugins-lv2jr" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.464557 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/47902aa9-b3e5-4279-a0ee-23ec28d1c67b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-lv2jr\" (UID: \"47902aa9-b3e5-4279-a0ee-23ec28d1c67b\") " pod="openshift-multus/multus-additional-cni-plugins-lv2jr" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.464572 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f38b9e31-13b0-4a48-93bf-b3722ca60642-cni-binary-copy\") pod \"multus-m4wg5\" (UID: \"f38b9e31-13b0-4a48-93bf-b3722ca60642\") " pod="openshift-multus/multus-m4wg5" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.464587 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f38b9e31-13b0-4a48-93bf-b3722ca60642-host-var-lib-kubelet\") pod \"multus-m4wg5\" (UID: \"f38b9e31-13b0-4a48-93bf-b3722ca60642\") " pod="openshift-multus/multus-m4wg5" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.464600 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ws4mx\" (UniqueName: \"kubernetes.io/projected/f38b9e31-13b0-4a48-93bf-b3722ca60642-kube-api-access-ws4mx\") pod \"multus-m4wg5\" (UID: \"f38b9e31-13b0-4a48-93bf-b3722ca60642\") " pod="openshift-multus/multus-m4wg5" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.464616 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f38b9e31-13b0-4a48-93bf-b3722ca60642-multus-cni-dir\") pod \"multus-m4wg5\" (UID: \"f38b9e31-13b0-4a48-93bf-b3722ca60642\") " pod="openshift-multus/multus-m4wg5" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.464631 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f38b9e31-13b0-4a48-93bf-b3722ca60642-hostroot\") pod \"multus-m4wg5\" (UID: \"f38b9e31-13b0-4a48-93bf-b3722ca60642\") " pod="openshift-multus/multus-m4wg5" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.464647 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f38b9e31-13b0-4a48-93bf-b3722ca60642-multus-daemon-config\") pod \"multus-m4wg5\" (UID: \"f38b9e31-13b0-4a48-93bf-b3722ca60642\") " pod="openshift-multus/multus-m4wg5" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.464662 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f38b9e31-13b0-4a48-93bf-b3722ca60642-host-var-lib-cni-multus\") pod \"multus-m4wg5\" (UID: \"f38b9e31-13b0-4a48-93bf-b3722ca60642\") " pod="openshift-multus/multus-m4wg5" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.464860 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/47902aa9-b3e5-4279-a0ee-23ec28d1c67b-os-release\") pod \"multus-additional-cni-plugins-lv2jr\" (UID: \"47902aa9-b3e5-4279-a0ee-23ec28d1c67b\") " pod="openshift-multus/multus-additional-cni-plugins-lv2jr" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.464861 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/47902aa9-b3e5-4279-a0ee-23ec28d1c67b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-lv2jr\" (UID: \"47902aa9-b3e5-4279-a0ee-23ec28d1c67b\") " pod="openshift-multus/multus-additional-cni-plugins-lv2jr" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.464875 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/47902aa9-b3e5-4279-a0ee-23ec28d1c67b-cni-binary-copy\") pod \"multus-additional-cni-plugins-lv2jr\" (UID: \"47902aa9-b3e5-4279-a0ee-23ec28d1c67b\") " pod="openshift-multus/multus-additional-cni-plugins-lv2jr" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.464898 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f38b9e31-13b0-4a48-93bf-b3722ca60642-system-cni-dir\") pod \"multus-m4wg5\" (UID: \"f38b9e31-13b0-4a48-93bf-b3722ca60642\") " pod="openshift-multus/multus-m4wg5" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.464941 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f38b9e31-13b0-4a48-93bf-b3722ca60642-cnibin\") pod \"multus-m4wg5\" (UID: \"f38b9e31-13b0-4a48-93bf-b3722ca60642\") " pod="openshift-multus/multus-m4wg5" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.464965 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f38b9e31-13b0-4a48-93bf-b3722ca60642-host-var-lib-cni-bin\") pod \"multus-m4wg5\" (UID: \"f38b9e31-13b0-4a48-93bf-b3722ca60642\") " pod="openshift-multus/multus-m4wg5" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.464985 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f38b9e31-13b0-4a48-93bf-b3722ca60642-host-run-netns\") pod \"multus-m4wg5\" (UID: \"f38b9e31-13b0-4a48-93bf-b3722ca60642\") " pod="openshift-multus/multus-m4wg5" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.465008 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f38b9e31-13b0-4a48-93bf-b3722ca60642-multus-conf-dir\") pod \"multus-m4wg5\" (UID: \"f38b9e31-13b0-4a48-93bf-b3722ca60642\") " pod="openshift-multus/multus-m4wg5" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.465038 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f38b9e31-13b0-4a48-93bf-b3722ca60642-multus-socket-dir-parent\") pod \"multus-m4wg5\" (UID: \"f38b9e31-13b0-4a48-93bf-b3722ca60642\") " pod="openshift-multus/multus-m4wg5" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.465632 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/47902aa9-b3e5-4279-a0ee-23ec28d1c67b-cni-binary-copy\") pod \"multus-additional-cni-plugins-lv2jr\" (UID: \"47902aa9-b3e5-4279-a0ee-23ec28d1c67b\") " pod="openshift-multus/multus-additional-cni-plugins-lv2jr" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.465669 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f38b9e31-13b0-4a48-93bf-b3722ca60642-host-var-lib-kubelet\") pod \"multus-m4wg5\" (UID: \"f38b9e31-13b0-4a48-93bf-b3722ca60642\") " pod="openshift-multus/multus-m4wg5" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.465683 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f38b9e31-13b0-4a48-93bf-b3722ca60642-cni-binary-copy\") pod \"multus-m4wg5\" (UID: \"f38b9e31-13b0-4a48-93bf-b3722ca60642\") " pod="openshift-multus/multus-m4wg5" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.465722 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f38b9e31-13b0-4a48-93bf-b3722ca60642-etc-kubernetes\") pod \"multus-m4wg5\" (UID: \"f38b9e31-13b0-4a48-93bf-b3722ca60642\") " pod="openshift-multus/multus-m4wg5" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.465746 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/47902aa9-b3e5-4279-a0ee-23ec28d1c67b-system-cni-dir\") pod \"multus-additional-cni-plugins-lv2jr\" (UID: \"47902aa9-b3e5-4279-a0ee-23ec28d1c67b\") " pod="openshift-multus/multus-additional-cni-plugins-lv2jr" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.465768 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f38b9e31-13b0-4a48-93bf-b3722ca60642-host-run-multus-certs\") pod \"multus-m4wg5\" (UID: \"f38b9e31-13b0-4a48-93bf-b3722ca60642\") " pod="openshift-multus/multus-m4wg5" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.465789 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/47902aa9-b3e5-4279-a0ee-23ec28d1c67b-cnibin\") pod \"multus-additional-cni-plugins-lv2jr\" (UID: \"47902aa9-b3e5-4279-a0ee-23ec28d1c67b\") " pod="openshift-multus/multus-additional-cni-plugins-lv2jr" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.465914 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f38b9e31-13b0-4a48-93bf-b3722ca60642-multus-cni-dir\") pod \"multus-m4wg5\" (UID: \"f38b9e31-13b0-4a48-93bf-b3722ca60642\") " pod="openshift-multus/multus-m4wg5" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.465959 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f38b9e31-13b0-4a48-93bf-b3722ca60642-hostroot\") pod \"multus-m4wg5\" (UID: \"f38b9e31-13b0-4a48-93bf-b3722ca60642\") " pod="openshift-multus/multus-m4wg5" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.466176 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f38b9e31-13b0-4a48-93bf-b3722ca60642-host-var-lib-cni-multus\") pod \"multus-m4wg5\" (UID: \"f38b9e31-13b0-4a48-93bf-b3722ca60642\") " pod="openshift-multus/multus-m4wg5" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.466291 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/47902aa9-b3e5-4279-a0ee-23ec28d1c67b-os-release\") pod \"multus-additional-cni-plugins-lv2jr\" (UID: \"47902aa9-b3e5-4279-a0ee-23ec28d1c67b\") " pod="openshift-multus/multus-additional-cni-plugins-lv2jr" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.466393 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f38b9e31-13b0-4a48-93bf-b3722ca60642-multus-daemon-config\") pod \"multus-m4wg5\" (UID: \"f38b9e31-13b0-4a48-93bf-b3722ca60642\") " pod="openshift-multus/multus-m4wg5" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.466456 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/47902aa9-b3e5-4279-a0ee-23ec28d1c67b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-lv2jr\" (UID: \"47902aa9-b3e5-4279-a0ee-23ec28d1c67b\") " pod="openshift-multus/multus-additional-cni-plugins-lv2jr" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.468470 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69adfdccde94cd891284e61c5edd6a092d14c5276ff514618fa3b44169e6347e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a6c3b22dbe7ed00b2356583225c79a46bfbf7014f2ba5b6d2226f1b1f3f7b39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:33:59Z is after 2025-08-24T17:21:41Z" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.481310 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqmfw\" (UniqueName: \"kubernetes.io/projected/47902aa9-b3e5-4279-a0ee-23ec28d1c67b-kube-api-access-vqmfw\") pod \"multus-additional-cni-plugins-lv2jr\" (UID: \"47902aa9-b3e5-4279-a0ee-23ec28d1c67b\") " pod="openshift-multus/multus-additional-cni-plugins-lv2jr" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.481788 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ws4mx\" (UniqueName: \"kubernetes.io/projected/f38b9e31-13b0-4a48-93bf-b3722ca60642-kube-api-access-ws4mx\") pod \"multus-m4wg5\" (UID: \"f38b9e31-13b0-4a48-93bf-b3722ca60642\") " pod="openshift-multus/multus-m4wg5" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.482420 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w7dpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"943bfc9d-612b-4273-9774-f1866b7af4b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w7dpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:33:59Z is after 2025-08-24T17:21:41Z" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.497211 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dce968b-9adc-4cbe-958c-00bdd7ba6159\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d0603aa8db338a61428d0c439ab252ad7db6ee0061992381a2f2c2534ecc08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b00d16ddb5e5c38227a87ee09c7e0d74ca43fb9226904dc5d604abe6a9c9b5ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc7177cbcbb0fe51491e055e7a8cb565836f1ffe163b6e160c3b425a5218fc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e749578c0ce8cb1dd38de90c24c5ddf738558163fe9012afcf1db088756495\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:33:59Z is after 2025-08-24T17:21:41Z" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.511480 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db02e5c3ea0837c8534d6d0d89c6343ef077bc79b08bb6514fcc7fa0d71ac94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:33:59Z is after 2025-08-24T17:21:41Z" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.524571 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.524638 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.524651 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.524666 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.524675 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:33:59Z","lastTransitionTime":"2025-12-01T21:33:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.530469 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:33:59Z is after 2025-08-24T17:21:41Z" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.549835 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-m4wg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38b9e31-13b0-4a48-93bf-b3722ca60642\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws4mx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-m4wg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:33:59Z is after 2025-08-24T17:21:41Z" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.561635 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-m4wg5" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.572256 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-lv2jr" Dec 01 21:33:59 crc kubenswrapper[4962]: W1201 21:33:59.580431 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf38b9e31_13b0_4a48_93bf_b3722ca60642.slice/crio-6b3925be105234a145e5d2e68f98c28474d456b0dd783a93c622a01ff302f15b WatchSource:0}: Error finding container 6b3925be105234a145e5d2e68f98c28474d456b0dd783a93c622a01ff302f15b: Status 404 returned error can't find the container with id 6b3925be105234a145e5d2e68f98c28474d456b0dd783a93c622a01ff302f15b Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.581904 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lv2jr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47902aa9-b3e5-4279-a0ee-23ec28d1c67b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lv2jr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:33:59Z is after 2025-08-24T17:21:41Z" Dec 01 21:33:59 crc kubenswrapper[4962]: W1201 21:33:59.590602 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47902aa9_b3e5_4279_a0ee_23ec28d1c67b.slice/crio-ebb9e8cb752b7de1f349ee81d46eac8bc90df51b18de0909e710e63794570649 WatchSource:0}: Error finding container ebb9e8cb752b7de1f349ee81d46eac8bc90df51b18de0909e710e63794570649: Status 404 returned error can't find the container with id ebb9e8cb752b7de1f349ee81d46eac8bc90df51b18de0909e710e63794570649 Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.609592 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac15c86b-cd18-4110-bda1-ba116dccb445\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edc674b0b2e671ee34618f4a89fb513b4d6be3b5b2c65510554ed57b84f305b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d764e05de4e1d86add3da927e55f8d2f211bba19a2104af89e43d13f986f5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a26275d40e0cdc563a8704d985b92ca116fd053ac663b23ada1f3d55d3d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7bb726576fbf1e8a595c9fdab30ed4852d838f3ad72aa432b880e8466493ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa02037f2a0411aa4c3b7eff66c72a1885d70e56cf72cdf786d04b5840ecbe6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"message\\\":\\\" 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764624829\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764624829\\\\\\\\\\\\\\\" (2025-12-01 20:33:49 +0000 UTC to 2026-12-01 20:33:49 +0000 UTC (now=2025-12-01 21:33:54.341897199 +0000 UTC))\\\\\\\"\\\\nI1201 21:33:54.342031 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 21:33:54.342089 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 21:33:54.342220 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1201 21:33:54.342244 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1201 21:33:54.341422 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 21:33:54.342706 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 21:33:54.344341 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-602538438/tls.crt::/tmp/serving-cert-602538438/tls.key\\\\\\\"\\\\nI1201 21:33:54.345794 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 21:33:54.345817 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 21:33:54.345844 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348097 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348569 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1201 21:33:54.367723 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bbe0097117ffbc7aa9c8247995dcec3765f03a9aa7657ee0c0da4c3a3ec874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:33:59Z is after 2025-08-24T17:21:41Z" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.628562 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.628592 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.628600 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.628614 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.628625 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:33:59Z","lastTransitionTime":"2025-12-01T21:33:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.630058 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:33:59Z is after 2025-08-24T17:21:41Z" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.631994 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-j77n9"] Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.633177 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.638294 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.638339 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.638646 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.638676 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.638713 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.638730 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.638858 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.648278 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://311915b276fcb2749fd7443c07b4a339131584ba32944273ed063ed55edec6c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:33:59Z is after 2025-08-24T17:21:41Z" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.664222 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69adfdccde94cd891284e61c5edd6a092d14c5276ff514618fa3b44169e6347e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a6c3b22dbe7ed00b2356583225c79a46bfbf7014f2ba5b6d2226f1b1f3f7b39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:33:59Z is after 2025-08-24T17:21:41Z" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.675095 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w7dpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"943bfc9d-612b-4273-9774-f1866b7af4b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w7dpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:33:59Z is after 2025-08-24T17:21:41Z" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.694879 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"191b6ce3-f613-4217-b224-a65ee4cfdfe7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rf67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rf67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b642k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:33:59Z is after 2025-08-24T17:21:41Z" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.709883 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:33:59Z is after 2025-08-24T17:21:41Z" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.721887 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac15c86b-cd18-4110-bda1-ba116dccb445\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edc674b0b2e671ee34618f4a89fb513b4d6be3b5b2c65510554ed57b84f305b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d764e05de4e1d86add3da927e55f8d2f211bba19a2104af89e43d13f986f5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a26275d40e0cdc563a8704d985b92ca116fd053ac663b23ada1f3d55d3d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7bb726576fbf1e8a595c9fdab30ed4852d838f3ad72aa432b880e8466493ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa02037f2a0411aa4c3b7eff66c72a1885d70e56cf72cdf786d04b5840ecbe6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"message\\\":\\\" 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764624829\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764624829\\\\\\\\\\\\\\\" (2025-12-01 20:33:49 +0000 UTC to 2026-12-01 20:33:49 +0000 UTC (now=2025-12-01 21:33:54.341897199 +0000 UTC))\\\\\\\"\\\\nI1201 21:33:54.342031 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 21:33:54.342089 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 21:33:54.342220 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1201 21:33:54.342244 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1201 21:33:54.341422 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 21:33:54.342706 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 21:33:54.344341 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-602538438/tls.crt::/tmp/serving-cert-602538438/tls.key\\\\\\\"\\\\nI1201 21:33:54.345794 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 21:33:54.345817 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 21:33:54.345844 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348097 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348569 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1201 21:33:54.367723 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bbe0097117ffbc7aa9c8247995dcec3765f03a9aa7657ee0c0da4c3a3ec874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:33:59Z is after 2025-08-24T17:21:41Z" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.731726 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.731769 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.731777 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.731797 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.731807 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:33:59Z","lastTransitionTime":"2025-12-01T21:33:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.735241 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:33:59Z is after 2025-08-24T17:21:41Z" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.746245 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69adfdccde94cd891284e61c5edd6a092d14c5276ff514618fa3b44169e6347e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a6c3b22dbe7ed00b2356583225c79a46bfbf7014f2ba5b6d2226f1b1f3f7b39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:33:59Z is after 2025-08-24T17:21:41Z" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.755768 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w7dpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"943bfc9d-612b-4273-9774-f1866b7af4b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w7dpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:33:59Z is after 2025-08-24T17:21:41Z" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.765824 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"191b6ce3-f613-4217-b224-a65ee4cfdfe7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rf67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rf67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b642k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:33:59Z is after 2025-08-24T17:21:41Z" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.767482 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-node-log\") pod \"ovnkube-node-j77n9\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.767538 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/017b2e87-9a6e-46c6-b061-1ed93bfd2322-ovnkube-config\") pod \"ovnkube-node-j77n9\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.767556 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-host-slash\") pod \"ovnkube-node-j77n9\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.767573 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-host-cni-netd\") pod \"ovnkube-node-j77n9\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.767593 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-host-run-ovn-kubernetes\") pod \"ovnkube-node-j77n9\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.767727 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-run-systemd\") pod \"ovnkube-node-j77n9\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.767750 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-run-openvswitch\") pod \"ovnkube-node-j77n9\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.767768 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fg5ph\" (UniqueName: \"kubernetes.io/projected/017b2e87-9a6e-46c6-b061-1ed93bfd2322-kube-api-access-fg5ph\") pod \"ovnkube-node-j77n9\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.767799 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-j77n9\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.767825 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/017b2e87-9a6e-46c6-b061-1ed93bfd2322-ovn-node-metrics-cert\") pod \"ovnkube-node-j77n9\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.767846 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-host-kubelet\") pod \"ovnkube-node-j77n9\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.767860 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-etc-openvswitch\") pod \"ovnkube-node-j77n9\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.767875 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-run-ovn\") pod \"ovnkube-node-j77n9\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.767890 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-systemd-units\") pod \"ovnkube-node-j77n9\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.767907 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-host-run-netns\") pod \"ovnkube-node-j77n9\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.767925 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/017b2e87-9a6e-46c6-b061-1ed93bfd2322-env-overrides\") pod \"ovnkube-node-j77n9\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.767966 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-host-cni-bin\") pod \"ovnkube-node-j77n9\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.767992 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-log-socket\") pod \"ovnkube-node-j77n9\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.768010 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-var-lib-openvswitch\") pod \"ovnkube-node-j77n9\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.768024 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/017b2e87-9a6e-46c6-b061-1ed93bfd2322-ovnkube-script-lib\") pod \"ovnkube-node-j77n9\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.777169 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://311915b276fcb2749fd7443c07b4a339131584ba32944273ed063ed55edec6c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:33:59Z is after 2025-08-24T17:21:41Z" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.789419 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:33:59Z is after 2025-08-24T17:21:41Z" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.813078 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"017b2e87-9a6e-46c6-b061-1ed93bfd2322\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j77n9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:33:59Z is after 2025-08-24T17:21:41Z" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.829191 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-m4wg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38b9e31-13b0-4a48-93bf-b3722ca60642\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws4mx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-m4wg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:33:59Z is after 2025-08-24T17:21:41Z" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.839996 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.840052 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.840069 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.840092 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.840107 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:33:59Z","lastTransitionTime":"2025-12-01T21:33:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.843189 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lv2jr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47902aa9-b3e5-4279-a0ee-23ec28d1c67b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lv2jr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:33:59Z is after 2025-08-24T17:21:41Z" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.861541 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dce968b-9adc-4cbe-958c-00bdd7ba6159\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d0603aa8db338a61428d0c439ab252ad7db6ee0061992381a2f2c2534ecc08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b00d16ddb5e5c38227a87ee09c7e0d74ca43fb9226904dc5d604abe6a9c9b5ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc7177cbcbb0fe51491e055e7a8cb565836f1ffe163b6e160c3b425a5218fc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e749578c0ce8cb1dd38de90c24c5ddf738558163fe9012afcf1db088756495\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:33:59Z is after 2025-08-24T17:21:41Z" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.868463 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/017b2e87-9a6e-46c6-b061-1ed93bfd2322-ovnkube-config\") pod \"ovnkube-node-j77n9\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.868568 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-host-slash\") pod \"ovnkube-node-j77n9\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.868638 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-host-cni-netd\") pod \"ovnkube-node-j77n9\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.868706 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-host-run-ovn-kubernetes\") pod \"ovnkube-node-j77n9\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.868784 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-run-systemd\") pod \"ovnkube-node-j77n9\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.868852 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-run-openvswitch\") pod \"ovnkube-node-j77n9\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.868944 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fg5ph\" (UniqueName: \"kubernetes.io/projected/017b2e87-9a6e-46c6-b061-1ed93bfd2322-kube-api-access-fg5ph\") pod \"ovnkube-node-j77n9\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.869017 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-j77n9\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.869084 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/017b2e87-9a6e-46c6-b061-1ed93bfd2322-ovn-node-metrics-cert\") pod \"ovnkube-node-j77n9\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.869157 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-host-kubelet\") pod \"ovnkube-node-j77n9\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.869220 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-etc-openvswitch\") pod \"ovnkube-node-j77n9\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.869293 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-run-ovn\") pod \"ovnkube-node-j77n9\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.869365 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-host-run-netns\") pod \"ovnkube-node-j77n9\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.869436 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/017b2e87-9a6e-46c6-b061-1ed93bfd2322-env-overrides\") pod \"ovnkube-node-j77n9\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.869556 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-systemd-units\") pod \"ovnkube-node-j77n9\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.869711 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-host-cni-bin\") pod \"ovnkube-node-j77n9\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.869792 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-log-socket\") pod \"ovnkube-node-j77n9\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.869867 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-var-lib-openvswitch\") pod \"ovnkube-node-j77n9\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.869953 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/017b2e87-9a6e-46c6-b061-1ed93bfd2322-ovnkube-script-lib\") pod \"ovnkube-node-j77n9\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.870043 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-node-log\") pod \"ovnkube-node-j77n9\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.870202 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-node-log\") pod \"ovnkube-node-j77n9\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.871071 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/017b2e87-9a6e-46c6-b061-1ed93bfd2322-ovnkube-config\") pod \"ovnkube-node-j77n9\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.871167 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-host-slash\") pod \"ovnkube-node-j77n9\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.871247 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-host-cni-netd\") pod \"ovnkube-node-j77n9\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.871318 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-host-run-ovn-kubernetes\") pod \"ovnkube-node-j77n9\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.871388 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-run-systemd\") pod \"ovnkube-node-j77n9\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.871511 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-run-openvswitch\") pod \"ovnkube-node-j77n9\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.871836 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-j77n9\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.872595 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-log-socket\") pod \"ovnkube-node-j77n9\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.872685 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-host-cni-bin\") pod \"ovnkube-node-j77n9\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.872701 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-var-lib-openvswitch\") pod \"ovnkube-node-j77n9\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.872739 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-systemd-units\") pod \"ovnkube-node-j77n9\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.872798 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-host-run-netns\") pod \"ovnkube-node-j77n9\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.872871 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-etc-openvswitch\") pod \"ovnkube-node-j77n9\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.872867 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-run-ovn\") pod \"ovnkube-node-j77n9\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.872881 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-host-kubelet\") pod \"ovnkube-node-j77n9\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.873404 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/017b2e87-9a6e-46c6-b061-1ed93bfd2322-env-overrides\") pod \"ovnkube-node-j77n9\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.874048 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/017b2e87-9a6e-46c6-b061-1ed93bfd2322-ovnkube-script-lib\") pod \"ovnkube-node-j77n9\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.874404 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db02e5c3ea0837c8534d6d0d89c6343ef077bc79b08bb6514fcc7fa0d71ac94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:33:59Z is after 2025-08-24T17:21:41Z" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.877758 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/017b2e87-9a6e-46c6-b061-1ed93bfd2322-ovn-node-metrics-cert\") pod \"ovnkube-node-j77n9\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.888100 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:33:59Z is after 2025-08-24T17:21:41Z" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.890201 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fg5ph\" (UniqueName: \"kubernetes.io/projected/017b2e87-9a6e-46c6-b061-1ed93bfd2322-kube-api-access-fg5ph\") pod \"ovnkube-node-j77n9\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.942643 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.942779 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.942876 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.942978 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:33:59 crc kubenswrapper[4962]: I1201 21:33:59.943088 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:33:59Z","lastTransitionTime":"2025-12-01T21:33:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.046372 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.046585 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.046760 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.046869 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.046978 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:00Z","lastTransitionTime":"2025-12-01T21:34:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.150457 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.150519 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.150536 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.150561 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.150579 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:00Z","lastTransitionTime":"2025-12-01T21:34:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.230808 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" Dec 01 21:34:00 crc kubenswrapper[4962]: W1201 21:34:00.252415 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod017b2e87_9a6e_46c6_b061_1ed93bfd2322.slice/crio-04a387e5711598ad2964df75f2cc21cf336310ff885eaab7bd2f443a7d190bad WatchSource:0}: Error finding container 04a387e5711598ad2964df75f2cc21cf336310ff885eaab7bd2f443a7d190bad: Status 404 returned error can't find the container with id 04a387e5711598ad2964df75f2cc21cf336310ff885eaab7bd2f443a7d190bad Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.253724 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.253758 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.253769 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.253786 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.253798 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:00Z","lastTransitionTime":"2025-12-01T21:34:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.356909 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.356958 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.356973 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.356990 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.357009 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:00Z","lastTransitionTime":"2025-12-01T21:34:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.435153 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-m4wg5" event={"ID":"f38b9e31-13b0-4a48-93bf-b3722ca60642","Type":"ContainerStarted","Data":"b6b607ad59cd135df76eb0d6167a309468b46c7d487077b737d6d35390990de0"} Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.435222 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-m4wg5" event={"ID":"f38b9e31-13b0-4a48-93bf-b3722ca60642","Type":"ContainerStarted","Data":"6b3925be105234a145e5d2e68f98c28474d456b0dd783a93c622a01ff302f15b"} Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.446424 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" event={"ID":"191b6ce3-f613-4217-b224-a65ee4cfdfe7","Type":"ContainerStarted","Data":"37ed2e5c6ce1be6c710341ab02dc9ae89ebb0b309c11e78b251ffb532c31587f"} Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.448176 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-w7dpq" event={"ID":"943bfc9d-612b-4273-9774-f1866b7af4b8","Type":"ContainerStarted","Data":"b0486c9366b81f3abfd35806c7eef00eb2a353450d850cc665c43fe3c78c9fa1"} Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.450126 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:00Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.451151 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lv2jr" event={"ID":"47902aa9-b3e5-4279-a0ee-23ec28d1c67b","Type":"ContainerStarted","Data":"f38ae7ac1122c7cceeaf7ef6284034b3b00f0d34060ce2a912344e521f8288ef"} Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.451214 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lv2jr" event={"ID":"47902aa9-b3e5-4279-a0ee-23ec28d1c67b","Type":"ContainerStarted","Data":"ebb9e8cb752b7de1f349ee81d46eac8bc90df51b18de0909e710e63794570649"} Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.452528 4962 generic.go:334] "Generic (PLEG): container finished" podID="017b2e87-9a6e-46c6-b061-1ed93bfd2322" containerID="b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69" exitCode=0 Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.452652 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" event={"ID":"017b2e87-9a6e-46c6-b061-1ed93bfd2322","Type":"ContainerDied","Data":"b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69"} Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.452697 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" event={"ID":"017b2e87-9a6e-46c6-b061-1ed93bfd2322","Type":"ContainerStarted","Data":"04a387e5711598ad2964df75f2cc21cf336310ff885eaab7bd2f443a7d190bad"} Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.461858 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.461890 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.461904 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.461921 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.461953 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:00Z","lastTransitionTime":"2025-12-01T21:34:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.466462 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-m4wg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38b9e31-13b0-4a48-93bf-b3722ca60642\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6b607ad59cd135df76eb0d6167a309468b46c7d487077b737d6d35390990de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws4mx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-m4wg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:00Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.483854 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lv2jr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47902aa9-b3e5-4279-a0ee-23ec28d1c67b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lv2jr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:00Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.504307 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dce968b-9adc-4cbe-958c-00bdd7ba6159\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d0603aa8db338a61428d0c439ab252ad7db6ee0061992381a2f2c2534ecc08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b00d16ddb5e5c38227a87ee09c7e0d74ca43fb9226904dc5d604abe6a9c9b5ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc7177cbcbb0fe51491e055e7a8cb565836f1ffe163b6e160c3b425a5218fc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e749578c0ce8cb1dd38de90c24c5ddf738558163fe9012afcf1db088756495\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:00Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.520500 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db02e5c3ea0837c8534d6d0d89c6343ef077bc79b08bb6514fcc7fa0d71ac94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:00Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.539619 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac15c86b-cd18-4110-bda1-ba116dccb445\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edc674b0b2e671ee34618f4a89fb513b4d6be3b5b2c65510554ed57b84f305b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d764e05de4e1d86add3da927e55f8d2f211bba19a2104af89e43d13f986f5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a26275d40e0cdc563a8704d985b92ca116fd053ac663b23ada1f3d55d3d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7bb726576fbf1e8a595c9fdab30ed4852d838f3ad72aa432b880e8466493ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa02037f2a0411aa4c3b7eff66c72a1885d70e56cf72cdf786d04b5840ecbe6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"message\\\":\\\" 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764624829\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764624829\\\\\\\\\\\\\\\" (2025-12-01 20:33:49 +0000 UTC to 2026-12-01 20:33:49 +0000 UTC (now=2025-12-01 21:33:54.341897199 +0000 UTC))\\\\\\\"\\\\nI1201 21:33:54.342031 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 21:33:54.342089 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 21:33:54.342220 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1201 21:33:54.342244 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1201 21:33:54.341422 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 21:33:54.342706 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 21:33:54.344341 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-602538438/tls.crt::/tmp/serving-cert-602538438/tls.key\\\\\\\"\\\\nI1201 21:33:54.345794 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 21:33:54.345817 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 21:33:54.345844 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348097 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348569 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1201 21:33:54.367723 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bbe0097117ffbc7aa9c8247995dcec3765f03a9aa7657ee0c0da4c3a3ec874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:00Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.564725 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.564779 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.564792 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.564810 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.564821 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:00Z","lastTransitionTime":"2025-12-01T21:34:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.571069 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:00Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.602449 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://311915b276fcb2749fd7443c07b4a339131584ba32944273ed063ed55edec6c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:00Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.623230 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69adfdccde94cd891284e61c5edd6a092d14c5276ff514618fa3b44169e6347e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a6c3b22dbe7ed00b2356583225c79a46bfbf7014f2ba5b6d2226f1b1f3f7b39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:00Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.646652 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w7dpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"943bfc9d-612b-4273-9774-f1866b7af4b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w7dpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:00Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.665254 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"191b6ce3-f613-4217-b224-a65ee4cfdfe7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rf67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rf67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b642k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:00Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.669237 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.669267 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.669277 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.669304 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.669317 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:00Z","lastTransitionTime":"2025-12-01T21:34:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.686431 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"017b2e87-9a6e-46c6-b061-1ed93bfd2322\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j77n9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:00Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.702192 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:00Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.718319 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db02e5c3ea0837c8534d6d0d89c6343ef077bc79b08bb6514fcc7fa0d71ac94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:00Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.731704 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:00Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.748793 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-m4wg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38b9e31-13b0-4a48-93bf-b3722ca60642\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6b607ad59cd135df76eb0d6167a309468b46c7d487077b737d6d35390990de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws4mx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-m4wg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:00Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.762566 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lv2jr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47902aa9-b3e5-4279-a0ee-23ec28d1c67b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f38ae7ac1122c7cceeaf7ef6284034b3b00f0d34060ce2a912344e521f8288ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lv2jr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:00Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.771657 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.771703 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.771716 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.771739 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.771754 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:00Z","lastTransitionTime":"2025-12-01T21:34:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.776852 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dce968b-9adc-4cbe-958c-00bdd7ba6159\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d0603aa8db338a61428d0c439ab252ad7db6ee0061992381a2f2c2534ecc08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b00d16ddb5e5c38227a87ee09c7e0d74ca43fb9226904dc5d604abe6a9c9b5ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc7177cbcbb0fe51491e055e7a8cb565836f1ffe163b6e160c3b425a5218fc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e749578c0ce8cb1dd38de90c24c5ddf738558163fe9012afcf1db088756495\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:00Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.788720 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:00Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.802542 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac15c86b-cd18-4110-bda1-ba116dccb445\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edc674b0b2e671ee34618f4a89fb513b4d6be3b5b2c65510554ed57b84f305b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d764e05de4e1d86add3da927e55f8d2f211bba19a2104af89e43d13f986f5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a26275d40e0cdc563a8704d985b92ca116fd053ac663b23ada1f3d55d3d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7bb726576fbf1e8a595c9fdab30ed4852d838f3ad72aa432b880e8466493ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa02037f2a0411aa4c3b7eff66c72a1885d70e56cf72cdf786d04b5840ecbe6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"message\\\":\\\" 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764624829\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764624829\\\\\\\\\\\\\\\" (2025-12-01 20:33:49 +0000 UTC to 2026-12-01 20:33:49 +0000 UTC (now=2025-12-01 21:33:54.341897199 +0000 UTC))\\\\\\\"\\\\nI1201 21:33:54.342031 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 21:33:54.342089 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 21:33:54.342220 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1201 21:33:54.342244 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1201 21:33:54.341422 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 21:33:54.342706 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 21:33:54.344341 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-602538438/tls.crt::/tmp/serving-cert-602538438/tls.key\\\\\\\"\\\\nI1201 21:33:54.345794 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 21:33:54.345817 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 21:33:54.345844 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348097 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348569 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1201 21:33:54.367723 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bbe0097117ffbc7aa9c8247995dcec3765f03a9aa7657ee0c0da4c3a3ec874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:00Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.815695 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://311915b276fcb2749fd7443c07b4a339131584ba32944273ed063ed55edec6c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:00Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.831143 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69adfdccde94cd891284e61c5edd6a092d14c5276ff514618fa3b44169e6347e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a6c3b22dbe7ed00b2356583225c79a46bfbf7014f2ba5b6d2226f1b1f3f7b39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:00Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.847037 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w7dpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"943bfc9d-612b-4273-9774-f1866b7af4b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0486c9366b81f3abfd35806c7eef00eb2a353450d850cc665c43fe3c78c9fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w7dpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:00Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.869060 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"191b6ce3-f613-4217-b224-a65ee4cfdfe7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37ed2e5c6ce1be6c710341ab02dc9ae89ebb0b309c11e78b251ffb532c31587f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rf67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca94caf0040f2d86239f149d34e419adcc86594fda2b4189e74f699c125857f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rf67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b642k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:00Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.873615 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.873645 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.873653 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.873667 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.873676 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:00Z","lastTransitionTime":"2025-12-01T21:34:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.887486 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:00Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.908958 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"017b2e87-9a6e-46c6-b061-1ed93bfd2322\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j77n9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:00Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.976013 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.976051 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.976060 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.976074 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:00 crc kubenswrapper[4962]: I1201 21:34:00.976083 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:00Z","lastTransitionTime":"2025-12-01T21:34:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.079139 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.079205 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.079217 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.079244 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.079257 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:01Z","lastTransitionTime":"2025-12-01T21:34:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.134222 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-4wzdm"] Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.134860 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4wzdm" Dec 01 21:34:01 crc kubenswrapper[4962]: W1201 21:34:01.137801 4962 reflector.go:561] object-"openshift-image-registry"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-image-registry": no relationship found between node 'crc' and this object Dec 01 21:34:01 crc kubenswrapper[4962]: E1201 21:34:01.137870 4962 reflector.go:158] "Unhandled Error" err="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-image-registry\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 01 21:34:01 crc kubenswrapper[4962]: W1201 21:34:01.137809 4962 reflector.go:561] object-"openshift-image-registry"/"node-ca-dockercfg-4777p": failed to list *v1.Secret: secrets "node-ca-dockercfg-4777p" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-image-registry": no relationship found between node 'crc' and this object Dec 01 21:34:01 crc kubenswrapper[4962]: W1201 21:34:01.137892 4962 reflector.go:561] object-"openshift-image-registry"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-image-registry": no relationship found between node 'crc' and this object Dec 01 21:34:01 crc kubenswrapper[4962]: E1201 21:34:01.138014 4962 reflector.go:158] "Unhandled Error" err="object-\"openshift-image-registry\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-image-registry\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 01 21:34:01 crc kubenswrapper[4962]: W1201 21:34:01.137812 4962 reflector.go:561] object-"openshift-image-registry"/"image-registry-certificates": failed to list *v1.ConfigMap: configmaps "image-registry-certificates" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-image-registry": no relationship found between node 'crc' and this object Dec 01 21:34:01 crc kubenswrapper[4962]: E1201 21:34:01.138072 4962 reflector.go:158] "Unhandled Error" err="object-\"openshift-image-registry\"/\"image-registry-certificates\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"image-registry-certificates\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-image-registry\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 01 21:34:01 crc kubenswrapper[4962]: E1201 21:34:01.137919 4962 reflector.go:158] "Unhandled Error" err="object-\"openshift-image-registry\"/\"node-ca-dockercfg-4777p\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"node-ca-dockercfg-4777p\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-image-registry\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.155764 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:01Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.182229 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.182286 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.182299 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.182320 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.182335 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:01Z","lastTransitionTime":"2025-12-01T21:34:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.185853 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8c4763ed-582e-4003-a0da-526ae8aee799-serviceca\") pod \"node-ca-4wzdm\" (UID: \"8c4763ed-582e-4003-a0da-526ae8aee799\") " pod="openshift-image-registry/node-ca-4wzdm" Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.185911 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8c4763ed-582e-4003-a0da-526ae8aee799-host\") pod \"node-ca-4wzdm\" (UID: \"8c4763ed-582e-4003-a0da-526ae8aee799\") " pod="openshift-image-registry/node-ca-4wzdm" Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.185964 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5lb4\" (UniqueName: \"kubernetes.io/projected/8c4763ed-582e-4003-a0da-526ae8aee799-kube-api-access-h5lb4\") pod \"node-ca-4wzdm\" (UID: \"8c4763ed-582e-4003-a0da-526ae8aee799\") " pod="openshift-image-registry/node-ca-4wzdm" Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.187045 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"017b2e87-9a6e-46c6-b061-1ed93bfd2322\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j77n9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:01Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.207981 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dce968b-9adc-4cbe-958c-00bdd7ba6159\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d0603aa8db338a61428d0c439ab252ad7db6ee0061992381a2f2c2534ecc08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b00d16ddb5e5c38227a87ee09c7e0d74ca43fb9226904dc5d604abe6a9c9b5ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc7177cbcbb0fe51491e055e7a8cb565836f1ffe163b6e160c3b425a5218fc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e749578c0ce8cb1dd38de90c24c5ddf738558163fe9012afcf1db088756495\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:01Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.218944 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.219106 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.219016 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 21:34:01 crc kubenswrapper[4962]: E1201 21:34:01.219321 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 21:34:01 crc kubenswrapper[4962]: E1201 21:34:01.219876 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 21:34:01 crc kubenswrapper[4962]: E1201 21:34:01.220053 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.222604 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db02e5c3ea0837c8534d6d0d89c6343ef077bc79b08bb6514fcc7fa0d71ac94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:01Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.237994 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:01Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.254285 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-m4wg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38b9e31-13b0-4a48-93bf-b3722ca60642\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6b607ad59cd135df76eb0d6167a309468b46c7d487077b737d6d35390990de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws4mx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-m4wg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:01Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.269454 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lv2jr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47902aa9-b3e5-4279-a0ee-23ec28d1c67b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f38ae7ac1122c7cceeaf7ef6284034b3b00f0d34060ce2a912344e521f8288ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lv2jr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:01Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.282362 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4wzdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c4763ed-582e-4003-a0da-526ae8aee799\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:01Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:01Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5lb4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:34:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4wzdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:01Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.284569 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.284700 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.284804 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.284949 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.285054 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:01Z","lastTransitionTime":"2025-12-01T21:34:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.287525 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8c4763ed-582e-4003-a0da-526ae8aee799-serviceca\") pod \"node-ca-4wzdm\" (UID: \"8c4763ed-582e-4003-a0da-526ae8aee799\") " pod="openshift-image-registry/node-ca-4wzdm" Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.287671 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8c4763ed-582e-4003-a0da-526ae8aee799-host\") pod \"node-ca-4wzdm\" (UID: \"8c4763ed-582e-4003-a0da-526ae8aee799\") " pod="openshift-image-registry/node-ca-4wzdm" Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.287740 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5lb4\" (UniqueName: \"kubernetes.io/projected/8c4763ed-582e-4003-a0da-526ae8aee799-kube-api-access-h5lb4\") pod \"node-ca-4wzdm\" (UID: \"8c4763ed-582e-4003-a0da-526ae8aee799\") " pod="openshift-image-registry/node-ca-4wzdm" Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.287821 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8c4763ed-582e-4003-a0da-526ae8aee799-host\") pod \"node-ca-4wzdm\" (UID: \"8c4763ed-582e-4003-a0da-526ae8aee799\") " pod="openshift-image-registry/node-ca-4wzdm" Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.298504 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac15c86b-cd18-4110-bda1-ba116dccb445\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edc674b0b2e671ee34618f4a89fb513b4d6be3b5b2c65510554ed57b84f305b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d764e05de4e1d86add3da927e55f8d2f211bba19a2104af89e43d13f986f5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a26275d40e0cdc563a8704d985b92ca116fd053ac663b23ada1f3d55d3d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7bb726576fbf1e8a595c9fdab30ed4852d838f3ad72aa432b880e8466493ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa02037f2a0411aa4c3b7eff66c72a1885d70e56cf72cdf786d04b5840ecbe6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"message\\\":\\\" 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764624829\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764624829\\\\\\\\\\\\\\\" (2025-12-01 20:33:49 +0000 UTC to 2026-12-01 20:33:49 +0000 UTC (now=2025-12-01 21:33:54.341897199 +0000 UTC))\\\\\\\"\\\\nI1201 21:33:54.342031 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 21:33:54.342089 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 21:33:54.342220 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1201 21:33:54.342244 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1201 21:33:54.341422 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 21:33:54.342706 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 21:33:54.344341 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-602538438/tls.crt::/tmp/serving-cert-602538438/tls.key\\\\\\\"\\\\nI1201 21:33:54.345794 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 21:33:54.345817 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 21:33:54.345844 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348097 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348569 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1201 21:33:54.367723 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bbe0097117ffbc7aa9c8247995dcec3765f03a9aa7657ee0c0da4c3a3ec874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:01Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.314486 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:01Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.328703 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://311915b276fcb2749fd7443c07b4a339131584ba32944273ed063ed55edec6c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:01Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.340367 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69adfdccde94cd891284e61c5edd6a092d14c5276ff514618fa3b44169e6347e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a6c3b22dbe7ed00b2356583225c79a46bfbf7014f2ba5b6d2226f1b1f3f7b39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:01Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.350801 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w7dpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"943bfc9d-612b-4273-9774-f1866b7af4b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0486c9366b81f3abfd35806c7eef00eb2a353450d850cc665c43fe3c78c9fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w7dpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:01Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.364202 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"191b6ce3-f613-4217-b224-a65ee4cfdfe7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37ed2e5c6ce1be6c710341ab02dc9ae89ebb0b309c11e78b251ffb532c31587f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rf67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca94caf0040f2d86239f149d34e419adcc86594fda2b4189e74f699c125857f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rf67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b642k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:01Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.388605 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.388693 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.388721 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.388756 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.388782 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:01Z","lastTransitionTime":"2025-12-01T21:34:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.458211 4962 generic.go:334] "Generic (PLEG): container finished" podID="47902aa9-b3e5-4279-a0ee-23ec28d1c67b" containerID="f38ae7ac1122c7cceeaf7ef6284034b3b00f0d34060ce2a912344e521f8288ef" exitCode=0 Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.458288 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lv2jr" event={"ID":"47902aa9-b3e5-4279-a0ee-23ec28d1c67b","Type":"ContainerDied","Data":"f38ae7ac1122c7cceeaf7ef6284034b3b00f0d34060ce2a912344e521f8288ef"} Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.465320 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" event={"ID":"017b2e87-9a6e-46c6-b061-1ed93bfd2322","Type":"ContainerStarted","Data":"81854e8f9430ba467e0aecead00fad301f92f85fdcca3f93f5e4aad9bba65925"} Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.465380 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" event={"ID":"017b2e87-9a6e-46c6-b061-1ed93bfd2322","Type":"ContainerStarted","Data":"70bcf088189f0158815dd70d752870decf941ad51bbfbe7c0c872da603659864"} Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.465394 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" event={"ID":"017b2e87-9a6e-46c6-b061-1ed93bfd2322","Type":"ContainerStarted","Data":"2a2d1885515e7eab97b61b268f14f9fab5ff02464621e1a9b6adfee4c032ca3e"} Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.465407 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" event={"ID":"017b2e87-9a6e-46c6-b061-1ed93bfd2322","Type":"ContainerStarted","Data":"2c2b5c061f60dc51486c1b3b6fd97f18f63860ecb75697b61ee8288c43f23417"} Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.465419 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" event={"ID":"017b2e87-9a6e-46c6-b061-1ed93bfd2322","Type":"ContainerStarted","Data":"7f5f31e9f2e590d4d3319248b0a4dca7e1f6575cb973ec7cc3e33fc91eb82d89"} Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.465430 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" event={"ID":"017b2e87-9a6e-46c6-b061-1ed93bfd2322","Type":"ContainerStarted","Data":"da8a6f1fac0baebddfe20bd0ebda104356ad37e9f128affd8df2757584e8e712"} Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.474977 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://311915b276fcb2749fd7443c07b4a339131584ba32944273ed063ed55edec6c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:01Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.492070 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.492137 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.492160 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.492189 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.492210 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:01Z","lastTransitionTime":"2025-12-01T21:34:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.497700 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69adfdccde94cd891284e61c5edd6a092d14c5276ff514618fa3b44169e6347e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a6c3b22dbe7ed00b2356583225c79a46bfbf7014f2ba5b6d2226f1b1f3f7b39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:01Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.516791 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w7dpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"943bfc9d-612b-4273-9774-f1866b7af4b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0486c9366b81f3abfd35806c7eef00eb2a353450d850cc665c43fe3c78c9fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w7dpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:01Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.534280 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"191b6ce3-f613-4217-b224-a65ee4cfdfe7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37ed2e5c6ce1be6c710341ab02dc9ae89ebb0b309c11e78b251ffb532c31587f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rf67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca94caf0040f2d86239f149d34e419adcc86594fda2b4189e74f699c125857f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rf67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b642k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:01Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.553541 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:01Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.581822 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"017b2e87-9a6e-46c6-b061-1ed93bfd2322\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j77n9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:01Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.595775 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.596249 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.596272 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.596319 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.596335 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:01Z","lastTransitionTime":"2025-12-01T21:34:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.604409 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dce968b-9adc-4cbe-958c-00bdd7ba6159\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d0603aa8db338a61428d0c439ab252ad7db6ee0061992381a2f2c2534ecc08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b00d16ddb5e5c38227a87ee09c7e0d74ca43fb9226904dc5d604abe6a9c9b5ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc7177cbcbb0fe51491e055e7a8cb565836f1ffe163b6e160c3b425a5218fc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e749578c0ce8cb1dd38de90c24c5ddf738558163fe9012afcf1db088756495\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:01Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.619456 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db02e5c3ea0837c8534d6d0d89c6343ef077bc79b08bb6514fcc7fa0d71ac94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:01Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.636682 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:01Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.653079 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-m4wg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38b9e31-13b0-4a48-93bf-b3722ca60642\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6b607ad59cd135df76eb0d6167a309468b46c7d487077b737d6d35390990de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws4mx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-m4wg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:01Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.695074 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lv2jr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47902aa9-b3e5-4279-a0ee-23ec28d1c67b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f38ae7ac1122c7cceeaf7ef6284034b3b00f0d34060ce2a912344e521f8288ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f38ae7ac1122c7cceeaf7ef6284034b3b00f0d34060ce2a912344e521f8288ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lv2jr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:01Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.699524 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.699555 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.699565 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.699581 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.699591 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:01Z","lastTransitionTime":"2025-12-01T21:34:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.728892 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4wzdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c4763ed-582e-4003-a0da-526ae8aee799\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:01Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:01Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5lb4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:34:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4wzdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:01Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.776543 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac15c86b-cd18-4110-bda1-ba116dccb445\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edc674b0b2e671ee34618f4a89fb513b4d6be3b5b2c65510554ed57b84f305b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d764e05de4e1d86add3da927e55f8d2f211bba19a2104af89e43d13f986f5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a26275d40e0cdc563a8704d985b92ca116fd053ac663b23ada1f3d55d3d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7bb726576fbf1e8a595c9fdab30ed4852d838f3ad72aa432b880e8466493ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa02037f2a0411aa4c3b7eff66c72a1885d70e56cf72cdf786d04b5840ecbe6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"message\\\":\\\" 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764624829\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764624829\\\\\\\\\\\\\\\" (2025-12-01 20:33:49 +0000 UTC to 2026-12-01 20:33:49 +0000 UTC (now=2025-12-01 21:33:54.341897199 +0000 UTC))\\\\\\\"\\\\nI1201 21:33:54.342031 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 21:33:54.342089 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 21:33:54.342220 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1201 21:33:54.342244 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1201 21:33:54.341422 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 21:33:54.342706 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 21:33:54.344341 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-602538438/tls.crt::/tmp/serving-cert-602538438/tls.key\\\\\\\"\\\\nI1201 21:33:54.345794 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 21:33:54.345817 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 21:33:54.345844 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348097 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348569 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1201 21:33:54.367723 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bbe0097117ffbc7aa9c8247995dcec3765f03a9aa7657ee0c0da4c3a3ec874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:01Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.804782 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.804845 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.804865 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.804889 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.804907 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:01Z","lastTransitionTime":"2025-12-01T21:34:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.807778 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:01Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.908265 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.908325 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.908337 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.908353 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:01 crc kubenswrapper[4962]: I1201 21:34:01.908364 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:01Z","lastTransitionTime":"2025-12-01T21:34:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:02 crc kubenswrapper[4962]: I1201 21:34:02.012626 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:02 crc kubenswrapper[4962]: I1201 21:34:02.012696 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:02 crc kubenswrapper[4962]: I1201 21:34:02.012715 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:02 crc kubenswrapper[4962]: I1201 21:34:02.012743 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:02 crc kubenswrapper[4962]: I1201 21:34:02.012761 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:02Z","lastTransitionTime":"2025-12-01T21:34:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:02 crc kubenswrapper[4962]: I1201 21:34:02.117488 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:02 crc kubenswrapper[4962]: I1201 21:34:02.117554 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:02 crc kubenswrapper[4962]: I1201 21:34:02.117571 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:02 crc kubenswrapper[4962]: I1201 21:34:02.117595 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:02 crc kubenswrapper[4962]: I1201 21:34:02.117611 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:02Z","lastTransitionTime":"2025-12-01T21:34:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:02 crc kubenswrapper[4962]: I1201 21:34:02.181575 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 01 21:34:02 crc kubenswrapper[4962]: I1201 21:34:02.189347 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8c4763ed-582e-4003-a0da-526ae8aee799-serviceca\") pod \"node-ca-4wzdm\" (UID: \"8c4763ed-582e-4003-a0da-526ae8aee799\") " pod="openshift-image-registry/node-ca-4wzdm" Dec 01 21:34:02 crc kubenswrapper[4962]: I1201 21:34:02.221420 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:02 crc kubenswrapper[4962]: I1201 21:34:02.221504 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:02 crc kubenswrapper[4962]: I1201 21:34:02.221531 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:02 crc kubenswrapper[4962]: I1201 21:34:02.221561 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:02 crc kubenswrapper[4962]: I1201 21:34:02.221583 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:02Z","lastTransitionTime":"2025-12-01T21:34:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:02 crc kubenswrapper[4962]: E1201 21:34:02.304873 4962 projected.go:288] Couldn't get configMap openshift-image-registry/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Dec 01 21:34:02 crc kubenswrapper[4962]: I1201 21:34:02.325738 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:02 crc kubenswrapper[4962]: I1201 21:34:02.325924 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:02 crc kubenswrapper[4962]: I1201 21:34:02.326059 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:02 crc kubenswrapper[4962]: I1201 21:34:02.326179 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:02 crc kubenswrapper[4962]: I1201 21:34:02.326279 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:02Z","lastTransitionTime":"2025-12-01T21:34:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:02 crc kubenswrapper[4962]: I1201 21:34:02.429179 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:02 crc kubenswrapper[4962]: I1201 21:34:02.429235 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:02 crc kubenswrapper[4962]: I1201 21:34:02.429254 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:02 crc kubenswrapper[4962]: I1201 21:34:02.429282 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:02 crc kubenswrapper[4962]: I1201 21:34:02.429301 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:02Z","lastTransitionTime":"2025-12-01T21:34:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:02 crc kubenswrapper[4962]: I1201 21:34:02.430339 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 01 21:34:02 crc kubenswrapper[4962]: E1201 21:34:02.435674 4962 projected.go:194] Error preparing data for projected volume kube-api-access-h5lb4 for pod openshift-image-registry/node-ca-4wzdm: failed to sync configmap cache: timed out waiting for the condition Dec 01 21:34:02 crc kubenswrapper[4962]: E1201 21:34:02.435800 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8c4763ed-582e-4003-a0da-526ae8aee799-kube-api-access-h5lb4 podName:8c4763ed-582e-4003-a0da-526ae8aee799 nodeName:}" failed. No retries permitted until 2025-12-01 21:34:02.935761592 +0000 UTC m=+27.037200827 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-h5lb4" (UniqueName: "kubernetes.io/projected/8c4763ed-582e-4003-a0da-526ae8aee799-kube-api-access-h5lb4") pod "node-ca-4wzdm" (UID: "8c4763ed-582e-4003-a0da-526ae8aee799") : failed to sync configmap cache: timed out waiting for the condition Dec 01 21:34:02 crc kubenswrapper[4962]: I1201 21:34:02.473151 4962 generic.go:334] "Generic (PLEG): container finished" podID="47902aa9-b3e5-4279-a0ee-23ec28d1c67b" containerID="a3df9e036c0435fee1aa00f18a53bfb50334213a9284f766de1454f30a6c78f6" exitCode=0 Dec 01 21:34:02 crc kubenswrapper[4962]: I1201 21:34:02.473259 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lv2jr" event={"ID":"47902aa9-b3e5-4279-a0ee-23ec28d1c67b","Type":"ContainerDied","Data":"a3df9e036c0435fee1aa00f18a53bfb50334213a9284f766de1454f30a6c78f6"} Dec 01 21:34:02 crc kubenswrapper[4962]: I1201 21:34:02.504074 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:02Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:02 crc kubenswrapper[4962]: I1201 21:34:02.532841 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:02 crc kubenswrapper[4962]: I1201 21:34:02.532910 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:02 crc kubenswrapper[4962]: I1201 21:34:02.532924 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:02 crc kubenswrapper[4962]: I1201 21:34:02.532966 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:02 crc kubenswrapper[4962]: I1201 21:34:02.532985 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:02Z","lastTransitionTime":"2025-12-01T21:34:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:02 crc kubenswrapper[4962]: I1201 21:34:02.538463 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"017b2e87-9a6e-46c6-b061-1ed93bfd2322\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j77n9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:02Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:02 crc kubenswrapper[4962]: I1201 21:34:02.558846 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-m4wg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38b9e31-13b0-4a48-93bf-b3722ca60642\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6b607ad59cd135df76eb0d6167a309468b46c7d487077b737d6d35390990de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws4mx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-m4wg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:02Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:02 crc kubenswrapper[4962]: I1201 21:34:02.590453 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lv2jr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47902aa9-b3e5-4279-a0ee-23ec28d1c67b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f38ae7ac1122c7cceeaf7ef6284034b3b00f0d34060ce2a912344e521f8288ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f38ae7ac1122c7cceeaf7ef6284034b3b00f0d34060ce2a912344e521f8288ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3df9e036c0435fee1aa00f18a53bfb50334213a9284f766de1454f30a6c78f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3df9e036c0435fee1aa00f18a53bfb50334213a9284f766de1454f30a6c78f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lv2jr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:02Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:02 crc kubenswrapper[4962]: I1201 21:34:02.606696 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4wzdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c4763ed-582e-4003-a0da-526ae8aee799\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:01Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:01Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5lb4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:34:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4wzdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:02Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:02 crc kubenswrapper[4962]: I1201 21:34:02.626037 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dce968b-9adc-4cbe-958c-00bdd7ba6159\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d0603aa8db338a61428d0c439ab252ad7db6ee0061992381a2f2c2534ecc08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b00d16ddb5e5c38227a87ee09c7e0d74ca43fb9226904dc5d604abe6a9c9b5ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc7177cbcbb0fe51491e055e7a8cb565836f1ffe163b6e160c3b425a5218fc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e749578c0ce8cb1dd38de90c24c5ddf738558163fe9012afcf1db088756495\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:02Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:02 crc kubenswrapper[4962]: I1201 21:34:02.636661 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:02 crc kubenswrapper[4962]: I1201 21:34:02.636739 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:02 crc kubenswrapper[4962]: I1201 21:34:02.636753 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:02 crc kubenswrapper[4962]: I1201 21:34:02.636780 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:02 crc kubenswrapper[4962]: I1201 21:34:02.636801 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:02Z","lastTransitionTime":"2025-12-01T21:34:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:02 crc kubenswrapper[4962]: I1201 21:34:02.645714 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db02e5c3ea0837c8534d6d0d89c6343ef077bc79b08bb6514fcc7fa0d71ac94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:02Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:02 crc kubenswrapper[4962]: I1201 21:34:02.662991 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:02Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:02 crc kubenswrapper[4962]: I1201 21:34:02.664190 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 01 21:34:02 crc kubenswrapper[4962]: I1201 21:34:02.686414 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac15c86b-cd18-4110-bda1-ba116dccb445\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edc674b0b2e671ee34618f4a89fb513b4d6be3b5b2c65510554ed57b84f305b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d764e05de4e1d86add3da927e55f8d2f211bba19a2104af89e43d13f986f5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a26275d40e0cdc563a8704d985b92ca116fd053ac663b23ada1f3d55d3d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7bb726576fbf1e8a595c9fdab30ed4852d838f3ad72aa432b880e8466493ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa02037f2a0411aa4c3b7eff66c72a1885d70e56cf72cdf786d04b5840ecbe6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"message\\\":\\\" 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764624829\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764624829\\\\\\\\\\\\\\\" (2025-12-01 20:33:49 +0000 UTC to 2026-12-01 20:33:49 +0000 UTC (now=2025-12-01 21:33:54.341897199 +0000 UTC))\\\\\\\"\\\\nI1201 21:33:54.342031 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 21:33:54.342089 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 21:33:54.342220 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1201 21:33:54.342244 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1201 21:33:54.341422 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 21:33:54.342706 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 21:33:54.344341 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-602538438/tls.crt::/tmp/serving-cert-602538438/tls.key\\\\\\\"\\\\nI1201 21:33:54.345794 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 21:33:54.345817 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 21:33:54.345844 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348097 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348569 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1201 21:33:54.367723 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bbe0097117ffbc7aa9c8247995dcec3765f03a9aa7657ee0c0da4c3a3ec874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:02Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:02 crc kubenswrapper[4962]: I1201 21:34:02.686740 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 01 21:34:02 crc kubenswrapper[4962]: I1201 21:34:02.700956 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:02Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:02 crc kubenswrapper[4962]: I1201 21:34:02.714213 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69adfdccde94cd891284e61c5edd6a092d14c5276ff514618fa3b44169e6347e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a6c3b22dbe7ed00b2356583225c79a46bfbf7014f2ba5b6d2226f1b1f3f7b39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:02Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:02 crc kubenswrapper[4962]: I1201 21:34:02.724461 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w7dpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"943bfc9d-612b-4273-9774-f1866b7af4b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0486c9366b81f3abfd35806c7eef00eb2a353450d850cc665c43fe3c78c9fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w7dpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:02Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:02 crc kubenswrapper[4962]: I1201 21:34:02.735150 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"191b6ce3-f613-4217-b224-a65ee4cfdfe7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37ed2e5c6ce1be6c710341ab02dc9ae89ebb0b309c11e78b251ffb532c31587f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rf67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca94caf0040f2d86239f149d34e419adcc86594fda2b4189e74f699c125857f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rf67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b642k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:02Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:02 crc kubenswrapper[4962]: I1201 21:34:02.739211 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:02 crc kubenswrapper[4962]: I1201 21:34:02.739243 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:02 crc kubenswrapper[4962]: I1201 21:34:02.739252 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:02 crc kubenswrapper[4962]: I1201 21:34:02.739268 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:02 crc kubenswrapper[4962]: I1201 21:34:02.739279 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:02Z","lastTransitionTime":"2025-12-01T21:34:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:02 crc kubenswrapper[4962]: I1201 21:34:02.747196 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://311915b276fcb2749fd7443c07b4a339131584ba32944273ed063ed55edec6c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:02Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:02 crc kubenswrapper[4962]: I1201 21:34:02.842278 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:02 crc kubenswrapper[4962]: I1201 21:34:02.842323 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:02 crc kubenswrapper[4962]: I1201 21:34:02.842337 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:02 crc kubenswrapper[4962]: I1201 21:34:02.842357 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:02 crc kubenswrapper[4962]: I1201 21:34:02.842372 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:02Z","lastTransitionTime":"2025-12-01T21:34:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:02 crc kubenswrapper[4962]: I1201 21:34:02.909630 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 21:34:02 crc kubenswrapper[4962]: E1201 21:34:02.909895 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 21:34:10.909877508 +0000 UTC m=+35.011316713 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 21:34:02 crc kubenswrapper[4962]: I1201 21:34:02.945461 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:02 crc kubenswrapper[4962]: I1201 21:34:02.945498 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:02 crc kubenswrapper[4962]: I1201 21:34:02.945511 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:02 crc kubenswrapper[4962]: I1201 21:34:02.945532 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:02 crc kubenswrapper[4962]: I1201 21:34:02.945544 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:02Z","lastTransitionTime":"2025-12-01T21:34:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:03 crc kubenswrapper[4962]: I1201 21:34:03.010802 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 21:34:03 crc kubenswrapper[4962]: I1201 21:34:03.010876 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 21:34:03 crc kubenswrapper[4962]: I1201 21:34:03.010921 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:34:03 crc kubenswrapper[4962]: I1201 21:34:03.011016 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5lb4\" (UniqueName: \"kubernetes.io/projected/8c4763ed-582e-4003-a0da-526ae8aee799-kube-api-access-h5lb4\") pod \"node-ca-4wzdm\" (UID: \"8c4763ed-582e-4003-a0da-526ae8aee799\") " pod="openshift-image-registry/node-ca-4wzdm" Dec 01 21:34:03 crc kubenswrapper[4962]: I1201 21:34:03.011052 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:34:03 crc kubenswrapper[4962]: E1201 21:34:03.011163 4962 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 21:34:03 crc kubenswrapper[4962]: E1201 21:34:03.011235 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 21:34:11.011212299 +0000 UTC m=+35.112651534 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 21:34:03 crc kubenswrapper[4962]: E1201 21:34:03.011798 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 21:34:03 crc kubenswrapper[4962]: E1201 21:34:03.011837 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 21:34:03 crc kubenswrapper[4962]: E1201 21:34:03.011857 4962 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 21:34:03 crc kubenswrapper[4962]: E1201 21:34:03.011906 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 21:34:11.011889147 +0000 UTC m=+35.113328382 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 21:34:03 crc kubenswrapper[4962]: E1201 21:34:03.012020 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 21:34:03 crc kubenswrapper[4962]: E1201 21:34:03.012040 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 21:34:03 crc kubenswrapper[4962]: E1201 21:34:03.012054 4962 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 21:34:03 crc kubenswrapper[4962]: E1201 21:34:03.012134 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 21:34:11.012120973 +0000 UTC m=+35.113560198 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 21:34:03 crc kubenswrapper[4962]: E1201 21:34:03.012233 4962 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 21:34:03 crc kubenswrapper[4962]: E1201 21:34:03.012272 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 21:34:11.012260847 +0000 UTC m=+35.113700082 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 21:34:03 crc kubenswrapper[4962]: I1201 21:34:03.022236 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5lb4\" (UniqueName: \"kubernetes.io/projected/8c4763ed-582e-4003-a0da-526ae8aee799-kube-api-access-h5lb4\") pod \"node-ca-4wzdm\" (UID: \"8c4763ed-582e-4003-a0da-526ae8aee799\") " pod="openshift-image-registry/node-ca-4wzdm" Dec 01 21:34:03 crc kubenswrapper[4962]: I1201 21:34:03.054234 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:03 crc kubenswrapper[4962]: I1201 21:34:03.054303 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:03 crc kubenswrapper[4962]: I1201 21:34:03.054321 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:03 crc kubenswrapper[4962]: I1201 21:34:03.054346 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:03 crc kubenswrapper[4962]: I1201 21:34:03.054367 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:03Z","lastTransitionTime":"2025-12-01T21:34:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:03 crc kubenswrapper[4962]: I1201 21:34:03.157559 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:03 crc kubenswrapper[4962]: I1201 21:34:03.157614 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:03 crc kubenswrapper[4962]: I1201 21:34:03.157633 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:03 crc kubenswrapper[4962]: I1201 21:34:03.157659 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:03 crc kubenswrapper[4962]: I1201 21:34:03.157676 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:03Z","lastTransitionTime":"2025-12-01T21:34:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:03 crc kubenswrapper[4962]: I1201 21:34:03.219427 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:34:03 crc kubenswrapper[4962]: I1201 21:34:03.219489 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 21:34:03 crc kubenswrapper[4962]: I1201 21:34:03.219501 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 21:34:03 crc kubenswrapper[4962]: E1201 21:34:03.219602 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 21:34:03 crc kubenswrapper[4962]: E1201 21:34:03.219829 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 21:34:03 crc kubenswrapper[4962]: E1201 21:34:03.219983 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 21:34:03 crc kubenswrapper[4962]: I1201 21:34:03.260674 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4wzdm" Dec 01 21:34:03 crc kubenswrapper[4962]: I1201 21:34:03.271743 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:03 crc kubenswrapper[4962]: I1201 21:34:03.271813 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:03 crc kubenswrapper[4962]: I1201 21:34:03.271846 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:03 crc kubenswrapper[4962]: I1201 21:34:03.271876 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:03 crc kubenswrapper[4962]: I1201 21:34:03.271897 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:03Z","lastTransitionTime":"2025-12-01T21:34:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:03 crc kubenswrapper[4962]: W1201 21:34:03.280755 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c4763ed_582e_4003_a0da_526ae8aee799.slice/crio-2cf4669e4745341c829c85bf3bab5440318841adb96ebf54bf735768b3089c5d WatchSource:0}: Error finding container 2cf4669e4745341c829c85bf3bab5440318841adb96ebf54bf735768b3089c5d: Status 404 returned error can't find the container with id 2cf4669e4745341c829c85bf3bab5440318841adb96ebf54bf735768b3089c5d Dec 01 21:34:03 crc kubenswrapper[4962]: I1201 21:34:03.375785 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:03 crc kubenswrapper[4962]: I1201 21:34:03.375837 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:03 crc kubenswrapper[4962]: I1201 21:34:03.375854 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:03 crc kubenswrapper[4962]: I1201 21:34:03.375880 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:03 crc kubenswrapper[4962]: I1201 21:34:03.375897 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:03Z","lastTransitionTime":"2025-12-01T21:34:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:03 crc kubenswrapper[4962]: I1201 21:34:03.479358 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:03 crc kubenswrapper[4962]: I1201 21:34:03.479796 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:03 crc kubenswrapper[4962]: I1201 21:34:03.479807 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:03 crc kubenswrapper[4962]: I1201 21:34:03.479820 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:03 crc kubenswrapper[4962]: I1201 21:34:03.479829 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:03Z","lastTransitionTime":"2025-12-01T21:34:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:03 crc kubenswrapper[4962]: I1201 21:34:03.484823 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" event={"ID":"017b2e87-9a6e-46c6-b061-1ed93bfd2322","Type":"ContainerStarted","Data":"893a9b3cc35f66e4d7ce63bd2c5c94a0e5824466a37b2281e05c4e92b5e90a05"} Dec 01 21:34:03 crc kubenswrapper[4962]: I1201 21:34:03.486700 4962 generic.go:334] "Generic (PLEG): container finished" podID="47902aa9-b3e5-4279-a0ee-23ec28d1c67b" containerID="1eea885834889b1efa1a74ddb9d4c378669affd6dcf493f7d5e2c18d63ba9b5f" exitCode=0 Dec 01 21:34:03 crc kubenswrapper[4962]: I1201 21:34:03.486736 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lv2jr" event={"ID":"47902aa9-b3e5-4279-a0ee-23ec28d1c67b","Type":"ContainerDied","Data":"1eea885834889b1efa1a74ddb9d4c378669affd6dcf493f7d5e2c18d63ba9b5f"} Dec 01 21:34:03 crc kubenswrapper[4962]: I1201 21:34:03.488599 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4wzdm" event={"ID":"8c4763ed-582e-4003-a0da-526ae8aee799","Type":"ContainerStarted","Data":"2cf4669e4745341c829c85bf3bab5440318841adb96ebf54bf735768b3089c5d"} Dec 01 21:34:03 crc kubenswrapper[4962]: I1201 21:34:03.508270 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://311915b276fcb2749fd7443c07b4a339131584ba32944273ed063ed55edec6c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:03Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:03 crc kubenswrapper[4962]: I1201 21:34:03.528899 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69adfdccde94cd891284e61c5edd6a092d14c5276ff514618fa3b44169e6347e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a6c3b22dbe7ed00b2356583225c79a46bfbf7014f2ba5b6d2226f1b1f3f7b39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:03Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:03 crc kubenswrapper[4962]: I1201 21:34:03.543569 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w7dpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"943bfc9d-612b-4273-9774-f1866b7af4b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0486c9366b81f3abfd35806c7eef00eb2a353450d850cc665c43fe3c78c9fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w7dpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:03Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:03 crc kubenswrapper[4962]: I1201 21:34:03.559646 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"191b6ce3-f613-4217-b224-a65ee4cfdfe7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37ed2e5c6ce1be6c710341ab02dc9ae89ebb0b309c11e78b251ffb532c31587f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rf67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca94caf0040f2d86239f149d34e419adcc86594fda2b4189e74f699c125857f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rf67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b642k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:03Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:03 crc kubenswrapper[4962]: I1201 21:34:03.575601 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:03Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:03 crc kubenswrapper[4962]: I1201 21:34:03.583210 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:03 crc kubenswrapper[4962]: I1201 21:34:03.583281 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:03 crc kubenswrapper[4962]: I1201 21:34:03.583301 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:03 crc kubenswrapper[4962]: I1201 21:34:03.583326 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:03 crc kubenswrapper[4962]: I1201 21:34:03.583347 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:03Z","lastTransitionTime":"2025-12-01T21:34:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:03 crc kubenswrapper[4962]: I1201 21:34:03.599427 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"017b2e87-9a6e-46c6-b061-1ed93bfd2322\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j77n9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:03Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:03 crc kubenswrapper[4962]: I1201 21:34:03.621483 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db02e5c3ea0837c8534d6d0d89c6343ef077bc79b08bb6514fcc7fa0d71ac94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:03Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:03 crc kubenswrapper[4962]: I1201 21:34:03.642597 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:03Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:03 crc kubenswrapper[4962]: I1201 21:34:03.664522 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-m4wg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38b9e31-13b0-4a48-93bf-b3722ca60642\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6b607ad59cd135df76eb0d6167a309468b46c7d487077b737d6d35390990de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws4mx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-m4wg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:03Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:03 crc kubenswrapper[4962]: I1201 21:34:03.679092 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lv2jr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47902aa9-b3e5-4279-a0ee-23ec28d1c67b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f38ae7ac1122c7cceeaf7ef6284034b3b00f0d34060ce2a912344e521f8288ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f38ae7ac1122c7cceeaf7ef6284034b3b00f0d34060ce2a912344e521f8288ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3df9e036c0435fee1aa00f18a53bfb50334213a9284f766de1454f30a6c78f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3df9e036c0435fee1aa00f18a53bfb50334213a9284f766de1454f30a6c78f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eea885834889b1efa1a74ddb9d4c378669affd6dcf493f7d5e2c18d63ba9b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eea885834889b1efa1a74ddb9d4c378669affd6dcf493f7d5e2c18d63ba9b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lv2jr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:03Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:03 crc kubenswrapper[4962]: I1201 21:34:03.685972 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:03 crc kubenswrapper[4962]: I1201 21:34:03.686025 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:03 crc kubenswrapper[4962]: I1201 21:34:03.686042 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:03 crc kubenswrapper[4962]: I1201 21:34:03.686069 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:03 crc kubenswrapper[4962]: I1201 21:34:03.686086 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:03Z","lastTransitionTime":"2025-12-01T21:34:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:03 crc kubenswrapper[4962]: I1201 21:34:03.696556 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4wzdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c4763ed-582e-4003-a0da-526ae8aee799\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:01Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:01Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5lb4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:34:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4wzdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:03Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:03 crc kubenswrapper[4962]: I1201 21:34:03.713002 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dce968b-9adc-4cbe-958c-00bdd7ba6159\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d0603aa8db338a61428d0c439ab252ad7db6ee0061992381a2f2c2534ecc08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b00d16ddb5e5c38227a87ee09c7e0d74ca43fb9226904dc5d604abe6a9c9b5ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc7177cbcbb0fe51491e055e7a8cb565836f1ffe163b6e160c3b425a5218fc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e749578c0ce8cb1dd38de90c24c5ddf738558163fe9012afcf1db088756495\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:03Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:03 crc kubenswrapper[4962]: I1201 21:34:03.732799 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:03Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:03 crc kubenswrapper[4962]: I1201 21:34:03.755880 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac15c86b-cd18-4110-bda1-ba116dccb445\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edc674b0b2e671ee34618f4a89fb513b4d6be3b5b2c65510554ed57b84f305b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d764e05de4e1d86add3da927e55f8d2f211bba19a2104af89e43d13f986f5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a26275d40e0cdc563a8704d985b92ca116fd053ac663b23ada1f3d55d3d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7bb726576fbf1e8a595c9fdab30ed4852d838f3ad72aa432b880e8466493ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa02037f2a0411aa4c3b7eff66c72a1885d70e56cf72cdf786d04b5840ecbe6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"message\\\":\\\" 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764624829\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764624829\\\\\\\\\\\\\\\" (2025-12-01 20:33:49 +0000 UTC to 2026-12-01 20:33:49 +0000 UTC (now=2025-12-01 21:33:54.341897199 +0000 UTC))\\\\\\\"\\\\nI1201 21:33:54.342031 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 21:33:54.342089 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 21:33:54.342220 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1201 21:33:54.342244 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1201 21:33:54.341422 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 21:33:54.342706 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 21:33:54.344341 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-602538438/tls.crt::/tmp/serving-cert-602538438/tls.key\\\\\\\"\\\\nI1201 21:33:54.345794 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 21:33:54.345817 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 21:33:54.345844 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348097 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348569 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1201 21:33:54.367723 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bbe0097117ffbc7aa9c8247995dcec3765f03a9aa7657ee0c0da4c3a3ec874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:03Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:03 crc kubenswrapper[4962]: I1201 21:34:03.788703 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:03 crc kubenswrapper[4962]: I1201 21:34:03.788765 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:03 crc kubenswrapper[4962]: I1201 21:34:03.788788 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:03 crc kubenswrapper[4962]: I1201 21:34:03.788814 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:03 crc kubenswrapper[4962]: I1201 21:34:03.788834 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:03Z","lastTransitionTime":"2025-12-01T21:34:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:03 crc kubenswrapper[4962]: I1201 21:34:03.890909 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:03 crc kubenswrapper[4962]: I1201 21:34:03.890973 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:03 crc kubenswrapper[4962]: I1201 21:34:03.890985 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:03 crc kubenswrapper[4962]: I1201 21:34:03.891003 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:03 crc kubenswrapper[4962]: I1201 21:34:03.891016 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:03Z","lastTransitionTime":"2025-12-01T21:34:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.012999 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.013045 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.013057 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.013076 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.013091 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:04Z","lastTransitionTime":"2025-12-01T21:34:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.116167 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.116208 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.116220 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.116236 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.116248 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:04Z","lastTransitionTime":"2025-12-01T21:34:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.219356 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.219400 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.219411 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.219425 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.219437 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:04Z","lastTransitionTime":"2025-12-01T21:34:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.322739 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.322802 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.322821 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.322845 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.322862 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:04Z","lastTransitionTime":"2025-12-01T21:34:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.426227 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.426286 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.426309 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.426337 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.426360 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:04Z","lastTransitionTime":"2025-12-01T21:34:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.495062 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4wzdm" event={"ID":"8c4763ed-582e-4003-a0da-526ae8aee799","Type":"ContainerStarted","Data":"ca28226077849a752e7f8484f9ef98d6fb065d6ba9b3c4ef05f3ef1b0554bc32"} Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.500750 4962 generic.go:334] "Generic (PLEG): container finished" podID="47902aa9-b3e5-4279-a0ee-23ec28d1c67b" containerID="4b892d495763f16b0d9b9afd4440fa4d704de060b42b7506b28e9d990c47bfc5" exitCode=0 Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.500887 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lv2jr" event={"ID":"47902aa9-b3e5-4279-a0ee-23ec28d1c67b","Type":"ContainerDied","Data":"4b892d495763f16b0d9b9afd4440fa4d704de060b42b7506b28e9d990c47bfc5"} Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.531889 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.531966 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.531985 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.532011 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.532028 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:04Z","lastTransitionTime":"2025-12-01T21:34:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.535014 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"017b2e87-9a6e-46c6-b061-1ed93bfd2322\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j77n9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:04Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.552612 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:04Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.568186 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:04Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.582974 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-m4wg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38b9e31-13b0-4a48-93bf-b3722ca60642\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6b607ad59cd135df76eb0d6167a309468b46c7d487077b737d6d35390990de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws4mx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-m4wg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:04Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.611454 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lv2jr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47902aa9-b3e5-4279-a0ee-23ec28d1c67b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f38ae7ac1122c7cceeaf7ef6284034b3b00f0d34060ce2a912344e521f8288ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f38ae7ac1122c7cceeaf7ef6284034b3b00f0d34060ce2a912344e521f8288ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3df9e036c0435fee1aa00f18a53bfb50334213a9284f766de1454f30a6c78f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3df9e036c0435fee1aa00f18a53bfb50334213a9284f766de1454f30a6c78f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eea885834889b1efa1a74ddb9d4c378669affd6dcf493f7d5e2c18d63ba9b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eea885834889b1efa1a74ddb9d4c378669affd6dcf493f7d5e2c18d63ba9b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lv2jr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:04Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.625588 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4wzdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c4763ed-582e-4003-a0da-526ae8aee799\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca28226077849a752e7f8484f9ef98d6fb065d6ba9b3c4ef05f3ef1b0554bc32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5lb4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:34:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4wzdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:04Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.636519 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.636560 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.636570 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.636586 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.636597 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:04Z","lastTransitionTime":"2025-12-01T21:34:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.659373 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dce968b-9adc-4cbe-958c-00bdd7ba6159\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d0603aa8db338a61428d0c439ab252ad7db6ee0061992381a2f2c2534ecc08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b00d16ddb5e5c38227a87ee09c7e0d74ca43fb9226904dc5d604abe6a9c9b5ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc7177cbcbb0fe51491e055e7a8cb565836f1ffe163b6e160c3b425a5218fc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e749578c0ce8cb1dd38de90c24c5ddf738558163fe9012afcf1db088756495\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:04Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.694230 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db02e5c3ea0837c8534d6d0d89c6343ef077bc79b08bb6514fcc7fa0d71ac94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:04Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.715800 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac15c86b-cd18-4110-bda1-ba116dccb445\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edc674b0b2e671ee34618f4a89fb513b4d6be3b5b2c65510554ed57b84f305b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d764e05de4e1d86add3da927e55f8d2f211bba19a2104af89e43d13f986f5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a26275d40e0cdc563a8704d985b92ca116fd053ac663b23ada1f3d55d3d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7bb726576fbf1e8a595c9fdab30ed4852d838f3ad72aa432b880e8466493ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa02037f2a0411aa4c3b7eff66c72a1885d70e56cf72cdf786d04b5840ecbe6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"message\\\":\\\" 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764624829\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764624829\\\\\\\\\\\\\\\" (2025-12-01 20:33:49 +0000 UTC to 2026-12-01 20:33:49 +0000 UTC (now=2025-12-01 21:33:54.341897199 +0000 UTC))\\\\\\\"\\\\nI1201 21:33:54.342031 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 21:33:54.342089 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 21:33:54.342220 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1201 21:33:54.342244 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1201 21:33:54.341422 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 21:33:54.342706 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 21:33:54.344341 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-602538438/tls.crt::/tmp/serving-cert-602538438/tls.key\\\\\\\"\\\\nI1201 21:33:54.345794 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 21:33:54.345817 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 21:33:54.345844 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348097 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348569 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1201 21:33:54.367723 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bbe0097117ffbc7aa9c8247995dcec3765f03a9aa7657ee0c0da4c3a3ec874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:04Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.726233 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:04Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.735500 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://311915b276fcb2749fd7443c07b4a339131584ba32944273ed063ed55edec6c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:04Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.738495 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.738521 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.738530 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.738545 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.738556 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:04Z","lastTransitionTime":"2025-12-01T21:34:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.746574 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69adfdccde94cd891284e61c5edd6a092d14c5276ff514618fa3b44169e6347e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a6c3b22dbe7ed00b2356583225c79a46bfbf7014f2ba5b6d2226f1b1f3f7b39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:04Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.755244 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w7dpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"943bfc9d-612b-4273-9774-f1866b7af4b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0486c9366b81f3abfd35806c7eef00eb2a353450d850cc665c43fe3c78c9fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w7dpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:04Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.766305 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"191b6ce3-f613-4217-b224-a65ee4cfdfe7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37ed2e5c6ce1be6c710341ab02dc9ae89ebb0b309c11e78b251ffb532c31587f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rf67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca94caf0040f2d86239f149d34e419adcc86594fda2b4189e74f699c125857f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rf67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b642k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:04Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.776681 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:04Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.794271 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"017b2e87-9a6e-46c6-b061-1ed93bfd2322\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j77n9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:04Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.806640 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dce968b-9adc-4cbe-958c-00bdd7ba6159\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d0603aa8db338a61428d0c439ab252ad7db6ee0061992381a2f2c2534ecc08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b00d16ddb5e5c38227a87ee09c7e0d74ca43fb9226904dc5d604abe6a9c9b5ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc7177cbcbb0fe51491e055e7a8cb565836f1ffe163b6e160c3b425a5218fc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e749578c0ce8cb1dd38de90c24c5ddf738558163fe9012afcf1db088756495\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:04Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.824814 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db02e5c3ea0837c8534d6d0d89c6343ef077bc79b08bb6514fcc7fa0d71ac94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:04Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.836149 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:04Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.841218 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.841266 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.841278 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.841296 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.841308 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:04Z","lastTransitionTime":"2025-12-01T21:34:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.850292 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-m4wg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38b9e31-13b0-4a48-93bf-b3722ca60642\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6b607ad59cd135df76eb0d6167a309468b46c7d487077b737d6d35390990de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws4mx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-m4wg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:04Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.866007 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lv2jr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47902aa9-b3e5-4279-a0ee-23ec28d1c67b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f38ae7ac1122c7cceeaf7ef6284034b3b00f0d34060ce2a912344e521f8288ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f38ae7ac1122c7cceeaf7ef6284034b3b00f0d34060ce2a912344e521f8288ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3df9e036c0435fee1aa00f18a53bfb50334213a9284f766de1454f30a6c78f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3df9e036c0435fee1aa00f18a53bfb50334213a9284f766de1454f30a6c78f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eea885834889b1efa1a74ddb9d4c378669affd6dcf493f7d5e2c18d63ba9b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eea885834889b1efa1a74ddb9d4c378669affd6dcf493f7d5e2c18d63ba9b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b892d495763f16b0d9b9afd4440fa4d704de060b42b7506b28e9d990c47bfc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b892d495763f16b0d9b9afd4440fa4d704de060b42b7506b28e9d990c47bfc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lv2jr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:04Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.879303 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4wzdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c4763ed-582e-4003-a0da-526ae8aee799\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca28226077849a752e7f8484f9ef98d6fb065d6ba9b3c4ef05f3ef1b0554bc32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5lb4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:34:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4wzdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:04Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.898587 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac15c86b-cd18-4110-bda1-ba116dccb445\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edc674b0b2e671ee34618f4a89fb513b4d6be3b5b2c65510554ed57b84f305b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d764e05de4e1d86add3da927e55f8d2f211bba19a2104af89e43d13f986f5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a26275d40e0cdc563a8704d985b92ca116fd053ac663b23ada1f3d55d3d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7bb726576fbf1e8a595c9fdab30ed4852d838f3ad72aa432b880e8466493ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa02037f2a0411aa4c3b7eff66c72a1885d70e56cf72cdf786d04b5840ecbe6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"message\\\":\\\" 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764624829\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764624829\\\\\\\\\\\\\\\" (2025-12-01 20:33:49 +0000 UTC to 2026-12-01 20:33:49 +0000 UTC (now=2025-12-01 21:33:54.341897199 +0000 UTC))\\\\\\\"\\\\nI1201 21:33:54.342031 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 21:33:54.342089 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 21:33:54.342220 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1201 21:33:54.342244 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1201 21:33:54.341422 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 21:33:54.342706 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 21:33:54.344341 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-602538438/tls.crt::/tmp/serving-cert-602538438/tls.key\\\\\\\"\\\\nI1201 21:33:54.345794 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 21:33:54.345817 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 21:33:54.345844 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348097 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348569 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1201 21:33:54.367723 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bbe0097117ffbc7aa9c8247995dcec3765f03a9aa7657ee0c0da4c3a3ec874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:04Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.921689 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:04Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.940051 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://311915b276fcb2749fd7443c07b4a339131584ba32944273ed063ed55edec6c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:04Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.943835 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.944072 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.944222 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.944351 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.944478 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:04Z","lastTransitionTime":"2025-12-01T21:34:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.960279 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69adfdccde94cd891284e61c5edd6a092d14c5276ff514618fa3b44169e6347e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a6c3b22dbe7ed00b2356583225c79a46bfbf7014f2ba5b6d2226f1b1f3f7b39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:04Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.975334 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w7dpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"943bfc9d-612b-4273-9774-f1866b7af4b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0486c9366b81f3abfd35806c7eef00eb2a353450d850cc665c43fe3c78c9fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w7dpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:04Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:04 crc kubenswrapper[4962]: I1201 21:34:04.993418 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"191b6ce3-f613-4217-b224-a65ee4cfdfe7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37ed2e5c6ce1be6c710341ab02dc9ae89ebb0b309c11e78b251ffb532c31587f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rf67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca94caf0040f2d86239f149d34e419adcc86594fda2b4189e74f699c125857f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rf67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b642k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:04Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:05 crc kubenswrapper[4962]: I1201 21:34:05.047980 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:05 crc kubenswrapper[4962]: I1201 21:34:05.048077 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:05 crc kubenswrapper[4962]: I1201 21:34:05.048105 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:05 crc kubenswrapper[4962]: I1201 21:34:05.048155 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:05 crc kubenswrapper[4962]: I1201 21:34:05.048181 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:05Z","lastTransitionTime":"2025-12-01T21:34:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:05 crc kubenswrapper[4962]: I1201 21:34:05.151545 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:05 crc kubenswrapper[4962]: I1201 21:34:05.151819 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:05 crc kubenswrapper[4962]: I1201 21:34:05.152066 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:05 crc kubenswrapper[4962]: I1201 21:34:05.152230 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:05 crc kubenswrapper[4962]: I1201 21:34:05.152369 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:05Z","lastTransitionTime":"2025-12-01T21:34:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:05 crc kubenswrapper[4962]: I1201 21:34:05.219454 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 21:34:05 crc kubenswrapper[4962]: I1201 21:34:05.219460 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 21:34:05 crc kubenswrapper[4962]: I1201 21:34:05.219552 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:34:05 crc kubenswrapper[4962]: E1201 21:34:05.219703 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 21:34:05 crc kubenswrapper[4962]: E1201 21:34:05.219806 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 21:34:05 crc kubenswrapper[4962]: E1201 21:34:05.220102 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 21:34:05 crc kubenswrapper[4962]: I1201 21:34:05.255488 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:05 crc kubenswrapper[4962]: I1201 21:34:05.255542 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:05 crc kubenswrapper[4962]: I1201 21:34:05.255560 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:05 crc kubenswrapper[4962]: I1201 21:34:05.255584 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:05 crc kubenswrapper[4962]: I1201 21:34:05.255603 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:05Z","lastTransitionTime":"2025-12-01T21:34:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:05 crc kubenswrapper[4962]: I1201 21:34:05.358631 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:05 crc kubenswrapper[4962]: I1201 21:34:05.358678 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:05 crc kubenswrapper[4962]: I1201 21:34:05.358695 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:05 crc kubenswrapper[4962]: I1201 21:34:05.358716 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:05 crc kubenswrapper[4962]: I1201 21:34:05.358734 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:05Z","lastTransitionTime":"2025-12-01T21:34:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:05 crc kubenswrapper[4962]: I1201 21:34:05.461326 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:05 crc kubenswrapper[4962]: I1201 21:34:05.461380 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:05 crc kubenswrapper[4962]: I1201 21:34:05.461396 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:05 crc kubenswrapper[4962]: I1201 21:34:05.461416 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:05 crc kubenswrapper[4962]: I1201 21:34:05.461430 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:05Z","lastTransitionTime":"2025-12-01T21:34:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:05 crc kubenswrapper[4962]: I1201 21:34:05.507875 4962 generic.go:334] "Generic (PLEG): container finished" podID="47902aa9-b3e5-4279-a0ee-23ec28d1c67b" containerID="f4fd3d341f898bc53c51ad9a79a03ffd02ef0ac0e8f53e33ebc139a30fa85b7a" exitCode=0 Dec 01 21:34:05 crc kubenswrapper[4962]: I1201 21:34:05.508045 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lv2jr" event={"ID":"47902aa9-b3e5-4279-a0ee-23ec28d1c67b","Type":"ContainerDied","Data":"f4fd3d341f898bc53c51ad9a79a03ffd02ef0ac0e8f53e33ebc139a30fa85b7a"} Dec 01 21:34:05 crc kubenswrapper[4962]: I1201 21:34:05.530212 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"191b6ce3-f613-4217-b224-a65ee4cfdfe7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37ed2e5c6ce1be6c710341ab02dc9ae89ebb0b309c11e78b251ffb532c31587f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rf67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca94caf0040f2d86239f149d34e419adcc86594fda2b4189e74f699c125857f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rf67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b642k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:05Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:05 crc kubenswrapper[4962]: I1201 21:34:05.547556 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://311915b276fcb2749fd7443c07b4a339131584ba32944273ed063ed55edec6c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:05Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:05 crc kubenswrapper[4962]: I1201 21:34:05.564832 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:05 crc kubenswrapper[4962]: I1201 21:34:05.564871 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:05 crc kubenswrapper[4962]: I1201 21:34:05.564879 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:05 crc kubenswrapper[4962]: I1201 21:34:05.564893 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:05 crc kubenswrapper[4962]: I1201 21:34:05.564905 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:05Z","lastTransitionTime":"2025-12-01T21:34:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:05 crc kubenswrapper[4962]: I1201 21:34:05.566891 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69adfdccde94cd891284e61c5edd6a092d14c5276ff514618fa3b44169e6347e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a6c3b22dbe7ed00b2356583225c79a46bfbf7014f2ba5b6d2226f1b1f3f7b39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:05Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:05 crc kubenswrapper[4962]: I1201 21:34:05.582110 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w7dpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"943bfc9d-612b-4273-9774-f1866b7af4b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0486c9366b81f3abfd35806c7eef00eb2a353450d850cc665c43fe3c78c9fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w7dpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:05Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:05 crc kubenswrapper[4962]: I1201 21:34:05.596964 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:05Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:05 crc kubenswrapper[4962]: I1201 21:34:05.620361 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"017b2e87-9a6e-46c6-b061-1ed93bfd2322\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j77n9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:05Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:05 crc kubenswrapper[4962]: I1201 21:34:05.634794 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4wzdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c4763ed-582e-4003-a0da-526ae8aee799\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca28226077849a752e7f8484f9ef98d6fb065d6ba9b3c4ef05f3ef1b0554bc32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5lb4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:34:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4wzdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:05Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:05 crc kubenswrapper[4962]: I1201 21:34:05.652027 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dce968b-9adc-4cbe-958c-00bdd7ba6159\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d0603aa8db338a61428d0c439ab252ad7db6ee0061992381a2f2c2534ecc08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b00d16ddb5e5c38227a87ee09c7e0d74ca43fb9226904dc5d604abe6a9c9b5ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc7177cbcbb0fe51491e055e7a8cb565836f1ffe163b6e160c3b425a5218fc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e749578c0ce8cb1dd38de90c24c5ddf738558163fe9012afcf1db088756495\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:05Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:05 crc kubenswrapper[4962]: I1201 21:34:05.665666 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db02e5c3ea0837c8534d6d0d89c6343ef077bc79b08bb6514fcc7fa0d71ac94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:05Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:05 crc kubenswrapper[4962]: I1201 21:34:05.667862 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:05 crc kubenswrapper[4962]: I1201 21:34:05.667914 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:05 crc kubenswrapper[4962]: I1201 21:34:05.667931 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:05 crc kubenswrapper[4962]: I1201 21:34:05.668019 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:05 crc kubenswrapper[4962]: I1201 21:34:05.668036 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:05Z","lastTransitionTime":"2025-12-01T21:34:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:05 crc kubenswrapper[4962]: I1201 21:34:05.679418 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:05Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:05 crc kubenswrapper[4962]: I1201 21:34:05.695577 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-m4wg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38b9e31-13b0-4a48-93bf-b3722ca60642\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6b607ad59cd135df76eb0d6167a309468b46c7d487077b737d6d35390990de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws4mx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-m4wg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:05Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:05 crc kubenswrapper[4962]: I1201 21:34:05.714980 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lv2jr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47902aa9-b3e5-4279-a0ee-23ec28d1c67b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f38ae7ac1122c7cceeaf7ef6284034b3b00f0d34060ce2a912344e521f8288ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f38ae7ac1122c7cceeaf7ef6284034b3b00f0d34060ce2a912344e521f8288ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3df9e036c0435fee1aa00f18a53bfb50334213a9284f766de1454f30a6c78f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3df9e036c0435fee1aa00f18a53bfb50334213a9284f766de1454f30a6c78f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eea885834889b1efa1a74ddb9d4c378669affd6dcf493f7d5e2c18d63ba9b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eea885834889b1efa1a74ddb9d4c378669affd6dcf493f7d5e2c18d63ba9b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b892d495763f16b0d9b9afd4440fa4d704de060b42b7506b28e9d990c47bfc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b892d495763f16b0d9b9afd4440fa4d704de060b42b7506b28e9d990c47bfc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fd3d341f898bc53c51ad9a79a03ffd02ef0ac0e8f53e33ebc139a30fa85b7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4fd3d341f898bc53c51ad9a79a03ffd02ef0ac0e8f53e33ebc139a30fa85b7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lv2jr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:05Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:05 crc kubenswrapper[4962]: I1201 21:34:05.734213 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac15c86b-cd18-4110-bda1-ba116dccb445\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edc674b0b2e671ee34618f4a89fb513b4d6be3b5b2c65510554ed57b84f305b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d764e05de4e1d86add3da927e55f8d2f211bba19a2104af89e43d13f986f5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a26275d40e0cdc563a8704d985b92ca116fd053ac663b23ada1f3d55d3d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7bb726576fbf1e8a595c9fdab30ed4852d838f3ad72aa432b880e8466493ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa02037f2a0411aa4c3b7eff66c72a1885d70e56cf72cdf786d04b5840ecbe6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"message\\\":\\\" 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764624829\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764624829\\\\\\\\\\\\\\\" (2025-12-01 20:33:49 +0000 UTC to 2026-12-01 20:33:49 +0000 UTC (now=2025-12-01 21:33:54.341897199 +0000 UTC))\\\\\\\"\\\\nI1201 21:33:54.342031 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 21:33:54.342089 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 21:33:54.342220 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1201 21:33:54.342244 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1201 21:33:54.341422 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 21:33:54.342706 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 21:33:54.344341 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-602538438/tls.crt::/tmp/serving-cert-602538438/tls.key\\\\\\\"\\\\nI1201 21:33:54.345794 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 21:33:54.345817 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 21:33:54.345844 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348097 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348569 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1201 21:33:54.367723 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bbe0097117ffbc7aa9c8247995dcec3765f03a9aa7657ee0c0da4c3a3ec874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:05Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:05 crc kubenswrapper[4962]: I1201 21:34:05.754727 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:05Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:05 crc kubenswrapper[4962]: I1201 21:34:05.771216 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:05 crc kubenswrapper[4962]: I1201 21:34:05.771261 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:05 crc kubenswrapper[4962]: I1201 21:34:05.771277 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:05 crc kubenswrapper[4962]: I1201 21:34:05.771336 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:05 crc kubenswrapper[4962]: I1201 21:34:05.771358 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:05Z","lastTransitionTime":"2025-12-01T21:34:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:05 crc kubenswrapper[4962]: I1201 21:34:05.893410 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:05 crc kubenswrapper[4962]: I1201 21:34:05.893476 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:05 crc kubenswrapper[4962]: I1201 21:34:05.893492 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:05 crc kubenswrapper[4962]: I1201 21:34:05.893510 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:05 crc kubenswrapper[4962]: I1201 21:34:05.893521 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:05Z","lastTransitionTime":"2025-12-01T21:34:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:05 crc kubenswrapper[4962]: I1201 21:34:05.988276 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 21:34:05 crc kubenswrapper[4962]: I1201 21:34:05.997340 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:05 crc kubenswrapper[4962]: I1201 21:34:05.997385 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:05 crc kubenswrapper[4962]: I1201 21:34:05.997401 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:05 crc kubenswrapper[4962]: I1201 21:34:05.997424 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:05 crc kubenswrapper[4962]: I1201 21:34:05.997440 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:05Z","lastTransitionTime":"2025-12-01T21:34:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.008564 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://311915b276fcb2749fd7443c07b4a339131584ba32944273ed063ed55edec6c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:06Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.028468 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69adfdccde94cd891284e61c5edd6a092d14c5276ff514618fa3b44169e6347e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a6c3b22dbe7ed00b2356583225c79a46bfbf7014f2ba5b6d2226f1b1f3f7b39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:06Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.045445 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w7dpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"943bfc9d-612b-4273-9774-f1866b7af4b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0486c9366b81f3abfd35806c7eef00eb2a353450d850cc665c43fe3c78c9fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w7dpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:06Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.061443 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"191b6ce3-f613-4217-b224-a65ee4cfdfe7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37ed2e5c6ce1be6c710341ab02dc9ae89ebb0b309c11e78b251ffb532c31587f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rf67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca94caf0040f2d86239f149d34e419adcc86594fda2b4189e74f699c125857f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rf67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b642k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:06Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.078987 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:06Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.100624 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.100683 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.100702 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.100726 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.100743 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:06Z","lastTransitionTime":"2025-12-01T21:34:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.109360 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"017b2e87-9a6e-46c6-b061-1ed93bfd2322\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j77n9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:06Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.129131 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dce968b-9adc-4cbe-958c-00bdd7ba6159\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d0603aa8db338a61428d0c439ab252ad7db6ee0061992381a2f2c2534ecc08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b00d16ddb5e5c38227a87ee09c7e0d74ca43fb9226904dc5d604abe6a9c9b5ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc7177cbcbb0fe51491e055e7a8cb565836f1ffe163b6e160c3b425a5218fc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e749578c0ce8cb1dd38de90c24c5ddf738558163fe9012afcf1db088756495\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:06Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.146734 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db02e5c3ea0837c8534d6d0d89c6343ef077bc79b08bb6514fcc7fa0d71ac94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:06Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.162384 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:06Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.183153 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-m4wg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38b9e31-13b0-4a48-93bf-b3722ca60642\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6b607ad59cd135df76eb0d6167a309468b46c7d487077b737d6d35390990de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws4mx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-m4wg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:06Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.204822 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.204877 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.204894 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.204918 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.204960 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:06Z","lastTransitionTime":"2025-12-01T21:34:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.206271 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lv2jr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47902aa9-b3e5-4279-a0ee-23ec28d1c67b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f38ae7ac1122c7cceeaf7ef6284034b3b00f0d34060ce2a912344e521f8288ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f38ae7ac1122c7cceeaf7ef6284034b3b00f0d34060ce2a912344e521f8288ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3df9e036c0435fee1aa00f18a53bfb50334213a9284f766de1454f30a6c78f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3df9e036c0435fee1aa00f18a53bfb50334213a9284f766de1454f30a6c78f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eea885834889b1efa1a74ddb9d4c378669affd6dcf493f7d5e2c18d63ba9b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eea885834889b1efa1a74ddb9d4c378669affd6dcf493f7d5e2c18d63ba9b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b892d495763f16b0d9b9afd4440fa4d704de060b42b7506b28e9d990c47bfc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b892d495763f16b0d9b9afd4440fa4d704de060b42b7506b28e9d990c47bfc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fd3d341f898bc53c51ad9a79a03ffd02ef0ac0e8f53e33ebc139a30fa85b7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4fd3d341f898bc53c51ad9a79a03ffd02ef0ac0e8f53e33ebc139a30fa85b7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lv2jr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:06Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.220867 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4wzdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c4763ed-582e-4003-a0da-526ae8aee799\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca28226077849a752e7f8484f9ef98d6fb065d6ba9b3c4ef05f3ef1b0554bc32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5lb4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:34:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4wzdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:06Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.239686 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac15c86b-cd18-4110-bda1-ba116dccb445\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edc674b0b2e671ee34618f4a89fb513b4d6be3b5b2c65510554ed57b84f305b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d764e05de4e1d86add3da927e55f8d2f211bba19a2104af89e43d13f986f5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a26275d40e0cdc563a8704d985b92ca116fd053ac663b23ada1f3d55d3d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7bb726576fbf1e8a595c9fdab30ed4852d838f3ad72aa432b880e8466493ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa02037f2a0411aa4c3b7eff66c72a1885d70e56cf72cdf786d04b5840ecbe6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"message\\\":\\\" 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764624829\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764624829\\\\\\\\\\\\\\\" (2025-12-01 20:33:49 +0000 UTC to 2026-12-01 20:33:49 +0000 UTC (now=2025-12-01 21:33:54.341897199 +0000 UTC))\\\\\\\"\\\\nI1201 21:33:54.342031 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 21:33:54.342089 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 21:33:54.342220 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1201 21:33:54.342244 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1201 21:33:54.341422 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 21:33:54.342706 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 21:33:54.344341 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-602538438/tls.crt::/tmp/serving-cert-602538438/tls.key\\\\\\\"\\\\nI1201 21:33:54.345794 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 21:33:54.345817 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 21:33:54.345844 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348097 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348569 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1201 21:33:54.367723 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bbe0097117ffbc7aa9c8247995dcec3765f03a9aa7657ee0c0da4c3a3ec874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:06Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.265652 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:06Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.283893 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69adfdccde94cd891284e61c5edd6a092d14c5276ff514618fa3b44169e6347e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a6c3b22dbe7ed00b2356583225c79a46bfbf7014f2ba5b6d2226f1b1f3f7b39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:06Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.305198 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w7dpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"943bfc9d-612b-4273-9774-f1866b7af4b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0486c9366b81f3abfd35806c7eef00eb2a353450d850cc665c43fe3c78c9fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w7dpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:06Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.307649 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.307710 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.307728 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.307751 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.307770 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:06Z","lastTransitionTime":"2025-12-01T21:34:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.325783 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"191b6ce3-f613-4217-b224-a65ee4cfdfe7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37ed2e5c6ce1be6c710341ab02dc9ae89ebb0b309c11e78b251ffb532c31587f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rf67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca94caf0040f2d86239f149d34e419adcc86594fda2b4189e74f699c125857f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rf67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b642k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:06Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.346566 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://311915b276fcb2749fd7443c07b4a339131584ba32944273ed063ed55edec6c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:06Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.367825 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:06Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.403472 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"017b2e87-9a6e-46c6-b061-1ed93bfd2322\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j77n9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:06Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.410767 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.410826 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.410842 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.410868 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.410887 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:06Z","lastTransitionTime":"2025-12-01T21:34:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.425758 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-m4wg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38b9e31-13b0-4a48-93bf-b3722ca60642\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6b607ad59cd135df76eb0d6167a309468b46c7d487077b737d6d35390990de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws4mx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-m4wg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:06Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.442888 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lv2jr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47902aa9-b3e5-4279-a0ee-23ec28d1c67b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f38ae7ac1122c7cceeaf7ef6284034b3b00f0d34060ce2a912344e521f8288ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f38ae7ac1122c7cceeaf7ef6284034b3b00f0d34060ce2a912344e521f8288ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3df9e036c0435fee1aa00f18a53bfb50334213a9284f766de1454f30a6c78f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3df9e036c0435fee1aa00f18a53bfb50334213a9284f766de1454f30a6c78f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eea885834889b1efa1a74ddb9d4c378669affd6dcf493f7d5e2c18d63ba9b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eea885834889b1efa1a74ddb9d4c378669affd6dcf493f7d5e2c18d63ba9b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b892d495763f16b0d9b9afd4440fa4d704de060b42b7506b28e9d990c47bfc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b892d495763f16b0d9b9afd4440fa4d704de060b42b7506b28e9d990c47bfc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fd3d341f898bc53c51ad9a79a03ffd02ef0ac0e8f53e33ebc139a30fa85b7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4fd3d341f898bc53c51ad9a79a03ffd02ef0ac0e8f53e33ebc139a30fa85b7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lv2jr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:06Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.457523 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4wzdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c4763ed-582e-4003-a0da-526ae8aee799\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca28226077849a752e7f8484f9ef98d6fb065d6ba9b3c4ef05f3ef1b0554bc32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5lb4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:34:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4wzdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:06Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.476466 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dce968b-9adc-4cbe-958c-00bdd7ba6159\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d0603aa8db338a61428d0c439ab252ad7db6ee0061992381a2f2c2534ecc08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b00d16ddb5e5c38227a87ee09c7e0d74ca43fb9226904dc5d604abe6a9c9b5ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc7177cbcbb0fe51491e055e7a8cb565836f1ffe163b6e160c3b425a5218fc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e749578c0ce8cb1dd38de90c24c5ddf738558163fe9012afcf1db088756495\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:06Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.499842 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db02e5c3ea0837c8534d6d0d89c6343ef077bc79b08bb6514fcc7fa0d71ac94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:06Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.513261 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.513329 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.513355 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.513387 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.513410 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:06Z","lastTransitionTime":"2025-12-01T21:34:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.516793 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" event={"ID":"017b2e87-9a6e-46c6-b061-1ed93bfd2322","Type":"ContainerStarted","Data":"54ff70b42b58fa14cfcaa208973cc61a1c839d2aae7f7f740a3be8ca07a1e518"} Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.518278 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.518349 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.518382 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.523713 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:06Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.524004 4962 generic.go:334] "Generic (PLEG): container finished" podID="47902aa9-b3e5-4279-a0ee-23ec28d1c67b" containerID="e1a2f823c4006bd3f5acb5f92c78b1020ea0992f77a76366c5bab21657bf403f" exitCode=0 Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.524067 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lv2jr" event={"ID":"47902aa9-b3e5-4279-a0ee-23ec28d1c67b","Type":"ContainerDied","Data":"e1a2f823c4006bd3f5acb5f92c78b1020ea0992f77a76366c5bab21657bf403f"} Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.542993 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac15c86b-cd18-4110-bda1-ba116dccb445\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edc674b0b2e671ee34618f4a89fb513b4d6be3b5b2c65510554ed57b84f305b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d764e05de4e1d86add3da927e55f8d2f211bba19a2104af89e43d13f986f5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a26275d40e0cdc563a8704d985b92ca116fd053ac663b23ada1f3d55d3d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7bb726576fbf1e8a595c9fdab30ed4852d838f3ad72aa432b880e8466493ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa02037f2a0411aa4c3b7eff66c72a1885d70e56cf72cdf786d04b5840ecbe6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"message\\\":\\\" 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764624829\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764624829\\\\\\\\\\\\\\\" (2025-12-01 20:33:49 +0000 UTC to 2026-12-01 20:33:49 +0000 UTC (now=2025-12-01 21:33:54.341897199 +0000 UTC))\\\\\\\"\\\\nI1201 21:33:54.342031 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 21:33:54.342089 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 21:33:54.342220 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1201 21:33:54.342244 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1201 21:33:54.341422 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 21:33:54.342706 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 21:33:54.344341 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-602538438/tls.crt::/tmp/serving-cert-602538438/tls.key\\\\\\\"\\\\nI1201 21:33:54.345794 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 21:33:54.345817 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 21:33:54.345844 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348097 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348569 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1201 21:33:54.367723 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bbe0097117ffbc7aa9c8247995dcec3765f03a9aa7657ee0c0da4c3a3ec874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:06Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.585348 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.589630 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.594029 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:06Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.612149 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db02e5c3ea0837c8534d6d0d89c6343ef077bc79b08bb6514fcc7fa0d71ac94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:06Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.616174 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.616212 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.616228 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.616247 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.616264 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:06Z","lastTransitionTime":"2025-12-01T21:34:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.632727 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:06Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.653500 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-m4wg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38b9e31-13b0-4a48-93bf-b3722ca60642\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6b607ad59cd135df76eb0d6167a309468b46c7d487077b737d6d35390990de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws4mx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-m4wg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:06Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.670474 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lv2jr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47902aa9-b3e5-4279-a0ee-23ec28d1c67b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f38ae7ac1122c7cceeaf7ef6284034b3b00f0d34060ce2a912344e521f8288ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f38ae7ac1122c7cceeaf7ef6284034b3b00f0d34060ce2a912344e521f8288ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3df9e036c0435fee1aa00f18a53bfb50334213a9284f766de1454f30a6c78f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3df9e036c0435fee1aa00f18a53bfb50334213a9284f766de1454f30a6c78f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eea885834889b1efa1a74ddb9d4c378669affd6dcf493f7d5e2c18d63ba9b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eea885834889b1efa1a74ddb9d4c378669affd6dcf493f7d5e2c18d63ba9b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b892d495763f16b0d9b9afd4440fa4d704de060b42b7506b28e9d990c47bfc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b892d495763f16b0d9b9afd4440fa4d704de060b42b7506b28e9d990c47bfc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fd3d341f898bc53c51ad9a79a03ffd02ef0ac0e8f53e33ebc139a30fa85b7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4fd3d341f898bc53c51ad9a79a03ffd02ef0ac0e8f53e33ebc139a30fa85b7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1a2f823c4006bd3f5acb5f92c78b1020ea0992f77a76366c5bab21657bf403f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1a2f823c4006bd3f5acb5f92c78b1020ea0992f77a76366c5bab21657bf403f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lv2jr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:06Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.685899 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4wzdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c4763ed-582e-4003-a0da-526ae8aee799\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca28226077849a752e7f8484f9ef98d6fb065d6ba9b3c4ef05f3ef1b0554bc32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5lb4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:34:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4wzdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:06Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.698246 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dce968b-9adc-4cbe-958c-00bdd7ba6159\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d0603aa8db338a61428d0c439ab252ad7db6ee0061992381a2f2c2534ecc08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b00d16ddb5e5c38227a87ee09c7e0d74ca43fb9226904dc5d604abe6a9c9b5ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc7177cbcbb0fe51491e055e7a8cb565836f1ffe163b6e160c3b425a5218fc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e749578c0ce8cb1dd38de90c24c5ddf738558163fe9012afcf1db088756495\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:06Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.717203 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:06Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.719229 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.719293 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.719312 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.719340 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.719361 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:06Z","lastTransitionTime":"2025-12-01T21:34:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.740730 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac15c86b-cd18-4110-bda1-ba116dccb445\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edc674b0b2e671ee34618f4a89fb513b4d6be3b5b2c65510554ed57b84f305b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d764e05de4e1d86add3da927e55f8d2f211bba19a2104af89e43d13f986f5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a26275d40e0cdc563a8704d985b92ca116fd053ac663b23ada1f3d55d3d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7bb726576fbf1e8a595c9fdab30ed4852d838f3ad72aa432b880e8466493ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa02037f2a0411aa4c3b7eff66c72a1885d70e56cf72cdf786d04b5840ecbe6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"message\\\":\\\" 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764624829\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764624829\\\\\\\\\\\\\\\" (2025-12-01 20:33:49 +0000 UTC to 2026-12-01 20:33:49 +0000 UTC (now=2025-12-01 21:33:54.341897199 +0000 UTC))\\\\\\\"\\\\nI1201 21:33:54.342031 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 21:33:54.342089 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 21:33:54.342220 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1201 21:33:54.342244 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1201 21:33:54.341422 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 21:33:54.342706 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 21:33:54.344341 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-602538438/tls.crt::/tmp/serving-cert-602538438/tls.key\\\\\\\"\\\\nI1201 21:33:54.345794 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 21:33:54.345817 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 21:33:54.345844 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348097 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348569 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1201 21:33:54.367723 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bbe0097117ffbc7aa9c8247995dcec3765f03a9aa7657ee0c0da4c3a3ec874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:06Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.758880 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://311915b276fcb2749fd7443c07b4a339131584ba32944273ed063ed55edec6c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:06Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.774206 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69adfdccde94cd891284e61c5edd6a092d14c5276ff514618fa3b44169e6347e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a6c3b22dbe7ed00b2356583225c79a46bfbf7014f2ba5b6d2226f1b1f3f7b39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:06Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.799916 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w7dpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"943bfc9d-612b-4273-9774-f1866b7af4b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0486c9366b81f3abfd35806c7eef00eb2a353450d850cc665c43fe3c78c9fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w7dpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:06Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.819429 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"191b6ce3-f613-4217-b224-a65ee4cfdfe7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37ed2e5c6ce1be6c710341ab02dc9ae89ebb0b309c11e78b251ffb532c31587f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rf67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca94caf0040f2d86239f149d34e419adcc86594fda2b4189e74f699c125857f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rf67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b642k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:06Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.821870 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.822008 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.822035 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.822067 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.822086 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:06Z","lastTransitionTime":"2025-12-01T21:34:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.838512 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:06Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.870472 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"017b2e87-9a6e-46c6-b061-1ed93bfd2322\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c2b5c061f60dc51486c1b3b6fd97f18f63860ecb75697b61ee8288c43f23417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a2d1885515e7eab97b61b268f14f9fab5ff02464621e1a9b6adfee4c032ca3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81854e8f9430ba467e0aecead00fad301f92f85fdcca3f93f5e4aad9bba65925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70bcf088189f0158815dd70d752870decf941ad51bbfbe7c0c872da603659864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f5f31e9f2e590d4d3319248b0a4dca7e1f6575cb973ec7cc3e33fc91eb82d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da8a6f1fac0baebddfe20bd0ebda104356ad37e9f128affd8df2757584e8e712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ff70b42b58fa14cfcaa208973cc61a1c839d2aae7f7f740a3be8ca07a1e518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://893a9b3cc35f66e4d7ce63bd2c5c94a0e5824466a37b2281e05c4e92b5e90a05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j77n9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:06Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.925200 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.925236 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.925247 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.925263 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:06 crc kubenswrapper[4962]: I1201 21:34:06.925276 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:06Z","lastTransitionTime":"2025-12-01T21:34:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.029074 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.029136 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.029154 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.029181 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.029199 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:07Z","lastTransitionTime":"2025-12-01T21:34:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.132443 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.132561 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.132586 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.132614 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.132636 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:07Z","lastTransitionTime":"2025-12-01T21:34:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.218610 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.218792 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:34:07 crc kubenswrapper[4962]: E1201 21:34:07.218930 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 21:34:07 crc kubenswrapper[4962]: E1201 21:34:07.219114 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.219252 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 21:34:07 crc kubenswrapper[4962]: E1201 21:34:07.219459 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.235472 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.235529 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.235551 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.235581 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.235605 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:07Z","lastTransitionTime":"2025-12-01T21:34:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.338312 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.338363 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.338381 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.338403 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.338422 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:07Z","lastTransitionTime":"2025-12-01T21:34:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.443238 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.443302 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.443324 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.443352 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.443374 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:07Z","lastTransitionTime":"2025-12-01T21:34:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.535921 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lv2jr" event={"ID":"47902aa9-b3e5-4279-a0ee-23ec28d1c67b","Type":"ContainerStarted","Data":"131ddb58672271d21222b6c9376dc6fa0bc5ba8d9bb4c8b0db88e81d0f59984c"} Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.550440 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.550499 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.550526 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.550560 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.550582 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:07Z","lastTransitionTime":"2025-12-01T21:34:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.560037 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac15c86b-cd18-4110-bda1-ba116dccb445\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edc674b0b2e671ee34618f4a89fb513b4d6be3b5b2c65510554ed57b84f305b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d764e05de4e1d86add3da927e55f8d2f211bba19a2104af89e43d13f986f5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a26275d40e0cdc563a8704d985b92ca116fd053ac663b23ada1f3d55d3d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7bb726576fbf1e8a595c9fdab30ed4852d838f3ad72aa432b880e8466493ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa02037f2a0411aa4c3b7eff66c72a1885d70e56cf72cdf786d04b5840ecbe6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"message\\\":\\\" 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764624829\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764624829\\\\\\\\\\\\\\\" (2025-12-01 20:33:49 +0000 UTC to 2026-12-01 20:33:49 +0000 UTC (now=2025-12-01 21:33:54.341897199 +0000 UTC))\\\\\\\"\\\\nI1201 21:33:54.342031 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 21:33:54.342089 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 21:33:54.342220 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1201 21:33:54.342244 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1201 21:33:54.341422 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 21:33:54.342706 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 21:33:54.344341 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-602538438/tls.crt::/tmp/serving-cert-602538438/tls.key\\\\\\\"\\\\nI1201 21:33:54.345794 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 21:33:54.345817 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 21:33:54.345844 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348097 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348569 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1201 21:33:54.367723 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bbe0097117ffbc7aa9c8247995dcec3765f03a9aa7657ee0c0da4c3a3ec874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:07Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.581696 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:07Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.598803 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w7dpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"943bfc9d-612b-4273-9774-f1866b7af4b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0486c9366b81f3abfd35806c7eef00eb2a353450d850cc665c43fe3c78c9fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w7dpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:07Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.616827 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"191b6ce3-f613-4217-b224-a65ee4cfdfe7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37ed2e5c6ce1be6c710341ab02dc9ae89ebb0b309c11e78b251ffb532c31587f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rf67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca94caf0040f2d86239f149d34e419adcc86594fda2b4189e74f699c125857f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rf67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b642k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:07Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.639488 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://311915b276fcb2749fd7443c07b4a339131584ba32944273ed063ed55edec6c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:07Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.654182 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.654259 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.654282 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.654308 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.654324 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:07Z","lastTransitionTime":"2025-12-01T21:34:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.663065 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69adfdccde94cd891284e61c5edd6a092d14c5276ff514618fa3b44169e6347e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a6c3b22dbe7ed00b2356583225c79a46bfbf7014f2ba5b6d2226f1b1f3f7b39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:07Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.681320 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:07Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.706794 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"017b2e87-9a6e-46c6-b061-1ed93bfd2322\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c2b5c061f60dc51486c1b3b6fd97f18f63860ecb75697b61ee8288c43f23417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a2d1885515e7eab97b61b268f14f9fab5ff02464621e1a9b6adfee4c032ca3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81854e8f9430ba467e0aecead00fad301f92f85fdcca3f93f5e4aad9bba65925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70bcf088189f0158815dd70d752870decf941ad51bbfbe7c0c872da603659864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f5f31e9f2e590d4d3319248b0a4dca7e1f6575cb973ec7cc3e33fc91eb82d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da8a6f1fac0baebddfe20bd0ebda104356ad37e9f128affd8df2757584e8e712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ff70b42b58fa14cfcaa208973cc61a1c839d2aae7f7f740a3be8ca07a1e518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://893a9b3cc35f66e4d7ce63bd2c5c94a0e5824466a37b2281e05c4e92b5e90a05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j77n9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:07Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.727872 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lv2jr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47902aa9-b3e5-4279-a0ee-23ec28d1c67b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131ddb58672271d21222b6c9376dc6fa0bc5ba8d9bb4c8b0db88e81d0f59984c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f38ae7ac1122c7cceeaf7ef6284034b3b00f0d34060ce2a912344e521f8288ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f38ae7ac1122c7cceeaf7ef6284034b3b00f0d34060ce2a912344e521f8288ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3df9e036c0435fee1aa00f18a53bfb50334213a9284f766de1454f30a6c78f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3df9e036c0435fee1aa00f18a53bfb50334213a9284f766de1454f30a6c78f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eea885834889b1efa1a74ddb9d4c378669affd6dcf493f7d5e2c18d63ba9b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eea885834889b1efa1a74ddb9d4c378669affd6dcf493f7d5e2c18d63ba9b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b892d495763f16b0d9b9afd4440fa4d704de060b42b7506b28e9d990c47bfc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b892d495763f16b0d9b9afd4440fa4d704de060b42b7506b28e9d990c47bfc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fd3d341f898bc53c51ad9a79a03ffd02ef0ac0e8f53e33ebc139a30fa85b7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4fd3d341f898bc53c51ad9a79a03ffd02ef0ac0e8f53e33ebc139a30fa85b7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1a2f823c4006bd3f5acb5f92c78b1020ea0992f77a76366c5bab21657bf403f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1a2f823c4006bd3f5acb5f92c78b1020ea0992f77a76366c5bab21657bf403f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lv2jr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:07Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.749250 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4wzdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c4763ed-582e-4003-a0da-526ae8aee799\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca28226077849a752e7f8484f9ef98d6fb065d6ba9b3c4ef05f3ef1b0554bc32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5lb4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:34:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4wzdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:07Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.756741 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.756797 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.756816 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.756842 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.756863 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:07Z","lastTransitionTime":"2025-12-01T21:34:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.758245 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.758313 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.758324 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.758338 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.758350 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:07Z","lastTransitionTime":"2025-12-01T21:34:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.770893 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dce968b-9adc-4cbe-958c-00bdd7ba6159\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d0603aa8db338a61428d0c439ab252ad7db6ee0061992381a2f2c2534ecc08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b00d16ddb5e5c38227a87ee09c7e0d74ca43fb9226904dc5d604abe6a9c9b5ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc7177cbcbb0fe51491e055e7a8cb565836f1ffe163b6e160c3b425a5218fc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e749578c0ce8cb1dd38de90c24c5ddf738558163fe9012afcf1db088756495\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:07Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.789568 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db02e5c3ea0837c8534d6d0d89c6343ef077bc79b08bb6514fcc7fa0d71ac94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:07Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:07 crc kubenswrapper[4962]: E1201 21:34:07.789379 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1be34435-728a-4ba5-8928-fb5e344b2f91\\\",\\\"systemUUID\\\":\\\"c1819972-ca37-4089-9661-6671772d5e38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:07Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.797283 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.797325 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.797334 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.797347 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.797356 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:07Z","lastTransitionTime":"2025-12-01T21:34:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:07 crc kubenswrapper[4962]: E1201 21:34:07.813955 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1be34435-728a-4ba5-8928-fb5e344b2f91\\\",\\\"systemUUID\\\":\\\"c1819972-ca37-4089-9661-6671772d5e38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:07Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.815439 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:07Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.817639 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.817665 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.817677 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.817694 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.817706 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:07Z","lastTransitionTime":"2025-12-01T21:34:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:07 crc kubenswrapper[4962]: E1201 21:34:07.830702 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1be34435-728a-4ba5-8928-fb5e344b2f91\\\",\\\"systemUUID\\\":\\\"c1819972-ca37-4089-9661-6671772d5e38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:07Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.830781 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-m4wg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38b9e31-13b0-4a48-93bf-b3722ca60642\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6b607ad59cd135df76eb0d6167a309468b46c7d487077b737d6d35390990de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws4mx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-m4wg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:07Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.834034 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.834074 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.834086 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.834105 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.834117 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:07Z","lastTransitionTime":"2025-12-01T21:34:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:07 crc kubenswrapper[4962]: E1201 21:34:07.847478 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1be34435-728a-4ba5-8928-fb5e344b2f91\\\",\\\"systemUUID\\\":\\\"c1819972-ca37-4089-9661-6671772d5e38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:07Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.851329 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.851365 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.851377 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.851392 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.851404 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:07Z","lastTransitionTime":"2025-12-01T21:34:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:07 crc kubenswrapper[4962]: E1201 21:34:07.864082 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1be34435-728a-4ba5-8928-fb5e344b2f91\\\",\\\"systemUUID\\\":\\\"c1819972-ca37-4089-9661-6671772d5e38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:07Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:07 crc kubenswrapper[4962]: E1201 21:34:07.864229 4962 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.866151 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.866198 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.866211 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.866234 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.866251 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:07Z","lastTransitionTime":"2025-12-01T21:34:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.969367 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.969421 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.969433 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.969455 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:07 crc kubenswrapper[4962]: I1201 21:34:07.969469 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:07Z","lastTransitionTime":"2025-12-01T21:34:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:08 crc kubenswrapper[4962]: I1201 21:34:08.073032 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:08 crc kubenswrapper[4962]: I1201 21:34:08.073109 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:08 crc kubenswrapper[4962]: I1201 21:34:08.073134 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:08 crc kubenswrapper[4962]: I1201 21:34:08.073165 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:08 crc kubenswrapper[4962]: I1201 21:34:08.073190 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:08Z","lastTransitionTime":"2025-12-01T21:34:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:08 crc kubenswrapper[4962]: I1201 21:34:08.176426 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:08 crc kubenswrapper[4962]: I1201 21:34:08.176489 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:08 crc kubenswrapper[4962]: I1201 21:34:08.176507 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:08 crc kubenswrapper[4962]: I1201 21:34:08.176531 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:08 crc kubenswrapper[4962]: I1201 21:34:08.176549 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:08Z","lastTransitionTime":"2025-12-01T21:34:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:08 crc kubenswrapper[4962]: I1201 21:34:08.278630 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:08 crc kubenswrapper[4962]: I1201 21:34:08.278657 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:08 crc kubenswrapper[4962]: I1201 21:34:08.278664 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:08 crc kubenswrapper[4962]: I1201 21:34:08.278676 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:08 crc kubenswrapper[4962]: I1201 21:34:08.278685 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:08Z","lastTransitionTime":"2025-12-01T21:34:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:08 crc kubenswrapper[4962]: I1201 21:34:08.381238 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:08 crc kubenswrapper[4962]: I1201 21:34:08.381266 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:08 crc kubenswrapper[4962]: I1201 21:34:08.381275 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:08 crc kubenswrapper[4962]: I1201 21:34:08.381288 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:08 crc kubenswrapper[4962]: I1201 21:34:08.381297 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:08Z","lastTransitionTime":"2025-12-01T21:34:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:08 crc kubenswrapper[4962]: I1201 21:34:08.485037 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:08 crc kubenswrapper[4962]: I1201 21:34:08.485100 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:08 crc kubenswrapper[4962]: I1201 21:34:08.485118 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:08 crc kubenswrapper[4962]: I1201 21:34:08.485143 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:08 crc kubenswrapper[4962]: I1201 21:34:08.485161 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:08Z","lastTransitionTime":"2025-12-01T21:34:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:08 crc kubenswrapper[4962]: I1201 21:34:08.587516 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:08 crc kubenswrapper[4962]: I1201 21:34:08.587578 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:08 crc kubenswrapper[4962]: I1201 21:34:08.587595 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:08 crc kubenswrapper[4962]: I1201 21:34:08.587620 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:08 crc kubenswrapper[4962]: I1201 21:34:08.587639 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:08Z","lastTransitionTime":"2025-12-01T21:34:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:08 crc kubenswrapper[4962]: I1201 21:34:08.690805 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:08 crc kubenswrapper[4962]: I1201 21:34:08.690874 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:08 crc kubenswrapper[4962]: I1201 21:34:08.690987 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:08 crc kubenswrapper[4962]: I1201 21:34:08.691020 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:08 crc kubenswrapper[4962]: I1201 21:34:08.691037 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:08Z","lastTransitionTime":"2025-12-01T21:34:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:08 crc kubenswrapper[4962]: I1201 21:34:08.794104 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:08 crc kubenswrapper[4962]: I1201 21:34:08.794166 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:08 crc kubenswrapper[4962]: I1201 21:34:08.794183 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:08 crc kubenswrapper[4962]: I1201 21:34:08.794207 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:08 crc kubenswrapper[4962]: I1201 21:34:08.794225 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:08Z","lastTransitionTime":"2025-12-01T21:34:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:08 crc kubenswrapper[4962]: I1201 21:34:08.897612 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:08 crc kubenswrapper[4962]: I1201 21:34:08.898114 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:08 crc kubenswrapper[4962]: I1201 21:34:08.898137 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:08 crc kubenswrapper[4962]: I1201 21:34:08.898167 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:08 crc kubenswrapper[4962]: I1201 21:34:08.898188 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:08Z","lastTransitionTime":"2025-12-01T21:34:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:09 crc kubenswrapper[4962]: I1201 21:34:09.000863 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:09 crc kubenswrapper[4962]: I1201 21:34:09.000977 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:09 crc kubenswrapper[4962]: I1201 21:34:09.001010 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:09 crc kubenswrapper[4962]: I1201 21:34:09.001042 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:09 crc kubenswrapper[4962]: I1201 21:34:09.001062 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:09Z","lastTransitionTime":"2025-12-01T21:34:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:09 crc kubenswrapper[4962]: I1201 21:34:09.103845 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:09 crc kubenswrapper[4962]: I1201 21:34:09.103920 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:09 crc kubenswrapper[4962]: I1201 21:34:09.103981 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:09 crc kubenswrapper[4962]: I1201 21:34:09.104014 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:09 crc kubenswrapper[4962]: I1201 21:34:09.104043 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:09Z","lastTransitionTime":"2025-12-01T21:34:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:09 crc kubenswrapper[4962]: I1201 21:34:09.207316 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:09 crc kubenswrapper[4962]: I1201 21:34:09.207388 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:09 crc kubenswrapper[4962]: I1201 21:34:09.207436 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:09 crc kubenswrapper[4962]: I1201 21:34:09.207471 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:09 crc kubenswrapper[4962]: I1201 21:34:09.207494 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:09Z","lastTransitionTime":"2025-12-01T21:34:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:09 crc kubenswrapper[4962]: I1201 21:34:09.218719 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 21:34:09 crc kubenswrapper[4962]: I1201 21:34:09.218766 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 21:34:09 crc kubenswrapper[4962]: I1201 21:34:09.218719 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:34:09 crc kubenswrapper[4962]: E1201 21:34:09.218883 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 21:34:09 crc kubenswrapper[4962]: E1201 21:34:09.219048 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 21:34:09 crc kubenswrapper[4962]: E1201 21:34:09.219157 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 21:34:09 crc kubenswrapper[4962]: I1201 21:34:09.310602 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:09 crc kubenswrapper[4962]: I1201 21:34:09.310666 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:09 crc kubenswrapper[4962]: I1201 21:34:09.310684 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:09 crc kubenswrapper[4962]: I1201 21:34:09.310708 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:09 crc kubenswrapper[4962]: I1201 21:34:09.310724 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:09Z","lastTransitionTime":"2025-12-01T21:34:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:09 crc kubenswrapper[4962]: I1201 21:34:09.414327 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:09 crc kubenswrapper[4962]: I1201 21:34:09.414390 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:09 crc kubenswrapper[4962]: I1201 21:34:09.414407 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:09 crc kubenswrapper[4962]: I1201 21:34:09.414432 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:09 crc kubenswrapper[4962]: I1201 21:34:09.414451 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:09Z","lastTransitionTime":"2025-12-01T21:34:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:09 crc kubenswrapper[4962]: I1201 21:34:09.518406 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:09 crc kubenswrapper[4962]: I1201 21:34:09.518541 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:09 crc kubenswrapper[4962]: I1201 21:34:09.518595 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:09 crc kubenswrapper[4962]: I1201 21:34:09.518694 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:09 crc kubenswrapper[4962]: I1201 21:34:09.518714 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:09Z","lastTransitionTime":"2025-12-01T21:34:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:09 crc kubenswrapper[4962]: I1201 21:34:09.545291 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j77n9_017b2e87-9a6e-46c6-b061-1ed93bfd2322/ovnkube-controller/0.log" Dec 01 21:34:09 crc kubenswrapper[4962]: I1201 21:34:09.549871 4962 generic.go:334] "Generic (PLEG): container finished" podID="017b2e87-9a6e-46c6-b061-1ed93bfd2322" containerID="54ff70b42b58fa14cfcaa208973cc61a1c839d2aae7f7f740a3be8ca07a1e518" exitCode=1 Dec 01 21:34:09 crc kubenswrapper[4962]: I1201 21:34:09.549913 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" event={"ID":"017b2e87-9a6e-46c6-b061-1ed93bfd2322","Type":"ContainerDied","Data":"54ff70b42b58fa14cfcaa208973cc61a1c839d2aae7f7f740a3be8ca07a1e518"} Dec 01 21:34:09 crc kubenswrapper[4962]: I1201 21:34:09.551054 4962 scope.go:117] "RemoveContainer" containerID="54ff70b42b58fa14cfcaa208973cc61a1c839d2aae7f7f740a3be8ca07a1e518" Dec 01 21:34:09 crc kubenswrapper[4962]: I1201 21:34:09.573151 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69adfdccde94cd891284e61c5edd6a092d14c5276ff514618fa3b44169e6347e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a6c3b22dbe7ed00b2356583225c79a46bfbf7014f2ba5b6d2226f1b1f3f7b39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:09Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:09 crc kubenswrapper[4962]: I1201 21:34:09.588682 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w7dpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"943bfc9d-612b-4273-9774-f1866b7af4b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0486c9366b81f3abfd35806c7eef00eb2a353450d850cc665c43fe3c78c9fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w7dpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:09Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:09 crc kubenswrapper[4962]: I1201 21:34:09.607347 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"191b6ce3-f613-4217-b224-a65ee4cfdfe7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37ed2e5c6ce1be6c710341ab02dc9ae89ebb0b309c11e78b251ffb532c31587f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rf67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca94caf0040f2d86239f149d34e419adcc86594fda2b4189e74f699c125857f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rf67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b642k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:09Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:09 crc kubenswrapper[4962]: I1201 21:34:09.626535 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:09 crc kubenswrapper[4962]: I1201 21:34:09.626596 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:09 crc kubenswrapper[4962]: I1201 21:34:09.626613 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:09 crc kubenswrapper[4962]: I1201 21:34:09.626640 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:09 crc kubenswrapper[4962]: I1201 21:34:09.626659 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:09Z","lastTransitionTime":"2025-12-01T21:34:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:09 crc kubenswrapper[4962]: I1201 21:34:09.627415 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://311915b276fcb2749fd7443c07b4a339131584ba32944273ed063ed55edec6c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:09Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:09 crc kubenswrapper[4962]: I1201 21:34:09.651840 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:09Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:09 crc kubenswrapper[4962]: I1201 21:34:09.680219 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"017b2e87-9a6e-46c6-b061-1ed93bfd2322\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c2b5c061f60dc51486c1b3b6fd97f18f63860ecb75697b61ee8288c43f23417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a2d1885515e7eab97b61b268f14f9fab5ff02464621e1a9b6adfee4c032ca3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81854e8f9430ba467e0aecead00fad301f92f85fdcca3f93f5e4aad9bba65925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70bcf088189f0158815dd70d752870decf941ad51bbfbe7c0c872da603659864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f5f31e9f2e590d4d3319248b0a4dca7e1f6575cb973ec7cc3e33fc91eb82d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da8a6f1fac0baebddfe20bd0ebda104356ad37e9f128affd8df2757584e8e712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54ff70b42b58fa14cfcaa208973cc61a1c839d2aae7f7f740a3be8ca07a1e518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54ff70b42b58fa14cfcaa208973cc61a1c839d2aae7f7f740a3be8ca07a1e518\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T21:34:08Z\\\",\\\"message\\\":\\\"lector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 21:34:08.423246 6272 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 21:34:08.423491 6272 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 21:34:08.423555 6272 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1201 21:34:08.423585 6272 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1201 21:34:08.423592 6272 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1201 21:34:08.423610 6272 factory.go:656] Stopping watch factory\\\\nI1201 21:34:08.423621 6272 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 21:34:08.423630 6272 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1201 21:34:08.423636 6272 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1201 21:34:08.423687 6272 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 21:34:08.423902 6272 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 21:34:08.424276 6272 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://893a9b3cc35f66e4d7ce63bd2c5c94a0e5824466a37b2281e05c4e92b5e90a05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j77n9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:09Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:09 crc kubenswrapper[4962]: I1201 21:34:09.698135 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-m4wg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38b9e31-13b0-4a48-93bf-b3722ca60642\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6b607ad59cd135df76eb0d6167a309468b46c7d487077b737d6d35390990de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws4mx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-m4wg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:09Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:09 crc kubenswrapper[4962]: I1201 21:34:09.724097 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lv2jr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47902aa9-b3e5-4279-a0ee-23ec28d1c67b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131ddb58672271d21222b6c9376dc6fa0bc5ba8d9bb4c8b0db88e81d0f59984c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f38ae7ac1122c7cceeaf7ef6284034b3b00f0d34060ce2a912344e521f8288ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f38ae7ac1122c7cceeaf7ef6284034b3b00f0d34060ce2a912344e521f8288ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3df9e036c0435fee1aa00f18a53bfb50334213a9284f766de1454f30a6c78f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3df9e036c0435fee1aa00f18a53bfb50334213a9284f766de1454f30a6c78f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eea885834889b1efa1a74ddb9d4c378669affd6dcf493f7d5e2c18d63ba9b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eea885834889b1efa1a74ddb9d4c378669affd6dcf493f7d5e2c18d63ba9b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b892d495763f16b0d9b9afd4440fa4d704de060b42b7506b28e9d990c47bfc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b892d495763f16b0d9b9afd4440fa4d704de060b42b7506b28e9d990c47bfc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fd3d341f898bc53c51ad9a79a03ffd02ef0ac0e8f53e33ebc139a30fa85b7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4fd3d341f898bc53c51ad9a79a03ffd02ef0ac0e8f53e33ebc139a30fa85b7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1a2f823c4006bd3f5acb5f92c78b1020ea0992f77a76366c5bab21657bf403f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1a2f823c4006bd3f5acb5f92c78b1020ea0992f77a76366c5bab21657bf403f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lv2jr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:09Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:09 crc kubenswrapper[4962]: I1201 21:34:09.730379 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:09 crc kubenswrapper[4962]: I1201 21:34:09.730465 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:09 crc kubenswrapper[4962]: I1201 21:34:09.730483 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:09 crc kubenswrapper[4962]: I1201 21:34:09.730513 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:09 crc kubenswrapper[4962]: I1201 21:34:09.730532 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:09Z","lastTransitionTime":"2025-12-01T21:34:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:09 crc kubenswrapper[4962]: I1201 21:34:09.740224 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4wzdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c4763ed-582e-4003-a0da-526ae8aee799\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca28226077849a752e7f8484f9ef98d6fb065d6ba9b3c4ef05f3ef1b0554bc32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5lb4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:34:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4wzdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:09Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:09 crc kubenswrapper[4962]: I1201 21:34:09.759423 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dce968b-9adc-4cbe-958c-00bdd7ba6159\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d0603aa8db338a61428d0c439ab252ad7db6ee0061992381a2f2c2534ecc08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b00d16ddb5e5c38227a87ee09c7e0d74ca43fb9226904dc5d604abe6a9c9b5ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc7177cbcbb0fe51491e055e7a8cb565836f1ffe163b6e160c3b425a5218fc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e749578c0ce8cb1dd38de90c24c5ddf738558163fe9012afcf1db088756495\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:09Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:09 crc kubenswrapper[4962]: I1201 21:34:09.774025 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db02e5c3ea0837c8534d6d0d89c6343ef077bc79b08bb6514fcc7fa0d71ac94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:09Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:09 crc kubenswrapper[4962]: I1201 21:34:09.787888 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:09Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:09 crc kubenswrapper[4962]: I1201 21:34:09.812610 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac15c86b-cd18-4110-bda1-ba116dccb445\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edc674b0b2e671ee34618f4a89fb513b4d6be3b5b2c65510554ed57b84f305b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d764e05de4e1d86add3da927e55f8d2f211bba19a2104af89e43d13f986f5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a26275d40e0cdc563a8704d985b92ca116fd053ac663b23ada1f3d55d3d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7bb726576fbf1e8a595c9fdab30ed4852d838f3ad72aa432b880e8466493ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa02037f2a0411aa4c3b7eff66c72a1885d70e56cf72cdf786d04b5840ecbe6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"message\\\":\\\" 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764624829\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764624829\\\\\\\\\\\\\\\" (2025-12-01 20:33:49 +0000 UTC to 2026-12-01 20:33:49 +0000 UTC (now=2025-12-01 21:33:54.341897199 +0000 UTC))\\\\\\\"\\\\nI1201 21:33:54.342031 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 21:33:54.342089 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 21:33:54.342220 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1201 21:33:54.342244 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1201 21:33:54.341422 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 21:33:54.342706 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 21:33:54.344341 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-602538438/tls.crt::/tmp/serving-cert-602538438/tls.key\\\\\\\"\\\\nI1201 21:33:54.345794 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 21:33:54.345817 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 21:33:54.345844 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348097 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348569 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1201 21:33:54.367723 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bbe0097117ffbc7aa9c8247995dcec3765f03a9aa7657ee0c0da4c3a3ec874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:09Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:09 crc kubenswrapper[4962]: I1201 21:34:09.832591 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:09Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:09 crc kubenswrapper[4962]: I1201 21:34:09.832846 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:09 crc kubenswrapper[4962]: I1201 21:34:09.832910 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:09 crc kubenswrapper[4962]: I1201 21:34:09.832953 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:09 crc kubenswrapper[4962]: I1201 21:34:09.832978 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:09 crc kubenswrapper[4962]: I1201 21:34:09.832995 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:09Z","lastTransitionTime":"2025-12-01T21:34:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:09 crc kubenswrapper[4962]: I1201 21:34:09.935509 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:09 crc kubenswrapper[4962]: I1201 21:34:09.935573 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:09 crc kubenswrapper[4962]: I1201 21:34:09.935585 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:09 crc kubenswrapper[4962]: I1201 21:34:09.935600 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:09 crc kubenswrapper[4962]: I1201 21:34:09.935608 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:09Z","lastTransitionTime":"2025-12-01T21:34:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:10 crc kubenswrapper[4962]: I1201 21:34:10.038229 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:10 crc kubenswrapper[4962]: I1201 21:34:10.038285 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:10 crc kubenswrapper[4962]: I1201 21:34:10.038302 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:10 crc kubenswrapper[4962]: I1201 21:34:10.038325 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:10 crc kubenswrapper[4962]: I1201 21:34:10.038341 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:10Z","lastTransitionTime":"2025-12-01T21:34:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:10 crc kubenswrapper[4962]: I1201 21:34:10.140526 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:10 crc kubenswrapper[4962]: I1201 21:34:10.140559 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:10 crc kubenswrapper[4962]: I1201 21:34:10.140568 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:10 crc kubenswrapper[4962]: I1201 21:34:10.140583 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:10 crc kubenswrapper[4962]: I1201 21:34:10.140593 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:10Z","lastTransitionTime":"2025-12-01T21:34:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:10 crc kubenswrapper[4962]: I1201 21:34:10.250478 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:10 crc kubenswrapper[4962]: I1201 21:34:10.250518 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:10 crc kubenswrapper[4962]: I1201 21:34:10.250529 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:10 crc kubenswrapper[4962]: I1201 21:34:10.250543 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:10 crc kubenswrapper[4962]: I1201 21:34:10.250555 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:10Z","lastTransitionTime":"2025-12-01T21:34:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:10 crc kubenswrapper[4962]: I1201 21:34:10.353162 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:10 crc kubenswrapper[4962]: I1201 21:34:10.353199 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:10 crc kubenswrapper[4962]: I1201 21:34:10.353210 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:10 crc kubenswrapper[4962]: I1201 21:34:10.353224 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:10 crc kubenswrapper[4962]: I1201 21:34:10.353236 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:10Z","lastTransitionTime":"2025-12-01T21:34:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:10 crc kubenswrapper[4962]: I1201 21:34:10.455850 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:10 crc kubenswrapper[4962]: I1201 21:34:10.455909 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:10 crc kubenswrapper[4962]: I1201 21:34:10.455926 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:10 crc kubenswrapper[4962]: I1201 21:34:10.455984 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:10 crc kubenswrapper[4962]: I1201 21:34:10.456007 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:10Z","lastTransitionTime":"2025-12-01T21:34:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:10 crc kubenswrapper[4962]: I1201 21:34:10.558099 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:10 crc kubenswrapper[4962]: I1201 21:34:10.558148 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:10 crc kubenswrapper[4962]: I1201 21:34:10.558163 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:10 crc kubenswrapper[4962]: I1201 21:34:10.558181 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:10 crc kubenswrapper[4962]: I1201 21:34:10.558193 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:10Z","lastTransitionTime":"2025-12-01T21:34:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:10 crc kubenswrapper[4962]: I1201 21:34:10.558603 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j77n9_017b2e87-9a6e-46c6-b061-1ed93bfd2322/ovnkube-controller/0.log" Dec 01 21:34:10 crc kubenswrapper[4962]: I1201 21:34:10.564208 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" event={"ID":"017b2e87-9a6e-46c6-b061-1ed93bfd2322","Type":"ContainerStarted","Data":"b188c66b2540fd993398d1612471d493903346524eeef1a5aa2d85b13e46f45c"} Dec 01 21:34:10 crc kubenswrapper[4962]: I1201 21:34:10.568304 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" Dec 01 21:34:10 crc kubenswrapper[4962]: I1201 21:34:10.590880 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-m4wg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38b9e31-13b0-4a48-93bf-b3722ca60642\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6b607ad59cd135df76eb0d6167a309468b46c7d487077b737d6d35390990de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws4mx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-m4wg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:10Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:10 crc kubenswrapper[4962]: I1201 21:34:10.611083 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lv2jr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47902aa9-b3e5-4279-a0ee-23ec28d1c67b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131ddb58672271d21222b6c9376dc6fa0bc5ba8d9bb4c8b0db88e81d0f59984c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f38ae7ac1122c7cceeaf7ef6284034b3b00f0d34060ce2a912344e521f8288ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f38ae7ac1122c7cceeaf7ef6284034b3b00f0d34060ce2a912344e521f8288ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3df9e036c0435fee1aa00f18a53bfb50334213a9284f766de1454f30a6c78f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3df9e036c0435fee1aa00f18a53bfb50334213a9284f766de1454f30a6c78f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eea885834889b1efa1a74ddb9d4c378669affd6dcf493f7d5e2c18d63ba9b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eea885834889b1efa1a74ddb9d4c378669affd6dcf493f7d5e2c18d63ba9b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b892d495763f16b0d9b9afd4440fa4d704de060b42b7506b28e9d990c47bfc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b892d495763f16b0d9b9afd4440fa4d704de060b42b7506b28e9d990c47bfc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fd3d341f898bc53c51ad9a79a03ffd02ef0ac0e8f53e33ebc139a30fa85b7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4fd3d341f898bc53c51ad9a79a03ffd02ef0ac0e8f53e33ebc139a30fa85b7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1a2f823c4006bd3f5acb5f92c78b1020ea0992f77a76366c5bab21657bf403f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1a2f823c4006bd3f5acb5f92c78b1020ea0992f77a76366c5bab21657bf403f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lv2jr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:10Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:10 crc kubenswrapper[4962]: I1201 21:34:10.625755 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4wzdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c4763ed-582e-4003-a0da-526ae8aee799\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca28226077849a752e7f8484f9ef98d6fb065d6ba9b3c4ef05f3ef1b0554bc32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5lb4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:34:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4wzdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:10Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:10 crc kubenswrapper[4962]: I1201 21:34:10.646546 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dce968b-9adc-4cbe-958c-00bdd7ba6159\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d0603aa8db338a61428d0c439ab252ad7db6ee0061992381a2f2c2534ecc08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b00d16ddb5e5c38227a87ee09c7e0d74ca43fb9226904dc5d604abe6a9c9b5ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc7177cbcbb0fe51491e055e7a8cb565836f1ffe163b6e160c3b425a5218fc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e749578c0ce8cb1dd38de90c24c5ddf738558163fe9012afcf1db088756495\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:10Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:10 crc kubenswrapper[4962]: I1201 21:34:10.663836 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:10 crc kubenswrapper[4962]: I1201 21:34:10.663910 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:10 crc kubenswrapper[4962]: I1201 21:34:10.663933 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:10 crc kubenswrapper[4962]: I1201 21:34:10.663998 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:10 crc kubenswrapper[4962]: I1201 21:34:10.664023 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:10Z","lastTransitionTime":"2025-12-01T21:34:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:10 crc kubenswrapper[4962]: I1201 21:34:10.668704 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db02e5c3ea0837c8534d6d0d89c6343ef077bc79b08bb6514fcc7fa0d71ac94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:10Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:10 crc kubenswrapper[4962]: I1201 21:34:10.688836 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:10Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:10 crc kubenswrapper[4962]: I1201 21:34:10.738038 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac15c86b-cd18-4110-bda1-ba116dccb445\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edc674b0b2e671ee34618f4a89fb513b4d6be3b5b2c65510554ed57b84f305b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d764e05de4e1d86add3da927e55f8d2f211bba19a2104af89e43d13f986f5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a26275d40e0cdc563a8704d985b92ca116fd053ac663b23ada1f3d55d3d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7bb726576fbf1e8a595c9fdab30ed4852d838f3ad72aa432b880e8466493ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa02037f2a0411aa4c3b7eff66c72a1885d70e56cf72cdf786d04b5840ecbe6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"message\\\":\\\" 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764624829\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764624829\\\\\\\\\\\\\\\" (2025-12-01 20:33:49 +0000 UTC to 2026-12-01 20:33:49 +0000 UTC (now=2025-12-01 21:33:54.341897199 +0000 UTC))\\\\\\\"\\\\nI1201 21:33:54.342031 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 21:33:54.342089 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 21:33:54.342220 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1201 21:33:54.342244 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1201 21:33:54.341422 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 21:33:54.342706 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 21:33:54.344341 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-602538438/tls.crt::/tmp/serving-cert-602538438/tls.key\\\\\\\"\\\\nI1201 21:33:54.345794 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 21:33:54.345817 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 21:33:54.345844 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348097 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348569 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1201 21:33:54.367723 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bbe0097117ffbc7aa9c8247995dcec3765f03a9aa7657ee0c0da4c3a3ec874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:10Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:10 crc kubenswrapper[4962]: I1201 21:34:10.764302 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:10Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:10 crc kubenswrapper[4962]: I1201 21:34:10.766703 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:10 crc kubenswrapper[4962]: I1201 21:34:10.766730 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:10 crc kubenswrapper[4962]: I1201 21:34:10.766740 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:10 crc kubenswrapper[4962]: I1201 21:34:10.766753 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:10 crc kubenswrapper[4962]: I1201 21:34:10.766762 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:10Z","lastTransitionTime":"2025-12-01T21:34:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:10 crc kubenswrapper[4962]: I1201 21:34:10.782577 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69adfdccde94cd891284e61c5edd6a092d14c5276ff514618fa3b44169e6347e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a6c3b22dbe7ed00b2356583225c79a46bfbf7014f2ba5b6d2226f1b1f3f7b39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:10Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:10 crc kubenswrapper[4962]: I1201 21:34:10.793975 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w7dpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"943bfc9d-612b-4273-9774-f1866b7af4b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0486c9366b81f3abfd35806c7eef00eb2a353450d850cc665c43fe3c78c9fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w7dpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:10Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:10 crc kubenswrapper[4962]: I1201 21:34:10.807825 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"191b6ce3-f613-4217-b224-a65ee4cfdfe7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37ed2e5c6ce1be6c710341ab02dc9ae89ebb0b309c11e78b251ffb532c31587f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rf67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca94caf0040f2d86239f149d34e419adcc86594fda2b4189e74f699c125857f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rf67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b642k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:10Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:10 crc kubenswrapper[4962]: I1201 21:34:10.820590 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://311915b276fcb2749fd7443c07b4a339131584ba32944273ed063ed55edec6c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:10Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:10 crc kubenswrapper[4962]: I1201 21:34:10.830475 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:10Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:10 crc kubenswrapper[4962]: I1201 21:34:10.850488 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"017b2e87-9a6e-46c6-b061-1ed93bfd2322\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c2b5c061f60dc51486c1b3b6fd97f18f63860ecb75697b61ee8288c43f23417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a2d1885515e7eab97b61b268f14f9fab5ff02464621e1a9b6adfee4c032ca3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81854e8f9430ba467e0aecead00fad301f92f85fdcca3f93f5e4aad9bba65925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70bcf088189f0158815dd70d752870decf941ad51bbfbe7c0c872da603659864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f5f31e9f2e590d4d3319248b0a4dca7e1f6575cb973ec7cc3e33fc91eb82d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da8a6f1fac0baebddfe20bd0ebda104356ad37e9f128affd8df2757584e8e712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b188c66b2540fd993398d1612471d493903346524eeef1a5aa2d85b13e46f45c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54ff70b42b58fa14cfcaa208973cc61a1c839d2aae7f7f740a3be8ca07a1e518\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T21:34:08Z\\\",\\\"message\\\":\\\"lector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 21:34:08.423246 6272 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 21:34:08.423491 6272 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 21:34:08.423555 6272 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1201 21:34:08.423585 6272 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1201 21:34:08.423592 6272 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1201 21:34:08.423610 6272 factory.go:656] Stopping watch factory\\\\nI1201 21:34:08.423621 6272 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 21:34:08.423630 6272 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1201 21:34:08.423636 6272 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1201 21:34:08.423687 6272 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 21:34:08.423902 6272 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 21:34:08.424276 6272 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://893a9b3cc35f66e4d7ce63bd2c5c94a0e5824466a37b2281e05c4e92b5e90a05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j77n9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:10Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:10 crc kubenswrapper[4962]: I1201 21:34:10.876383 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:10 crc kubenswrapper[4962]: I1201 21:34:10.876417 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:10 crc kubenswrapper[4962]: I1201 21:34:10.876429 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:10 crc kubenswrapper[4962]: I1201 21:34:10.876448 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:10 crc kubenswrapper[4962]: I1201 21:34:10.876463 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:10Z","lastTransitionTime":"2025-12-01T21:34:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:10 crc kubenswrapper[4962]: I1201 21:34:10.926159 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 21:34:10 crc kubenswrapper[4962]: E1201 21:34:10.926521 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 21:34:26.926487499 +0000 UTC m=+51.027926734 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 21:34:10 crc kubenswrapper[4962]: I1201 21:34:10.979755 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:10 crc kubenswrapper[4962]: I1201 21:34:10.979835 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:10 crc kubenswrapper[4962]: I1201 21:34:10.979853 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:10 crc kubenswrapper[4962]: I1201 21:34:10.979882 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:10 crc kubenswrapper[4962]: I1201 21:34:10.979901 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:10Z","lastTransitionTime":"2025-12-01T21:34:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:11 crc kubenswrapper[4962]: I1201 21:34:11.027474 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 21:34:11 crc kubenswrapper[4962]: I1201 21:34:11.027745 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 21:34:11 crc kubenswrapper[4962]: E1201 21:34:11.027907 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 21:34:11 crc kubenswrapper[4962]: E1201 21:34:11.028004 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 21:34:11 crc kubenswrapper[4962]: E1201 21:34:11.028026 4962 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 21:34:11 crc kubenswrapper[4962]: E1201 21:34:11.028116 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 21:34:27.028084967 +0000 UTC m=+51.129524152 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 21:34:11 crc kubenswrapper[4962]: E1201 21:34:11.028209 4962 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 21:34:11 crc kubenswrapper[4962]: E1201 21:34:11.028327 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 21:34:27.028297543 +0000 UTC m=+51.129736768 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 21:34:11 crc kubenswrapper[4962]: I1201 21:34:11.027984 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:34:11 crc kubenswrapper[4962]: I1201 21:34:11.028626 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:34:11 crc kubenswrapper[4962]: E1201 21:34:11.028724 4962 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 21:34:11 crc kubenswrapper[4962]: E1201 21:34:11.029032 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 21:34:27.029001801 +0000 UTC m=+51.130441036 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 21:34:11 crc kubenswrapper[4962]: E1201 21:34:11.028741 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 21:34:11 crc kubenswrapper[4962]: E1201 21:34:11.029373 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 21:34:11 crc kubenswrapper[4962]: E1201 21:34:11.029496 4962 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 21:34:11 crc kubenswrapper[4962]: E1201 21:34:11.029701 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 21:34:27.029681109 +0000 UTC m=+51.131120344 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 21:34:11 crc kubenswrapper[4962]: I1201 21:34:11.083577 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:11 crc kubenswrapper[4962]: I1201 21:34:11.083630 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:11 crc kubenswrapper[4962]: I1201 21:34:11.083644 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:11 crc kubenswrapper[4962]: I1201 21:34:11.083663 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:11 crc kubenswrapper[4962]: I1201 21:34:11.083678 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:11Z","lastTransitionTime":"2025-12-01T21:34:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:11 crc kubenswrapper[4962]: I1201 21:34:11.187345 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:11 crc kubenswrapper[4962]: I1201 21:34:11.187415 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:11 crc kubenswrapper[4962]: I1201 21:34:11.187434 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:11 crc kubenswrapper[4962]: I1201 21:34:11.187466 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:11 crc kubenswrapper[4962]: I1201 21:34:11.187486 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:11Z","lastTransitionTime":"2025-12-01T21:34:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:11 crc kubenswrapper[4962]: I1201 21:34:11.218839 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 21:34:11 crc kubenswrapper[4962]: I1201 21:34:11.218896 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:34:11 crc kubenswrapper[4962]: E1201 21:34:11.219040 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 21:34:11 crc kubenswrapper[4962]: I1201 21:34:11.219254 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 21:34:11 crc kubenswrapper[4962]: E1201 21:34:11.219249 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 21:34:11 crc kubenswrapper[4962]: E1201 21:34:11.219668 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 21:34:11 crc kubenswrapper[4962]: I1201 21:34:11.290918 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:11 crc kubenswrapper[4962]: I1201 21:34:11.291009 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:11 crc kubenswrapper[4962]: I1201 21:34:11.291034 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:11 crc kubenswrapper[4962]: I1201 21:34:11.291065 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:11 crc kubenswrapper[4962]: I1201 21:34:11.291093 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:11Z","lastTransitionTime":"2025-12-01T21:34:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:11 crc kubenswrapper[4962]: I1201 21:34:11.394647 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:11 crc kubenswrapper[4962]: I1201 21:34:11.394756 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:11 crc kubenswrapper[4962]: I1201 21:34:11.394778 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:11 crc kubenswrapper[4962]: I1201 21:34:11.394805 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:11 crc kubenswrapper[4962]: I1201 21:34:11.394822 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:11Z","lastTransitionTime":"2025-12-01T21:34:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:11 crc kubenswrapper[4962]: I1201 21:34:11.498034 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:11 crc kubenswrapper[4962]: I1201 21:34:11.498085 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:11 crc kubenswrapper[4962]: I1201 21:34:11.498102 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:11 crc kubenswrapper[4962]: I1201 21:34:11.498127 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:11 crc kubenswrapper[4962]: I1201 21:34:11.498149 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:11Z","lastTransitionTime":"2025-12-01T21:34:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:11 crc kubenswrapper[4962]: I1201 21:34:11.571049 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j77n9_017b2e87-9a6e-46c6-b061-1ed93bfd2322/ovnkube-controller/1.log" Dec 01 21:34:11 crc kubenswrapper[4962]: I1201 21:34:11.572170 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j77n9_017b2e87-9a6e-46c6-b061-1ed93bfd2322/ovnkube-controller/0.log" Dec 01 21:34:11 crc kubenswrapper[4962]: I1201 21:34:11.576817 4962 generic.go:334] "Generic (PLEG): container finished" podID="017b2e87-9a6e-46c6-b061-1ed93bfd2322" containerID="b188c66b2540fd993398d1612471d493903346524eeef1a5aa2d85b13e46f45c" exitCode=1 Dec 01 21:34:11 crc kubenswrapper[4962]: I1201 21:34:11.576917 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" event={"ID":"017b2e87-9a6e-46c6-b061-1ed93bfd2322","Type":"ContainerDied","Data":"b188c66b2540fd993398d1612471d493903346524eeef1a5aa2d85b13e46f45c"} Dec 01 21:34:11 crc kubenswrapper[4962]: I1201 21:34:11.577298 4962 scope.go:117] "RemoveContainer" containerID="54ff70b42b58fa14cfcaa208973cc61a1c839d2aae7f7f740a3be8ca07a1e518" Dec 01 21:34:11 crc kubenswrapper[4962]: I1201 21:34:11.578107 4962 scope.go:117] "RemoveContainer" containerID="b188c66b2540fd993398d1612471d493903346524eeef1a5aa2d85b13e46f45c" Dec 01 21:34:11 crc kubenswrapper[4962]: E1201 21:34:11.578356 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-j77n9_openshift-ovn-kubernetes(017b2e87-9a6e-46c6-b061-1ed93bfd2322)\"" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" podUID="017b2e87-9a6e-46c6-b061-1ed93bfd2322" Dec 01 21:34:11 crc kubenswrapper[4962]: I1201 21:34:11.601567 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:11 crc kubenswrapper[4962]: I1201 21:34:11.601628 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:11 crc kubenswrapper[4962]: I1201 21:34:11.601643 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:11 crc kubenswrapper[4962]: I1201 21:34:11.601667 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:11 crc kubenswrapper[4962]: I1201 21:34:11.601685 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:11Z","lastTransitionTime":"2025-12-01T21:34:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:11 crc kubenswrapper[4962]: I1201 21:34:11.602873 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:11Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:11 crc kubenswrapper[4962]: I1201 21:34:11.629421 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"017b2e87-9a6e-46c6-b061-1ed93bfd2322\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c2b5c061f60dc51486c1b3b6fd97f18f63860ecb75697b61ee8288c43f23417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a2d1885515e7eab97b61b268f14f9fab5ff02464621e1a9b6adfee4c032ca3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81854e8f9430ba467e0aecead00fad301f92f85fdcca3f93f5e4aad9bba65925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70bcf088189f0158815dd70d752870decf941ad51bbfbe7c0c872da603659864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f5f31e9f2e590d4d3319248b0a4dca7e1f6575cb973ec7cc3e33fc91eb82d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da8a6f1fac0baebddfe20bd0ebda104356ad37e9f128affd8df2757584e8e712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b188c66b2540fd993398d1612471d493903346524eeef1a5aa2d85b13e46f45c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54ff70b42b58fa14cfcaa208973cc61a1c839d2aae7f7f740a3be8ca07a1e518\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T21:34:08Z\\\",\\\"message\\\":\\\"lector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 21:34:08.423246 6272 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 21:34:08.423491 6272 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 21:34:08.423555 6272 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1201 21:34:08.423585 6272 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1201 21:34:08.423592 6272 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1201 21:34:08.423610 6272 factory.go:656] Stopping watch factory\\\\nI1201 21:34:08.423621 6272 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 21:34:08.423630 6272 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1201 21:34:08.423636 6272 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1201 21:34:08.423687 6272 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 21:34:08.423902 6272 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 21:34:08.424276 6272 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b188c66b2540fd993398d1612471d493903346524eeef1a5aa2d85b13e46f45c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T21:34:10Z\\\",\\\"message\\\":\\\"work policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:10Z is after 2025-08-24T17:21:41Z]\\\\nI1201 21:34:10.543525 6419 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"7f9b8f25-db1a-4d02-a423-9afc5c2fb83c\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}},\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://893a9b3cc35f66e4d7ce63bd2c5c94a0e5824466a37b2281e05c4e92b5e90a05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j77n9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:11Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:11 crc kubenswrapper[4962]: I1201 21:34:11.656049 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lv2jr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47902aa9-b3e5-4279-a0ee-23ec28d1c67b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131ddb58672271d21222b6c9376dc6fa0bc5ba8d9bb4c8b0db88e81d0f59984c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f38ae7ac1122c7cceeaf7ef6284034b3b00f0d34060ce2a912344e521f8288ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f38ae7ac1122c7cceeaf7ef6284034b3b00f0d34060ce2a912344e521f8288ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3df9e036c0435fee1aa00f18a53bfb50334213a9284f766de1454f30a6c78f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3df9e036c0435fee1aa00f18a53bfb50334213a9284f766de1454f30a6c78f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eea885834889b1efa1a74ddb9d4c378669affd6dcf493f7d5e2c18d63ba9b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eea885834889b1efa1a74ddb9d4c378669affd6dcf493f7d5e2c18d63ba9b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b892d495763f16b0d9b9afd4440fa4d704de060b42b7506b28e9d990c47bfc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b892d495763f16b0d9b9afd4440fa4d704de060b42b7506b28e9d990c47bfc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fd3d341f898bc53c51ad9a79a03ffd02ef0ac0e8f53e33ebc139a30fa85b7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4fd3d341f898bc53c51ad9a79a03ffd02ef0ac0e8f53e33ebc139a30fa85b7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1a2f823c4006bd3f5acb5f92c78b1020ea0992f77a76366c5bab21657bf403f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1a2f823c4006bd3f5acb5f92c78b1020ea0992f77a76366c5bab21657bf403f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lv2jr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:11Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:11 crc kubenswrapper[4962]: I1201 21:34:11.678270 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4wzdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c4763ed-582e-4003-a0da-526ae8aee799\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca28226077849a752e7f8484f9ef98d6fb065d6ba9b3c4ef05f3ef1b0554bc32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5lb4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:34:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4wzdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:11Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:11 crc kubenswrapper[4962]: I1201 21:34:11.699426 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dce968b-9adc-4cbe-958c-00bdd7ba6159\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d0603aa8db338a61428d0c439ab252ad7db6ee0061992381a2f2c2534ecc08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b00d16ddb5e5c38227a87ee09c7e0d74ca43fb9226904dc5d604abe6a9c9b5ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc7177cbcbb0fe51491e055e7a8cb565836f1ffe163b6e160c3b425a5218fc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e749578c0ce8cb1dd38de90c24c5ddf738558163fe9012afcf1db088756495\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:11Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:11 crc kubenswrapper[4962]: I1201 21:34:11.705681 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:11 crc kubenswrapper[4962]: I1201 21:34:11.705733 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:11 crc kubenswrapper[4962]: I1201 21:34:11.705750 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:11 crc kubenswrapper[4962]: I1201 21:34:11.705779 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:11 crc kubenswrapper[4962]: I1201 21:34:11.705798 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:11Z","lastTransitionTime":"2025-12-01T21:34:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:11 crc kubenswrapper[4962]: I1201 21:34:11.723788 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db02e5c3ea0837c8534d6d0d89c6343ef077bc79b08bb6514fcc7fa0d71ac94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:11Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:11 crc kubenswrapper[4962]: I1201 21:34:11.744612 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:11Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:11 crc kubenswrapper[4962]: I1201 21:34:11.765875 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-m4wg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38b9e31-13b0-4a48-93bf-b3722ca60642\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6b607ad59cd135df76eb0d6167a309468b46c7d487077b737d6d35390990de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws4mx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-m4wg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:11Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:11 crc kubenswrapper[4962]: I1201 21:34:11.788422 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac15c86b-cd18-4110-bda1-ba116dccb445\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edc674b0b2e671ee34618f4a89fb513b4d6be3b5b2c65510554ed57b84f305b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d764e05de4e1d86add3da927e55f8d2f211bba19a2104af89e43d13f986f5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a26275d40e0cdc563a8704d985b92ca116fd053ac663b23ada1f3d55d3d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7bb726576fbf1e8a595c9fdab30ed4852d838f3ad72aa432b880e8466493ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa02037f2a0411aa4c3b7eff66c72a1885d70e56cf72cdf786d04b5840ecbe6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"message\\\":\\\" 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764624829\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764624829\\\\\\\\\\\\\\\" (2025-12-01 20:33:49 +0000 UTC to 2026-12-01 20:33:49 +0000 UTC (now=2025-12-01 21:33:54.341897199 +0000 UTC))\\\\\\\"\\\\nI1201 21:33:54.342031 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 21:33:54.342089 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 21:33:54.342220 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1201 21:33:54.342244 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1201 21:33:54.341422 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 21:33:54.342706 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 21:33:54.344341 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-602538438/tls.crt::/tmp/serving-cert-602538438/tls.key\\\\\\\"\\\\nI1201 21:33:54.345794 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 21:33:54.345817 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 21:33:54.345844 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348097 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348569 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1201 21:33:54.367723 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bbe0097117ffbc7aa9c8247995dcec3765f03a9aa7657ee0c0da4c3a3ec874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:11Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:11 crc kubenswrapper[4962]: I1201 21:34:11.807469 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:11Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:11 crc kubenswrapper[4962]: I1201 21:34:11.809298 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:11 crc kubenswrapper[4962]: I1201 21:34:11.809348 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:11 crc kubenswrapper[4962]: I1201 21:34:11.809369 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:11 crc kubenswrapper[4962]: I1201 21:34:11.809460 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:11 crc kubenswrapper[4962]: I1201 21:34:11.809483 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:11Z","lastTransitionTime":"2025-12-01T21:34:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:11 crc kubenswrapper[4962]: I1201 21:34:11.821694 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w7dpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"943bfc9d-612b-4273-9774-f1866b7af4b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0486c9366b81f3abfd35806c7eef00eb2a353450d850cc665c43fe3c78c9fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w7dpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:11Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:11 crc kubenswrapper[4962]: I1201 21:34:11.840471 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"191b6ce3-f613-4217-b224-a65ee4cfdfe7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37ed2e5c6ce1be6c710341ab02dc9ae89ebb0b309c11e78b251ffb532c31587f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rf67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca94caf0040f2d86239f149d34e419adcc86594fda2b4189e74f699c125857f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rf67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b642k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:11Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:11 crc kubenswrapper[4962]: I1201 21:34:11.861412 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://311915b276fcb2749fd7443c07b4a339131584ba32944273ed063ed55edec6c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:11Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:11 crc kubenswrapper[4962]: I1201 21:34:11.882547 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69adfdccde94cd891284e61c5edd6a092d14c5276ff514618fa3b44169e6347e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a6c3b22dbe7ed00b2356583225c79a46bfbf7014f2ba5b6d2226f1b1f3f7b39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:11Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:11 crc kubenswrapper[4962]: I1201 21:34:11.912551 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:11 crc kubenswrapper[4962]: I1201 21:34:11.912595 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:11 crc kubenswrapper[4962]: I1201 21:34:11.912611 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:11 crc kubenswrapper[4962]: I1201 21:34:11.912633 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:11 crc kubenswrapper[4962]: I1201 21:34:11.912648 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:11Z","lastTransitionTime":"2025-12-01T21:34:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:11 crc kubenswrapper[4962]: I1201 21:34:11.953901 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fqtnk"] Dec 01 21:34:11 crc kubenswrapper[4962]: I1201 21:34:11.954820 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fqtnk" Dec 01 21:34:11 crc kubenswrapper[4962]: I1201 21:34:11.958446 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 01 21:34:11 crc kubenswrapper[4962]: I1201 21:34:11.958854 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 01 21:34:11 crc kubenswrapper[4962]: I1201 21:34:11.981817 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db02e5c3ea0837c8534d6d0d89c6343ef077bc79b08bb6514fcc7fa0d71ac94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:11Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:11 crc kubenswrapper[4962]: I1201 21:34:11.999511 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:11Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.015912 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.015991 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.016010 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.016033 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.016051 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:12Z","lastTransitionTime":"2025-12-01T21:34:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.017277 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-m4wg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38b9e31-13b0-4a48-93bf-b3722ca60642\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6b607ad59cd135df76eb0d6167a309468b46c7d487077b737d6d35390990de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws4mx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-m4wg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:12Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.040904 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lv2jr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47902aa9-b3e5-4279-a0ee-23ec28d1c67b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131ddb58672271d21222b6c9376dc6fa0bc5ba8d9bb4c8b0db88e81d0f59984c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f38ae7ac1122c7cceeaf7ef6284034b3b00f0d34060ce2a912344e521f8288ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f38ae7ac1122c7cceeaf7ef6284034b3b00f0d34060ce2a912344e521f8288ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3df9e036c0435fee1aa00f18a53bfb50334213a9284f766de1454f30a6c78f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3df9e036c0435fee1aa00f18a53bfb50334213a9284f766de1454f30a6c78f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eea885834889b1efa1a74ddb9d4c378669affd6dcf493f7d5e2c18d63ba9b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eea885834889b1efa1a74ddb9d4c378669affd6dcf493f7d5e2c18d63ba9b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b892d495763f16b0d9b9afd4440fa4d704de060b42b7506b28e9d990c47bfc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b892d495763f16b0d9b9afd4440fa4d704de060b42b7506b28e9d990c47bfc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fd3d341f898bc53c51ad9a79a03ffd02ef0ac0e8f53e33ebc139a30fa85b7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4fd3d341f898bc53c51ad9a79a03ffd02ef0ac0e8f53e33ebc139a30fa85b7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1a2f823c4006bd3f5acb5f92c78b1020ea0992f77a76366c5bab21657bf403f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1a2f823c4006bd3f5acb5f92c78b1020ea0992f77a76366c5bab21657bf403f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lv2jr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:12Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.059981 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4wzdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c4763ed-582e-4003-a0da-526ae8aee799\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca28226077849a752e7f8484f9ef98d6fb065d6ba9b3c4ef05f3ef1b0554bc32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5lb4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:34:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4wzdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:12Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.081573 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dce968b-9adc-4cbe-958c-00bdd7ba6159\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d0603aa8db338a61428d0c439ab252ad7db6ee0061992381a2f2c2534ecc08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b00d16ddb5e5c38227a87ee09c7e0d74ca43fb9226904dc5d604abe6a9c9b5ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc7177cbcbb0fe51491e055e7a8cb565836f1ffe163b6e160c3b425a5218fc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e749578c0ce8cb1dd38de90c24c5ddf738558163fe9012afcf1db088756495\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:12Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.102248 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:12Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.118804 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.118858 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.118870 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.118889 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.118902 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:12Z","lastTransitionTime":"2025-12-01T21:34:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.121445 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fqtnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84505e9c-7b91-400d-b30b-d7d2cfe3c29b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vpt7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vpt7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:34:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fqtnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:12Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.141492 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpt7r\" (UniqueName: \"kubernetes.io/projected/84505e9c-7b91-400d-b30b-d7d2cfe3c29b-kube-api-access-vpt7r\") pod \"ovnkube-control-plane-749d76644c-fqtnk\" (UID: \"84505e9c-7b91-400d-b30b-d7d2cfe3c29b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fqtnk" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.141540 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/84505e9c-7b91-400d-b30b-d7d2cfe3c29b-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-fqtnk\" (UID: \"84505e9c-7b91-400d-b30b-d7d2cfe3c29b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fqtnk" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.141565 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/84505e9c-7b91-400d-b30b-d7d2cfe3c29b-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-fqtnk\" (UID: \"84505e9c-7b91-400d-b30b-d7d2cfe3c29b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fqtnk" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.141589 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/84505e9c-7b91-400d-b30b-d7d2cfe3c29b-env-overrides\") pod \"ovnkube-control-plane-749d76644c-fqtnk\" (UID: \"84505e9c-7b91-400d-b30b-d7d2cfe3c29b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fqtnk" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.143619 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac15c86b-cd18-4110-bda1-ba116dccb445\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edc674b0b2e671ee34618f4a89fb513b4d6be3b5b2c65510554ed57b84f305b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d764e05de4e1d86add3da927e55f8d2f211bba19a2104af89e43d13f986f5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a26275d40e0cdc563a8704d985b92ca116fd053ac663b23ada1f3d55d3d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7bb726576fbf1e8a595c9fdab30ed4852d838f3ad72aa432b880e8466493ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa02037f2a0411aa4c3b7eff66c72a1885d70e56cf72cdf786d04b5840ecbe6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"message\\\":\\\" 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764624829\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764624829\\\\\\\\\\\\\\\" (2025-12-01 20:33:49 +0000 UTC to 2026-12-01 20:33:49 +0000 UTC (now=2025-12-01 21:33:54.341897199 +0000 UTC))\\\\\\\"\\\\nI1201 21:33:54.342031 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 21:33:54.342089 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 21:33:54.342220 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1201 21:33:54.342244 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1201 21:33:54.341422 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 21:33:54.342706 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 21:33:54.344341 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-602538438/tls.crt::/tmp/serving-cert-602538438/tls.key\\\\\\\"\\\\nI1201 21:33:54.345794 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 21:33:54.345817 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 21:33:54.345844 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348097 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348569 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1201 21:33:54.367723 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bbe0097117ffbc7aa9c8247995dcec3765f03a9aa7657ee0c0da4c3a3ec874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:12Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.159514 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://311915b276fcb2749fd7443c07b4a339131584ba32944273ed063ed55edec6c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:12Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.177185 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69adfdccde94cd891284e61c5edd6a092d14c5276ff514618fa3b44169e6347e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a6c3b22dbe7ed00b2356583225c79a46bfbf7014f2ba5b6d2226f1b1f3f7b39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:12Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.187895 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w7dpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"943bfc9d-612b-4273-9774-f1866b7af4b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0486c9366b81f3abfd35806c7eef00eb2a353450d850cc665c43fe3c78c9fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w7dpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:12Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.202104 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"191b6ce3-f613-4217-b224-a65ee4cfdfe7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37ed2e5c6ce1be6c710341ab02dc9ae89ebb0b309c11e78b251ffb532c31587f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rf67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca94caf0040f2d86239f149d34e419adcc86594fda2b4189e74f699c125857f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rf67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b642k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:12Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.216268 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:12Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.222049 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.222118 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.222131 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.222147 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.222158 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:12Z","lastTransitionTime":"2025-12-01T21:34:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.236287 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"017b2e87-9a6e-46c6-b061-1ed93bfd2322\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c2b5c061f60dc51486c1b3b6fd97f18f63860ecb75697b61ee8288c43f23417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a2d1885515e7eab97b61b268f14f9fab5ff02464621e1a9b6adfee4c032ca3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81854e8f9430ba467e0aecead00fad301f92f85fdcca3f93f5e4aad9bba65925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70bcf088189f0158815dd70d752870decf941ad51bbfbe7c0c872da603659864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f5f31e9f2e590d4d3319248b0a4dca7e1f6575cb973ec7cc3e33fc91eb82d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da8a6f1fac0baebddfe20bd0ebda104356ad37e9f128affd8df2757584e8e712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b188c66b2540fd993398d1612471d493903346524eeef1a5aa2d85b13e46f45c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54ff70b42b58fa14cfcaa208973cc61a1c839d2aae7f7f740a3be8ca07a1e518\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T21:34:08Z\\\",\\\"message\\\":\\\"lector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 21:34:08.423246 6272 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 21:34:08.423491 6272 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 21:34:08.423555 6272 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1201 21:34:08.423585 6272 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1201 21:34:08.423592 6272 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1201 21:34:08.423610 6272 factory.go:656] Stopping watch factory\\\\nI1201 21:34:08.423621 6272 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 21:34:08.423630 6272 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1201 21:34:08.423636 6272 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1201 21:34:08.423687 6272 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 21:34:08.423902 6272 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 21:34:08.424276 6272 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b188c66b2540fd993398d1612471d493903346524eeef1a5aa2d85b13e46f45c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T21:34:10Z\\\",\\\"message\\\":\\\"work policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:10Z is after 2025-08-24T17:21:41Z]\\\\nI1201 21:34:10.543525 6419 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"7f9b8f25-db1a-4d02-a423-9afc5c2fb83c\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}},\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://893a9b3cc35f66e4d7ce63bd2c5c94a0e5824466a37b2281e05c4e92b5e90a05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j77n9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:12Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.242307 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpt7r\" (UniqueName: \"kubernetes.io/projected/84505e9c-7b91-400d-b30b-d7d2cfe3c29b-kube-api-access-vpt7r\") pod \"ovnkube-control-plane-749d76644c-fqtnk\" (UID: \"84505e9c-7b91-400d-b30b-d7d2cfe3c29b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fqtnk" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.242351 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/84505e9c-7b91-400d-b30b-d7d2cfe3c29b-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-fqtnk\" (UID: \"84505e9c-7b91-400d-b30b-d7d2cfe3c29b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fqtnk" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.242381 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/84505e9c-7b91-400d-b30b-d7d2cfe3c29b-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-fqtnk\" (UID: \"84505e9c-7b91-400d-b30b-d7d2cfe3c29b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fqtnk" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.242410 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/84505e9c-7b91-400d-b30b-d7d2cfe3c29b-env-overrides\") pod \"ovnkube-control-plane-749d76644c-fqtnk\" (UID: \"84505e9c-7b91-400d-b30b-d7d2cfe3c29b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fqtnk" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.243126 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/84505e9c-7b91-400d-b30b-d7d2cfe3c29b-env-overrides\") pod \"ovnkube-control-plane-749d76644c-fqtnk\" (UID: \"84505e9c-7b91-400d-b30b-d7d2cfe3c29b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fqtnk" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.243873 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/84505e9c-7b91-400d-b30b-d7d2cfe3c29b-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-fqtnk\" (UID: \"84505e9c-7b91-400d-b30b-d7d2cfe3c29b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fqtnk" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.248776 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/84505e9c-7b91-400d-b30b-d7d2cfe3c29b-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-fqtnk\" (UID: \"84505e9c-7b91-400d-b30b-d7d2cfe3c29b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fqtnk" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.265841 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpt7r\" (UniqueName: \"kubernetes.io/projected/84505e9c-7b91-400d-b30b-d7d2cfe3c29b-kube-api-access-vpt7r\") pod \"ovnkube-control-plane-749d76644c-fqtnk\" (UID: \"84505e9c-7b91-400d-b30b-d7d2cfe3c29b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fqtnk" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.284995 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fqtnk" Dec 01 21:34:12 crc kubenswrapper[4962]: W1201 21:34:12.304974 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84505e9c_7b91_400d_b30b_d7d2cfe3c29b.slice/crio-f66f1af4d13ffcfe49e05ce182ea29695e77005a8d4ea2c96b61e4f89efdc57c WatchSource:0}: Error finding container f66f1af4d13ffcfe49e05ce182ea29695e77005a8d4ea2c96b61e4f89efdc57c: Status 404 returned error can't find the container with id f66f1af4d13ffcfe49e05ce182ea29695e77005a8d4ea2c96b61e4f89efdc57c Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.325474 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.325540 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.325563 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.325596 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.325615 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:12Z","lastTransitionTime":"2025-12-01T21:34:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.428079 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.428132 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.428147 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.428166 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.428180 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:12Z","lastTransitionTime":"2025-12-01T21:34:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.531464 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.531515 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.531528 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.531546 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.531900 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:12Z","lastTransitionTime":"2025-12-01T21:34:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.585239 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j77n9_017b2e87-9a6e-46c6-b061-1ed93bfd2322/ovnkube-controller/1.log" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.591475 4962 scope.go:117] "RemoveContainer" containerID="b188c66b2540fd993398d1612471d493903346524eeef1a5aa2d85b13e46f45c" Dec 01 21:34:12 crc kubenswrapper[4962]: E1201 21:34:12.591820 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-j77n9_openshift-ovn-kubernetes(017b2e87-9a6e-46c6-b061-1ed93bfd2322)\"" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" podUID="017b2e87-9a6e-46c6-b061-1ed93bfd2322" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.598065 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fqtnk" event={"ID":"84505e9c-7b91-400d-b30b-d7d2cfe3c29b","Type":"ContainerStarted","Data":"4acbb8c835ff8f148555f849f503e7bf07afe0526bd6d0a22277e53f2eeef390"} Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.598125 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fqtnk" event={"ID":"84505e9c-7b91-400d-b30b-d7d2cfe3c29b","Type":"ContainerStarted","Data":"f66f1af4d13ffcfe49e05ce182ea29695e77005a8d4ea2c96b61e4f89efdc57c"} Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.609274 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:12Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.635899 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.635980 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.636005 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.635824 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"017b2e87-9a6e-46c6-b061-1ed93bfd2322\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c2b5c061f60dc51486c1b3b6fd97f18f63860ecb75697b61ee8288c43f23417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a2d1885515e7eab97b61b268f14f9fab5ff02464621e1a9b6adfee4c032ca3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81854e8f9430ba467e0aecead00fad301f92f85fdcca3f93f5e4aad9bba65925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70bcf088189f0158815dd70d752870decf941ad51bbfbe7c0c872da603659864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f5f31e9f2e590d4d3319248b0a4dca7e1f6575cb973ec7cc3e33fc91eb82d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da8a6f1fac0baebddfe20bd0ebda104356ad37e9f128affd8df2757584e8e712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b188c66b2540fd993398d1612471d493903346524eeef1a5aa2d85b13e46f45c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b188c66b2540fd993398d1612471d493903346524eeef1a5aa2d85b13e46f45c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T21:34:10Z\\\",\\\"message\\\":\\\"work policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:10Z is after 2025-08-24T17:21:41Z]\\\\nI1201 21:34:10.543525 6419 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"7f9b8f25-db1a-4d02-a423-9afc5c2fb83c\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}},\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-j77n9_openshift-ovn-kubernetes(017b2e87-9a6e-46c6-b061-1ed93bfd2322)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://893a9b3cc35f66e4d7ce63bd2c5c94a0e5824466a37b2281e05c4e92b5e90a05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j77n9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:12Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.636034 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.636232 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:12Z","lastTransitionTime":"2025-12-01T21:34:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.659164 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dce968b-9adc-4cbe-958c-00bdd7ba6159\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d0603aa8db338a61428d0c439ab252ad7db6ee0061992381a2f2c2534ecc08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b00d16ddb5e5c38227a87ee09c7e0d74ca43fb9226904dc5d604abe6a9c9b5ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc7177cbcbb0fe51491e055e7a8cb565836f1ffe163b6e160c3b425a5218fc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e749578c0ce8cb1dd38de90c24c5ddf738558163fe9012afcf1db088756495\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:12Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.680861 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db02e5c3ea0837c8534d6d0d89c6343ef077bc79b08bb6514fcc7fa0d71ac94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:12Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.703264 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:12Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.731506 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-m4wg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38b9e31-13b0-4a48-93bf-b3722ca60642\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6b607ad59cd135df76eb0d6167a309468b46c7d487077b737d6d35390990de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws4mx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-m4wg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:12Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.742313 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.742399 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.743087 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.743121 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.743145 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:12Z","lastTransitionTime":"2025-12-01T21:34:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.756528 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lv2jr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47902aa9-b3e5-4279-a0ee-23ec28d1c67b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131ddb58672271d21222b6c9376dc6fa0bc5ba8d9bb4c8b0db88e81d0f59984c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f38ae7ac1122c7cceeaf7ef6284034b3b00f0d34060ce2a912344e521f8288ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f38ae7ac1122c7cceeaf7ef6284034b3b00f0d34060ce2a912344e521f8288ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3df9e036c0435fee1aa00f18a53bfb50334213a9284f766de1454f30a6c78f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3df9e036c0435fee1aa00f18a53bfb50334213a9284f766de1454f30a6c78f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eea885834889b1efa1a74ddb9d4c378669affd6dcf493f7d5e2c18d63ba9b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eea885834889b1efa1a74ddb9d4c378669affd6dcf493f7d5e2c18d63ba9b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b892d495763f16b0d9b9afd4440fa4d704de060b42b7506b28e9d990c47bfc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b892d495763f16b0d9b9afd4440fa4d704de060b42b7506b28e9d990c47bfc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fd3d341f898bc53c51ad9a79a03ffd02ef0ac0e8f53e33ebc139a30fa85b7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4fd3d341f898bc53c51ad9a79a03ffd02ef0ac0e8f53e33ebc139a30fa85b7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1a2f823c4006bd3f5acb5f92c78b1020ea0992f77a76366c5bab21657bf403f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1a2f823c4006bd3f5acb5f92c78b1020ea0992f77a76366c5bab21657bf403f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lv2jr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:12Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.782739 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4wzdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c4763ed-582e-4003-a0da-526ae8aee799\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca28226077849a752e7f8484f9ef98d6fb065d6ba9b3c4ef05f3ef1b0554bc32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5lb4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:34:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4wzdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:12Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.805922 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac15c86b-cd18-4110-bda1-ba116dccb445\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edc674b0b2e671ee34618f4a89fb513b4d6be3b5b2c65510554ed57b84f305b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d764e05de4e1d86add3da927e55f8d2f211bba19a2104af89e43d13f986f5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a26275d40e0cdc563a8704d985b92ca116fd053ac663b23ada1f3d55d3d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7bb726576fbf1e8a595c9fdab30ed4852d838f3ad72aa432b880e8466493ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa02037f2a0411aa4c3b7eff66c72a1885d70e56cf72cdf786d04b5840ecbe6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"message\\\":\\\" 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764624829\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764624829\\\\\\\\\\\\\\\" (2025-12-01 20:33:49 +0000 UTC to 2026-12-01 20:33:49 +0000 UTC (now=2025-12-01 21:33:54.341897199 +0000 UTC))\\\\\\\"\\\\nI1201 21:33:54.342031 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 21:33:54.342089 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 21:33:54.342220 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1201 21:33:54.342244 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1201 21:33:54.341422 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 21:33:54.342706 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 21:33:54.344341 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-602538438/tls.crt::/tmp/serving-cert-602538438/tls.key\\\\\\\"\\\\nI1201 21:33:54.345794 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 21:33:54.345817 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 21:33:54.345844 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348097 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348569 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1201 21:33:54.367723 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bbe0097117ffbc7aa9c8247995dcec3765f03a9aa7657ee0c0da4c3a3ec874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:12Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.823836 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:12Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.846266 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fqtnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84505e9c-7b91-400d-b30b-d7d2cfe3c29b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vpt7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vpt7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:34:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fqtnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:12Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.846550 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.846599 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.846616 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.846645 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.846665 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:12Z","lastTransitionTime":"2025-12-01T21:34:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.865441 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://311915b276fcb2749fd7443c07b4a339131584ba32944273ed063ed55edec6c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:12Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.883450 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69adfdccde94cd891284e61c5edd6a092d14c5276ff514618fa3b44169e6347e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a6c3b22dbe7ed00b2356583225c79a46bfbf7014f2ba5b6d2226f1b1f3f7b39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:12Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.897245 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w7dpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"943bfc9d-612b-4273-9774-f1866b7af4b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0486c9366b81f3abfd35806c7eef00eb2a353450d850cc665c43fe3c78c9fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w7dpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:12Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.908146 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"191b6ce3-f613-4217-b224-a65ee4cfdfe7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37ed2e5c6ce1be6c710341ab02dc9ae89ebb0b309c11e78b251ffb532c31587f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rf67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca94caf0040f2d86239f149d34e419adcc86594fda2b4189e74f699c125857f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rf67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b642k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:12Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.950176 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.950232 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.950251 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.950277 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:12 crc kubenswrapper[4962]: I1201 21:34:12.950295 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:12Z","lastTransitionTime":"2025-12-01T21:34:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.053978 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.054047 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.054061 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.054280 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.054294 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:13Z","lastTransitionTime":"2025-12-01T21:34:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.159313 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.159374 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.159392 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.159420 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.159439 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:13Z","lastTransitionTime":"2025-12-01T21:34:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.219049 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:34:13 crc kubenswrapper[4962]: E1201 21:34:13.219274 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.219591 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 21:34:13 crc kubenswrapper[4962]: E1201 21:34:13.219660 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.219692 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 21:34:13 crc kubenswrapper[4962]: E1201 21:34:13.219737 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.262592 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.262653 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.262670 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.262695 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.262715 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:13Z","lastTransitionTime":"2025-12-01T21:34:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.364975 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.365038 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.365059 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.365082 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.365100 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:13Z","lastTransitionTime":"2025-12-01T21:34:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.468187 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.468222 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.468231 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.468246 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.468257 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:13Z","lastTransitionTime":"2025-12-01T21:34:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.499545 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-2q5q5"] Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.500656 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q5q5" Dec 01 21:34:13 crc kubenswrapper[4962]: E1201 21:34:13.500820 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q5q5" podUID="5e1746bf-6971-44aa-ae52-f349e6963eb2" Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.521309 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:13Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.552125 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"017b2e87-9a6e-46c6-b061-1ed93bfd2322\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c2b5c061f60dc51486c1b3b6fd97f18f63860ecb75697b61ee8288c43f23417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a2d1885515e7eab97b61b268f14f9fab5ff02464621e1a9b6adfee4c032ca3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81854e8f9430ba467e0aecead00fad301f92f85fdcca3f93f5e4aad9bba65925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70bcf088189f0158815dd70d752870decf941ad51bbfbe7c0c872da603659864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f5f31e9f2e590d4d3319248b0a4dca7e1f6575cb973ec7cc3e33fc91eb82d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da8a6f1fac0baebddfe20bd0ebda104356ad37e9f128affd8df2757584e8e712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b188c66b2540fd993398d1612471d493903346524eeef1a5aa2d85b13e46f45c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b188c66b2540fd993398d1612471d493903346524eeef1a5aa2d85b13e46f45c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T21:34:10Z\\\",\\\"message\\\":\\\"work policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:10Z is after 2025-08-24T17:21:41Z]\\\\nI1201 21:34:10.543525 6419 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"7f9b8f25-db1a-4d02-a423-9afc5c2fb83c\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}},\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-j77n9_openshift-ovn-kubernetes(017b2e87-9a6e-46c6-b061-1ed93bfd2322)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://893a9b3cc35f66e4d7ce63bd2c5c94a0e5824466a37b2281e05c4e92b5e90a05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j77n9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:13Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.561808 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wgwq\" (UniqueName: \"kubernetes.io/projected/5e1746bf-6971-44aa-ae52-f349e6963eb2-kube-api-access-9wgwq\") pod \"network-metrics-daemon-2q5q5\" (UID: \"5e1746bf-6971-44aa-ae52-f349e6963eb2\") " pod="openshift-multus/network-metrics-daemon-2q5q5" Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.562023 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5e1746bf-6971-44aa-ae52-f349e6963eb2-metrics-certs\") pod \"network-metrics-daemon-2q5q5\" (UID: \"5e1746bf-6971-44aa-ae52-f349e6963eb2\") " pod="openshift-multus/network-metrics-daemon-2q5q5" Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.572008 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.572081 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.572101 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.572131 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.572157 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:13Z","lastTransitionTime":"2025-12-01T21:34:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.574743 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db02e5c3ea0837c8534d6d0d89c6343ef077bc79b08bb6514fcc7fa0d71ac94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:13Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.601698 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:13Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.608068 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fqtnk" event={"ID":"84505e9c-7b91-400d-b30b-d7d2cfe3c29b","Type":"ContainerStarted","Data":"c5ebad9cf8b8ff273eadbf4b7188ec1d0e0b191dd8c7d6e48efb504a6182b655"} Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.624623 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-m4wg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38b9e31-13b0-4a48-93bf-b3722ca60642\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6b607ad59cd135df76eb0d6167a309468b46c7d487077b737d6d35390990de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws4mx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-m4wg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:13Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.649126 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lv2jr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47902aa9-b3e5-4279-a0ee-23ec28d1c67b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131ddb58672271d21222b6c9376dc6fa0bc5ba8d9bb4c8b0db88e81d0f59984c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f38ae7ac1122c7cceeaf7ef6284034b3b00f0d34060ce2a912344e521f8288ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f38ae7ac1122c7cceeaf7ef6284034b3b00f0d34060ce2a912344e521f8288ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3df9e036c0435fee1aa00f18a53bfb50334213a9284f766de1454f30a6c78f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3df9e036c0435fee1aa00f18a53bfb50334213a9284f766de1454f30a6c78f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eea885834889b1efa1a74ddb9d4c378669affd6dcf493f7d5e2c18d63ba9b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eea885834889b1efa1a74ddb9d4c378669affd6dcf493f7d5e2c18d63ba9b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b892d495763f16b0d9b9afd4440fa4d704de060b42b7506b28e9d990c47bfc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b892d495763f16b0d9b9afd4440fa4d704de060b42b7506b28e9d990c47bfc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fd3d341f898bc53c51ad9a79a03ffd02ef0ac0e8f53e33ebc139a30fa85b7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4fd3d341f898bc53c51ad9a79a03ffd02ef0ac0e8f53e33ebc139a30fa85b7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1a2f823c4006bd3f5acb5f92c78b1020ea0992f77a76366c5bab21657bf403f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1a2f823c4006bd3f5acb5f92c78b1020ea0992f77a76366c5bab21657bf403f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lv2jr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:13Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.663841 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wgwq\" (UniqueName: \"kubernetes.io/projected/5e1746bf-6971-44aa-ae52-f349e6963eb2-kube-api-access-9wgwq\") pod \"network-metrics-daemon-2q5q5\" (UID: \"5e1746bf-6971-44aa-ae52-f349e6963eb2\") " pod="openshift-multus/network-metrics-daemon-2q5q5" Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.664582 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5e1746bf-6971-44aa-ae52-f349e6963eb2-metrics-certs\") pod \"network-metrics-daemon-2q5q5\" (UID: \"5e1746bf-6971-44aa-ae52-f349e6963eb2\") " pod="openshift-multus/network-metrics-daemon-2q5q5" Dec 01 21:34:13 crc kubenswrapper[4962]: E1201 21:34:13.664658 4962 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 21:34:13 crc kubenswrapper[4962]: E1201 21:34:13.665628 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e1746bf-6971-44aa-ae52-f349e6963eb2-metrics-certs podName:5e1746bf-6971-44aa-ae52-f349e6963eb2 nodeName:}" failed. No retries permitted until 2025-12-01 21:34:14.165593003 +0000 UTC m=+38.267032248 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5e1746bf-6971-44aa-ae52-f349e6963eb2-metrics-certs") pod "network-metrics-daemon-2q5q5" (UID: "5e1746bf-6971-44aa-ae52-f349e6963eb2") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.667082 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4wzdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c4763ed-582e-4003-a0da-526ae8aee799\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca28226077849a752e7f8484f9ef98d6fb065d6ba9b3c4ef05f3ef1b0554bc32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5lb4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:34:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4wzdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:13Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.674748 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.674798 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.674815 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.674838 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.674856 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:13Z","lastTransitionTime":"2025-12-01T21:34:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.696735 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dce968b-9adc-4cbe-958c-00bdd7ba6159\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d0603aa8db338a61428d0c439ab252ad7db6ee0061992381a2f2c2534ecc08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b00d16ddb5e5c38227a87ee09c7e0d74ca43fb9226904dc5d604abe6a9c9b5ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc7177cbcbb0fe51491e055e7a8cb565836f1ffe163b6e160c3b425a5218fc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e749578c0ce8cb1dd38de90c24c5ddf738558163fe9012afcf1db088756495\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:13Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.697765 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wgwq\" (UniqueName: \"kubernetes.io/projected/5e1746bf-6971-44aa-ae52-f349e6963eb2-kube-api-access-9wgwq\") pod \"network-metrics-daemon-2q5q5\" (UID: \"5e1746bf-6971-44aa-ae52-f349e6963eb2\") " pod="openshift-multus/network-metrics-daemon-2q5q5" Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.719968 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:13Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.739323 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fqtnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84505e9c-7b91-400d-b30b-d7d2cfe3c29b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vpt7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vpt7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:34:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fqtnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:13Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.761990 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac15c86b-cd18-4110-bda1-ba116dccb445\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edc674b0b2e671ee34618f4a89fb513b4d6be3b5b2c65510554ed57b84f305b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d764e05de4e1d86add3da927e55f8d2f211bba19a2104af89e43d13f986f5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a26275d40e0cdc563a8704d985b92ca116fd053ac663b23ada1f3d55d3d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7bb726576fbf1e8a595c9fdab30ed4852d838f3ad72aa432b880e8466493ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa02037f2a0411aa4c3b7eff66c72a1885d70e56cf72cdf786d04b5840ecbe6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"message\\\":\\\" 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764624829\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764624829\\\\\\\\\\\\\\\" (2025-12-01 20:33:49 +0000 UTC to 2026-12-01 20:33:49 +0000 UTC (now=2025-12-01 21:33:54.341897199 +0000 UTC))\\\\\\\"\\\\nI1201 21:33:54.342031 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 21:33:54.342089 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 21:33:54.342220 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1201 21:33:54.342244 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1201 21:33:54.341422 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 21:33:54.342706 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 21:33:54.344341 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-602538438/tls.crt::/tmp/serving-cert-602538438/tls.key\\\\\\\"\\\\nI1201 21:33:54.345794 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 21:33:54.345817 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 21:33:54.345844 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348097 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348569 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1201 21:33:54.367723 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bbe0097117ffbc7aa9c8247995dcec3765f03a9aa7657ee0c0da4c3a3ec874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:13Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.781743 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://311915b276fcb2749fd7443c07b4a339131584ba32944273ed063ed55edec6c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:13Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.783679 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.783729 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.783748 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.783771 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.783788 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:13Z","lastTransitionTime":"2025-12-01T21:34:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.804325 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69adfdccde94cd891284e61c5edd6a092d14c5276ff514618fa3b44169e6347e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a6c3b22dbe7ed00b2356583225c79a46bfbf7014f2ba5b6d2226f1b1f3f7b39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:13Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.819069 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w7dpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"943bfc9d-612b-4273-9774-f1866b7af4b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0486c9366b81f3abfd35806c7eef00eb2a353450d850cc665c43fe3c78c9fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w7dpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:13Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.836785 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"191b6ce3-f613-4217-b224-a65ee4cfdfe7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37ed2e5c6ce1be6c710341ab02dc9ae89ebb0b309c11e78b251ffb532c31587f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rf67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca94caf0040f2d86239f149d34e419adcc86594fda2b4189e74f699c125857f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rf67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b642k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:13Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.854574 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2q5q5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e1746bf-6971-44aa-ae52-f349e6963eb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wgwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wgwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:34:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2q5q5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:13Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.876138 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dce968b-9adc-4cbe-958c-00bdd7ba6159\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d0603aa8db338a61428d0c439ab252ad7db6ee0061992381a2f2c2534ecc08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b00d16ddb5e5c38227a87ee09c7e0d74ca43fb9226904dc5d604abe6a9c9b5ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc7177cbcbb0fe51491e055e7a8cb565836f1ffe163b6e160c3b425a5218fc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e749578c0ce8cb1dd38de90c24c5ddf738558163fe9012afcf1db088756495\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:13Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.886922 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.887019 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.887097 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.887140 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.887163 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:13Z","lastTransitionTime":"2025-12-01T21:34:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.899372 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db02e5c3ea0837c8534d6d0d89c6343ef077bc79b08bb6514fcc7fa0d71ac94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:13Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.920526 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:13Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.939850 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-m4wg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38b9e31-13b0-4a48-93bf-b3722ca60642\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6b607ad59cd135df76eb0d6167a309468b46c7d487077b737d6d35390990de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws4mx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-m4wg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:13Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.965594 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lv2jr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47902aa9-b3e5-4279-a0ee-23ec28d1c67b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131ddb58672271d21222b6c9376dc6fa0bc5ba8d9bb4c8b0db88e81d0f59984c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f38ae7ac1122c7cceeaf7ef6284034b3b00f0d34060ce2a912344e521f8288ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f38ae7ac1122c7cceeaf7ef6284034b3b00f0d34060ce2a912344e521f8288ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3df9e036c0435fee1aa00f18a53bfb50334213a9284f766de1454f30a6c78f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3df9e036c0435fee1aa00f18a53bfb50334213a9284f766de1454f30a6c78f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eea885834889b1efa1a74ddb9d4c378669affd6dcf493f7d5e2c18d63ba9b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eea885834889b1efa1a74ddb9d4c378669affd6dcf493f7d5e2c18d63ba9b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b892d495763f16b0d9b9afd4440fa4d704de060b42b7506b28e9d990c47bfc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b892d495763f16b0d9b9afd4440fa4d704de060b42b7506b28e9d990c47bfc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fd3d341f898bc53c51ad9a79a03ffd02ef0ac0e8f53e33ebc139a30fa85b7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4fd3d341f898bc53c51ad9a79a03ffd02ef0ac0e8f53e33ebc139a30fa85b7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1a2f823c4006bd3f5acb5f92c78b1020ea0992f77a76366c5bab21657bf403f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1a2f823c4006bd3f5acb5f92c78b1020ea0992f77a76366c5bab21657bf403f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lv2jr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:13Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.981915 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4wzdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c4763ed-582e-4003-a0da-526ae8aee799\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca28226077849a752e7f8484f9ef98d6fb065d6ba9b3c4ef05f3ef1b0554bc32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5lb4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:34:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4wzdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:13Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.990739 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.990798 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.990815 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.990839 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:13 crc kubenswrapper[4962]: I1201 21:34:13.990857 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:13Z","lastTransitionTime":"2025-12-01T21:34:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:14 crc kubenswrapper[4962]: I1201 21:34:14.003630 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac15c86b-cd18-4110-bda1-ba116dccb445\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edc674b0b2e671ee34618f4a89fb513b4d6be3b5b2c65510554ed57b84f305b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d764e05de4e1d86add3da927e55f8d2f211bba19a2104af89e43d13f986f5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a26275d40e0cdc563a8704d985b92ca116fd053ac663b23ada1f3d55d3d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7bb726576fbf1e8a595c9fdab30ed4852d838f3ad72aa432b880e8466493ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa02037f2a0411aa4c3b7eff66c72a1885d70e56cf72cdf786d04b5840ecbe6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"message\\\":\\\" 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764624829\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764624829\\\\\\\\\\\\\\\" (2025-12-01 20:33:49 +0000 UTC to 2026-12-01 20:33:49 +0000 UTC (now=2025-12-01 21:33:54.341897199 +0000 UTC))\\\\\\\"\\\\nI1201 21:33:54.342031 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 21:33:54.342089 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 21:33:54.342220 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1201 21:33:54.342244 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1201 21:33:54.341422 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 21:33:54.342706 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 21:33:54.344341 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-602538438/tls.crt::/tmp/serving-cert-602538438/tls.key\\\\\\\"\\\\nI1201 21:33:54.345794 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 21:33:54.345817 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 21:33:54.345844 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348097 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348569 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1201 21:33:54.367723 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bbe0097117ffbc7aa9c8247995dcec3765f03a9aa7657ee0c0da4c3a3ec874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:14Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:14 crc kubenswrapper[4962]: I1201 21:34:14.022674 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:14Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:14 crc kubenswrapper[4962]: I1201 21:34:14.040419 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fqtnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84505e9c-7b91-400d-b30b-d7d2cfe3c29b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4acbb8c835ff8f148555f849f503e7bf07afe0526bd6d0a22277e53f2eeef390\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vpt7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5ebad9cf8b8ff273eadbf4b7188ec1d0e0b191dd8c7d6e48efb504a6182b655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vpt7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:34:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fqtnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:14Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:14 crc kubenswrapper[4962]: I1201 21:34:14.059763 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2q5q5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e1746bf-6971-44aa-ae52-f349e6963eb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wgwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wgwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:34:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2q5q5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:14Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:14 crc kubenswrapper[4962]: I1201 21:34:14.079175 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://311915b276fcb2749fd7443c07b4a339131584ba32944273ed063ed55edec6c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:14Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:14 crc kubenswrapper[4962]: I1201 21:34:14.094230 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:14 crc kubenswrapper[4962]: I1201 21:34:14.094293 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:14 crc kubenswrapper[4962]: I1201 21:34:14.094310 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:14 crc kubenswrapper[4962]: I1201 21:34:14.094336 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:14 crc kubenswrapper[4962]: I1201 21:34:14.094352 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:14Z","lastTransitionTime":"2025-12-01T21:34:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:14 crc kubenswrapper[4962]: I1201 21:34:14.103395 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69adfdccde94cd891284e61c5edd6a092d14c5276ff514618fa3b44169e6347e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a6c3b22dbe7ed00b2356583225c79a46bfbf7014f2ba5b6d2226f1b1f3f7b39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:14Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:14 crc kubenswrapper[4962]: I1201 21:34:14.119989 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w7dpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"943bfc9d-612b-4273-9774-f1866b7af4b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0486c9366b81f3abfd35806c7eef00eb2a353450d850cc665c43fe3c78c9fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w7dpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:14Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:14 crc kubenswrapper[4962]: I1201 21:34:14.140178 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"191b6ce3-f613-4217-b224-a65ee4cfdfe7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37ed2e5c6ce1be6c710341ab02dc9ae89ebb0b309c11e78b251ffb532c31587f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rf67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca94caf0040f2d86239f149d34e419adcc86594fda2b4189e74f699c125857f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rf67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b642k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:14Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:14 crc kubenswrapper[4962]: I1201 21:34:14.165643 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:14Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:14 crc kubenswrapper[4962]: I1201 21:34:14.169083 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5e1746bf-6971-44aa-ae52-f349e6963eb2-metrics-certs\") pod \"network-metrics-daemon-2q5q5\" (UID: \"5e1746bf-6971-44aa-ae52-f349e6963eb2\") " pod="openshift-multus/network-metrics-daemon-2q5q5" Dec 01 21:34:14 crc kubenswrapper[4962]: E1201 21:34:14.169284 4962 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 21:34:14 crc kubenswrapper[4962]: E1201 21:34:14.169361 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e1746bf-6971-44aa-ae52-f349e6963eb2-metrics-certs podName:5e1746bf-6971-44aa-ae52-f349e6963eb2 nodeName:}" failed. No retries permitted until 2025-12-01 21:34:15.169339209 +0000 UTC m=+39.270778434 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5e1746bf-6971-44aa-ae52-f349e6963eb2-metrics-certs") pod "network-metrics-daemon-2q5q5" (UID: "5e1746bf-6971-44aa-ae52-f349e6963eb2") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 21:34:14 crc kubenswrapper[4962]: I1201 21:34:14.197196 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:14 crc kubenswrapper[4962]: I1201 21:34:14.197291 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:14 crc kubenswrapper[4962]: I1201 21:34:14.197311 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:14 crc kubenswrapper[4962]: I1201 21:34:14.197367 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:14 crc kubenswrapper[4962]: I1201 21:34:14.197387 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:14Z","lastTransitionTime":"2025-12-01T21:34:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:14 crc kubenswrapper[4962]: I1201 21:34:14.199397 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"017b2e87-9a6e-46c6-b061-1ed93bfd2322\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c2b5c061f60dc51486c1b3b6fd97f18f63860ecb75697b61ee8288c43f23417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a2d1885515e7eab97b61b268f14f9fab5ff02464621e1a9b6adfee4c032ca3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81854e8f9430ba467e0aecead00fad301f92f85fdcca3f93f5e4aad9bba65925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70bcf088189f0158815dd70d752870decf941ad51bbfbe7c0c872da603659864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f5f31e9f2e590d4d3319248b0a4dca7e1f6575cb973ec7cc3e33fc91eb82d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da8a6f1fac0baebddfe20bd0ebda104356ad37e9f128affd8df2757584e8e712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b188c66b2540fd993398d1612471d493903346524eeef1a5aa2d85b13e46f45c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b188c66b2540fd993398d1612471d493903346524eeef1a5aa2d85b13e46f45c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T21:34:10Z\\\",\\\"message\\\":\\\"work policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:10Z is after 2025-08-24T17:21:41Z]\\\\nI1201 21:34:10.543525 6419 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"7f9b8f25-db1a-4d02-a423-9afc5c2fb83c\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}},\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-j77n9_openshift-ovn-kubernetes(017b2e87-9a6e-46c6-b061-1ed93bfd2322)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://893a9b3cc35f66e4d7ce63bd2c5c94a0e5824466a37b2281e05c4e92b5e90a05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j77n9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:14Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:14 crc kubenswrapper[4962]: I1201 21:34:14.300190 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:14 crc kubenswrapper[4962]: I1201 21:34:14.300249 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:14 crc kubenswrapper[4962]: I1201 21:34:14.300267 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:14 crc kubenswrapper[4962]: I1201 21:34:14.300289 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:14 crc kubenswrapper[4962]: I1201 21:34:14.300306 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:14Z","lastTransitionTime":"2025-12-01T21:34:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:14 crc kubenswrapper[4962]: I1201 21:34:14.403915 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:14 crc kubenswrapper[4962]: I1201 21:34:14.403995 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:14 crc kubenswrapper[4962]: I1201 21:34:14.404007 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:14 crc kubenswrapper[4962]: I1201 21:34:14.404025 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:14 crc kubenswrapper[4962]: I1201 21:34:14.404037 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:14Z","lastTransitionTime":"2025-12-01T21:34:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:14 crc kubenswrapper[4962]: I1201 21:34:14.507869 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:14 crc kubenswrapper[4962]: I1201 21:34:14.507979 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:14 crc kubenswrapper[4962]: I1201 21:34:14.508007 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:14 crc kubenswrapper[4962]: I1201 21:34:14.508086 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:14 crc kubenswrapper[4962]: I1201 21:34:14.508112 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:14Z","lastTransitionTime":"2025-12-01T21:34:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:14 crc kubenswrapper[4962]: I1201 21:34:14.612877 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:14 crc kubenswrapper[4962]: I1201 21:34:14.612972 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:14 crc kubenswrapper[4962]: I1201 21:34:14.613006 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:14 crc kubenswrapper[4962]: I1201 21:34:14.613038 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:14 crc kubenswrapper[4962]: I1201 21:34:14.613058 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:14Z","lastTransitionTime":"2025-12-01T21:34:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:14 crc kubenswrapper[4962]: I1201 21:34:14.716059 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:14 crc kubenswrapper[4962]: I1201 21:34:14.716104 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:14 crc kubenswrapper[4962]: I1201 21:34:14.716122 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:14 crc kubenswrapper[4962]: I1201 21:34:14.716147 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:14 crc kubenswrapper[4962]: I1201 21:34:14.716164 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:14Z","lastTransitionTime":"2025-12-01T21:34:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:14 crc kubenswrapper[4962]: I1201 21:34:14.818660 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:14 crc kubenswrapper[4962]: I1201 21:34:14.818716 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:14 crc kubenswrapper[4962]: I1201 21:34:14.818734 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:14 crc kubenswrapper[4962]: I1201 21:34:14.818756 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:14 crc kubenswrapper[4962]: I1201 21:34:14.818775 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:14Z","lastTransitionTime":"2025-12-01T21:34:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:14 crc kubenswrapper[4962]: I1201 21:34:14.922275 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:14 crc kubenswrapper[4962]: I1201 21:34:14.922344 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:14 crc kubenswrapper[4962]: I1201 21:34:14.922363 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:14 crc kubenswrapper[4962]: I1201 21:34:14.922387 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:14 crc kubenswrapper[4962]: I1201 21:34:14.922406 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:14Z","lastTransitionTime":"2025-12-01T21:34:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:15 crc kubenswrapper[4962]: I1201 21:34:15.025593 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:15 crc kubenswrapper[4962]: I1201 21:34:15.025653 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:15 crc kubenswrapper[4962]: I1201 21:34:15.025671 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:15 crc kubenswrapper[4962]: I1201 21:34:15.025695 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:15 crc kubenswrapper[4962]: I1201 21:34:15.025747 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:15Z","lastTransitionTime":"2025-12-01T21:34:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:15 crc kubenswrapper[4962]: I1201 21:34:15.128266 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:15 crc kubenswrapper[4962]: I1201 21:34:15.128300 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:15 crc kubenswrapper[4962]: I1201 21:34:15.128309 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:15 crc kubenswrapper[4962]: I1201 21:34:15.128323 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:15 crc kubenswrapper[4962]: I1201 21:34:15.128334 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:15Z","lastTransitionTime":"2025-12-01T21:34:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:15 crc kubenswrapper[4962]: I1201 21:34:15.181265 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5e1746bf-6971-44aa-ae52-f349e6963eb2-metrics-certs\") pod \"network-metrics-daemon-2q5q5\" (UID: \"5e1746bf-6971-44aa-ae52-f349e6963eb2\") " pod="openshift-multus/network-metrics-daemon-2q5q5" Dec 01 21:34:15 crc kubenswrapper[4962]: E1201 21:34:15.181517 4962 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 21:34:15 crc kubenswrapper[4962]: E1201 21:34:15.181651 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e1746bf-6971-44aa-ae52-f349e6963eb2-metrics-certs podName:5e1746bf-6971-44aa-ae52-f349e6963eb2 nodeName:}" failed. No retries permitted until 2025-12-01 21:34:17.18161921 +0000 UTC m=+41.283058445 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5e1746bf-6971-44aa-ae52-f349e6963eb2-metrics-certs") pod "network-metrics-daemon-2q5q5" (UID: "5e1746bf-6971-44aa-ae52-f349e6963eb2") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 21:34:15 crc kubenswrapper[4962]: I1201 21:34:15.219468 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q5q5" Dec 01 21:34:15 crc kubenswrapper[4962]: I1201 21:34:15.219626 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:34:15 crc kubenswrapper[4962]: E1201 21:34:15.219833 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q5q5" podUID="5e1746bf-6971-44aa-ae52-f349e6963eb2" Dec 01 21:34:15 crc kubenswrapper[4962]: I1201 21:34:15.220168 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 21:34:15 crc kubenswrapper[4962]: I1201 21:34:15.220233 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 21:34:15 crc kubenswrapper[4962]: E1201 21:34:15.220351 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 21:34:15 crc kubenswrapper[4962]: E1201 21:34:15.220547 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 21:34:15 crc kubenswrapper[4962]: E1201 21:34:15.220661 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 21:34:15 crc kubenswrapper[4962]: I1201 21:34:15.231747 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:15 crc kubenswrapper[4962]: I1201 21:34:15.231811 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:15 crc kubenswrapper[4962]: I1201 21:34:15.231834 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:15 crc kubenswrapper[4962]: I1201 21:34:15.231862 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:15 crc kubenswrapper[4962]: I1201 21:34:15.231884 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:15Z","lastTransitionTime":"2025-12-01T21:34:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:15 crc kubenswrapper[4962]: I1201 21:34:15.335542 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:15 crc kubenswrapper[4962]: I1201 21:34:15.335602 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:15 crc kubenswrapper[4962]: I1201 21:34:15.335622 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:15 crc kubenswrapper[4962]: I1201 21:34:15.335645 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:15 crc kubenswrapper[4962]: I1201 21:34:15.335662 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:15Z","lastTransitionTime":"2025-12-01T21:34:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:15 crc kubenswrapper[4962]: I1201 21:34:15.439399 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:15 crc kubenswrapper[4962]: I1201 21:34:15.439466 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:15 crc kubenswrapper[4962]: I1201 21:34:15.439487 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:15 crc kubenswrapper[4962]: I1201 21:34:15.439511 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:15 crc kubenswrapper[4962]: I1201 21:34:15.439528 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:15Z","lastTransitionTime":"2025-12-01T21:34:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:15 crc kubenswrapper[4962]: I1201 21:34:15.543586 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:15 crc kubenswrapper[4962]: I1201 21:34:15.543635 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:15 crc kubenswrapper[4962]: I1201 21:34:15.543649 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:15 crc kubenswrapper[4962]: I1201 21:34:15.543700 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:15 crc kubenswrapper[4962]: I1201 21:34:15.543748 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:15Z","lastTransitionTime":"2025-12-01T21:34:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:15 crc kubenswrapper[4962]: I1201 21:34:15.646521 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:15 crc kubenswrapper[4962]: I1201 21:34:15.646577 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:15 crc kubenswrapper[4962]: I1201 21:34:15.646593 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:15 crc kubenswrapper[4962]: I1201 21:34:15.646619 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:15 crc kubenswrapper[4962]: I1201 21:34:15.646636 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:15Z","lastTransitionTime":"2025-12-01T21:34:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:15 crc kubenswrapper[4962]: I1201 21:34:15.749230 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:15 crc kubenswrapper[4962]: I1201 21:34:15.749291 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:15 crc kubenswrapper[4962]: I1201 21:34:15.749302 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:15 crc kubenswrapper[4962]: I1201 21:34:15.749318 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:15 crc kubenswrapper[4962]: I1201 21:34:15.749330 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:15Z","lastTransitionTime":"2025-12-01T21:34:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:15 crc kubenswrapper[4962]: I1201 21:34:15.851664 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:15 crc kubenswrapper[4962]: I1201 21:34:15.851715 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:15 crc kubenswrapper[4962]: I1201 21:34:15.851729 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:15 crc kubenswrapper[4962]: I1201 21:34:15.851752 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:15 crc kubenswrapper[4962]: I1201 21:34:15.851766 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:15Z","lastTransitionTime":"2025-12-01T21:34:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:15 crc kubenswrapper[4962]: I1201 21:34:15.954544 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:15 crc kubenswrapper[4962]: I1201 21:34:15.954585 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:15 crc kubenswrapper[4962]: I1201 21:34:15.954596 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:15 crc kubenswrapper[4962]: I1201 21:34:15.954610 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:15 crc kubenswrapper[4962]: I1201 21:34:15.954620 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:15Z","lastTransitionTime":"2025-12-01T21:34:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:16 crc kubenswrapper[4962]: I1201 21:34:16.058729 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:16 crc kubenswrapper[4962]: I1201 21:34:16.058803 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:16 crc kubenswrapper[4962]: I1201 21:34:16.058820 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:16 crc kubenswrapper[4962]: I1201 21:34:16.058847 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:16 crc kubenswrapper[4962]: I1201 21:34:16.058874 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:16Z","lastTransitionTime":"2025-12-01T21:34:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:16 crc kubenswrapper[4962]: I1201 21:34:16.163351 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:16 crc kubenswrapper[4962]: I1201 21:34:16.163419 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:16 crc kubenswrapper[4962]: I1201 21:34:16.163437 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:16 crc kubenswrapper[4962]: I1201 21:34:16.163466 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:16 crc kubenswrapper[4962]: I1201 21:34:16.163487 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:16Z","lastTransitionTime":"2025-12-01T21:34:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:16 crc kubenswrapper[4962]: I1201 21:34:16.242975 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:16Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:16 crc kubenswrapper[4962]: I1201 21:34:16.267589 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:16 crc kubenswrapper[4962]: I1201 21:34:16.267723 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:16 crc kubenswrapper[4962]: I1201 21:34:16.267752 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:16 crc kubenswrapper[4962]: I1201 21:34:16.267786 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:16 crc kubenswrapper[4962]: I1201 21:34:16.267814 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:16Z","lastTransitionTime":"2025-12-01T21:34:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:16 crc kubenswrapper[4962]: I1201 21:34:16.281605 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"017b2e87-9a6e-46c6-b061-1ed93bfd2322\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c2b5c061f60dc51486c1b3b6fd97f18f63860ecb75697b61ee8288c43f23417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a2d1885515e7eab97b61b268f14f9fab5ff02464621e1a9b6adfee4c032ca3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81854e8f9430ba467e0aecead00fad301f92f85fdcca3f93f5e4aad9bba65925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70bcf088189f0158815dd70d752870decf941ad51bbfbe7c0c872da603659864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f5f31e9f2e590d4d3319248b0a4dca7e1f6575cb973ec7cc3e33fc91eb82d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da8a6f1fac0baebddfe20bd0ebda104356ad37e9f128affd8df2757584e8e712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b188c66b2540fd993398d1612471d493903346524eeef1a5aa2d85b13e46f45c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b188c66b2540fd993398d1612471d493903346524eeef1a5aa2d85b13e46f45c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T21:34:10Z\\\",\\\"message\\\":\\\"work policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:10Z is after 2025-08-24T17:21:41Z]\\\\nI1201 21:34:10.543525 6419 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"7f9b8f25-db1a-4d02-a423-9afc5c2fb83c\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}},\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-j77n9_openshift-ovn-kubernetes(017b2e87-9a6e-46c6-b061-1ed93bfd2322)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://893a9b3cc35f66e4d7ce63bd2c5c94a0e5824466a37b2281e05c4e92b5e90a05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j77n9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:16Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:16 crc kubenswrapper[4962]: I1201 21:34:16.312161 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dce968b-9adc-4cbe-958c-00bdd7ba6159\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d0603aa8db338a61428d0c439ab252ad7db6ee0061992381a2f2c2534ecc08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b00d16ddb5e5c38227a87ee09c7e0d74ca43fb9226904dc5d604abe6a9c9b5ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc7177cbcbb0fe51491e055e7a8cb565836f1ffe163b6e160c3b425a5218fc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e749578c0ce8cb1dd38de90c24c5ddf738558163fe9012afcf1db088756495\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:16Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:16 crc kubenswrapper[4962]: I1201 21:34:16.338655 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db02e5c3ea0837c8534d6d0d89c6343ef077bc79b08bb6514fcc7fa0d71ac94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:16Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:16 crc kubenswrapper[4962]: I1201 21:34:16.360065 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:16Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:16 crc kubenswrapper[4962]: I1201 21:34:16.371741 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:16 crc kubenswrapper[4962]: I1201 21:34:16.372265 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:16 crc kubenswrapper[4962]: I1201 21:34:16.372701 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:16 crc kubenswrapper[4962]: I1201 21:34:16.372762 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:16 crc kubenswrapper[4962]: I1201 21:34:16.372784 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:16Z","lastTransitionTime":"2025-12-01T21:34:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:16 crc kubenswrapper[4962]: I1201 21:34:16.378767 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-m4wg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38b9e31-13b0-4a48-93bf-b3722ca60642\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6b607ad59cd135df76eb0d6167a309468b46c7d487077b737d6d35390990de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws4mx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-m4wg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:16Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:16 crc kubenswrapper[4962]: I1201 21:34:16.400156 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lv2jr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47902aa9-b3e5-4279-a0ee-23ec28d1c67b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131ddb58672271d21222b6c9376dc6fa0bc5ba8d9bb4c8b0db88e81d0f59984c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f38ae7ac1122c7cceeaf7ef6284034b3b00f0d34060ce2a912344e521f8288ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f38ae7ac1122c7cceeaf7ef6284034b3b00f0d34060ce2a912344e521f8288ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3df9e036c0435fee1aa00f18a53bfb50334213a9284f766de1454f30a6c78f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3df9e036c0435fee1aa00f18a53bfb50334213a9284f766de1454f30a6c78f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eea885834889b1efa1a74ddb9d4c378669affd6dcf493f7d5e2c18d63ba9b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eea885834889b1efa1a74ddb9d4c378669affd6dcf493f7d5e2c18d63ba9b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b892d495763f16b0d9b9afd4440fa4d704de060b42b7506b28e9d990c47bfc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b892d495763f16b0d9b9afd4440fa4d704de060b42b7506b28e9d990c47bfc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fd3d341f898bc53c51ad9a79a03ffd02ef0ac0e8f53e33ebc139a30fa85b7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4fd3d341f898bc53c51ad9a79a03ffd02ef0ac0e8f53e33ebc139a30fa85b7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1a2f823c4006bd3f5acb5f92c78b1020ea0992f77a76366c5bab21657bf403f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1a2f823c4006bd3f5acb5f92c78b1020ea0992f77a76366c5bab21657bf403f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lv2jr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:16Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:16 crc kubenswrapper[4962]: I1201 21:34:16.414053 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4wzdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c4763ed-582e-4003-a0da-526ae8aee799\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca28226077849a752e7f8484f9ef98d6fb065d6ba9b3c4ef05f3ef1b0554bc32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5lb4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:34:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4wzdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:16Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:16 crc kubenswrapper[4962]: I1201 21:34:16.430761 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac15c86b-cd18-4110-bda1-ba116dccb445\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edc674b0b2e671ee34618f4a89fb513b4d6be3b5b2c65510554ed57b84f305b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d764e05de4e1d86add3da927e55f8d2f211bba19a2104af89e43d13f986f5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a26275d40e0cdc563a8704d985b92ca116fd053ac663b23ada1f3d55d3d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7bb726576fbf1e8a595c9fdab30ed4852d838f3ad72aa432b880e8466493ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa02037f2a0411aa4c3b7eff66c72a1885d70e56cf72cdf786d04b5840ecbe6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"message\\\":\\\" 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764624829\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764624829\\\\\\\\\\\\\\\" (2025-12-01 20:33:49 +0000 UTC to 2026-12-01 20:33:49 +0000 UTC (now=2025-12-01 21:33:54.341897199 +0000 UTC))\\\\\\\"\\\\nI1201 21:33:54.342031 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 21:33:54.342089 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 21:33:54.342220 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1201 21:33:54.342244 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1201 21:33:54.341422 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 21:33:54.342706 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 21:33:54.344341 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-602538438/tls.crt::/tmp/serving-cert-602538438/tls.key\\\\\\\"\\\\nI1201 21:33:54.345794 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 21:33:54.345817 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 21:33:54.345844 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348097 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348569 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1201 21:33:54.367723 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bbe0097117ffbc7aa9c8247995dcec3765f03a9aa7657ee0c0da4c3a3ec874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:16Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:16 crc kubenswrapper[4962]: I1201 21:34:16.448865 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:16Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:16 crc kubenswrapper[4962]: I1201 21:34:16.463764 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fqtnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84505e9c-7b91-400d-b30b-d7d2cfe3c29b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4acbb8c835ff8f148555f849f503e7bf07afe0526bd6d0a22277e53f2eeef390\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vpt7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5ebad9cf8b8ff273eadbf4b7188ec1d0e0b191dd8c7d6e48efb504a6182b655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vpt7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:34:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fqtnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:16Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:16 crc kubenswrapper[4962]: I1201 21:34:16.475177 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:16 crc kubenswrapper[4962]: I1201 21:34:16.475267 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:16 crc kubenswrapper[4962]: I1201 21:34:16.475288 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:16 crc kubenswrapper[4962]: I1201 21:34:16.475338 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:16 crc kubenswrapper[4962]: I1201 21:34:16.475356 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:16Z","lastTransitionTime":"2025-12-01T21:34:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:16 crc kubenswrapper[4962]: I1201 21:34:16.478193 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2q5q5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e1746bf-6971-44aa-ae52-f349e6963eb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wgwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wgwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:34:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2q5q5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:16Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:16 crc kubenswrapper[4962]: I1201 21:34:16.491687 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://311915b276fcb2749fd7443c07b4a339131584ba32944273ed063ed55edec6c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:16Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:16 crc kubenswrapper[4962]: I1201 21:34:16.506753 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69adfdccde94cd891284e61c5edd6a092d14c5276ff514618fa3b44169e6347e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a6c3b22dbe7ed00b2356583225c79a46bfbf7014f2ba5b6d2226f1b1f3f7b39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:16Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:16 crc kubenswrapper[4962]: I1201 21:34:16.519626 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w7dpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"943bfc9d-612b-4273-9774-f1866b7af4b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0486c9366b81f3abfd35806c7eef00eb2a353450d850cc665c43fe3c78c9fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w7dpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:16Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:16 crc kubenswrapper[4962]: I1201 21:34:16.537180 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"191b6ce3-f613-4217-b224-a65ee4cfdfe7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37ed2e5c6ce1be6c710341ab02dc9ae89ebb0b309c11e78b251ffb532c31587f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rf67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca94caf0040f2d86239f149d34e419adcc86594fda2b4189e74f699c125857f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rf67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b642k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:16Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:16 crc kubenswrapper[4962]: I1201 21:34:16.578654 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:16 crc kubenswrapper[4962]: I1201 21:34:16.578737 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:16 crc kubenswrapper[4962]: I1201 21:34:16.578760 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:16 crc kubenswrapper[4962]: I1201 21:34:16.578791 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:16 crc kubenswrapper[4962]: I1201 21:34:16.578814 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:16Z","lastTransitionTime":"2025-12-01T21:34:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:16 crc kubenswrapper[4962]: I1201 21:34:16.681751 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:16 crc kubenswrapper[4962]: I1201 21:34:16.681806 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:16 crc kubenswrapper[4962]: I1201 21:34:16.681822 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:16 crc kubenswrapper[4962]: I1201 21:34:16.681846 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:16 crc kubenswrapper[4962]: I1201 21:34:16.681863 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:16Z","lastTransitionTime":"2025-12-01T21:34:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:16 crc kubenswrapper[4962]: I1201 21:34:16.787327 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:16 crc kubenswrapper[4962]: I1201 21:34:16.787641 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:16 crc kubenswrapper[4962]: I1201 21:34:16.787780 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:16 crc kubenswrapper[4962]: I1201 21:34:16.787913 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:16 crc kubenswrapper[4962]: I1201 21:34:16.788102 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:16Z","lastTransitionTime":"2025-12-01T21:34:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:16 crc kubenswrapper[4962]: I1201 21:34:16.891080 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:16 crc kubenswrapper[4962]: I1201 21:34:16.892180 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:16 crc kubenswrapper[4962]: I1201 21:34:16.892221 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:16 crc kubenswrapper[4962]: I1201 21:34:16.892247 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:16 crc kubenswrapper[4962]: I1201 21:34:16.892267 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:16Z","lastTransitionTime":"2025-12-01T21:34:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:16 crc kubenswrapper[4962]: I1201 21:34:16.995390 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:16 crc kubenswrapper[4962]: I1201 21:34:16.995448 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:16 crc kubenswrapper[4962]: I1201 21:34:16.995466 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:16 crc kubenswrapper[4962]: I1201 21:34:16.995491 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:16 crc kubenswrapper[4962]: I1201 21:34:16.995510 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:16Z","lastTransitionTime":"2025-12-01T21:34:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:17 crc kubenswrapper[4962]: I1201 21:34:17.098981 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:17 crc kubenswrapper[4962]: I1201 21:34:17.099047 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:17 crc kubenswrapper[4962]: I1201 21:34:17.099112 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:17 crc kubenswrapper[4962]: I1201 21:34:17.099145 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:17 crc kubenswrapper[4962]: I1201 21:34:17.099173 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:17Z","lastTransitionTime":"2025-12-01T21:34:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:17 crc kubenswrapper[4962]: I1201 21:34:17.202462 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:17 crc kubenswrapper[4962]: I1201 21:34:17.202511 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:17 crc kubenswrapper[4962]: I1201 21:34:17.202529 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:17 crc kubenswrapper[4962]: I1201 21:34:17.202555 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:17 crc kubenswrapper[4962]: I1201 21:34:17.202574 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:17Z","lastTransitionTime":"2025-12-01T21:34:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:17 crc kubenswrapper[4962]: I1201 21:34:17.204432 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5e1746bf-6971-44aa-ae52-f349e6963eb2-metrics-certs\") pod \"network-metrics-daemon-2q5q5\" (UID: \"5e1746bf-6971-44aa-ae52-f349e6963eb2\") " pod="openshift-multus/network-metrics-daemon-2q5q5" Dec 01 21:34:17 crc kubenswrapper[4962]: E1201 21:34:17.204596 4962 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 21:34:17 crc kubenswrapper[4962]: E1201 21:34:17.204663 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e1746bf-6971-44aa-ae52-f349e6963eb2-metrics-certs podName:5e1746bf-6971-44aa-ae52-f349e6963eb2 nodeName:}" failed. No retries permitted until 2025-12-01 21:34:21.204641687 +0000 UTC m=+45.306080972 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5e1746bf-6971-44aa-ae52-f349e6963eb2-metrics-certs") pod "network-metrics-daemon-2q5q5" (UID: "5e1746bf-6971-44aa-ae52-f349e6963eb2") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 21:34:17 crc kubenswrapper[4962]: I1201 21:34:17.218582 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:34:17 crc kubenswrapper[4962]: I1201 21:34:17.218829 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q5q5" Dec 01 21:34:17 crc kubenswrapper[4962]: I1201 21:34:17.218679 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 21:34:17 crc kubenswrapper[4962]: I1201 21:34:17.218742 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 21:34:17 crc kubenswrapper[4962]: E1201 21:34:17.219331 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 21:34:17 crc kubenswrapper[4962]: E1201 21:34:17.219466 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 21:34:17 crc kubenswrapper[4962]: E1201 21:34:17.219618 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q5q5" podUID="5e1746bf-6971-44aa-ae52-f349e6963eb2" Dec 01 21:34:17 crc kubenswrapper[4962]: E1201 21:34:17.219889 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 21:34:17 crc kubenswrapper[4962]: I1201 21:34:17.305893 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:17 crc kubenswrapper[4962]: I1201 21:34:17.306285 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:17 crc kubenswrapper[4962]: I1201 21:34:17.306448 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:17 crc kubenswrapper[4962]: I1201 21:34:17.306593 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:17 crc kubenswrapper[4962]: I1201 21:34:17.306767 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:17Z","lastTransitionTime":"2025-12-01T21:34:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:17 crc kubenswrapper[4962]: I1201 21:34:17.409894 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:17 crc kubenswrapper[4962]: I1201 21:34:17.410179 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:17 crc kubenswrapper[4962]: I1201 21:34:17.410314 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:17 crc kubenswrapper[4962]: I1201 21:34:17.410463 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:17 crc kubenswrapper[4962]: I1201 21:34:17.410600 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:17Z","lastTransitionTime":"2025-12-01T21:34:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:17 crc kubenswrapper[4962]: I1201 21:34:17.513756 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:17 crc kubenswrapper[4962]: I1201 21:34:17.513832 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:17 crc kubenswrapper[4962]: I1201 21:34:17.513850 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:17 crc kubenswrapper[4962]: I1201 21:34:17.513878 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:17 crc kubenswrapper[4962]: I1201 21:34:17.513896 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:17Z","lastTransitionTime":"2025-12-01T21:34:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:17 crc kubenswrapper[4962]: I1201 21:34:17.617242 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:17 crc kubenswrapper[4962]: I1201 21:34:17.617306 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:17 crc kubenswrapper[4962]: I1201 21:34:17.617324 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:17 crc kubenswrapper[4962]: I1201 21:34:17.617351 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:17 crc kubenswrapper[4962]: I1201 21:34:17.617374 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:17Z","lastTransitionTime":"2025-12-01T21:34:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:17 crc kubenswrapper[4962]: I1201 21:34:17.720436 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:17 crc kubenswrapper[4962]: I1201 21:34:17.720548 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:17 crc kubenswrapper[4962]: I1201 21:34:17.720567 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:17 crc kubenswrapper[4962]: I1201 21:34:17.720591 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:17 crc kubenswrapper[4962]: I1201 21:34:17.720611 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:17Z","lastTransitionTime":"2025-12-01T21:34:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:17 crc kubenswrapper[4962]: I1201 21:34:17.824567 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:17 crc kubenswrapper[4962]: I1201 21:34:17.824649 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:17 crc kubenswrapper[4962]: I1201 21:34:17.824671 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:17 crc kubenswrapper[4962]: I1201 21:34:17.824700 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:17 crc kubenswrapper[4962]: I1201 21:34:17.824730 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:17Z","lastTransitionTime":"2025-12-01T21:34:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:17 crc kubenswrapper[4962]: I1201 21:34:17.902019 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:17 crc kubenswrapper[4962]: I1201 21:34:17.902063 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:17 crc kubenswrapper[4962]: I1201 21:34:17.902075 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:17 crc kubenswrapper[4962]: I1201 21:34:17.902091 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:17 crc kubenswrapper[4962]: I1201 21:34:17.902103 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:17Z","lastTransitionTime":"2025-12-01T21:34:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:17 crc kubenswrapper[4962]: E1201 21:34:17.919253 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1be34435-728a-4ba5-8928-fb5e344b2f91\\\",\\\"systemUUID\\\":\\\"c1819972-ca37-4089-9661-6671772d5e38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:17Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:17 crc kubenswrapper[4962]: I1201 21:34:17.924305 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:17 crc kubenswrapper[4962]: I1201 21:34:17.924544 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:17 crc kubenswrapper[4962]: I1201 21:34:17.924570 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:17 crc kubenswrapper[4962]: I1201 21:34:17.924640 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:17 crc kubenswrapper[4962]: I1201 21:34:17.924667 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:17Z","lastTransitionTime":"2025-12-01T21:34:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:17 crc kubenswrapper[4962]: E1201 21:34:17.945895 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1be34435-728a-4ba5-8928-fb5e344b2f91\\\",\\\"systemUUID\\\":\\\"c1819972-ca37-4089-9661-6671772d5e38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:17Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:17 crc kubenswrapper[4962]: I1201 21:34:17.951077 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:17 crc kubenswrapper[4962]: I1201 21:34:17.951128 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:17 crc kubenswrapper[4962]: I1201 21:34:17.951145 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:17 crc kubenswrapper[4962]: I1201 21:34:17.951171 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:17 crc kubenswrapper[4962]: I1201 21:34:17.951189 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:17Z","lastTransitionTime":"2025-12-01T21:34:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:17 crc kubenswrapper[4962]: E1201 21:34:17.970598 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1be34435-728a-4ba5-8928-fb5e344b2f91\\\",\\\"systemUUID\\\":\\\"c1819972-ca37-4089-9661-6671772d5e38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:17Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:17 crc kubenswrapper[4962]: I1201 21:34:17.975528 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:17 crc kubenswrapper[4962]: I1201 21:34:17.975575 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:17 crc kubenswrapper[4962]: I1201 21:34:17.975592 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:17 crc kubenswrapper[4962]: I1201 21:34:17.975612 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:17 crc kubenswrapper[4962]: I1201 21:34:17.975629 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:17Z","lastTransitionTime":"2025-12-01T21:34:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:17 crc kubenswrapper[4962]: E1201 21:34:17.994852 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1be34435-728a-4ba5-8928-fb5e344b2f91\\\",\\\"systemUUID\\\":\\\"c1819972-ca37-4089-9661-6671772d5e38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:17Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:18 crc kubenswrapper[4962]: I1201 21:34:18.000839 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:18 crc kubenswrapper[4962]: I1201 21:34:18.000930 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:18 crc kubenswrapper[4962]: I1201 21:34:18.001013 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:18 crc kubenswrapper[4962]: I1201 21:34:18.001083 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:18 crc kubenswrapper[4962]: I1201 21:34:18.001108 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:18Z","lastTransitionTime":"2025-12-01T21:34:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:18 crc kubenswrapper[4962]: E1201 21:34:18.022152 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1be34435-728a-4ba5-8928-fb5e344b2f91\\\",\\\"systemUUID\\\":\\\"c1819972-ca37-4089-9661-6671772d5e38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:18Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:18 crc kubenswrapper[4962]: E1201 21:34:18.022380 4962 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 21:34:18 crc kubenswrapper[4962]: I1201 21:34:18.024777 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:18 crc kubenswrapper[4962]: I1201 21:34:18.024825 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:18 crc kubenswrapper[4962]: I1201 21:34:18.024843 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:18 crc kubenswrapper[4962]: I1201 21:34:18.024865 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:18 crc kubenswrapper[4962]: I1201 21:34:18.024882 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:18Z","lastTransitionTime":"2025-12-01T21:34:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:18 crc kubenswrapper[4962]: I1201 21:34:18.127473 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:18 crc kubenswrapper[4962]: I1201 21:34:18.127534 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:18 crc kubenswrapper[4962]: I1201 21:34:18.127559 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:18 crc kubenswrapper[4962]: I1201 21:34:18.127588 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:18 crc kubenswrapper[4962]: I1201 21:34:18.127607 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:18Z","lastTransitionTime":"2025-12-01T21:34:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:18 crc kubenswrapper[4962]: I1201 21:34:18.230541 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:18 crc kubenswrapper[4962]: I1201 21:34:18.230602 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:18 crc kubenswrapper[4962]: I1201 21:34:18.230621 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:18 crc kubenswrapper[4962]: I1201 21:34:18.230644 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:18 crc kubenswrapper[4962]: I1201 21:34:18.230663 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:18Z","lastTransitionTime":"2025-12-01T21:34:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:18 crc kubenswrapper[4962]: I1201 21:34:18.334383 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:18 crc kubenswrapper[4962]: I1201 21:34:18.334753 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:18 crc kubenswrapper[4962]: I1201 21:34:18.334922 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:18 crc kubenswrapper[4962]: I1201 21:34:18.335134 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:18 crc kubenswrapper[4962]: I1201 21:34:18.335290 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:18Z","lastTransitionTime":"2025-12-01T21:34:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:18 crc kubenswrapper[4962]: I1201 21:34:18.439238 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:18 crc kubenswrapper[4962]: I1201 21:34:18.439773 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:18 crc kubenswrapper[4962]: I1201 21:34:18.439926 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:18 crc kubenswrapper[4962]: I1201 21:34:18.440133 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:18 crc kubenswrapper[4962]: I1201 21:34:18.440259 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:18Z","lastTransitionTime":"2025-12-01T21:34:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:18 crc kubenswrapper[4962]: I1201 21:34:18.543705 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:18 crc kubenswrapper[4962]: I1201 21:34:18.543776 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:18 crc kubenswrapper[4962]: I1201 21:34:18.543798 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:18 crc kubenswrapper[4962]: I1201 21:34:18.543823 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:18 crc kubenswrapper[4962]: I1201 21:34:18.543842 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:18Z","lastTransitionTime":"2025-12-01T21:34:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:18 crc kubenswrapper[4962]: I1201 21:34:18.647549 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:18 crc kubenswrapper[4962]: I1201 21:34:18.647666 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:18 crc kubenswrapper[4962]: I1201 21:34:18.647697 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:18 crc kubenswrapper[4962]: I1201 21:34:18.647771 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:18 crc kubenswrapper[4962]: I1201 21:34:18.647804 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:18Z","lastTransitionTime":"2025-12-01T21:34:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:18 crc kubenswrapper[4962]: I1201 21:34:18.751170 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:18 crc kubenswrapper[4962]: I1201 21:34:18.751237 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:18 crc kubenswrapper[4962]: I1201 21:34:18.751257 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:18 crc kubenswrapper[4962]: I1201 21:34:18.751282 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:18 crc kubenswrapper[4962]: I1201 21:34:18.751304 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:18Z","lastTransitionTime":"2025-12-01T21:34:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:18 crc kubenswrapper[4962]: I1201 21:34:18.854500 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:18 crc kubenswrapper[4962]: I1201 21:34:18.854576 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:18 crc kubenswrapper[4962]: I1201 21:34:18.854595 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:18 crc kubenswrapper[4962]: I1201 21:34:18.854621 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:18 crc kubenswrapper[4962]: I1201 21:34:18.854640 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:18Z","lastTransitionTime":"2025-12-01T21:34:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:18 crc kubenswrapper[4962]: I1201 21:34:18.958253 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:18 crc kubenswrapper[4962]: I1201 21:34:18.958323 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:18 crc kubenswrapper[4962]: I1201 21:34:18.958346 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:18 crc kubenswrapper[4962]: I1201 21:34:18.958373 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:18 crc kubenswrapper[4962]: I1201 21:34:18.958393 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:18Z","lastTransitionTime":"2025-12-01T21:34:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:19 crc kubenswrapper[4962]: I1201 21:34:19.061030 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:19 crc kubenswrapper[4962]: I1201 21:34:19.061094 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:19 crc kubenswrapper[4962]: I1201 21:34:19.061112 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:19 crc kubenswrapper[4962]: I1201 21:34:19.061136 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:19 crc kubenswrapper[4962]: I1201 21:34:19.061153 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:19Z","lastTransitionTime":"2025-12-01T21:34:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:19 crc kubenswrapper[4962]: I1201 21:34:19.164041 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:19 crc kubenswrapper[4962]: I1201 21:34:19.164107 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:19 crc kubenswrapper[4962]: I1201 21:34:19.164126 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:19 crc kubenswrapper[4962]: I1201 21:34:19.164149 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:19 crc kubenswrapper[4962]: I1201 21:34:19.164167 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:19Z","lastTransitionTime":"2025-12-01T21:34:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:19 crc kubenswrapper[4962]: I1201 21:34:19.218813 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:34:19 crc kubenswrapper[4962]: E1201 21:34:19.219048 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 21:34:19 crc kubenswrapper[4962]: I1201 21:34:19.219185 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q5q5" Dec 01 21:34:19 crc kubenswrapper[4962]: I1201 21:34:19.219238 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 21:34:19 crc kubenswrapper[4962]: E1201 21:34:19.219302 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q5q5" podUID="5e1746bf-6971-44aa-ae52-f349e6963eb2" Dec 01 21:34:19 crc kubenswrapper[4962]: I1201 21:34:19.219402 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 21:34:19 crc kubenswrapper[4962]: E1201 21:34:19.219415 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 21:34:19 crc kubenswrapper[4962]: E1201 21:34:19.219537 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 21:34:19 crc kubenswrapper[4962]: I1201 21:34:19.266594 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:19 crc kubenswrapper[4962]: I1201 21:34:19.266638 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:19 crc kubenswrapper[4962]: I1201 21:34:19.266654 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:19 crc kubenswrapper[4962]: I1201 21:34:19.266676 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:19 crc kubenswrapper[4962]: I1201 21:34:19.266694 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:19Z","lastTransitionTime":"2025-12-01T21:34:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:19 crc kubenswrapper[4962]: I1201 21:34:19.370342 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:19 crc kubenswrapper[4962]: I1201 21:34:19.370411 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:19 crc kubenswrapper[4962]: I1201 21:34:19.370434 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:19 crc kubenswrapper[4962]: I1201 21:34:19.370463 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:19 crc kubenswrapper[4962]: I1201 21:34:19.370484 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:19Z","lastTransitionTime":"2025-12-01T21:34:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:19 crc kubenswrapper[4962]: I1201 21:34:19.473368 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:19 crc kubenswrapper[4962]: I1201 21:34:19.473433 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:19 crc kubenswrapper[4962]: I1201 21:34:19.473450 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:19 crc kubenswrapper[4962]: I1201 21:34:19.473474 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:19 crc kubenswrapper[4962]: I1201 21:34:19.473492 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:19Z","lastTransitionTime":"2025-12-01T21:34:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:19 crc kubenswrapper[4962]: I1201 21:34:19.577232 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:19 crc kubenswrapper[4962]: I1201 21:34:19.577283 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:19 crc kubenswrapper[4962]: I1201 21:34:19.577300 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:19 crc kubenswrapper[4962]: I1201 21:34:19.577324 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:19 crc kubenswrapper[4962]: I1201 21:34:19.577344 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:19Z","lastTransitionTime":"2025-12-01T21:34:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:19 crc kubenswrapper[4962]: I1201 21:34:19.680231 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:19 crc kubenswrapper[4962]: I1201 21:34:19.680299 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:19 crc kubenswrapper[4962]: I1201 21:34:19.680321 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:19 crc kubenswrapper[4962]: I1201 21:34:19.680344 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:19 crc kubenswrapper[4962]: I1201 21:34:19.680362 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:19Z","lastTransitionTime":"2025-12-01T21:34:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:19 crc kubenswrapper[4962]: I1201 21:34:19.784154 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:19 crc kubenswrapper[4962]: I1201 21:34:19.784232 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:19 crc kubenswrapper[4962]: I1201 21:34:19.784255 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:19 crc kubenswrapper[4962]: I1201 21:34:19.784285 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:19 crc kubenswrapper[4962]: I1201 21:34:19.784311 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:19Z","lastTransitionTime":"2025-12-01T21:34:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:19 crc kubenswrapper[4962]: I1201 21:34:19.888092 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:19 crc kubenswrapper[4962]: I1201 21:34:19.888166 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:19 crc kubenswrapper[4962]: I1201 21:34:19.888183 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:19 crc kubenswrapper[4962]: I1201 21:34:19.888208 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:19 crc kubenswrapper[4962]: I1201 21:34:19.888226 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:19Z","lastTransitionTime":"2025-12-01T21:34:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:19 crc kubenswrapper[4962]: I1201 21:34:19.990981 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:19 crc kubenswrapper[4962]: I1201 21:34:19.991034 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:19 crc kubenswrapper[4962]: I1201 21:34:19.991050 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:19 crc kubenswrapper[4962]: I1201 21:34:19.991074 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:19 crc kubenswrapper[4962]: I1201 21:34:19.991090 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:19Z","lastTransitionTime":"2025-12-01T21:34:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:20 crc kubenswrapper[4962]: I1201 21:34:20.093285 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:20 crc kubenswrapper[4962]: I1201 21:34:20.093340 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:20 crc kubenswrapper[4962]: I1201 21:34:20.093356 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:20 crc kubenswrapper[4962]: I1201 21:34:20.093379 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:20 crc kubenswrapper[4962]: I1201 21:34:20.093396 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:20Z","lastTransitionTime":"2025-12-01T21:34:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:20 crc kubenswrapper[4962]: I1201 21:34:20.195769 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:20 crc kubenswrapper[4962]: I1201 21:34:20.195830 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:20 crc kubenswrapper[4962]: I1201 21:34:20.195847 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:20 crc kubenswrapper[4962]: I1201 21:34:20.195870 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:20 crc kubenswrapper[4962]: I1201 21:34:20.195888 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:20Z","lastTransitionTime":"2025-12-01T21:34:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:20 crc kubenswrapper[4962]: I1201 21:34:20.303265 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:20 crc kubenswrapper[4962]: I1201 21:34:20.303353 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:20 crc kubenswrapper[4962]: I1201 21:34:20.303403 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:20 crc kubenswrapper[4962]: I1201 21:34:20.303427 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:20 crc kubenswrapper[4962]: I1201 21:34:20.303443 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:20Z","lastTransitionTime":"2025-12-01T21:34:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:20 crc kubenswrapper[4962]: I1201 21:34:20.407474 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:20 crc kubenswrapper[4962]: I1201 21:34:20.407575 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:20 crc kubenswrapper[4962]: I1201 21:34:20.407594 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:20 crc kubenswrapper[4962]: I1201 21:34:20.407653 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:20 crc kubenswrapper[4962]: I1201 21:34:20.407671 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:20Z","lastTransitionTime":"2025-12-01T21:34:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:20 crc kubenswrapper[4962]: I1201 21:34:20.510352 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:20 crc kubenswrapper[4962]: I1201 21:34:20.510423 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:20 crc kubenswrapper[4962]: I1201 21:34:20.510447 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:20 crc kubenswrapper[4962]: I1201 21:34:20.510480 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:20 crc kubenswrapper[4962]: I1201 21:34:20.510500 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:20Z","lastTransitionTime":"2025-12-01T21:34:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:20 crc kubenswrapper[4962]: I1201 21:34:20.613050 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:20 crc kubenswrapper[4962]: I1201 21:34:20.613088 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:20 crc kubenswrapper[4962]: I1201 21:34:20.613098 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:20 crc kubenswrapper[4962]: I1201 21:34:20.613116 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:20 crc kubenswrapper[4962]: I1201 21:34:20.613128 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:20Z","lastTransitionTime":"2025-12-01T21:34:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:20 crc kubenswrapper[4962]: I1201 21:34:20.715355 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:20 crc kubenswrapper[4962]: I1201 21:34:20.715443 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:20 crc kubenswrapper[4962]: I1201 21:34:20.715455 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:20 crc kubenswrapper[4962]: I1201 21:34:20.715471 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:20 crc kubenswrapper[4962]: I1201 21:34:20.715482 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:20Z","lastTransitionTime":"2025-12-01T21:34:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:20 crc kubenswrapper[4962]: I1201 21:34:20.817537 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:20 crc kubenswrapper[4962]: I1201 21:34:20.817572 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:20 crc kubenswrapper[4962]: I1201 21:34:20.817582 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:20 crc kubenswrapper[4962]: I1201 21:34:20.817597 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:20 crc kubenswrapper[4962]: I1201 21:34:20.817612 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:20Z","lastTransitionTime":"2025-12-01T21:34:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:20 crc kubenswrapper[4962]: I1201 21:34:20.921513 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:20 crc kubenswrapper[4962]: I1201 21:34:20.921603 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:20 crc kubenswrapper[4962]: I1201 21:34:20.921622 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:20 crc kubenswrapper[4962]: I1201 21:34:20.921646 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:20 crc kubenswrapper[4962]: I1201 21:34:20.921710 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:20Z","lastTransitionTime":"2025-12-01T21:34:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:21 crc kubenswrapper[4962]: I1201 21:34:21.024491 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:21 crc kubenswrapper[4962]: I1201 21:34:21.024569 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:21 crc kubenswrapper[4962]: I1201 21:34:21.024589 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:21 crc kubenswrapper[4962]: I1201 21:34:21.024610 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:21 crc kubenswrapper[4962]: I1201 21:34:21.024626 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:21Z","lastTransitionTime":"2025-12-01T21:34:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:21 crc kubenswrapper[4962]: I1201 21:34:21.127453 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:21 crc kubenswrapper[4962]: I1201 21:34:21.127513 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:21 crc kubenswrapper[4962]: I1201 21:34:21.127536 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:21 crc kubenswrapper[4962]: I1201 21:34:21.127567 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:21 crc kubenswrapper[4962]: I1201 21:34:21.127589 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:21Z","lastTransitionTime":"2025-12-01T21:34:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:21 crc kubenswrapper[4962]: I1201 21:34:21.218606 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 21:34:21 crc kubenswrapper[4962]: I1201 21:34:21.218658 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q5q5" Dec 01 21:34:21 crc kubenswrapper[4962]: I1201 21:34:21.218606 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 21:34:21 crc kubenswrapper[4962]: I1201 21:34:21.218738 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:34:21 crc kubenswrapper[4962]: E1201 21:34:21.218802 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 21:34:21 crc kubenswrapper[4962]: E1201 21:34:21.219063 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q5q5" podUID="5e1746bf-6971-44aa-ae52-f349e6963eb2" Dec 01 21:34:21 crc kubenswrapper[4962]: E1201 21:34:21.219065 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 21:34:21 crc kubenswrapper[4962]: E1201 21:34:21.219167 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 21:34:21 crc kubenswrapper[4962]: I1201 21:34:21.230764 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:21 crc kubenswrapper[4962]: I1201 21:34:21.230872 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:21 crc kubenswrapper[4962]: I1201 21:34:21.230896 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:21 crc kubenswrapper[4962]: I1201 21:34:21.230923 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:21 crc kubenswrapper[4962]: I1201 21:34:21.230973 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:21Z","lastTransitionTime":"2025-12-01T21:34:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:21 crc kubenswrapper[4962]: I1201 21:34:21.249582 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5e1746bf-6971-44aa-ae52-f349e6963eb2-metrics-certs\") pod \"network-metrics-daemon-2q5q5\" (UID: \"5e1746bf-6971-44aa-ae52-f349e6963eb2\") " pod="openshift-multus/network-metrics-daemon-2q5q5" Dec 01 21:34:21 crc kubenswrapper[4962]: E1201 21:34:21.249873 4962 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 21:34:21 crc kubenswrapper[4962]: E1201 21:34:21.250053 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e1746bf-6971-44aa-ae52-f349e6963eb2-metrics-certs podName:5e1746bf-6971-44aa-ae52-f349e6963eb2 nodeName:}" failed. No retries permitted until 2025-12-01 21:34:29.250018367 +0000 UTC m=+53.351457662 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5e1746bf-6971-44aa-ae52-f349e6963eb2-metrics-certs") pod "network-metrics-daemon-2q5q5" (UID: "5e1746bf-6971-44aa-ae52-f349e6963eb2") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 21:34:21 crc kubenswrapper[4962]: I1201 21:34:21.334423 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:21 crc kubenswrapper[4962]: I1201 21:34:21.334483 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:21 crc kubenswrapper[4962]: I1201 21:34:21.334501 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:21 crc kubenswrapper[4962]: I1201 21:34:21.334524 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:21 crc kubenswrapper[4962]: I1201 21:34:21.334542 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:21Z","lastTransitionTime":"2025-12-01T21:34:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:21 crc kubenswrapper[4962]: I1201 21:34:21.437442 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:21 crc kubenswrapper[4962]: I1201 21:34:21.437502 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:21 crc kubenswrapper[4962]: I1201 21:34:21.437520 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:21 crc kubenswrapper[4962]: I1201 21:34:21.437545 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:21 crc kubenswrapper[4962]: I1201 21:34:21.437563 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:21Z","lastTransitionTime":"2025-12-01T21:34:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:21 crc kubenswrapper[4962]: I1201 21:34:21.541485 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:21 crc kubenswrapper[4962]: I1201 21:34:21.541554 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:21 crc kubenswrapper[4962]: I1201 21:34:21.541579 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:21 crc kubenswrapper[4962]: I1201 21:34:21.541607 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:21 crc kubenswrapper[4962]: I1201 21:34:21.541628 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:21Z","lastTransitionTime":"2025-12-01T21:34:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:21 crc kubenswrapper[4962]: I1201 21:34:21.644876 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:21 crc kubenswrapper[4962]: I1201 21:34:21.644962 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:21 crc kubenswrapper[4962]: I1201 21:34:21.644980 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:21 crc kubenswrapper[4962]: I1201 21:34:21.645003 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:21 crc kubenswrapper[4962]: I1201 21:34:21.645027 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:21Z","lastTransitionTime":"2025-12-01T21:34:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:21 crc kubenswrapper[4962]: I1201 21:34:21.748089 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:21 crc kubenswrapper[4962]: I1201 21:34:21.748165 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:21 crc kubenswrapper[4962]: I1201 21:34:21.748190 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:21 crc kubenswrapper[4962]: I1201 21:34:21.748219 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:21 crc kubenswrapper[4962]: I1201 21:34:21.748238 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:21Z","lastTransitionTime":"2025-12-01T21:34:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:21 crc kubenswrapper[4962]: I1201 21:34:21.851262 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:21 crc kubenswrapper[4962]: I1201 21:34:21.851332 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:21 crc kubenswrapper[4962]: I1201 21:34:21.851354 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:21 crc kubenswrapper[4962]: I1201 21:34:21.851385 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:21 crc kubenswrapper[4962]: I1201 21:34:21.851407 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:21Z","lastTransitionTime":"2025-12-01T21:34:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:21 crc kubenswrapper[4962]: I1201 21:34:21.954143 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:21 crc kubenswrapper[4962]: I1201 21:34:21.954222 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:21 crc kubenswrapper[4962]: I1201 21:34:21.954231 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:21 crc kubenswrapper[4962]: I1201 21:34:21.954246 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:21 crc kubenswrapper[4962]: I1201 21:34:21.954257 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:21Z","lastTransitionTime":"2025-12-01T21:34:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:22 crc kubenswrapper[4962]: I1201 21:34:22.056757 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:22 crc kubenswrapper[4962]: I1201 21:34:22.056801 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:22 crc kubenswrapper[4962]: I1201 21:34:22.056813 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:22 crc kubenswrapper[4962]: I1201 21:34:22.056830 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:22 crc kubenswrapper[4962]: I1201 21:34:22.056842 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:22Z","lastTransitionTime":"2025-12-01T21:34:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:22 crc kubenswrapper[4962]: I1201 21:34:22.159380 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:22 crc kubenswrapper[4962]: I1201 21:34:22.159430 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:22 crc kubenswrapper[4962]: I1201 21:34:22.159447 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:22 crc kubenswrapper[4962]: I1201 21:34:22.159470 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:22 crc kubenswrapper[4962]: I1201 21:34:22.159488 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:22Z","lastTransitionTime":"2025-12-01T21:34:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:22 crc kubenswrapper[4962]: I1201 21:34:22.262048 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:22 crc kubenswrapper[4962]: I1201 21:34:22.262115 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:22 crc kubenswrapper[4962]: I1201 21:34:22.262142 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:22 crc kubenswrapper[4962]: I1201 21:34:22.262172 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:22 crc kubenswrapper[4962]: I1201 21:34:22.262199 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:22Z","lastTransitionTime":"2025-12-01T21:34:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:22 crc kubenswrapper[4962]: I1201 21:34:22.365396 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:22 crc kubenswrapper[4962]: I1201 21:34:22.365463 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:22 crc kubenswrapper[4962]: I1201 21:34:22.365484 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:22 crc kubenswrapper[4962]: I1201 21:34:22.365509 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:22 crc kubenswrapper[4962]: I1201 21:34:22.365527 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:22Z","lastTransitionTime":"2025-12-01T21:34:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:22 crc kubenswrapper[4962]: I1201 21:34:22.468038 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:22 crc kubenswrapper[4962]: I1201 21:34:22.468114 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:22 crc kubenswrapper[4962]: I1201 21:34:22.468140 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:22 crc kubenswrapper[4962]: I1201 21:34:22.468166 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:22 crc kubenswrapper[4962]: I1201 21:34:22.468183 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:22Z","lastTransitionTime":"2025-12-01T21:34:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:22 crc kubenswrapper[4962]: I1201 21:34:22.571385 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:22 crc kubenswrapper[4962]: I1201 21:34:22.571433 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:22 crc kubenswrapper[4962]: I1201 21:34:22.571445 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:22 crc kubenswrapper[4962]: I1201 21:34:22.571462 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:22 crc kubenswrapper[4962]: I1201 21:34:22.571473 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:22Z","lastTransitionTime":"2025-12-01T21:34:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:22 crc kubenswrapper[4962]: I1201 21:34:22.673808 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:22 crc kubenswrapper[4962]: I1201 21:34:22.673880 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:22 crc kubenswrapper[4962]: I1201 21:34:22.673898 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:22 crc kubenswrapper[4962]: I1201 21:34:22.673927 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:22 crc kubenswrapper[4962]: I1201 21:34:22.673976 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:22Z","lastTransitionTime":"2025-12-01T21:34:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:22 crc kubenswrapper[4962]: I1201 21:34:22.777036 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:22 crc kubenswrapper[4962]: I1201 21:34:22.777135 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:22 crc kubenswrapper[4962]: I1201 21:34:22.777153 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:22 crc kubenswrapper[4962]: I1201 21:34:22.777177 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:22 crc kubenswrapper[4962]: I1201 21:34:22.777194 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:22Z","lastTransitionTime":"2025-12-01T21:34:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:22 crc kubenswrapper[4962]: I1201 21:34:22.879717 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:22 crc kubenswrapper[4962]: I1201 21:34:22.879798 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:22 crc kubenswrapper[4962]: I1201 21:34:22.879825 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:22 crc kubenswrapper[4962]: I1201 21:34:22.879853 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:22 crc kubenswrapper[4962]: I1201 21:34:22.879872 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:22Z","lastTransitionTime":"2025-12-01T21:34:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:22 crc kubenswrapper[4962]: I1201 21:34:22.983170 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:22 crc kubenswrapper[4962]: I1201 21:34:22.983230 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:22 crc kubenswrapper[4962]: I1201 21:34:22.983247 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:22 crc kubenswrapper[4962]: I1201 21:34:22.983272 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:22 crc kubenswrapper[4962]: I1201 21:34:22.983289 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:22Z","lastTransitionTime":"2025-12-01T21:34:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:23 crc kubenswrapper[4962]: I1201 21:34:23.086978 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:23 crc kubenswrapper[4962]: I1201 21:34:23.087042 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:23 crc kubenswrapper[4962]: I1201 21:34:23.087060 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:23 crc kubenswrapper[4962]: I1201 21:34:23.087083 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:23 crc kubenswrapper[4962]: I1201 21:34:23.087100 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:23Z","lastTransitionTime":"2025-12-01T21:34:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:23 crc kubenswrapper[4962]: I1201 21:34:23.189553 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:23 crc kubenswrapper[4962]: I1201 21:34:23.189661 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:23 crc kubenswrapper[4962]: I1201 21:34:23.189688 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:23 crc kubenswrapper[4962]: I1201 21:34:23.189716 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:23 crc kubenswrapper[4962]: I1201 21:34:23.189738 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:23Z","lastTransitionTime":"2025-12-01T21:34:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:23 crc kubenswrapper[4962]: I1201 21:34:23.219057 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q5q5" Dec 01 21:34:23 crc kubenswrapper[4962]: I1201 21:34:23.219062 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 21:34:23 crc kubenswrapper[4962]: E1201 21:34:23.219257 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q5q5" podUID="5e1746bf-6971-44aa-ae52-f349e6963eb2" Dec 01 21:34:23 crc kubenswrapper[4962]: I1201 21:34:23.219364 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 21:34:23 crc kubenswrapper[4962]: I1201 21:34:23.219466 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:34:23 crc kubenswrapper[4962]: E1201 21:34:23.219546 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 21:34:23 crc kubenswrapper[4962]: E1201 21:34:23.219467 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 21:34:23 crc kubenswrapper[4962]: E1201 21:34:23.219617 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 21:34:23 crc kubenswrapper[4962]: I1201 21:34:23.293226 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:23 crc kubenswrapper[4962]: I1201 21:34:23.293268 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:23 crc kubenswrapper[4962]: I1201 21:34:23.293284 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:23 crc kubenswrapper[4962]: I1201 21:34:23.293310 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:23 crc kubenswrapper[4962]: I1201 21:34:23.293328 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:23Z","lastTransitionTime":"2025-12-01T21:34:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:23 crc kubenswrapper[4962]: I1201 21:34:23.396273 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:23 crc kubenswrapper[4962]: I1201 21:34:23.396343 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:23 crc kubenswrapper[4962]: I1201 21:34:23.396360 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:23 crc kubenswrapper[4962]: I1201 21:34:23.396384 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:23 crc kubenswrapper[4962]: I1201 21:34:23.396403 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:23Z","lastTransitionTime":"2025-12-01T21:34:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:23 crc kubenswrapper[4962]: I1201 21:34:23.499902 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:23 crc kubenswrapper[4962]: I1201 21:34:23.500045 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:23 crc kubenswrapper[4962]: I1201 21:34:23.500070 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:23 crc kubenswrapper[4962]: I1201 21:34:23.500140 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:23 crc kubenswrapper[4962]: I1201 21:34:23.500166 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:23Z","lastTransitionTime":"2025-12-01T21:34:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:23 crc kubenswrapper[4962]: I1201 21:34:23.602709 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:23 crc kubenswrapper[4962]: I1201 21:34:23.602775 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:23 crc kubenswrapper[4962]: I1201 21:34:23.602792 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:23 crc kubenswrapper[4962]: I1201 21:34:23.602816 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:23 crc kubenswrapper[4962]: I1201 21:34:23.602832 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:23Z","lastTransitionTime":"2025-12-01T21:34:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:23 crc kubenswrapper[4962]: I1201 21:34:23.705767 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:23 crc kubenswrapper[4962]: I1201 21:34:23.705808 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:23 crc kubenswrapper[4962]: I1201 21:34:23.705856 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:23 crc kubenswrapper[4962]: I1201 21:34:23.705877 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:23 crc kubenswrapper[4962]: I1201 21:34:23.705891 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:23Z","lastTransitionTime":"2025-12-01T21:34:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:23 crc kubenswrapper[4962]: I1201 21:34:23.808428 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:23 crc kubenswrapper[4962]: I1201 21:34:23.808496 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:23 crc kubenswrapper[4962]: I1201 21:34:23.808517 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:23 crc kubenswrapper[4962]: I1201 21:34:23.808547 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:23 crc kubenswrapper[4962]: I1201 21:34:23.808568 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:23Z","lastTransitionTime":"2025-12-01T21:34:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:23 crc kubenswrapper[4962]: I1201 21:34:23.911193 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:23 crc kubenswrapper[4962]: I1201 21:34:23.911245 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:23 crc kubenswrapper[4962]: I1201 21:34:23.911261 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:23 crc kubenswrapper[4962]: I1201 21:34:23.911284 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:23 crc kubenswrapper[4962]: I1201 21:34:23.911299 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:23Z","lastTransitionTime":"2025-12-01T21:34:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:24 crc kubenswrapper[4962]: I1201 21:34:24.014318 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:24 crc kubenswrapper[4962]: I1201 21:34:24.014361 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:24 crc kubenswrapper[4962]: I1201 21:34:24.014377 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:24 crc kubenswrapper[4962]: I1201 21:34:24.014398 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:24 crc kubenswrapper[4962]: I1201 21:34:24.014446 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:24Z","lastTransitionTime":"2025-12-01T21:34:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:24 crc kubenswrapper[4962]: I1201 21:34:24.117862 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:24 crc kubenswrapper[4962]: I1201 21:34:24.117971 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:24 crc kubenswrapper[4962]: I1201 21:34:24.118007 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:24 crc kubenswrapper[4962]: I1201 21:34:24.118039 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:24 crc kubenswrapper[4962]: I1201 21:34:24.118059 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:24Z","lastTransitionTime":"2025-12-01T21:34:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:24 crc kubenswrapper[4962]: I1201 21:34:24.221314 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:24 crc kubenswrapper[4962]: I1201 21:34:24.221368 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:24 crc kubenswrapper[4962]: I1201 21:34:24.221393 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:24 crc kubenswrapper[4962]: I1201 21:34:24.221420 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:24 crc kubenswrapper[4962]: I1201 21:34:24.221441 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:24Z","lastTransitionTime":"2025-12-01T21:34:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:24 crc kubenswrapper[4962]: I1201 21:34:24.325158 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:24 crc kubenswrapper[4962]: I1201 21:34:24.325214 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:24 crc kubenswrapper[4962]: I1201 21:34:24.325238 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:24 crc kubenswrapper[4962]: I1201 21:34:24.325266 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:24 crc kubenswrapper[4962]: I1201 21:34:24.325287 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:24Z","lastTransitionTime":"2025-12-01T21:34:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:24 crc kubenswrapper[4962]: I1201 21:34:24.427831 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:24 crc kubenswrapper[4962]: I1201 21:34:24.427870 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:24 crc kubenswrapper[4962]: I1201 21:34:24.427882 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:24 crc kubenswrapper[4962]: I1201 21:34:24.427898 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:24 crc kubenswrapper[4962]: I1201 21:34:24.427909 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:24Z","lastTransitionTime":"2025-12-01T21:34:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:24 crc kubenswrapper[4962]: I1201 21:34:24.530984 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:24 crc kubenswrapper[4962]: I1201 21:34:24.531030 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:24 crc kubenswrapper[4962]: I1201 21:34:24.531047 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:24 crc kubenswrapper[4962]: I1201 21:34:24.531070 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:24 crc kubenswrapper[4962]: I1201 21:34:24.531086 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:24Z","lastTransitionTime":"2025-12-01T21:34:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:24 crc kubenswrapper[4962]: I1201 21:34:24.633744 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:24 crc kubenswrapper[4962]: I1201 21:34:24.633800 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:24 crc kubenswrapper[4962]: I1201 21:34:24.633818 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:24 crc kubenswrapper[4962]: I1201 21:34:24.633839 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:24 crc kubenswrapper[4962]: I1201 21:34:24.633856 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:24Z","lastTransitionTime":"2025-12-01T21:34:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:24 crc kubenswrapper[4962]: I1201 21:34:24.736207 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:24 crc kubenswrapper[4962]: I1201 21:34:24.736620 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:24 crc kubenswrapper[4962]: I1201 21:34:24.736807 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:24 crc kubenswrapper[4962]: I1201 21:34:24.736992 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:24 crc kubenswrapper[4962]: I1201 21:34:24.737127 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:24Z","lastTransitionTime":"2025-12-01T21:34:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:24 crc kubenswrapper[4962]: I1201 21:34:24.841068 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:24 crc kubenswrapper[4962]: I1201 21:34:24.841130 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:24 crc kubenswrapper[4962]: I1201 21:34:24.841147 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:24 crc kubenswrapper[4962]: I1201 21:34:24.841171 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:24 crc kubenswrapper[4962]: I1201 21:34:24.841187 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:24Z","lastTransitionTime":"2025-12-01T21:34:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:24 crc kubenswrapper[4962]: I1201 21:34:24.944668 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:24 crc kubenswrapper[4962]: I1201 21:34:24.944739 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:24 crc kubenswrapper[4962]: I1201 21:34:24.944755 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:24 crc kubenswrapper[4962]: I1201 21:34:24.944778 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:24 crc kubenswrapper[4962]: I1201 21:34:24.944794 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:24Z","lastTransitionTime":"2025-12-01T21:34:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:25 crc kubenswrapper[4962]: I1201 21:34:25.047904 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:25 crc kubenswrapper[4962]: I1201 21:34:25.047989 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:25 crc kubenswrapper[4962]: I1201 21:34:25.048006 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:25 crc kubenswrapper[4962]: I1201 21:34:25.048031 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:25 crc kubenswrapper[4962]: I1201 21:34:25.048061 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:25Z","lastTransitionTime":"2025-12-01T21:34:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:25 crc kubenswrapper[4962]: I1201 21:34:25.150912 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:25 crc kubenswrapper[4962]: I1201 21:34:25.151017 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:25 crc kubenswrapper[4962]: I1201 21:34:25.151046 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:25 crc kubenswrapper[4962]: I1201 21:34:25.151075 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:25 crc kubenswrapper[4962]: I1201 21:34:25.151097 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:25Z","lastTransitionTime":"2025-12-01T21:34:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:25 crc kubenswrapper[4962]: I1201 21:34:25.218804 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 21:34:25 crc kubenswrapper[4962]: I1201 21:34:25.218835 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:34:25 crc kubenswrapper[4962]: I1201 21:34:25.218892 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 21:34:25 crc kubenswrapper[4962]: I1201 21:34:25.219000 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q5q5" Dec 01 21:34:25 crc kubenswrapper[4962]: E1201 21:34:25.219141 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 21:34:25 crc kubenswrapper[4962]: E1201 21:34:25.219272 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 21:34:25 crc kubenswrapper[4962]: E1201 21:34:25.219416 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 21:34:25 crc kubenswrapper[4962]: E1201 21:34:25.219543 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q5q5" podUID="5e1746bf-6971-44aa-ae52-f349e6963eb2" Dec 01 21:34:25 crc kubenswrapper[4962]: I1201 21:34:25.254547 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:25 crc kubenswrapper[4962]: I1201 21:34:25.254633 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:25 crc kubenswrapper[4962]: I1201 21:34:25.254652 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:25 crc kubenswrapper[4962]: I1201 21:34:25.254681 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:25 crc kubenswrapper[4962]: I1201 21:34:25.254699 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:25Z","lastTransitionTime":"2025-12-01T21:34:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:25 crc kubenswrapper[4962]: I1201 21:34:25.357850 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:25 crc kubenswrapper[4962]: I1201 21:34:25.357917 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:25 crc kubenswrapper[4962]: I1201 21:34:25.357972 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:25 crc kubenswrapper[4962]: I1201 21:34:25.358005 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:25 crc kubenswrapper[4962]: I1201 21:34:25.358035 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:25Z","lastTransitionTime":"2025-12-01T21:34:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:25 crc kubenswrapper[4962]: I1201 21:34:25.460611 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:25 crc kubenswrapper[4962]: I1201 21:34:25.461233 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:25 crc kubenswrapper[4962]: I1201 21:34:25.461432 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:25 crc kubenswrapper[4962]: I1201 21:34:25.461582 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:25 crc kubenswrapper[4962]: I1201 21:34:25.461708 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:25Z","lastTransitionTime":"2025-12-01T21:34:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:25 crc kubenswrapper[4962]: I1201 21:34:25.565367 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:25 crc kubenswrapper[4962]: I1201 21:34:25.565436 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:25 crc kubenswrapper[4962]: I1201 21:34:25.565457 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:25 crc kubenswrapper[4962]: I1201 21:34:25.565485 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:25 crc kubenswrapper[4962]: I1201 21:34:25.565506 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:25Z","lastTransitionTime":"2025-12-01T21:34:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:25 crc kubenswrapper[4962]: I1201 21:34:25.668253 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:25 crc kubenswrapper[4962]: I1201 21:34:25.668311 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:25 crc kubenswrapper[4962]: I1201 21:34:25.668331 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:25 crc kubenswrapper[4962]: I1201 21:34:25.668354 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:25 crc kubenswrapper[4962]: I1201 21:34:25.668372 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:25Z","lastTransitionTime":"2025-12-01T21:34:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:25 crc kubenswrapper[4962]: I1201 21:34:25.771763 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:25 crc kubenswrapper[4962]: I1201 21:34:25.771841 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:25 crc kubenswrapper[4962]: I1201 21:34:25.771859 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:25 crc kubenswrapper[4962]: I1201 21:34:25.771884 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:25 crc kubenswrapper[4962]: I1201 21:34:25.771902 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:25Z","lastTransitionTime":"2025-12-01T21:34:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:25 crc kubenswrapper[4962]: I1201 21:34:25.875597 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:25 crc kubenswrapper[4962]: I1201 21:34:25.875671 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:25 crc kubenswrapper[4962]: I1201 21:34:25.875688 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:25 crc kubenswrapper[4962]: I1201 21:34:25.875737 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:25 crc kubenswrapper[4962]: I1201 21:34:25.875756 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:25Z","lastTransitionTime":"2025-12-01T21:34:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:25 crc kubenswrapper[4962]: I1201 21:34:25.978809 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:25 crc kubenswrapper[4962]: I1201 21:34:25.978877 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:25 crc kubenswrapper[4962]: I1201 21:34:25.978897 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:25 crc kubenswrapper[4962]: I1201 21:34:25.978920 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:25 crc kubenswrapper[4962]: I1201 21:34:25.978979 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:25Z","lastTransitionTime":"2025-12-01T21:34:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:26 crc kubenswrapper[4962]: I1201 21:34:26.081831 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:26 crc kubenswrapper[4962]: I1201 21:34:26.081879 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:26 crc kubenswrapper[4962]: I1201 21:34:26.081895 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:26 crc kubenswrapper[4962]: I1201 21:34:26.081930 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:26 crc kubenswrapper[4962]: I1201 21:34:26.081979 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:26Z","lastTransitionTime":"2025-12-01T21:34:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:26 crc kubenswrapper[4962]: I1201 21:34:26.184783 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:26 crc kubenswrapper[4962]: I1201 21:34:26.184875 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:26 crc kubenswrapper[4962]: I1201 21:34:26.184895 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:26 crc kubenswrapper[4962]: I1201 21:34:26.184918 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:26 crc kubenswrapper[4962]: I1201 21:34:26.184960 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:26Z","lastTransitionTime":"2025-12-01T21:34:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:26 crc kubenswrapper[4962]: I1201 21:34:26.242772 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-m4wg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38b9e31-13b0-4a48-93bf-b3722ca60642\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6b607ad59cd135df76eb0d6167a309468b46c7d487077b737d6d35390990de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws4mx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-m4wg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:26Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:26 crc kubenswrapper[4962]: I1201 21:34:26.274278 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lv2jr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47902aa9-b3e5-4279-a0ee-23ec28d1c67b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131ddb58672271d21222b6c9376dc6fa0bc5ba8d9bb4c8b0db88e81d0f59984c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f38ae7ac1122c7cceeaf7ef6284034b3b00f0d34060ce2a912344e521f8288ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f38ae7ac1122c7cceeaf7ef6284034b3b00f0d34060ce2a912344e521f8288ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3df9e036c0435fee1aa00f18a53bfb50334213a9284f766de1454f30a6c78f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3df9e036c0435fee1aa00f18a53bfb50334213a9284f766de1454f30a6c78f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eea885834889b1efa1a74ddb9d4c378669affd6dcf493f7d5e2c18d63ba9b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eea885834889b1efa1a74ddb9d4c378669affd6dcf493f7d5e2c18d63ba9b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b892d495763f16b0d9b9afd4440fa4d704de060b42b7506b28e9d990c47bfc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b892d495763f16b0d9b9afd4440fa4d704de060b42b7506b28e9d990c47bfc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fd3d341f898bc53c51ad9a79a03ffd02ef0ac0e8f53e33ebc139a30fa85b7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4fd3d341f898bc53c51ad9a79a03ffd02ef0ac0e8f53e33ebc139a30fa85b7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1a2f823c4006bd3f5acb5f92c78b1020ea0992f77a76366c5bab21657bf403f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1a2f823c4006bd3f5acb5f92c78b1020ea0992f77a76366c5bab21657bf403f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lv2jr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:26Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:26 crc kubenswrapper[4962]: I1201 21:34:26.288449 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:26 crc kubenswrapper[4962]: I1201 21:34:26.288504 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:26 crc kubenswrapper[4962]: I1201 21:34:26.288521 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:26 crc kubenswrapper[4962]: I1201 21:34:26.288547 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:26 crc kubenswrapper[4962]: I1201 21:34:26.288567 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:26Z","lastTransitionTime":"2025-12-01T21:34:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:26 crc kubenswrapper[4962]: I1201 21:34:26.293662 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4wzdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c4763ed-582e-4003-a0da-526ae8aee799\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca28226077849a752e7f8484f9ef98d6fb065d6ba9b3c4ef05f3ef1b0554bc32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5lb4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:34:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4wzdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:26Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:26 crc kubenswrapper[4962]: I1201 21:34:26.321757 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dce968b-9adc-4cbe-958c-00bdd7ba6159\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d0603aa8db338a61428d0c439ab252ad7db6ee0061992381a2f2c2534ecc08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b00d16ddb5e5c38227a87ee09c7e0d74ca43fb9226904dc5d604abe6a9c9b5ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc7177cbcbb0fe51491e055e7a8cb565836f1ffe163b6e160c3b425a5218fc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e749578c0ce8cb1dd38de90c24c5ddf738558163fe9012afcf1db088756495\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:26Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:26 crc kubenswrapper[4962]: I1201 21:34:26.343559 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db02e5c3ea0837c8534d6d0d89c6343ef077bc79b08bb6514fcc7fa0d71ac94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:26Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:26 crc kubenswrapper[4962]: I1201 21:34:26.360085 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:26Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:26 crc kubenswrapper[4962]: I1201 21:34:26.379987 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac15c86b-cd18-4110-bda1-ba116dccb445\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edc674b0b2e671ee34618f4a89fb513b4d6be3b5b2c65510554ed57b84f305b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d764e05de4e1d86add3da927e55f8d2f211bba19a2104af89e43d13f986f5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a26275d40e0cdc563a8704d985b92ca116fd053ac663b23ada1f3d55d3d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7bb726576fbf1e8a595c9fdab30ed4852d838f3ad72aa432b880e8466493ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa02037f2a0411aa4c3b7eff66c72a1885d70e56cf72cdf786d04b5840ecbe6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"message\\\":\\\" 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764624829\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764624829\\\\\\\\\\\\\\\" (2025-12-01 20:33:49 +0000 UTC to 2026-12-01 20:33:49 +0000 UTC (now=2025-12-01 21:33:54.341897199 +0000 UTC))\\\\\\\"\\\\nI1201 21:33:54.342031 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 21:33:54.342089 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 21:33:54.342220 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1201 21:33:54.342244 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1201 21:33:54.341422 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 21:33:54.342706 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 21:33:54.344341 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-602538438/tls.crt::/tmp/serving-cert-602538438/tls.key\\\\\\\"\\\\nI1201 21:33:54.345794 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 21:33:54.345817 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 21:33:54.345844 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348097 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348569 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1201 21:33:54.367723 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bbe0097117ffbc7aa9c8247995dcec3765f03a9aa7657ee0c0da4c3a3ec874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:26Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:26 crc kubenswrapper[4962]: I1201 21:34:26.392173 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:26 crc kubenswrapper[4962]: I1201 21:34:26.392256 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:26 crc kubenswrapper[4962]: I1201 21:34:26.392277 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:26 crc kubenswrapper[4962]: I1201 21:34:26.392309 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:26 crc kubenswrapper[4962]: I1201 21:34:26.392330 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:26Z","lastTransitionTime":"2025-12-01T21:34:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:26 crc kubenswrapper[4962]: I1201 21:34:26.398323 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:26Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:26 crc kubenswrapper[4962]: I1201 21:34:26.414910 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fqtnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84505e9c-7b91-400d-b30b-d7d2cfe3c29b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4acbb8c835ff8f148555f849f503e7bf07afe0526bd6d0a22277e53f2eeef390\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vpt7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5ebad9cf8b8ff273eadbf4b7188ec1d0e0b191dd8c7d6e48efb504a6182b655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vpt7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:34:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fqtnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:26Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:26 crc kubenswrapper[4962]: I1201 21:34:26.437247 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69adfdccde94cd891284e61c5edd6a092d14c5276ff514618fa3b44169e6347e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a6c3b22dbe7ed00b2356583225c79a46bfbf7014f2ba5b6d2226f1b1f3f7b39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:26Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:26 crc kubenswrapper[4962]: I1201 21:34:26.456826 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w7dpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"943bfc9d-612b-4273-9774-f1866b7af4b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0486c9366b81f3abfd35806c7eef00eb2a353450d850cc665c43fe3c78c9fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w7dpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:26Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:26 crc kubenswrapper[4962]: I1201 21:34:26.473048 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"191b6ce3-f613-4217-b224-a65ee4cfdfe7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37ed2e5c6ce1be6c710341ab02dc9ae89ebb0b309c11e78b251ffb532c31587f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rf67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca94caf0040f2d86239f149d34e419adcc86594fda2b4189e74f699c125857f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rf67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b642k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:26Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:26 crc kubenswrapper[4962]: I1201 21:34:26.487218 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2q5q5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e1746bf-6971-44aa-ae52-f349e6963eb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wgwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wgwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:34:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2q5q5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:26Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:26 crc kubenswrapper[4962]: I1201 21:34:26.495475 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:26 crc kubenswrapper[4962]: I1201 21:34:26.495525 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:26 crc kubenswrapper[4962]: I1201 21:34:26.495542 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:26 crc kubenswrapper[4962]: I1201 21:34:26.495567 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:26 crc kubenswrapper[4962]: I1201 21:34:26.495584 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:26Z","lastTransitionTime":"2025-12-01T21:34:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:26 crc kubenswrapper[4962]: I1201 21:34:26.505654 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://311915b276fcb2749fd7443c07b4a339131584ba32944273ed063ed55edec6c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:26Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:26 crc kubenswrapper[4962]: I1201 21:34:26.530106 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:26Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:26 crc kubenswrapper[4962]: I1201 21:34:26.568228 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"017b2e87-9a6e-46c6-b061-1ed93bfd2322\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c2b5c061f60dc51486c1b3b6fd97f18f63860ecb75697b61ee8288c43f23417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a2d1885515e7eab97b61b268f14f9fab5ff02464621e1a9b6adfee4c032ca3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81854e8f9430ba467e0aecead00fad301f92f85fdcca3f93f5e4aad9bba65925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70bcf088189f0158815dd70d752870decf941ad51bbfbe7c0c872da603659864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f5f31e9f2e590d4d3319248b0a4dca7e1f6575cb973ec7cc3e33fc91eb82d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da8a6f1fac0baebddfe20bd0ebda104356ad37e9f128affd8df2757584e8e712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b188c66b2540fd993398d1612471d493903346524eeef1a5aa2d85b13e46f45c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b188c66b2540fd993398d1612471d493903346524eeef1a5aa2d85b13e46f45c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T21:34:10Z\\\",\\\"message\\\":\\\"work policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:10Z is after 2025-08-24T17:21:41Z]\\\\nI1201 21:34:10.543525 6419 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"7f9b8f25-db1a-4d02-a423-9afc5c2fb83c\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}},\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-j77n9_openshift-ovn-kubernetes(017b2e87-9a6e-46c6-b061-1ed93bfd2322)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://893a9b3cc35f66e4d7ce63bd2c5c94a0e5824466a37b2281e05c4e92b5e90a05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j77n9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:26Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:26 crc kubenswrapper[4962]: I1201 21:34:26.599855 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:26 crc kubenswrapper[4962]: I1201 21:34:26.599905 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:26 crc kubenswrapper[4962]: I1201 21:34:26.599916 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:26 crc kubenswrapper[4962]: I1201 21:34:26.599951 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:26 crc kubenswrapper[4962]: I1201 21:34:26.599964 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:26Z","lastTransitionTime":"2025-12-01T21:34:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:26 crc kubenswrapper[4962]: I1201 21:34:26.702866 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:26 crc kubenswrapper[4962]: I1201 21:34:26.703045 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:26 crc kubenswrapper[4962]: I1201 21:34:26.703124 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:26 crc kubenswrapper[4962]: I1201 21:34:26.703206 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:26 crc kubenswrapper[4962]: I1201 21:34:26.703234 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:26Z","lastTransitionTime":"2025-12-01T21:34:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:26 crc kubenswrapper[4962]: I1201 21:34:26.807134 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:26 crc kubenswrapper[4962]: I1201 21:34:26.807204 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:26 crc kubenswrapper[4962]: I1201 21:34:26.807222 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:26 crc kubenswrapper[4962]: I1201 21:34:26.807249 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:26 crc kubenswrapper[4962]: I1201 21:34:26.807269 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:26Z","lastTransitionTime":"2025-12-01T21:34:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:26 crc kubenswrapper[4962]: I1201 21:34:26.910318 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:26 crc kubenswrapper[4962]: I1201 21:34:26.910383 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:26 crc kubenswrapper[4962]: I1201 21:34:26.910401 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:26 crc kubenswrapper[4962]: I1201 21:34:26.910428 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:26 crc kubenswrapper[4962]: I1201 21:34:26.910445 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:26Z","lastTransitionTime":"2025-12-01T21:34:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.013029 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.013087 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.013103 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.013125 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.013143 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:27Z","lastTransitionTime":"2025-12-01T21:34:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.013514 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 21:34:27 crc kubenswrapper[4962]: E1201 21:34:27.013740 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 21:34:59.013697893 +0000 UTC m=+83.115137128 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.114661 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.114736 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.114796 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 21:34:27 crc kubenswrapper[4962]: E1201 21:34:27.114811 4962 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.114829 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 21:34:27 crc kubenswrapper[4962]: E1201 21:34:27.114889 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 21:34:59.11486969 +0000 UTC m=+83.216308925 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 21:34:27 crc kubenswrapper[4962]: E1201 21:34:27.114974 4962 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 21:34:27 crc kubenswrapper[4962]: E1201 21:34:27.115012 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 21:34:27 crc kubenswrapper[4962]: E1201 21:34:27.115038 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 21:34:27 crc kubenswrapper[4962]: E1201 21:34:27.115057 4962 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 21:34:27 crc kubenswrapper[4962]: E1201 21:34:27.115086 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 21:34:59.115057795 +0000 UTC m=+83.216497030 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 21:34:27 crc kubenswrapper[4962]: E1201 21:34:27.115115 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 21:34:59.115098756 +0000 UTC m=+83.216537991 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 21:34:27 crc kubenswrapper[4962]: E1201 21:34:27.115238 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 21:34:27 crc kubenswrapper[4962]: E1201 21:34:27.115278 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 21:34:27 crc kubenswrapper[4962]: E1201 21:34:27.115299 4962 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 21:34:27 crc kubenswrapper[4962]: E1201 21:34:27.115389 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 21:34:59.115363623 +0000 UTC m=+83.216802848 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.116565 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.116613 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.116632 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.116654 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.116671 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:27Z","lastTransitionTime":"2025-12-01T21:34:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.219101 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.219155 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.219128 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q5q5" Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.219261 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 21:34:27 crc kubenswrapper[4962]: E1201 21:34:27.219404 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 21:34:27 crc kubenswrapper[4962]: E1201 21:34:27.219696 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.220024 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.220101 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.220191 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.220269 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.220293 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:27Z","lastTransitionTime":"2025-12-01T21:34:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:27 crc kubenswrapper[4962]: E1201 21:34:27.220817 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q5q5" podUID="5e1746bf-6971-44aa-ae52-f349e6963eb2" Dec 01 21:34:27 crc kubenswrapper[4962]: E1201 21:34:27.221003 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.221109 4962 scope.go:117] "RemoveContainer" containerID="b188c66b2540fd993398d1612471d493903346524eeef1a5aa2d85b13e46f45c" Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.323109 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.323514 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.323530 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.323555 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.323573 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:27Z","lastTransitionTime":"2025-12-01T21:34:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.425965 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.426011 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.426022 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.426038 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.426049 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:27Z","lastTransitionTime":"2025-12-01T21:34:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.529049 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.529092 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.529100 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.529115 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.529124 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:27Z","lastTransitionTime":"2025-12-01T21:34:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.632724 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.632775 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.632796 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.632860 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.632884 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:27Z","lastTransitionTime":"2025-12-01T21:34:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.669751 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j77n9_017b2e87-9a6e-46c6-b061-1ed93bfd2322/ovnkube-controller/1.log" Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.675246 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" event={"ID":"017b2e87-9a6e-46c6-b061-1ed93bfd2322","Type":"ContainerStarted","Data":"b08ed34e121e3dba29c3c3bfb0a54f8496af37c1b7930a64db11f43c68076592"} Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.676925 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.700419 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:27Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.736545 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.736586 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.736597 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.736613 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.736624 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:27Z","lastTransitionTime":"2025-12-01T21:34:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.738405 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"017b2e87-9a6e-46c6-b061-1ed93bfd2322\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c2b5c061f60dc51486c1b3b6fd97f18f63860ecb75697b61ee8288c43f23417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a2d1885515e7eab97b61b268f14f9fab5ff02464621e1a9b6adfee4c032ca3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81854e8f9430ba467e0aecead00fad301f92f85fdcca3f93f5e4aad9bba65925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70bcf088189f0158815dd70d752870decf941ad51bbfbe7c0c872da603659864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f5f31e9f2e590d4d3319248b0a4dca7e1f6575cb973ec7cc3e33fc91eb82d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da8a6f1fac0baebddfe20bd0ebda104356ad37e9f128affd8df2757584e8e712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b08ed34e121e3dba29c3c3bfb0a54f8496af37c1b7930a64db11f43c68076592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b188c66b2540fd993398d1612471d493903346524eeef1a5aa2d85b13e46f45c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T21:34:10Z\\\",\\\"message\\\":\\\"work policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:10Z is after 2025-08-24T17:21:41Z]\\\\nI1201 21:34:10.543525 6419 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"7f9b8f25-db1a-4d02-a423-9afc5c2fb83c\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}},\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://893a9b3cc35f66e4d7ce63bd2c5c94a0e5824466a37b2281e05c4e92b5e90a05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j77n9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:27Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.763676 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lv2jr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47902aa9-b3e5-4279-a0ee-23ec28d1c67b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131ddb58672271d21222b6c9376dc6fa0bc5ba8d9bb4c8b0db88e81d0f59984c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f38ae7ac1122c7cceeaf7ef6284034b3b00f0d34060ce2a912344e521f8288ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f38ae7ac1122c7cceeaf7ef6284034b3b00f0d34060ce2a912344e521f8288ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3df9e036c0435fee1aa00f18a53bfb50334213a9284f766de1454f30a6c78f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3df9e036c0435fee1aa00f18a53bfb50334213a9284f766de1454f30a6c78f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eea885834889b1efa1a74ddb9d4c378669affd6dcf493f7d5e2c18d63ba9b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eea885834889b1efa1a74ddb9d4c378669affd6dcf493f7d5e2c18d63ba9b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b892d495763f16b0d9b9afd4440fa4d704de060b42b7506b28e9d990c47bfc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b892d495763f16b0d9b9afd4440fa4d704de060b42b7506b28e9d990c47bfc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fd3d341f898bc53c51ad9a79a03ffd02ef0ac0e8f53e33ebc139a30fa85b7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4fd3d341f898bc53c51ad9a79a03ffd02ef0ac0e8f53e33ebc139a30fa85b7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1a2f823c4006bd3f5acb5f92c78b1020ea0992f77a76366c5bab21657bf403f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1a2f823c4006bd3f5acb5f92c78b1020ea0992f77a76366c5bab21657bf403f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lv2jr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:27Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.782603 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4wzdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c4763ed-582e-4003-a0da-526ae8aee799\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca28226077849a752e7f8484f9ef98d6fb065d6ba9b3c4ef05f3ef1b0554bc32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5lb4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:34:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4wzdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:27Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.800665 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dce968b-9adc-4cbe-958c-00bdd7ba6159\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d0603aa8db338a61428d0c439ab252ad7db6ee0061992381a2f2c2534ecc08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b00d16ddb5e5c38227a87ee09c7e0d74ca43fb9226904dc5d604abe6a9c9b5ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc7177cbcbb0fe51491e055e7a8cb565836f1ffe163b6e160c3b425a5218fc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e749578c0ce8cb1dd38de90c24c5ddf738558163fe9012afcf1db088756495\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:27Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.822576 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db02e5c3ea0837c8534d6d0d89c6343ef077bc79b08bb6514fcc7fa0d71ac94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:27Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.839702 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.839726 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.839734 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.839746 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.839755 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:27Z","lastTransitionTime":"2025-12-01T21:34:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.841828 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:27Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.858149 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-m4wg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38b9e31-13b0-4a48-93bf-b3722ca60642\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6b607ad59cd135df76eb0d6167a309468b46c7d487077b737d6d35390990de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws4mx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-m4wg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:27Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.878295 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac15c86b-cd18-4110-bda1-ba116dccb445\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edc674b0b2e671ee34618f4a89fb513b4d6be3b5b2c65510554ed57b84f305b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d764e05de4e1d86add3da927e55f8d2f211bba19a2104af89e43d13f986f5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a26275d40e0cdc563a8704d985b92ca116fd053ac663b23ada1f3d55d3d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7bb726576fbf1e8a595c9fdab30ed4852d838f3ad72aa432b880e8466493ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa02037f2a0411aa4c3b7eff66c72a1885d70e56cf72cdf786d04b5840ecbe6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"message\\\":\\\" 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764624829\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764624829\\\\\\\\\\\\\\\" (2025-12-01 20:33:49 +0000 UTC to 2026-12-01 20:33:49 +0000 UTC (now=2025-12-01 21:33:54.341897199 +0000 UTC))\\\\\\\"\\\\nI1201 21:33:54.342031 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 21:33:54.342089 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 21:33:54.342220 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1201 21:33:54.342244 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1201 21:33:54.341422 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 21:33:54.342706 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 21:33:54.344341 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-602538438/tls.crt::/tmp/serving-cert-602538438/tls.key\\\\\\\"\\\\nI1201 21:33:54.345794 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 21:33:54.345817 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 21:33:54.345844 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348097 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348569 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1201 21:33:54.367723 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bbe0097117ffbc7aa9c8247995dcec3765f03a9aa7657ee0c0da4c3a3ec874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:27Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.891951 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:27Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.903876 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fqtnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84505e9c-7b91-400d-b30b-d7d2cfe3c29b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4acbb8c835ff8f148555f849f503e7bf07afe0526bd6d0a22277e53f2eeef390\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vpt7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5ebad9cf8b8ff273eadbf4b7188ec1d0e0b191dd8c7d6e48efb504a6182b655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vpt7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:34:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fqtnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:27Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.937798 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w7dpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"943bfc9d-612b-4273-9774-f1866b7af4b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0486c9366b81f3abfd35806c7eef00eb2a353450d850cc665c43fe3c78c9fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w7dpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:27Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.942003 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.942026 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.942035 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.942047 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.942056 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:27Z","lastTransitionTime":"2025-12-01T21:34:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.951056 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"191b6ce3-f613-4217-b224-a65ee4cfdfe7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37ed2e5c6ce1be6c710341ab02dc9ae89ebb0b309c11e78b251ffb532c31587f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rf67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca94caf0040f2d86239f149d34e419adcc86594fda2b4189e74f699c125857f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rf67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b642k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:27Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.963046 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2q5q5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e1746bf-6971-44aa-ae52-f349e6963eb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wgwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wgwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:34:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2q5q5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:27Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.974562 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://311915b276fcb2749fd7443c07b4a339131584ba32944273ed063ed55edec6c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:27Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:27 crc kubenswrapper[4962]: I1201 21:34:27.993503 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69adfdccde94cd891284e61c5edd6a092d14c5276ff514618fa3b44169e6347e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a6c3b22dbe7ed00b2356583225c79a46bfbf7014f2ba5b6d2226f1b1f3f7b39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:27Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.044120 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.044155 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.044164 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.044179 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.044192 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:28Z","lastTransitionTime":"2025-12-01T21:34:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.128148 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.128207 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.128225 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.128250 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.128268 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:28Z","lastTransitionTime":"2025-12-01T21:34:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:28 crc kubenswrapper[4962]: E1201 21:34:28.150825 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1be34435-728a-4ba5-8928-fb5e344b2f91\\\",\\\"systemUUID\\\":\\\"c1819972-ca37-4089-9661-6671772d5e38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:28Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.157331 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.157379 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.157397 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.157421 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.157442 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:28Z","lastTransitionTime":"2025-12-01T21:34:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:28 crc kubenswrapper[4962]: E1201 21:34:28.178658 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1be34435-728a-4ba5-8928-fb5e344b2f91\\\",\\\"systemUUID\\\":\\\"c1819972-ca37-4089-9661-6671772d5e38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:28Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.186080 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.186145 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.186168 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.186202 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.186233 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:28Z","lastTransitionTime":"2025-12-01T21:34:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:28 crc kubenswrapper[4962]: E1201 21:34:28.209789 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1be34435-728a-4ba5-8928-fb5e344b2f91\\\",\\\"systemUUID\\\":\\\"c1819972-ca37-4089-9661-6671772d5e38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:28Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.214713 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.214766 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.214784 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.214809 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.214826 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:28Z","lastTransitionTime":"2025-12-01T21:34:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:28 crc kubenswrapper[4962]: E1201 21:34:28.234033 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1be34435-728a-4ba5-8928-fb5e344b2f91\\\",\\\"systemUUID\\\":\\\"c1819972-ca37-4089-9661-6671772d5e38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:28Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.239866 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.239917 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.239927 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.239961 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.239977 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:28Z","lastTransitionTime":"2025-12-01T21:34:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:28 crc kubenswrapper[4962]: E1201 21:34:28.253863 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1be34435-728a-4ba5-8928-fb5e344b2f91\\\",\\\"systemUUID\\\":\\\"c1819972-ca37-4089-9661-6671772d5e38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:28Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:28 crc kubenswrapper[4962]: E1201 21:34:28.254125 4962 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.256583 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.256632 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.256649 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.256677 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.256695 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:28Z","lastTransitionTime":"2025-12-01T21:34:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.359358 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.359414 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.359428 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.359452 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.359468 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:28Z","lastTransitionTime":"2025-12-01T21:34:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.463289 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.463352 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.463370 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.463397 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.463416 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:28Z","lastTransitionTime":"2025-12-01T21:34:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.566018 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.566082 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.566102 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.566127 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.566149 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:28Z","lastTransitionTime":"2025-12-01T21:34:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.670717 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.670776 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.670792 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.670816 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.670833 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:28Z","lastTransitionTime":"2025-12-01T21:34:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.682600 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j77n9_017b2e87-9a6e-46c6-b061-1ed93bfd2322/ovnkube-controller/2.log" Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.683696 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j77n9_017b2e87-9a6e-46c6-b061-1ed93bfd2322/ovnkube-controller/1.log" Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.690442 4962 generic.go:334] "Generic (PLEG): container finished" podID="017b2e87-9a6e-46c6-b061-1ed93bfd2322" containerID="b08ed34e121e3dba29c3c3bfb0a54f8496af37c1b7930a64db11f43c68076592" exitCode=1 Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.690576 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" event={"ID":"017b2e87-9a6e-46c6-b061-1ed93bfd2322","Type":"ContainerDied","Data":"b08ed34e121e3dba29c3c3bfb0a54f8496af37c1b7930a64db11f43c68076592"} Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.690664 4962 scope.go:117] "RemoveContainer" containerID="b188c66b2540fd993398d1612471d493903346524eeef1a5aa2d85b13e46f45c" Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.691720 4962 scope.go:117] "RemoveContainer" containerID="b08ed34e121e3dba29c3c3bfb0a54f8496af37c1b7930a64db11f43c68076592" Dec 01 21:34:28 crc kubenswrapper[4962]: E1201 21:34:28.692053 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-j77n9_openshift-ovn-kubernetes(017b2e87-9a6e-46c6-b061-1ed93bfd2322)\"" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" podUID="017b2e87-9a6e-46c6-b061-1ed93bfd2322" Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.719116 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:28Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.756994 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"017b2e87-9a6e-46c6-b061-1ed93bfd2322\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c2b5c061f60dc51486c1b3b6fd97f18f63860ecb75697b61ee8288c43f23417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a2d1885515e7eab97b61b268f14f9fab5ff02464621e1a9b6adfee4c032ca3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81854e8f9430ba467e0aecead00fad301f92f85fdcca3f93f5e4aad9bba65925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70bcf088189f0158815dd70d752870decf941ad51bbfbe7c0c872da603659864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f5f31e9f2e590d4d3319248b0a4dca7e1f6575cb973ec7cc3e33fc91eb82d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da8a6f1fac0baebddfe20bd0ebda104356ad37e9f128affd8df2757584e8e712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b08ed34e121e3dba29c3c3bfb0a54f8496af37c1b7930a64db11f43c68076592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b188c66b2540fd993398d1612471d493903346524eeef1a5aa2d85b13e46f45c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T21:34:10Z\\\",\\\"message\\\":\\\"work policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:10Z is after 2025-08-24T17:21:41Z]\\\\nI1201 21:34:10.543525 6419 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"7f9b8f25-db1a-4d02-a423-9afc5c2fb83c\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}},\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b08ed34e121e3dba29c3c3bfb0a54f8496af37c1b7930a64db11f43c68076592\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T21:34:28Z\\\",\\\"message\\\":\\\"] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {43933d5e-3c3b-4ff8-8926-04ac25de450e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 21:34:28.257117 6634 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI1201 21:34:28.257138 6634 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI1201 21:34:28.258137 6634 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nF1201 21:34:28.258116 6634 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://893a9b3cc35f66e4d7ce63bd2c5c94a0e5824466a37b2281e05c4e92b5e90a05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j77n9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:28Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.774372 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.774419 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.774436 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.774459 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.774477 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:28Z","lastTransitionTime":"2025-12-01T21:34:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.776036 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4wzdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c4763ed-582e-4003-a0da-526ae8aee799\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca28226077849a752e7f8484f9ef98d6fb065d6ba9b3c4ef05f3ef1b0554bc32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5lb4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:34:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4wzdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:28Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.797470 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dce968b-9adc-4cbe-958c-00bdd7ba6159\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d0603aa8db338a61428d0c439ab252ad7db6ee0061992381a2f2c2534ecc08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b00d16ddb5e5c38227a87ee09c7e0d74ca43fb9226904dc5d604abe6a9c9b5ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc7177cbcbb0fe51491e055e7a8cb565836f1ffe163b6e160c3b425a5218fc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e749578c0ce8cb1dd38de90c24c5ddf738558163fe9012afcf1db088756495\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:28Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.819357 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db02e5c3ea0837c8534d6d0d89c6343ef077bc79b08bb6514fcc7fa0d71ac94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:28Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.838735 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:28Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.860167 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-m4wg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38b9e31-13b0-4a48-93bf-b3722ca60642\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6b607ad59cd135df76eb0d6167a309468b46c7d487077b737d6d35390990de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws4mx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-m4wg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:28Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.879823 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.879888 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.879906 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.879960 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.879979 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:28Z","lastTransitionTime":"2025-12-01T21:34:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.885904 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lv2jr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47902aa9-b3e5-4279-a0ee-23ec28d1c67b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131ddb58672271d21222b6c9376dc6fa0bc5ba8d9bb4c8b0db88e81d0f59984c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f38ae7ac1122c7cceeaf7ef6284034b3b00f0d34060ce2a912344e521f8288ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f38ae7ac1122c7cceeaf7ef6284034b3b00f0d34060ce2a912344e521f8288ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3df9e036c0435fee1aa00f18a53bfb50334213a9284f766de1454f30a6c78f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3df9e036c0435fee1aa00f18a53bfb50334213a9284f766de1454f30a6c78f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eea885834889b1efa1a74ddb9d4c378669affd6dcf493f7d5e2c18d63ba9b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eea885834889b1efa1a74ddb9d4c378669affd6dcf493f7d5e2c18d63ba9b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b892d495763f16b0d9b9afd4440fa4d704de060b42b7506b28e9d990c47bfc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b892d495763f16b0d9b9afd4440fa4d704de060b42b7506b28e9d990c47bfc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fd3d341f898bc53c51ad9a79a03ffd02ef0ac0e8f53e33ebc139a30fa85b7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4fd3d341f898bc53c51ad9a79a03ffd02ef0ac0e8f53e33ebc139a30fa85b7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1a2f823c4006bd3f5acb5f92c78b1020ea0992f77a76366c5bab21657bf403f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1a2f823c4006bd3f5acb5f92c78b1020ea0992f77a76366c5bab21657bf403f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lv2jr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:28Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.908760 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac15c86b-cd18-4110-bda1-ba116dccb445\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edc674b0b2e671ee34618f4a89fb513b4d6be3b5b2c65510554ed57b84f305b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d764e05de4e1d86add3da927e55f8d2f211bba19a2104af89e43d13f986f5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a26275d40e0cdc563a8704d985b92ca116fd053ac663b23ada1f3d55d3d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7bb726576fbf1e8a595c9fdab30ed4852d838f3ad72aa432b880e8466493ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa02037f2a0411aa4c3b7eff66c72a1885d70e56cf72cdf786d04b5840ecbe6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"message\\\":\\\" 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764624829\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764624829\\\\\\\\\\\\\\\" (2025-12-01 20:33:49 +0000 UTC to 2026-12-01 20:33:49 +0000 UTC (now=2025-12-01 21:33:54.341897199 +0000 UTC))\\\\\\\"\\\\nI1201 21:33:54.342031 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 21:33:54.342089 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 21:33:54.342220 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1201 21:33:54.342244 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1201 21:33:54.341422 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 21:33:54.342706 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 21:33:54.344341 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-602538438/tls.crt::/tmp/serving-cert-602538438/tls.key\\\\\\\"\\\\nI1201 21:33:54.345794 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 21:33:54.345817 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 21:33:54.345844 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348097 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348569 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1201 21:33:54.367723 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bbe0097117ffbc7aa9c8247995dcec3765f03a9aa7657ee0c0da4c3a3ec874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:28Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.929930 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:28Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.948873 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fqtnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84505e9c-7b91-400d-b30b-d7d2cfe3c29b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4acbb8c835ff8f148555f849f503e7bf07afe0526bd6d0a22277e53f2eeef390\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vpt7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5ebad9cf8b8ff273eadbf4b7188ec1d0e0b191dd8c7d6e48efb504a6182b655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vpt7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:34:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fqtnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:28Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.966787 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"191b6ce3-f613-4217-b224-a65ee4cfdfe7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37ed2e5c6ce1be6c710341ab02dc9ae89ebb0b309c11e78b251ffb532c31587f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rf67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca94caf0040f2d86239f149d34e419adcc86594fda2b4189e74f699c125857f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rf67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b642k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:28Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.983621 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.983686 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.983703 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.983732 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.983751 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:28Z","lastTransitionTime":"2025-12-01T21:34:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:28 crc kubenswrapper[4962]: I1201 21:34:28.988736 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2q5q5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e1746bf-6971-44aa-ae52-f349e6963eb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wgwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wgwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:34:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2q5q5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:28Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:29 crc kubenswrapper[4962]: I1201 21:34:29.008850 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://311915b276fcb2749fd7443c07b4a339131584ba32944273ed063ed55edec6c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:29Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:29 crc kubenswrapper[4962]: I1201 21:34:29.029670 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69adfdccde94cd891284e61c5edd6a092d14c5276ff514618fa3b44169e6347e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a6c3b22dbe7ed00b2356583225c79a46bfbf7014f2ba5b6d2226f1b1f3f7b39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:29Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:29 crc kubenswrapper[4962]: I1201 21:34:29.046195 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w7dpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"943bfc9d-612b-4273-9774-f1866b7af4b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0486c9366b81f3abfd35806c7eef00eb2a353450d850cc665c43fe3c78c9fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w7dpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:29Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:29 crc kubenswrapper[4962]: I1201 21:34:29.086361 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:29 crc kubenswrapper[4962]: I1201 21:34:29.086413 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:29 crc kubenswrapper[4962]: I1201 21:34:29.086431 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:29 crc kubenswrapper[4962]: I1201 21:34:29.086454 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:29 crc kubenswrapper[4962]: I1201 21:34:29.086474 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:29Z","lastTransitionTime":"2025-12-01T21:34:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:29 crc kubenswrapper[4962]: I1201 21:34:29.189376 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:29 crc kubenswrapper[4962]: I1201 21:34:29.189448 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:29 crc kubenswrapper[4962]: I1201 21:34:29.189471 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:29 crc kubenswrapper[4962]: I1201 21:34:29.189504 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:29 crc kubenswrapper[4962]: I1201 21:34:29.189528 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:29Z","lastTransitionTime":"2025-12-01T21:34:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:29 crc kubenswrapper[4962]: I1201 21:34:29.219145 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:34:29 crc kubenswrapper[4962]: I1201 21:34:29.219229 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q5q5" Dec 01 21:34:29 crc kubenswrapper[4962]: I1201 21:34:29.219343 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 21:34:29 crc kubenswrapper[4962]: E1201 21:34:29.219604 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 21:34:29 crc kubenswrapper[4962]: I1201 21:34:29.219651 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 21:34:29 crc kubenswrapper[4962]: E1201 21:34:29.219842 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 21:34:29 crc kubenswrapper[4962]: E1201 21:34:29.220169 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 21:34:29 crc kubenswrapper[4962]: E1201 21:34:29.220275 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q5q5" podUID="5e1746bf-6971-44aa-ae52-f349e6963eb2" Dec 01 21:34:29 crc kubenswrapper[4962]: I1201 21:34:29.293327 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:29 crc kubenswrapper[4962]: I1201 21:34:29.293406 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:29 crc kubenswrapper[4962]: I1201 21:34:29.293423 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:29 crc kubenswrapper[4962]: I1201 21:34:29.293449 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:29 crc kubenswrapper[4962]: I1201 21:34:29.293470 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:29Z","lastTransitionTime":"2025-12-01T21:34:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:29 crc kubenswrapper[4962]: I1201 21:34:29.344473 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5e1746bf-6971-44aa-ae52-f349e6963eb2-metrics-certs\") pod \"network-metrics-daemon-2q5q5\" (UID: \"5e1746bf-6971-44aa-ae52-f349e6963eb2\") " pod="openshift-multus/network-metrics-daemon-2q5q5" Dec 01 21:34:29 crc kubenswrapper[4962]: E1201 21:34:29.344763 4962 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 21:34:29 crc kubenswrapper[4962]: E1201 21:34:29.344927 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e1746bf-6971-44aa-ae52-f349e6963eb2-metrics-certs podName:5e1746bf-6971-44aa-ae52-f349e6963eb2 nodeName:}" failed. No retries permitted until 2025-12-01 21:34:45.344891517 +0000 UTC m=+69.446330722 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5e1746bf-6971-44aa-ae52-f349e6963eb2-metrics-certs") pod "network-metrics-daemon-2q5q5" (UID: "5e1746bf-6971-44aa-ae52-f349e6963eb2") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 21:34:29 crc kubenswrapper[4962]: I1201 21:34:29.397393 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:29 crc kubenswrapper[4962]: I1201 21:34:29.397457 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:29 crc kubenswrapper[4962]: I1201 21:34:29.397474 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:29 crc kubenswrapper[4962]: I1201 21:34:29.397501 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:29 crc kubenswrapper[4962]: I1201 21:34:29.397519 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:29Z","lastTransitionTime":"2025-12-01T21:34:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:29 crc kubenswrapper[4962]: I1201 21:34:29.501891 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:29 crc kubenswrapper[4962]: I1201 21:34:29.502006 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:29 crc kubenswrapper[4962]: I1201 21:34:29.502025 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:29 crc kubenswrapper[4962]: I1201 21:34:29.502058 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:29 crc kubenswrapper[4962]: I1201 21:34:29.502081 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:29Z","lastTransitionTime":"2025-12-01T21:34:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:29 crc kubenswrapper[4962]: I1201 21:34:29.606074 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:29 crc kubenswrapper[4962]: I1201 21:34:29.606127 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:29 crc kubenswrapper[4962]: I1201 21:34:29.606145 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:29 crc kubenswrapper[4962]: I1201 21:34:29.606169 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:29 crc kubenswrapper[4962]: I1201 21:34:29.606187 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:29Z","lastTransitionTime":"2025-12-01T21:34:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:29 crc kubenswrapper[4962]: I1201 21:34:29.697054 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j77n9_017b2e87-9a6e-46c6-b061-1ed93bfd2322/ovnkube-controller/2.log" Dec 01 21:34:29 crc kubenswrapper[4962]: I1201 21:34:29.702797 4962 scope.go:117] "RemoveContainer" containerID="b08ed34e121e3dba29c3c3bfb0a54f8496af37c1b7930a64db11f43c68076592" Dec 01 21:34:29 crc kubenswrapper[4962]: E1201 21:34:29.703155 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-j77n9_openshift-ovn-kubernetes(017b2e87-9a6e-46c6-b061-1ed93bfd2322)\"" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" podUID="017b2e87-9a6e-46c6-b061-1ed93bfd2322" Dec 01 21:34:29 crc kubenswrapper[4962]: I1201 21:34:29.709268 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:29 crc kubenswrapper[4962]: I1201 21:34:29.709522 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:29 crc kubenswrapper[4962]: I1201 21:34:29.709676 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:29 crc kubenswrapper[4962]: I1201 21:34:29.709820 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:29 crc kubenswrapper[4962]: I1201 21:34:29.710033 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:29Z","lastTransitionTime":"2025-12-01T21:34:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:29 crc kubenswrapper[4962]: I1201 21:34:29.739167 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"017b2e87-9a6e-46c6-b061-1ed93bfd2322\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c2b5c061f60dc51486c1b3b6fd97f18f63860ecb75697b61ee8288c43f23417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a2d1885515e7eab97b61b268f14f9fab5ff02464621e1a9b6adfee4c032ca3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81854e8f9430ba467e0aecead00fad301f92f85fdcca3f93f5e4aad9bba65925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70bcf088189f0158815dd70d752870decf941ad51bbfbe7c0c872da603659864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f5f31e9f2e590d4d3319248b0a4dca7e1f6575cb973ec7cc3e33fc91eb82d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da8a6f1fac0baebddfe20bd0ebda104356ad37e9f128affd8df2757584e8e712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b08ed34e121e3dba29c3c3bfb0a54f8496af37c1b7930a64db11f43c68076592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b08ed34e121e3dba29c3c3bfb0a54f8496af37c1b7930a64db11f43c68076592\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T21:34:28Z\\\",\\\"message\\\":\\\"] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {43933d5e-3c3b-4ff8-8926-04ac25de450e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 21:34:28.257117 6634 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI1201 21:34:28.257138 6634 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI1201 21:34:28.258137 6634 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nF1201 21:34:28.258116 6634 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-j77n9_openshift-ovn-kubernetes(017b2e87-9a6e-46c6-b061-1ed93bfd2322)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://893a9b3cc35f66e4d7ce63bd2c5c94a0e5824466a37b2281e05c4e92b5e90a05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j77n9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:29Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:29 crc kubenswrapper[4962]: I1201 21:34:29.762832 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:29Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:29 crc kubenswrapper[4962]: I1201 21:34:29.783624 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:29Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:29 crc kubenswrapper[4962]: I1201 21:34:29.802628 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-m4wg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38b9e31-13b0-4a48-93bf-b3722ca60642\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6b607ad59cd135df76eb0d6167a309468b46c7d487077b737d6d35390990de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws4mx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-m4wg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:29Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:29 crc kubenswrapper[4962]: I1201 21:34:29.814228 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:29 crc kubenswrapper[4962]: I1201 21:34:29.814468 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:29 crc kubenswrapper[4962]: I1201 21:34:29.814831 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:29 crc kubenswrapper[4962]: I1201 21:34:29.815029 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:29 crc kubenswrapper[4962]: I1201 21:34:29.815161 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:29Z","lastTransitionTime":"2025-12-01T21:34:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:29 crc kubenswrapper[4962]: I1201 21:34:29.823731 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lv2jr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47902aa9-b3e5-4279-a0ee-23ec28d1c67b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131ddb58672271d21222b6c9376dc6fa0bc5ba8d9bb4c8b0db88e81d0f59984c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f38ae7ac1122c7cceeaf7ef6284034b3b00f0d34060ce2a912344e521f8288ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f38ae7ac1122c7cceeaf7ef6284034b3b00f0d34060ce2a912344e521f8288ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3df9e036c0435fee1aa00f18a53bfb50334213a9284f766de1454f30a6c78f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3df9e036c0435fee1aa00f18a53bfb50334213a9284f766de1454f30a6c78f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eea885834889b1efa1a74ddb9d4c378669affd6dcf493f7d5e2c18d63ba9b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eea885834889b1efa1a74ddb9d4c378669affd6dcf493f7d5e2c18d63ba9b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b892d495763f16b0d9b9afd4440fa4d704de060b42b7506b28e9d990c47bfc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b892d495763f16b0d9b9afd4440fa4d704de060b42b7506b28e9d990c47bfc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fd3d341f898bc53c51ad9a79a03ffd02ef0ac0e8f53e33ebc139a30fa85b7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4fd3d341f898bc53c51ad9a79a03ffd02ef0ac0e8f53e33ebc139a30fa85b7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1a2f823c4006bd3f5acb5f92c78b1020ea0992f77a76366c5bab21657bf403f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1a2f823c4006bd3f5acb5f92c78b1020ea0992f77a76366c5bab21657bf403f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lv2jr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:29Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:29 crc kubenswrapper[4962]: I1201 21:34:29.839236 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4wzdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c4763ed-582e-4003-a0da-526ae8aee799\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca28226077849a752e7f8484f9ef98d6fb065d6ba9b3c4ef05f3ef1b0554bc32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5lb4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:34:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4wzdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:29Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:29 crc kubenswrapper[4962]: I1201 21:34:29.863462 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dce968b-9adc-4cbe-958c-00bdd7ba6159\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d0603aa8db338a61428d0c439ab252ad7db6ee0061992381a2f2c2534ecc08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b00d16ddb5e5c38227a87ee09c7e0d74ca43fb9226904dc5d604abe6a9c9b5ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc7177cbcbb0fe51491e055e7a8cb565836f1ffe163b6e160c3b425a5218fc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e749578c0ce8cb1dd38de90c24c5ddf738558163fe9012afcf1db088756495\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:29Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:29 crc kubenswrapper[4962]: I1201 21:34:29.887247 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db02e5c3ea0837c8534d6d0d89c6343ef077bc79b08bb6514fcc7fa0d71ac94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:29Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:29 crc kubenswrapper[4962]: I1201 21:34:29.904492 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fqtnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84505e9c-7b91-400d-b30b-d7d2cfe3c29b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4acbb8c835ff8f148555f849f503e7bf07afe0526bd6d0a22277e53f2eeef390\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vpt7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5ebad9cf8b8ff273eadbf4b7188ec1d0e0b191dd8c7d6e48efb504a6182b655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vpt7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:34:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fqtnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:29Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:29 crc kubenswrapper[4962]: I1201 21:34:29.917766 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:29 crc kubenswrapper[4962]: I1201 21:34:29.917824 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:29 crc kubenswrapper[4962]: I1201 21:34:29.917841 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:29 crc kubenswrapper[4962]: I1201 21:34:29.917866 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:29 crc kubenswrapper[4962]: I1201 21:34:29.917883 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:29Z","lastTransitionTime":"2025-12-01T21:34:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:29 crc kubenswrapper[4962]: I1201 21:34:29.927208 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac15c86b-cd18-4110-bda1-ba116dccb445\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edc674b0b2e671ee34618f4a89fb513b4d6be3b5b2c65510554ed57b84f305b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d764e05de4e1d86add3da927e55f8d2f211bba19a2104af89e43d13f986f5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a26275d40e0cdc563a8704d985b92ca116fd053ac663b23ada1f3d55d3d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7bb726576fbf1e8a595c9fdab30ed4852d838f3ad72aa432b880e8466493ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa02037f2a0411aa4c3b7eff66c72a1885d70e56cf72cdf786d04b5840ecbe6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"message\\\":\\\" 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764624829\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764624829\\\\\\\\\\\\\\\" (2025-12-01 20:33:49 +0000 UTC to 2026-12-01 20:33:49 +0000 UTC (now=2025-12-01 21:33:54.341897199 +0000 UTC))\\\\\\\"\\\\nI1201 21:33:54.342031 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 21:33:54.342089 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 21:33:54.342220 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1201 21:33:54.342244 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1201 21:33:54.341422 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 21:33:54.342706 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 21:33:54.344341 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-602538438/tls.crt::/tmp/serving-cert-602538438/tls.key\\\\\\\"\\\\nI1201 21:33:54.345794 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 21:33:54.345817 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 21:33:54.345844 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348097 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348569 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1201 21:33:54.367723 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bbe0097117ffbc7aa9c8247995dcec3765f03a9aa7657ee0c0da4c3a3ec874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:29Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:29 crc kubenswrapper[4962]: I1201 21:34:29.947243 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:29Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:29 crc kubenswrapper[4962]: I1201 21:34:29.966414 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://311915b276fcb2749fd7443c07b4a339131584ba32944273ed063ed55edec6c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:29Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:29 crc kubenswrapper[4962]: I1201 21:34:29.989618 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69adfdccde94cd891284e61c5edd6a092d14c5276ff514618fa3b44169e6347e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a6c3b22dbe7ed00b2356583225c79a46bfbf7014f2ba5b6d2226f1b1f3f7b39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:29Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:30 crc kubenswrapper[4962]: I1201 21:34:30.006123 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w7dpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"943bfc9d-612b-4273-9774-f1866b7af4b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0486c9366b81f3abfd35806c7eef00eb2a353450d850cc665c43fe3c78c9fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w7dpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:30Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:30 crc kubenswrapper[4962]: I1201 21:34:30.021010 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:30 crc kubenswrapper[4962]: I1201 21:34:30.021083 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:30 crc kubenswrapper[4962]: I1201 21:34:30.021102 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:30 crc kubenswrapper[4962]: I1201 21:34:30.021127 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:30 crc kubenswrapper[4962]: I1201 21:34:30.021148 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:30Z","lastTransitionTime":"2025-12-01T21:34:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:30 crc kubenswrapper[4962]: I1201 21:34:30.023334 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"191b6ce3-f613-4217-b224-a65ee4cfdfe7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37ed2e5c6ce1be6c710341ab02dc9ae89ebb0b309c11e78b251ffb532c31587f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rf67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca94caf0040f2d86239f149d34e419adcc86594fda2b4189e74f699c125857f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rf67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b642k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:30Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:30 crc kubenswrapper[4962]: I1201 21:34:30.038796 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2q5q5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e1746bf-6971-44aa-ae52-f349e6963eb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wgwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wgwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:34:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2q5q5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:30Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:30 crc kubenswrapper[4962]: I1201 21:34:30.124334 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:30 crc kubenswrapper[4962]: I1201 21:34:30.124388 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:30 crc kubenswrapper[4962]: I1201 21:34:30.124423 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:30 crc kubenswrapper[4962]: I1201 21:34:30.124447 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:30 crc kubenswrapper[4962]: I1201 21:34:30.124464 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:30Z","lastTransitionTime":"2025-12-01T21:34:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:30 crc kubenswrapper[4962]: I1201 21:34:30.226867 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:30 crc kubenswrapper[4962]: I1201 21:34:30.226952 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:30 crc kubenswrapper[4962]: I1201 21:34:30.226970 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:30 crc kubenswrapper[4962]: I1201 21:34:30.226992 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:30 crc kubenswrapper[4962]: I1201 21:34:30.227010 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:30Z","lastTransitionTime":"2025-12-01T21:34:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:30 crc kubenswrapper[4962]: I1201 21:34:30.329752 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:30 crc kubenswrapper[4962]: I1201 21:34:30.330239 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:30 crc kubenswrapper[4962]: I1201 21:34:30.330263 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:30 crc kubenswrapper[4962]: I1201 21:34:30.330295 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:30 crc kubenswrapper[4962]: I1201 21:34:30.330319 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:30Z","lastTransitionTime":"2025-12-01T21:34:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:30 crc kubenswrapper[4962]: I1201 21:34:30.433052 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:30 crc kubenswrapper[4962]: I1201 21:34:30.433111 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:30 crc kubenswrapper[4962]: I1201 21:34:30.433129 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:30 crc kubenswrapper[4962]: I1201 21:34:30.433153 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:30 crc kubenswrapper[4962]: I1201 21:34:30.433172 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:30Z","lastTransitionTime":"2025-12-01T21:34:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:30 crc kubenswrapper[4962]: I1201 21:34:30.536876 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:30 crc kubenswrapper[4962]: I1201 21:34:30.537007 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:30 crc kubenswrapper[4962]: I1201 21:34:30.537026 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:30 crc kubenswrapper[4962]: I1201 21:34:30.537052 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:30 crc kubenswrapper[4962]: I1201 21:34:30.537077 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:30Z","lastTransitionTime":"2025-12-01T21:34:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:30 crc kubenswrapper[4962]: I1201 21:34:30.553663 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 21:34:30 crc kubenswrapper[4962]: I1201 21:34:30.570025 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 01 21:34:30 crc kubenswrapper[4962]: I1201 21:34:30.574790 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://311915b276fcb2749fd7443c07b4a339131584ba32944273ed063ed55edec6c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:30Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:30 crc kubenswrapper[4962]: I1201 21:34:30.596055 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69adfdccde94cd891284e61c5edd6a092d14c5276ff514618fa3b44169e6347e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a6c3b22dbe7ed00b2356583225c79a46bfbf7014f2ba5b6d2226f1b1f3f7b39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:30Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:30 crc kubenswrapper[4962]: I1201 21:34:30.612270 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w7dpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"943bfc9d-612b-4273-9774-f1866b7af4b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0486c9366b81f3abfd35806c7eef00eb2a353450d850cc665c43fe3c78c9fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w7dpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:30Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:30 crc kubenswrapper[4962]: I1201 21:34:30.631074 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"191b6ce3-f613-4217-b224-a65ee4cfdfe7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37ed2e5c6ce1be6c710341ab02dc9ae89ebb0b309c11e78b251ffb532c31587f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rf67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca94caf0040f2d86239f149d34e419adcc86594fda2b4189e74f699c125857f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rf67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b642k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:30Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:30 crc kubenswrapper[4962]: I1201 21:34:30.639990 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:30 crc kubenswrapper[4962]: I1201 21:34:30.640046 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:30 crc kubenswrapper[4962]: I1201 21:34:30.640062 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:30 crc kubenswrapper[4962]: I1201 21:34:30.640088 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:30 crc kubenswrapper[4962]: I1201 21:34:30.640105 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:30Z","lastTransitionTime":"2025-12-01T21:34:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:30 crc kubenswrapper[4962]: I1201 21:34:30.649897 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2q5q5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e1746bf-6971-44aa-ae52-f349e6963eb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wgwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wgwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:34:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2q5q5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:30Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:30 crc kubenswrapper[4962]: I1201 21:34:30.686227 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"017b2e87-9a6e-46c6-b061-1ed93bfd2322\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c2b5c061f60dc51486c1b3b6fd97f18f63860ecb75697b61ee8288c43f23417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a2d1885515e7eab97b61b268f14f9fab5ff02464621e1a9b6adfee4c032ca3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81854e8f9430ba467e0aecead00fad301f92f85fdcca3f93f5e4aad9bba65925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70bcf088189f0158815dd70d752870decf941ad51bbfbe7c0c872da603659864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f5f31e9f2e590d4d3319248b0a4dca7e1f6575cb973ec7cc3e33fc91eb82d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da8a6f1fac0baebddfe20bd0ebda104356ad37e9f128affd8df2757584e8e712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b08ed34e121e3dba29c3c3bfb0a54f8496af37c1b7930a64db11f43c68076592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b08ed34e121e3dba29c3c3bfb0a54f8496af37c1b7930a64db11f43c68076592\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T21:34:28Z\\\",\\\"message\\\":\\\"] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {43933d5e-3c3b-4ff8-8926-04ac25de450e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 21:34:28.257117 6634 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI1201 21:34:28.257138 6634 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI1201 21:34:28.258137 6634 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nF1201 21:34:28.258116 6634 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-j77n9_openshift-ovn-kubernetes(017b2e87-9a6e-46c6-b061-1ed93bfd2322)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://893a9b3cc35f66e4d7ce63bd2c5c94a0e5824466a37b2281e05c4e92b5e90a05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j77n9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:30Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:30 crc kubenswrapper[4962]: I1201 21:34:30.711335 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:30Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:30 crc kubenswrapper[4962]: I1201 21:34:30.730788 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:30Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:30 crc kubenswrapper[4962]: I1201 21:34:30.743752 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:30 crc kubenswrapper[4962]: I1201 21:34:30.743831 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:30 crc kubenswrapper[4962]: I1201 21:34:30.743855 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:30 crc kubenswrapper[4962]: I1201 21:34:30.743887 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:30 crc kubenswrapper[4962]: I1201 21:34:30.743910 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:30Z","lastTransitionTime":"2025-12-01T21:34:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:30 crc kubenswrapper[4962]: I1201 21:34:30.755506 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-m4wg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38b9e31-13b0-4a48-93bf-b3722ca60642\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6b607ad59cd135df76eb0d6167a309468b46c7d487077b737d6d35390990de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws4mx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-m4wg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:30Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:30 crc kubenswrapper[4962]: I1201 21:34:30.779307 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lv2jr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47902aa9-b3e5-4279-a0ee-23ec28d1c67b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131ddb58672271d21222b6c9376dc6fa0bc5ba8d9bb4c8b0db88e81d0f59984c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f38ae7ac1122c7cceeaf7ef6284034b3b00f0d34060ce2a912344e521f8288ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f38ae7ac1122c7cceeaf7ef6284034b3b00f0d34060ce2a912344e521f8288ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3df9e036c0435fee1aa00f18a53bfb50334213a9284f766de1454f30a6c78f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3df9e036c0435fee1aa00f18a53bfb50334213a9284f766de1454f30a6c78f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eea885834889b1efa1a74ddb9d4c378669affd6dcf493f7d5e2c18d63ba9b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eea885834889b1efa1a74ddb9d4c378669affd6dcf493f7d5e2c18d63ba9b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b892d495763f16b0d9b9afd4440fa4d704de060b42b7506b28e9d990c47bfc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b892d495763f16b0d9b9afd4440fa4d704de060b42b7506b28e9d990c47bfc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fd3d341f898bc53c51ad9a79a03ffd02ef0ac0e8f53e33ebc139a30fa85b7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4fd3d341f898bc53c51ad9a79a03ffd02ef0ac0e8f53e33ebc139a30fa85b7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1a2f823c4006bd3f5acb5f92c78b1020ea0992f77a76366c5bab21657bf403f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1a2f823c4006bd3f5acb5f92c78b1020ea0992f77a76366c5bab21657bf403f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lv2jr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:30Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:30 crc kubenswrapper[4962]: I1201 21:34:30.802315 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4wzdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c4763ed-582e-4003-a0da-526ae8aee799\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca28226077849a752e7f8484f9ef98d6fb065d6ba9b3c4ef05f3ef1b0554bc32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5lb4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:34:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4wzdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:30Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:30 crc kubenswrapper[4962]: I1201 21:34:30.819314 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dce968b-9adc-4cbe-958c-00bdd7ba6159\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d0603aa8db338a61428d0c439ab252ad7db6ee0061992381a2f2c2534ecc08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b00d16ddb5e5c38227a87ee09c7e0d74ca43fb9226904dc5d604abe6a9c9b5ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc7177cbcbb0fe51491e055e7a8cb565836f1ffe163b6e160c3b425a5218fc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e749578c0ce8cb1dd38de90c24c5ddf738558163fe9012afcf1db088756495\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:30Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:30 crc kubenswrapper[4962]: I1201 21:34:30.837929 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db02e5c3ea0837c8534d6d0d89c6343ef077bc79b08bb6514fcc7fa0d71ac94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:30Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:30 crc kubenswrapper[4962]: I1201 21:34:30.846611 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:30 crc kubenswrapper[4962]: I1201 21:34:30.846665 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:30 crc kubenswrapper[4962]: I1201 21:34:30.846682 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:30 crc kubenswrapper[4962]: I1201 21:34:30.846706 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:30 crc kubenswrapper[4962]: I1201 21:34:30.846724 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:30Z","lastTransitionTime":"2025-12-01T21:34:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:30 crc kubenswrapper[4962]: I1201 21:34:30.860090 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fqtnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84505e9c-7b91-400d-b30b-d7d2cfe3c29b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4acbb8c835ff8f148555f849f503e7bf07afe0526bd6d0a22277e53f2eeef390\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vpt7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5ebad9cf8b8ff273eadbf4b7188ec1d0e0b191dd8c7d6e48efb504a6182b655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vpt7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:34:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fqtnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:30Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:30 crc kubenswrapper[4962]: I1201 21:34:30.882420 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac15c86b-cd18-4110-bda1-ba116dccb445\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edc674b0b2e671ee34618f4a89fb513b4d6be3b5b2c65510554ed57b84f305b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d764e05de4e1d86add3da927e55f8d2f211bba19a2104af89e43d13f986f5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a26275d40e0cdc563a8704d985b92ca116fd053ac663b23ada1f3d55d3d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7bb726576fbf1e8a595c9fdab30ed4852d838f3ad72aa432b880e8466493ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa02037f2a0411aa4c3b7eff66c72a1885d70e56cf72cdf786d04b5840ecbe6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"message\\\":\\\" 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764624829\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764624829\\\\\\\\\\\\\\\" (2025-12-01 20:33:49 +0000 UTC to 2026-12-01 20:33:49 +0000 UTC (now=2025-12-01 21:33:54.341897199 +0000 UTC))\\\\\\\"\\\\nI1201 21:33:54.342031 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 21:33:54.342089 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 21:33:54.342220 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1201 21:33:54.342244 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1201 21:33:54.341422 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 21:33:54.342706 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 21:33:54.344341 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-602538438/tls.crt::/tmp/serving-cert-602538438/tls.key\\\\\\\"\\\\nI1201 21:33:54.345794 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 21:33:54.345817 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 21:33:54.345844 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348097 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348569 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1201 21:33:54.367723 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bbe0097117ffbc7aa9c8247995dcec3765f03a9aa7657ee0c0da4c3a3ec874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:30Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:30 crc kubenswrapper[4962]: I1201 21:34:30.902822 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:30Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:30 crc kubenswrapper[4962]: I1201 21:34:30.949888 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:30 crc kubenswrapper[4962]: I1201 21:34:30.949980 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:30 crc kubenswrapper[4962]: I1201 21:34:30.949999 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:30 crc kubenswrapper[4962]: I1201 21:34:30.950030 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:30 crc kubenswrapper[4962]: I1201 21:34:30.950045 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:30Z","lastTransitionTime":"2025-12-01T21:34:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:31 crc kubenswrapper[4962]: I1201 21:34:31.053552 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:31 crc kubenswrapper[4962]: I1201 21:34:31.053630 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:31 crc kubenswrapper[4962]: I1201 21:34:31.053650 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:31 crc kubenswrapper[4962]: I1201 21:34:31.053675 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:31 crc kubenswrapper[4962]: I1201 21:34:31.053692 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:31Z","lastTransitionTime":"2025-12-01T21:34:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:31 crc kubenswrapper[4962]: I1201 21:34:31.156320 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:31 crc kubenswrapper[4962]: I1201 21:34:31.156378 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:31 crc kubenswrapper[4962]: I1201 21:34:31.156396 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:31 crc kubenswrapper[4962]: I1201 21:34:31.156419 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:31 crc kubenswrapper[4962]: I1201 21:34:31.156439 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:31Z","lastTransitionTime":"2025-12-01T21:34:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:31 crc kubenswrapper[4962]: I1201 21:34:31.218766 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q5q5" Dec 01 21:34:31 crc kubenswrapper[4962]: I1201 21:34:31.218838 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 21:34:31 crc kubenswrapper[4962]: I1201 21:34:31.218895 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:34:31 crc kubenswrapper[4962]: E1201 21:34:31.219085 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q5q5" podUID="5e1746bf-6971-44aa-ae52-f349e6963eb2" Dec 01 21:34:31 crc kubenswrapper[4962]: I1201 21:34:31.219569 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 21:34:31 crc kubenswrapper[4962]: E1201 21:34:31.220066 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 21:34:31 crc kubenswrapper[4962]: E1201 21:34:31.220317 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 21:34:31 crc kubenswrapper[4962]: E1201 21:34:31.220513 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 21:34:31 crc kubenswrapper[4962]: I1201 21:34:31.259882 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:31 crc kubenswrapper[4962]: I1201 21:34:31.260001 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:31 crc kubenswrapper[4962]: I1201 21:34:31.260028 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:31 crc kubenswrapper[4962]: I1201 21:34:31.260052 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:31 crc kubenswrapper[4962]: I1201 21:34:31.260070 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:31Z","lastTransitionTime":"2025-12-01T21:34:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:31 crc kubenswrapper[4962]: I1201 21:34:31.364143 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:31 crc kubenswrapper[4962]: I1201 21:34:31.364209 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:31 crc kubenswrapper[4962]: I1201 21:34:31.364227 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:31 crc kubenswrapper[4962]: I1201 21:34:31.364252 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:31 crc kubenswrapper[4962]: I1201 21:34:31.364269 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:31Z","lastTransitionTime":"2025-12-01T21:34:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:31 crc kubenswrapper[4962]: I1201 21:34:31.467082 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:31 crc kubenswrapper[4962]: I1201 21:34:31.467134 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:31 crc kubenswrapper[4962]: I1201 21:34:31.467151 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:31 crc kubenswrapper[4962]: I1201 21:34:31.467174 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:31 crc kubenswrapper[4962]: I1201 21:34:31.467191 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:31Z","lastTransitionTime":"2025-12-01T21:34:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:31 crc kubenswrapper[4962]: I1201 21:34:31.569513 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:31 crc kubenswrapper[4962]: I1201 21:34:31.569567 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:31 crc kubenswrapper[4962]: I1201 21:34:31.569583 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:31 crc kubenswrapper[4962]: I1201 21:34:31.569605 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:31 crc kubenswrapper[4962]: I1201 21:34:31.569621 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:31Z","lastTransitionTime":"2025-12-01T21:34:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:31 crc kubenswrapper[4962]: I1201 21:34:31.672316 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:31 crc kubenswrapper[4962]: I1201 21:34:31.672372 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:31 crc kubenswrapper[4962]: I1201 21:34:31.672389 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:31 crc kubenswrapper[4962]: I1201 21:34:31.672412 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:31 crc kubenswrapper[4962]: I1201 21:34:31.672430 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:31Z","lastTransitionTime":"2025-12-01T21:34:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:31 crc kubenswrapper[4962]: I1201 21:34:31.774746 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:31 crc kubenswrapper[4962]: I1201 21:34:31.774841 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:31 crc kubenswrapper[4962]: I1201 21:34:31.774855 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:31 crc kubenswrapper[4962]: I1201 21:34:31.774873 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:31 crc kubenswrapper[4962]: I1201 21:34:31.774888 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:31Z","lastTransitionTime":"2025-12-01T21:34:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:31 crc kubenswrapper[4962]: I1201 21:34:31.878470 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:31 crc kubenswrapper[4962]: I1201 21:34:31.878527 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:31 crc kubenswrapper[4962]: I1201 21:34:31.878545 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:31 crc kubenswrapper[4962]: I1201 21:34:31.878569 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:31 crc kubenswrapper[4962]: I1201 21:34:31.878586 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:31Z","lastTransitionTime":"2025-12-01T21:34:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:31 crc kubenswrapper[4962]: I1201 21:34:31.981697 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:31 crc kubenswrapper[4962]: I1201 21:34:31.981759 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:31 crc kubenswrapper[4962]: I1201 21:34:31.981777 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:31 crc kubenswrapper[4962]: I1201 21:34:31.981801 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:31 crc kubenswrapper[4962]: I1201 21:34:31.981819 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:31Z","lastTransitionTime":"2025-12-01T21:34:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:32 crc kubenswrapper[4962]: I1201 21:34:32.084710 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:32 crc kubenswrapper[4962]: I1201 21:34:32.084767 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:32 crc kubenswrapper[4962]: I1201 21:34:32.084784 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:32 crc kubenswrapper[4962]: I1201 21:34:32.084809 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:32 crc kubenswrapper[4962]: I1201 21:34:32.084827 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:32Z","lastTransitionTime":"2025-12-01T21:34:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:32 crc kubenswrapper[4962]: I1201 21:34:32.187422 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:32 crc kubenswrapper[4962]: I1201 21:34:32.187461 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:32 crc kubenswrapper[4962]: I1201 21:34:32.187472 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:32 crc kubenswrapper[4962]: I1201 21:34:32.187491 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:32 crc kubenswrapper[4962]: I1201 21:34:32.187505 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:32Z","lastTransitionTime":"2025-12-01T21:34:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:32 crc kubenswrapper[4962]: I1201 21:34:32.290645 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:32 crc kubenswrapper[4962]: I1201 21:34:32.290707 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:32 crc kubenswrapper[4962]: I1201 21:34:32.290724 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:32 crc kubenswrapper[4962]: I1201 21:34:32.290748 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:32 crc kubenswrapper[4962]: I1201 21:34:32.290764 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:32Z","lastTransitionTime":"2025-12-01T21:34:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:32 crc kubenswrapper[4962]: I1201 21:34:32.394630 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:32 crc kubenswrapper[4962]: I1201 21:34:32.394687 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:32 crc kubenswrapper[4962]: I1201 21:34:32.394739 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:32 crc kubenswrapper[4962]: I1201 21:34:32.394766 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:32 crc kubenswrapper[4962]: I1201 21:34:32.394782 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:32Z","lastTransitionTime":"2025-12-01T21:34:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:32 crc kubenswrapper[4962]: I1201 21:34:32.498017 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:32 crc kubenswrapper[4962]: I1201 21:34:32.498069 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:32 crc kubenswrapper[4962]: I1201 21:34:32.498085 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:32 crc kubenswrapper[4962]: I1201 21:34:32.498106 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:32 crc kubenswrapper[4962]: I1201 21:34:32.498125 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:32Z","lastTransitionTime":"2025-12-01T21:34:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:32 crc kubenswrapper[4962]: I1201 21:34:32.602523 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:32 crc kubenswrapper[4962]: I1201 21:34:32.602600 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:32 crc kubenswrapper[4962]: I1201 21:34:32.602622 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:32 crc kubenswrapper[4962]: I1201 21:34:32.602650 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:32 crc kubenswrapper[4962]: I1201 21:34:32.602673 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:32Z","lastTransitionTime":"2025-12-01T21:34:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:32 crc kubenswrapper[4962]: I1201 21:34:32.706419 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:32 crc kubenswrapper[4962]: I1201 21:34:32.706490 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:32 crc kubenswrapper[4962]: I1201 21:34:32.706510 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:32 crc kubenswrapper[4962]: I1201 21:34:32.706536 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:32 crc kubenswrapper[4962]: I1201 21:34:32.706555 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:32Z","lastTransitionTime":"2025-12-01T21:34:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:32 crc kubenswrapper[4962]: I1201 21:34:32.811165 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:32 crc kubenswrapper[4962]: I1201 21:34:32.811230 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:32 crc kubenswrapper[4962]: I1201 21:34:32.811248 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:32 crc kubenswrapper[4962]: I1201 21:34:32.811278 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:32 crc kubenswrapper[4962]: I1201 21:34:32.811301 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:32Z","lastTransitionTime":"2025-12-01T21:34:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:32 crc kubenswrapper[4962]: I1201 21:34:32.914873 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:32 crc kubenswrapper[4962]: I1201 21:34:32.914920 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:32 crc kubenswrapper[4962]: I1201 21:34:32.914949 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:32 crc kubenswrapper[4962]: I1201 21:34:32.914966 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:32 crc kubenswrapper[4962]: I1201 21:34:32.914978 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:32Z","lastTransitionTime":"2025-12-01T21:34:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:33 crc kubenswrapper[4962]: I1201 21:34:33.017378 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:33 crc kubenswrapper[4962]: I1201 21:34:33.017428 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:33 crc kubenswrapper[4962]: I1201 21:34:33.017447 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:33 crc kubenswrapper[4962]: I1201 21:34:33.017466 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:33 crc kubenswrapper[4962]: I1201 21:34:33.017483 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:33Z","lastTransitionTime":"2025-12-01T21:34:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:33 crc kubenswrapper[4962]: I1201 21:34:33.121030 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:33 crc kubenswrapper[4962]: I1201 21:34:33.121087 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:33 crc kubenswrapper[4962]: I1201 21:34:33.121104 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:33 crc kubenswrapper[4962]: I1201 21:34:33.121131 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:33 crc kubenswrapper[4962]: I1201 21:34:33.121147 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:33Z","lastTransitionTime":"2025-12-01T21:34:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:33 crc kubenswrapper[4962]: I1201 21:34:33.218860 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 21:34:33 crc kubenswrapper[4962]: I1201 21:34:33.218989 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:34:33 crc kubenswrapper[4962]: E1201 21:34:33.219107 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 21:34:33 crc kubenswrapper[4962]: I1201 21:34:33.219139 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q5q5" Dec 01 21:34:33 crc kubenswrapper[4962]: I1201 21:34:33.219182 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 21:34:33 crc kubenswrapper[4962]: E1201 21:34:33.219285 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q5q5" podUID="5e1746bf-6971-44aa-ae52-f349e6963eb2" Dec 01 21:34:33 crc kubenswrapper[4962]: E1201 21:34:33.219478 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 21:34:33 crc kubenswrapper[4962]: E1201 21:34:33.219557 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 21:34:33 crc kubenswrapper[4962]: I1201 21:34:33.223665 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:33 crc kubenswrapper[4962]: I1201 21:34:33.223727 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:33 crc kubenswrapper[4962]: I1201 21:34:33.223748 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:33 crc kubenswrapper[4962]: I1201 21:34:33.223778 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:33 crc kubenswrapper[4962]: I1201 21:34:33.223805 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:33Z","lastTransitionTime":"2025-12-01T21:34:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:33 crc kubenswrapper[4962]: I1201 21:34:33.326305 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:33 crc kubenswrapper[4962]: I1201 21:34:33.326377 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:33 crc kubenswrapper[4962]: I1201 21:34:33.326399 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:33 crc kubenswrapper[4962]: I1201 21:34:33.326427 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:33 crc kubenswrapper[4962]: I1201 21:34:33.326449 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:33Z","lastTransitionTime":"2025-12-01T21:34:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:33 crc kubenswrapper[4962]: I1201 21:34:33.428991 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:33 crc kubenswrapper[4962]: I1201 21:34:33.429052 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:33 crc kubenswrapper[4962]: I1201 21:34:33.429075 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:33 crc kubenswrapper[4962]: I1201 21:34:33.429107 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:33 crc kubenswrapper[4962]: I1201 21:34:33.429131 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:33Z","lastTransitionTime":"2025-12-01T21:34:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:33 crc kubenswrapper[4962]: I1201 21:34:33.532382 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:33 crc kubenswrapper[4962]: I1201 21:34:33.532451 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:33 crc kubenswrapper[4962]: I1201 21:34:33.532473 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:33 crc kubenswrapper[4962]: I1201 21:34:33.532502 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:33 crc kubenswrapper[4962]: I1201 21:34:33.532523 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:33Z","lastTransitionTime":"2025-12-01T21:34:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:33 crc kubenswrapper[4962]: I1201 21:34:33.635321 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:33 crc kubenswrapper[4962]: I1201 21:34:33.635379 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:33 crc kubenswrapper[4962]: I1201 21:34:33.635401 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:33 crc kubenswrapper[4962]: I1201 21:34:33.635557 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:33 crc kubenswrapper[4962]: I1201 21:34:33.635590 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:33Z","lastTransitionTime":"2025-12-01T21:34:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:33 crc kubenswrapper[4962]: I1201 21:34:33.738468 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:33 crc kubenswrapper[4962]: I1201 21:34:33.738542 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:33 crc kubenswrapper[4962]: I1201 21:34:33.738565 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:33 crc kubenswrapper[4962]: I1201 21:34:33.738593 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:33 crc kubenswrapper[4962]: I1201 21:34:33.738615 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:33Z","lastTransitionTime":"2025-12-01T21:34:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:33 crc kubenswrapper[4962]: I1201 21:34:33.841374 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:33 crc kubenswrapper[4962]: I1201 21:34:33.841473 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:33 crc kubenswrapper[4962]: I1201 21:34:33.841495 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:33 crc kubenswrapper[4962]: I1201 21:34:33.841524 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:33 crc kubenswrapper[4962]: I1201 21:34:33.841544 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:33Z","lastTransitionTime":"2025-12-01T21:34:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:33 crc kubenswrapper[4962]: I1201 21:34:33.945241 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:33 crc kubenswrapper[4962]: I1201 21:34:33.945306 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:33 crc kubenswrapper[4962]: I1201 21:34:33.945325 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:33 crc kubenswrapper[4962]: I1201 21:34:33.945349 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:33 crc kubenswrapper[4962]: I1201 21:34:33.945367 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:33Z","lastTransitionTime":"2025-12-01T21:34:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:34 crc kubenswrapper[4962]: I1201 21:34:34.048912 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:34 crc kubenswrapper[4962]: I1201 21:34:34.049005 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:34 crc kubenswrapper[4962]: I1201 21:34:34.049024 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:34 crc kubenswrapper[4962]: I1201 21:34:34.049048 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:34 crc kubenswrapper[4962]: I1201 21:34:34.049065 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:34Z","lastTransitionTime":"2025-12-01T21:34:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:34 crc kubenswrapper[4962]: I1201 21:34:34.152090 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:34 crc kubenswrapper[4962]: I1201 21:34:34.152163 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:34 crc kubenswrapper[4962]: I1201 21:34:34.152185 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:34 crc kubenswrapper[4962]: I1201 21:34:34.152213 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:34 crc kubenswrapper[4962]: I1201 21:34:34.152239 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:34Z","lastTransitionTime":"2025-12-01T21:34:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:34 crc kubenswrapper[4962]: I1201 21:34:34.254436 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:34 crc kubenswrapper[4962]: I1201 21:34:34.254513 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:34 crc kubenswrapper[4962]: I1201 21:34:34.254537 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:34 crc kubenswrapper[4962]: I1201 21:34:34.254564 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:34 crc kubenswrapper[4962]: I1201 21:34:34.254587 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:34Z","lastTransitionTime":"2025-12-01T21:34:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:34 crc kubenswrapper[4962]: I1201 21:34:34.356926 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:34 crc kubenswrapper[4962]: I1201 21:34:34.357092 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:34 crc kubenswrapper[4962]: I1201 21:34:34.357114 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:34 crc kubenswrapper[4962]: I1201 21:34:34.357139 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:34 crc kubenswrapper[4962]: I1201 21:34:34.357156 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:34Z","lastTransitionTime":"2025-12-01T21:34:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:34 crc kubenswrapper[4962]: I1201 21:34:34.460691 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:34 crc kubenswrapper[4962]: I1201 21:34:34.460758 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:34 crc kubenswrapper[4962]: I1201 21:34:34.460775 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:34 crc kubenswrapper[4962]: I1201 21:34:34.460800 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:34 crc kubenswrapper[4962]: I1201 21:34:34.460817 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:34Z","lastTransitionTime":"2025-12-01T21:34:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:34 crc kubenswrapper[4962]: I1201 21:34:34.564050 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:34 crc kubenswrapper[4962]: I1201 21:34:34.564121 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:34 crc kubenswrapper[4962]: I1201 21:34:34.564132 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:34 crc kubenswrapper[4962]: I1201 21:34:34.564174 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:34 crc kubenswrapper[4962]: I1201 21:34:34.564189 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:34Z","lastTransitionTime":"2025-12-01T21:34:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:34 crc kubenswrapper[4962]: I1201 21:34:34.668572 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:34 crc kubenswrapper[4962]: I1201 21:34:34.668661 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:34 crc kubenswrapper[4962]: I1201 21:34:34.668685 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:34 crc kubenswrapper[4962]: I1201 21:34:34.668721 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:34 crc kubenswrapper[4962]: I1201 21:34:34.668746 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:34Z","lastTransitionTime":"2025-12-01T21:34:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:34 crc kubenswrapper[4962]: I1201 21:34:34.772014 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:34 crc kubenswrapper[4962]: I1201 21:34:34.772081 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:34 crc kubenswrapper[4962]: I1201 21:34:34.772097 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:34 crc kubenswrapper[4962]: I1201 21:34:34.772121 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:34 crc kubenswrapper[4962]: I1201 21:34:34.772137 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:34Z","lastTransitionTime":"2025-12-01T21:34:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:34 crc kubenswrapper[4962]: I1201 21:34:34.875273 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:34 crc kubenswrapper[4962]: I1201 21:34:34.875348 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:34 crc kubenswrapper[4962]: I1201 21:34:34.875369 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:34 crc kubenswrapper[4962]: I1201 21:34:34.875396 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:34 crc kubenswrapper[4962]: I1201 21:34:34.875415 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:34Z","lastTransitionTime":"2025-12-01T21:34:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:34 crc kubenswrapper[4962]: I1201 21:34:34.978587 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:34 crc kubenswrapper[4962]: I1201 21:34:34.978623 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:34 crc kubenswrapper[4962]: I1201 21:34:34.978634 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:34 crc kubenswrapper[4962]: I1201 21:34:34.978651 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:34 crc kubenswrapper[4962]: I1201 21:34:34.978663 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:34Z","lastTransitionTime":"2025-12-01T21:34:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:35 crc kubenswrapper[4962]: I1201 21:34:35.082088 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:35 crc kubenswrapper[4962]: I1201 21:34:35.082133 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:35 crc kubenswrapper[4962]: I1201 21:34:35.082150 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:35 crc kubenswrapper[4962]: I1201 21:34:35.082175 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:35 crc kubenswrapper[4962]: I1201 21:34:35.082193 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:35Z","lastTransitionTime":"2025-12-01T21:34:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:35 crc kubenswrapper[4962]: I1201 21:34:35.184420 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:35 crc kubenswrapper[4962]: I1201 21:34:35.184484 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:35 crc kubenswrapper[4962]: I1201 21:34:35.184501 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:35 crc kubenswrapper[4962]: I1201 21:34:35.184526 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:35 crc kubenswrapper[4962]: I1201 21:34:35.184548 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:35Z","lastTransitionTime":"2025-12-01T21:34:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:35 crc kubenswrapper[4962]: I1201 21:34:35.218818 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q5q5" Dec 01 21:34:35 crc kubenswrapper[4962]: I1201 21:34:35.218924 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 21:34:35 crc kubenswrapper[4962]: I1201 21:34:35.218877 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:34:35 crc kubenswrapper[4962]: I1201 21:34:35.218826 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 21:34:35 crc kubenswrapper[4962]: E1201 21:34:35.219172 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q5q5" podUID="5e1746bf-6971-44aa-ae52-f349e6963eb2" Dec 01 21:34:35 crc kubenswrapper[4962]: E1201 21:34:35.219323 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 21:34:35 crc kubenswrapper[4962]: E1201 21:34:35.219430 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 21:34:35 crc kubenswrapper[4962]: E1201 21:34:35.219552 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 21:34:35 crc kubenswrapper[4962]: I1201 21:34:35.287854 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:35 crc kubenswrapper[4962]: I1201 21:34:35.287907 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:35 crc kubenswrapper[4962]: I1201 21:34:35.287923 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:35 crc kubenswrapper[4962]: I1201 21:34:35.287980 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:35 crc kubenswrapper[4962]: I1201 21:34:35.287998 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:35Z","lastTransitionTime":"2025-12-01T21:34:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:35 crc kubenswrapper[4962]: I1201 21:34:35.390521 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:35 crc kubenswrapper[4962]: I1201 21:34:35.390577 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:35 crc kubenswrapper[4962]: I1201 21:34:35.390596 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:35 crc kubenswrapper[4962]: I1201 21:34:35.390618 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:35 crc kubenswrapper[4962]: I1201 21:34:35.390636 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:35Z","lastTransitionTime":"2025-12-01T21:34:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:35 crc kubenswrapper[4962]: I1201 21:34:35.493975 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:35 crc kubenswrapper[4962]: I1201 21:34:35.494043 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:35 crc kubenswrapper[4962]: I1201 21:34:35.494063 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:35 crc kubenswrapper[4962]: I1201 21:34:35.494092 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:35 crc kubenswrapper[4962]: I1201 21:34:35.494111 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:35Z","lastTransitionTime":"2025-12-01T21:34:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:35 crc kubenswrapper[4962]: I1201 21:34:35.597290 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:35 crc kubenswrapper[4962]: I1201 21:34:35.597355 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:35 crc kubenswrapper[4962]: I1201 21:34:35.597373 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:35 crc kubenswrapper[4962]: I1201 21:34:35.597398 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:35 crc kubenswrapper[4962]: I1201 21:34:35.597417 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:35Z","lastTransitionTime":"2025-12-01T21:34:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:35 crc kubenswrapper[4962]: I1201 21:34:35.700300 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:35 crc kubenswrapper[4962]: I1201 21:34:35.700363 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:35 crc kubenswrapper[4962]: I1201 21:34:35.700380 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:35 crc kubenswrapper[4962]: I1201 21:34:35.700405 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:35 crc kubenswrapper[4962]: I1201 21:34:35.700423 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:35Z","lastTransitionTime":"2025-12-01T21:34:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:35 crc kubenswrapper[4962]: I1201 21:34:35.803093 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:35 crc kubenswrapper[4962]: I1201 21:34:35.803165 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:35 crc kubenswrapper[4962]: I1201 21:34:35.803182 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:35 crc kubenswrapper[4962]: I1201 21:34:35.803206 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:35 crc kubenswrapper[4962]: I1201 21:34:35.803222 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:35Z","lastTransitionTime":"2025-12-01T21:34:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:35 crc kubenswrapper[4962]: I1201 21:34:35.906135 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:35 crc kubenswrapper[4962]: I1201 21:34:35.906207 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:35 crc kubenswrapper[4962]: I1201 21:34:35.906235 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:35 crc kubenswrapper[4962]: I1201 21:34:35.906265 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:35 crc kubenswrapper[4962]: I1201 21:34:35.906286 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:35Z","lastTransitionTime":"2025-12-01T21:34:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:36 crc kubenswrapper[4962]: I1201 21:34:36.010196 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:36 crc kubenswrapper[4962]: I1201 21:34:36.010268 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:36 crc kubenswrapper[4962]: I1201 21:34:36.010290 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:36 crc kubenswrapper[4962]: I1201 21:34:36.010321 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:36 crc kubenswrapper[4962]: I1201 21:34:36.010342 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:36Z","lastTransitionTime":"2025-12-01T21:34:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:36 crc kubenswrapper[4962]: I1201 21:34:36.113248 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:36 crc kubenswrapper[4962]: I1201 21:34:36.113305 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:36 crc kubenswrapper[4962]: I1201 21:34:36.113322 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:36 crc kubenswrapper[4962]: I1201 21:34:36.113350 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:36 crc kubenswrapper[4962]: I1201 21:34:36.113373 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:36Z","lastTransitionTime":"2025-12-01T21:34:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:36 crc kubenswrapper[4962]: I1201 21:34:36.215247 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:36 crc kubenswrapper[4962]: I1201 21:34:36.215303 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:36 crc kubenswrapper[4962]: I1201 21:34:36.215318 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:36 crc kubenswrapper[4962]: I1201 21:34:36.215335 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:36 crc kubenswrapper[4962]: I1201 21:34:36.215346 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:36Z","lastTransitionTime":"2025-12-01T21:34:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:36 crc kubenswrapper[4962]: I1201 21:34:36.235391 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2q5q5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e1746bf-6971-44aa-ae52-f349e6963eb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wgwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wgwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:34:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2q5q5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:36Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:36 crc kubenswrapper[4962]: I1201 21:34:36.254113 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://311915b276fcb2749fd7443c07b4a339131584ba32944273ed063ed55edec6c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:36Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:36 crc kubenswrapper[4962]: I1201 21:34:36.273177 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69adfdccde94cd891284e61c5edd6a092d14c5276ff514618fa3b44169e6347e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a6c3b22dbe7ed00b2356583225c79a46bfbf7014f2ba5b6d2226f1b1f3f7b39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:36Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:36 crc kubenswrapper[4962]: I1201 21:34:36.289133 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w7dpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"943bfc9d-612b-4273-9774-f1866b7af4b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0486c9366b81f3abfd35806c7eef00eb2a353450d850cc665c43fe3c78c9fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w7dpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:36Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:36 crc kubenswrapper[4962]: I1201 21:34:36.304224 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"191b6ce3-f613-4217-b224-a65ee4cfdfe7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37ed2e5c6ce1be6c710341ab02dc9ae89ebb0b309c11e78b251ffb532c31587f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rf67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca94caf0040f2d86239f149d34e419adcc86594fda2b4189e74f699c125857f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rf67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b642k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:36Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:36 crc kubenswrapper[4962]: I1201 21:34:36.318286 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:36 crc kubenswrapper[4962]: I1201 21:34:36.318376 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:36 crc kubenswrapper[4962]: I1201 21:34:36.318436 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:36 crc kubenswrapper[4962]: I1201 21:34:36.318462 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:36 crc kubenswrapper[4962]: I1201 21:34:36.318512 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:36Z","lastTransitionTime":"2025-12-01T21:34:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:36 crc kubenswrapper[4962]: I1201 21:34:36.318980 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0980f7e7-e587-417c-a00b-aada45bea2ed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d0b3972d5038c6a9d8368101aa0f07a39a4d7023ae9e3ea0ef061b909afb194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ceb0b8e00edd30c0fbc6f9b36340785b50eff3b237aeac7dc21c74fcf801c1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e689e109554f82f2b6fc4bbde2e2f2a7d6ba0b5772b57c7a53827e6a5f66223c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://171ba01c49ce48dc0d4b632a8de3217fc6e9eab2c512c7214c377ec51987b2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://171ba01c49ce48dc0d4b632a8de3217fc6e9eab2c512c7214c377ec51987b2d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:36Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:36 crc kubenswrapper[4962]: I1201 21:34:36.341583 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:36Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:36 crc kubenswrapper[4962]: I1201 21:34:36.371871 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"017b2e87-9a6e-46c6-b061-1ed93bfd2322\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c2b5c061f60dc51486c1b3b6fd97f18f63860ecb75697b61ee8288c43f23417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a2d1885515e7eab97b61b268f14f9fab5ff02464621e1a9b6adfee4c032ca3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81854e8f9430ba467e0aecead00fad301f92f85fdcca3f93f5e4aad9bba65925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70bcf088189f0158815dd70d752870decf941ad51bbfbe7c0c872da603659864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f5f31e9f2e590d4d3319248b0a4dca7e1f6575cb973ec7cc3e33fc91eb82d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da8a6f1fac0baebddfe20bd0ebda104356ad37e9f128affd8df2757584e8e712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b08ed34e121e3dba29c3c3bfb0a54f8496af37c1b7930a64db11f43c68076592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b08ed34e121e3dba29c3c3bfb0a54f8496af37c1b7930a64db11f43c68076592\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T21:34:28Z\\\",\\\"message\\\":\\\"] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {43933d5e-3c3b-4ff8-8926-04ac25de450e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 21:34:28.257117 6634 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI1201 21:34:28.257138 6634 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI1201 21:34:28.258137 6634 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nF1201 21:34:28.258116 6634 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-j77n9_openshift-ovn-kubernetes(017b2e87-9a6e-46c6-b061-1ed93bfd2322)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://893a9b3cc35f66e4d7ce63bd2c5c94a0e5824466a37b2281e05c4e92b5e90a05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j77n9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:36Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:36 crc kubenswrapper[4962]: I1201 21:34:36.389852 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dce968b-9adc-4cbe-958c-00bdd7ba6159\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d0603aa8db338a61428d0c439ab252ad7db6ee0061992381a2f2c2534ecc08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b00d16ddb5e5c38227a87ee09c7e0d74ca43fb9226904dc5d604abe6a9c9b5ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc7177cbcbb0fe51491e055e7a8cb565836f1ffe163b6e160c3b425a5218fc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e749578c0ce8cb1dd38de90c24c5ddf738558163fe9012afcf1db088756495\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:36Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:36 crc kubenswrapper[4962]: I1201 21:34:36.403383 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db02e5c3ea0837c8534d6d0d89c6343ef077bc79b08bb6514fcc7fa0d71ac94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:36Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:36 crc kubenswrapper[4962]: I1201 21:34:36.416475 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:36Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:36 crc kubenswrapper[4962]: I1201 21:34:36.422710 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:36 crc kubenswrapper[4962]: I1201 21:34:36.422742 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:36 crc kubenswrapper[4962]: I1201 21:34:36.422750 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:36 crc kubenswrapper[4962]: I1201 21:34:36.422766 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:36 crc kubenswrapper[4962]: I1201 21:34:36.422777 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:36Z","lastTransitionTime":"2025-12-01T21:34:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:36 crc kubenswrapper[4962]: I1201 21:34:36.443490 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-m4wg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38b9e31-13b0-4a48-93bf-b3722ca60642\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6b607ad59cd135df76eb0d6167a309468b46c7d487077b737d6d35390990de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws4mx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-m4wg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:36Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:36 crc kubenswrapper[4962]: I1201 21:34:36.467919 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lv2jr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47902aa9-b3e5-4279-a0ee-23ec28d1c67b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131ddb58672271d21222b6c9376dc6fa0bc5ba8d9bb4c8b0db88e81d0f59984c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f38ae7ac1122c7cceeaf7ef6284034b3b00f0d34060ce2a912344e521f8288ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f38ae7ac1122c7cceeaf7ef6284034b3b00f0d34060ce2a912344e521f8288ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3df9e036c0435fee1aa00f18a53bfb50334213a9284f766de1454f30a6c78f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3df9e036c0435fee1aa00f18a53bfb50334213a9284f766de1454f30a6c78f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eea885834889b1efa1a74ddb9d4c378669affd6dcf493f7d5e2c18d63ba9b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eea885834889b1efa1a74ddb9d4c378669affd6dcf493f7d5e2c18d63ba9b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b892d495763f16b0d9b9afd4440fa4d704de060b42b7506b28e9d990c47bfc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b892d495763f16b0d9b9afd4440fa4d704de060b42b7506b28e9d990c47bfc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fd3d341f898bc53c51ad9a79a03ffd02ef0ac0e8f53e33ebc139a30fa85b7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4fd3d341f898bc53c51ad9a79a03ffd02ef0ac0e8f53e33ebc139a30fa85b7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1a2f823c4006bd3f5acb5f92c78b1020ea0992f77a76366c5bab21657bf403f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1a2f823c4006bd3f5acb5f92c78b1020ea0992f77a76366c5bab21657bf403f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lv2jr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:36Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:36 crc kubenswrapper[4962]: I1201 21:34:36.483559 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4wzdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c4763ed-582e-4003-a0da-526ae8aee799\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca28226077849a752e7f8484f9ef98d6fb065d6ba9b3c4ef05f3ef1b0554bc32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5lb4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:34:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4wzdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:36Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:36 crc kubenswrapper[4962]: I1201 21:34:36.504226 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac15c86b-cd18-4110-bda1-ba116dccb445\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edc674b0b2e671ee34618f4a89fb513b4d6be3b5b2c65510554ed57b84f305b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d764e05de4e1d86add3da927e55f8d2f211bba19a2104af89e43d13f986f5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a26275d40e0cdc563a8704d985b92ca116fd053ac663b23ada1f3d55d3d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7bb726576fbf1e8a595c9fdab30ed4852d838f3ad72aa432b880e8466493ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa02037f2a0411aa4c3b7eff66c72a1885d70e56cf72cdf786d04b5840ecbe6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"message\\\":\\\" 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764624829\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764624829\\\\\\\\\\\\\\\" (2025-12-01 20:33:49 +0000 UTC to 2026-12-01 20:33:49 +0000 UTC (now=2025-12-01 21:33:54.341897199 +0000 UTC))\\\\\\\"\\\\nI1201 21:33:54.342031 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 21:33:54.342089 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 21:33:54.342220 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1201 21:33:54.342244 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1201 21:33:54.341422 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 21:33:54.342706 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 21:33:54.344341 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-602538438/tls.crt::/tmp/serving-cert-602538438/tls.key\\\\\\\"\\\\nI1201 21:33:54.345794 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 21:33:54.345817 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 21:33:54.345844 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348097 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348569 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1201 21:33:54.367723 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bbe0097117ffbc7aa9c8247995dcec3765f03a9aa7657ee0c0da4c3a3ec874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:36Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:36 crc kubenswrapper[4962]: I1201 21:34:36.525047 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:36Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:36 crc kubenswrapper[4962]: I1201 21:34:36.526693 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:36 crc kubenswrapper[4962]: I1201 21:34:36.526778 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:36 crc kubenswrapper[4962]: I1201 21:34:36.526807 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:36 crc kubenswrapper[4962]: I1201 21:34:36.526842 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:36 crc kubenswrapper[4962]: I1201 21:34:36.526868 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:36Z","lastTransitionTime":"2025-12-01T21:34:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:36 crc kubenswrapper[4962]: I1201 21:34:36.545272 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fqtnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84505e9c-7b91-400d-b30b-d7d2cfe3c29b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4acbb8c835ff8f148555f849f503e7bf07afe0526bd6d0a22277e53f2eeef390\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vpt7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5ebad9cf8b8ff273eadbf4b7188ec1d0e0b191dd8c7d6e48efb504a6182b655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vpt7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:34:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fqtnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:36Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:36 crc kubenswrapper[4962]: I1201 21:34:36.630293 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:36 crc kubenswrapper[4962]: I1201 21:34:36.630893 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:36 crc kubenswrapper[4962]: I1201 21:34:36.631293 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:36 crc kubenswrapper[4962]: I1201 21:34:36.631463 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:36 crc kubenswrapper[4962]: I1201 21:34:36.631612 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:36Z","lastTransitionTime":"2025-12-01T21:34:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:36 crc kubenswrapper[4962]: I1201 21:34:36.734843 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:36 crc kubenswrapper[4962]: I1201 21:34:36.734885 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:36 crc kubenswrapper[4962]: I1201 21:34:36.734893 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:36 crc kubenswrapper[4962]: I1201 21:34:36.734909 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:36 crc kubenswrapper[4962]: I1201 21:34:36.734921 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:36Z","lastTransitionTime":"2025-12-01T21:34:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:36 crc kubenswrapper[4962]: I1201 21:34:36.838359 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:36 crc kubenswrapper[4962]: I1201 21:34:36.838419 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:36 crc kubenswrapper[4962]: I1201 21:34:36.838436 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:36 crc kubenswrapper[4962]: I1201 21:34:36.838464 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:36 crc kubenswrapper[4962]: I1201 21:34:36.838485 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:36Z","lastTransitionTime":"2025-12-01T21:34:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:36 crc kubenswrapper[4962]: I1201 21:34:36.941883 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:36 crc kubenswrapper[4962]: I1201 21:34:36.941964 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:36 crc kubenswrapper[4962]: I1201 21:34:36.941984 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:36 crc kubenswrapper[4962]: I1201 21:34:36.942012 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:36 crc kubenswrapper[4962]: I1201 21:34:36.942030 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:36Z","lastTransitionTime":"2025-12-01T21:34:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:37 crc kubenswrapper[4962]: I1201 21:34:37.045396 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:37 crc kubenswrapper[4962]: I1201 21:34:37.045464 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:37 crc kubenswrapper[4962]: I1201 21:34:37.045489 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:37 crc kubenswrapper[4962]: I1201 21:34:37.045517 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:37 crc kubenswrapper[4962]: I1201 21:34:37.045539 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:37Z","lastTransitionTime":"2025-12-01T21:34:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:37 crc kubenswrapper[4962]: I1201 21:34:37.148713 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:37 crc kubenswrapper[4962]: I1201 21:34:37.148764 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:37 crc kubenswrapper[4962]: I1201 21:34:37.148777 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:37 crc kubenswrapper[4962]: I1201 21:34:37.148799 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:37 crc kubenswrapper[4962]: I1201 21:34:37.148814 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:37Z","lastTransitionTime":"2025-12-01T21:34:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:37 crc kubenswrapper[4962]: I1201 21:34:37.219394 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q5q5" Dec 01 21:34:37 crc kubenswrapper[4962]: I1201 21:34:37.219488 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 21:34:37 crc kubenswrapper[4962]: I1201 21:34:37.219593 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:34:37 crc kubenswrapper[4962]: E1201 21:34:37.219820 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q5q5" podUID="5e1746bf-6971-44aa-ae52-f349e6963eb2" Dec 01 21:34:37 crc kubenswrapper[4962]: E1201 21:34:37.220024 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 21:34:37 crc kubenswrapper[4962]: I1201 21:34:37.220257 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 21:34:37 crc kubenswrapper[4962]: E1201 21:34:37.220303 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 21:34:37 crc kubenswrapper[4962]: E1201 21:34:37.220788 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 21:34:37 crc kubenswrapper[4962]: I1201 21:34:37.252380 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:37 crc kubenswrapper[4962]: I1201 21:34:37.252440 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:37 crc kubenswrapper[4962]: I1201 21:34:37.252461 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:37 crc kubenswrapper[4962]: I1201 21:34:37.252494 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:37 crc kubenswrapper[4962]: I1201 21:34:37.252516 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:37Z","lastTransitionTime":"2025-12-01T21:34:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:37 crc kubenswrapper[4962]: I1201 21:34:37.355730 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:37 crc kubenswrapper[4962]: I1201 21:34:37.355806 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:37 crc kubenswrapper[4962]: I1201 21:34:37.355825 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:37 crc kubenswrapper[4962]: I1201 21:34:37.355851 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:37 crc kubenswrapper[4962]: I1201 21:34:37.355869 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:37Z","lastTransitionTime":"2025-12-01T21:34:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:37 crc kubenswrapper[4962]: I1201 21:34:37.458756 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:37 crc kubenswrapper[4962]: I1201 21:34:37.458800 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:37 crc kubenswrapper[4962]: I1201 21:34:37.458815 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:37 crc kubenswrapper[4962]: I1201 21:34:37.458834 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:37 crc kubenswrapper[4962]: I1201 21:34:37.458848 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:37Z","lastTransitionTime":"2025-12-01T21:34:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:37 crc kubenswrapper[4962]: I1201 21:34:37.562463 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:37 crc kubenswrapper[4962]: I1201 21:34:37.562527 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:37 crc kubenswrapper[4962]: I1201 21:34:37.562544 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:37 crc kubenswrapper[4962]: I1201 21:34:37.562567 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:37 crc kubenswrapper[4962]: I1201 21:34:37.562584 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:37Z","lastTransitionTime":"2025-12-01T21:34:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:37 crc kubenswrapper[4962]: I1201 21:34:37.665653 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:37 crc kubenswrapper[4962]: I1201 21:34:37.665705 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:37 crc kubenswrapper[4962]: I1201 21:34:37.665722 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:37 crc kubenswrapper[4962]: I1201 21:34:37.665745 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:37 crc kubenswrapper[4962]: I1201 21:34:37.665763 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:37Z","lastTransitionTime":"2025-12-01T21:34:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:37 crc kubenswrapper[4962]: I1201 21:34:37.768733 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:37 crc kubenswrapper[4962]: I1201 21:34:37.768789 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:37 crc kubenswrapper[4962]: I1201 21:34:37.768805 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:37 crc kubenswrapper[4962]: I1201 21:34:37.768830 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:37 crc kubenswrapper[4962]: I1201 21:34:37.768849 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:37Z","lastTransitionTime":"2025-12-01T21:34:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:37 crc kubenswrapper[4962]: I1201 21:34:37.872028 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:37 crc kubenswrapper[4962]: I1201 21:34:37.872096 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:37 crc kubenswrapper[4962]: I1201 21:34:37.872115 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:37 crc kubenswrapper[4962]: I1201 21:34:37.872145 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:37 crc kubenswrapper[4962]: I1201 21:34:37.872169 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:37Z","lastTransitionTime":"2025-12-01T21:34:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:37 crc kubenswrapper[4962]: I1201 21:34:37.974901 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:37 crc kubenswrapper[4962]: I1201 21:34:37.974986 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:37 crc kubenswrapper[4962]: I1201 21:34:37.975002 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:37 crc kubenswrapper[4962]: I1201 21:34:37.975025 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:37 crc kubenswrapper[4962]: I1201 21:34:37.975043 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:37Z","lastTransitionTime":"2025-12-01T21:34:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:38 crc kubenswrapper[4962]: I1201 21:34:38.078228 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:38 crc kubenswrapper[4962]: I1201 21:34:38.078267 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:38 crc kubenswrapper[4962]: I1201 21:34:38.078280 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:38 crc kubenswrapper[4962]: I1201 21:34:38.078296 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:38 crc kubenswrapper[4962]: I1201 21:34:38.078309 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:38Z","lastTransitionTime":"2025-12-01T21:34:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:38 crc kubenswrapper[4962]: I1201 21:34:38.181185 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:38 crc kubenswrapper[4962]: I1201 21:34:38.181234 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:38 crc kubenswrapper[4962]: I1201 21:34:38.181248 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:38 crc kubenswrapper[4962]: I1201 21:34:38.181265 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:38 crc kubenswrapper[4962]: I1201 21:34:38.181277 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:38Z","lastTransitionTime":"2025-12-01T21:34:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:38 crc kubenswrapper[4962]: I1201 21:34:38.276689 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:38 crc kubenswrapper[4962]: I1201 21:34:38.276742 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:38 crc kubenswrapper[4962]: I1201 21:34:38.276760 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:38 crc kubenswrapper[4962]: I1201 21:34:38.276784 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:38 crc kubenswrapper[4962]: I1201 21:34:38.276801 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:38Z","lastTransitionTime":"2025-12-01T21:34:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:38 crc kubenswrapper[4962]: E1201 21:34:38.299777 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1be34435-728a-4ba5-8928-fb5e344b2f91\\\",\\\"systemUUID\\\":\\\"c1819972-ca37-4089-9661-6671772d5e38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:38Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:38 crc kubenswrapper[4962]: I1201 21:34:38.304636 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:38 crc kubenswrapper[4962]: I1201 21:34:38.304699 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:38 crc kubenswrapper[4962]: I1201 21:34:38.304718 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:38 crc kubenswrapper[4962]: I1201 21:34:38.304745 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:38 crc kubenswrapper[4962]: I1201 21:34:38.304762 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:38Z","lastTransitionTime":"2025-12-01T21:34:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:38 crc kubenswrapper[4962]: E1201 21:34:38.325722 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1be34435-728a-4ba5-8928-fb5e344b2f91\\\",\\\"systemUUID\\\":\\\"c1819972-ca37-4089-9661-6671772d5e38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:38Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:38 crc kubenswrapper[4962]: I1201 21:34:38.330581 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:38 crc kubenswrapper[4962]: I1201 21:34:38.330628 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:38 crc kubenswrapper[4962]: I1201 21:34:38.330648 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:38 crc kubenswrapper[4962]: I1201 21:34:38.330670 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:38 crc kubenswrapper[4962]: I1201 21:34:38.330689 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:38Z","lastTransitionTime":"2025-12-01T21:34:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:38 crc kubenswrapper[4962]: E1201 21:34:38.351634 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1be34435-728a-4ba5-8928-fb5e344b2f91\\\",\\\"systemUUID\\\":\\\"c1819972-ca37-4089-9661-6671772d5e38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:38Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:38 crc kubenswrapper[4962]: I1201 21:34:38.356619 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:38 crc kubenswrapper[4962]: I1201 21:34:38.356689 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:38 crc kubenswrapper[4962]: I1201 21:34:38.356710 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:38 crc kubenswrapper[4962]: I1201 21:34:38.356735 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:38 crc kubenswrapper[4962]: I1201 21:34:38.356753 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:38Z","lastTransitionTime":"2025-12-01T21:34:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:38 crc kubenswrapper[4962]: E1201 21:34:38.377721 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1be34435-728a-4ba5-8928-fb5e344b2f91\\\",\\\"systemUUID\\\":\\\"c1819972-ca37-4089-9661-6671772d5e38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:38Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:38 crc kubenswrapper[4962]: I1201 21:34:38.383242 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:38 crc kubenswrapper[4962]: I1201 21:34:38.383295 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:38 crc kubenswrapper[4962]: I1201 21:34:38.383311 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:38 crc kubenswrapper[4962]: I1201 21:34:38.383335 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:38 crc kubenswrapper[4962]: I1201 21:34:38.383355 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:38Z","lastTransitionTime":"2025-12-01T21:34:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:38 crc kubenswrapper[4962]: E1201 21:34:38.406233 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1be34435-728a-4ba5-8928-fb5e344b2f91\\\",\\\"systemUUID\\\":\\\"c1819972-ca37-4089-9661-6671772d5e38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:38Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:38 crc kubenswrapper[4962]: E1201 21:34:38.406503 4962 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 21:34:38 crc kubenswrapper[4962]: I1201 21:34:38.409050 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:38 crc kubenswrapper[4962]: I1201 21:34:38.409149 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:38 crc kubenswrapper[4962]: I1201 21:34:38.409167 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:38 crc kubenswrapper[4962]: I1201 21:34:38.409192 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:38 crc kubenswrapper[4962]: I1201 21:34:38.409210 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:38Z","lastTransitionTime":"2025-12-01T21:34:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:38 crc kubenswrapper[4962]: I1201 21:34:38.512300 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:38 crc kubenswrapper[4962]: I1201 21:34:38.512346 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:38 crc kubenswrapper[4962]: I1201 21:34:38.512362 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:38 crc kubenswrapper[4962]: I1201 21:34:38.512385 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:38 crc kubenswrapper[4962]: I1201 21:34:38.512402 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:38Z","lastTransitionTime":"2025-12-01T21:34:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:38 crc kubenswrapper[4962]: I1201 21:34:38.615098 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:38 crc kubenswrapper[4962]: I1201 21:34:38.615190 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:38 crc kubenswrapper[4962]: I1201 21:34:38.615209 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:38 crc kubenswrapper[4962]: I1201 21:34:38.615231 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:38 crc kubenswrapper[4962]: I1201 21:34:38.615249 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:38Z","lastTransitionTime":"2025-12-01T21:34:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:38 crc kubenswrapper[4962]: I1201 21:34:38.718919 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:38 crc kubenswrapper[4962]: I1201 21:34:38.718998 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:38 crc kubenswrapper[4962]: I1201 21:34:38.719014 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:38 crc kubenswrapper[4962]: I1201 21:34:38.719036 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:38 crc kubenswrapper[4962]: I1201 21:34:38.719052 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:38Z","lastTransitionTime":"2025-12-01T21:34:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:38 crc kubenswrapper[4962]: I1201 21:34:38.823060 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:38 crc kubenswrapper[4962]: I1201 21:34:38.823116 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:38 crc kubenswrapper[4962]: I1201 21:34:38.823135 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:38 crc kubenswrapper[4962]: I1201 21:34:38.823158 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:38 crc kubenswrapper[4962]: I1201 21:34:38.823176 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:38Z","lastTransitionTime":"2025-12-01T21:34:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:38 crc kubenswrapper[4962]: I1201 21:34:38.925862 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:38 crc kubenswrapper[4962]: I1201 21:34:38.925895 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:38 crc kubenswrapper[4962]: I1201 21:34:38.925908 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:38 crc kubenswrapper[4962]: I1201 21:34:38.925925 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:38 crc kubenswrapper[4962]: I1201 21:34:38.925951 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:38Z","lastTransitionTime":"2025-12-01T21:34:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:39 crc kubenswrapper[4962]: I1201 21:34:39.029560 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:39 crc kubenswrapper[4962]: I1201 21:34:39.029625 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:39 crc kubenswrapper[4962]: I1201 21:34:39.029638 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:39 crc kubenswrapper[4962]: I1201 21:34:39.029666 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:39 crc kubenswrapper[4962]: I1201 21:34:39.029682 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:39Z","lastTransitionTime":"2025-12-01T21:34:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:39 crc kubenswrapper[4962]: I1201 21:34:39.133598 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:39 crc kubenswrapper[4962]: I1201 21:34:39.133755 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:39 crc kubenswrapper[4962]: I1201 21:34:39.133776 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:39 crc kubenswrapper[4962]: I1201 21:34:39.133806 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:39 crc kubenswrapper[4962]: I1201 21:34:39.133826 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:39Z","lastTransitionTime":"2025-12-01T21:34:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:39 crc kubenswrapper[4962]: I1201 21:34:39.219443 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 21:34:39 crc kubenswrapper[4962]: I1201 21:34:39.219588 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:34:39 crc kubenswrapper[4962]: I1201 21:34:39.219616 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q5q5" Dec 01 21:34:39 crc kubenswrapper[4962]: I1201 21:34:39.219822 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 21:34:39 crc kubenswrapper[4962]: E1201 21:34:39.219828 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 21:34:39 crc kubenswrapper[4962]: E1201 21:34:39.220093 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 21:34:39 crc kubenswrapper[4962]: E1201 21:34:39.220189 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q5q5" podUID="5e1746bf-6971-44aa-ae52-f349e6963eb2" Dec 01 21:34:39 crc kubenswrapper[4962]: E1201 21:34:39.220342 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 21:34:39 crc kubenswrapper[4962]: I1201 21:34:39.236719 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:39 crc kubenswrapper[4962]: I1201 21:34:39.236769 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:39 crc kubenswrapper[4962]: I1201 21:34:39.236786 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:39 crc kubenswrapper[4962]: I1201 21:34:39.236810 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:39 crc kubenswrapper[4962]: I1201 21:34:39.236829 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:39Z","lastTransitionTime":"2025-12-01T21:34:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:39 crc kubenswrapper[4962]: I1201 21:34:39.344210 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:39 crc kubenswrapper[4962]: I1201 21:34:39.344265 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:39 crc kubenswrapper[4962]: I1201 21:34:39.344278 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:39 crc kubenswrapper[4962]: I1201 21:34:39.344303 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:39 crc kubenswrapper[4962]: I1201 21:34:39.344320 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:39Z","lastTransitionTime":"2025-12-01T21:34:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:39 crc kubenswrapper[4962]: I1201 21:34:39.446990 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:39 crc kubenswrapper[4962]: I1201 21:34:39.447046 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:39 crc kubenswrapper[4962]: I1201 21:34:39.447059 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:39 crc kubenswrapper[4962]: I1201 21:34:39.447079 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:39 crc kubenswrapper[4962]: I1201 21:34:39.447090 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:39Z","lastTransitionTime":"2025-12-01T21:34:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:39 crc kubenswrapper[4962]: I1201 21:34:39.550022 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:39 crc kubenswrapper[4962]: I1201 21:34:39.550068 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:39 crc kubenswrapper[4962]: I1201 21:34:39.550080 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:39 crc kubenswrapper[4962]: I1201 21:34:39.550097 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:39 crc kubenswrapper[4962]: I1201 21:34:39.550110 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:39Z","lastTransitionTime":"2025-12-01T21:34:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:39 crc kubenswrapper[4962]: I1201 21:34:39.652873 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:39 crc kubenswrapper[4962]: I1201 21:34:39.652930 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:39 crc kubenswrapper[4962]: I1201 21:34:39.652985 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:39 crc kubenswrapper[4962]: I1201 21:34:39.653008 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:39 crc kubenswrapper[4962]: I1201 21:34:39.653025 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:39Z","lastTransitionTime":"2025-12-01T21:34:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:39 crc kubenswrapper[4962]: I1201 21:34:39.755024 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:39 crc kubenswrapper[4962]: I1201 21:34:39.755061 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:39 crc kubenswrapper[4962]: I1201 21:34:39.755069 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:39 crc kubenswrapper[4962]: I1201 21:34:39.755083 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:39 crc kubenswrapper[4962]: I1201 21:34:39.755092 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:39Z","lastTransitionTime":"2025-12-01T21:34:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:39 crc kubenswrapper[4962]: I1201 21:34:39.857630 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:39 crc kubenswrapper[4962]: I1201 21:34:39.857679 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:39 crc kubenswrapper[4962]: I1201 21:34:39.857688 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:39 crc kubenswrapper[4962]: I1201 21:34:39.857703 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:39 crc kubenswrapper[4962]: I1201 21:34:39.857715 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:39Z","lastTransitionTime":"2025-12-01T21:34:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:39 crc kubenswrapper[4962]: I1201 21:34:39.960012 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:39 crc kubenswrapper[4962]: I1201 21:34:39.960079 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:39 crc kubenswrapper[4962]: I1201 21:34:39.960091 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:39 crc kubenswrapper[4962]: I1201 21:34:39.960112 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:39 crc kubenswrapper[4962]: I1201 21:34:39.960124 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:39Z","lastTransitionTime":"2025-12-01T21:34:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:40 crc kubenswrapper[4962]: I1201 21:34:40.062636 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:40 crc kubenswrapper[4962]: I1201 21:34:40.062686 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:40 crc kubenswrapper[4962]: I1201 21:34:40.062702 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:40 crc kubenswrapper[4962]: I1201 21:34:40.062722 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:40 crc kubenswrapper[4962]: I1201 21:34:40.062740 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:40Z","lastTransitionTime":"2025-12-01T21:34:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:40 crc kubenswrapper[4962]: I1201 21:34:40.165473 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:40 crc kubenswrapper[4962]: I1201 21:34:40.165542 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:40 crc kubenswrapper[4962]: I1201 21:34:40.165567 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:40 crc kubenswrapper[4962]: I1201 21:34:40.165590 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:40 crc kubenswrapper[4962]: I1201 21:34:40.165607 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:40Z","lastTransitionTime":"2025-12-01T21:34:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:40 crc kubenswrapper[4962]: I1201 21:34:40.268448 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:40 crc kubenswrapper[4962]: I1201 21:34:40.268881 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:40 crc kubenswrapper[4962]: I1201 21:34:40.268906 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:40 crc kubenswrapper[4962]: I1201 21:34:40.268970 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:40 crc kubenswrapper[4962]: I1201 21:34:40.268997 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:40Z","lastTransitionTime":"2025-12-01T21:34:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:40 crc kubenswrapper[4962]: I1201 21:34:40.371735 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:40 crc kubenswrapper[4962]: I1201 21:34:40.371764 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:40 crc kubenswrapper[4962]: I1201 21:34:40.371773 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:40 crc kubenswrapper[4962]: I1201 21:34:40.371786 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:40 crc kubenswrapper[4962]: I1201 21:34:40.371795 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:40Z","lastTransitionTime":"2025-12-01T21:34:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:40 crc kubenswrapper[4962]: I1201 21:34:40.474279 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:40 crc kubenswrapper[4962]: I1201 21:34:40.474325 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:40 crc kubenswrapper[4962]: I1201 21:34:40.474334 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:40 crc kubenswrapper[4962]: I1201 21:34:40.474348 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:40 crc kubenswrapper[4962]: I1201 21:34:40.474356 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:40Z","lastTransitionTime":"2025-12-01T21:34:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:40 crc kubenswrapper[4962]: I1201 21:34:40.576811 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:40 crc kubenswrapper[4962]: I1201 21:34:40.576885 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:40 crc kubenswrapper[4962]: I1201 21:34:40.576913 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:40 crc kubenswrapper[4962]: I1201 21:34:40.576979 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:40 crc kubenswrapper[4962]: I1201 21:34:40.577006 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:40Z","lastTransitionTime":"2025-12-01T21:34:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:40 crc kubenswrapper[4962]: I1201 21:34:40.678984 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:40 crc kubenswrapper[4962]: I1201 21:34:40.679017 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:40 crc kubenswrapper[4962]: I1201 21:34:40.679028 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:40 crc kubenswrapper[4962]: I1201 21:34:40.679043 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:40 crc kubenswrapper[4962]: I1201 21:34:40.679053 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:40Z","lastTransitionTime":"2025-12-01T21:34:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:40 crc kubenswrapper[4962]: I1201 21:34:40.781446 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:40 crc kubenswrapper[4962]: I1201 21:34:40.781488 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:40 crc kubenswrapper[4962]: I1201 21:34:40.781500 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:40 crc kubenswrapper[4962]: I1201 21:34:40.781515 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:40 crc kubenswrapper[4962]: I1201 21:34:40.781525 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:40Z","lastTransitionTime":"2025-12-01T21:34:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:40 crc kubenswrapper[4962]: I1201 21:34:40.884525 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:40 crc kubenswrapper[4962]: I1201 21:34:40.884565 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:40 crc kubenswrapper[4962]: I1201 21:34:40.884572 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:40 crc kubenswrapper[4962]: I1201 21:34:40.884587 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:40 crc kubenswrapper[4962]: I1201 21:34:40.884597 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:40Z","lastTransitionTime":"2025-12-01T21:34:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:40 crc kubenswrapper[4962]: I1201 21:34:40.987540 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:40 crc kubenswrapper[4962]: I1201 21:34:40.987597 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:40 crc kubenswrapper[4962]: I1201 21:34:40.987609 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:40 crc kubenswrapper[4962]: I1201 21:34:40.987626 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:40 crc kubenswrapper[4962]: I1201 21:34:40.987639 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:40Z","lastTransitionTime":"2025-12-01T21:34:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:41 crc kubenswrapper[4962]: I1201 21:34:41.090177 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:41 crc kubenswrapper[4962]: I1201 21:34:41.090228 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:41 crc kubenswrapper[4962]: I1201 21:34:41.090246 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:41 crc kubenswrapper[4962]: I1201 21:34:41.090268 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:41 crc kubenswrapper[4962]: I1201 21:34:41.090287 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:41Z","lastTransitionTime":"2025-12-01T21:34:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:41 crc kubenswrapper[4962]: I1201 21:34:41.192754 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:41 crc kubenswrapper[4962]: I1201 21:34:41.192859 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:41 crc kubenswrapper[4962]: I1201 21:34:41.192877 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:41 crc kubenswrapper[4962]: I1201 21:34:41.192902 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:41 crc kubenswrapper[4962]: I1201 21:34:41.192924 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:41Z","lastTransitionTime":"2025-12-01T21:34:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:41 crc kubenswrapper[4962]: I1201 21:34:41.218523 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q5q5" Dec 01 21:34:41 crc kubenswrapper[4962]: I1201 21:34:41.218554 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:34:41 crc kubenswrapper[4962]: I1201 21:34:41.218630 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 21:34:41 crc kubenswrapper[4962]: E1201 21:34:41.218655 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 21:34:41 crc kubenswrapper[4962]: I1201 21:34:41.218729 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 21:34:41 crc kubenswrapper[4962]: E1201 21:34:41.218876 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 21:34:41 crc kubenswrapper[4962]: E1201 21:34:41.219080 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q5q5" podUID="5e1746bf-6971-44aa-ae52-f349e6963eb2" Dec 01 21:34:41 crc kubenswrapper[4962]: E1201 21:34:41.219096 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 21:34:41 crc kubenswrapper[4962]: I1201 21:34:41.296207 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:41 crc kubenswrapper[4962]: I1201 21:34:41.296264 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:41 crc kubenswrapper[4962]: I1201 21:34:41.296281 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:41 crc kubenswrapper[4962]: I1201 21:34:41.296305 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:41 crc kubenswrapper[4962]: I1201 21:34:41.296322 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:41Z","lastTransitionTime":"2025-12-01T21:34:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:41 crc kubenswrapper[4962]: I1201 21:34:41.400202 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:41 crc kubenswrapper[4962]: I1201 21:34:41.400254 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:41 crc kubenswrapper[4962]: I1201 21:34:41.400271 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:41 crc kubenswrapper[4962]: I1201 21:34:41.400295 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:41 crc kubenswrapper[4962]: I1201 21:34:41.400313 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:41Z","lastTransitionTime":"2025-12-01T21:34:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:41 crc kubenswrapper[4962]: I1201 21:34:41.502879 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:41 crc kubenswrapper[4962]: I1201 21:34:41.502958 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:41 crc kubenswrapper[4962]: I1201 21:34:41.502973 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:41 crc kubenswrapper[4962]: I1201 21:34:41.502997 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:41 crc kubenswrapper[4962]: I1201 21:34:41.503006 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:41Z","lastTransitionTime":"2025-12-01T21:34:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:41 crc kubenswrapper[4962]: I1201 21:34:41.605799 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:41 crc kubenswrapper[4962]: I1201 21:34:41.605860 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:41 crc kubenswrapper[4962]: I1201 21:34:41.605877 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:41 crc kubenswrapper[4962]: I1201 21:34:41.605902 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:41 crc kubenswrapper[4962]: I1201 21:34:41.605920 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:41Z","lastTransitionTime":"2025-12-01T21:34:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:41 crc kubenswrapper[4962]: I1201 21:34:41.709469 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:41 crc kubenswrapper[4962]: I1201 21:34:41.709529 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:41 crc kubenswrapper[4962]: I1201 21:34:41.709546 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:41 crc kubenswrapper[4962]: I1201 21:34:41.709566 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:41 crc kubenswrapper[4962]: I1201 21:34:41.709579 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:41Z","lastTransitionTime":"2025-12-01T21:34:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:41 crc kubenswrapper[4962]: I1201 21:34:41.813199 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:41 crc kubenswrapper[4962]: I1201 21:34:41.813274 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:41 crc kubenswrapper[4962]: I1201 21:34:41.813326 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:41 crc kubenswrapper[4962]: I1201 21:34:41.813351 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:41 crc kubenswrapper[4962]: I1201 21:34:41.813369 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:41Z","lastTransitionTime":"2025-12-01T21:34:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:41 crc kubenswrapper[4962]: I1201 21:34:41.921403 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:41 crc kubenswrapper[4962]: I1201 21:34:41.921581 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:41 crc kubenswrapper[4962]: I1201 21:34:41.921887 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:41 crc kubenswrapper[4962]: I1201 21:34:41.921954 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:41 crc kubenswrapper[4962]: I1201 21:34:41.921968 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:41Z","lastTransitionTime":"2025-12-01T21:34:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:42 crc kubenswrapper[4962]: I1201 21:34:42.024533 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:42 crc kubenswrapper[4962]: I1201 21:34:42.024607 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:42 crc kubenswrapper[4962]: I1201 21:34:42.024631 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:42 crc kubenswrapper[4962]: I1201 21:34:42.024661 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:42 crc kubenswrapper[4962]: I1201 21:34:42.024685 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:42Z","lastTransitionTime":"2025-12-01T21:34:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:42 crc kubenswrapper[4962]: I1201 21:34:42.127261 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:42 crc kubenswrapper[4962]: I1201 21:34:42.127321 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:42 crc kubenswrapper[4962]: I1201 21:34:42.127347 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:42 crc kubenswrapper[4962]: I1201 21:34:42.127377 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:42 crc kubenswrapper[4962]: I1201 21:34:42.127400 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:42Z","lastTransitionTime":"2025-12-01T21:34:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:42 crc kubenswrapper[4962]: I1201 21:34:42.229915 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:42 crc kubenswrapper[4962]: I1201 21:34:42.230018 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:42 crc kubenswrapper[4962]: I1201 21:34:42.230042 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:42 crc kubenswrapper[4962]: I1201 21:34:42.230067 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:42 crc kubenswrapper[4962]: I1201 21:34:42.230090 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:42Z","lastTransitionTime":"2025-12-01T21:34:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:42 crc kubenswrapper[4962]: I1201 21:34:42.332876 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:42 crc kubenswrapper[4962]: I1201 21:34:42.332953 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:42 crc kubenswrapper[4962]: I1201 21:34:42.332973 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:42 crc kubenswrapper[4962]: I1201 21:34:42.332996 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:42 crc kubenswrapper[4962]: I1201 21:34:42.333012 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:42Z","lastTransitionTime":"2025-12-01T21:34:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:42 crc kubenswrapper[4962]: I1201 21:34:42.435593 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:42 crc kubenswrapper[4962]: I1201 21:34:42.435656 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:42 crc kubenswrapper[4962]: I1201 21:34:42.435673 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:42 crc kubenswrapper[4962]: I1201 21:34:42.435697 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:42 crc kubenswrapper[4962]: I1201 21:34:42.435715 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:42Z","lastTransitionTime":"2025-12-01T21:34:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:42 crc kubenswrapper[4962]: I1201 21:34:42.538604 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:42 crc kubenswrapper[4962]: I1201 21:34:42.538674 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:42 crc kubenswrapper[4962]: I1201 21:34:42.538700 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:42 crc kubenswrapper[4962]: I1201 21:34:42.538758 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:42 crc kubenswrapper[4962]: I1201 21:34:42.538783 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:42Z","lastTransitionTime":"2025-12-01T21:34:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:42 crc kubenswrapper[4962]: I1201 21:34:42.641216 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:42 crc kubenswrapper[4962]: I1201 21:34:42.641261 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:42 crc kubenswrapper[4962]: I1201 21:34:42.641276 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:42 crc kubenswrapper[4962]: I1201 21:34:42.641295 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:42 crc kubenswrapper[4962]: I1201 21:34:42.641307 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:42Z","lastTransitionTime":"2025-12-01T21:34:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:42 crc kubenswrapper[4962]: I1201 21:34:42.744271 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:42 crc kubenswrapper[4962]: I1201 21:34:42.744328 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:42 crc kubenswrapper[4962]: I1201 21:34:42.744347 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:42 crc kubenswrapper[4962]: I1201 21:34:42.744371 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:42 crc kubenswrapper[4962]: I1201 21:34:42.744388 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:42Z","lastTransitionTime":"2025-12-01T21:34:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:42 crc kubenswrapper[4962]: I1201 21:34:42.848485 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:42 crc kubenswrapper[4962]: I1201 21:34:42.848543 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:42 crc kubenswrapper[4962]: I1201 21:34:42.848560 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:42 crc kubenswrapper[4962]: I1201 21:34:42.848584 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:42 crc kubenswrapper[4962]: I1201 21:34:42.848603 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:42Z","lastTransitionTime":"2025-12-01T21:34:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:42 crc kubenswrapper[4962]: I1201 21:34:42.952161 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:42 crc kubenswrapper[4962]: I1201 21:34:42.952239 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:42 crc kubenswrapper[4962]: I1201 21:34:42.952265 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:42 crc kubenswrapper[4962]: I1201 21:34:42.952297 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:42 crc kubenswrapper[4962]: I1201 21:34:42.952318 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:42Z","lastTransitionTime":"2025-12-01T21:34:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:43 crc kubenswrapper[4962]: I1201 21:34:43.054748 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:43 crc kubenswrapper[4962]: I1201 21:34:43.054805 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:43 crc kubenswrapper[4962]: I1201 21:34:43.054823 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:43 crc kubenswrapper[4962]: I1201 21:34:43.054851 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:43 crc kubenswrapper[4962]: I1201 21:34:43.054868 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:43Z","lastTransitionTime":"2025-12-01T21:34:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:43 crc kubenswrapper[4962]: I1201 21:34:43.158012 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:43 crc kubenswrapper[4962]: I1201 21:34:43.158061 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:43 crc kubenswrapper[4962]: I1201 21:34:43.158070 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:43 crc kubenswrapper[4962]: I1201 21:34:43.158087 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:43 crc kubenswrapper[4962]: I1201 21:34:43.158096 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:43Z","lastTransitionTime":"2025-12-01T21:34:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:43 crc kubenswrapper[4962]: I1201 21:34:43.218770 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 21:34:43 crc kubenswrapper[4962]: I1201 21:34:43.218806 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q5q5" Dec 01 21:34:43 crc kubenswrapper[4962]: I1201 21:34:43.218831 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:34:43 crc kubenswrapper[4962]: I1201 21:34:43.218798 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 21:34:43 crc kubenswrapper[4962]: E1201 21:34:43.218974 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 21:34:43 crc kubenswrapper[4962]: E1201 21:34:43.219104 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 21:34:43 crc kubenswrapper[4962]: E1201 21:34:43.219486 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q5q5" podUID="5e1746bf-6971-44aa-ae52-f349e6963eb2" Dec 01 21:34:43 crc kubenswrapper[4962]: E1201 21:34:43.219612 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 21:34:43 crc kubenswrapper[4962]: I1201 21:34:43.219685 4962 scope.go:117] "RemoveContainer" containerID="b08ed34e121e3dba29c3c3bfb0a54f8496af37c1b7930a64db11f43c68076592" Dec 01 21:34:43 crc kubenswrapper[4962]: E1201 21:34:43.219815 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-j77n9_openshift-ovn-kubernetes(017b2e87-9a6e-46c6-b061-1ed93bfd2322)\"" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" podUID="017b2e87-9a6e-46c6-b061-1ed93bfd2322" Dec 01 21:34:43 crc kubenswrapper[4962]: I1201 21:34:43.260862 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:43 crc kubenswrapper[4962]: I1201 21:34:43.260916 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:43 crc kubenswrapper[4962]: I1201 21:34:43.260959 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:43 crc kubenswrapper[4962]: I1201 21:34:43.260984 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:43 crc kubenswrapper[4962]: I1201 21:34:43.261002 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:43Z","lastTransitionTime":"2025-12-01T21:34:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:43 crc kubenswrapper[4962]: I1201 21:34:43.363484 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:43 crc kubenswrapper[4962]: I1201 21:34:43.363561 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:43 crc kubenswrapper[4962]: I1201 21:34:43.363579 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:43 crc kubenswrapper[4962]: I1201 21:34:43.363602 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:43 crc kubenswrapper[4962]: I1201 21:34:43.363620 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:43Z","lastTransitionTime":"2025-12-01T21:34:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:43 crc kubenswrapper[4962]: I1201 21:34:43.466487 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:43 crc kubenswrapper[4962]: I1201 21:34:43.466554 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:43 crc kubenswrapper[4962]: I1201 21:34:43.466577 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:43 crc kubenswrapper[4962]: I1201 21:34:43.466604 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:43 crc kubenswrapper[4962]: I1201 21:34:43.466654 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:43Z","lastTransitionTime":"2025-12-01T21:34:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:43 crc kubenswrapper[4962]: I1201 21:34:43.568854 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:43 crc kubenswrapper[4962]: I1201 21:34:43.568958 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:43 crc kubenswrapper[4962]: I1201 21:34:43.568975 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:43 crc kubenswrapper[4962]: I1201 21:34:43.568991 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:43 crc kubenswrapper[4962]: I1201 21:34:43.569001 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:43Z","lastTransitionTime":"2025-12-01T21:34:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:43 crc kubenswrapper[4962]: I1201 21:34:43.670892 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:43 crc kubenswrapper[4962]: I1201 21:34:43.670975 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:43 crc kubenswrapper[4962]: I1201 21:34:43.670994 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:43 crc kubenswrapper[4962]: I1201 21:34:43.671016 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:43 crc kubenswrapper[4962]: I1201 21:34:43.671037 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:43Z","lastTransitionTime":"2025-12-01T21:34:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:43 crc kubenswrapper[4962]: I1201 21:34:43.773132 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:43 crc kubenswrapper[4962]: I1201 21:34:43.773212 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:43 crc kubenswrapper[4962]: I1201 21:34:43.773223 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:43 crc kubenswrapper[4962]: I1201 21:34:43.773237 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:43 crc kubenswrapper[4962]: I1201 21:34:43.773246 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:43Z","lastTransitionTime":"2025-12-01T21:34:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:43 crc kubenswrapper[4962]: I1201 21:34:43.876147 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:43 crc kubenswrapper[4962]: I1201 21:34:43.876201 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:43 crc kubenswrapper[4962]: I1201 21:34:43.876217 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:43 crc kubenswrapper[4962]: I1201 21:34:43.876238 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:43 crc kubenswrapper[4962]: I1201 21:34:43.876257 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:43Z","lastTransitionTime":"2025-12-01T21:34:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:43 crc kubenswrapper[4962]: I1201 21:34:43.978516 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:43 crc kubenswrapper[4962]: I1201 21:34:43.978568 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:43 crc kubenswrapper[4962]: I1201 21:34:43.978584 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:43 crc kubenswrapper[4962]: I1201 21:34:43.978607 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:43 crc kubenswrapper[4962]: I1201 21:34:43.978622 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:43Z","lastTransitionTime":"2025-12-01T21:34:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:44 crc kubenswrapper[4962]: I1201 21:34:44.081818 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:44 crc kubenswrapper[4962]: I1201 21:34:44.081877 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:44 crc kubenswrapper[4962]: I1201 21:34:44.081897 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:44 crc kubenswrapper[4962]: I1201 21:34:44.081930 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:44 crc kubenswrapper[4962]: I1201 21:34:44.081984 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:44Z","lastTransitionTime":"2025-12-01T21:34:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:44 crc kubenswrapper[4962]: I1201 21:34:44.184665 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:44 crc kubenswrapper[4962]: I1201 21:34:44.184746 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:44 crc kubenswrapper[4962]: I1201 21:34:44.184770 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:44 crc kubenswrapper[4962]: I1201 21:34:44.184800 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:44 crc kubenswrapper[4962]: I1201 21:34:44.184823 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:44Z","lastTransitionTime":"2025-12-01T21:34:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:44 crc kubenswrapper[4962]: I1201 21:34:44.287401 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:44 crc kubenswrapper[4962]: I1201 21:34:44.287443 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:44 crc kubenswrapper[4962]: I1201 21:34:44.287452 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:44 crc kubenswrapper[4962]: I1201 21:34:44.287466 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:44 crc kubenswrapper[4962]: I1201 21:34:44.287475 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:44Z","lastTransitionTime":"2025-12-01T21:34:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:44 crc kubenswrapper[4962]: I1201 21:34:44.390105 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:44 crc kubenswrapper[4962]: I1201 21:34:44.390187 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:44 crc kubenswrapper[4962]: I1201 21:34:44.390211 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:44 crc kubenswrapper[4962]: I1201 21:34:44.390241 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:44 crc kubenswrapper[4962]: I1201 21:34:44.390262 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:44Z","lastTransitionTime":"2025-12-01T21:34:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:44 crc kubenswrapper[4962]: I1201 21:34:44.492681 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:44 crc kubenswrapper[4962]: I1201 21:34:44.492750 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:44 crc kubenswrapper[4962]: I1201 21:34:44.492768 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:44 crc kubenswrapper[4962]: I1201 21:34:44.492790 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:44 crc kubenswrapper[4962]: I1201 21:34:44.492808 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:44Z","lastTransitionTime":"2025-12-01T21:34:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:44 crc kubenswrapper[4962]: I1201 21:34:44.596600 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:44 crc kubenswrapper[4962]: I1201 21:34:44.596677 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:44 crc kubenswrapper[4962]: I1201 21:34:44.596701 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:44 crc kubenswrapper[4962]: I1201 21:34:44.596729 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:44 crc kubenswrapper[4962]: I1201 21:34:44.596751 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:44Z","lastTransitionTime":"2025-12-01T21:34:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:44 crc kubenswrapper[4962]: I1201 21:34:44.699516 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:44 crc kubenswrapper[4962]: I1201 21:34:44.699620 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:44 crc kubenswrapper[4962]: I1201 21:34:44.699644 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:44 crc kubenswrapper[4962]: I1201 21:34:44.699668 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:44 crc kubenswrapper[4962]: I1201 21:34:44.699687 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:44Z","lastTransitionTime":"2025-12-01T21:34:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:44 crc kubenswrapper[4962]: I1201 21:34:44.801980 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:44 crc kubenswrapper[4962]: I1201 21:34:44.802038 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:44 crc kubenswrapper[4962]: I1201 21:34:44.802056 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:44 crc kubenswrapper[4962]: I1201 21:34:44.802079 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:44 crc kubenswrapper[4962]: I1201 21:34:44.802100 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:44Z","lastTransitionTime":"2025-12-01T21:34:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:44 crc kubenswrapper[4962]: I1201 21:34:44.905131 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:44 crc kubenswrapper[4962]: I1201 21:34:44.905217 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:44 crc kubenswrapper[4962]: I1201 21:34:44.905244 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:44 crc kubenswrapper[4962]: I1201 21:34:44.905274 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:44 crc kubenswrapper[4962]: I1201 21:34:44.905296 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:44Z","lastTransitionTime":"2025-12-01T21:34:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:45 crc kubenswrapper[4962]: I1201 21:34:45.007843 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:45 crc kubenswrapper[4962]: I1201 21:34:45.007911 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:45 crc kubenswrapper[4962]: I1201 21:34:45.007929 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:45 crc kubenswrapper[4962]: I1201 21:34:45.007997 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:45 crc kubenswrapper[4962]: I1201 21:34:45.008015 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:45Z","lastTransitionTime":"2025-12-01T21:34:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:45 crc kubenswrapper[4962]: I1201 21:34:45.111322 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:45 crc kubenswrapper[4962]: I1201 21:34:45.111377 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:45 crc kubenswrapper[4962]: I1201 21:34:45.111394 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:45 crc kubenswrapper[4962]: I1201 21:34:45.111416 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:45 crc kubenswrapper[4962]: I1201 21:34:45.111435 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:45Z","lastTransitionTime":"2025-12-01T21:34:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:45 crc kubenswrapper[4962]: I1201 21:34:45.214494 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:45 crc kubenswrapper[4962]: I1201 21:34:45.214553 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:45 crc kubenswrapper[4962]: I1201 21:34:45.214571 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:45 crc kubenswrapper[4962]: I1201 21:34:45.214594 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:45 crc kubenswrapper[4962]: I1201 21:34:45.214612 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:45Z","lastTransitionTime":"2025-12-01T21:34:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:45 crc kubenswrapper[4962]: I1201 21:34:45.218969 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:34:45 crc kubenswrapper[4962]: I1201 21:34:45.219036 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 21:34:45 crc kubenswrapper[4962]: I1201 21:34:45.219088 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q5q5" Dec 01 21:34:45 crc kubenswrapper[4962]: E1201 21:34:45.219224 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 21:34:45 crc kubenswrapper[4962]: I1201 21:34:45.219263 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 21:34:45 crc kubenswrapper[4962]: E1201 21:34:45.219330 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 21:34:45 crc kubenswrapper[4962]: E1201 21:34:45.219514 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 21:34:45 crc kubenswrapper[4962]: E1201 21:34:45.219610 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q5q5" podUID="5e1746bf-6971-44aa-ae52-f349e6963eb2" Dec 01 21:34:45 crc kubenswrapper[4962]: I1201 21:34:45.318293 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:45 crc kubenswrapper[4962]: I1201 21:34:45.318334 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:45 crc kubenswrapper[4962]: I1201 21:34:45.318345 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:45 crc kubenswrapper[4962]: I1201 21:34:45.318363 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:45 crc kubenswrapper[4962]: I1201 21:34:45.318374 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:45Z","lastTransitionTime":"2025-12-01T21:34:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:45 crc kubenswrapper[4962]: I1201 21:34:45.356720 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5e1746bf-6971-44aa-ae52-f349e6963eb2-metrics-certs\") pod \"network-metrics-daemon-2q5q5\" (UID: \"5e1746bf-6971-44aa-ae52-f349e6963eb2\") " pod="openshift-multus/network-metrics-daemon-2q5q5" Dec 01 21:34:45 crc kubenswrapper[4962]: E1201 21:34:45.356998 4962 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 21:34:45 crc kubenswrapper[4962]: E1201 21:34:45.357085 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e1746bf-6971-44aa-ae52-f349e6963eb2-metrics-certs podName:5e1746bf-6971-44aa-ae52-f349e6963eb2 nodeName:}" failed. No retries permitted until 2025-12-01 21:35:17.357055814 +0000 UTC m=+101.458495049 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5e1746bf-6971-44aa-ae52-f349e6963eb2-metrics-certs") pod "network-metrics-daemon-2q5q5" (UID: "5e1746bf-6971-44aa-ae52-f349e6963eb2") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 21:34:45 crc kubenswrapper[4962]: I1201 21:34:45.421620 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:45 crc kubenswrapper[4962]: I1201 21:34:45.421681 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:45 crc kubenswrapper[4962]: I1201 21:34:45.421697 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:45 crc kubenswrapper[4962]: I1201 21:34:45.421720 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:45 crc kubenswrapper[4962]: I1201 21:34:45.421739 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:45Z","lastTransitionTime":"2025-12-01T21:34:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:45 crc kubenswrapper[4962]: I1201 21:34:45.524259 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:45 crc kubenswrapper[4962]: I1201 21:34:45.524317 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:45 crc kubenswrapper[4962]: I1201 21:34:45.524334 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:45 crc kubenswrapper[4962]: I1201 21:34:45.524359 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:45 crc kubenswrapper[4962]: I1201 21:34:45.524377 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:45Z","lastTransitionTime":"2025-12-01T21:34:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:45 crc kubenswrapper[4962]: I1201 21:34:45.627525 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:45 crc kubenswrapper[4962]: I1201 21:34:45.627609 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:45 crc kubenswrapper[4962]: I1201 21:34:45.627628 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:45 crc kubenswrapper[4962]: I1201 21:34:45.627687 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:45 crc kubenswrapper[4962]: I1201 21:34:45.627712 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:45Z","lastTransitionTime":"2025-12-01T21:34:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:45 crc kubenswrapper[4962]: I1201 21:34:45.729773 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:45 crc kubenswrapper[4962]: I1201 21:34:45.729826 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:45 crc kubenswrapper[4962]: I1201 21:34:45.729838 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:45 crc kubenswrapper[4962]: I1201 21:34:45.729851 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:45 crc kubenswrapper[4962]: I1201 21:34:45.729867 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:45Z","lastTransitionTime":"2025-12-01T21:34:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:45 crc kubenswrapper[4962]: I1201 21:34:45.832180 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:45 crc kubenswrapper[4962]: I1201 21:34:45.832211 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:45 crc kubenswrapper[4962]: I1201 21:34:45.832221 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:45 crc kubenswrapper[4962]: I1201 21:34:45.832253 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:45 crc kubenswrapper[4962]: I1201 21:34:45.832263 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:45Z","lastTransitionTime":"2025-12-01T21:34:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:45 crc kubenswrapper[4962]: I1201 21:34:45.938076 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:45 crc kubenswrapper[4962]: I1201 21:34:45.938106 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:45 crc kubenswrapper[4962]: I1201 21:34:45.938116 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:45 crc kubenswrapper[4962]: I1201 21:34:45.938132 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:45 crc kubenswrapper[4962]: I1201 21:34:45.938142 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:45Z","lastTransitionTime":"2025-12-01T21:34:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.041286 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.041348 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.041364 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.041389 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.041408 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:46Z","lastTransitionTime":"2025-12-01T21:34:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.144981 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.145022 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.145033 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.145049 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.145061 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:46Z","lastTransitionTime":"2025-12-01T21:34:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.236848 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db02e5c3ea0837c8534d6d0d89c6343ef077bc79b08bb6514fcc7fa0d71ac94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:46Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.248830 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.248883 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.248900 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.248922 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.248960 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:46Z","lastTransitionTime":"2025-12-01T21:34:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.253571 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:46Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.271347 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-m4wg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38b9e31-13b0-4a48-93bf-b3722ca60642\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6b607ad59cd135df76eb0d6167a309468b46c7d487077b737d6d35390990de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws4mx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-m4wg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:46Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.290705 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lv2jr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47902aa9-b3e5-4279-a0ee-23ec28d1c67b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131ddb58672271d21222b6c9376dc6fa0bc5ba8d9bb4c8b0db88e81d0f59984c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f38ae7ac1122c7cceeaf7ef6284034b3b00f0d34060ce2a912344e521f8288ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f38ae7ac1122c7cceeaf7ef6284034b3b00f0d34060ce2a912344e521f8288ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3df9e036c0435fee1aa00f18a53bfb50334213a9284f766de1454f30a6c78f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3df9e036c0435fee1aa00f18a53bfb50334213a9284f766de1454f30a6c78f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eea885834889b1efa1a74ddb9d4c378669affd6dcf493f7d5e2c18d63ba9b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eea885834889b1efa1a74ddb9d4c378669affd6dcf493f7d5e2c18d63ba9b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b892d495763f16b0d9b9afd4440fa4d704de060b42b7506b28e9d990c47bfc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b892d495763f16b0d9b9afd4440fa4d704de060b42b7506b28e9d990c47bfc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fd3d341f898bc53c51ad9a79a03ffd02ef0ac0e8f53e33ebc139a30fa85b7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4fd3d341f898bc53c51ad9a79a03ffd02ef0ac0e8f53e33ebc139a30fa85b7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1a2f823c4006bd3f5acb5f92c78b1020ea0992f77a76366c5bab21657bf403f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1a2f823c4006bd3f5acb5f92c78b1020ea0992f77a76366c5bab21657bf403f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lv2jr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:46Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.307285 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4wzdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c4763ed-582e-4003-a0da-526ae8aee799\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca28226077849a752e7f8484f9ef98d6fb065d6ba9b3c4ef05f3ef1b0554bc32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5lb4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:34:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4wzdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:46Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.318806 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dce968b-9adc-4cbe-958c-00bdd7ba6159\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d0603aa8db338a61428d0c439ab252ad7db6ee0061992381a2f2c2534ecc08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b00d16ddb5e5c38227a87ee09c7e0d74ca43fb9226904dc5d604abe6a9c9b5ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc7177cbcbb0fe51491e055e7a8cb565836f1ffe163b6e160c3b425a5218fc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e749578c0ce8cb1dd38de90c24c5ddf738558163fe9012afcf1db088756495\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:46Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.331214 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:46Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.343379 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fqtnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84505e9c-7b91-400d-b30b-d7d2cfe3c29b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4acbb8c835ff8f148555f849f503e7bf07afe0526bd6d0a22277e53f2eeef390\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vpt7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5ebad9cf8b8ff273eadbf4b7188ec1d0e0b191dd8c7d6e48efb504a6182b655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vpt7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:34:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fqtnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:46Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.352020 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.352076 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.352096 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.352120 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.352138 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:46Z","lastTransitionTime":"2025-12-01T21:34:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.355873 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac15c86b-cd18-4110-bda1-ba116dccb445\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edc674b0b2e671ee34618f4a89fb513b4d6be3b5b2c65510554ed57b84f305b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d764e05de4e1d86add3da927e55f8d2f211bba19a2104af89e43d13f986f5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a26275d40e0cdc563a8704d985b92ca116fd053ac663b23ada1f3d55d3d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7bb726576fbf1e8a595c9fdab30ed4852d838f3ad72aa432b880e8466493ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa02037f2a0411aa4c3b7eff66c72a1885d70e56cf72cdf786d04b5840ecbe6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"message\\\":\\\" 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764624829\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764624829\\\\\\\\\\\\\\\" (2025-12-01 20:33:49 +0000 UTC to 2026-12-01 20:33:49 +0000 UTC (now=2025-12-01 21:33:54.341897199 +0000 UTC))\\\\\\\"\\\\nI1201 21:33:54.342031 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 21:33:54.342089 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 21:33:54.342220 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1201 21:33:54.342244 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1201 21:33:54.341422 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 21:33:54.342706 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 21:33:54.344341 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-602538438/tls.crt::/tmp/serving-cert-602538438/tls.key\\\\\\\"\\\\nI1201 21:33:54.345794 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 21:33:54.345817 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 21:33:54.345844 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348097 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348569 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1201 21:33:54.367723 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bbe0097117ffbc7aa9c8247995dcec3765f03a9aa7657ee0c0da4c3a3ec874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:46Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.368646 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://311915b276fcb2749fd7443c07b4a339131584ba32944273ed063ed55edec6c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:46Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.381284 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69adfdccde94cd891284e61c5edd6a092d14c5276ff514618fa3b44169e6347e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a6c3b22dbe7ed00b2356583225c79a46bfbf7014f2ba5b6d2226f1b1f3f7b39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:46Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.390152 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w7dpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"943bfc9d-612b-4273-9774-f1866b7af4b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0486c9366b81f3abfd35806c7eef00eb2a353450d850cc665c43fe3c78c9fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w7dpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:46Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.402708 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"191b6ce3-f613-4217-b224-a65ee4cfdfe7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37ed2e5c6ce1be6c710341ab02dc9ae89ebb0b309c11e78b251ffb532c31587f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rf67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca94caf0040f2d86239f149d34e419adcc86594fda2b4189e74f699c125857f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rf67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b642k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:46Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.414322 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2q5q5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e1746bf-6971-44aa-ae52-f349e6963eb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wgwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wgwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:34:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2q5q5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:46Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.427316 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:46Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.447008 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"017b2e87-9a6e-46c6-b061-1ed93bfd2322\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c2b5c061f60dc51486c1b3b6fd97f18f63860ecb75697b61ee8288c43f23417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a2d1885515e7eab97b61b268f14f9fab5ff02464621e1a9b6adfee4c032ca3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81854e8f9430ba467e0aecead00fad301f92f85fdcca3f93f5e4aad9bba65925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70bcf088189f0158815dd70d752870decf941ad51bbfbe7c0c872da603659864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f5f31e9f2e590d4d3319248b0a4dca7e1f6575cb973ec7cc3e33fc91eb82d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da8a6f1fac0baebddfe20bd0ebda104356ad37e9f128affd8df2757584e8e712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b08ed34e121e3dba29c3c3bfb0a54f8496af37c1b7930a64db11f43c68076592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b08ed34e121e3dba29c3c3bfb0a54f8496af37c1b7930a64db11f43c68076592\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T21:34:28Z\\\",\\\"message\\\":\\\"] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {43933d5e-3c3b-4ff8-8926-04ac25de450e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 21:34:28.257117 6634 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI1201 21:34:28.257138 6634 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI1201 21:34:28.258137 6634 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nF1201 21:34:28.258116 6634 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-j77n9_openshift-ovn-kubernetes(017b2e87-9a6e-46c6-b061-1ed93bfd2322)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://893a9b3cc35f66e4d7ce63bd2c5c94a0e5824466a37b2281e05c4e92b5e90a05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j77n9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:46Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.454817 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.455036 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.455177 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.455320 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.455472 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:46Z","lastTransitionTime":"2025-12-01T21:34:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.462658 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0980f7e7-e587-417c-a00b-aada45bea2ed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d0b3972d5038c6a9d8368101aa0f07a39a4d7023ae9e3ea0ef061b909afb194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ceb0b8e00edd30c0fbc6f9b36340785b50eff3b237aeac7dc21c74fcf801c1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e689e109554f82f2b6fc4bbde2e2f2a7d6ba0b5772b57c7a53827e6a5f66223c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://171ba01c49ce48dc0d4b632a8de3217fc6e9eab2c512c7214c377ec51987b2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://171ba01c49ce48dc0d4b632a8de3217fc6e9eab2c512c7214c377ec51987b2d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:46Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.557842 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.557903 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.557928 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.557984 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.558008 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:46Z","lastTransitionTime":"2025-12-01T21:34:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.660969 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.661058 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.661082 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.661110 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.661131 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:46Z","lastTransitionTime":"2025-12-01T21:34:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.763924 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-m4wg5_f38b9e31-13b0-4a48-93bf-b3722ca60642/kube-multus/0.log" Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.763966 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.763998 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.764008 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.764023 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.764023 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-m4wg5" event={"ID":"f38b9e31-13b0-4a48-93bf-b3722ca60642","Type":"ContainerDied","Data":"b6b607ad59cd135df76eb0d6167a309468b46c7d487077b737d6d35390990de0"} Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.764032 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:46Z","lastTransitionTime":"2025-12-01T21:34:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.764001 4962 generic.go:334] "Generic (PLEG): container finished" podID="f38b9e31-13b0-4a48-93bf-b3722ca60642" containerID="b6b607ad59cd135df76eb0d6167a309468b46c7d487077b737d6d35390990de0" exitCode=1 Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.764348 4962 scope.go:117] "RemoveContainer" containerID="b6b607ad59cd135df76eb0d6167a309468b46c7d487077b737d6d35390990de0" Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.780341 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69adfdccde94cd891284e61c5edd6a092d14c5276ff514618fa3b44169e6347e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a6c3b22dbe7ed00b2356583225c79a46bfbf7014f2ba5b6d2226f1b1f3f7b39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:46Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.794780 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w7dpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"943bfc9d-612b-4273-9774-f1866b7af4b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0486c9366b81f3abfd35806c7eef00eb2a353450d850cc665c43fe3c78c9fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w7dpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:46Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.812844 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"191b6ce3-f613-4217-b224-a65ee4cfdfe7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37ed2e5c6ce1be6c710341ab02dc9ae89ebb0b309c11e78b251ffb532c31587f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rf67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca94caf0040f2d86239f149d34e419adcc86594fda2b4189e74f699c125857f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rf67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b642k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:46Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.826712 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2q5q5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e1746bf-6971-44aa-ae52-f349e6963eb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wgwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wgwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:34:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2q5q5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:46Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.841993 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://311915b276fcb2749fd7443c07b4a339131584ba32944273ed063ed55edec6c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:46Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.857147 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0980f7e7-e587-417c-a00b-aada45bea2ed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d0b3972d5038c6a9d8368101aa0f07a39a4d7023ae9e3ea0ef061b909afb194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ceb0b8e00edd30c0fbc6f9b36340785b50eff3b237aeac7dc21c74fcf801c1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e689e109554f82f2b6fc4bbde2e2f2a7d6ba0b5772b57c7a53827e6a5f66223c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://171ba01c49ce48dc0d4b632a8de3217fc6e9eab2c512c7214c377ec51987b2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://171ba01c49ce48dc0d4b632a8de3217fc6e9eab2c512c7214c377ec51987b2d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:46Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.866242 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.866269 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.866278 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.866292 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.866301 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:46Z","lastTransitionTime":"2025-12-01T21:34:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.872573 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:46Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.888091 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"017b2e87-9a6e-46c6-b061-1ed93bfd2322\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c2b5c061f60dc51486c1b3b6fd97f18f63860ecb75697b61ee8288c43f23417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a2d1885515e7eab97b61b268f14f9fab5ff02464621e1a9b6adfee4c032ca3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81854e8f9430ba467e0aecead00fad301f92f85fdcca3f93f5e4aad9bba65925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70bcf088189f0158815dd70d752870decf941ad51bbfbe7c0c872da603659864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f5f31e9f2e590d4d3319248b0a4dca7e1f6575cb973ec7cc3e33fc91eb82d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da8a6f1fac0baebddfe20bd0ebda104356ad37e9f128affd8df2757584e8e712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b08ed34e121e3dba29c3c3bfb0a54f8496af37c1b7930a64db11f43c68076592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b08ed34e121e3dba29c3c3bfb0a54f8496af37c1b7930a64db11f43c68076592\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T21:34:28Z\\\",\\\"message\\\":\\\"] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {43933d5e-3c3b-4ff8-8926-04ac25de450e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 21:34:28.257117 6634 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI1201 21:34:28.257138 6634 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI1201 21:34:28.258137 6634 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nF1201 21:34:28.258116 6634 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-j77n9_openshift-ovn-kubernetes(017b2e87-9a6e-46c6-b061-1ed93bfd2322)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://893a9b3cc35f66e4d7ce63bd2c5c94a0e5824466a37b2281e05c4e92b5e90a05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j77n9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:46Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.908716 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-m4wg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38b9e31-13b0-4a48-93bf-b3722ca60642\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6b607ad59cd135df76eb0d6167a309468b46c7d487077b737d6d35390990de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6b607ad59cd135df76eb0d6167a309468b46c7d487077b737d6d35390990de0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T21:34:45Z\\\",\\\"message\\\":\\\"2025-12-01T21:34:00+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_dd791fb7-de12-4b23-866f-9a58f88e2ac0\\\\n2025-12-01T21:34:00+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_dd791fb7-de12-4b23-866f-9a58f88e2ac0 to /host/opt/cni/bin/\\\\n2025-12-01T21:34:00Z [verbose] multus-daemon started\\\\n2025-12-01T21:34:00Z [verbose] Readiness Indicator file check\\\\n2025-12-01T21:34:45Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws4mx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-m4wg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:46Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.929659 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lv2jr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47902aa9-b3e5-4279-a0ee-23ec28d1c67b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131ddb58672271d21222b6c9376dc6fa0bc5ba8d9bb4c8b0db88e81d0f59984c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f38ae7ac1122c7cceeaf7ef6284034b3b00f0d34060ce2a912344e521f8288ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f38ae7ac1122c7cceeaf7ef6284034b3b00f0d34060ce2a912344e521f8288ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3df9e036c0435fee1aa00f18a53bfb50334213a9284f766de1454f30a6c78f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3df9e036c0435fee1aa00f18a53bfb50334213a9284f766de1454f30a6c78f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eea885834889b1efa1a74ddb9d4c378669affd6dcf493f7d5e2c18d63ba9b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eea885834889b1efa1a74ddb9d4c378669affd6dcf493f7d5e2c18d63ba9b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b892d495763f16b0d9b9afd4440fa4d704de060b42b7506b28e9d990c47bfc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b892d495763f16b0d9b9afd4440fa4d704de060b42b7506b28e9d990c47bfc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fd3d341f898bc53c51ad9a79a03ffd02ef0ac0e8f53e33ebc139a30fa85b7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4fd3d341f898bc53c51ad9a79a03ffd02ef0ac0e8f53e33ebc139a30fa85b7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1a2f823c4006bd3f5acb5f92c78b1020ea0992f77a76366c5bab21657bf403f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1a2f823c4006bd3f5acb5f92c78b1020ea0992f77a76366c5bab21657bf403f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lv2jr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:46Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.943288 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4wzdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c4763ed-582e-4003-a0da-526ae8aee799\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca28226077849a752e7f8484f9ef98d6fb065d6ba9b3c4ef05f3ef1b0554bc32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5lb4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:34:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4wzdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:46Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.957302 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dce968b-9adc-4cbe-958c-00bdd7ba6159\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d0603aa8db338a61428d0c439ab252ad7db6ee0061992381a2f2c2534ecc08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b00d16ddb5e5c38227a87ee09c7e0d74ca43fb9226904dc5d604abe6a9c9b5ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc7177cbcbb0fe51491e055e7a8cb565836f1ffe163b6e160c3b425a5218fc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e749578c0ce8cb1dd38de90c24c5ddf738558163fe9012afcf1db088756495\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:46Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.968836 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.968878 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.968890 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.968906 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.968918 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:46Z","lastTransitionTime":"2025-12-01T21:34:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.972480 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db02e5c3ea0837c8534d6d0d89c6343ef077bc79b08bb6514fcc7fa0d71ac94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:46Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.984861 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:46Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:46 crc kubenswrapper[4962]: I1201 21:34:46.999211 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac15c86b-cd18-4110-bda1-ba116dccb445\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edc674b0b2e671ee34618f4a89fb513b4d6be3b5b2c65510554ed57b84f305b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d764e05de4e1d86add3da927e55f8d2f211bba19a2104af89e43d13f986f5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a26275d40e0cdc563a8704d985b92ca116fd053ac663b23ada1f3d55d3d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7bb726576fbf1e8a595c9fdab30ed4852d838f3ad72aa432b880e8466493ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa02037f2a0411aa4c3b7eff66c72a1885d70e56cf72cdf786d04b5840ecbe6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"message\\\":\\\" 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764624829\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764624829\\\\\\\\\\\\\\\" (2025-12-01 20:33:49 +0000 UTC to 2026-12-01 20:33:49 +0000 UTC (now=2025-12-01 21:33:54.341897199 +0000 UTC))\\\\\\\"\\\\nI1201 21:33:54.342031 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 21:33:54.342089 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 21:33:54.342220 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1201 21:33:54.342244 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1201 21:33:54.341422 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 21:33:54.342706 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 21:33:54.344341 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-602538438/tls.crt::/tmp/serving-cert-602538438/tls.key\\\\\\\"\\\\nI1201 21:33:54.345794 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 21:33:54.345817 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 21:33:54.345844 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348097 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348569 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1201 21:33:54.367723 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bbe0097117ffbc7aa9c8247995dcec3765f03a9aa7657ee0c0da4c3a3ec874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:46Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:47 crc kubenswrapper[4962]: I1201 21:34:47.013828 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:47Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:47 crc kubenswrapper[4962]: I1201 21:34:47.027901 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fqtnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84505e9c-7b91-400d-b30b-d7d2cfe3c29b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4acbb8c835ff8f148555f849f503e7bf07afe0526bd6d0a22277e53f2eeef390\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vpt7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5ebad9cf8b8ff273eadbf4b7188ec1d0e0b191dd8c7d6e48efb504a6182b655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vpt7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:34:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fqtnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:47Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:47 crc kubenswrapper[4962]: I1201 21:34:47.070776 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:47 crc kubenswrapper[4962]: I1201 21:34:47.070828 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:47 crc kubenswrapper[4962]: I1201 21:34:47.070845 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:47 crc kubenswrapper[4962]: I1201 21:34:47.070867 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:47 crc kubenswrapper[4962]: I1201 21:34:47.070884 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:47Z","lastTransitionTime":"2025-12-01T21:34:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:47 crc kubenswrapper[4962]: I1201 21:34:47.173600 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:47 crc kubenswrapper[4962]: I1201 21:34:47.173712 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:47 crc kubenswrapper[4962]: I1201 21:34:47.173803 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:47 crc kubenswrapper[4962]: I1201 21:34:47.173882 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:47 crc kubenswrapper[4962]: I1201 21:34:47.173930 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:47Z","lastTransitionTime":"2025-12-01T21:34:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:47 crc kubenswrapper[4962]: I1201 21:34:47.219400 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:34:47 crc kubenswrapper[4962]: I1201 21:34:47.219452 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 21:34:47 crc kubenswrapper[4962]: I1201 21:34:47.219462 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 21:34:47 crc kubenswrapper[4962]: E1201 21:34:47.219586 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 21:34:47 crc kubenswrapper[4962]: E1201 21:34:47.219735 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 21:34:47 crc kubenswrapper[4962]: E1201 21:34:47.219831 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 21:34:47 crc kubenswrapper[4962]: I1201 21:34:47.219971 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q5q5" Dec 01 21:34:47 crc kubenswrapper[4962]: E1201 21:34:47.220058 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q5q5" podUID="5e1746bf-6971-44aa-ae52-f349e6963eb2" Dec 01 21:34:47 crc kubenswrapper[4962]: I1201 21:34:47.276423 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:47 crc kubenswrapper[4962]: I1201 21:34:47.276479 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:47 crc kubenswrapper[4962]: I1201 21:34:47.276496 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:47 crc kubenswrapper[4962]: I1201 21:34:47.276520 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:47 crc kubenswrapper[4962]: I1201 21:34:47.276538 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:47Z","lastTransitionTime":"2025-12-01T21:34:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:47 crc kubenswrapper[4962]: I1201 21:34:47.378590 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:47 crc kubenswrapper[4962]: I1201 21:34:47.378633 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:47 crc kubenswrapper[4962]: I1201 21:34:47.378644 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:47 crc kubenswrapper[4962]: I1201 21:34:47.378660 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:47 crc kubenswrapper[4962]: I1201 21:34:47.378672 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:47Z","lastTransitionTime":"2025-12-01T21:34:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:47 crc kubenswrapper[4962]: I1201 21:34:47.481573 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:47 crc kubenswrapper[4962]: I1201 21:34:47.481634 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:47 crc kubenswrapper[4962]: I1201 21:34:47.481650 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:47 crc kubenswrapper[4962]: I1201 21:34:47.481676 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:47 crc kubenswrapper[4962]: I1201 21:34:47.481694 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:47Z","lastTransitionTime":"2025-12-01T21:34:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:47 crc kubenswrapper[4962]: I1201 21:34:47.585054 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:47 crc kubenswrapper[4962]: I1201 21:34:47.585122 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:47 crc kubenswrapper[4962]: I1201 21:34:47.585142 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:47 crc kubenswrapper[4962]: I1201 21:34:47.585167 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:47 crc kubenswrapper[4962]: I1201 21:34:47.585189 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:47Z","lastTransitionTime":"2025-12-01T21:34:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:47 crc kubenswrapper[4962]: I1201 21:34:47.688170 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:47 crc kubenswrapper[4962]: I1201 21:34:47.688216 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:47 crc kubenswrapper[4962]: I1201 21:34:47.688229 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:47 crc kubenswrapper[4962]: I1201 21:34:47.688250 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:47 crc kubenswrapper[4962]: I1201 21:34:47.688261 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:47Z","lastTransitionTime":"2025-12-01T21:34:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:47 crc kubenswrapper[4962]: I1201 21:34:47.770100 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-m4wg5_f38b9e31-13b0-4a48-93bf-b3722ca60642/kube-multus/0.log" Dec 01 21:34:47 crc kubenswrapper[4962]: I1201 21:34:47.770173 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-m4wg5" event={"ID":"f38b9e31-13b0-4a48-93bf-b3722ca60642","Type":"ContainerStarted","Data":"e13ebe98bcf0b54dcebec65942330d0f78a7f024cc3dbe2b2499bf7c572541c1"} Dec 01 21:34:47 crc kubenswrapper[4962]: I1201 21:34:47.789835 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-m4wg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38b9e31-13b0-4a48-93bf-b3722ca60642\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e13ebe98bcf0b54dcebec65942330d0f78a7f024cc3dbe2b2499bf7c572541c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6b607ad59cd135df76eb0d6167a309468b46c7d487077b737d6d35390990de0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T21:34:45Z\\\",\\\"message\\\":\\\"2025-12-01T21:34:00+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_dd791fb7-de12-4b23-866f-9a58f88e2ac0\\\\n2025-12-01T21:34:00+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_dd791fb7-de12-4b23-866f-9a58f88e2ac0 to /host/opt/cni/bin/\\\\n2025-12-01T21:34:00Z [verbose] multus-daemon started\\\\n2025-12-01T21:34:00Z [verbose] Readiness Indicator file check\\\\n2025-12-01T21:34:45Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws4mx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-m4wg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:47Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:47 crc kubenswrapper[4962]: I1201 21:34:47.791850 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:47 crc kubenswrapper[4962]: I1201 21:34:47.791910 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:47 crc kubenswrapper[4962]: I1201 21:34:47.791958 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:47 crc kubenswrapper[4962]: I1201 21:34:47.791988 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:47 crc kubenswrapper[4962]: I1201 21:34:47.792010 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:47Z","lastTransitionTime":"2025-12-01T21:34:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:47 crc kubenswrapper[4962]: I1201 21:34:47.812781 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lv2jr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47902aa9-b3e5-4279-a0ee-23ec28d1c67b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131ddb58672271d21222b6c9376dc6fa0bc5ba8d9bb4c8b0db88e81d0f59984c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f38ae7ac1122c7cceeaf7ef6284034b3b00f0d34060ce2a912344e521f8288ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f38ae7ac1122c7cceeaf7ef6284034b3b00f0d34060ce2a912344e521f8288ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3df9e036c0435fee1aa00f18a53bfb50334213a9284f766de1454f30a6c78f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3df9e036c0435fee1aa00f18a53bfb50334213a9284f766de1454f30a6c78f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eea885834889b1efa1a74ddb9d4c378669affd6dcf493f7d5e2c18d63ba9b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eea885834889b1efa1a74ddb9d4c378669affd6dcf493f7d5e2c18d63ba9b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b892d495763f16b0d9b9afd4440fa4d704de060b42b7506b28e9d990c47bfc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b892d495763f16b0d9b9afd4440fa4d704de060b42b7506b28e9d990c47bfc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fd3d341f898bc53c51ad9a79a03ffd02ef0ac0e8f53e33ebc139a30fa85b7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4fd3d341f898bc53c51ad9a79a03ffd02ef0ac0e8f53e33ebc139a30fa85b7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1a2f823c4006bd3f5acb5f92c78b1020ea0992f77a76366c5bab21657bf403f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1a2f823c4006bd3f5acb5f92c78b1020ea0992f77a76366c5bab21657bf403f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lv2jr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:47Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:47 crc kubenswrapper[4962]: I1201 21:34:47.829305 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4wzdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c4763ed-582e-4003-a0da-526ae8aee799\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca28226077849a752e7f8484f9ef98d6fb065d6ba9b3c4ef05f3ef1b0554bc32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5lb4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:34:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4wzdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:47Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:47 crc kubenswrapper[4962]: I1201 21:34:47.845850 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dce968b-9adc-4cbe-958c-00bdd7ba6159\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d0603aa8db338a61428d0c439ab252ad7db6ee0061992381a2f2c2534ecc08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b00d16ddb5e5c38227a87ee09c7e0d74ca43fb9226904dc5d604abe6a9c9b5ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc7177cbcbb0fe51491e055e7a8cb565836f1ffe163b6e160c3b425a5218fc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e749578c0ce8cb1dd38de90c24c5ddf738558163fe9012afcf1db088756495\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:47Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:47 crc kubenswrapper[4962]: I1201 21:34:47.864000 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db02e5c3ea0837c8534d6d0d89c6343ef077bc79b08bb6514fcc7fa0d71ac94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:47Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:47 crc kubenswrapper[4962]: I1201 21:34:47.879034 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:47Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:47 crc kubenswrapper[4962]: I1201 21:34:47.894382 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:47 crc kubenswrapper[4962]: I1201 21:34:47.894427 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:47 crc kubenswrapper[4962]: I1201 21:34:47.894438 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:47 crc kubenswrapper[4962]: I1201 21:34:47.894457 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:47 crc kubenswrapper[4962]: I1201 21:34:47.894468 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:47Z","lastTransitionTime":"2025-12-01T21:34:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:47 crc kubenswrapper[4962]: I1201 21:34:47.899746 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac15c86b-cd18-4110-bda1-ba116dccb445\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edc674b0b2e671ee34618f4a89fb513b4d6be3b5b2c65510554ed57b84f305b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d764e05de4e1d86add3da927e55f8d2f211bba19a2104af89e43d13f986f5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a26275d40e0cdc563a8704d985b92ca116fd053ac663b23ada1f3d55d3d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7bb726576fbf1e8a595c9fdab30ed4852d838f3ad72aa432b880e8466493ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa02037f2a0411aa4c3b7eff66c72a1885d70e56cf72cdf786d04b5840ecbe6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"message\\\":\\\" 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764624829\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764624829\\\\\\\\\\\\\\\" (2025-12-01 20:33:49 +0000 UTC to 2026-12-01 20:33:49 +0000 UTC (now=2025-12-01 21:33:54.341897199 +0000 UTC))\\\\\\\"\\\\nI1201 21:33:54.342031 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 21:33:54.342089 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 21:33:54.342220 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1201 21:33:54.342244 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1201 21:33:54.341422 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 21:33:54.342706 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 21:33:54.344341 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-602538438/tls.crt::/tmp/serving-cert-602538438/tls.key\\\\\\\"\\\\nI1201 21:33:54.345794 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 21:33:54.345817 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 21:33:54.345844 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348097 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348569 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1201 21:33:54.367723 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bbe0097117ffbc7aa9c8247995dcec3765f03a9aa7657ee0c0da4c3a3ec874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:47Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:47 crc kubenswrapper[4962]: I1201 21:34:47.915913 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:47Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:47 crc kubenswrapper[4962]: I1201 21:34:47.932202 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fqtnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84505e9c-7b91-400d-b30b-d7d2cfe3c29b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4acbb8c835ff8f148555f849f503e7bf07afe0526bd6d0a22277e53f2eeef390\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vpt7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5ebad9cf8b8ff273eadbf4b7188ec1d0e0b191dd8c7d6e48efb504a6182b655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vpt7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:34:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fqtnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:47Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:47 crc kubenswrapper[4962]: I1201 21:34:47.944892 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69adfdccde94cd891284e61c5edd6a092d14c5276ff514618fa3b44169e6347e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a6c3b22dbe7ed00b2356583225c79a46bfbf7014f2ba5b6d2226f1b1f3f7b39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:47Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:47 crc kubenswrapper[4962]: I1201 21:34:47.956791 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w7dpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"943bfc9d-612b-4273-9774-f1866b7af4b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0486c9366b81f3abfd35806c7eef00eb2a353450d850cc665c43fe3c78c9fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w7dpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:47Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:47 crc kubenswrapper[4962]: I1201 21:34:47.972124 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"191b6ce3-f613-4217-b224-a65ee4cfdfe7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37ed2e5c6ce1be6c710341ab02dc9ae89ebb0b309c11e78b251ffb532c31587f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rf67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca94caf0040f2d86239f149d34e419adcc86594fda2b4189e74f699c125857f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rf67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b642k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:47Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:47 crc kubenswrapper[4962]: I1201 21:34:47.986267 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2q5q5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e1746bf-6971-44aa-ae52-f349e6963eb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wgwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wgwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:34:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2q5q5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:47Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:47 crc kubenswrapper[4962]: I1201 21:34:47.997265 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:47 crc kubenswrapper[4962]: I1201 21:34:47.997335 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:47 crc kubenswrapper[4962]: I1201 21:34:47.997365 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:47 crc kubenswrapper[4962]: I1201 21:34:47.997379 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:47 crc kubenswrapper[4962]: I1201 21:34:47.997391 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:47Z","lastTransitionTime":"2025-12-01T21:34:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:47 crc kubenswrapper[4962]: I1201 21:34:47.998118 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://311915b276fcb2749fd7443c07b4a339131584ba32944273ed063ed55edec6c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:47Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:48 crc kubenswrapper[4962]: I1201 21:34:48.010994 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0980f7e7-e587-417c-a00b-aada45bea2ed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d0b3972d5038c6a9d8368101aa0f07a39a4d7023ae9e3ea0ef061b909afb194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ceb0b8e00edd30c0fbc6f9b36340785b50eff3b237aeac7dc21c74fcf801c1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e689e109554f82f2b6fc4bbde2e2f2a7d6ba0b5772b57c7a53827e6a5f66223c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://171ba01c49ce48dc0d4b632a8de3217fc6e9eab2c512c7214c377ec51987b2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://171ba01c49ce48dc0d4b632a8de3217fc6e9eab2c512c7214c377ec51987b2d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:48Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:48 crc kubenswrapper[4962]: I1201 21:34:48.025777 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:48Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:48 crc kubenswrapper[4962]: I1201 21:34:48.049412 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"017b2e87-9a6e-46c6-b061-1ed93bfd2322\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c2b5c061f60dc51486c1b3b6fd97f18f63860ecb75697b61ee8288c43f23417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a2d1885515e7eab97b61b268f14f9fab5ff02464621e1a9b6adfee4c032ca3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81854e8f9430ba467e0aecead00fad301f92f85fdcca3f93f5e4aad9bba65925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70bcf088189f0158815dd70d752870decf941ad51bbfbe7c0c872da603659864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f5f31e9f2e590d4d3319248b0a4dca7e1f6575cb973ec7cc3e33fc91eb82d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da8a6f1fac0baebddfe20bd0ebda104356ad37e9f128affd8df2757584e8e712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b08ed34e121e3dba29c3c3bfb0a54f8496af37c1b7930a64db11f43c68076592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b08ed34e121e3dba29c3c3bfb0a54f8496af37c1b7930a64db11f43c68076592\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T21:34:28Z\\\",\\\"message\\\":\\\"] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {43933d5e-3c3b-4ff8-8926-04ac25de450e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 21:34:28.257117 6634 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI1201 21:34:28.257138 6634 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI1201 21:34:28.258137 6634 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nF1201 21:34:28.258116 6634 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-j77n9_openshift-ovn-kubernetes(017b2e87-9a6e-46c6-b061-1ed93bfd2322)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://893a9b3cc35f66e4d7ce63bd2c5c94a0e5824466a37b2281e05c4e92b5e90a05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j77n9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:48Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:48 crc kubenswrapper[4962]: I1201 21:34:48.099832 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:48 crc kubenswrapper[4962]: I1201 21:34:48.099886 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:48 crc kubenswrapper[4962]: I1201 21:34:48.099905 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:48 crc kubenswrapper[4962]: I1201 21:34:48.099929 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:48 crc kubenswrapper[4962]: I1201 21:34:48.099973 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:48Z","lastTransitionTime":"2025-12-01T21:34:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:48 crc kubenswrapper[4962]: I1201 21:34:48.202444 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:48 crc kubenswrapper[4962]: I1201 21:34:48.202474 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:48 crc kubenswrapper[4962]: I1201 21:34:48.202482 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:48 crc kubenswrapper[4962]: I1201 21:34:48.202493 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:48 crc kubenswrapper[4962]: I1201 21:34:48.202502 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:48Z","lastTransitionTime":"2025-12-01T21:34:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:48 crc kubenswrapper[4962]: I1201 21:34:48.305680 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:48 crc kubenswrapper[4962]: I1201 21:34:48.305742 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:48 crc kubenswrapper[4962]: I1201 21:34:48.305761 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:48 crc kubenswrapper[4962]: I1201 21:34:48.305787 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:48 crc kubenswrapper[4962]: I1201 21:34:48.305805 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:48Z","lastTransitionTime":"2025-12-01T21:34:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:48 crc kubenswrapper[4962]: I1201 21:34:48.408409 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:48 crc kubenswrapper[4962]: I1201 21:34:48.408466 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:48 crc kubenswrapper[4962]: I1201 21:34:48.408486 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:48 crc kubenswrapper[4962]: I1201 21:34:48.408509 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:48 crc kubenswrapper[4962]: I1201 21:34:48.408527 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:48Z","lastTransitionTime":"2025-12-01T21:34:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:48 crc kubenswrapper[4962]: I1201 21:34:48.506171 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:48 crc kubenswrapper[4962]: I1201 21:34:48.506275 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:48 crc kubenswrapper[4962]: I1201 21:34:48.506293 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:48 crc kubenswrapper[4962]: I1201 21:34:48.506318 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:48 crc kubenswrapper[4962]: I1201 21:34:48.506336 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:48Z","lastTransitionTime":"2025-12-01T21:34:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:48 crc kubenswrapper[4962]: E1201 21:34:48.523682 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1be34435-728a-4ba5-8928-fb5e344b2f91\\\",\\\"systemUUID\\\":\\\"c1819972-ca37-4089-9661-6671772d5e38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:48Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:48 crc kubenswrapper[4962]: I1201 21:34:48.528209 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:48 crc kubenswrapper[4962]: I1201 21:34:48.528248 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:48 crc kubenswrapper[4962]: I1201 21:34:48.528264 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:48 crc kubenswrapper[4962]: I1201 21:34:48.528286 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:48 crc kubenswrapper[4962]: I1201 21:34:48.528303 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:48Z","lastTransitionTime":"2025-12-01T21:34:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:48 crc kubenswrapper[4962]: E1201 21:34:48.545504 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1be34435-728a-4ba5-8928-fb5e344b2f91\\\",\\\"systemUUID\\\":\\\"c1819972-ca37-4089-9661-6671772d5e38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:48Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:48 crc kubenswrapper[4962]: I1201 21:34:48.550197 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:48 crc kubenswrapper[4962]: I1201 21:34:48.550260 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:48 crc kubenswrapper[4962]: I1201 21:34:48.550272 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:48 crc kubenswrapper[4962]: I1201 21:34:48.550289 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:48 crc kubenswrapper[4962]: I1201 21:34:48.550302 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:48Z","lastTransitionTime":"2025-12-01T21:34:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:48 crc kubenswrapper[4962]: E1201 21:34:48.569490 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1be34435-728a-4ba5-8928-fb5e344b2f91\\\",\\\"systemUUID\\\":\\\"c1819972-ca37-4089-9661-6671772d5e38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:48Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:48 crc kubenswrapper[4962]: I1201 21:34:48.573237 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:48 crc kubenswrapper[4962]: I1201 21:34:48.573265 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:48 crc kubenswrapper[4962]: I1201 21:34:48.573274 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:48 crc kubenswrapper[4962]: I1201 21:34:48.573285 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:48 crc kubenswrapper[4962]: I1201 21:34:48.573294 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:48Z","lastTransitionTime":"2025-12-01T21:34:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:48 crc kubenswrapper[4962]: E1201 21:34:48.590527 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1be34435-728a-4ba5-8928-fb5e344b2f91\\\",\\\"systemUUID\\\":\\\"c1819972-ca37-4089-9661-6671772d5e38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:48Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:48 crc kubenswrapper[4962]: I1201 21:34:48.595062 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:48 crc kubenswrapper[4962]: I1201 21:34:48.595123 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:48 crc kubenswrapper[4962]: I1201 21:34:48.595144 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:48 crc kubenswrapper[4962]: I1201 21:34:48.595171 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:48 crc kubenswrapper[4962]: I1201 21:34:48.595188 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:48Z","lastTransitionTime":"2025-12-01T21:34:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:48 crc kubenswrapper[4962]: E1201 21:34:48.612526 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1be34435-728a-4ba5-8928-fb5e344b2f91\\\",\\\"systemUUID\\\":\\\"c1819972-ca37-4089-9661-6671772d5e38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:48Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:48 crc kubenswrapper[4962]: E1201 21:34:48.612736 4962 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 21:34:48 crc kubenswrapper[4962]: I1201 21:34:48.614281 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:48 crc kubenswrapper[4962]: I1201 21:34:48.614364 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:48 crc kubenswrapper[4962]: I1201 21:34:48.614381 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:48 crc kubenswrapper[4962]: I1201 21:34:48.614407 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:48 crc kubenswrapper[4962]: I1201 21:34:48.614424 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:48Z","lastTransitionTime":"2025-12-01T21:34:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:48 crc kubenswrapper[4962]: I1201 21:34:48.716855 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:48 crc kubenswrapper[4962]: I1201 21:34:48.716902 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:48 crc kubenswrapper[4962]: I1201 21:34:48.716920 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:48 crc kubenswrapper[4962]: I1201 21:34:48.716959 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:48 crc kubenswrapper[4962]: I1201 21:34:48.716976 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:48Z","lastTransitionTime":"2025-12-01T21:34:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:48 crc kubenswrapper[4962]: I1201 21:34:48.819864 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:48 crc kubenswrapper[4962]: I1201 21:34:48.819920 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:48 crc kubenswrapper[4962]: I1201 21:34:48.819963 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:48 crc kubenswrapper[4962]: I1201 21:34:48.819986 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:48 crc kubenswrapper[4962]: I1201 21:34:48.820005 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:48Z","lastTransitionTime":"2025-12-01T21:34:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:48 crc kubenswrapper[4962]: I1201 21:34:48.922092 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:48 crc kubenswrapper[4962]: I1201 21:34:48.922186 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:48 crc kubenswrapper[4962]: I1201 21:34:48.922208 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:48 crc kubenswrapper[4962]: I1201 21:34:48.922231 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:48 crc kubenswrapper[4962]: I1201 21:34:48.922247 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:48Z","lastTransitionTime":"2025-12-01T21:34:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:49 crc kubenswrapper[4962]: I1201 21:34:49.025098 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:49 crc kubenswrapper[4962]: I1201 21:34:49.025161 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:49 crc kubenswrapper[4962]: I1201 21:34:49.025184 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:49 crc kubenswrapper[4962]: I1201 21:34:49.025217 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:49 crc kubenswrapper[4962]: I1201 21:34:49.025239 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:49Z","lastTransitionTime":"2025-12-01T21:34:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:49 crc kubenswrapper[4962]: I1201 21:34:49.127988 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:49 crc kubenswrapper[4962]: I1201 21:34:49.128050 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:49 crc kubenswrapper[4962]: I1201 21:34:49.128073 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:49 crc kubenswrapper[4962]: I1201 21:34:49.128103 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:49 crc kubenswrapper[4962]: I1201 21:34:49.128124 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:49Z","lastTransitionTime":"2025-12-01T21:34:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:49 crc kubenswrapper[4962]: I1201 21:34:49.218781 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q5q5" Dec 01 21:34:49 crc kubenswrapper[4962]: I1201 21:34:49.218834 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 21:34:49 crc kubenswrapper[4962]: I1201 21:34:49.218791 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 21:34:49 crc kubenswrapper[4962]: I1201 21:34:49.218973 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:34:49 crc kubenswrapper[4962]: E1201 21:34:49.219119 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q5q5" podUID="5e1746bf-6971-44aa-ae52-f349e6963eb2" Dec 01 21:34:49 crc kubenswrapper[4962]: E1201 21:34:49.219275 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 21:34:49 crc kubenswrapper[4962]: E1201 21:34:49.219406 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 21:34:49 crc kubenswrapper[4962]: E1201 21:34:49.219546 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 21:34:49 crc kubenswrapper[4962]: I1201 21:34:49.231013 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:49 crc kubenswrapper[4962]: I1201 21:34:49.231062 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:49 crc kubenswrapper[4962]: I1201 21:34:49.231084 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:49 crc kubenswrapper[4962]: I1201 21:34:49.231157 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:49 crc kubenswrapper[4962]: I1201 21:34:49.231235 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:49Z","lastTransitionTime":"2025-12-01T21:34:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:49 crc kubenswrapper[4962]: I1201 21:34:49.334061 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:49 crc kubenswrapper[4962]: I1201 21:34:49.334134 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:49 crc kubenswrapper[4962]: I1201 21:34:49.334154 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:49 crc kubenswrapper[4962]: I1201 21:34:49.334178 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:49 crc kubenswrapper[4962]: I1201 21:34:49.334196 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:49Z","lastTransitionTime":"2025-12-01T21:34:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:49 crc kubenswrapper[4962]: I1201 21:34:49.437076 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:49 crc kubenswrapper[4962]: I1201 21:34:49.437166 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:49 crc kubenswrapper[4962]: I1201 21:34:49.437185 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:49 crc kubenswrapper[4962]: I1201 21:34:49.437210 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:49 crc kubenswrapper[4962]: I1201 21:34:49.437229 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:49Z","lastTransitionTime":"2025-12-01T21:34:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:49 crc kubenswrapper[4962]: I1201 21:34:49.540653 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:49 crc kubenswrapper[4962]: I1201 21:34:49.540700 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:49 crc kubenswrapper[4962]: I1201 21:34:49.540717 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:49 crc kubenswrapper[4962]: I1201 21:34:49.540740 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:49 crc kubenswrapper[4962]: I1201 21:34:49.540757 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:49Z","lastTransitionTime":"2025-12-01T21:34:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:49 crc kubenswrapper[4962]: I1201 21:34:49.643072 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:49 crc kubenswrapper[4962]: I1201 21:34:49.643168 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:49 crc kubenswrapper[4962]: I1201 21:34:49.643197 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:49 crc kubenswrapper[4962]: I1201 21:34:49.643232 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:49 crc kubenswrapper[4962]: I1201 21:34:49.643258 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:49Z","lastTransitionTime":"2025-12-01T21:34:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:49 crc kubenswrapper[4962]: I1201 21:34:49.746892 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:49 crc kubenswrapper[4962]: I1201 21:34:49.747038 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:49 crc kubenswrapper[4962]: I1201 21:34:49.747060 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:49 crc kubenswrapper[4962]: I1201 21:34:49.747089 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:49 crc kubenswrapper[4962]: I1201 21:34:49.747109 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:49Z","lastTransitionTime":"2025-12-01T21:34:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:49 crc kubenswrapper[4962]: I1201 21:34:49.850172 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:49 crc kubenswrapper[4962]: I1201 21:34:49.850215 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:49 crc kubenswrapper[4962]: I1201 21:34:49.850225 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:49 crc kubenswrapper[4962]: I1201 21:34:49.850241 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:49 crc kubenswrapper[4962]: I1201 21:34:49.850251 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:49Z","lastTransitionTime":"2025-12-01T21:34:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:49 crc kubenswrapper[4962]: I1201 21:34:49.954068 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:49 crc kubenswrapper[4962]: I1201 21:34:49.954120 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:49 crc kubenswrapper[4962]: I1201 21:34:49.954139 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:49 crc kubenswrapper[4962]: I1201 21:34:49.954166 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:49 crc kubenswrapper[4962]: I1201 21:34:49.954184 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:49Z","lastTransitionTime":"2025-12-01T21:34:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:50 crc kubenswrapper[4962]: I1201 21:34:50.057699 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:50 crc kubenswrapper[4962]: I1201 21:34:50.057757 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:50 crc kubenswrapper[4962]: I1201 21:34:50.057775 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:50 crc kubenswrapper[4962]: I1201 21:34:50.057799 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:50 crc kubenswrapper[4962]: I1201 21:34:50.057818 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:50Z","lastTransitionTime":"2025-12-01T21:34:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:50 crc kubenswrapper[4962]: I1201 21:34:50.160177 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:50 crc kubenswrapper[4962]: I1201 21:34:50.160234 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:50 crc kubenswrapper[4962]: I1201 21:34:50.160245 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:50 crc kubenswrapper[4962]: I1201 21:34:50.160262 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:50 crc kubenswrapper[4962]: I1201 21:34:50.160278 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:50Z","lastTransitionTime":"2025-12-01T21:34:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:50 crc kubenswrapper[4962]: I1201 21:34:50.263433 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:50 crc kubenswrapper[4962]: I1201 21:34:50.263486 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:50 crc kubenswrapper[4962]: I1201 21:34:50.263497 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:50 crc kubenswrapper[4962]: I1201 21:34:50.263515 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:50 crc kubenswrapper[4962]: I1201 21:34:50.263527 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:50Z","lastTransitionTime":"2025-12-01T21:34:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:50 crc kubenswrapper[4962]: I1201 21:34:50.366445 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:50 crc kubenswrapper[4962]: I1201 21:34:50.366486 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:50 crc kubenswrapper[4962]: I1201 21:34:50.366498 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:50 crc kubenswrapper[4962]: I1201 21:34:50.366514 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:50 crc kubenswrapper[4962]: I1201 21:34:50.366527 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:50Z","lastTransitionTime":"2025-12-01T21:34:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:50 crc kubenswrapper[4962]: I1201 21:34:50.468835 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:50 crc kubenswrapper[4962]: I1201 21:34:50.468891 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:50 crc kubenswrapper[4962]: I1201 21:34:50.468910 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:50 crc kubenswrapper[4962]: I1201 21:34:50.468959 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:50 crc kubenswrapper[4962]: I1201 21:34:50.468989 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:50Z","lastTransitionTime":"2025-12-01T21:34:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:50 crc kubenswrapper[4962]: I1201 21:34:50.571884 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:50 crc kubenswrapper[4962]: I1201 21:34:50.571975 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:50 crc kubenswrapper[4962]: I1201 21:34:50.571994 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:50 crc kubenswrapper[4962]: I1201 21:34:50.572020 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:50 crc kubenswrapper[4962]: I1201 21:34:50.572041 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:50Z","lastTransitionTime":"2025-12-01T21:34:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:50 crc kubenswrapper[4962]: I1201 21:34:50.674910 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:50 crc kubenswrapper[4962]: I1201 21:34:50.675037 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:50 crc kubenswrapper[4962]: I1201 21:34:50.675055 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:50 crc kubenswrapper[4962]: I1201 21:34:50.675078 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:50 crc kubenswrapper[4962]: I1201 21:34:50.675095 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:50Z","lastTransitionTime":"2025-12-01T21:34:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:50 crc kubenswrapper[4962]: I1201 21:34:50.777732 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:50 crc kubenswrapper[4962]: I1201 21:34:50.777824 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:50 crc kubenswrapper[4962]: I1201 21:34:50.777862 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:50 crc kubenswrapper[4962]: I1201 21:34:50.777892 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:50 crc kubenswrapper[4962]: I1201 21:34:50.777912 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:50Z","lastTransitionTime":"2025-12-01T21:34:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:50 crc kubenswrapper[4962]: I1201 21:34:50.880963 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:50 crc kubenswrapper[4962]: I1201 21:34:50.881030 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:50 crc kubenswrapper[4962]: I1201 21:34:50.881049 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:50 crc kubenswrapper[4962]: I1201 21:34:50.881078 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:50 crc kubenswrapper[4962]: I1201 21:34:50.881099 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:50Z","lastTransitionTime":"2025-12-01T21:34:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:50 crc kubenswrapper[4962]: I1201 21:34:50.984032 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:50 crc kubenswrapper[4962]: I1201 21:34:50.984098 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:50 crc kubenswrapper[4962]: I1201 21:34:50.984116 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:50 crc kubenswrapper[4962]: I1201 21:34:50.984139 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:50 crc kubenswrapper[4962]: I1201 21:34:50.984159 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:50Z","lastTransitionTime":"2025-12-01T21:34:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:51 crc kubenswrapper[4962]: I1201 21:34:51.087158 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:51 crc kubenswrapper[4962]: I1201 21:34:51.087226 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:51 crc kubenswrapper[4962]: I1201 21:34:51.087242 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:51 crc kubenswrapper[4962]: I1201 21:34:51.087268 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:51 crc kubenswrapper[4962]: I1201 21:34:51.087285 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:51Z","lastTransitionTime":"2025-12-01T21:34:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:51 crc kubenswrapper[4962]: I1201 21:34:51.190505 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:51 crc kubenswrapper[4962]: I1201 21:34:51.190574 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:51 crc kubenswrapper[4962]: I1201 21:34:51.190593 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:51 crc kubenswrapper[4962]: I1201 21:34:51.190616 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:51 crc kubenswrapper[4962]: I1201 21:34:51.190636 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:51Z","lastTransitionTime":"2025-12-01T21:34:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:51 crc kubenswrapper[4962]: I1201 21:34:51.219127 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:34:51 crc kubenswrapper[4962]: I1201 21:34:51.219193 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 21:34:51 crc kubenswrapper[4962]: I1201 21:34:51.219150 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 21:34:51 crc kubenswrapper[4962]: E1201 21:34:51.219315 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 21:34:51 crc kubenswrapper[4962]: E1201 21:34:51.219462 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 21:34:51 crc kubenswrapper[4962]: E1201 21:34:51.219704 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 21:34:51 crc kubenswrapper[4962]: I1201 21:34:51.219860 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q5q5" Dec 01 21:34:51 crc kubenswrapper[4962]: E1201 21:34:51.220154 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q5q5" podUID="5e1746bf-6971-44aa-ae52-f349e6963eb2" Dec 01 21:34:51 crc kubenswrapper[4962]: I1201 21:34:51.293767 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:51 crc kubenswrapper[4962]: I1201 21:34:51.293860 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:51 crc kubenswrapper[4962]: I1201 21:34:51.293883 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:51 crc kubenswrapper[4962]: I1201 21:34:51.293922 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:51 crc kubenswrapper[4962]: I1201 21:34:51.293980 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:51Z","lastTransitionTime":"2025-12-01T21:34:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:51 crc kubenswrapper[4962]: I1201 21:34:51.397837 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:51 crc kubenswrapper[4962]: I1201 21:34:51.397896 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:51 crc kubenswrapper[4962]: I1201 21:34:51.397915 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:51 crc kubenswrapper[4962]: I1201 21:34:51.397978 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:51 crc kubenswrapper[4962]: I1201 21:34:51.397999 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:51Z","lastTransitionTime":"2025-12-01T21:34:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:51 crc kubenswrapper[4962]: I1201 21:34:51.501755 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:51 crc kubenswrapper[4962]: I1201 21:34:51.501828 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:51 crc kubenswrapper[4962]: I1201 21:34:51.501858 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:51 crc kubenswrapper[4962]: I1201 21:34:51.501889 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:51 crc kubenswrapper[4962]: I1201 21:34:51.501908 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:51Z","lastTransitionTime":"2025-12-01T21:34:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:51 crc kubenswrapper[4962]: I1201 21:34:51.605168 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:51 crc kubenswrapper[4962]: I1201 21:34:51.605221 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:51 crc kubenswrapper[4962]: I1201 21:34:51.605238 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:51 crc kubenswrapper[4962]: I1201 21:34:51.605261 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:51 crc kubenswrapper[4962]: I1201 21:34:51.605279 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:51Z","lastTransitionTime":"2025-12-01T21:34:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:51 crc kubenswrapper[4962]: I1201 21:34:51.708162 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:51 crc kubenswrapper[4962]: I1201 21:34:51.708255 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:51 crc kubenswrapper[4962]: I1201 21:34:51.708276 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:51 crc kubenswrapper[4962]: I1201 21:34:51.708300 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:51 crc kubenswrapper[4962]: I1201 21:34:51.708319 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:51Z","lastTransitionTime":"2025-12-01T21:34:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:51 crc kubenswrapper[4962]: I1201 21:34:51.810333 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:51 crc kubenswrapper[4962]: I1201 21:34:51.810393 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:51 crc kubenswrapper[4962]: I1201 21:34:51.810413 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:51 crc kubenswrapper[4962]: I1201 21:34:51.810436 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:51 crc kubenswrapper[4962]: I1201 21:34:51.810455 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:51Z","lastTransitionTime":"2025-12-01T21:34:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:51 crc kubenswrapper[4962]: I1201 21:34:51.913387 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:51 crc kubenswrapper[4962]: I1201 21:34:51.913459 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:51 crc kubenswrapper[4962]: I1201 21:34:51.913483 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:51 crc kubenswrapper[4962]: I1201 21:34:51.913512 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:51 crc kubenswrapper[4962]: I1201 21:34:51.913533 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:51Z","lastTransitionTime":"2025-12-01T21:34:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:52 crc kubenswrapper[4962]: I1201 21:34:52.016516 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:52 crc kubenswrapper[4962]: I1201 21:34:52.016553 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:52 crc kubenswrapper[4962]: I1201 21:34:52.016562 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:52 crc kubenswrapper[4962]: I1201 21:34:52.016579 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:52 crc kubenswrapper[4962]: I1201 21:34:52.016587 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:52Z","lastTransitionTime":"2025-12-01T21:34:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:52 crc kubenswrapper[4962]: I1201 21:34:52.119335 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:52 crc kubenswrapper[4962]: I1201 21:34:52.119387 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:52 crc kubenswrapper[4962]: I1201 21:34:52.119401 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:52 crc kubenswrapper[4962]: I1201 21:34:52.119422 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:52 crc kubenswrapper[4962]: I1201 21:34:52.119436 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:52Z","lastTransitionTime":"2025-12-01T21:34:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:52 crc kubenswrapper[4962]: I1201 21:34:52.222191 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:52 crc kubenswrapper[4962]: I1201 21:34:52.222262 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:52 crc kubenswrapper[4962]: I1201 21:34:52.222286 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:52 crc kubenswrapper[4962]: I1201 21:34:52.222316 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:52 crc kubenswrapper[4962]: I1201 21:34:52.222339 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:52Z","lastTransitionTime":"2025-12-01T21:34:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:52 crc kubenswrapper[4962]: I1201 21:34:52.235927 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Dec 01 21:34:52 crc kubenswrapper[4962]: I1201 21:34:52.325286 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:52 crc kubenswrapper[4962]: I1201 21:34:52.325348 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:52 crc kubenswrapper[4962]: I1201 21:34:52.325369 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:52 crc kubenswrapper[4962]: I1201 21:34:52.325393 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:52 crc kubenswrapper[4962]: I1201 21:34:52.325412 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:52Z","lastTransitionTime":"2025-12-01T21:34:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:52 crc kubenswrapper[4962]: I1201 21:34:52.427808 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:52 crc kubenswrapper[4962]: I1201 21:34:52.427856 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:52 crc kubenswrapper[4962]: I1201 21:34:52.427875 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:52 crc kubenswrapper[4962]: I1201 21:34:52.427898 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:52 crc kubenswrapper[4962]: I1201 21:34:52.427915 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:52Z","lastTransitionTime":"2025-12-01T21:34:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:52 crc kubenswrapper[4962]: I1201 21:34:52.531181 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:52 crc kubenswrapper[4962]: I1201 21:34:52.531242 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:52 crc kubenswrapper[4962]: I1201 21:34:52.531260 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:52 crc kubenswrapper[4962]: I1201 21:34:52.531284 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:52 crc kubenswrapper[4962]: I1201 21:34:52.531300 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:52Z","lastTransitionTime":"2025-12-01T21:34:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:52 crc kubenswrapper[4962]: I1201 21:34:52.634181 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:52 crc kubenswrapper[4962]: I1201 21:34:52.634301 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:52 crc kubenswrapper[4962]: I1201 21:34:52.634320 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:52 crc kubenswrapper[4962]: I1201 21:34:52.634342 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:52 crc kubenswrapper[4962]: I1201 21:34:52.634359 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:52Z","lastTransitionTime":"2025-12-01T21:34:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:52 crc kubenswrapper[4962]: I1201 21:34:52.737770 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:52 crc kubenswrapper[4962]: I1201 21:34:52.737883 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:52 crc kubenswrapper[4962]: I1201 21:34:52.737907 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:52 crc kubenswrapper[4962]: I1201 21:34:52.738010 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:52 crc kubenswrapper[4962]: I1201 21:34:52.738076 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:52Z","lastTransitionTime":"2025-12-01T21:34:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:52 crc kubenswrapper[4962]: I1201 21:34:52.841184 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:52 crc kubenswrapper[4962]: I1201 21:34:52.841223 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:52 crc kubenswrapper[4962]: I1201 21:34:52.841232 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:52 crc kubenswrapper[4962]: I1201 21:34:52.841245 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:52 crc kubenswrapper[4962]: I1201 21:34:52.841255 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:52Z","lastTransitionTime":"2025-12-01T21:34:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:52 crc kubenswrapper[4962]: I1201 21:34:52.945072 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:52 crc kubenswrapper[4962]: I1201 21:34:52.945126 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:52 crc kubenswrapper[4962]: I1201 21:34:52.945143 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:52 crc kubenswrapper[4962]: I1201 21:34:52.945166 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:52 crc kubenswrapper[4962]: I1201 21:34:52.945186 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:52Z","lastTransitionTime":"2025-12-01T21:34:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:53 crc kubenswrapper[4962]: I1201 21:34:53.049161 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:53 crc kubenswrapper[4962]: I1201 21:34:53.049223 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:53 crc kubenswrapper[4962]: I1201 21:34:53.049240 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:53 crc kubenswrapper[4962]: I1201 21:34:53.049268 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:53 crc kubenswrapper[4962]: I1201 21:34:53.049292 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:53Z","lastTransitionTime":"2025-12-01T21:34:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:53 crc kubenswrapper[4962]: I1201 21:34:53.152682 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:53 crc kubenswrapper[4962]: I1201 21:34:53.152753 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:53 crc kubenswrapper[4962]: I1201 21:34:53.152778 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:53 crc kubenswrapper[4962]: I1201 21:34:53.152807 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:53 crc kubenswrapper[4962]: I1201 21:34:53.152829 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:53Z","lastTransitionTime":"2025-12-01T21:34:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:53 crc kubenswrapper[4962]: I1201 21:34:53.218535 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 21:34:53 crc kubenswrapper[4962]: I1201 21:34:53.218571 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:34:53 crc kubenswrapper[4962]: I1201 21:34:53.218616 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q5q5" Dec 01 21:34:53 crc kubenswrapper[4962]: I1201 21:34:53.218580 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 21:34:53 crc kubenswrapper[4962]: E1201 21:34:53.218692 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 21:34:53 crc kubenswrapper[4962]: E1201 21:34:53.218879 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 21:34:53 crc kubenswrapper[4962]: E1201 21:34:53.218987 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 21:34:53 crc kubenswrapper[4962]: E1201 21:34:53.219091 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q5q5" podUID="5e1746bf-6971-44aa-ae52-f349e6963eb2" Dec 01 21:34:53 crc kubenswrapper[4962]: I1201 21:34:53.255351 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:53 crc kubenswrapper[4962]: I1201 21:34:53.255409 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:53 crc kubenswrapper[4962]: I1201 21:34:53.255434 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:53 crc kubenswrapper[4962]: I1201 21:34:53.255466 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:53 crc kubenswrapper[4962]: I1201 21:34:53.255488 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:53Z","lastTransitionTime":"2025-12-01T21:34:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:53 crc kubenswrapper[4962]: I1201 21:34:53.358648 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:53 crc kubenswrapper[4962]: I1201 21:34:53.358705 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:53 crc kubenswrapper[4962]: I1201 21:34:53.358722 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:53 crc kubenswrapper[4962]: I1201 21:34:53.358744 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:53 crc kubenswrapper[4962]: I1201 21:34:53.358759 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:53Z","lastTransitionTime":"2025-12-01T21:34:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:53 crc kubenswrapper[4962]: I1201 21:34:53.461007 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:53 crc kubenswrapper[4962]: I1201 21:34:53.461084 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:53 crc kubenswrapper[4962]: I1201 21:34:53.461103 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:53 crc kubenswrapper[4962]: I1201 21:34:53.461128 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:53 crc kubenswrapper[4962]: I1201 21:34:53.461147 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:53Z","lastTransitionTime":"2025-12-01T21:34:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:53 crc kubenswrapper[4962]: I1201 21:34:53.564615 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:53 crc kubenswrapper[4962]: I1201 21:34:53.564672 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:53 crc kubenswrapper[4962]: I1201 21:34:53.564690 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:53 crc kubenswrapper[4962]: I1201 21:34:53.564714 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:53 crc kubenswrapper[4962]: I1201 21:34:53.564731 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:53Z","lastTransitionTime":"2025-12-01T21:34:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:53 crc kubenswrapper[4962]: I1201 21:34:53.668308 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:53 crc kubenswrapper[4962]: I1201 21:34:53.668394 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:53 crc kubenswrapper[4962]: I1201 21:34:53.668411 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:53 crc kubenswrapper[4962]: I1201 21:34:53.668435 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:53 crc kubenswrapper[4962]: I1201 21:34:53.668453 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:53Z","lastTransitionTime":"2025-12-01T21:34:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:53 crc kubenswrapper[4962]: I1201 21:34:53.771460 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:53 crc kubenswrapper[4962]: I1201 21:34:53.771558 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:53 crc kubenswrapper[4962]: I1201 21:34:53.771584 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:53 crc kubenswrapper[4962]: I1201 21:34:53.771614 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:53 crc kubenswrapper[4962]: I1201 21:34:53.771635 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:53Z","lastTransitionTime":"2025-12-01T21:34:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:53 crc kubenswrapper[4962]: I1201 21:34:53.874082 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:53 crc kubenswrapper[4962]: I1201 21:34:53.874133 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:53 crc kubenswrapper[4962]: I1201 21:34:53.874144 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:53 crc kubenswrapper[4962]: I1201 21:34:53.874163 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:53 crc kubenswrapper[4962]: I1201 21:34:53.874176 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:53Z","lastTransitionTime":"2025-12-01T21:34:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:53 crc kubenswrapper[4962]: I1201 21:34:53.977224 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:53 crc kubenswrapper[4962]: I1201 21:34:53.977302 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:53 crc kubenswrapper[4962]: I1201 21:34:53.977326 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:53 crc kubenswrapper[4962]: I1201 21:34:53.977356 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:53 crc kubenswrapper[4962]: I1201 21:34:53.977379 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:53Z","lastTransitionTime":"2025-12-01T21:34:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:54 crc kubenswrapper[4962]: I1201 21:34:54.080853 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:54 crc kubenswrapper[4962]: I1201 21:34:54.080925 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:54 crc kubenswrapper[4962]: I1201 21:34:54.080977 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:54 crc kubenswrapper[4962]: I1201 21:34:54.081001 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:54 crc kubenswrapper[4962]: I1201 21:34:54.081021 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:54Z","lastTransitionTime":"2025-12-01T21:34:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:54 crc kubenswrapper[4962]: I1201 21:34:54.184343 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:54 crc kubenswrapper[4962]: I1201 21:34:54.184399 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:54 crc kubenswrapper[4962]: I1201 21:34:54.184433 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:54 crc kubenswrapper[4962]: I1201 21:34:54.184461 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:54 crc kubenswrapper[4962]: I1201 21:34:54.184481 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:54Z","lastTransitionTime":"2025-12-01T21:34:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:54 crc kubenswrapper[4962]: I1201 21:34:54.287401 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:54 crc kubenswrapper[4962]: I1201 21:34:54.287459 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:54 crc kubenswrapper[4962]: I1201 21:34:54.287482 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:54 crc kubenswrapper[4962]: I1201 21:34:54.287529 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:54 crc kubenswrapper[4962]: I1201 21:34:54.287560 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:54Z","lastTransitionTime":"2025-12-01T21:34:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:54 crc kubenswrapper[4962]: I1201 21:34:54.389856 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:54 crc kubenswrapper[4962]: I1201 21:34:54.389968 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:54 crc kubenswrapper[4962]: I1201 21:34:54.389997 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:54 crc kubenswrapper[4962]: I1201 21:34:54.390027 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:54 crc kubenswrapper[4962]: I1201 21:34:54.390048 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:54Z","lastTransitionTime":"2025-12-01T21:34:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:54 crc kubenswrapper[4962]: I1201 21:34:54.493528 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:54 crc kubenswrapper[4962]: I1201 21:34:54.493594 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:54 crc kubenswrapper[4962]: I1201 21:34:54.493615 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:54 crc kubenswrapper[4962]: I1201 21:34:54.493639 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:54 crc kubenswrapper[4962]: I1201 21:34:54.493655 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:54Z","lastTransitionTime":"2025-12-01T21:34:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:54 crc kubenswrapper[4962]: I1201 21:34:54.596175 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:54 crc kubenswrapper[4962]: I1201 21:34:54.596224 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:54 crc kubenswrapper[4962]: I1201 21:34:54.596240 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:54 crc kubenswrapper[4962]: I1201 21:34:54.596262 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:54 crc kubenswrapper[4962]: I1201 21:34:54.596278 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:54Z","lastTransitionTime":"2025-12-01T21:34:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:54 crc kubenswrapper[4962]: I1201 21:34:54.700024 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:54 crc kubenswrapper[4962]: I1201 21:34:54.700088 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:54 crc kubenswrapper[4962]: I1201 21:34:54.700106 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:54 crc kubenswrapper[4962]: I1201 21:34:54.700132 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:54 crc kubenswrapper[4962]: I1201 21:34:54.700149 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:54Z","lastTransitionTime":"2025-12-01T21:34:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:54 crc kubenswrapper[4962]: I1201 21:34:54.802371 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:54 crc kubenswrapper[4962]: I1201 21:34:54.802432 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:54 crc kubenswrapper[4962]: I1201 21:34:54.802455 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:54 crc kubenswrapper[4962]: I1201 21:34:54.802483 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:54 crc kubenswrapper[4962]: I1201 21:34:54.802505 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:54Z","lastTransitionTime":"2025-12-01T21:34:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:54 crc kubenswrapper[4962]: I1201 21:34:54.905218 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:54 crc kubenswrapper[4962]: I1201 21:34:54.905275 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:54 crc kubenswrapper[4962]: I1201 21:34:54.905291 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:54 crc kubenswrapper[4962]: I1201 21:34:54.905317 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:54 crc kubenswrapper[4962]: I1201 21:34:54.905335 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:54Z","lastTransitionTime":"2025-12-01T21:34:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:55 crc kubenswrapper[4962]: I1201 21:34:55.008332 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:55 crc kubenswrapper[4962]: I1201 21:34:55.008514 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:55 crc kubenswrapper[4962]: I1201 21:34:55.008536 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:55 crc kubenswrapper[4962]: I1201 21:34:55.008558 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:55 crc kubenswrapper[4962]: I1201 21:34:55.008576 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:55Z","lastTransitionTime":"2025-12-01T21:34:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:55 crc kubenswrapper[4962]: I1201 21:34:55.111432 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:55 crc kubenswrapper[4962]: I1201 21:34:55.111488 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:55 crc kubenswrapper[4962]: I1201 21:34:55.111506 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:55 crc kubenswrapper[4962]: I1201 21:34:55.111530 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:55 crc kubenswrapper[4962]: I1201 21:34:55.111550 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:55Z","lastTransitionTime":"2025-12-01T21:34:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:55 crc kubenswrapper[4962]: I1201 21:34:55.215187 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:55 crc kubenswrapper[4962]: I1201 21:34:55.215236 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:55 crc kubenswrapper[4962]: I1201 21:34:55.215256 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:55 crc kubenswrapper[4962]: I1201 21:34:55.215279 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:55 crc kubenswrapper[4962]: I1201 21:34:55.215301 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:55Z","lastTransitionTime":"2025-12-01T21:34:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:55 crc kubenswrapper[4962]: I1201 21:34:55.218781 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 21:34:55 crc kubenswrapper[4962]: I1201 21:34:55.218838 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 21:34:55 crc kubenswrapper[4962]: I1201 21:34:55.218802 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q5q5" Dec 01 21:34:55 crc kubenswrapper[4962]: I1201 21:34:55.218867 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:34:55 crc kubenswrapper[4962]: E1201 21:34:55.219018 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 21:34:55 crc kubenswrapper[4962]: E1201 21:34:55.219303 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 21:34:55 crc kubenswrapper[4962]: E1201 21:34:55.219394 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 21:34:55 crc kubenswrapper[4962]: E1201 21:34:55.219475 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q5q5" podUID="5e1746bf-6971-44aa-ae52-f349e6963eb2" Dec 01 21:34:55 crc kubenswrapper[4962]: I1201 21:34:55.319017 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:55 crc kubenswrapper[4962]: I1201 21:34:55.319143 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:55 crc kubenswrapper[4962]: I1201 21:34:55.319170 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:55 crc kubenswrapper[4962]: I1201 21:34:55.319196 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:55 crc kubenswrapper[4962]: I1201 21:34:55.319215 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:55Z","lastTransitionTime":"2025-12-01T21:34:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:55 crc kubenswrapper[4962]: I1201 21:34:55.422397 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:55 crc kubenswrapper[4962]: I1201 21:34:55.422461 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:55 crc kubenswrapper[4962]: I1201 21:34:55.422479 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:55 crc kubenswrapper[4962]: I1201 21:34:55.422505 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:55 crc kubenswrapper[4962]: I1201 21:34:55.422525 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:55Z","lastTransitionTime":"2025-12-01T21:34:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:55 crc kubenswrapper[4962]: I1201 21:34:55.525360 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:55 crc kubenswrapper[4962]: I1201 21:34:55.525414 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:55 crc kubenswrapper[4962]: I1201 21:34:55.525437 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:55 crc kubenswrapper[4962]: I1201 21:34:55.525465 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:55 crc kubenswrapper[4962]: I1201 21:34:55.525486 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:55Z","lastTransitionTime":"2025-12-01T21:34:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:55 crc kubenswrapper[4962]: I1201 21:34:55.628517 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:55 crc kubenswrapper[4962]: I1201 21:34:55.628589 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:55 crc kubenswrapper[4962]: I1201 21:34:55.628614 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:55 crc kubenswrapper[4962]: I1201 21:34:55.628645 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:55 crc kubenswrapper[4962]: I1201 21:34:55.628667 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:55Z","lastTransitionTime":"2025-12-01T21:34:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:55 crc kubenswrapper[4962]: I1201 21:34:55.731743 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:55 crc kubenswrapper[4962]: I1201 21:34:55.731823 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:55 crc kubenswrapper[4962]: I1201 21:34:55.731841 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:55 crc kubenswrapper[4962]: I1201 21:34:55.731866 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:55 crc kubenswrapper[4962]: I1201 21:34:55.731889 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:55Z","lastTransitionTime":"2025-12-01T21:34:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:55 crc kubenswrapper[4962]: I1201 21:34:55.834400 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:55 crc kubenswrapper[4962]: I1201 21:34:55.834465 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:55 crc kubenswrapper[4962]: I1201 21:34:55.834483 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:55 crc kubenswrapper[4962]: I1201 21:34:55.834540 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:55 crc kubenswrapper[4962]: I1201 21:34:55.834560 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:55Z","lastTransitionTime":"2025-12-01T21:34:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:55 crc kubenswrapper[4962]: I1201 21:34:55.938326 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:55 crc kubenswrapper[4962]: I1201 21:34:55.938391 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:55 crc kubenswrapper[4962]: I1201 21:34:55.938414 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:55 crc kubenswrapper[4962]: I1201 21:34:55.938443 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:55 crc kubenswrapper[4962]: I1201 21:34:55.938464 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:55Z","lastTransitionTime":"2025-12-01T21:34:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.041966 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.042043 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.042066 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.042097 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.042119 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:56Z","lastTransitionTime":"2025-12-01T21:34:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.145791 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.145863 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.145883 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.145910 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.145931 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:56Z","lastTransitionTime":"2025-12-01T21:34:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.220496 4962 scope.go:117] "RemoveContainer" containerID="b08ed34e121e3dba29c3c3bfb0a54f8496af37c1b7930a64db11f43c68076592" Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.242851 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0980f7e7-e587-417c-a00b-aada45bea2ed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d0b3972d5038c6a9d8368101aa0f07a39a4d7023ae9e3ea0ef061b909afb194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ceb0b8e00edd30c0fbc6f9b36340785b50eff3b237aeac7dc21c74fcf801c1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e689e109554f82f2b6fc4bbde2e2f2a7d6ba0b5772b57c7a53827e6a5f66223c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://171ba01c49ce48dc0d4b632a8de3217fc6e9eab2c512c7214c377ec51987b2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://171ba01c49ce48dc0d4b632a8de3217fc6e9eab2c512c7214c377ec51987b2d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:56Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.251716 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.251776 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.251794 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.251818 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.251835 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:56Z","lastTransitionTime":"2025-12-01T21:34:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.263657 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:56Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.294588 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"017b2e87-9a6e-46c6-b061-1ed93bfd2322\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c2b5c061f60dc51486c1b3b6fd97f18f63860ecb75697b61ee8288c43f23417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a2d1885515e7eab97b61b268f14f9fab5ff02464621e1a9b6adfee4c032ca3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81854e8f9430ba467e0aecead00fad301f92f85fdcca3f93f5e4aad9bba65925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70bcf088189f0158815dd70d752870decf941ad51bbfbe7c0c872da603659864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f5f31e9f2e590d4d3319248b0a4dca7e1f6575cb973ec7cc3e33fc91eb82d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da8a6f1fac0baebddfe20bd0ebda104356ad37e9f128affd8df2757584e8e712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b08ed34e121e3dba29c3c3bfb0a54f8496af37c1b7930a64db11f43c68076592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b08ed34e121e3dba29c3c3bfb0a54f8496af37c1b7930a64db11f43c68076592\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T21:34:28Z\\\",\\\"message\\\":\\\"] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {43933d5e-3c3b-4ff8-8926-04ac25de450e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 21:34:28.257117 6634 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI1201 21:34:28.257138 6634 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI1201 21:34:28.258137 6634 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nF1201 21:34:28.258116 6634 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-j77n9_openshift-ovn-kubernetes(017b2e87-9a6e-46c6-b061-1ed93bfd2322)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://893a9b3cc35f66e4d7ce63bd2c5c94a0e5824466a37b2281e05c4e92b5e90a05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j77n9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:56Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.317238 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-m4wg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38b9e31-13b0-4a48-93bf-b3722ca60642\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e13ebe98bcf0b54dcebec65942330d0f78a7f024cc3dbe2b2499bf7c572541c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6b607ad59cd135df76eb0d6167a309468b46c7d487077b737d6d35390990de0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T21:34:45Z\\\",\\\"message\\\":\\\"2025-12-01T21:34:00+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_dd791fb7-de12-4b23-866f-9a58f88e2ac0\\\\n2025-12-01T21:34:00+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_dd791fb7-de12-4b23-866f-9a58f88e2ac0 to /host/opt/cni/bin/\\\\n2025-12-01T21:34:00Z [verbose] multus-daemon started\\\\n2025-12-01T21:34:00Z [verbose] Readiness Indicator file check\\\\n2025-12-01T21:34:45Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws4mx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-m4wg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:56Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.337393 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lv2jr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47902aa9-b3e5-4279-a0ee-23ec28d1c67b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131ddb58672271d21222b6c9376dc6fa0bc5ba8d9bb4c8b0db88e81d0f59984c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f38ae7ac1122c7cceeaf7ef6284034b3b00f0d34060ce2a912344e521f8288ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f38ae7ac1122c7cceeaf7ef6284034b3b00f0d34060ce2a912344e521f8288ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3df9e036c0435fee1aa00f18a53bfb50334213a9284f766de1454f30a6c78f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3df9e036c0435fee1aa00f18a53bfb50334213a9284f766de1454f30a6c78f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eea885834889b1efa1a74ddb9d4c378669affd6dcf493f7d5e2c18d63ba9b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eea885834889b1efa1a74ddb9d4c378669affd6dcf493f7d5e2c18d63ba9b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b892d495763f16b0d9b9afd4440fa4d704de060b42b7506b28e9d990c47bfc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b892d495763f16b0d9b9afd4440fa4d704de060b42b7506b28e9d990c47bfc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fd3d341f898bc53c51ad9a79a03ffd02ef0ac0e8f53e33ebc139a30fa85b7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4fd3d341f898bc53c51ad9a79a03ffd02ef0ac0e8f53e33ebc139a30fa85b7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1a2f823c4006bd3f5acb5f92c78b1020ea0992f77a76366c5bab21657bf403f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1a2f823c4006bd3f5acb5f92c78b1020ea0992f77a76366c5bab21657bf403f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lv2jr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:56Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.348643 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4wzdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c4763ed-582e-4003-a0da-526ae8aee799\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca28226077849a752e7f8484f9ef98d6fb065d6ba9b3c4ef05f3ef1b0554bc32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5lb4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:34:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4wzdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:56Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.354972 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.355047 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.355058 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.355073 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.355083 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:56Z","lastTransitionTime":"2025-12-01T21:34:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.360615 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dce968b-9adc-4cbe-958c-00bdd7ba6159\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d0603aa8db338a61428d0c439ab252ad7db6ee0061992381a2f2c2534ecc08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b00d16ddb5e5c38227a87ee09c7e0d74ca43fb9226904dc5d604abe6a9c9b5ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc7177cbcbb0fe51491e055e7a8cb565836f1ffe163b6e160c3b425a5218fc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e749578c0ce8cb1dd38de90c24c5ddf738558163fe9012afcf1db088756495\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:56Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.372871 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db02e5c3ea0837c8534d6d0d89c6343ef077bc79b08bb6514fcc7fa0d71ac94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:56Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.384083 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:56Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.398077 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac15c86b-cd18-4110-bda1-ba116dccb445\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edc674b0b2e671ee34618f4a89fb513b4d6be3b5b2c65510554ed57b84f305b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d764e05de4e1d86add3da927e55f8d2f211bba19a2104af89e43d13f986f5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a26275d40e0cdc563a8704d985b92ca116fd053ac663b23ada1f3d55d3d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7bb726576fbf1e8a595c9fdab30ed4852d838f3ad72aa432b880e8466493ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa02037f2a0411aa4c3b7eff66c72a1885d70e56cf72cdf786d04b5840ecbe6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"message\\\":\\\" 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764624829\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764624829\\\\\\\\\\\\\\\" (2025-12-01 20:33:49 +0000 UTC to 2026-12-01 20:33:49 +0000 UTC (now=2025-12-01 21:33:54.341897199 +0000 UTC))\\\\\\\"\\\\nI1201 21:33:54.342031 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 21:33:54.342089 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 21:33:54.342220 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1201 21:33:54.342244 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1201 21:33:54.341422 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 21:33:54.342706 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 21:33:54.344341 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-602538438/tls.crt::/tmp/serving-cert-602538438/tls.key\\\\\\\"\\\\nI1201 21:33:54.345794 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 21:33:54.345817 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 21:33:54.345844 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348097 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348569 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1201 21:33:54.367723 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bbe0097117ffbc7aa9c8247995dcec3765f03a9aa7657ee0c0da4c3a3ec874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:56Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.414653 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:56Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.426705 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fqtnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84505e9c-7b91-400d-b30b-d7d2cfe3c29b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4acbb8c835ff8f148555f849f503e7bf07afe0526bd6d0a22277e53f2eeef390\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vpt7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5ebad9cf8b8ff273eadbf4b7188ec1d0e0b191dd8c7d6e48efb504a6182b655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vpt7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:34:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fqtnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:56Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.439239 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69adfdccde94cd891284e61c5edd6a092d14c5276ff514618fa3b44169e6347e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a6c3b22dbe7ed00b2356583225c79a46bfbf7014f2ba5b6d2226f1b1f3f7b39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:56Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.448826 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w7dpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"943bfc9d-612b-4273-9774-f1866b7af4b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0486c9366b81f3abfd35806c7eef00eb2a353450d850cc665c43fe3c78c9fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w7dpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:56Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.459326 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.459376 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.459395 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.459808 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.459863 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:56Z","lastTransitionTime":"2025-12-01T21:34:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.461384 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"191b6ce3-f613-4217-b224-a65ee4cfdfe7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37ed2e5c6ce1be6c710341ab02dc9ae89ebb0b309c11e78b251ffb532c31587f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rf67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca94caf0040f2d86239f149d34e419adcc86594fda2b4189e74f699c125857f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rf67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b642k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:56Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.472439 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2q5q5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e1746bf-6971-44aa-ae52-f349e6963eb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wgwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wgwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:34:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2q5q5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:56Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.481716 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fd42bb4-a281-49ee-bcb8-d6c26b78048f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://329efc0484814f00ee6b557b9788f2847ede432890d1a3a33cea693006ed07cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c72b3e92dfa20af11102b274817b0548b41f00933ce9c4dafcc83c8f99bb75d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c72b3e92dfa20af11102b274817b0548b41f00933ce9c4dafcc83c8f99bb75d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:56Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.494022 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://311915b276fcb2749fd7443c07b4a339131584ba32944273ed063ed55edec6c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:56Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.562570 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.562614 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.562626 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.562644 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.562656 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:56Z","lastTransitionTime":"2025-12-01T21:34:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.664847 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.664894 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.664905 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.664922 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.664953 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:56Z","lastTransitionTime":"2025-12-01T21:34:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.767631 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.767686 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.767698 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.767719 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.767735 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:56Z","lastTransitionTime":"2025-12-01T21:34:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.804098 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j77n9_017b2e87-9a6e-46c6-b061-1ed93bfd2322/ovnkube-controller/2.log" Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.807411 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" event={"ID":"017b2e87-9a6e-46c6-b061-1ed93bfd2322","Type":"ContainerStarted","Data":"52ceea6c3fd351be1043a8bf07055a4d385f97a923ff2f08f1ddf810320d61ce"} Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.807974 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.827173 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac15c86b-cd18-4110-bda1-ba116dccb445\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edc674b0b2e671ee34618f4a89fb513b4d6be3b5b2c65510554ed57b84f305b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d764e05de4e1d86add3da927e55f8d2f211bba19a2104af89e43d13f986f5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a26275d40e0cdc563a8704d985b92ca116fd053ac663b23ada1f3d55d3d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7bb726576fbf1e8a595c9fdab30ed4852d838f3ad72aa432b880e8466493ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa02037f2a0411aa4c3b7eff66c72a1885d70e56cf72cdf786d04b5840ecbe6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"message\\\":\\\" 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764624829\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764624829\\\\\\\\\\\\\\\" (2025-12-01 20:33:49 +0000 UTC to 2026-12-01 20:33:49 +0000 UTC (now=2025-12-01 21:33:54.341897199 +0000 UTC))\\\\\\\"\\\\nI1201 21:33:54.342031 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 21:33:54.342089 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 21:33:54.342220 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1201 21:33:54.342244 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1201 21:33:54.341422 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 21:33:54.342706 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 21:33:54.344341 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-602538438/tls.crt::/tmp/serving-cert-602538438/tls.key\\\\\\\"\\\\nI1201 21:33:54.345794 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 21:33:54.345817 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 21:33:54.345844 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348097 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348569 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1201 21:33:54.367723 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bbe0097117ffbc7aa9c8247995dcec3765f03a9aa7657ee0c0da4c3a3ec874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:56Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.844774 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:56Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.858923 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fqtnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84505e9c-7b91-400d-b30b-d7d2cfe3c29b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4acbb8c835ff8f148555f849f503e7bf07afe0526bd6d0a22277e53f2eeef390\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vpt7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5ebad9cf8b8ff273eadbf4b7188ec1d0e0b191dd8c7d6e48efb504a6182b655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vpt7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:34:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fqtnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:56Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.870722 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.870774 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.870786 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.870805 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.870817 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:56Z","lastTransitionTime":"2025-12-01T21:34:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.877849 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fd42bb4-a281-49ee-bcb8-d6c26b78048f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://329efc0484814f00ee6b557b9788f2847ede432890d1a3a33cea693006ed07cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c72b3e92dfa20af11102b274817b0548b41f00933ce9c4dafcc83c8f99bb75d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c72b3e92dfa20af11102b274817b0548b41f00933ce9c4dafcc83c8f99bb75d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:56Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.893862 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://311915b276fcb2749fd7443c07b4a339131584ba32944273ed063ed55edec6c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:56Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.909880 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69adfdccde94cd891284e61c5edd6a092d14c5276ff514618fa3b44169e6347e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a6c3b22dbe7ed00b2356583225c79a46bfbf7014f2ba5b6d2226f1b1f3f7b39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:56Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.922129 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w7dpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"943bfc9d-612b-4273-9774-f1866b7af4b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0486c9366b81f3abfd35806c7eef00eb2a353450d850cc665c43fe3c78c9fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w7dpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:56Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.935828 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"191b6ce3-f613-4217-b224-a65ee4cfdfe7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37ed2e5c6ce1be6c710341ab02dc9ae89ebb0b309c11e78b251ffb532c31587f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rf67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca94caf0040f2d86239f149d34e419adcc86594fda2b4189e74f699c125857f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rf67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b642k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:56Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.950458 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2q5q5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e1746bf-6971-44aa-ae52-f349e6963eb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wgwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wgwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:34:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2q5q5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:56Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.965880 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0980f7e7-e587-417c-a00b-aada45bea2ed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d0b3972d5038c6a9d8368101aa0f07a39a4d7023ae9e3ea0ef061b909afb194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ceb0b8e00edd30c0fbc6f9b36340785b50eff3b237aeac7dc21c74fcf801c1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e689e109554f82f2b6fc4bbde2e2f2a7d6ba0b5772b57c7a53827e6a5f66223c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://171ba01c49ce48dc0d4b632a8de3217fc6e9eab2c512c7214c377ec51987b2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://171ba01c49ce48dc0d4b632a8de3217fc6e9eab2c512c7214c377ec51987b2d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:56Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.973285 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.973365 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.973389 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.973419 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.973440 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:56Z","lastTransitionTime":"2025-12-01T21:34:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:56 crc kubenswrapper[4962]: I1201 21:34:56.984067 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:56Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:57 crc kubenswrapper[4962]: I1201 21:34:57.003964 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"017b2e87-9a6e-46c6-b061-1ed93bfd2322\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c2b5c061f60dc51486c1b3b6fd97f18f63860ecb75697b61ee8288c43f23417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a2d1885515e7eab97b61b268f14f9fab5ff02464621e1a9b6adfee4c032ca3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81854e8f9430ba467e0aecead00fad301f92f85fdcca3f93f5e4aad9bba65925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70bcf088189f0158815dd70d752870decf941ad51bbfbe7c0c872da603659864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f5f31e9f2e590d4d3319248b0a4dca7e1f6575cb973ec7cc3e33fc91eb82d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da8a6f1fac0baebddfe20bd0ebda104356ad37e9f128affd8df2757584e8e712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52ceea6c3fd351be1043a8bf07055a4d385f97a923ff2f08f1ddf810320d61ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b08ed34e121e3dba29c3c3bfb0a54f8496af37c1b7930a64db11f43c68076592\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T21:34:28Z\\\",\\\"message\\\":\\\"] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {43933d5e-3c3b-4ff8-8926-04ac25de450e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 21:34:28.257117 6634 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI1201 21:34:28.257138 6634 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI1201 21:34:28.258137 6634 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nF1201 21:34:28.258116 6634 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://893a9b3cc35f66e4d7ce63bd2c5c94a0e5824466a37b2281e05c4e92b5e90a05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j77n9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:57Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:57 crc kubenswrapper[4962]: I1201 21:34:57.021688 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dce968b-9adc-4cbe-958c-00bdd7ba6159\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d0603aa8db338a61428d0c439ab252ad7db6ee0061992381a2f2c2534ecc08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b00d16ddb5e5c38227a87ee09c7e0d74ca43fb9226904dc5d604abe6a9c9b5ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc7177cbcbb0fe51491e055e7a8cb565836f1ffe163b6e160c3b425a5218fc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e749578c0ce8cb1dd38de90c24c5ddf738558163fe9012afcf1db088756495\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:57Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:57 crc kubenswrapper[4962]: I1201 21:34:57.044041 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db02e5c3ea0837c8534d6d0d89c6343ef077bc79b08bb6514fcc7fa0d71ac94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:57Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:57 crc kubenswrapper[4962]: I1201 21:34:57.064286 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:57Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:57 crc kubenswrapper[4962]: I1201 21:34:57.076534 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:57 crc kubenswrapper[4962]: I1201 21:34:57.076609 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:57 crc kubenswrapper[4962]: I1201 21:34:57.076631 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:57 crc kubenswrapper[4962]: I1201 21:34:57.076665 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:57 crc kubenswrapper[4962]: I1201 21:34:57.076689 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:57Z","lastTransitionTime":"2025-12-01T21:34:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:57 crc kubenswrapper[4962]: I1201 21:34:57.088251 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-m4wg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38b9e31-13b0-4a48-93bf-b3722ca60642\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e13ebe98bcf0b54dcebec65942330d0f78a7f024cc3dbe2b2499bf7c572541c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6b607ad59cd135df76eb0d6167a309468b46c7d487077b737d6d35390990de0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T21:34:45Z\\\",\\\"message\\\":\\\"2025-12-01T21:34:00+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_dd791fb7-de12-4b23-866f-9a58f88e2ac0\\\\n2025-12-01T21:34:00+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_dd791fb7-de12-4b23-866f-9a58f88e2ac0 to /host/opt/cni/bin/\\\\n2025-12-01T21:34:00Z [verbose] multus-daemon started\\\\n2025-12-01T21:34:00Z [verbose] Readiness Indicator file check\\\\n2025-12-01T21:34:45Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws4mx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-m4wg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:57Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:57 crc kubenswrapper[4962]: I1201 21:34:57.109297 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lv2jr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47902aa9-b3e5-4279-a0ee-23ec28d1c67b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131ddb58672271d21222b6c9376dc6fa0bc5ba8d9bb4c8b0db88e81d0f59984c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f38ae7ac1122c7cceeaf7ef6284034b3b00f0d34060ce2a912344e521f8288ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f38ae7ac1122c7cceeaf7ef6284034b3b00f0d34060ce2a912344e521f8288ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3df9e036c0435fee1aa00f18a53bfb50334213a9284f766de1454f30a6c78f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3df9e036c0435fee1aa00f18a53bfb50334213a9284f766de1454f30a6c78f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eea885834889b1efa1a74ddb9d4c378669affd6dcf493f7d5e2c18d63ba9b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eea885834889b1efa1a74ddb9d4c378669affd6dcf493f7d5e2c18d63ba9b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b892d495763f16b0d9b9afd4440fa4d704de060b42b7506b28e9d990c47bfc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b892d495763f16b0d9b9afd4440fa4d704de060b42b7506b28e9d990c47bfc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fd3d341f898bc53c51ad9a79a03ffd02ef0ac0e8f53e33ebc139a30fa85b7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4fd3d341f898bc53c51ad9a79a03ffd02ef0ac0e8f53e33ebc139a30fa85b7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1a2f823c4006bd3f5acb5f92c78b1020ea0992f77a76366c5bab21657bf403f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1a2f823c4006bd3f5acb5f92c78b1020ea0992f77a76366c5bab21657bf403f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lv2jr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:57Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:57 crc kubenswrapper[4962]: I1201 21:34:57.120678 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4wzdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c4763ed-582e-4003-a0da-526ae8aee799\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca28226077849a752e7f8484f9ef98d6fb065d6ba9b3c4ef05f3ef1b0554bc32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5lb4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:34:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4wzdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:57Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:57 crc kubenswrapper[4962]: I1201 21:34:57.179535 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:57 crc kubenswrapper[4962]: I1201 21:34:57.179638 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:57 crc kubenswrapper[4962]: I1201 21:34:57.179666 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:57 crc kubenswrapper[4962]: I1201 21:34:57.179699 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:57 crc kubenswrapper[4962]: I1201 21:34:57.179725 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:57Z","lastTransitionTime":"2025-12-01T21:34:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:57 crc kubenswrapper[4962]: I1201 21:34:57.218831 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 21:34:57 crc kubenswrapper[4962]: I1201 21:34:57.218889 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q5q5" Dec 01 21:34:57 crc kubenswrapper[4962]: I1201 21:34:57.218980 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 21:34:57 crc kubenswrapper[4962]: E1201 21:34:57.219075 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 21:34:57 crc kubenswrapper[4962]: I1201 21:34:57.219107 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:34:57 crc kubenswrapper[4962]: E1201 21:34:57.219310 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q5q5" podUID="5e1746bf-6971-44aa-ae52-f349e6963eb2" Dec 01 21:34:57 crc kubenswrapper[4962]: E1201 21:34:57.219576 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 21:34:57 crc kubenswrapper[4962]: E1201 21:34:57.219674 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 21:34:57 crc kubenswrapper[4962]: I1201 21:34:57.283061 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:57 crc kubenswrapper[4962]: I1201 21:34:57.283137 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:57 crc kubenswrapper[4962]: I1201 21:34:57.283162 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:57 crc kubenswrapper[4962]: I1201 21:34:57.283192 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:57 crc kubenswrapper[4962]: I1201 21:34:57.283216 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:57Z","lastTransitionTime":"2025-12-01T21:34:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:57 crc kubenswrapper[4962]: I1201 21:34:57.387210 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:57 crc kubenswrapper[4962]: I1201 21:34:57.387262 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:57 crc kubenswrapper[4962]: I1201 21:34:57.387278 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:57 crc kubenswrapper[4962]: I1201 21:34:57.387323 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:57 crc kubenswrapper[4962]: I1201 21:34:57.387340 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:57Z","lastTransitionTime":"2025-12-01T21:34:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:57 crc kubenswrapper[4962]: I1201 21:34:57.490691 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:57 crc kubenswrapper[4962]: I1201 21:34:57.490747 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:57 crc kubenswrapper[4962]: I1201 21:34:57.490764 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:57 crc kubenswrapper[4962]: I1201 21:34:57.490789 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:57 crc kubenswrapper[4962]: I1201 21:34:57.490810 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:57Z","lastTransitionTime":"2025-12-01T21:34:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:57 crc kubenswrapper[4962]: I1201 21:34:57.593985 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:57 crc kubenswrapper[4962]: I1201 21:34:57.594040 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:57 crc kubenswrapper[4962]: I1201 21:34:57.594058 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:57 crc kubenswrapper[4962]: I1201 21:34:57.594081 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:57 crc kubenswrapper[4962]: I1201 21:34:57.594098 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:57Z","lastTransitionTime":"2025-12-01T21:34:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:57 crc kubenswrapper[4962]: I1201 21:34:57.696554 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:57 crc kubenswrapper[4962]: I1201 21:34:57.696612 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:57 crc kubenswrapper[4962]: I1201 21:34:57.696629 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:57 crc kubenswrapper[4962]: I1201 21:34:57.696653 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:57 crc kubenswrapper[4962]: I1201 21:34:57.696669 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:57Z","lastTransitionTime":"2025-12-01T21:34:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:57 crc kubenswrapper[4962]: I1201 21:34:57.800377 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:57 crc kubenswrapper[4962]: I1201 21:34:57.800436 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:57 crc kubenswrapper[4962]: I1201 21:34:57.800452 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:57 crc kubenswrapper[4962]: I1201 21:34:57.800475 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:57 crc kubenswrapper[4962]: I1201 21:34:57.800493 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:57Z","lastTransitionTime":"2025-12-01T21:34:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:57 crc kubenswrapper[4962]: I1201 21:34:57.814540 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j77n9_017b2e87-9a6e-46c6-b061-1ed93bfd2322/ovnkube-controller/3.log" Dec 01 21:34:57 crc kubenswrapper[4962]: I1201 21:34:57.815765 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j77n9_017b2e87-9a6e-46c6-b061-1ed93bfd2322/ovnkube-controller/2.log" Dec 01 21:34:57 crc kubenswrapper[4962]: I1201 21:34:57.819880 4962 generic.go:334] "Generic (PLEG): container finished" podID="017b2e87-9a6e-46c6-b061-1ed93bfd2322" containerID="52ceea6c3fd351be1043a8bf07055a4d385f97a923ff2f08f1ddf810320d61ce" exitCode=1 Dec 01 21:34:57 crc kubenswrapper[4962]: I1201 21:34:57.819924 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" event={"ID":"017b2e87-9a6e-46c6-b061-1ed93bfd2322","Type":"ContainerDied","Data":"52ceea6c3fd351be1043a8bf07055a4d385f97a923ff2f08f1ddf810320d61ce"} Dec 01 21:34:57 crc kubenswrapper[4962]: I1201 21:34:57.820014 4962 scope.go:117] "RemoveContainer" containerID="b08ed34e121e3dba29c3c3bfb0a54f8496af37c1b7930a64db11f43c68076592" Dec 01 21:34:57 crc kubenswrapper[4962]: I1201 21:34:57.821276 4962 scope.go:117] "RemoveContainer" containerID="52ceea6c3fd351be1043a8bf07055a4d385f97a923ff2f08f1ddf810320d61ce" Dec 01 21:34:57 crc kubenswrapper[4962]: E1201 21:34:57.821775 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-j77n9_openshift-ovn-kubernetes(017b2e87-9a6e-46c6-b061-1ed93bfd2322)\"" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" podUID="017b2e87-9a6e-46c6-b061-1ed93bfd2322" Dec 01 21:34:57 crc kubenswrapper[4962]: I1201 21:34:57.845224 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db02e5c3ea0837c8534d6d0d89c6343ef077bc79b08bb6514fcc7fa0d71ac94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:57Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:57 crc kubenswrapper[4962]: I1201 21:34:57.866386 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:57Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:57 crc kubenswrapper[4962]: I1201 21:34:57.892439 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-m4wg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38b9e31-13b0-4a48-93bf-b3722ca60642\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e13ebe98bcf0b54dcebec65942330d0f78a7f024cc3dbe2b2499bf7c572541c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6b607ad59cd135df76eb0d6167a309468b46c7d487077b737d6d35390990de0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T21:34:45Z\\\",\\\"message\\\":\\\"2025-12-01T21:34:00+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_dd791fb7-de12-4b23-866f-9a58f88e2ac0\\\\n2025-12-01T21:34:00+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_dd791fb7-de12-4b23-866f-9a58f88e2ac0 to /host/opt/cni/bin/\\\\n2025-12-01T21:34:00Z [verbose] multus-daemon started\\\\n2025-12-01T21:34:00Z [verbose] Readiness Indicator file check\\\\n2025-12-01T21:34:45Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws4mx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-m4wg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:57Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:57 crc kubenswrapper[4962]: I1201 21:34:57.903779 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:57 crc kubenswrapper[4962]: I1201 21:34:57.903848 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:57 crc kubenswrapper[4962]: I1201 21:34:57.903869 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:57 crc kubenswrapper[4962]: I1201 21:34:57.903895 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:57 crc kubenswrapper[4962]: I1201 21:34:57.903916 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:57Z","lastTransitionTime":"2025-12-01T21:34:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:57 crc kubenswrapper[4962]: I1201 21:34:57.916645 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lv2jr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47902aa9-b3e5-4279-a0ee-23ec28d1c67b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131ddb58672271d21222b6c9376dc6fa0bc5ba8d9bb4c8b0db88e81d0f59984c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f38ae7ac1122c7cceeaf7ef6284034b3b00f0d34060ce2a912344e521f8288ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f38ae7ac1122c7cceeaf7ef6284034b3b00f0d34060ce2a912344e521f8288ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3df9e036c0435fee1aa00f18a53bfb50334213a9284f766de1454f30a6c78f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3df9e036c0435fee1aa00f18a53bfb50334213a9284f766de1454f30a6c78f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eea885834889b1efa1a74ddb9d4c378669affd6dcf493f7d5e2c18d63ba9b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eea885834889b1efa1a74ddb9d4c378669affd6dcf493f7d5e2c18d63ba9b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b892d495763f16b0d9b9afd4440fa4d704de060b42b7506b28e9d990c47bfc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b892d495763f16b0d9b9afd4440fa4d704de060b42b7506b28e9d990c47bfc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fd3d341f898bc53c51ad9a79a03ffd02ef0ac0e8f53e33ebc139a30fa85b7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4fd3d341f898bc53c51ad9a79a03ffd02ef0ac0e8f53e33ebc139a30fa85b7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1a2f823c4006bd3f5acb5f92c78b1020ea0992f77a76366c5bab21657bf403f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1a2f823c4006bd3f5acb5f92c78b1020ea0992f77a76366c5bab21657bf403f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lv2jr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:57Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:57 crc kubenswrapper[4962]: I1201 21:34:57.933394 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4wzdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c4763ed-582e-4003-a0da-526ae8aee799\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca28226077849a752e7f8484f9ef98d6fb065d6ba9b3c4ef05f3ef1b0554bc32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5lb4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:34:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4wzdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:57Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:57 crc kubenswrapper[4962]: I1201 21:34:57.954472 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dce968b-9adc-4cbe-958c-00bdd7ba6159\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d0603aa8db338a61428d0c439ab252ad7db6ee0061992381a2f2c2534ecc08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b00d16ddb5e5c38227a87ee09c7e0d74ca43fb9226904dc5d604abe6a9c9b5ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc7177cbcbb0fe51491e055e7a8cb565836f1ffe163b6e160c3b425a5218fc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e749578c0ce8cb1dd38de90c24c5ddf738558163fe9012afcf1db088756495\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:57Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:57 crc kubenswrapper[4962]: I1201 21:34:57.977303 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:57Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.000911 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fqtnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84505e9c-7b91-400d-b30b-d7d2cfe3c29b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4acbb8c835ff8f148555f849f503e7bf07afe0526bd6d0a22277e53f2eeef390\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vpt7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5ebad9cf8b8ff273eadbf4b7188ec1d0e0b191dd8c7d6e48efb504a6182b655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vpt7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:34:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fqtnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:57Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.010567 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.010654 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.010669 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.010696 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.010718 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:58Z","lastTransitionTime":"2025-12-01T21:34:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.018138 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac15c86b-cd18-4110-bda1-ba116dccb445\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edc674b0b2e671ee34618f4a89fb513b4d6be3b5b2c65510554ed57b84f305b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d764e05de4e1d86add3da927e55f8d2f211bba19a2104af89e43d13f986f5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a26275d40e0cdc563a8704d985b92ca116fd053ac663b23ada1f3d55d3d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7bb726576fbf1e8a595c9fdab30ed4852d838f3ad72aa432b880e8466493ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa02037f2a0411aa4c3b7eff66c72a1885d70e56cf72cdf786d04b5840ecbe6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"message\\\":\\\" 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764624829\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764624829\\\\\\\\\\\\\\\" (2025-12-01 20:33:49 +0000 UTC to 2026-12-01 20:33:49 +0000 UTC (now=2025-12-01 21:33:54.341897199 +0000 UTC))\\\\\\\"\\\\nI1201 21:33:54.342031 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 21:33:54.342089 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 21:33:54.342220 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1201 21:33:54.342244 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1201 21:33:54.341422 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 21:33:54.342706 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 21:33:54.344341 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-602538438/tls.crt::/tmp/serving-cert-602538438/tls.key\\\\\\\"\\\\nI1201 21:33:54.345794 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 21:33:54.345817 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 21:33:54.345844 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348097 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348569 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1201 21:33:54.367723 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bbe0097117ffbc7aa9c8247995dcec3765f03a9aa7657ee0c0da4c3a3ec874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:58Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.032434 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://311915b276fcb2749fd7443c07b4a339131584ba32944273ed063ed55edec6c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:58Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.050269 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69adfdccde94cd891284e61c5edd6a092d14c5276ff514618fa3b44169e6347e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a6c3b22dbe7ed00b2356583225c79a46bfbf7014f2ba5b6d2226f1b1f3f7b39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:58Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.064403 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w7dpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"943bfc9d-612b-4273-9774-f1866b7af4b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0486c9366b81f3abfd35806c7eef00eb2a353450d850cc665c43fe3c78c9fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w7dpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:58Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.076862 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"191b6ce3-f613-4217-b224-a65ee4cfdfe7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37ed2e5c6ce1be6c710341ab02dc9ae89ebb0b309c11e78b251ffb532c31587f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rf67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca94caf0040f2d86239f149d34e419adcc86594fda2b4189e74f699c125857f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rf67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b642k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:58Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.094256 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2q5q5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e1746bf-6971-44aa-ae52-f349e6963eb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wgwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wgwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:34:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2q5q5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:58Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.107162 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fd42bb4-a281-49ee-bcb8-d6c26b78048f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://329efc0484814f00ee6b557b9788f2847ede432890d1a3a33cea693006ed07cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c72b3e92dfa20af11102b274817b0548b41f00933ce9c4dafcc83c8f99bb75d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c72b3e92dfa20af11102b274817b0548b41f00933ce9c4dafcc83c8f99bb75d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:58Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.114533 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.114727 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.114852 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.115028 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.115149 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:58Z","lastTransitionTime":"2025-12-01T21:34:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.122299 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:58Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.143141 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"017b2e87-9a6e-46c6-b061-1ed93bfd2322\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c2b5c061f60dc51486c1b3b6fd97f18f63860ecb75697b61ee8288c43f23417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a2d1885515e7eab97b61b268f14f9fab5ff02464621e1a9b6adfee4c032ca3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81854e8f9430ba467e0aecead00fad301f92f85fdcca3f93f5e4aad9bba65925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70bcf088189f0158815dd70d752870decf941ad51bbfbe7c0c872da603659864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f5f31e9f2e590d4d3319248b0a4dca7e1f6575cb973ec7cc3e33fc91eb82d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da8a6f1fac0baebddfe20bd0ebda104356ad37e9f128affd8df2757584e8e712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52ceea6c3fd351be1043a8bf07055a4d385f97a923ff2f08f1ddf810320d61ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b08ed34e121e3dba29c3c3bfb0a54f8496af37c1b7930a64db11f43c68076592\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T21:34:28Z\\\",\\\"message\\\":\\\"] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {43933d5e-3c3b-4ff8-8926-04ac25de450e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 21:34:28.257117 6634 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI1201 21:34:28.257138 6634 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI1201 21:34:28.258137 6634 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nF1201 21:34:28.258116 6634 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52ceea6c3fd351be1043a8bf07055a4d385f97a923ff2f08f1ddf810320d61ce\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T21:34:57Z\\\",\\\"message\\\":\\\"kind:Service k8s.ovn.org/owner:openshift-marketplace/redhat-marketplace]} name:Service_openshift-marketplace/redhat-marketplace_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.140:50051:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {97b6e7b0-06ca-455e-8259-06895040cb0c}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 21:34:57.069447 7035 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-marketplace/redhat-marketplace]} name:Service_openshift-marketplace/redhat-marketplace_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.140:50051:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {97b6e7b0-06ca-455e-8259-06895040cb0c}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1201 21:34:57.068683 7035 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://893a9b3cc35f66e4d7ce63bd2c5c94a0e5824466a37b2281e05c4e92b5e90a05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j77n9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:58Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.156621 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0980f7e7-e587-417c-a00b-aada45bea2ed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d0b3972d5038c6a9d8368101aa0f07a39a4d7023ae9e3ea0ef061b909afb194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ceb0b8e00edd30c0fbc6f9b36340785b50eff3b237aeac7dc21c74fcf801c1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e689e109554f82f2b6fc4bbde2e2f2a7d6ba0b5772b57c7a53827e6a5f66223c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://171ba01c49ce48dc0d4b632a8de3217fc6e9eab2c512c7214c377ec51987b2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://171ba01c49ce48dc0d4b632a8de3217fc6e9eab2c512c7214c377ec51987b2d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:58Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.218423 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.218481 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.218500 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.218528 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.218547 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:58Z","lastTransitionTime":"2025-12-01T21:34:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.321726 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.321779 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.321792 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.321811 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.321822 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:58Z","lastTransitionTime":"2025-12-01T21:34:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.426035 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.426100 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.426119 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.426144 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.426161 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:58Z","lastTransitionTime":"2025-12-01T21:34:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.529672 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.529728 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.529749 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.529776 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.529797 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:58Z","lastTransitionTime":"2025-12-01T21:34:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.633282 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.633353 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.633401 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.633430 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.633456 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:58Z","lastTransitionTime":"2025-12-01T21:34:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.728797 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.728979 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.729010 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.729051 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.729081 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:58Z","lastTransitionTime":"2025-12-01T21:34:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:58 crc kubenswrapper[4962]: E1201 21:34:58.767490 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1be34435-728a-4ba5-8928-fb5e344b2f91\\\",\\\"systemUUID\\\":\\\"c1819972-ca37-4089-9661-6671772d5e38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:58Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.777162 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.777224 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.777241 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.777264 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.777282 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:58Z","lastTransitionTime":"2025-12-01T21:34:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:58 crc kubenswrapper[4962]: E1201 21:34:58.804129 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1be34435-728a-4ba5-8928-fb5e344b2f91\\\",\\\"systemUUID\\\":\\\"c1819972-ca37-4089-9661-6671772d5e38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:58Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.808980 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.809026 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.809039 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.809058 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.809070 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:58Z","lastTransitionTime":"2025-12-01T21:34:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.824655 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j77n9_017b2e87-9a6e-46c6-b061-1ed93bfd2322/ovnkube-controller/3.log" Dec 01 21:34:58 crc kubenswrapper[4962]: E1201 21:34:58.828171 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1be34435-728a-4ba5-8928-fb5e344b2f91\\\",\\\"systemUUID\\\":\\\"c1819972-ca37-4089-9661-6671772d5e38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:58Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.828776 4962 scope.go:117] "RemoveContainer" containerID="52ceea6c3fd351be1043a8bf07055a4d385f97a923ff2f08f1ddf810320d61ce" Dec 01 21:34:58 crc kubenswrapper[4962]: E1201 21:34:58.828976 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-j77n9_openshift-ovn-kubernetes(017b2e87-9a6e-46c6-b061-1ed93bfd2322)\"" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" podUID="017b2e87-9a6e-46c6-b061-1ed93bfd2322" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.832754 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.832801 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.832812 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.832830 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.832843 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:58Z","lastTransitionTime":"2025-12-01T21:34:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.844498 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-m4wg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38b9e31-13b0-4a48-93bf-b3722ca60642\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e13ebe98bcf0b54dcebec65942330d0f78a7f024cc3dbe2b2499bf7c572541c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6b607ad59cd135df76eb0d6167a309468b46c7d487077b737d6d35390990de0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T21:34:45Z\\\",\\\"message\\\":\\\"2025-12-01T21:34:00+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_dd791fb7-de12-4b23-866f-9a58f88e2ac0\\\\n2025-12-01T21:34:00+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_dd791fb7-de12-4b23-866f-9a58f88e2ac0 to /host/opt/cni/bin/\\\\n2025-12-01T21:34:00Z [verbose] multus-daemon started\\\\n2025-12-01T21:34:00Z [verbose] Readiness Indicator file check\\\\n2025-12-01T21:34:45Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws4mx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-m4wg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:58Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:58 crc kubenswrapper[4962]: E1201 21:34:58.845642 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1be34435-728a-4ba5-8928-fb5e344b2f91\\\",\\\"systemUUID\\\":\\\"c1819972-ca37-4089-9661-6671772d5e38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:58Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.850565 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.850610 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.850624 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.850642 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.850656 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:58Z","lastTransitionTime":"2025-12-01T21:34:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.870720 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lv2jr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47902aa9-b3e5-4279-a0ee-23ec28d1c67b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131ddb58672271d21222b6c9376dc6fa0bc5ba8d9bb4c8b0db88e81d0f59984c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f38ae7ac1122c7cceeaf7ef6284034b3b00f0d34060ce2a912344e521f8288ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f38ae7ac1122c7cceeaf7ef6284034b3b00f0d34060ce2a912344e521f8288ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3df9e036c0435fee1aa00f18a53bfb50334213a9284f766de1454f30a6c78f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3df9e036c0435fee1aa00f18a53bfb50334213a9284f766de1454f30a6c78f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eea885834889b1efa1a74ddb9d4c378669affd6dcf493f7d5e2c18d63ba9b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eea885834889b1efa1a74ddb9d4c378669affd6dcf493f7d5e2c18d63ba9b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b892d495763f16b0d9b9afd4440fa4d704de060b42b7506b28e9d990c47bfc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b892d495763f16b0d9b9afd4440fa4d704de060b42b7506b28e9d990c47bfc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fd3d341f898bc53c51ad9a79a03ffd02ef0ac0e8f53e33ebc139a30fa85b7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4fd3d341f898bc53c51ad9a79a03ffd02ef0ac0e8f53e33ebc139a30fa85b7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1a2f823c4006bd3f5acb5f92c78b1020ea0992f77a76366c5bab21657bf403f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1a2f823c4006bd3f5acb5f92c78b1020ea0992f77a76366c5bab21657bf403f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lv2jr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:58Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:58 crc kubenswrapper[4962]: E1201 21:34:58.873188 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:34:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1be34435-728a-4ba5-8928-fb5e344b2f91\\\",\\\"systemUUID\\\":\\\"c1819972-ca37-4089-9661-6671772d5e38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:58Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:58 crc kubenswrapper[4962]: E1201 21:34:58.873549 4962 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.875728 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.875786 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.875802 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.875821 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.875835 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:58Z","lastTransitionTime":"2025-12-01T21:34:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.889913 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4wzdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c4763ed-582e-4003-a0da-526ae8aee799\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca28226077849a752e7f8484f9ef98d6fb065d6ba9b3c4ef05f3ef1b0554bc32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5lb4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:34:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4wzdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:58Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.908678 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dce968b-9adc-4cbe-958c-00bdd7ba6159\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d0603aa8db338a61428d0c439ab252ad7db6ee0061992381a2f2c2534ecc08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b00d16ddb5e5c38227a87ee09c7e0d74ca43fb9226904dc5d604abe6a9c9b5ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc7177cbcbb0fe51491e055e7a8cb565836f1ffe163b6e160c3b425a5218fc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e749578c0ce8cb1dd38de90c24c5ddf738558163fe9012afcf1db088756495\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:58Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.932492 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db02e5c3ea0837c8534d6d0d89c6343ef077bc79b08bb6514fcc7fa0d71ac94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:58Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.953647 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:58Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.978393 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac15c86b-cd18-4110-bda1-ba116dccb445\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edc674b0b2e671ee34618f4a89fb513b4d6be3b5b2c65510554ed57b84f305b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d764e05de4e1d86add3da927e55f8d2f211bba19a2104af89e43d13f986f5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a26275d40e0cdc563a8704d985b92ca116fd053ac663b23ada1f3d55d3d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7bb726576fbf1e8a595c9fdab30ed4852d838f3ad72aa432b880e8466493ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa02037f2a0411aa4c3b7eff66c72a1885d70e56cf72cdf786d04b5840ecbe6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"message\\\":\\\" 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764624829\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764624829\\\\\\\\\\\\\\\" (2025-12-01 20:33:49 +0000 UTC to 2026-12-01 20:33:49 +0000 UTC (now=2025-12-01 21:33:54.341897199 +0000 UTC))\\\\\\\"\\\\nI1201 21:33:54.342031 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 21:33:54.342089 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 21:33:54.342220 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1201 21:33:54.342244 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1201 21:33:54.341422 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 21:33:54.342706 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 21:33:54.344341 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-602538438/tls.crt::/tmp/serving-cert-602538438/tls.key\\\\\\\"\\\\nI1201 21:33:54.345794 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 21:33:54.345817 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 21:33:54.345844 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348097 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348569 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1201 21:33:54.367723 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bbe0097117ffbc7aa9c8247995dcec3765f03a9aa7657ee0c0da4c3a3ec874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:58Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.979710 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.979971 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.980071 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.980648 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:58 crc kubenswrapper[4962]: I1201 21:34:58.980739 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:58Z","lastTransitionTime":"2025-12-01T21:34:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:59 crc kubenswrapper[4962]: I1201 21:34:59.005800 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:59Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:59 crc kubenswrapper[4962]: I1201 21:34:59.024774 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fqtnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84505e9c-7b91-400d-b30b-d7d2cfe3c29b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4acbb8c835ff8f148555f849f503e7bf07afe0526bd6d0a22277e53f2eeef390\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vpt7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5ebad9cf8b8ff273eadbf4b7188ec1d0e0b191dd8c7d6e48efb504a6182b655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vpt7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:34:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fqtnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:59Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:59 crc kubenswrapper[4962]: I1201 21:34:59.041562 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69adfdccde94cd891284e61c5edd6a092d14c5276ff514618fa3b44169e6347e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a6c3b22dbe7ed00b2356583225c79a46bfbf7014f2ba5b6d2226f1b1f3f7b39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:59Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:59 crc kubenswrapper[4962]: I1201 21:34:59.058959 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w7dpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"943bfc9d-612b-4273-9774-f1866b7af4b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0486c9366b81f3abfd35806c7eef00eb2a353450d850cc665c43fe3c78c9fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w7dpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:59Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:59 crc kubenswrapper[4962]: I1201 21:34:59.079021 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"191b6ce3-f613-4217-b224-a65ee4cfdfe7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37ed2e5c6ce1be6c710341ab02dc9ae89ebb0b309c11e78b251ffb532c31587f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rf67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca94caf0040f2d86239f149d34e419adcc86594fda2b4189e74f699c125857f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rf67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b642k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:59Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:59 crc kubenswrapper[4962]: I1201 21:34:59.086556 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:59 crc kubenswrapper[4962]: I1201 21:34:59.086900 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:59 crc kubenswrapper[4962]: I1201 21:34:59.087159 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:59 crc kubenswrapper[4962]: I1201 21:34:59.087388 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:59 crc kubenswrapper[4962]: I1201 21:34:59.087536 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:59Z","lastTransitionTime":"2025-12-01T21:34:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:59 crc kubenswrapper[4962]: I1201 21:34:59.099872 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2q5q5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e1746bf-6971-44aa-ae52-f349e6963eb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wgwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wgwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:34:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2q5q5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:59Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:59 crc kubenswrapper[4962]: I1201 21:34:59.112277 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 21:34:59 crc kubenswrapper[4962]: E1201 21:34:59.112520 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 21:36:03.112487843 +0000 UTC m=+147.213927078 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 21:34:59 crc kubenswrapper[4962]: I1201 21:34:59.118014 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fd42bb4-a281-49ee-bcb8-d6c26b78048f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://329efc0484814f00ee6b557b9788f2847ede432890d1a3a33cea693006ed07cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c72b3e92dfa20af11102b274817b0548b41f00933ce9c4dafcc83c8f99bb75d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c72b3e92dfa20af11102b274817b0548b41f00933ce9c4dafcc83c8f99bb75d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:59Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:59 crc kubenswrapper[4962]: I1201 21:34:59.137359 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://311915b276fcb2749fd7443c07b4a339131584ba32944273ed063ed55edec6c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:59Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:59 crc kubenswrapper[4962]: I1201 21:34:59.153447 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0980f7e7-e587-417c-a00b-aada45bea2ed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d0b3972d5038c6a9d8368101aa0f07a39a4d7023ae9e3ea0ef061b909afb194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ceb0b8e00edd30c0fbc6f9b36340785b50eff3b237aeac7dc21c74fcf801c1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e689e109554f82f2b6fc4bbde2e2f2a7d6ba0b5772b57c7a53827e6a5f66223c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://171ba01c49ce48dc0d4b632a8de3217fc6e9eab2c512c7214c377ec51987b2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://171ba01c49ce48dc0d4b632a8de3217fc6e9eab2c512c7214c377ec51987b2d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:59Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:59 crc kubenswrapper[4962]: I1201 21:34:59.170367 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:59Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:59 crc kubenswrapper[4962]: I1201 21:34:59.191025 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:59 crc kubenswrapper[4962]: I1201 21:34:59.191094 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:59 crc kubenswrapper[4962]: I1201 21:34:59.191112 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:59 crc kubenswrapper[4962]: I1201 21:34:59.191144 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:59 crc kubenswrapper[4962]: I1201 21:34:59.191163 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:59Z","lastTransitionTime":"2025-12-01T21:34:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:59 crc kubenswrapper[4962]: I1201 21:34:59.194586 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"017b2e87-9a6e-46c6-b061-1ed93bfd2322\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c2b5c061f60dc51486c1b3b6fd97f18f63860ecb75697b61ee8288c43f23417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a2d1885515e7eab97b61b268f14f9fab5ff02464621e1a9b6adfee4c032ca3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81854e8f9430ba467e0aecead00fad301f92f85fdcca3f93f5e4aad9bba65925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70bcf088189f0158815dd70d752870decf941ad51bbfbe7c0c872da603659864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f5f31e9f2e590d4d3319248b0a4dca7e1f6575cb973ec7cc3e33fc91eb82d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da8a6f1fac0baebddfe20bd0ebda104356ad37e9f128affd8df2757584e8e712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52ceea6c3fd351be1043a8bf07055a4d385f97a923ff2f08f1ddf810320d61ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52ceea6c3fd351be1043a8bf07055a4d385f97a923ff2f08f1ddf810320d61ce\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T21:34:57Z\\\",\\\"message\\\":\\\"kind:Service k8s.ovn.org/owner:openshift-marketplace/redhat-marketplace]} name:Service_openshift-marketplace/redhat-marketplace_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.140:50051:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {97b6e7b0-06ca-455e-8259-06895040cb0c}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 21:34:57.069447 7035 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-marketplace/redhat-marketplace]} name:Service_openshift-marketplace/redhat-marketplace_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.140:50051:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {97b6e7b0-06ca-455e-8259-06895040cb0c}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1201 21:34:57.068683 7035 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-j77n9_openshift-ovn-kubernetes(017b2e87-9a6e-46c6-b061-1ed93bfd2322)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://893a9b3cc35f66e4d7ce63bd2c5c94a0e5824466a37b2281e05c4e92b5e90a05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j77n9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:34:59Z is after 2025-08-24T17:21:41Z" Dec 01 21:34:59 crc kubenswrapper[4962]: I1201 21:34:59.213841 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 21:34:59 crc kubenswrapper[4962]: I1201 21:34:59.213911 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 21:34:59 crc kubenswrapper[4962]: I1201 21:34:59.214006 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:34:59 crc kubenswrapper[4962]: I1201 21:34:59.214062 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:34:59 crc kubenswrapper[4962]: E1201 21:34:59.214151 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 21:34:59 crc kubenswrapper[4962]: E1201 21:34:59.214180 4962 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 21:34:59 crc kubenswrapper[4962]: E1201 21:34:59.214190 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 21:34:59 crc kubenswrapper[4962]: E1201 21:34:59.214216 4962 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 21:34:59 crc kubenswrapper[4962]: E1201 21:34:59.214215 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 21:34:59 crc kubenswrapper[4962]: E1201 21:34:59.214263 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 21:36:03.214240722 +0000 UTC m=+147.315679947 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 21:34:59 crc kubenswrapper[4962]: E1201 21:34:59.214269 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 21:34:59 crc kubenswrapper[4962]: E1201 21:34:59.214292 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 21:36:03.214279453 +0000 UTC m=+147.315718688 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 21:34:59 crc kubenswrapper[4962]: E1201 21:34:59.214298 4962 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 21:34:59 crc kubenswrapper[4962]: E1201 21:34:59.214383 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 21:36:03.214358285 +0000 UTC m=+147.315797520 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 21:34:59 crc kubenswrapper[4962]: E1201 21:34:59.214228 4962 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 21:34:59 crc kubenswrapper[4962]: E1201 21:34:59.214468 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 21:36:03.214444287 +0000 UTC m=+147.315883582 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 21:34:59 crc kubenswrapper[4962]: I1201 21:34:59.218665 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:34:59 crc kubenswrapper[4962]: I1201 21:34:59.218727 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q5q5" Dec 01 21:34:59 crc kubenswrapper[4962]: I1201 21:34:59.218697 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 21:34:59 crc kubenswrapper[4962]: I1201 21:34:59.218677 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 21:34:59 crc kubenswrapper[4962]: E1201 21:34:59.218871 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 21:34:59 crc kubenswrapper[4962]: E1201 21:34:59.219040 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 21:34:59 crc kubenswrapper[4962]: E1201 21:34:59.219189 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q5q5" podUID="5e1746bf-6971-44aa-ae52-f349e6963eb2" Dec 01 21:34:59 crc kubenswrapper[4962]: E1201 21:34:59.219304 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 21:34:59 crc kubenswrapper[4962]: I1201 21:34:59.294633 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:59 crc kubenswrapper[4962]: I1201 21:34:59.294706 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:59 crc kubenswrapper[4962]: I1201 21:34:59.294724 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:59 crc kubenswrapper[4962]: I1201 21:34:59.294751 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:59 crc kubenswrapper[4962]: I1201 21:34:59.294770 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:59Z","lastTransitionTime":"2025-12-01T21:34:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:59 crc kubenswrapper[4962]: I1201 21:34:59.398187 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:59 crc kubenswrapper[4962]: I1201 21:34:59.398255 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:59 crc kubenswrapper[4962]: I1201 21:34:59.398272 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:59 crc kubenswrapper[4962]: I1201 21:34:59.398297 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:59 crc kubenswrapper[4962]: I1201 21:34:59.398315 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:59Z","lastTransitionTime":"2025-12-01T21:34:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:59 crc kubenswrapper[4962]: I1201 21:34:59.501161 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:59 crc kubenswrapper[4962]: I1201 21:34:59.501215 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:59 crc kubenswrapper[4962]: I1201 21:34:59.501237 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:59 crc kubenswrapper[4962]: I1201 21:34:59.501269 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:59 crc kubenswrapper[4962]: I1201 21:34:59.501290 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:59Z","lastTransitionTime":"2025-12-01T21:34:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:59 crc kubenswrapper[4962]: I1201 21:34:59.607274 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:59 crc kubenswrapper[4962]: I1201 21:34:59.607339 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:59 crc kubenswrapper[4962]: I1201 21:34:59.607362 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:59 crc kubenswrapper[4962]: I1201 21:34:59.607392 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:59 crc kubenswrapper[4962]: I1201 21:34:59.607413 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:59Z","lastTransitionTime":"2025-12-01T21:34:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:59 crc kubenswrapper[4962]: I1201 21:34:59.710875 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:59 crc kubenswrapper[4962]: I1201 21:34:59.710972 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:59 crc kubenswrapper[4962]: I1201 21:34:59.710999 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:59 crc kubenswrapper[4962]: I1201 21:34:59.711027 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:59 crc kubenswrapper[4962]: I1201 21:34:59.711048 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:59Z","lastTransitionTime":"2025-12-01T21:34:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:59 crc kubenswrapper[4962]: I1201 21:34:59.814081 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:59 crc kubenswrapper[4962]: I1201 21:34:59.814158 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:59 crc kubenswrapper[4962]: I1201 21:34:59.814179 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:59 crc kubenswrapper[4962]: I1201 21:34:59.814202 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:59 crc kubenswrapper[4962]: I1201 21:34:59.814221 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:59Z","lastTransitionTime":"2025-12-01T21:34:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:34:59 crc kubenswrapper[4962]: I1201 21:34:59.917719 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:34:59 crc kubenswrapper[4962]: I1201 21:34:59.917764 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:34:59 crc kubenswrapper[4962]: I1201 21:34:59.917780 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:34:59 crc kubenswrapper[4962]: I1201 21:34:59.917802 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:34:59 crc kubenswrapper[4962]: I1201 21:34:59.917819 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:34:59Z","lastTransitionTime":"2025-12-01T21:34:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:00 crc kubenswrapper[4962]: I1201 21:35:00.020769 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:00 crc kubenswrapper[4962]: I1201 21:35:00.020894 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:00 crc kubenswrapper[4962]: I1201 21:35:00.020919 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:00 crc kubenswrapper[4962]: I1201 21:35:00.020966 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:00 crc kubenswrapper[4962]: I1201 21:35:00.020985 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:00Z","lastTransitionTime":"2025-12-01T21:35:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:00 crc kubenswrapper[4962]: I1201 21:35:00.124165 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:00 crc kubenswrapper[4962]: I1201 21:35:00.124219 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:00 crc kubenswrapper[4962]: I1201 21:35:00.124236 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:00 crc kubenswrapper[4962]: I1201 21:35:00.124262 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:00 crc kubenswrapper[4962]: I1201 21:35:00.124286 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:00Z","lastTransitionTime":"2025-12-01T21:35:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:00 crc kubenswrapper[4962]: I1201 21:35:00.226756 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:00 crc kubenswrapper[4962]: I1201 21:35:00.226837 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:00 crc kubenswrapper[4962]: I1201 21:35:00.227085 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:00 crc kubenswrapper[4962]: I1201 21:35:00.227134 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:00 crc kubenswrapper[4962]: I1201 21:35:00.227159 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:00Z","lastTransitionTime":"2025-12-01T21:35:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:00 crc kubenswrapper[4962]: I1201 21:35:00.329788 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:00 crc kubenswrapper[4962]: I1201 21:35:00.329861 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:00 crc kubenswrapper[4962]: I1201 21:35:00.329883 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:00 crc kubenswrapper[4962]: I1201 21:35:00.329912 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:00 crc kubenswrapper[4962]: I1201 21:35:00.329931 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:00Z","lastTransitionTime":"2025-12-01T21:35:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:00 crc kubenswrapper[4962]: I1201 21:35:00.433459 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:00 crc kubenswrapper[4962]: I1201 21:35:00.433533 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:00 crc kubenswrapper[4962]: I1201 21:35:00.433556 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:00 crc kubenswrapper[4962]: I1201 21:35:00.433585 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:00 crc kubenswrapper[4962]: I1201 21:35:00.433605 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:00Z","lastTransitionTime":"2025-12-01T21:35:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:00 crc kubenswrapper[4962]: I1201 21:35:00.536806 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:00 crc kubenswrapper[4962]: I1201 21:35:00.536883 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:00 crc kubenswrapper[4962]: I1201 21:35:00.536899 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:00 crc kubenswrapper[4962]: I1201 21:35:00.536920 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:00 crc kubenswrapper[4962]: I1201 21:35:00.536968 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:00Z","lastTransitionTime":"2025-12-01T21:35:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:00 crc kubenswrapper[4962]: I1201 21:35:00.640105 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:00 crc kubenswrapper[4962]: I1201 21:35:00.640178 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:00 crc kubenswrapper[4962]: I1201 21:35:00.640201 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:00 crc kubenswrapper[4962]: I1201 21:35:00.640230 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:00 crc kubenswrapper[4962]: I1201 21:35:00.640253 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:00Z","lastTransitionTime":"2025-12-01T21:35:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:00 crc kubenswrapper[4962]: I1201 21:35:00.743463 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:00 crc kubenswrapper[4962]: I1201 21:35:00.743528 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:00 crc kubenswrapper[4962]: I1201 21:35:00.743550 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:00 crc kubenswrapper[4962]: I1201 21:35:00.743579 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:00 crc kubenswrapper[4962]: I1201 21:35:00.743603 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:00Z","lastTransitionTime":"2025-12-01T21:35:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:00 crc kubenswrapper[4962]: I1201 21:35:00.846892 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:00 crc kubenswrapper[4962]: I1201 21:35:00.846982 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:00 crc kubenswrapper[4962]: I1201 21:35:00.847001 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:00 crc kubenswrapper[4962]: I1201 21:35:00.847027 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:00 crc kubenswrapper[4962]: I1201 21:35:00.847044 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:00Z","lastTransitionTime":"2025-12-01T21:35:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:00 crc kubenswrapper[4962]: I1201 21:35:00.949757 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:00 crc kubenswrapper[4962]: I1201 21:35:00.949824 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:00 crc kubenswrapper[4962]: I1201 21:35:00.949843 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:00 crc kubenswrapper[4962]: I1201 21:35:00.949871 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:00 crc kubenswrapper[4962]: I1201 21:35:00.949894 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:00Z","lastTransitionTime":"2025-12-01T21:35:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:01 crc kubenswrapper[4962]: I1201 21:35:01.052876 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:01 crc kubenswrapper[4962]: I1201 21:35:01.052922 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:01 crc kubenswrapper[4962]: I1201 21:35:01.052952 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:01 crc kubenswrapper[4962]: I1201 21:35:01.052968 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:01 crc kubenswrapper[4962]: I1201 21:35:01.052981 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:01Z","lastTransitionTime":"2025-12-01T21:35:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:01 crc kubenswrapper[4962]: I1201 21:35:01.156276 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:01 crc kubenswrapper[4962]: I1201 21:35:01.156348 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:01 crc kubenswrapper[4962]: I1201 21:35:01.156366 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:01 crc kubenswrapper[4962]: I1201 21:35:01.156389 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:01 crc kubenswrapper[4962]: I1201 21:35:01.156413 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:01Z","lastTransitionTime":"2025-12-01T21:35:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:01 crc kubenswrapper[4962]: I1201 21:35:01.218992 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q5q5" Dec 01 21:35:01 crc kubenswrapper[4962]: I1201 21:35:01.219037 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 21:35:01 crc kubenswrapper[4962]: I1201 21:35:01.219127 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 21:35:01 crc kubenswrapper[4962]: E1201 21:35:01.219183 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q5q5" podUID="5e1746bf-6971-44aa-ae52-f349e6963eb2" Dec 01 21:35:01 crc kubenswrapper[4962]: I1201 21:35:01.219225 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:35:01 crc kubenswrapper[4962]: E1201 21:35:01.219385 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 21:35:01 crc kubenswrapper[4962]: E1201 21:35:01.219517 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 21:35:01 crc kubenswrapper[4962]: E1201 21:35:01.219650 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 21:35:01 crc kubenswrapper[4962]: I1201 21:35:01.259653 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:01 crc kubenswrapper[4962]: I1201 21:35:01.259728 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:01 crc kubenswrapper[4962]: I1201 21:35:01.259750 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:01 crc kubenswrapper[4962]: I1201 21:35:01.259775 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:01 crc kubenswrapper[4962]: I1201 21:35:01.259793 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:01Z","lastTransitionTime":"2025-12-01T21:35:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:01 crc kubenswrapper[4962]: I1201 21:35:01.362448 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:01 crc kubenswrapper[4962]: I1201 21:35:01.362508 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:01 crc kubenswrapper[4962]: I1201 21:35:01.362525 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:01 crc kubenswrapper[4962]: I1201 21:35:01.362548 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:01 crc kubenswrapper[4962]: I1201 21:35:01.362564 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:01Z","lastTransitionTime":"2025-12-01T21:35:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:01 crc kubenswrapper[4962]: I1201 21:35:01.466420 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:01 crc kubenswrapper[4962]: I1201 21:35:01.466517 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:01 crc kubenswrapper[4962]: I1201 21:35:01.466537 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:01 crc kubenswrapper[4962]: I1201 21:35:01.466592 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:01 crc kubenswrapper[4962]: I1201 21:35:01.466612 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:01Z","lastTransitionTime":"2025-12-01T21:35:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:01 crc kubenswrapper[4962]: I1201 21:35:01.570559 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:01 crc kubenswrapper[4962]: I1201 21:35:01.570652 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:01 crc kubenswrapper[4962]: I1201 21:35:01.570673 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:01 crc kubenswrapper[4962]: I1201 21:35:01.570728 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:01 crc kubenswrapper[4962]: I1201 21:35:01.570746 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:01Z","lastTransitionTime":"2025-12-01T21:35:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:01 crc kubenswrapper[4962]: I1201 21:35:01.674089 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:01 crc kubenswrapper[4962]: I1201 21:35:01.674164 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:01 crc kubenswrapper[4962]: I1201 21:35:01.674183 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:01 crc kubenswrapper[4962]: I1201 21:35:01.674212 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:01 crc kubenswrapper[4962]: I1201 21:35:01.674230 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:01Z","lastTransitionTime":"2025-12-01T21:35:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:01 crc kubenswrapper[4962]: I1201 21:35:01.777173 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:01 crc kubenswrapper[4962]: I1201 21:35:01.777220 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:01 crc kubenswrapper[4962]: I1201 21:35:01.777236 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:01 crc kubenswrapper[4962]: I1201 21:35:01.777261 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:01 crc kubenswrapper[4962]: I1201 21:35:01.777277 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:01Z","lastTransitionTime":"2025-12-01T21:35:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:01 crc kubenswrapper[4962]: I1201 21:35:01.880135 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:01 crc kubenswrapper[4962]: I1201 21:35:01.880217 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:01 crc kubenswrapper[4962]: I1201 21:35:01.880233 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:01 crc kubenswrapper[4962]: I1201 21:35:01.880250 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:01 crc kubenswrapper[4962]: I1201 21:35:01.880290 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:01Z","lastTransitionTime":"2025-12-01T21:35:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:01 crc kubenswrapper[4962]: I1201 21:35:01.982767 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:01 crc kubenswrapper[4962]: I1201 21:35:01.982891 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:01 crc kubenswrapper[4962]: I1201 21:35:01.982910 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:01 crc kubenswrapper[4962]: I1201 21:35:01.982964 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:01 crc kubenswrapper[4962]: I1201 21:35:01.982982 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:01Z","lastTransitionTime":"2025-12-01T21:35:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:02 crc kubenswrapper[4962]: I1201 21:35:02.086906 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:02 crc kubenswrapper[4962]: I1201 21:35:02.087026 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:02 crc kubenswrapper[4962]: I1201 21:35:02.087054 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:02 crc kubenswrapper[4962]: I1201 21:35:02.087085 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:02 crc kubenswrapper[4962]: I1201 21:35:02.087105 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:02Z","lastTransitionTime":"2025-12-01T21:35:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:02 crc kubenswrapper[4962]: I1201 21:35:02.190631 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:02 crc kubenswrapper[4962]: I1201 21:35:02.190697 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:02 crc kubenswrapper[4962]: I1201 21:35:02.190719 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:02 crc kubenswrapper[4962]: I1201 21:35:02.190744 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:02 crc kubenswrapper[4962]: I1201 21:35:02.190762 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:02Z","lastTransitionTime":"2025-12-01T21:35:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:02 crc kubenswrapper[4962]: I1201 21:35:02.294784 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:02 crc kubenswrapper[4962]: I1201 21:35:02.294855 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:02 crc kubenswrapper[4962]: I1201 21:35:02.294874 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:02 crc kubenswrapper[4962]: I1201 21:35:02.294900 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:02 crc kubenswrapper[4962]: I1201 21:35:02.294988 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:02Z","lastTransitionTime":"2025-12-01T21:35:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:02 crc kubenswrapper[4962]: I1201 21:35:02.398562 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:02 crc kubenswrapper[4962]: I1201 21:35:02.398642 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:02 crc kubenswrapper[4962]: I1201 21:35:02.398665 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:02 crc kubenswrapper[4962]: I1201 21:35:02.398694 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:02 crc kubenswrapper[4962]: I1201 21:35:02.398716 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:02Z","lastTransitionTime":"2025-12-01T21:35:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:02 crc kubenswrapper[4962]: I1201 21:35:02.501862 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:02 crc kubenswrapper[4962]: I1201 21:35:02.502000 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:02 crc kubenswrapper[4962]: I1201 21:35:02.502020 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:02 crc kubenswrapper[4962]: I1201 21:35:02.502043 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:02 crc kubenswrapper[4962]: I1201 21:35:02.502060 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:02Z","lastTransitionTime":"2025-12-01T21:35:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:02 crc kubenswrapper[4962]: I1201 21:35:02.605881 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:02 crc kubenswrapper[4962]: I1201 21:35:02.605972 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:02 crc kubenswrapper[4962]: I1201 21:35:02.605991 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:02 crc kubenswrapper[4962]: I1201 21:35:02.606014 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:02 crc kubenswrapper[4962]: I1201 21:35:02.606033 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:02Z","lastTransitionTime":"2025-12-01T21:35:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:02 crc kubenswrapper[4962]: I1201 21:35:02.709561 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:02 crc kubenswrapper[4962]: I1201 21:35:02.709623 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:02 crc kubenswrapper[4962]: I1201 21:35:02.709641 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:02 crc kubenswrapper[4962]: I1201 21:35:02.709665 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:02 crc kubenswrapper[4962]: I1201 21:35:02.709683 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:02Z","lastTransitionTime":"2025-12-01T21:35:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:02 crc kubenswrapper[4962]: I1201 21:35:02.813082 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:02 crc kubenswrapper[4962]: I1201 21:35:02.813162 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:02 crc kubenswrapper[4962]: I1201 21:35:02.813187 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:02 crc kubenswrapper[4962]: I1201 21:35:02.813222 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:02 crc kubenswrapper[4962]: I1201 21:35:02.813245 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:02Z","lastTransitionTime":"2025-12-01T21:35:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:02 crc kubenswrapper[4962]: I1201 21:35:02.916576 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:02 crc kubenswrapper[4962]: I1201 21:35:02.916642 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:02 crc kubenswrapper[4962]: I1201 21:35:02.916660 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:02 crc kubenswrapper[4962]: I1201 21:35:02.916684 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:02 crc kubenswrapper[4962]: I1201 21:35:02.916704 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:02Z","lastTransitionTime":"2025-12-01T21:35:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:03 crc kubenswrapper[4962]: I1201 21:35:03.020225 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:03 crc kubenswrapper[4962]: I1201 21:35:03.020291 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:03 crc kubenswrapper[4962]: I1201 21:35:03.020309 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:03 crc kubenswrapper[4962]: I1201 21:35:03.020336 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:03 crc kubenswrapper[4962]: I1201 21:35:03.020357 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:03Z","lastTransitionTime":"2025-12-01T21:35:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:03 crc kubenswrapper[4962]: I1201 21:35:03.123470 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:03 crc kubenswrapper[4962]: I1201 21:35:03.123537 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:03 crc kubenswrapper[4962]: I1201 21:35:03.123558 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:03 crc kubenswrapper[4962]: I1201 21:35:03.123589 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:03 crc kubenswrapper[4962]: I1201 21:35:03.123618 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:03Z","lastTransitionTime":"2025-12-01T21:35:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:03 crc kubenswrapper[4962]: I1201 21:35:03.219311 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:35:03 crc kubenswrapper[4962]: E1201 21:35:03.219643 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 21:35:03 crc kubenswrapper[4962]: I1201 21:35:03.220091 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 21:35:03 crc kubenswrapper[4962]: E1201 21:35:03.220257 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 21:35:03 crc kubenswrapper[4962]: I1201 21:35:03.220621 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 21:35:03 crc kubenswrapper[4962]: E1201 21:35:03.220758 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 21:35:03 crc kubenswrapper[4962]: I1201 21:35:03.221106 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q5q5" Dec 01 21:35:03 crc kubenswrapper[4962]: E1201 21:35:03.221283 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q5q5" podUID="5e1746bf-6971-44aa-ae52-f349e6963eb2" Dec 01 21:35:03 crc kubenswrapper[4962]: I1201 21:35:03.227177 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:03 crc kubenswrapper[4962]: I1201 21:35:03.227220 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:03 crc kubenswrapper[4962]: I1201 21:35:03.227237 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:03 crc kubenswrapper[4962]: I1201 21:35:03.227256 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:03 crc kubenswrapper[4962]: I1201 21:35:03.227274 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:03Z","lastTransitionTime":"2025-12-01T21:35:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:03 crc kubenswrapper[4962]: I1201 21:35:03.330137 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:03 crc kubenswrapper[4962]: I1201 21:35:03.330201 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:03 crc kubenswrapper[4962]: I1201 21:35:03.330219 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:03 crc kubenswrapper[4962]: I1201 21:35:03.330243 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:03 crc kubenswrapper[4962]: I1201 21:35:03.330260 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:03Z","lastTransitionTime":"2025-12-01T21:35:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:03 crc kubenswrapper[4962]: I1201 21:35:03.433821 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:03 crc kubenswrapper[4962]: I1201 21:35:03.433912 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:03 crc kubenswrapper[4962]: I1201 21:35:03.433961 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:03 crc kubenswrapper[4962]: I1201 21:35:03.433992 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:03 crc kubenswrapper[4962]: I1201 21:35:03.434016 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:03Z","lastTransitionTime":"2025-12-01T21:35:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:03 crc kubenswrapper[4962]: I1201 21:35:03.537067 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:03 crc kubenswrapper[4962]: I1201 21:35:03.537140 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:03 crc kubenswrapper[4962]: I1201 21:35:03.537158 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:03 crc kubenswrapper[4962]: I1201 21:35:03.537187 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:03 crc kubenswrapper[4962]: I1201 21:35:03.537205 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:03Z","lastTransitionTime":"2025-12-01T21:35:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:03 crc kubenswrapper[4962]: I1201 21:35:03.640626 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:03 crc kubenswrapper[4962]: I1201 21:35:03.640695 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:03 crc kubenswrapper[4962]: I1201 21:35:03.640708 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:03 crc kubenswrapper[4962]: I1201 21:35:03.640723 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:03 crc kubenswrapper[4962]: I1201 21:35:03.640738 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:03Z","lastTransitionTime":"2025-12-01T21:35:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:03 crc kubenswrapper[4962]: I1201 21:35:03.743266 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:03 crc kubenswrapper[4962]: I1201 21:35:03.743329 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:03 crc kubenswrapper[4962]: I1201 21:35:03.743347 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:03 crc kubenswrapper[4962]: I1201 21:35:03.743374 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:03 crc kubenswrapper[4962]: I1201 21:35:03.743393 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:03Z","lastTransitionTime":"2025-12-01T21:35:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:03 crc kubenswrapper[4962]: I1201 21:35:03.846614 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:03 crc kubenswrapper[4962]: I1201 21:35:03.846675 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:03 crc kubenswrapper[4962]: I1201 21:35:03.846691 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:03 crc kubenswrapper[4962]: I1201 21:35:03.846718 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:03 crc kubenswrapper[4962]: I1201 21:35:03.846738 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:03Z","lastTransitionTime":"2025-12-01T21:35:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:03 crc kubenswrapper[4962]: I1201 21:35:03.949852 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:03 crc kubenswrapper[4962]: I1201 21:35:03.949968 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:03 crc kubenswrapper[4962]: I1201 21:35:03.949987 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:03 crc kubenswrapper[4962]: I1201 21:35:03.950014 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:03 crc kubenswrapper[4962]: I1201 21:35:03.950031 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:03Z","lastTransitionTime":"2025-12-01T21:35:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:04 crc kubenswrapper[4962]: I1201 21:35:04.053012 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:04 crc kubenswrapper[4962]: I1201 21:35:04.053087 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:04 crc kubenswrapper[4962]: I1201 21:35:04.053105 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:04 crc kubenswrapper[4962]: I1201 21:35:04.053129 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:04 crc kubenswrapper[4962]: I1201 21:35:04.053147 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:04Z","lastTransitionTime":"2025-12-01T21:35:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:04 crc kubenswrapper[4962]: I1201 21:35:04.155838 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:04 crc kubenswrapper[4962]: I1201 21:35:04.155905 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:04 crc kubenswrapper[4962]: I1201 21:35:04.155927 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:04 crc kubenswrapper[4962]: I1201 21:35:04.155989 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:04 crc kubenswrapper[4962]: I1201 21:35:04.156013 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:04Z","lastTransitionTime":"2025-12-01T21:35:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:04 crc kubenswrapper[4962]: I1201 21:35:04.258779 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:04 crc kubenswrapper[4962]: I1201 21:35:04.258858 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:04 crc kubenswrapper[4962]: I1201 21:35:04.258879 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:04 crc kubenswrapper[4962]: I1201 21:35:04.258907 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:04 crc kubenswrapper[4962]: I1201 21:35:04.258929 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:04Z","lastTransitionTime":"2025-12-01T21:35:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:04 crc kubenswrapper[4962]: I1201 21:35:04.363039 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:04 crc kubenswrapper[4962]: I1201 21:35:04.363107 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:04 crc kubenswrapper[4962]: I1201 21:35:04.363124 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:04 crc kubenswrapper[4962]: I1201 21:35:04.363149 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:04 crc kubenswrapper[4962]: I1201 21:35:04.363170 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:04Z","lastTransitionTime":"2025-12-01T21:35:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:04 crc kubenswrapper[4962]: I1201 21:35:04.466841 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:04 crc kubenswrapper[4962]: I1201 21:35:04.466899 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:04 crc kubenswrapper[4962]: I1201 21:35:04.466915 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:04 crc kubenswrapper[4962]: I1201 21:35:04.466962 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:04 crc kubenswrapper[4962]: I1201 21:35:04.466980 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:04Z","lastTransitionTime":"2025-12-01T21:35:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:04 crc kubenswrapper[4962]: I1201 21:35:04.569830 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:04 crc kubenswrapper[4962]: I1201 21:35:04.569885 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:04 crc kubenswrapper[4962]: I1201 21:35:04.569902 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:04 crc kubenswrapper[4962]: I1201 21:35:04.569922 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:04 crc kubenswrapper[4962]: I1201 21:35:04.569964 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:04Z","lastTransitionTime":"2025-12-01T21:35:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:04 crc kubenswrapper[4962]: I1201 21:35:04.671927 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:04 crc kubenswrapper[4962]: I1201 21:35:04.672009 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:04 crc kubenswrapper[4962]: I1201 21:35:04.672026 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:04 crc kubenswrapper[4962]: I1201 21:35:04.672048 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:04 crc kubenswrapper[4962]: I1201 21:35:04.672066 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:04Z","lastTransitionTime":"2025-12-01T21:35:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:04 crc kubenswrapper[4962]: I1201 21:35:04.775318 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:04 crc kubenswrapper[4962]: I1201 21:35:04.775397 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:04 crc kubenswrapper[4962]: I1201 21:35:04.775418 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:04 crc kubenswrapper[4962]: I1201 21:35:04.775446 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:04 crc kubenswrapper[4962]: I1201 21:35:04.775467 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:04Z","lastTransitionTime":"2025-12-01T21:35:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:04 crc kubenswrapper[4962]: I1201 21:35:04.878866 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:04 crc kubenswrapper[4962]: I1201 21:35:04.878967 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:04 crc kubenswrapper[4962]: I1201 21:35:04.878991 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:04 crc kubenswrapper[4962]: I1201 21:35:04.879020 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:04 crc kubenswrapper[4962]: I1201 21:35:04.879039 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:04Z","lastTransitionTime":"2025-12-01T21:35:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:04 crc kubenswrapper[4962]: I1201 21:35:04.983236 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:04 crc kubenswrapper[4962]: I1201 21:35:04.983373 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:04 crc kubenswrapper[4962]: I1201 21:35:04.983397 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:04 crc kubenswrapper[4962]: I1201 21:35:04.983434 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:04 crc kubenswrapper[4962]: I1201 21:35:04.983457 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:04Z","lastTransitionTime":"2025-12-01T21:35:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:05 crc kubenswrapper[4962]: I1201 21:35:05.087048 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:05 crc kubenswrapper[4962]: I1201 21:35:05.087128 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:05 crc kubenswrapper[4962]: I1201 21:35:05.087146 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:05 crc kubenswrapper[4962]: I1201 21:35:05.087171 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:05 crc kubenswrapper[4962]: I1201 21:35:05.087189 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:05Z","lastTransitionTime":"2025-12-01T21:35:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:05 crc kubenswrapper[4962]: I1201 21:35:05.190789 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:05 crc kubenswrapper[4962]: I1201 21:35:05.190857 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:05 crc kubenswrapper[4962]: I1201 21:35:05.190880 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:05 crc kubenswrapper[4962]: I1201 21:35:05.190910 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:05 crc kubenswrapper[4962]: I1201 21:35:05.190989 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:05Z","lastTransitionTime":"2025-12-01T21:35:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:05 crc kubenswrapper[4962]: I1201 21:35:05.219441 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:35:05 crc kubenswrapper[4962]: I1201 21:35:05.219541 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q5q5" Dec 01 21:35:05 crc kubenswrapper[4962]: I1201 21:35:05.219535 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 21:35:05 crc kubenswrapper[4962]: E1201 21:35:05.219722 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 21:35:05 crc kubenswrapper[4962]: I1201 21:35:05.219751 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 21:35:05 crc kubenswrapper[4962]: E1201 21:35:05.219924 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q5q5" podUID="5e1746bf-6971-44aa-ae52-f349e6963eb2" Dec 01 21:35:05 crc kubenswrapper[4962]: E1201 21:35:05.220094 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 21:35:05 crc kubenswrapper[4962]: E1201 21:35:05.220194 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 21:35:05 crc kubenswrapper[4962]: I1201 21:35:05.294101 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:05 crc kubenswrapper[4962]: I1201 21:35:05.294167 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:05 crc kubenswrapper[4962]: I1201 21:35:05.294184 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:05 crc kubenswrapper[4962]: I1201 21:35:05.294209 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:05 crc kubenswrapper[4962]: I1201 21:35:05.294225 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:05Z","lastTransitionTime":"2025-12-01T21:35:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:05 crc kubenswrapper[4962]: I1201 21:35:05.397754 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:05 crc kubenswrapper[4962]: I1201 21:35:05.397817 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:05 crc kubenswrapper[4962]: I1201 21:35:05.397837 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:05 crc kubenswrapper[4962]: I1201 21:35:05.397862 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:05 crc kubenswrapper[4962]: I1201 21:35:05.397881 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:05Z","lastTransitionTime":"2025-12-01T21:35:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:05 crc kubenswrapper[4962]: I1201 21:35:05.500897 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:05 crc kubenswrapper[4962]: I1201 21:35:05.500991 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:05 crc kubenswrapper[4962]: I1201 21:35:05.501051 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:05 crc kubenswrapper[4962]: I1201 21:35:05.501082 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:05 crc kubenswrapper[4962]: I1201 21:35:05.501101 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:05Z","lastTransitionTime":"2025-12-01T21:35:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:05 crc kubenswrapper[4962]: I1201 21:35:05.603905 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:05 crc kubenswrapper[4962]: I1201 21:35:05.604025 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:05 crc kubenswrapper[4962]: I1201 21:35:05.604052 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:05 crc kubenswrapper[4962]: I1201 21:35:05.604085 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:05 crc kubenswrapper[4962]: I1201 21:35:05.604110 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:05Z","lastTransitionTime":"2025-12-01T21:35:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:05 crc kubenswrapper[4962]: I1201 21:35:05.707829 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:05 crc kubenswrapper[4962]: I1201 21:35:05.707894 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:05 crc kubenswrapper[4962]: I1201 21:35:05.707911 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:05 crc kubenswrapper[4962]: I1201 21:35:05.707974 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:05 crc kubenswrapper[4962]: I1201 21:35:05.707993 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:05Z","lastTransitionTime":"2025-12-01T21:35:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:05 crc kubenswrapper[4962]: I1201 21:35:05.811144 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:05 crc kubenswrapper[4962]: I1201 21:35:05.811187 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:05 crc kubenswrapper[4962]: I1201 21:35:05.811224 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:05 crc kubenswrapper[4962]: I1201 21:35:05.811242 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:05 crc kubenswrapper[4962]: I1201 21:35:05.811253 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:05Z","lastTransitionTime":"2025-12-01T21:35:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:05 crc kubenswrapper[4962]: I1201 21:35:05.913990 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:05 crc kubenswrapper[4962]: I1201 21:35:05.914051 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:05 crc kubenswrapper[4962]: I1201 21:35:05.914061 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:05 crc kubenswrapper[4962]: I1201 21:35:05.914082 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:05 crc kubenswrapper[4962]: I1201 21:35:05.914096 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:05Z","lastTransitionTime":"2025-12-01T21:35:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:06 crc kubenswrapper[4962]: I1201 21:35:06.016829 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:06 crc kubenswrapper[4962]: I1201 21:35:06.016886 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:06 crc kubenswrapper[4962]: I1201 21:35:06.016898 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:06 crc kubenswrapper[4962]: I1201 21:35:06.016922 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:06 crc kubenswrapper[4962]: I1201 21:35:06.016953 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:06Z","lastTransitionTime":"2025-12-01T21:35:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:06 crc kubenswrapper[4962]: I1201 21:35:06.120673 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:06 crc kubenswrapper[4962]: I1201 21:35:06.120759 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:06 crc kubenswrapper[4962]: I1201 21:35:06.120778 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:06 crc kubenswrapper[4962]: I1201 21:35:06.120815 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:06 crc kubenswrapper[4962]: I1201 21:35:06.120835 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:06Z","lastTransitionTime":"2025-12-01T21:35:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:06 crc kubenswrapper[4962]: I1201 21:35:06.224178 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:06 crc kubenswrapper[4962]: I1201 21:35:06.224303 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:06 crc kubenswrapper[4962]: I1201 21:35:06.224321 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:06 crc kubenswrapper[4962]: I1201 21:35:06.224345 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:06 crc kubenswrapper[4962]: I1201 21:35:06.224362 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:06Z","lastTransitionTime":"2025-12-01T21:35:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:06 crc kubenswrapper[4962]: I1201 21:35:06.241054 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Dec 01 21:35:06 crc kubenswrapper[4962]: I1201 21:35:06.242127 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0980f7e7-e587-417c-a00b-aada45bea2ed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d0b3972d5038c6a9d8368101aa0f07a39a4d7023ae9e3ea0ef061b909afb194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ceb0b8e00edd30c0fbc6f9b36340785b50eff3b237aeac7dc21c74fcf801c1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e689e109554f82f2b6fc4bbde2e2f2a7d6ba0b5772b57c7a53827e6a5f66223c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://171ba01c49ce48dc0d4b632a8de3217fc6e9eab2c512c7214c377ec51987b2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://171ba01c49ce48dc0d4b632a8de3217fc6e9eab2c512c7214c377ec51987b2d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:35:06Z is after 2025-08-24T17:21:41Z" Dec 01 21:35:06 crc kubenswrapper[4962]: I1201 21:35:06.272325 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:35:06Z is after 2025-08-24T17:21:41Z" Dec 01 21:35:06 crc kubenswrapper[4962]: I1201 21:35:06.308846 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"017b2e87-9a6e-46c6-b061-1ed93bfd2322\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c2b5c061f60dc51486c1b3b6fd97f18f63860ecb75697b61ee8288c43f23417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a2d1885515e7eab97b61b268f14f9fab5ff02464621e1a9b6adfee4c032ca3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81854e8f9430ba467e0aecead00fad301f92f85fdcca3f93f5e4aad9bba65925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70bcf088189f0158815dd70d752870decf941ad51bbfbe7c0c872da603659864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f5f31e9f2e590d4d3319248b0a4dca7e1f6575cb973ec7cc3e33fc91eb82d89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da8a6f1fac0baebddfe20bd0ebda104356ad37e9f128affd8df2757584e8e712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52ceea6c3fd351be1043a8bf07055a4d385f97a923ff2f08f1ddf810320d61ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52ceea6c3fd351be1043a8bf07055a4d385f97a923ff2f08f1ddf810320d61ce\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T21:34:57Z\\\",\\\"message\\\":\\\"kind:Service k8s.ovn.org/owner:openshift-marketplace/redhat-marketplace]} name:Service_openshift-marketplace/redhat-marketplace_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.140:50051:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {97b6e7b0-06ca-455e-8259-06895040cb0c}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 21:34:57.069447 7035 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-marketplace/redhat-marketplace]} name:Service_openshift-marketplace/redhat-marketplace_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.140:50051:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {97b6e7b0-06ca-455e-8259-06895040cb0c}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1201 21:34:57.068683 7035 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-j77n9_openshift-ovn-kubernetes(017b2e87-9a6e-46c6-b061-1ed93bfd2322)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://893a9b3cc35f66e4d7ce63bd2c5c94a0e5824466a37b2281e05c4e92b5e90a05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fg5ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j77n9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:35:06Z is after 2025-08-24T17:21:41Z" Dec 01 21:35:06 crc kubenswrapper[4962]: I1201 21:35:06.326640 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:06 crc kubenswrapper[4962]: I1201 21:35:06.326805 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:06 crc kubenswrapper[4962]: I1201 21:35:06.326825 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:06 crc kubenswrapper[4962]: I1201 21:35:06.326891 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:06 crc kubenswrapper[4962]: I1201 21:35:06.326911 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:06Z","lastTransitionTime":"2025-12-01T21:35:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:06 crc kubenswrapper[4962]: I1201 21:35:06.329417 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4wzdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c4763ed-582e-4003-a0da-526ae8aee799\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca28226077849a752e7f8484f9ef98d6fb065d6ba9b3c4ef05f3ef1b0554bc32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5lb4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:34:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4wzdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:35:06Z is after 2025-08-24T17:21:41Z" Dec 01 21:35:06 crc kubenswrapper[4962]: I1201 21:35:06.351774 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dce968b-9adc-4cbe-958c-00bdd7ba6159\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d0603aa8db338a61428d0c439ab252ad7db6ee0061992381a2f2c2534ecc08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b00d16ddb5e5c38227a87ee09c7e0d74ca43fb9226904dc5d604abe6a9c9b5ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bc7177cbcbb0fe51491e055e7a8cb565836f1ffe163b6e160c3b425a5218fc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e749578c0ce8cb1dd38de90c24c5ddf738558163fe9012afcf1db088756495\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:35:06Z is after 2025-08-24T17:21:41Z" Dec 01 21:35:06 crc kubenswrapper[4962]: I1201 21:35:06.380796 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db02e5c3ea0837c8534d6d0d89c6343ef077bc79b08bb6514fcc7fa0d71ac94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:35:06Z is after 2025-08-24T17:21:41Z" Dec 01 21:35:06 crc kubenswrapper[4962]: I1201 21:35:06.397474 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:35:06Z is after 2025-08-24T17:21:41Z" Dec 01 21:35:06 crc kubenswrapper[4962]: I1201 21:35:06.412913 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-m4wg5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38b9e31-13b0-4a48-93bf-b3722ca60642\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e13ebe98bcf0b54dcebec65942330d0f78a7f024cc3dbe2b2499bf7c572541c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6b607ad59cd135df76eb0d6167a309468b46c7d487077b737d6d35390990de0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T21:34:45Z\\\",\\\"message\\\":\\\"2025-12-01T21:34:00+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_dd791fb7-de12-4b23-866f-9a58f88e2ac0\\\\n2025-12-01T21:34:00+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_dd791fb7-de12-4b23-866f-9a58f88e2ac0 to /host/opt/cni/bin/\\\\n2025-12-01T21:34:00Z [verbose] multus-daemon started\\\\n2025-12-01T21:34:00Z [verbose] Readiness Indicator file check\\\\n2025-12-01T21:34:45Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws4mx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-m4wg5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:35:06Z is after 2025-08-24T17:21:41Z" Dec 01 21:35:06 crc kubenswrapper[4962]: I1201 21:35:06.430304 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:06 crc kubenswrapper[4962]: I1201 21:35:06.430370 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:06 crc kubenswrapper[4962]: I1201 21:35:06.430384 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:06 crc kubenswrapper[4962]: I1201 21:35:06.430401 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:06 crc kubenswrapper[4962]: I1201 21:35:06.430273 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lv2jr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47902aa9-b3e5-4279-a0ee-23ec28d1c67b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131ddb58672271d21222b6c9376dc6fa0bc5ba8d9bb4c8b0db88e81d0f59984c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f38ae7ac1122c7cceeaf7ef6284034b3b00f0d34060ce2a912344e521f8288ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f38ae7ac1122c7cceeaf7ef6284034b3b00f0d34060ce2a912344e521f8288ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3df9e036c0435fee1aa00f18a53bfb50334213a9284f766de1454f30a6c78f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3df9e036c0435fee1aa00f18a53bfb50334213a9284f766de1454f30a6c78f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eea885834889b1efa1a74ddb9d4c378669affd6dcf493f7d5e2c18d63ba9b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eea885834889b1efa1a74ddb9d4c378669affd6dcf493f7d5e2c18d63ba9b5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b892d495763f16b0d9b9afd4440fa4d704de060b42b7506b28e9d990c47bfc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b892d495763f16b0d9b9afd4440fa4d704de060b42b7506b28e9d990c47bfc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fd3d341f898bc53c51ad9a79a03ffd02ef0ac0e8f53e33ebc139a30fa85b7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4fd3d341f898bc53c51ad9a79a03ffd02ef0ac0e8f53e33ebc139a30fa85b7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1a2f823c4006bd3f5acb5f92c78b1020ea0992f77a76366c5bab21657bf403f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1a2f823c4006bd3f5acb5f92c78b1020ea0992f77a76366c5bab21657bf403f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:34:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqmfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lv2jr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:35:06Z is after 2025-08-24T17:21:41Z" Dec 01 21:35:06 crc kubenswrapper[4962]: I1201 21:35:06.430415 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:06Z","lastTransitionTime":"2025-12-01T21:35:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:06 crc kubenswrapper[4962]: I1201 21:35:06.444008 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac15c86b-cd18-4110-bda1-ba116dccb445\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edc674b0b2e671ee34618f4a89fb513b4d6be3b5b2c65510554ed57b84f305b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d764e05de4e1d86add3da927e55f8d2f211bba19a2104af89e43d13f986f5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a26275d40e0cdc563a8704d985b92ca116fd053ac663b23ada1f3d55d3d3a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7bb726576fbf1e8a595c9fdab30ed4852d838f3ad72aa432b880e8466493ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa02037f2a0411aa4c3b7eff66c72a1885d70e56cf72cdf786d04b5840ecbe6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T21:33:54Z\\\",\\\"message\\\":\\\" 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764624829\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764624829\\\\\\\\\\\\\\\" (2025-12-01 20:33:49 +0000 UTC to 2026-12-01 20:33:49 +0000 UTC (now=2025-12-01 21:33:54.341897199 +0000 UTC))\\\\\\\"\\\\nI1201 21:33:54.342031 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 21:33:54.342089 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 21:33:54.342220 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1201 21:33:54.342244 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1201 21:33:54.341422 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 21:33:54.342706 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 21:33:54.344341 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-602538438/tls.crt::/tmp/serving-cert-602538438/tls.key\\\\\\\"\\\\nI1201 21:33:54.345794 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 21:33:54.345817 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 21:33:54.345844 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348097 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1201 21:33:54.348569 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1201 21:33:54.367723 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82bbe0097117ffbc7aa9c8247995dcec3765f03a9aa7657ee0c0da4c3a3ec874\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:35:06Z is after 2025-08-24T17:21:41Z" Dec 01 21:35:06 crc kubenswrapper[4962]: I1201 21:35:06.456620 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:35:06Z is after 2025-08-24T17:21:41Z" Dec 01 21:35:06 crc kubenswrapper[4962]: I1201 21:35:06.467910 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fqtnk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84505e9c-7b91-400d-b30b-d7d2cfe3c29b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4acbb8c835ff8f148555f849f503e7bf07afe0526bd6d0a22277e53f2eeef390\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vpt7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5ebad9cf8b8ff273eadbf4b7188ec1d0e0b191dd8c7d6e48efb504a6182b655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:34:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vpt7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:34:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fqtnk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:35:06Z is after 2025-08-24T17:21:41Z" Dec 01 21:35:06 crc kubenswrapper[4962]: I1201 21:35:06.479381 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"191b6ce3-f613-4217-b224-a65ee4cfdfe7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37ed2e5c6ce1be6c710341ab02dc9ae89ebb0b309c11e78b251ffb532c31587f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rf67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca94caf0040f2d86239f149d34e419adcc86594fda2b4189e74f699c125857f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rf67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b642k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:35:06Z is after 2025-08-24T17:21:41Z" Dec 01 21:35:06 crc kubenswrapper[4962]: I1201 21:35:06.489477 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2q5q5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e1746bf-6971-44aa-ae52-f349e6963eb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wgwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wgwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:34:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2q5q5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:35:06Z is after 2025-08-24T17:21:41Z" Dec 01 21:35:06 crc kubenswrapper[4962]: I1201 21:35:06.502040 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fd42bb4-a281-49ee-bcb8-d6c26b78048f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://329efc0484814f00ee6b557b9788f2847ede432890d1a3a33cea693006ed07cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c72b3e92dfa20af11102b274817b0548b41f00933ce9c4dafcc83c8f99bb75d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c72b3e92dfa20af11102b274817b0548b41f00933ce9c4dafcc83c8f99bb75d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T21:33:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T21:33:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:35:06Z is after 2025-08-24T17:21:41Z" Dec 01 21:35:06 crc kubenswrapper[4962]: I1201 21:35:06.518654 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://311915b276fcb2749fd7443c07b4a339131584ba32944273ed063ed55edec6c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:35:06Z is after 2025-08-24T17:21:41Z" Dec 01 21:35:06 crc kubenswrapper[4962]: I1201 21:35:06.534017 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69adfdccde94cd891284e61c5edd6a092d14c5276ff514618fa3b44169e6347e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a6c3b22dbe7ed00b2356583225c79a46bfbf7014f2ba5b6d2226f1b1f3f7b39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:35:06Z is after 2025-08-24T17:21:41Z" Dec 01 21:35:06 crc kubenswrapper[4962]: I1201 21:35:06.535194 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:06 crc kubenswrapper[4962]: I1201 21:35:06.535308 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:06 crc kubenswrapper[4962]: I1201 21:35:06.535333 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:06 crc kubenswrapper[4962]: I1201 21:35:06.535618 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:06 crc kubenswrapper[4962]: I1201 21:35:06.535692 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:06Z","lastTransitionTime":"2025-12-01T21:35:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:06 crc kubenswrapper[4962]: I1201 21:35:06.548719 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w7dpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"943bfc9d-612b-4273-9774-f1866b7af4b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:33:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T21:34:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0486c9366b81f3abfd35806c7eef00eb2a353450d850cc665c43fe3c78c9fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T21:33:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T21:33:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w7dpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:35:06Z is after 2025-08-24T17:21:41Z" Dec 01 21:35:06 crc kubenswrapper[4962]: I1201 21:35:06.639149 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:06 crc kubenswrapper[4962]: I1201 21:35:06.639230 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:06 crc kubenswrapper[4962]: I1201 21:35:06.639253 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:06 crc kubenswrapper[4962]: I1201 21:35:06.639283 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:06 crc kubenswrapper[4962]: I1201 21:35:06.639308 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:06Z","lastTransitionTime":"2025-12-01T21:35:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:06 crc kubenswrapper[4962]: I1201 21:35:06.742514 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:06 crc kubenswrapper[4962]: I1201 21:35:06.742584 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:06 crc kubenswrapper[4962]: I1201 21:35:06.742599 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:06 crc kubenswrapper[4962]: I1201 21:35:06.742615 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:06 crc kubenswrapper[4962]: I1201 21:35:06.742644 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:06Z","lastTransitionTime":"2025-12-01T21:35:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:06 crc kubenswrapper[4962]: I1201 21:35:06.845529 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:06 crc kubenswrapper[4962]: I1201 21:35:06.845589 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:06 crc kubenswrapper[4962]: I1201 21:35:06.845605 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:06 crc kubenswrapper[4962]: I1201 21:35:06.845628 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:06 crc kubenswrapper[4962]: I1201 21:35:06.845647 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:06Z","lastTransitionTime":"2025-12-01T21:35:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:06 crc kubenswrapper[4962]: I1201 21:35:06.948028 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:06 crc kubenswrapper[4962]: I1201 21:35:06.948110 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:06 crc kubenswrapper[4962]: I1201 21:35:06.948130 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:06 crc kubenswrapper[4962]: I1201 21:35:06.948156 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:06 crc kubenswrapper[4962]: I1201 21:35:06.948174 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:06Z","lastTransitionTime":"2025-12-01T21:35:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:07 crc kubenswrapper[4962]: I1201 21:35:07.052427 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:07 crc kubenswrapper[4962]: I1201 21:35:07.052470 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:07 crc kubenswrapper[4962]: I1201 21:35:07.052481 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:07 crc kubenswrapper[4962]: I1201 21:35:07.052496 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:07 crc kubenswrapper[4962]: I1201 21:35:07.052508 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:07Z","lastTransitionTime":"2025-12-01T21:35:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:07 crc kubenswrapper[4962]: I1201 21:35:07.155439 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:07 crc kubenswrapper[4962]: I1201 21:35:07.155498 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:07 crc kubenswrapper[4962]: I1201 21:35:07.155515 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:07 crc kubenswrapper[4962]: I1201 21:35:07.155537 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:07 crc kubenswrapper[4962]: I1201 21:35:07.155554 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:07Z","lastTransitionTime":"2025-12-01T21:35:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:07 crc kubenswrapper[4962]: I1201 21:35:07.218769 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q5q5" Dec 01 21:35:07 crc kubenswrapper[4962]: I1201 21:35:07.218783 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 21:35:07 crc kubenswrapper[4962]: I1201 21:35:07.218887 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:35:07 crc kubenswrapper[4962]: I1201 21:35:07.218910 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 21:35:07 crc kubenswrapper[4962]: E1201 21:35:07.219110 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q5q5" podUID="5e1746bf-6971-44aa-ae52-f349e6963eb2" Dec 01 21:35:07 crc kubenswrapper[4962]: E1201 21:35:07.219289 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 21:35:07 crc kubenswrapper[4962]: E1201 21:35:07.219509 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 21:35:07 crc kubenswrapper[4962]: E1201 21:35:07.219589 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 21:35:07 crc kubenswrapper[4962]: I1201 21:35:07.258612 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:07 crc kubenswrapper[4962]: I1201 21:35:07.259346 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:07 crc kubenswrapper[4962]: I1201 21:35:07.259363 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:07 crc kubenswrapper[4962]: I1201 21:35:07.259386 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:07 crc kubenswrapper[4962]: I1201 21:35:07.259402 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:07Z","lastTransitionTime":"2025-12-01T21:35:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:07 crc kubenswrapper[4962]: I1201 21:35:07.367286 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:07 crc kubenswrapper[4962]: I1201 21:35:07.367558 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:07 crc kubenswrapper[4962]: I1201 21:35:07.367577 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:07 crc kubenswrapper[4962]: I1201 21:35:07.367603 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:07 crc kubenswrapper[4962]: I1201 21:35:07.367621 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:07Z","lastTransitionTime":"2025-12-01T21:35:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:07 crc kubenswrapper[4962]: I1201 21:35:07.470303 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:07 crc kubenswrapper[4962]: I1201 21:35:07.470357 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:07 crc kubenswrapper[4962]: I1201 21:35:07.470375 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:07 crc kubenswrapper[4962]: I1201 21:35:07.470396 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:07 crc kubenswrapper[4962]: I1201 21:35:07.470412 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:07Z","lastTransitionTime":"2025-12-01T21:35:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:07 crc kubenswrapper[4962]: I1201 21:35:07.572872 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:07 crc kubenswrapper[4962]: I1201 21:35:07.572990 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:07 crc kubenswrapper[4962]: I1201 21:35:07.573017 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:07 crc kubenswrapper[4962]: I1201 21:35:07.573043 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:07 crc kubenswrapper[4962]: I1201 21:35:07.573064 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:07Z","lastTransitionTime":"2025-12-01T21:35:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:07 crc kubenswrapper[4962]: I1201 21:35:07.675646 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:07 crc kubenswrapper[4962]: I1201 21:35:07.676050 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:07 crc kubenswrapper[4962]: I1201 21:35:07.676203 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:07 crc kubenswrapper[4962]: I1201 21:35:07.676339 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:07 crc kubenswrapper[4962]: I1201 21:35:07.676486 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:07Z","lastTransitionTime":"2025-12-01T21:35:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:07 crc kubenswrapper[4962]: I1201 21:35:07.779676 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:07 crc kubenswrapper[4962]: I1201 21:35:07.779732 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:07 crc kubenswrapper[4962]: I1201 21:35:07.779750 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:07 crc kubenswrapper[4962]: I1201 21:35:07.779776 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:07 crc kubenswrapper[4962]: I1201 21:35:07.779797 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:07Z","lastTransitionTime":"2025-12-01T21:35:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:07 crc kubenswrapper[4962]: I1201 21:35:07.882057 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:07 crc kubenswrapper[4962]: I1201 21:35:07.882131 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:07 crc kubenswrapper[4962]: I1201 21:35:07.882148 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:07 crc kubenswrapper[4962]: I1201 21:35:07.882171 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:07 crc kubenswrapper[4962]: I1201 21:35:07.882188 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:07Z","lastTransitionTime":"2025-12-01T21:35:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:07 crc kubenswrapper[4962]: I1201 21:35:07.985740 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:07 crc kubenswrapper[4962]: I1201 21:35:07.985828 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:07 crc kubenswrapper[4962]: I1201 21:35:07.985854 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:07 crc kubenswrapper[4962]: I1201 21:35:07.985883 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:07 crc kubenswrapper[4962]: I1201 21:35:07.985906 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:07Z","lastTransitionTime":"2025-12-01T21:35:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:08 crc kubenswrapper[4962]: I1201 21:35:08.088688 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:08 crc kubenswrapper[4962]: I1201 21:35:08.088749 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:08 crc kubenswrapper[4962]: I1201 21:35:08.088765 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:08 crc kubenswrapper[4962]: I1201 21:35:08.088787 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:08 crc kubenswrapper[4962]: I1201 21:35:08.088808 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:08Z","lastTransitionTime":"2025-12-01T21:35:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:08 crc kubenswrapper[4962]: I1201 21:35:08.191558 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:08 crc kubenswrapper[4962]: I1201 21:35:08.191625 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:08 crc kubenswrapper[4962]: I1201 21:35:08.191647 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:08 crc kubenswrapper[4962]: I1201 21:35:08.191679 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:08 crc kubenswrapper[4962]: I1201 21:35:08.191703 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:08Z","lastTransitionTime":"2025-12-01T21:35:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:08 crc kubenswrapper[4962]: I1201 21:35:08.295439 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:08 crc kubenswrapper[4962]: I1201 21:35:08.295519 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:08 crc kubenswrapper[4962]: I1201 21:35:08.295544 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:08 crc kubenswrapper[4962]: I1201 21:35:08.295578 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:08 crc kubenswrapper[4962]: I1201 21:35:08.295604 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:08Z","lastTransitionTime":"2025-12-01T21:35:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:08 crc kubenswrapper[4962]: I1201 21:35:08.398626 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:08 crc kubenswrapper[4962]: I1201 21:35:08.398693 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:08 crc kubenswrapper[4962]: I1201 21:35:08.398713 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:08 crc kubenswrapper[4962]: I1201 21:35:08.398737 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:08 crc kubenswrapper[4962]: I1201 21:35:08.398756 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:08Z","lastTransitionTime":"2025-12-01T21:35:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:08 crc kubenswrapper[4962]: I1201 21:35:08.501392 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:08 crc kubenswrapper[4962]: I1201 21:35:08.501468 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:08 crc kubenswrapper[4962]: I1201 21:35:08.501493 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:08 crc kubenswrapper[4962]: I1201 21:35:08.501522 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:08 crc kubenswrapper[4962]: I1201 21:35:08.501539 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:08Z","lastTransitionTime":"2025-12-01T21:35:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:08 crc kubenswrapper[4962]: I1201 21:35:08.604769 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:08 crc kubenswrapper[4962]: I1201 21:35:08.604833 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:08 crc kubenswrapper[4962]: I1201 21:35:08.604854 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:08 crc kubenswrapper[4962]: I1201 21:35:08.604877 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:08 crc kubenswrapper[4962]: I1201 21:35:08.604894 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:08Z","lastTransitionTime":"2025-12-01T21:35:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:08 crc kubenswrapper[4962]: I1201 21:35:08.709008 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:08 crc kubenswrapper[4962]: I1201 21:35:08.709068 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:08 crc kubenswrapper[4962]: I1201 21:35:08.709087 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:08 crc kubenswrapper[4962]: I1201 21:35:08.709111 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:08 crc kubenswrapper[4962]: I1201 21:35:08.709128 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:08Z","lastTransitionTime":"2025-12-01T21:35:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:08 crc kubenswrapper[4962]: I1201 21:35:08.812255 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:08 crc kubenswrapper[4962]: I1201 21:35:08.812321 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:08 crc kubenswrapper[4962]: I1201 21:35:08.812337 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:08 crc kubenswrapper[4962]: I1201 21:35:08.812364 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:08 crc kubenswrapper[4962]: I1201 21:35:08.812381 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:08Z","lastTransitionTime":"2025-12-01T21:35:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:08 crc kubenswrapper[4962]: I1201 21:35:08.915426 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:08 crc kubenswrapper[4962]: I1201 21:35:08.915478 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:08 crc kubenswrapper[4962]: I1201 21:35:08.915495 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:08 crc kubenswrapper[4962]: I1201 21:35:08.915517 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:08 crc kubenswrapper[4962]: I1201 21:35:08.915535 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:08Z","lastTransitionTime":"2025-12-01T21:35:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:09 crc kubenswrapper[4962]: I1201 21:35:09.018598 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:09 crc kubenswrapper[4962]: I1201 21:35:09.018662 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:09 crc kubenswrapper[4962]: I1201 21:35:09.018679 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:09 crc kubenswrapper[4962]: I1201 21:35:09.018703 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:09 crc kubenswrapper[4962]: I1201 21:35:09.018721 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:09Z","lastTransitionTime":"2025-12-01T21:35:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:09 crc kubenswrapper[4962]: I1201 21:35:09.101114 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:09 crc kubenswrapper[4962]: I1201 21:35:09.101208 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:09 crc kubenswrapper[4962]: I1201 21:35:09.101226 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:09 crc kubenswrapper[4962]: I1201 21:35:09.101250 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:09 crc kubenswrapper[4962]: I1201 21:35:09.101268 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:09Z","lastTransitionTime":"2025-12-01T21:35:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:09 crc kubenswrapper[4962]: E1201 21:35:09.122524 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:35:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:35:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:35:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:35:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:35:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:35:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:35:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:35:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1be34435-728a-4ba5-8928-fb5e344b2f91\\\",\\\"systemUUID\\\":\\\"c1819972-ca37-4089-9661-6671772d5e38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:35:09Z is after 2025-08-24T17:21:41Z" Dec 01 21:35:09 crc kubenswrapper[4962]: I1201 21:35:09.128338 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:09 crc kubenswrapper[4962]: I1201 21:35:09.128396 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:09 crc kubenswrapper[4962]: I1201 21:35:09.128413 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:09 crc kubenswrapper[4962]: I1201 21:35:09.128440 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:09 crc kubenswrapper[4962]: I1201 21:35:09.128457 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:09Z","lastTransitionTime":"2025-12-01T21:35:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:09 crc kubenswrapper[4962]: E1201 21:35:09.149029 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:35:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:35:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:35:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:35:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:35:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:35:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:35:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:35:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1be34435-728a-4ba5-8928-fb5e344b2f91\\\",\\\"systemUUID\\\":\\\"c1819972-ca37-4089-9661-6671772d5e38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:35:09Z is after 2025-08-24T17:21:41Z" Dec 01 21:35:09 crc kubenswrapper[4962]: I1201 21:35:09.154809 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:09 crc kubenswrapper[4962]: I1201 21:35:09.154869 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:09 crc kubenswrapper[4962]: I1201 21:35:09.154887 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:09 crc kubenswrapper[4962]: I1201 21:35:09.154911 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:09 crc kubenswrapper[4962]: I1201 21:35:09.154962 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:09Z","lastTransitionTime":"2025-12-01T21:35:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:09 crc kubenswrapper[4962]: E1201 21:35:09.176548 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:35:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:35:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:35:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:35:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:35:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:35:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:35:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:35:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1be34435-728a-4ba5-8928-fb5e344b2f91\\\",\\\"systemUUID\\\":\\\"c1819972-ca37-4089-9661-6671772d5e38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:35:09Z is after 2025-08-24T17:21:41Z" Dec 01 21:35:09 crc kubenswrapper[4962]: I1201 21:35:09.182869 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:09 crc kubenswrapper[4962]: I1201 21:35:09.183017 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:09 crc kubenswrapper[4962]: I1201 21:35:09.183082 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:09 crc kubenswrapper[4962]: I1201 21:35:09.183111 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:09 crc kubenswrapper[4962]: I1201 21:35:09.183128 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:09Z","lastTransitionTime":"2025-12-01T21:35:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:09 crc kubenswrapper[4962]: E1201 21:35:09.200160 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:35:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:35:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:35:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:35:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:35:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:35:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:35:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:35:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1be34435-728a-4ba5-8928-fb5e344b2f91\\\",\\\"systemUUID\\\":\\\"c1819972-ca37-4089-9661-6671772d5e38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:35:09Z is after 2025-08-24T17:21:41Z" Dec 01 21:35:09 crc kubenswrapper[4962]: I1201 21:35:09.205733 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:09 crc kubenswrapper[4962]: I1201 21:35:09.205819 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:09 crc kubenswrapper[4962]: I1201 21:35:09.205847 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:09 crc kubenswrapper[4962]: I1201 21:35:09.205912 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:09 crc kubenswrapper[4962]: I1201 21:35:09.205984 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:09Z","lastTransitionTime":"2025-12-01T21:35:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:09 crc kubenswrapper[4962]: I1201 21:35:09.218707 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 21:35:09 crc kubenswrapper[4962]: I1201 21:35:09.218840 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:35:09 crc kubenswrapper[4962]: I1201 21:35:09.218748 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 21:35:09 crc kubenswrapper[4962]: I1201 21:35:09.218742 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q5q5" Dec 01 21:35:09 crc kubenswrapper[4962]: E1201 21:35:09.219016 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 21:35:09 crc kubenswrapper[4962]: E1201 21:35:09.219192 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 21:35:09 crc kubenswrapper[4962]: E1201 21:35:09.219305 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 21:35:09 crc kubenswrapper[4962]: E1201 21:35:09.219422 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q5q5" podUID="5e1746bf-6971-44aa-ae52-f349e6963eb2" Dec 01 21:35:09 crc kubenswrapper[4962]: E1201 21:35:09.233030 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:35:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:35:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:35:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:35:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:35:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:35:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:35:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T21:35:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1be34435-728a-4ba5-8928-fb5e344b2f91\\\",\\\"systemUUID\\\":\\\"c1819972-ca37-4089-9661-6671772d5e38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T21:35:09Z is after 2025-08-24T17:21:41Z" Dec 01 21:35:09 crc kubenswrapper[4962]: E1201 21:35:09.233201 4962 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 21:35:09 crc kubenswrapper[4962]: I1201 21:35:09.234882 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:09 crc kubenswrapper[4962]: I1201 21:35:09.234967 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:09 crc kubenswrapper[4962]: I1201 21:35:09.234993 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:09 crc kubenswrapper[4962]: I1201 21:35:09.235023 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:09 crc kubenswrapper[4962]: I1201 21:35:09.235045 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:09Z","lastTransitionTime":"2025-12-01T21:35:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:09 crc kubenswrapper[4962]: I1201 21:35:09.338171 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:09 crc kubenswrapper[4962]: I1201 21:35:09.338234 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:09 crc kubenswrapper[4962]: I1201 21:35:09.338253 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:09 crc kubenswrapper[4962]: I1201 21:35:09.338275 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:09 crc kubenswrapper[4962]: I1201 21:35:09.338295 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:09Z","lastTransitionTime":"2025-12-01T21:35:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:09 crc kubenswrapper[4962]: I1201 21:35:09.441513 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:09 crc kubenswrapper[4962]: I1201 21:35:09.441587 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:09 crc kubenswrapper[4962]: I1201 21:35:09.441606 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:09 crc kubenswrapper[4962]: I1201 21:35:09.441629 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:09 crc kubenswrapper[4962]: I1201 21:35:09.441646 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:09Z","lastTransitionTime":"2025-12-01T21:35:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:09 crc kubenswrapper[4962]: I1201 21:35:09.544676 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:09 crc kubenswrapper[4962]: I1201 21:35:09.544770 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:09 crc kubenswrapper[4962]: I1201 21:35:09.544794 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:09 crc kubenswrapper[4962]: I1201 21:35:09.544821 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:09 crc kubenswrapper[4962]: I1201 21:35:09.544841 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:09Z","lastTransitionTime":"2025-12-01T21:35:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:09 crc kubenswrapper[4962]: I1201 21:35:09.648181 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:09 crc kubenswrapper[4962]: I1201 21:35:09.648248 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:09 crc kubenswrapper[4962]: I1201 21:35:09.648273 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:09 crc kubenswrapper[4962]: I1201 21:35:09.648302 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:09 crc kubenswrapper[4962]: I1201 21:35:09.648324 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:09Z","lastTransitionTime":"2025-12-01T21:35:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:09 crc kubenswrapper[4962]: I1201 21:35:09.751635 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:09 crc kubenswrapper[4962]: I1201 21:35:09.751682 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:09 crc kubenswrapper[4962]: I1201 21:35:09.751700 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:09 crc kubenswrapper[4962]: I1201 21:35:09.751722 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:09 crc kubenswrapper[4962]: I1201 21:35:09.751738 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:09Z","lastTransitionTime":"2025-12-01T21:35:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:09 crc kubenswrapper[4962]: I1201 21:35:09.855244 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:09 crc kubenswrapper[4962]: I1201 21:35:09.855311 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:09 crc kubenswrapper[4962]: I1201 21:35:09.855329 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:09 crc kubenswrapper[4962]: I1201 21:35:09.855360 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:09 crc kubenswrapper[4962]: I1201 21:35:09.855379 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:09Z","lastTransitionTime":"2025-12-01T21:35:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:09 crc kubenswrapper[4962]: I1201 21:35:09.961832 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:09 crc kubenswrapper[4962]: I1201 21:35:09.961907 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:09 crc kubenswrapper[4962]: I1201 21:35:09.961929 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:09 crc kubenswrapper[4962]: I1201 21:35:09.961991 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:09 crc kubenswrapper[4962]: I1201 21:35:09.962014 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:09Z","lastTransitionTime":"2025-12-01T21:35:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:10 crc kubenswrapper[4962]: I1201 21:35:10.065277 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:10 crc kubenswrapper[4962]: I1201 21:35:10.065342 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:10 crc kubenswrapper[4962]: I1201 21:35:10.065362 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:10 crc kubenswrapper[4962]: I1201 21:35:10.065388 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:10 crc kubenswrapper[4962]: I1201 21:35:10.065409 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:10Z","lastTransitionTime":"2025-12-01T21:35:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:10 crc kubenswrapper[4962]: I1201 21:35:10.168611 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:10 crc kubenswrapper[4962]: I1201 21:35:10.168668 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:10 crc kubenswrapper[4962]: I1201 21:35:10.168686 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:10 crc kubenswrapper[4962]: I1201 21:35:10.168708 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:10 crc kubenswrapper[4962]: I1201 21:35:10.168725 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:10Z","lastTransitionTime":"2025-12-01T21:35:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:10 crc kubenswrapper[4962]: I1201 21:35:10.219928 4962 scope.go:117] "RemoveContainer" containerID="52ceea6c3fd351be1043a8bf07055a4d385f97a923ff2f08f1ddf810320d61ce" Dec 01 21:35:10 crc kubenswrapper[4962]: E1201 21:35:10.220234 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-j77n9_openshift-ovn-kubernetes(017b2e87-9a6e-46c6-b061-1ed93bfd2322)\"" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" podUID="017b2e87-9a6e-46c6-b061-1ed93bfd2322" Dec 01 21:35:10 crc kubenswrapper[4962]: I1201 21:35:10.271872 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:10 crc kubenswrapper[4962]: I1201 21:35:10.271974 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:10 crc kubenswrapper[4962]: I1201 21:35:10.271997 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:10 crc kubenswrapper[4962]: I1201 21:35:10.272021 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:10 crc kubenswrapper[4962]: I1201 21:35:10.272039 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:10Z","lastTransitionTime":"2025-12-01T21:35:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:10 crc kubenswrapper[4962]: I1201 21:35:10.375298 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:10 crc kubenswrapper[4962]: I1201 21:35:10.375356 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:10 crc kubenswrapper[4962]: I1201 21:35:10.375373 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:10 crc kubenswrapper[4962]: I1201 21:35:10.375395 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:10 crc kubenswrapper[4962]: I1201 21:35:10.375412 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:10Z","lastTransitionTime":"2025-12-01T21:35:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:10 crc kubenswrapper[4962]: I1201 21:35:10.478263 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:10 crc kubenswrapper[4962]: I1201 21:35:10.478331 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:10 crc kubenswrapper[4962]: I1201 21:35:10.478355 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:10 crc kubenswrapper[4962]: I1201 21:35:10.478386 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:10 crc kubenswrapper[4962]: I1201 21:35:10.478405 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:10Z","lastTransitionTime":"2025-12-01T21:35:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:10 crc kubenswrapper[4962]: I1201 21:35:10.581354 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:10 crc kubenswrapper[4962]: I1201 21:35:10.581421 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:10 crc kubenswrapper[4962]: I1201 21:35:10.581440 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:10 crc kubenswrapper[4962]: I1201 21:35:10.581464 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:10 crc kubenswrapper[4962]: I1201 21:35:10.581482 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:10Z","lastTransitionTime":"2025-12-01T21:35:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:10 crc kubenswrapper[4962]: I1201 21:35:10.684749 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:10 crc kubenswrapper[4962]: I1201 21:35:10.684797 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:10 crc kubenswrapper[4962]: I1201 21:35:10.684810 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:10 crc kubenswrapper[4962]: I1201 21:35:10.684827 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:10 crc kubenswrapper[4962]: I1201 21:35:10.684838 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:10Z","lastTransitionTime":"2025-12-01T21:35:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:10 crc kubenswrapper[4962]: I1201 21:35:10.787515 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:10 crc kubenswrapper[4962]: I1201 21:35:10.787557 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:10 crc kubenswrapper[4962]: I1201 21:35:10.787568 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:10 crc kubenswrapper[4962]: I1201 21:35:10.787584 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:10 crc kubenswrapper[4962]: I1201 21:35:10.787596 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:10Z","lastTransitionTime":"2025-12-01T21:35:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:10 crc kubenswrapper[4962]: I1201 21:35:10.890918 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:10 crc kubenswrapper[4962]: I1201 21:35:10.891032 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:10 crc kubenswrapper[4962]: I1201 21:35:10.891053 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:10 crc kubenswrapper[4962]: I1201 21:35:10.891078 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:10 crc kubenswrapper[4962]: I1201 21:35:10.891197 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:10Z","lastTransitionTime":"2025-12-01T21:35:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:10 crc kubenswrapper[4962]: I1201 21:35:10.994827 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:10 crc kubenswrapper[4962]: I1201 21:35:10.994892 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:10 crc kubenswrapper[4962]: I1201 21:35:10.994908 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:10 crc kubenswrapper[4962]: I1201 21:35:10.994998 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:10 crc kubenswrapper[4962]: I1201 21:35:10.995025 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:10Z","lastTransitionTime":"2025-12-01T21:35:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:11 crc kubenswrapper[4962]: I1201 21:35:11.097946 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:11 crc kubenswrapper[4962]: I1201 21:35:11.097986 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:11 crc kubenswrapper[4962]: I1201 21:35:11.097994 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:11 crc kubenswrapper[4962]: I1201 21:35:11.098008 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:11 crc kubenswrapper[4962]: I1201 21:35:11.098022 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:11Z","lastTransitionTime":"2025-12-01T21:35:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:11 crc kubenswrapper[4962]: I1201 21:35:11.200347 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:11 crc kubenswrapper[4962]: I1201 21:35:11.200403 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:11 crc kubenswrapper[4962]: I1201 21:35:11.200420 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:11 crc kubenswrapper[4962]: I1201 21:35:11.200442 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:11 crc kubenswrapper[4962]: I1201 21:35:11.200460 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:11Z","lastTransitionTime":"2025-12-01T21:35:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:11 crc kubenswrapper[4962]: I1201 21:35:11.219077 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 21:35:11 crc kubenswrapper[4962]: I1201 21:35:11.219180 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:35:11 crc kubenswrapper[4962]: E1201 21:35:11.219235 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 21:35:11 crc kubenswrapper[4962]: I1201 21:35:11.219244 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 21:35:11 crc kubenswrapper[4962]: I1201 21:35:11.219281 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q5q5" Dec 01 21:35:11 crc kubenswrapper[4962]: E1201 21:35:11.219447 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 21:35:11 crc kubenswrapper[4962]: E1201 21:35:11.219602 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 21:35:11 crc kubenswrapper[4962]: E1201 21:35:11.219684 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q5q5" podUID="5e1746bf-6971-44aa-ae52-f349e6963eb2" Dec 01 21:35:11 crc kubenswrapper[4962]: I1201 21:35:11.304806 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:11 crc kubenswrapper[4962]: I1201 21:35:11.304882 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:11 crc kubenswrapper[4962]: I1201 21:35:11.304907 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:11 crc kubenswrapper[4962]: I1201 21:35:11.304963 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:11 crc kubenswrapper[4962]: I1201 21:35:11.304991 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:11Z","lastTransitionTime":"2025-12-01T21:35:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:11 crc kubenswrapper[4962]: I1201 21:35:11.408412 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:11 crc kubenswrapper[4962]: I1201 21:35:11.408475 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:11 crc kubenswrapper[4962]: I1201 21:35:11.408485 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:11 crc kubenswrapper[4962]: I1201 21:35:11.408504 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:11 crc kubenswrapper[4962]: I1201 21:35:11.408520 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:11Z","lastTransitionTime":"2025-12-01T21:35:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:11 crc kubenswrapper[4962]: I1201 21:35:11.512367 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:11 crc kubenswrapper[4962]: I1201 21:35:11.512446 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:11 crc kubenswrapper[4962]: I1201 21:35:11.512473 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:11 crc kubenswrapper[4962]: I1201 21:35:11.512503 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:11 crc kubenswrapper[4962]: I1201 21:35:11.512526 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:11Z","lastTransitionTime":"2025-12-01T21:35:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:11 crc kubenswrapper[4962]: I1201 21:35:11.616113 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:11 crc kubenswrapper[4962]: I1201 21:35:11.616196 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:11 crc kubenswrapper[4962]: I1201 21:35:11.616215 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:11 crc kubenswrapper[4962]: I1201 21:35:11.616245 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:11 crc kubenswrapper[4962]: I1201 21:35:11.616266 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:11Z","lastTransitionTime":"2025-12-01T21:35:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:11 crc kubenswrapper[4962]: I1201 21:35:11.719229 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:11 crc kubenswrapper[4962]: I1201 21:35:11.719310 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:11 crc kubenswrapper[4962]: I1201 21:35:11.719330 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:11 crc kubenswrapper[4962]: I1201 21:35:11.719362 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:11 crc kubenswrapper[4962]: I1201 21:35:11.719383 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:11Z","lastTransitionTime":"2025-12-01T21:35:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:11 crc kubenswrapper[4962]: I1201 21:35:11.822786 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:11 crc kubenswrapper[4962]: I1201 21:35:11.822843 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:11 crc kubenswrapper[4962]: I1201 21:35:11.822862 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:11 crc kubenswrapper[4962]: I1201 21:35:11.822884 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:11 crc kubenswrapper[4962]: I1201 21:35:11.822902 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:11Z","lastTransitionTime":"2025-12-01T21:35:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:11 crc kubenswrapper[4962]: I1201 21:35:11.925612 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:11 crc kubenswrapper[4962]: I1201 21:35:11.925764 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:11 crc kubenswrapper[4962]: I1201 21:35:11.925784 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:11 crc kubenswrapper[4962]: I1201 21:35:11.925862 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:11 crc kubenswrapper[4962]: I1201 21:35:11.925883 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:11Z","lastTransitionTime":"2025-12-01T21:35:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:12 crc kubenswrapper[4962]: I1201 21:35:12.029767 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:12 crc kubenswrapper[4962]: I1201 21:35:12.029832 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:12 crc kubenswrapper[4962]: I1201 21:35:12.029849 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:12 crc kubenswrapper[4962]: I1201 21:35:12.029872 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:12 crc kubenswrapper[4962]: I1201 21:35:12.029891 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:12Z","lastTransitionTime":"2025-12-01T21:35:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:12 crc kubenswrapper[4962]: I1201 21:35:12.133060 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:12 crc kubenswrapper[4962]: I1201 21:35:12.133114 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:12 crc kubenswrapper[4962]: I1201 21:35:12.133128 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:12 crc kubenswrapper[4962]: I1201 21:35:12.133148 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:12 crc kubenswrapper[4962]: I1201 21:35:12.133164 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:12Z","lastTransitionTime":"2025-12-01T21:35:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:12 crc kubenswrapper[4962]: I1201 21:35:12.236278 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:12 crc kubenswrapper[4962]: I1201 21:35:12.236346 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:12 crc kubenswrapper[4962]: I1201 21:35:12.236363 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:12 crc kubenswrapper[4962]: I1201 21:35:12.236387 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:12 crc kubenswrapper[4962]: I1201 21:35:12.236404 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:12Z","lastTransitionTime":"2025-12-01T21:35:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:12 crc kubenswrapper[4962]: I1201 21:35:12.340458 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:12 crc kubenswrapper[4962]: I1201 21:35:12.340525 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:12 crc kubenswrapper[4962]: I1201 21:35:12.340537 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:12 crc kubenswrapper[4962]: I1201 21:35:12.340561 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:12 crc kubenswrapper[4962]: I1201 21:35:12.340574 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:12Z","lastTransitionTime":"2025-12-01T21:35:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:12 crc kubenswrapper[4962]: I1201 21:35:12.444363 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:12 crc kubenswrapper[4962]: I1201 21:35:12.444453 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:12 crc kubenswrapper[4962]: I1201 21:35:12.444478 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:12 crc kubenswrapper[4962]: I1201 21:35:12.444515 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:12 crc kubenswrapper[4962]: I1201 21:35:12.444536 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:12Z","lastTransitionTime":"2025-12-01T21:35:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:12 crc kubenswrapper[4962]: I1201 21:35:12.548091 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:12 crc kubenswrapper[4962]: I1201 21:35:12.548155 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:12 crc kubenswrapper[4962]: I1201 21:35:12.548173 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:12 crc kubenswrapper[4962]: I1201 21:35:12.548203 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:12 crc kubenswrapper[4962]: I1201 21:35:12.548222 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:12Z","lastTransitionTime":"2025-12-01T21:35:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:12 crc kubenswrapper[4962]: I1201 21:35:12.650875 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:12 crc kubenswrapper[4962]: I1201 21:35:12.650975 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:12 crc kubenswrapper[4962]: I1201 21:35:12.651006 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:12 crc kubenswrapper[4962]: I1201 21:35:12.651041 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:12 crc kubenswrapper[4962]: I1201 21:35:12.651064 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:12Z","lastTransitionTime":"2025-12-01T21:35:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:12 crc kubenswrapper[4962]: I1201 21:35:12.754329 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:12 crc kubenswrapper[4962]: I1201 21:35:12.754422 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:12 crc kubenswrapper[4962]: I1201 21:35:12.754449 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:12 crc kubenswrapper[4962]: I1201 21:35:12.754483 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:12 crc kubenswrapper[4962]: I1201 21:35:12.754507 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:12Z","lastTransitionTime":"2025-12-01T21:35:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:12 crc kubenswrapper[4962]: I1201 21:35:12.857385 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:12 crc kubenswrapper[4962]: I1201 21:35:12.857453 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:12 crc kubenswrapper[4962]: I1201 21:35:12.857470 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:12 crc kubenswrapper[4962]: I1201 21:35:12.857495 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:12 crc kubenswrapper[4962]: I1201 21:35:12.857512 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:12Z","lastTransitionTime":"2025-12-01T21:35:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:12 crc kubenswrapper[4962]: I1201 21:35:12.961079 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:12 crc kubenswrapper[4962]: I1201 21:35:12.961143 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:12 crc kubenswrapper[4962]: I1201 21:35:12.961160 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:12 crc kubenswrapper[4962]: I1201 21:35:12.961186 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:12 crc kubenswrapper[4962]: I1201 21:35:12.961204 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:12Z","lastTransitionTime":"2025-12-01T21:35:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:13 crc kubenswrapper[4962]: I1201 21:35:13.064503 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:13 crc kubenswrapper[4962]: I1201 21:35:13.064551 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:13 crc kubenswrapper[4962]: I1201 21:35:13.064567 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:13 crc kubenswrapper[4962]: I1201 21:35:13.064590 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:13 crc kubenswrapper[4962]: I1201 21:35:13.064611 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:13Z","lastTransitionTime":"2025-12-01T21:35:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:13 crc kubenswrapper[4962]: I1201 21:35:13.167579 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:13 crc kubenswrapper[4962]: I1201 21:35:13.167636 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:13 crc kubenswrapper[4962]: I1201 21:35:13.167653 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:13 crc kubenswrapper[4962]: I1201 21:35:13.167674 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:13 crc kubenswrapper[4962]: I1201 21:35:13.167693 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:13Z","lastTransitionTime":"2025-12-01T21:35:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:13 crc kubenswrapper[4962]: I1201 21:35:13.219196 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 21:35:13 crc kubenswrapper[4962]: I1201 21:35:13.219211 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:35:13 crc kubenswrapper[4962]: I1201 21:35:13.219348 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 21:35:13 crc kubenswrapper[4962]: E1201 21:35:13.219508 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 21:35:13 crc kubenswrapper[4962]: I1201 21:35:13.219597 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q5q5" Dec 01 21:35:13 crc kubenswrapper[4962]: E1201 21:35:13.219903 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q5q5" podUID="5e1746bf-6971-44aa-ae52-f349e6963eb2" Dec 01 21:35:13 crc kubenswrapper[4962]: E1201 21:35:13.220128 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 21:35:13 crc kubenswrapper[4962]: E1201 21:35:13.220310 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 21:35:13 crc kubenswrapper[4962]: I1201 21:35:13.270312 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:13 crc kubenswrapper[4962]: I1201 21:35:13.270375 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:13 crc kubenswrapper[4962]: I1201 21:35:13.270392 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:13 crc kubenswrapper[4962]: I1201 21:35:13.270419 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:13 crc kubenswrapper[4962]: I1201 21:35:13.270436 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:13Z","lastTransitionTime":"2025-12-01T21:35:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:13 crc kubenswrapper[4962]: I1201 21:35:13.372232 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:13 crc kubenswrapper[4962]: I1201 21:35:13.372278 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:13 crc kubenswrapper[4962]: I1201 21:35:13.372290 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:13 crc kubenswrapper[4962]: I1201 21:35:13.372307 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:13 crc kubenswrapper[4962]: I1201 21:35:13.372320 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:13Z","lastTransitionTime":"2025-12-01T21:35:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:13 crc kubenswrapper[4962]: I1201 21:35:13.475288 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:13 crc kubenswrapper[4962]: I1201 21:35:13.475549 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:13 crc kubenswrapper[4962]: I1201 21:35:13.475575 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:13 crc kubenswrapper[4962]: I1201 21:35:13.475604 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:13 crc kubenswrapper[4962]: I1201 21:35:13.475643 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:13Z","lastTransitionTime":"2025-12-01T21:35:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:13 crc kubenswrapper[4962]: I1201 21:35:13.578867 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:13 crc kubenswrapper[4962]: I1201 21:35:13.578918 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:13 crc kubenswrapper[4962]: I1201 21:35:13.578963 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:13 crc kubenswrapper[4962]: I1201 21:35:13.578988 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:13 crc kubenswrapper[4962]: I1201 21:35:13.579009 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:13Z","lastTransitionTime":"2025-12-01T21:35:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:13 crc kubenswrapper[4962]: I1201 21:35:13.682279 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:13 crc kubenswrapper[4962]: I1201 21:35:13.682353 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:13 crc kubenswrapper[4962]: I1201 21:35:13.682418 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:13 crc kubenswrapper[4962]: I1201 21:35:13.682448 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:13 crc kubenswrapper[4962]: I1201 21:35:13.682470 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:13Z","lastTransitionTime":"2025-12-01T21:35:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:13 crc kubenswrapper[4962]: I1201 21:35:13.786087 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:13 crc kubenswrapper[4962]: I1201 21:35:13.786179 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:13 crc kubenswrapper[4962]: I1201 21:35:13.786205 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:13 crc kubenswrapper[4962]: I1201 21:35:13.786240 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:13 crc kubenswrapper[4962]: I1201 21:35:13.786263 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:13Z","lastTransitionTime":"2025-12-01T21:35:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:13 crc kubenswrapper[4962]: I1201 21:35:13.888616 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:13 crc kubenswrapper[4962]: I1201 21:35:13.888683 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:13 crc kubenswrapper[4962]: I1201 21:35:13.888709 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:13 crc kubenswrapper[4962]: I1201 21:35:13.888739 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:13 crc kubenswrapper[4962]: I1201 21:35:13.888761 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:13Z","lastTransitionTime":"2025-12-01T21:35:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:13 crc kubenswrapper[4962]: I1201 21:35:13.991875 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:13 crc kubenswrapper[4962]: I1201 21:35:13.991930 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:13 crc kubenswrapper[4962]: I1201 21:35:13.991976 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:13 crc kubenswrapper[4962]: I1201 21:35:13.991998 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:13 crc kubenswrapper[4962]: I1201 21:35:13.992016 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:13Z","lastTransitionTime":"2025-12-01T21:35:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:14 crc kubenswrapper[4962]: I1201 21:35:14.094804 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:14 crc kubenswrapper[4962]: I1201 21:35:14.094866 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:14 crc kubenswrapper[4962]: I1201 21:35:14.094893 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:14 crc kubenswrapper[4962]: I1201 21:35:14.094921 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:14 crc kubenswrapper[4962]: I1201 21:35:14.094982 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:14Z","lastTransitionTime":"2025-12-01T21:35:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:14 crc kubenswrapper[4962]: I1201 21:35:14.198234 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:14 crc kubenswrapper[4962]: I1201 21:35:14.198304 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:14 crc kubenswrapper[4962]: I1201 21:35:14.198328 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:14 crc kubenswrapper[4962]: I1201 21:35:14.198355 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:14 crc kubenswrapper[4962]: I1201 21:35:14.198376 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:14Z","lastTransitionTime":"2025-12-01T21:35:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:14 crc kubenswrapper[4962]: I1201 21:35:14.301539 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:14 crc kubenswrapper[4962]: I1201 21:35:14.301588 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:14 crc kubenswrapper[4962]: I1201 21:35:14.301604 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:14 crc kubenswrapper[4962]: I1201 21:35:14.301625 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:14 crc kubenswrapper[4962]: I1201 21:35:14.301642 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:14Z","lastTransitionTime":"2025-12-01T21:35:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:14 crc kubenswrapper[4962]: I1201 21:35:14.405277 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:14 crc kubenswrapper[4962]: I1201 21:35:14.405345 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:14 crc kubenswrapper[4962]: I1201 21:35:14.405370 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:14 crc kubenswrapper[4962]: I1201 21:35:14.405399 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:14 crc kubenswrapper[4962]: I1201 21:35:14.405421 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:14Z","lastTransitionTime":"2025-12-01T21:35:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:14 crc kubenswrapper[4962]: I1201 21:35:14.508490 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:14 crc kubenswrapper[4962]: I1201 21:35:14.508551 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:14 crc kubenswrapper[4962]: I1201 21:35:14.508568 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:14 crc kubenswrapper[4962]: I1201 21:35:14.508594 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:14 crc kubenswrapper[4962]: I1201 21:35:14.508615 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:14Z","lastTransitionTime":"2025-12-01T21:35:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:14 crc kubenswrapper[4962]: I1201 21:35:14.610795 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:14 crc kubenswrapper[4962]: I1201 21:35:14.610855 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:14 crc kubenswrapper[4962]: I1201 21:35:14.610871 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:14 crc kubenswrapper[4962]: I1201 21:35:14.610895 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:14 crc kubenswrapper[4962]: I1201 21:35:14.610912 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:14Z","lastTransitionTime":"2025-12-01T21:35:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:14 crc kubenswrapper[4962]: I1201 21:35:14.715282 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:14 crc kubenswrapper[4962]: I1201 21:35:14.715332 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:14 crc kubenswrapper[4962]: I1201 21:35:14.715348 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:14 crc kubenswrapper[4962]: I1201 21:35:14.715366 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:14 crc kubenswrapper[4962]: I1201 21:35:14.715381 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:14Z","lastTransitionTime":"2025-12-01T21:35:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:14 crc kubenswrapper[4962]: I1201 21:35:14.818835 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:14 crc kubenswrapper[4962]: I1201 21:35:14.818927 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:14 crc kubenswrapper[4962]: I1201 21:35:14.818993 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:14 crc kubenswrapper[4962]: I1201 21:35:14.819019 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:14 crc kubenswrapper[4962]: I1201 21:35:14.819036 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:14Z","lastTransitionTime":"2025-12-01T21:35:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:14 crc kubenswrapper[4962]: I1201 21:35:14.921524 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:14 crc kubenswrapper[4962]: I1201 21:35:14.921586 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:14 crc kubenswrapper[4962]: I1201 21:35:14.921604 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:14 crc kubenswrapper[4962]: I1201 21:35:14.921628 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:14 crc kubenswrapper[4962]: I1201 21:35:14.921646 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:14Z","lastTransitionTime":"2025-12-01T21:35:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:15 crc kubenswrapper[4962]: I1201 21:35:15.024795 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:15 crc kubenswrapper[4962]: I1201 21:35:15.024854 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:15 crc kubenswrapper[4962]: I1201 21:35:15.024872 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:15 crc kubenswrapper[4962]: I1201 21:35:15.024897 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:15 crc kubenswrapper[4962]: I1201 21:35:15.024914 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:15Z","lastTransitionTime":"2025-12-01T21:35:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:15 crc kubenswrapper[4962]: I1201 21:35:15.127153 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:15 crc kubenswrapper[4962]: I1201 21:35:15.127203 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:15 crc kubenswrapper[4962]: I1201 21:35:15.127257 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:15 crc kubenswrapper[4962]: I1201 21:35:15.127281 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:15 crc kubenswrapper[4962]: I1201 21:35:15.127299 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:15Z","lastTransitionTime":"2025-12-01T21:35:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:15 crc kubenswrapper[4962]: I1201 21:35:15.219159 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q5q5" Dec 01 21:35:15 crc kubenswrapper[4962]: I1201 21:35:15.219237 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 21:35:15 crc kubenswrapper[4962]: I1201 21:35:15.219282 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 21:35:15 crc kubenswrapper[4962]: I1201 21:35:15.219299 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:35:15 crc kubenswrapper[4962]: E1201 21:35:15.219713 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 21:35:15 crc kubenswrapper[4962]: E1201 21:35:15.219871 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 21:35:15 crc kubenswrapper[4962]: E1201 21:35:15.220033 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 21:35:15 crc kubenswrapper[4962]: E1201 21:35:15.220225 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q5q5" podUID="5e1746bf-6971-44aa-ae52-f349e6963eb2" Dec 01 21:35:15 crc kubenswrapper[4962]: I1201 21:35:15.230667 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:15 crc kubenswrapper[4962]: I1201 21:35:15.230723 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:15 crc kubenswrapper[4962]: I1201 21:35:15.230742 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:15 crc kubenswrapper[4962]: I1201 21:35:15.230772 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:15 crc kubenswrapper[4962]: I1201 21:35:15.230795 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:15Z","lastTransitionTime":"2025-12-01T21:35:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:15 crc kubenswrapper[4962]: I1201 21:35:15.333835 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:15 crc kubenswrapper[4962]: I1201 21:35:15.333911 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:15 crc kubenswrapper[4962]: I1201 21:35:15.333930 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:15 crc kubenswrapper[4962]: I1201 21:35:15.333981 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:15 crc kubenswrapper[4962]: I1201 21:35:15.334003 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:15Z","lastTransitionTime":"2025-12-01T21:35:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:15 crc kubenswrapper[4962]: I1201 21:35:15.437389 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:15 crc kubenswrapper[4962]: I1201 21:35:15.437445 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:15 crc kubenswrapper[4962]: I1201 21:35:15.437462 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:15 crc kubenswrapper[4962]: I1201 21:35:15.437486 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:15 crc kubenswrapper[4962]: I1201 21:35:15.437504 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:15Z","lastTransitionTime":"2025-12-01T21:35:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:15 crc kubenswrapper[4962]: I1201 21:35:15.540864 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:15 crc kubenswrapper[4962]: I1201 21:35:15.540925 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:15 crc kubenswrapper[4962]: I1201 21:35:15.540992 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:15 crc kubenswrapper[4962]: I1201 21:35:15.541027 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:15 crc kubenswrapper[4962]: I1201 21:35:15.541049 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:15Z","lastTransitionTime":"2025-12-01T21:35:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:15 crc kubenswrapper[4962]: I1201 21:35:15.643797 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:15 crc kubenswrapper[4962]: I1201 21:35:15.643888 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:15 crc kubenswrapper[4962]: I1201 21:35:15.643905 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:15 crc kubenswrapper[4962]: I1201 21:35:15.643927 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:15 crc kubenswrapper[4962]: I1201 21:35:15.643977 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:15Z","lastTransitionTime":"2025-12-01T21:35:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:15 crc kubenswrapper[4962]: I1201 21:35:15.746648 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:15 crc kubenswrapper[4962]: I1201 21:35:15.746715 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:15 crc kubenswrapper[4962]: I1201 21:35:15.746735 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:15 crc kubenswrapper[4962]: I1201 21:35:15.746764 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:15 crc kubenswrapper[4962]: I1201 21:35:15.746783 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:15Z","lastTransitionTime":"2025-12-01T21:35:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:15 crc kubenswrapper[4962]: I1201 21:35:15.850025 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:15 crc kubenswrapper[4962]: I1201 21:35:15.850118 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:15 crc kubenswrapper[4962]: I1201 21:35:15.850139 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:15 crc kubenswrapper[4962]: I1201 21:35:15.850164 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:15 crc kubenswrapper[4962]: I1201 21:35:15.850209 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:15Z","lastTransitionTime":"2025-12-01T21:35:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:15 crc kubenswrapper[4962]: I1201 21:35:15.953021 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:15 crc kubenswrapper[4962]: I1201 21:35:15.953089 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:15 crc kubenswrapper[4962]: I1201 21:35:15.953106 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:15 crc kubenswrapper[4962]: I1201 21:35:15.953131 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:15 crc kubenswrapper[4962]: I1201 21:35:15.953151 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:15Z","lastTransitionTime":"2025-12-01T21:35:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:16 crc kubenswrapper[4962]: I1201 21:35:16.056644 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:16 crc kubenswrapper[4962]: I1201 21:35:16.056702 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:16 crc kubenswrapper[4962]: I1201 21:35:16.056722 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:16 crc kubenswrapper[4962]: I1201 21:35:16.056749 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:16 crc kubenswrapper[4962]: I1201 21:35:16.056773 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:16Z","lastTransitionTime":"2025-12-01T21:35:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:16 crc kubenswrapper[4962]: I1201 21:35:16.159808 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:16 crc kubenswrapper[4962]: I1201 21:35:16.159876 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:16 crc kubenswrapper[4962]: I1201 21:35:16.159895 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:16 crc kubenswrapper[4962]: I1201 21:35:16.159919 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:16 crc kubenswrapper[4962]: I1201 21:35:16.159943 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:16Z","lastTransitionTime":"2025-12-01T21:35:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:16 crc kubenswrapper[4962]: I1201 21:35:16.262450 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:16 crc kubenswrapper[4962]: I1201 21:35:16.262525 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:16 crc kubenswrapper[4962]: I1201 21:35:16.262548 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:16 crc kubenswrapper[4962]: I1201 21:35:16.262577 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:16 crc kubenswrapper[4962]: I1201 21:35:16.262599 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:16Z","lastTransitionTime":"2025-12-01T21:35:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:16 crc kubenswrapper[4962]: I1201 21:35:16.301718 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-lv2jr" podStartSLOduration=78.301696912 podStartE2EDuration="1m18.301696912s" podCreationTimestamp="2025-12-01 21:33:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:35:16.301450545 +0000 UTC m=+100.402889760" watchObservedRunningTime="2025-12-01 21:35:16.301696912 +0000 UTC m=+100.403136117" Dec 01 21:35:16 crc kubenswrapper[4962]: I1201 21:35:16.302005 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-m4wg5" podStartSLOduration=78.30199791 podStartE2EDuration="1m18.30199791s" podCreationTimestamp="2025-12-01 21:33:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:35:16.269626587 +0000 UTC m=+100.371065802" watchObservedRunningTime="2025-12-01 21:35:16.30199791 +0000 UTC m=+100.403437135" Dec 01 21:35:16 crc kubenswrapper[4962]: I1201 21:35:16.322647 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-4wzdm" podStartSLOduration=78.322619013 podStartE2EDuration="1m18.322619013s" podCreationTimestamp="2025-12-01 21:33:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:35:16.322465278 +0000 UTC m=+100.423904533" watchObservedRunningTime="2025-12-01 21:35:16.322619013 +0000 UTC m=+100.424058248" Dec 01 21:35:16 crc kubenswrapper[4962]: I1201 21:35:16.355843 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=82.355801536 podStartE2EDuration="1m22.355801536s" podCreationTimestamp="2025-12-01 21:33:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:35:16.355300073 +0000 UTC m=+100.456739338" watchObservedRunningTime="2025-12-01 21:35:16.355801536 +0000 UTC m=+100.457240751" Dec 01 21:35:16 crc kubenswrapper[4962]: I1201 21:35:16.368370 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:16 crc kubenswrapper[4962]: I1201 21:35:16.368413 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:16 crc kubenswrapper[4962]: I1201 21:35:16.368427 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:16 crc kubenswrapper[4962]: I1201 21:35:16.368446 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:16 crc kubenswrapper[4962]: I1201 21:35:16.368460 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:16Z","lastTransitionTime":"2025-12-01T21:35:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:16 crc kubenswrapper[4962]: I1201 21:35:16.396241 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fqtnk" podStartSLOduration=78.39621886 podStartE2EDuration="1m18.39621886s" podCreationTimestamp="2025-12-01 21:33:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:35:16.395984394 +0000 UTC m=+100.497423599" watchObservedRunningTime="2025-12-01 21:35:16.39621886 +0000 UTC m=+100.497658075" Dec 01 21:35:16 crc kubenswrapper[4962]: I1201 21:35:16.414475 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=82.414457031 podStartE2EDuration="1m22.414457031s" podCreationTimestamp="2025-12-01 21:33:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:35:16.413649899 +0000 UTC m=+100.515089084" watchObservedRunningTime="2025-12-01 21:35:16.414457031 +0000 UTC m=+100.515896236" Dec 01 21:35:16 crc kubenswrapper[4962]: I1201 21:35:16.470464 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:16 crc kubenswrapper[4962]: I1201 21:35:16.470538 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:16 crc kubenswrapper[4962]: I1201 21:35:16.470562 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:16 crc kubenswrapper[4962]: I1201 21:35:16.470592 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:16 crc kubenswrapper[4962]: I1201 21:35:16.470612 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:16Z","lastTransitionTime":"2025-12-01T21:35:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:16 crc kubenswrapper[4962]: I1201 21:35:16.477010 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-w7dpq" podStartSLOduration=78.476992317 podStartE2EDuration="1m18.476992317s" podCreationTimestamp="2025-12-01 21:33:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:35:16.476393941 +0000 UTC m=+100.577833136" watchObservedRunningTime="2025-12-01 21:35:16.476992317 +0000 UTC m=+100.578431502" Dec 01 21:35:16 crc kubenswrapper[4962]: I1201 21:35:16.491371 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podStartSLOduration=78.491354855 podStartE2EDuration="1m18.491354855s" podCreationTimestamp="2025-12-01 21:33:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:35:16.491012876 +0000 UTC m=+100.592452101" watchObservedRunningTime="2025-12-01 21:35:16.491354855 +0000 UTC m=+100.592794050" Dec 01 21:35:16 crc kubenswrapper[4962]: I1201 21:35:16.524672 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=24.524645762 podStartE2EDuration="24.524645762s" podCreationTimestamp="2025-12-01 21:34:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:35:16.524592761 +0000 UTC m=+100.626031956" watchObservedRunningTime="2025-12-01 21:35:16.524645762 +0000 UTC m=+100.626085027" Dec 01 21:35:16 crc kubenswrapper[4962]: I1201 21:35:16.548597 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=10.548581832 podStartE2EDuration="10.548581832s" podCreationTimestamp="2025-12-01 21:35:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:35:16.547999217 +0000 UTC m=+100.649438422" watchObservedRunningTime="2025-12-01 21:35:16.548581832 +0000 UTC m=+100.650021017" Dec 01 21:35:16 crc kubenswrapper[4962]: I1201 21:35:16.573513 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:16 crc kubenswrapper[4962]: I1201 21:35:16.573560 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:16 crc kubenswrapper[4962]: I1201 21:35:16.573572 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:16 crc kubenswrapper[4962]: I1201 21:35:16.573590 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:16 crc kubenswrapper[4962]: I1201 21:35:16.573602 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:16Z","lastTransitionTime":"2025-12-01T21:35:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:16 crc kubenswrapper[4962]: I1201 21:35:16.607421 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=46.607394261 podStartE2EDuration="46.607394261s" podCreationTimestamp="2025-12-01 21:34:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:35:16.605532202 +0000 UTC m=+100.706971397" watchObservedRunningTime="2025-12-01 21:35:16.607394261 +0000 UTC m=+100.708833486" Dec 01 21:35:16 crc kubenswrapper[4962]: I1201 21:35:16.676708 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:16 crc kubenswrapper[4962]: I1201 21:35:16.676757 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:16 crc kubenswrapper[4962]: I1201 21:35:16.676770 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:16 crc kubenswrapper[4962]: I1201 21:35:16.676787 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:16 crc kubenswrapper[4962]: I1201 21:35:16.676800 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:16Z","lastTransitionTime":"2025-12-01T21:35:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:16 crc kubenswrapper[4962]: I1201 21:35:16.780256 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:16 crc kubenswrapper[4962]: I1201 21:35:16.780307 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:16 crc kubenswrapper[4962]: I1201 21:35:16.780325 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:16 crc kubenswrapper[4962]: I1201 21:35:16.780351 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:16 crc kubenswrapper[4962]: I1201 21:35:16.780370 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:16Z","lastTransitionTime":"2025-12-01T21:35:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:16 crc kubenswrapper[4962]: I1201 21:35:16.882728 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:16 crc kubenswrapper[4962]: I1201 21:35:16.882785 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:16 crc kubenswrapper[4962]: I1201 21:35:16.882803 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:16 crc kubenswrapper[4962]: I1201 21:35:16.882827 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:16 crc kubenswrapper[4962]: I1201 21:35:16.882844 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:16Z","lastTransitionTime":"2025-12-01T21:35:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:16 crc kubenswrapper[4962]: I1201 21:35:16.986558 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:16 crc kubenswrapper[4962]: I1201 21:35:16.986621 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:16 crc kubenswrapper[4962]: I1201 21:35:16.986638 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:16 crc kubenswrapper[4962]: I1201 21:35:16.986662 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:16 crc kubenswrapper[4962]: I1201 21:35:16.986680 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:16Z","lastTransitionTime":"2025-12-01T21:35:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:17 crc kubenswrapper[4962]: I1201 21:35:17.090147 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:17 crc kubenswrapper[4962]: I1201 21:35:17.090548 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:17 crc kubenswrapper[4962]: I1201 21:35:17.090702 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:17 crc kubenswrapper[4962]: I1201 21:35:17.090850 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:17 crc kubenswrapper[4962]: I1201 21:35:17.091022 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:17Z","lastTransitionTime":"2025-12-01T21:35:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:17 crc kubenswrapper[4962]: I1201 21:35:17.193851 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:17 crc kubenswrapper[4962]: I1201 21:35:17.193896 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:17 crc kubenswrapper[4962]: I1201 21:35:17.193908 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:17 crc kubenswrapper[4962]: I1201 21:35:17.193923 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:17 crc kubenswrapper[4962]: I1201 21:35:17.193954 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:17Z","lastTransitionTime":"2025-12-01T21:35:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:17 crc kubenswrapper[4962]: I1201 21:35:17.219138 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q5q5" Dec 01 21:35:17 crc kubenswrapper[4962]: I1201 21:35:17.219153 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 21:35:17 crc kubenswrapper[4962]: I1201 21:35:17.219467 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:35:17 crc kubenswrapper[4962]: E1201 21:35:17.219574 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 21:35:17 crc kubenswrapper[4962]: I1201 21:35:17.219732 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 21:35:17 crc kubenswrapper[4962]: E1201 21:35:17.219828 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q5q5" podUID="5e1746bf-6971-44aa-ae52-f349e6963eb2" Dec 01 21:35:17 crc kubenswrapper[4962]: E1201 21:35:17.220031 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 21:35:17 crc kubenswrapper[4962]: E1201 21:35:17.220518 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 21:35:17 crc kubenswrapper[4962]: I1201 21:35:17.297172 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:17 crc kubenswrapper[4962]: I1201 21:35:17.297231 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:17 crc kubenswrapper[4962]: I1201 21:35:17.297249 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:17 crc kubenswrapper[4962]: I1201 21:35:17.297273 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:17 crc kubenswrapper[4962]: I1201 21:35:17.297292 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:17Z","lastTransitionTime":"2025-12-01T21:35:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:17 crc kubenswrapper[4962]: I1201 21:35:17.400206 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:17 crc kubenswrapper[4962]: I1201 21:35:17.400265 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:17 crc kubenswrapper[4962]: I1201 21:35:17.400283 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:17 crc kubenswrapper[4962]: I1201 21:35:17.400305 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:17 crc kubenswrapper[4962]: I1201 21:35:17.400323 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:17Z","lastTransitionTime":"2025-12-01T21:35:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:17 crc kubenswrapper[4962]: I1201 21:35:17.431353 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5e1746bf-6971-44aa-ae52-f349e6963eb2-metrics-certs\") pod \"network-metrics-daemon-2q5q5\" (UID: \"5e1746bf-6971-44aa-ae52-f349e6963eb2\") " pod="openshift-multus/network-metrics-daemon-2q5q5" Dec 01 21:35:17 crc kubenswrapper[4962]: E1201 21:35:17.431582 4962 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 21:35:17 crc kubenswrapper[4962]: E1201 21:35:17.431662 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e1746bf-6971-44aa-ae52-f349e6963eb2-metrics-certs podName:5e1746bf-6971-44aa-ae52-f349e6963eb2 nodeName:}" failed. No retries permitted until 2025-12-01 21:36:21.431634233 +0000 UTC m=+165.533073468 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5e1746bf-6971-44aa-ae52-f349e6963eb2-metrics-certs") pod "network-metrics-daemon-2q5q5" (UID: "5e1746bf-6971-44aa-ae52-f349e6963eb2") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 21:35:17 crc kubenswrapper[4962]: I1201 21:35:17.503502 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:17 crc kubenswrapper[4962]: I1201 21:35:17.503547 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:17 crc kubenswrapper[4962]: I1201 21:35:17.503558 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:17 crc kubenswrapper[4962]: I1201 21:35:17.503577 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:17 crc kubenswrapper[4962]: I1201 21:35:17.503592 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:17Z","lastTransitionTime":"2025-12-01T21:35:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:17 crc kubenswrapper[4962]: I1201 21:35:17.606632 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:17 crc kubenswrapper[4962]: I1201 21:35:17.606708 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:17 crc kubenswrapper[4962]: I1201 21:35:17.606732 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:17 crc kubenswrapper[4962]: I1201 21:35:17.606758 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:17 crc kubenswrapper[4962]: I1201 21:35:17.606775 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:17Z","lastTransitionTime":"2025-12-01T21:35:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:17 crc kubenswrapper[4962]: I1201 21:35:17.709770 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:17 crc kubenswrapper[4962]: I1201 21:35:17.709901 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:17 crc kubenswrapper[4962]: I1201 21:35:17.709930 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:17 crc kubenswrapper[4962]: I1201 21:35:17.710000 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:17 crc kubenswrapper[4962]: I1201 21:35:17.710022 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:17Z","lastTransitionTime":"2025-12-01T21:35:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:17 crc kubenswrapper[4962]: I1201 21:35:17.813236 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:17 crc kubenswrapper[4962]: I1201 21:35:17.813304 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:17 crc kubenswrapper[4962]: I1201 21:35:17.813323 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:17 crc kubenswrapper[4962]: I1201 21:35:17.813348 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:17 crc kubenswrapper[4962]: I1201 21:35:17.813366 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:17Z","lastTransitionTime":"2025-12-01T21:35:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:17 crc kubenswrapper[4962]: I1201 21:35:17.916459 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:17 crc kubenswrapper[4962]: I1201 21:35:17.916524 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:17 crc kubenswrapper[4962]: I1201 21:35:17.916544 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:17 crc kubenswrapper[4962]: I1201 21:35:17.916570 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:17 crc kubenswrapper[4962]: I1201 21:35:17.916589 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:17Z","lastTransitionTime":"2025-12-01T21:35:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:18 crc kubenswrapper[4962]: I1201 21:35:18.019798 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:18 crc kubenswrapper[4962]: I1201 21:35:18.019869 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:18 crc kubenswrapper[4962]: I1201 21:35:18.019926 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:18 crc kubenswrapper[4962]: I1201 21:35:18.019995 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:18 crc kubenswrapper[4962]: I1201 21:35:18.020018 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:18Z","lastTransitionTime":"2025-12-01T21:35:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:18 crc kubenswrapper[4962]: I1201 21:35:18.124373 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:18 crc kubenswrapper[4962]: I1201 21:35:18.124436 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:18 crc kubenswrapper[4962]: I1201 21:35:18.124461 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:18 crc kubenswrapper[4962]: I1201 21:35:18.124492 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:18 crc kubenswrapper[4962]: I1201 21:35:18.124512 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:18Z","lastTransitionTime":"2025-12-01T21:35:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:18 crc kubenswrapper[4962]: I1201 21:35:18.226876 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:18 crc kubenswrapper[4962]: I1201 21:35:18.227005 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:18 crc kubenswrapper[4962]: I1201 21:35:18.227030 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:18 crc kubenswrapper[4962]: I1201 21:35:18.227058 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:18 crc kubenswrapper[4962]: I1201 21:35:18.227079 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:18Z","lastTransitionTime":"2025-12-01T21:35:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:18 crc kubenswrapper[4962]: I1201 21:35:18.330395 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:18 crc kubenswrapper[4962]: I1201 21:35:18.330453 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:18 crc kubenswrapper[4962]: I1201 21:35:18.330470 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:18 crc kubenswrapper[4962]: I1201 21:35:18.330494 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:18 crc kubenswrapper[4962]: I1201 21:35:18.330512 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:18Z","lastTransitionTime":"2025-12-01T21:35:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:18 crc kubenswrapper[4962]: I1201 21:35:18.434072 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:18 crc kubenswrapper[4962]: I1201 21:35:18.434129 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:18 crc kubenswrapper[4962]: I1201 21:35:18.434155 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:18 crc kubenswrapper[4962]: I1201 21:35:18.434185 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:18 crc kubenswrapper[4962]: I1201 21:35:18.434206 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:18Z","lastTransitionTime":"2025-12-01T21:35:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:18 crc kubenswrapper[4962]: I1201 21:35:18.536766 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:18 crc kubenswrapper[4962]: I1201 21:35:18.536865 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:18 crc kubenswrapper[4962]: I1201 21:35:18.536888 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:18 crc kubenswrapper[4962]: I1201 21:35:18.536917 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:18 crc kubenswrapper[4962]: I1201 21:35:18.537087 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:18Z","lastTransitionTime":"2025-12-01T21:35:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:18 crc kubenswrapper[4962]: I1201 21:35:18.639736 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:18 crc kubenswrapper[4962]: I1201 21:35:18.639791 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:18 crc kubenswrapper[4962]: I1201 21:35:18.639809 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:18 crc kubenswrapper[4962]: I1201 21:35:18.639828 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:18 crc kubenswrapper[4962]: I1201 21:35:18.639845 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:18Z","lastTransitionTime":"2025-12-01T21:35:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:18 crc kubenswrapper[4962]: I1201 21:35:18.744002 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:18 crc kubenswrapper[4962]: I1201 21:35:18.744059 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:18 crc kubenswrapper[4962]: I1201 21:35:18.744081 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:18 crc kubenswrapper[4962]: I1201 21:35:18.744105 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:18 crc kubenswrapper[4962]: I1201 21:35:18.744168 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:18Z","lastTransitionTime":"2025-12-01T21:35:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:18 crc kubenswrapper[4962]: I1201 21:35:18.847244 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:18 crc kubenswrapper[4962]: I1201 21:35:18.847299 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:18 crc kubenswrapper[4962]: I1201 21:35:18.847318 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:18 crc kubenswrapper[4962]: I1201 21:35:18.847342 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:18 crc kubenswrapper[4962]: I1201 21:35:18.847361 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:18Z","lastTransitionTime":"2025-12-01T21:35:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:18 crc kubenswrapper[4962]: I1201 21:35:18.949446 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:18 crc kubenswrapper[4962]: I1201 21:35:18.949500 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:18 crc kubenswrapper[4962]: I1201 21:35:18.949516 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:18 crc kubenswrapper[4962]: I1201 21:35:18.949538 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:18 crc kubenswrapper[4962]: I1201 21:35:18.949555 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:18Z","lastTransitionTime":"2025-12-01T21:35:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:19 crc kubenswrapper[4962]: I1201 21:35:19.052104 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:19 crc kubenswrapper[4962]: I1201 21:35:19.052188 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:19 crc kubenswrapper[4962]: I1201 21:35:19.052205 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:19 crc kubenswrapper[4962]: I1201 21:35:19.052229 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:19 crc kubenswrapper[4962]: I1201 21:35:19.052247 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:19Z","lastTransitionTime":"2025-12-01T21:35:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:19 crc kubenswrapper[4962]: I1201 21:35:19.155600 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:19 crc kubenswrapper[4962]: I1201 21:35:19.155666 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:19 crc kubenswrapper[4962]: I1201 21:35:19.155690 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:19 crc kubenswrapper[4962]: I1201 21:35:19.155719 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:19 crc kubenswrapper[4962]: I1201 21:35:19.155740 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:19Z","lastTransitionTime":"2025-12-01T21:35:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:19 crc kubenswrapper[4962]: I1201 21:35:19.218502 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 21:35:19 crc kubenswrapper[4962]: I1201 21:35:19.218548 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 21:35:19 crc kubenswrapper[4962]: I1201 21:35:19.218514 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q5q5" Dec 01 21:35:19 crc kubenswrapper[4962]: E1201 21:35:19.218670 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 21:35:19 crc kubenswrapper[4962]: I1201 21:35:19.218819 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:35:19 crc kubenswrapper[4962]: E1201 21:35:19.218921 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q5q5" podUID="5e1746bf-6971-44aa-ae52-f349e6963eb2" Dec 01 21:35:19 crc kubenswrapper[4962]: E1201 21:35:19.219176 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 21:35:19 crc kubenswrapper[4962]: E1201 21:35:19.219314 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 21:35:19 crc kubenswrapper[4962]: I1201 21:35:19.258885 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:19 crc kubenswrapper[4962]: I1201 21:35:19.258967 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:19 crc kubenswrapper[4962]: I1201 21:35:19.258986 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:19 crc kubenswrapper[4962]: I1201 21:35:19.259008 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:19 crc kubenswrapper[4962]: I1201 21:35:19.259026 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:19Z","lastTransitionTime":"2025-12-01T21:35:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:19 crc kubenswrapper[4962]: I1201 21:35:19.362690 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:19 crc kubenswrapper[4962]: I1201 21:35:19.362744 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:19 crc kubenswrapper[4962]: I1201 21:35:19.362760 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:19 crc kubenswrapper[4962]: I1201 21:35:19.362786 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:19 crc kubenswrapper[4962]: I1201 21:35:19.362805 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:19Z","lastTransitionTime":"2025-12-01T21:35:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:19 crc kubenswrapper[4962]: I1201 21:35:19.415839 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 21:35:19 crc kubenswrapper[4962]: I1201 21:35:19.415910 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 21:35:19 crc kubenswrapper[4962]: I1201 21:35:19.415930 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 21:35:19 crc kubenswrapper[4962]: I1201 21:35:19.415985 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 21:35:19 crc kubenswrapper[4962]: I1201 21:35:19.416003 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T21:35:19Z","lastTransitionTime":"2025-12-01T21:35:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 21:35:19 crc kubenswrapper[4962]: I1201 21:35:19.483821 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-d8lqf"] Dec 01 21:35:19 crc kubenswrapper[4962]: I1201 21:35:19.484505 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d8lqf" Dec 01 21:35:19 crc kubenswrapper[4962]: I1201 21:35:19.487789 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 01 21:35:19 crc kubenswrapper[4962]: I1201 21:35:19.489176 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 01 21:35:19 crc kubenswrapper[4962]: I1201 21:35:19.490774 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 01 21:35:19 crc kubenswrapper[4962]: I1201 21:35:19.491692 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 01 21:35:19 crc kubenswrapper[4962]: I1201 21:35:19.657287 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7a4d47f3-ff66-4fc0-a473-611f7ea83b98-service-ca\") pod \"cluster-version-operator-5c965bbfc6-d8lqf\" (UID: \"7a4d47f3-ff66-4fc0-a473-611f7ea83b98\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d8lqf" Dec 01 21:35:19 crc kubenswrapper[4962]: I1201 21:35:19.657364 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a4d47f3-ff66-4fc0-a473-611f7ea83b98-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-d8lqf\" (UID: \"7a4d47f3-ff66-4fc0-a473-611f7ea83b98\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d8lqf" Dec 01 21:35:19 crc kubenswrapper[4962]: I1201 21:35:19.657424 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/7a4d47f3-ff66-4fc0-a473-611f7ea83b98-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-d8lqf\" (UID: \"7a4d47f3-ff66-4fc0-a473-611f7ea83b98\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d8lqf" Dec 01 21:35:19 crc kubenswrapper[4962]: I1201 21:35:19.657516 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7a4d47f3-ff66-4fc0-a473-611f7ea83b98-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-d8lqf\" (UID: \"7a4d47f3-ff66-4fc0-a473-611f7ea83b98\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d8lqf" Dec 01 21:35:19 crc kubenswrapper[4962]: I1201 21:35:19.657559 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/7a4d47f3-ff66-4fc0-a473-611f7ea83b98-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-d8lqf\" (UID: \"7a4d47f3-ff66-4fc0-a473-611f7ea83b98\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d8lqf" Dec 01 21:35:19 crc kubenswrapper[4962]: I1201 21:35:19.759535 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7a4d47f3-ff66-4fc0-a473-611f7ea83b98-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-d8lqf\" (UID: \"7a4d47f3-ff66-4fc0-a473-611f7ea83b98\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d8lqf" Dec 01 21:35:19 crc kubenswrapper[4962]: I1201 21:35:19.759657 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/7a4d47f3-ff66-4fc0-a473-611f7ea83b98-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-d8lqf\" (UID: \"7a4d47f3-ff66-4fc0-a473-611f7ea83b98\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d8lqf" Dec 01 21:35:19 crc kubenswrapper[4962]: I1201 21:35:19.759702 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7a4d47f3-ff66-4fc0-a473-611f7ea83b98-service-ca\") pod \"cluster-version-operator-5c965bbfc6-d8lqf\" (UID: \"7a4d47f3-ff66-4fc0-a473-611f7ea83b98\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d8lqf" Dec 01 21:35:19 crc kubenswrapper[4962]: I1201 21:35:19.759749 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a4d47f3-ff66-4fc0-a473-611f7ea83b98-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-d8lqf\" (UID: \"7a4d47f3-ff66-4fc0-a473-611f7ea83b98\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d8lqf" Dec 01 21:35:19 crc kubenswrapper[4962]: I1201 21:35:19.759786 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/7a4d47f3-ff66-4fc0-a473-611f7ea83b98-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-d8lqf\" (UID: \"7a4d47f3-ff66-4fc0-a473-611f7ea83b98\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d8lqf" Dec 01 21:35:19 crc kubenswrapper[4962]: I1201 21:35:19.759890 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/7a4d47f3-ff66-4fc0-a473-611f7ea83b98-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-d8lqf\" (UID: \"7a4d47f3-ff66-4fc0-a473-611f7ea83b98\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d8lqf" Dec 01 21:35:19 crc kubenswrapper[4962]: I1201 21:35:19.759897 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/7a4d47f3-ff66-4fc0-a473-611f7ea83b98-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-d8lqf\" (UID: \"7a4d47f3-ff66-4fc0-a473-611f7ea83b98\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d8lqf" Dec 01 21:35:19 crc kubenswrapper[4962]: I1201 21:35:19.761519 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7a4d47f3-ff66-4fc0-a473-611f7ea83b98-service-ca\") pod \"cluster-version-operator-5c965bbfc6-d8lqf\" (UID: \"7a4d47f3-ff66-4fc0-a473-611f7ea83b98\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d8lqf" Dec 01 21:35:19 crc kubenswrapper[4962]: I1201 21:35:19.770354 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a4d47f3-ff66-4fc0-a473-611f7ea83b98-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-d8lqf\" (UID: \"7a4d47f3-ff66-4fc0-a473-611f7ea83b98\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d8lqf" Dec 01 21:35:19 crc kubenswrapper[4962]: I1201 21:35:19.789803 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7a4d47f3-ff66-4fc0-a473-611f7ea83b98-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-d8lqf\" (UID: \"7a4d47f3-ff66-4fc0-a473-611f7ea83b98\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d8lqf" Dec 01 21:35:19 crc kubenswrapper[4962]: I1201 21:35:19.810192 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d8lqf" Dec 01 21:35:19 crc kubenswrapper[4962]: I1201 21:35:19.907500 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d8lqf" event={"ID":"7a4d47f3-ff66-4fc0-a473-611f7ea83b98","Type":"ContainerStarted","Data":"39d18de337706b7599bf1127277f3569870e6da2cd1c828ac55b18650ba495c7"} Dec 01 21:35:20 crc kubenswrapper[4962]: I1201 21:35:20.914089 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d8lqf" event={"ID":"7a4d47f3-ff66-4fc0-a473-611f7ea83b98","Type":"ContainerStarted","Data":"d0ac77c124855d53619aa0620160345d751d8ab4c3627963cf3848339ed0d07a"} Dec 01 21:35:20 crc kubenswrapper[4962]: I1201 21:35:20.930520 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d8lqf" podStartSLOduration=82.930490508 podStartE2EDuration="1m22.930490508s" podCreationTimestamp="2025-12-01 21:33:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:35:20.928346511 +0000 UTC m=+105.029785746" watchObservedRunningTime="2025-12-01 21:35:20.930490508 +0000 UTC m=+105.031929743" Dec 01 21:35:21 crc kubenswrapper[4962]: I1201 21:35:21.218997 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:35:21 crc kubenswrapper[4962]: I1201 21:35:21.219033 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q5q5" Dec 01 21:35:21 crc kubenswrapper[4962]: I1201 21:35:21.218958 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 21:35:21 crc kubenswrapper[4962]: E1201 21:35:21.219220 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q5q5" podUID="5e1746bf-6971-44aa-ae52-f349e6963eb2" Dec 01 21:35:21 crc kubenswrapper[4962]: I1201 21:35:21.219539 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 21:35:21 crc kubenswrapper[4962]: E1201 21:35:21.219673 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 21:35:21 crc kubenswrapper[4962]: E1201 21:35:21.219873 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 21:35:21 crc kubenswrapper[4962]: E1201 21:35:21.219927 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 21:35:23 crc kubenswrapper[4962]: I1201 21:35:23.219258 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q5q5" Dec 01 21:35:23 crc kubenswrapper[4962]: I1201 21:35:23.219313 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 21:35:23 crc kubenswrapper[4962]: E1201 21:35:23.219420 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q5q5" podUID="5e1746bf-6971-44aa-ae52-f349e6963eb2" Dec 01 21:35:23 crc kubenswrapper[4962]: I1201 21:35:23.220098 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:35:23 crc kubenswrapper[4962]: E1201 21:35:23.220318 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 21:35:23 crc kubenswrapper[4962]: I1201 21:35:23.220513 4962 scope.go:117] "RemoveContainer" containerID="52ceea6c3fd351be1043a8bf07055a4d385f97a923ff2f08f1ddf810320d61ce" Dec 01 21:35:23 crc kubenswrapper[4962]: E1201 21:35:23.220651 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 21:35:23 crc kubenswrapper[4962]: E1201 21:35:23.220750 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-j77n9_openshift-ovn-kubernetes(017b2e87-9a6e-46c6-b061-1ed93bfd2322)\"" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" podUID="017b2e87-9a6e-46c6-b061-1ed93bfd2322" Dec 01 21:35:23 crc kubenswrapper[4962]: I1201 21:35:23.220794 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 21:35:23 crc kubenswrapper[4962]: E1201 21:35:23.220918 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 21:35:25 crc kubenswrapper[4962]: I1201 21:35:25.219439 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:35:25 crc kubenswrapper[4962]: I1201 21:35:25.219439 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 21:35:25 crc kubenswrapper[4962]: E1201 21:35:25.220293 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 21:35:25 crc kubenswrapper[4962]: I1201 21:35:25.219506 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 21:35:25 crc kubenswrapper[4962]: I1201 21:35:25.219467 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q5q5" Dec 01 21:35:25 crc kubenswrapper[4962]: E1201 21:35:25.220488 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 21:35:25 crc kubenswrapper[4962]: E1201 21:35:25.220668 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q5q5" podUID="5e1746bf-6971-44aa-ae52-f349e6963eb2" Dec 01 21:35:25 crc kubenswrapper[4962]: E1201 21:35:25.220770 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 21:35:27 crc kubenswrapper[4962]: I1201 21:35:27.219571 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 21:35:27 crc kubenswrapper[4962]: I1201 21:35:27.219625 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:35:27 crc kubenswrapper[4962]: I1201 21:35:27.219674 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q5q5" Dec 01 21:35:27 crc kubenswrapper[4962]: I1201 21:35:27.219594 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 21:35:27 crc kubenswrapper[4962]: E1201 21:35:27.219786 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 21:35:27 crc kubenswrapper[4962]: E1201 21:35:27.219905 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 21:35:27 crc kubenswrapper[4962]: E1201 21:35:27.220038 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 21:35:27 crc kubenswrapper[4962]: E1201 21:35:27.220130 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q5q5" podUID="5e1746bf-6971-44aa-ae52-f349e6963eb2" Dec 01 21:35:29 crc kubenswrapper[4962]: I1201 21:35:29.219184 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 21:35:29 crc kubenswrapper[4962]: I1201 21:35:29.219219 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:35:29 crc kubenswrapper[4962]: E1201 21:35:29.219733 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 21:35:29 crc kubenswrapper[4962]: I1201 21:35:29.219301 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 21:35:29 crc kubenswrapper[4962]: E1201 21:35:29.219983 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 21:35:29 crc kubenswrapper[4962]: I1201 21:35:29.219268 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q5q5" Dec 01 21:35:29 crc kubenswrapper[4962]: E1201 21:35:29.220131 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q5q5" podUID="5e1746bf-6971-44aa-ae52-f349e6963eb2" Dec 01 21:35:29 crc kubenswrapper[4962]: E1201 21:35:29.220269 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 21:35:31 crc kubenswrapper[4962]: I1201 21:35:31.219402 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 21:35:31 crc kubenswrapper[4962]: I1201 21:35:31.219401 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q5q5" Dec 01 21:35:31 crc kubenswrapper[4962]: E1201 21:35:31.219582 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 21:35:31 crc kubenswrapper[4962]: E1201 21:35:31.219705 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q5q5" podUID="5e1746bf-6971-44aa-ae52-f349e6963eb2" Dec 01 21:35:31 crc kubenswrapper[4962]: I1201 21:35:31.219431 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 21:35:31 crc kubenswrapper[4962]: E1201 21:35:31.219811 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 21:35:31 crc kubenswrapper[4962]: I1201 21:35:31.220132 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:35:31 crc kubenswrapper[4962]: E1201 21:35:31.220278 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 21:35:32 crc kubenswrapper[4962]: I1201 21:35:32.965051 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-m4wg5_f38b9e31-13b0-4a48-93bf-b3722ca60642/kube-multus/1.log" Dec 01 21:35:32 crc kubenswrapper[4962]: I1201 21:35:32.966022 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-m4wg5_f38b9e31-13b0-4a48-93bf-b3722ca60642/kube-multus/0.log" Dec 01 21:35:32 crc kubenswrapper[4962]: I1201 21:35:32.966094 4962 generic.go:334] "Generic (PLEG): container finished" podID="f38b9e31-13b0-4a48-93bf-b3722ca60642" containerID="e13ebe98bcf0b54dcebec65942330d0f78a7f024cc3dbe2b2499bf7c572541c1" exitCode=1 Dec 01 21:35:32 crc kubenswrapper[4962]: I1201 21:35:32.966153 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-m4wg5" event={"ID":"f38b9e31-13b0-4a48-93bf-b3722ca60642","Type":"ContainerDied","Data":"e13ebe98bcf0b54dcebec65942330d0f78a7f024cc3dbe2b2499bf7c572541c1"} Dec 01 21:35:32 crc kubenswrapper[4962]: I1201 21:35:32.966243 4962 scope.go:117] "RemoveContainer" containerID="b6b607ad59cd135df76eb0d6167a309468b46c7d487077b737d6d35390990de0" Dec 01 21:35:32 crc kubenswrapper[4962]: I1201 21:35:32.966912 4962 scope.go:117] "RemoveContainer" containerID="e13ebe98bcf0b54dcebec65942330d0f78a7f024cc3dbe2b2499bf7c572541c1" Dec 01 21:35:32 crc kubenswrapper[4962]: E1201 21:35:32.967254 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-m4wg5_openshift-multus(f38b9e31-13b0-4a48-93bf-b3722ca60642)\"" pod="openshift-multus/multus-m4wg5" podUID="f38b9e31-13b0-4a48-93bf-b3722ca60642" Dec 01 21:35:33 crc kubenswrapper[4962]: I1201 21:35:33.218916 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 21:35:33 crc kubenswrapper[4962]: I1201 21:35:33.218956 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:35:33 crc kubenswrapper[4962]: I1201 21:35:33.219052 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 21:35:33 crc kubenswrapper[4962]: I1201 21:35:33.219052 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q5q5" Dec 01 21:35:33 crc kubenswrapper[4962]: E1201 21:35:33.219296 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 21:35:33 crc kubenswrapper[4962]: E1201 21:35:33.219434 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 21:35:33 crc kubenswrapper[4962]: E1201 21:35:33.219602 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 21:35:33 crc kubenswrapper[4962]: E1201 21:35:33.219762 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q5q5" podUID="5e1746bf-6971-44aa-ae52-f349e6963eb2" Dec 01 21:35:33 crc kubenswrapper[4962]: I1201 21:35:33.971573 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-m4wg5_f38b9e31-13b0-4a48-93bf-b3722ca60642/kube-multus/1.log" Dec 01 21:35:35 crc kubenswrapper[4962]: I1201 21:35:35.218865 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q5q5" Dec 01 21:35:35 crc kubenswrapper[4962]: I1201 21:35:35.218924 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 21:35:35 crc kubenswrapper[4962]: I1201 21:35:35.219040 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:35:35 crc kubenswrapper[4962]: I1201 21:35:35.219089 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 21:35:35 crc kubenswrapper[4962]: E1201 21:35:35.219156 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q5q5" podUID="5e1746bf-6971-44aa-ae52-f349e6963eb2" Dec 01 21:35:35 crc kubenswrapper[4962]: E1201 21:35:35.219384 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 21:35:35 crc kubenswrapper[4962]: E1201 21:35:35.219529 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 21:35:35 crc kubenswrapper[4962]: E1201 21:35:35.219673 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 21:35:36 crc kubenswrapper[4962]: E1201 21:35:36.199004 4962 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Dec 01 21:35:36 crc kubenswrapper[4962]: E1201 21:35:36.335838 4962 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 01 21:35:37 crc kubenswrapper[4962]: I1201 21:35:37.218560 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q5q5" Dec 01 21:35:37 crc kubenswrapper[4962]: I1201 21:35:37.218666 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:35:37 crc kubenswrapper[4962]: I1201 21:35:37.219167 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 21:35:37 crc kubenswrapper[4962]: I1201 21:35:37.219388 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 21:35:37 crc kubenswrapper[4962]: E1201 21:35:37.219380 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q5q5" podUID="5e1746bf-6971-44aa-ae52-f349e6963eb2" Dec 01 21:35:37 crc kubenswrapper[4962]: E1201 21:35:37.219471 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 21:35:37 crc kubenswrapper[4962]: E1201 21:35:37.219710 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 21:35:37 crc kubenswrapper[4962]: E1201 21:35:37.220475 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 21:35:37 crc kubenswrapper[4962]: I1201 21:35:37.220872 4962 scope.go:117] "RemoveContainer" containerID="52ceea6c3fd351be1043a8bf07055a4d385f97a923ff2f08f1ddf810320d61ce" Dec 01 21:35:37 crc kubenswrapper[4962]: E1201 21:35:37.221202 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-j77n9_openshift-ovn-kubernetes(017b2e87-9a6e-46c6-b061-1ed93bfd2322)\"" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" podUID="017b2e87-9a6e-46c6-b061-1ed93bfd2322" Dec 01 21:35:39 crc kubenswrapper[4962]: I1201 21:35:39.218677 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 21:35:39 crc kubenswrapper[4962]: I1201 21:35:39.218725 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q5q5" Dec 01 21:35:39 crc kubenswrapper[4962]: E1201 21:35:39.218864 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 21:35:39 crc kubenswrapper[4962]: I1201 21:35:39.218898 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 21:35:39 crc kubenswrapper[4962]: E1201 21:35:39.219092 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q5q5" podUID="5e1746bf-6971-44aa-ae52-f349e6963eb2" Dec 01 21:35:39 crc kubenswrapper[4962]: I1201 21:35:39.219168 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:35:39 crc kubenswrapper[4962]: E1201 21:35:39.219283 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 21:35:39 crc kubenswrapper[4962]: E1201 21:35:39.219398 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 21:35:41 crc kubenswrapper[4962]: I1201 21:35:41.219198 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q5q5" Dec 01 21:35:41 crc kubenswrapper[4962]: E1201 21:35:41.220203 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q5q5" podUID="5e1746bf-6971-44aa-ae52-f349e6963eb2" Dec 01 21:35:41 crc kubenswrapper[4962]: I1201 21:35:41.219225 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 21:35:41 crc kubenswrapper[4962]: I1201 21:35:41.219223 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:35:41 crc kubenswrapper[4962]: E1201 21:35:41.220316 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 21:35:41 crc kubenswrapper[4962]: I1201 21:35:41.219331 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 21:35:41 crc kubenswrapper[4962]: E1201 21:35:41.220489 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 21:35:41 crc kubenswrapper[4962]: E1201 21:35:41.220696 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 21:35:41 crc kubenswrapper[4962]: E1201 21:35:41.337571 4962 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 01 21:35:43 crc kubenswrapper[4962]: I1201 21:35:43.219555 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 21:35:43 crc kubenswrapper[4962]: I1201 21:35:43.219590 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q5q5" Dec 01 21:35:43 crc kubenswrapper[4962]: I1201 21:35:43.219590 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:35:43 crc kubenswrapper[4962]: I1201 21:35:43.219652 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 21:35:43 crc kubenswrapper[4962]: E1201 21:35:43.219755 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 21:35:43 crc kubenswrapper[4962]: E1201 21:35:43.219905 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 21:35:43 crc kubenswrapper[4962]: E1201 21:35:43.220102 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q5q5" podUID="5e1746bf-6971-44aa-ae52-f349e6963eb2" Dec 01 21:35:43 crc kubenswrapper[4962]: E1201 21:35:43.220192 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 21:35:45 crc kubenswrapper[4962]: I1201 21:35:45.220767 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q5q5" Dec 01 21:35:45 crc kubenswrapper[4962]: I1201 21:35:45.220778 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 21:35:45 crc kubenswrapper[4962]: E1201 21:35:45.221002 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q5q5" podUID="5e1746bf-6971-44aa-ae52-f349e6963eb2" Dec 01 21:35:45 crc kubenswrapper[4962]: I1201 21:35:45.221109 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:35:45 crc kubenswrapper[4962]: I1201 21:35:45.221168 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 21:35:45 crc kubenswrapper[4962]: E1201 21:35:45.221291 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 21:35:45 crc kubenswrapper[4962]: E1201 21:35:45.221484 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 21:35:45 crc kubenswrapper[4962]: E1201 21:35:45.221633 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 21:35:46 crc kubenswrapper[4962]: E1201 21:35:46.338385 4962 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 01 21:35:47 crc kubenswrapper[4962]: I1201 21:35:47.219370 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 21:35:47 crc kubenswrapper[4962]: I1201 21:35:47.219450 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q5q5" Dec 01 21:35:47 crc kubenswrapper[4962]: E1201 21:35:47.219557 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 21:35:47 crc kubenswrapper[4962]: I1201 21:35:47.219399 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 21:35:47 crc kubenswrapper[4962]: E1201 21:35:47.219627 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q5q5" podUID="5e1746bf-6971-44aa-ae52-f349e6963eb2" Dec 01 21:35:47 crc kubenswrapper[4962]: I1201 21:35:47.219456 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:35:47 crc kubenswrapper[4962]: E1201 21:35:47.219707 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 21:35:47 crc kubenswrapper[4962]: E1201 21:35:47.219894 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 21:35:48 crc kubenswrapper[4962]: I1201 21:35:48.219266 4962 scope.go:117] "RemoveContainer" containerID="e13ebe98bcf0b54dcebec65942330d0f78a7f024cc3dbe2b2499bf7c572541c1" Dec 01 21:35:49 crc kubenswrapper[4962]: I1201 21:35:49.041543 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-m4wg5_f38b9e31-13b0-4a48-93bf-b3722ca60642/kube-multus/1.log" Dec 01 21:35:49 crc kubenswrapper[4962]: I1201 21:35:49.042061 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-m4wg5" event={"ID":"f38b9e31-13b0-4a48-93bf-b3722ca60642","Type":"ContainerStarted","Data":"1b8d562c177ec53feb71127f293276762385b527ac37171b4992a030c29c6db7"} Dec 01 21:35:49 crc kubenswrapper[4962]: I1201 21:35:49.219603 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q5q5" Dec 01 21:35:49 crc kubenswrapper[4962]: I1201 21:35:49.219653 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 21:35:49 crc kubenswrapper[4962]: I1201 21:35:49.219686 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 21:35:49 crc kubenswrapper[4962]: I1201 21:35:49.219756 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:35:49 crc kubenswrapper[4962]: E1201 21:35:49.221250 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 21:35:49 crc kubenswrapper[4962]: E1201 21:35:49.220992 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q5q5" podUID="5e1746bf-6971-44aa-ae52-f349e6963eb2" Dec 01 21:35:49 crc kubenswrapper[4962]: E1201 21:35:49.221369 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 21:35:49 crc kubenswrapper[4962]: E1201 21:35:49.220992 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 21:35:50 crc kubenswrapper[4962]: I1201 21:35:50.220321 4962 scope.go:117] "RemoveContainer" containerID="52ceea6c3fd351be1043a8bf07055a4d385f97a923ff2f08f1ddf810320d61ce" Dec 01 21:35:51 crc kubenswrapper[4962]: I1201 21:35:51.050254 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j77n9_017b2e87-9a6e-46c6-b061-1ed93bfd2322/ovnkube-controller/3.log" Dec 01 21:35:51 crc kubenswrapper[4962]: I1201 21:35:51.053722 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" event={"ID":"017b2e87-9a6e-46c6-b061-1ed93bfd2322","Type":"ContainerStarted","Data":"2ddb62c277e10551212f6291e472c1576caa040e6b2f4fcfe115b91a1bb25581"} Dec 01 21:35:51 crc kubenswrapper[4962]: I1201 21:35:51.087418 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" podStartSLOduration=113.087386354 podStartE2EDuration="1m53.087386354s" podCreationTimestamp="2025-12-01 21:33:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:35:51.086893961 +0000 UTC m=+135.188333226" watchObservedRunningTime="2025-12-01 21:35:51.087386354 +0000 UTC m=+135.188825649" Dec 01 21:35:51 crc kubenswrapper[4962]: I1201 21:35:51.121386 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-2q5q5"] Dec 01 21:35:51 crc kubenswrapper[4962]: I1201 21:35:51.121597 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q5q5" Dec 01 21:35:51 crc kubenswrapper[4962]: E1201 21:35:51.121785 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q5q5" podUID="5e1746bf-6971-44aa-ae52-f349e6963eb2" Dec 01 21:35:51 crc kubenswrapper[4962]: I1201 21:35:51.219452 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 21:35:51 crc kubenswrapper[4962]: I1201 21:35:51.219454 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 21:35:51 crc kubenswrapper[4962]: E1201 21:35:51.219868 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 21:35:51 crc kubenswrapper[4962]: E1201 21:35:51.219971 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 21:35:51 crc kubenswrapper[4962]: I1201 21:35:51.219490 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:35:51 crc kubenswrapper[4962]: E1201 21:35:51.220055 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 21:35:51 crc kubenswrapper[4962]: E1201 21:35:51.339774 4962 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 01 21:35:52 crc kubenswrapper[4962]: I1201 21:35:52.219094 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q5q5" Dec 01 21:35:52 crc kubenswrapper[4962]: E1201 21:35:52.219330 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q5q5" podUID="5e1746bf-6971-44aa-ae52-f349e6963eb2" Dec 01 21:35:53 crc kubenswrapper[4962]: I1201 21:35:53.219085 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:35:53 crc kubenswrapper[4962]: I1201 21:35:53.219136 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 21:35:53 crc kubenswrapper[4962]: I1201 21:35:53.219110 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 21:35:53 crc kubenswrapper[4962]: E1201 21:35:53.219264 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 21:35:53 crc kubenswrapper[4962]: E1201 21:35:53.219403 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 21:35:53 crc kubenswrapper[4962]: E1201 21:35:53.219515 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 21:35:54 crc kubenswrapper[4962]: I1201 21:35:54.219643 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q5q5" Dec 01 21:35:54 crc kubenswrapper[4962]: E1201 21:35:54.219870 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q5q5" podUID="5e1746bf-6971-44aa-ae52-f349e6963eb2" Dec 01 21:35:55 crc kubenswrapper[4962]: I1201 21:35:55.219370 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:35:55 crc kubenswrapper[4962]: I1201 21:35:55.219460 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 21:35:55 crc kubenswrapper[4962]: I1201 21:35:55.219482 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 21:35:55 crc kubenswrapper[4962]: E1201 21:35:55.219561 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 21:35:55 crc kubenswrapper[4962]: E1201 21:35:55.219681 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 21:35:55 crc kubenswrapper[4962]: E1201 21:35:55.220053 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 21:35:56 crc kubenswrapper[4962]: I1201 21:35:56.218717 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q5q5" Dec 01 21:35:56 crc kubenswrapper[4962]: E1201 21:35:56.220878 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2q5q5" podUID="5e1746bf-6971-44aa-ae52-f349e6963eb2" Dec 01 21:35:57 crc kubenswrapper[4962]: I1201 21:35:57.219219 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:35:57 crc kubenswrapper[4962]: I1201 21:35:57.219237 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 21:35:57 crc kubenswrapper[4962]: I1201 21:35:57.219245 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 21:35:57 crc kubenswrapper[4962]: I1201 21:35:57.226160 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 01 21:35:57 crc kubenswrapper[4962]: I1201 21:35:57.227214 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 01 21:35:57 crc kubenswrapper[4962]: I1201 21:35:57.228495 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 01 21:35:57 crc kubenswrapper[4962]: I1201 21:35:57.228758 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 01 21:35:58 crc kubenswrapper[4962]: I1201 21:35:58.221170 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q5q5" Dec 01 21:35:58 crc kubenswrapper[4962]: I1201 21:35:58.224121 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 01 21:35:58 crc kubenswrapper[4962]: I1201 21:35:58.224121 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.668826 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.720344 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w86m4"] Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.721071 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w86m4" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.722023 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-h69ht"] Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.722406 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-h69ht" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.723310 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-hvvd4"] Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.723826 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-hvvd4" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.725374 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-9wbzf"] Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.725762 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9wbzf" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.727469 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-qj8zv"] Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.727791 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-qj8zv" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.729035 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cj2tt"] Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.730000 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cj2tt" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.731373 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.731707 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.731894 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.732205 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.732336 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.732419 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.732495 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.732598 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.732695 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.732834 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.732967 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.733060 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.733157 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.733586 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.733697 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.743994 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-7x948"] Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.744280 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-w2wdh"] Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.744586 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-lt2df"] Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.744689 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-7x948" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.744866 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lt2df" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.745100 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-w2wdh" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.788600 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.788920 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.791350 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.793008 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.793151 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.793342 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.793421 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.793483 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.793499 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.793600 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.793680 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.793815 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.793904 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.793840 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63788da5-7737-4b23-aef1-283bc26a4202-config\") pod \"machine-approver-56656f9798-9wbzf\" (UID: \"63788da5-7737-4b23-aef1-283bc26a4202\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9wbzf" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.794003 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.793872 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.794027 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p28qt\" (UniqueName: \"kubernetes.io/projected/02ee2c67-a501-4ae1-a1c3-f9cc8fa1650d-kube-api-access-p28qt\") pod \"dns-operator-744455d44c-h69ht\" (UID: \"02ee2c67-a501-4ae1-a1c3-f9cc8fa1650d\") " pod="openshift-dns-operator/dns-operator-744455d44c-h69ht" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.794060 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g42xl\" (UniqueName: \"kubernetes.io/projected/c25f2bd8-5e89-40b2-8c62-3c67a364384f-kube-api-access-g42xl\") pod \"etcd-operator-b45778765-hvvd4\" (UID: \"c25f2bd8-5e89-40b2-8c62-3c67a364384f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hvvd4" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.794098 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/c25f2bd8-5e89-40b2-8c62-3c67a364384f-etcd-ca\") pod \"etcd-operator-b45778765-hvvd4\" (UID: \"c25f2bd8-5e89-40b2-8c62-3c67a364384f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hvvd4" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.794121 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.794158 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.794167 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c25f2bd8-5e89-40b2-8c62-3c67a364384f-etcd-client\") pod \"etcd-operator-b45778765-hvvd4\" (UID: \"c25f2bd8-5e89-40b2-8c62-3c67a364384f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hvvd4" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.794181 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.794189 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/570052d5-a3db-4720-b8ab-32b4d71c44f8-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-w86m4\" (UID: \"570052d5-a3db-4720-b8ab-32b4d71c44f8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w86m4" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.794228 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pm796\" (UniqueName: \"kubernetes.io/projected/63788da5-7737-4b23-aef1-283bc26a4202-kube-api-access-pm796\") pod \"machine-approver-56656f9798-9wbzf\" (UID: \"63788da5-7737-4b23-aef1-283bc26a4202\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9wbzf" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.794252 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.794256 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c25f2bd8-5e89-40b2-8c62-3c67a364384f-config\") pod \"etcd-operator-b45778765-hvvd4\" (UID: \"c25f2bd8-5e89-40b2-8c62-3c67a364384f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hvvd4" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.794304 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.794314 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/63788da5-7737-4b23-aef1-283bc26a4202-auth-proxy-config\") pod \"machine-approver-56656f9798-9wbzf\" (UID: \"63788da5-7737-4b23-aef1-283bc26a4202\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9wbzf" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.794335 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/c25f2bd8-5e89-40b2-8c62-3c67a364384f-etcd-service-ca\") pod \"etcd-operator-b45778765-hvvd4\" (UID: \"c25f2bd8-5e89-40b2-8c62-3c67a364384f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hvvd4" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.794346 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.794379 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.794404 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/02ee2c67-a501-4ae1-a1c3-f9cc8fa1650d-metrics-tls\") pod \"dns-operator-744455d44c-h69ht\" (UID: \"02ee2c67-a501-4ae1-a1c3-f9cc8fa1650d\") " pod="openshift-dns-operator/dns-operator-744455d44c-h69ht" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.794424 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxj56\" (UniqueName: \"kubernetes.io/projected/570052d5-a3db-4720-b8ab-32b4d71c44f8-kube-api-access-gxj56\") pod \"cluster-samples-operator-665b6dd947-w86m4\" (UID: \"570052d5-a3db-4720-b8ab-32b4d71c44f8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w86m4" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.794443 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c25f2bd8-5e89-40b2-8c62-3c67a364384f-serving-cert\") pod \"etcd-operator-b45778765-hvvd4\" (UID: \"c25f2bd8-5e89-40b2-8c62-3c67a364384f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hvvd4" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.794443 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.794483 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.794536 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/63788da5-7737-4b23-aef1-283bc26a4202-machine-approver-tls\") pod \"machine-approver-56656f9798-9wbzf\" (UID: \"63788da5-7737-4b23-aef1-283bc26a4202\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9wbzf" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.794577 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.794586 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.794672 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.794815 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.794950 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.795100 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.795576 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.797293 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.797475 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.797699 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-7hn2f"] Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.798301 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-csp6p"] Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.798600 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-w56dm"] Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.799086 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-w56dm" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.799411 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7hn2f" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.799674 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-csp6p" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.801389 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.802019 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p9pvs"] Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.802064 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.802325 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.802452 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.802465 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-lwmpj"] Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.802645 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.803013 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p9pvs" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.804019 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-jsv9r"] Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.804329 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-rvt4l"] Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.804664 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-rvt4l" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.805537 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-cjns7"] Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.805951 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.806571 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lwmpj" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.806855 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-jsv9r" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.810065 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.810274 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.811863 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.812308 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.813212 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-sl474"] Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.813741 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zkgk9"] Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.814076 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-frc7j"] Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.814450 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-frc7j" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.814723 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sl474" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.814811 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h7flt"] Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.814926 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zkgk9" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.816299 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-cjns7" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.819386 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nl5jh"] Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.819694 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nl5jh" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.819735 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h7flt" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.829199 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.829336 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.829499 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-g44h6"] Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.829567 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.829736 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.829802 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.829924 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.829957 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-g44h6" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.830030 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.830140 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.830158 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.830200 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.830268 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.830289 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.830382 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.830424 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.830533 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.830626 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-j5fqx"] Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.830636 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.830663 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.830672 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.842681 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.843098 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.830707 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.830753 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.830766 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.830789 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.830825 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.840620 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.840842 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.841205 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.844538 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.844848 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.845457 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.861360 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.861892 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.865271 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.865345 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-hjqsp"] Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.865687 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-k8fq2"] Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.866219 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-k8fq2" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.866535 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-j5fqx" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.866767 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-hjqsp" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.866858 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.867270 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.869372 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.871054 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.871258 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.871862 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.873165 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7p82b"] Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.873750 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ljnp5"] Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.874083 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ljnp5" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.874261 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7p82b" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.874312 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.874382 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-gg284"] Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.874714 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.874801 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-gg284" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.875533 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.878405 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-7qvr5"] Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.878877 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7qvr5" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.880029 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.882338 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6ffb7"] Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.882837 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-gh8cn"] Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.882987 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.883422 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-zbnc9"] Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.884670 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.886381 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kg8bg"] Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.890090 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6ffb7" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.890324 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gh8cn" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.890373 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-zbnc9" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.891472 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410410-kr2lr"] Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.891860 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kg8bg" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.892108 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-stv9m"] Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.892529 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-85jfd"] Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.893089 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-p5kqx"] Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.893912 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410410-kr2lr" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.893988 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-p5kqx" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.894070 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-stv9m" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.894143 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-85jfd" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.895400 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4002c0b0-4b79-4755-a60c-2fc0cdac7876-service-ca\") pod \"console-f9d7485db-jsv9r\" (UID: \"4002c0b0-4b79-4755-a60c-2fc0cdac7876\") " pod="openshift-console/console-f9d7485db-jsv9r" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.895437 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-csp6p\" (UID: \"6e0f577b-3526-4f2e-846f-65d5a5ee1d8e\") " pod="openshift-authentication/oauth-openshift-558db77b4-csp6p" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.895462 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfzc9\" (UniqueName: \"kubernetes.io/projected/847a70f9-ec15-480b-8ed8-9d3cb006ce64-kube-api-access-kfzc9\") pod \"apiserver-76f77b778f-w2wdh\" (UID: \"847a70f9-ec15-480b-8ed8-9d3cb006ce64\") " pod="openshift-apiserver/apiserver-76f77b778f-w2wdh" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.895485 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/931bdb70-51c0-4893-b3b9-6fe8dd700233-audit-dir\") pod \"apiserver-7bbb656c7d-w56dm\" (UID: \"931bdb70-51c0-4893-b3b9-6fe8dd700233\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-w56dm" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.895508 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qw78\" (UniqueName: \"kubernetes.io/projected/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-kube-api-access-9qw78\") pod \"oauth-openshift-558db77b4-csp6p\" (UID: \"6e0f577b-3526-4f2e-846f-65d5a5ee1d8e\") " pod="openshift-authentication/oauth-openshift-558db77b4-csp6p" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.895529 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/847a70f9-ec15-480b-8ed8-9d3cb006ce64-trusted-ca-bundle\") pod \"apiserver-76f77b778f-w2wdh\" (UID: \"847a70f9-ec15-480b-8ed8-9d3cb006ce64\") " pod="openshift-apiserver/apiserver-76f77b778f-w2wdh" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.895551 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6zs5\" (UniqueName: \"kubernetes.io/projected/0b40c111-e562-48b7-9db2-1a494e16786c-kube-api-access-l6zs5\") pod \"openshift-config-operator-7777fb866f-lt2df\" (UID: \"0b40c111-e562-48b7-9db2-1a494e16786c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lt2df" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.895572 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/931bdb70-51c0-4893-b3b9-6fe8dd700233-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-w56dm\" (UID: \"931bdb70-51c0-4893-b3b9-6fe8dd700233\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-w56dm" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.895590 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a56487ec-ba83-42dc-b8b4-35c353fbc836-srv-cert\") pod \"catalog-operator-68c6474976-h7flt\" (UID: \"a56487ec-ba83-42dc-b8b4-35c353fbc836\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h7flt" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.895613 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b57c377-c954-4559-af66-e64599cf5f71-config\") pod \"console-operator-58897d9998-cjns7\" (UID: \"7b57c377-c954-4559-af66-e64599cf5f71\") " pod="openshift-console-operator/console-operator-58897d9998-cjns7" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.895628 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w86m4"] Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.895633 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-csp6p\" (UID: \"6e0f577b-3526-4f2e-846f-65d5a5ee1d8e\") " pod="openshift-authentication/oauth-openshift-558db77b4-csp6p" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.895670 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pm796\" (UniqueName: \"kubernetes.io/projected/63788da5-7737-4b23-aef1-283bc26a4202-kube-api-access-pm796\") pod \"machine-approver-56656f9798-9wbzf\" (UID: \"63788da5-7737-4b23-aef1-283bc26a4202\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9wbzf" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.895688 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c25f2bd8-5e89-40b2-8c62-3c67a364384f-config\") pod \"etcd-operator-b45778765-hvvd4\" (UID: \"c25f2bd8-5e89-40b2-8c62-3c67a364384f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hvvd4" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.895707 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-csp6p\" (UID: \"6e0f577b-3526-4f2e-846f-65d5a5ee1d8e\") " pod="openshift-authentication/oauth-openshift-558db77b4-csp6p" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.895726 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-csp6p\" (UID: \"6e0f577b-3526-4f2e-846f-65d5a5ee1d8e\") " pod="openshift-authentication/oauth-openshift-558db77b4-csp6p" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.895742 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/847a70f9-ec15-480b-8ed8-9d3cb006ce64-serving-cert\") pod \"apiserver-76f77b778f-w2wdh\" (UID: \"847a70f9-ec15-480b-8ed8-9d3cb006ce64\") " pod="openshift-apiserver/apiserver-76f77b778f-w2wdh" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.895759 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4002c0b0-4b79-4755-a60c-2fc0cdac7876-console-oauth-config\") pod \"console-f9d7485db-jsv9r\" (UID: \"4002c0b0-4b79-4755-a60c-2fc0cdac7876\") " pod="openshift-console/console-f9d7485db-jsv9r" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.895774 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-csp6p\" (UID: \"6e0f577b-3526-4f2e-846f-65d5a5ee1d8e\") " pod="openshift-authentication/oauth-openshift-558db77b4-csp6p" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.895799 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a3bc6803-77e3-4e63-bfaa-761500e161de-bound-sa-token\") pod \"ingress-operator-5b745b69d9-sl474\" (UID: \"a3bc6803-77e3-4e63-bfaa-761500e161de\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sl474" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.895839 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/63788da5-7737-4b23-aef1-283bc26a4202-auth-proxy-config\") pod \"machine-approver-56656f9798-9wbzf\" (UID: \"63788da5-7737-4b23-aef1-283bc26a4202\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9wbzf" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.895860 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/c25f2bd8-5e89-40b2-8c62-3c67a364384f-etcd-service-ca\") pod \"etcd-operator-b45778765-hvvd4\" (UID: \"c25f2bd8-5e89-40b2-8c62-3c67a364384f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hvvd4" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.895883 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8bsz\" (UniqueName: \"kubernetes.io/projected/4fd77fbe-c9db-4f41-9c98-a8f3490c1b30-kube-api-access-z8bsz\") pod \"openshift-apiserver-operator-796bbdcf4f-cj2tt\" (UID: \"4fd77fbe-c9db-4f41-9c98-a8f3490c1b30\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cj2tt" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.895904 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a56487ec-ba83-42dc-b8b4-35c353fbc836-profile-collector-cert\") pod \"catalog-operator-68c6474976-h7flt\" (UID: \"a56487ec-ba83-42dc-b8b4-35c353fbc836\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h7flt" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.895950 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b92760e-ca08-4b9c-b2a0-3f522ba6a975-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-zkgk9\" (UID: \"6b92760e-ca08-4b9c-b2a0-3f522ba6a975\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zkgk9" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.895971 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/847a70f9-ec15-480b-8ed8-9d3cb006ce64-audit-dir\") pod \"apiserver-76f77b778f-w2wdh\" (UID: \"847a70f9-ec15-480b-8ed8-9d3cb006ce64\") " pod="openshift-apiserver/apiserver-76f77b778f-w2wdh" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.895993 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/931bdb70-51c0-4893-b3b9-6fe8dd700233-etcd-client\") pod \"apiserver-7bbb656c7d-w56dm\" (UID: \"931bdb70-51c0-4893-b3b9-6fe8dd700233\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-w56dm" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.896014 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/931bdb70-51c0-4893-b3b9-6fe8dd700233-encryption-config\") pod \"apiserver-7bbb656c7d-w56dm\" (UID: \"931bdb70-51c0-4893-b3b9-6fe8dd700233\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-w56dm" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.896034 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/458c25bb-0a4d-4f96-9f46-cdfd66b6ae34-client-ca\") pod \"route-controller-manager-6576b87f9c-7hn2f\" (UID: \"458c25bb-0a4d-4f96-9f46-cdfd66b6ae34\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7hn2f" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.896056 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64k8k\" (UniqueName: \"kubernetes.io/projected/06b421f6-7c09-4105-9aa3-06296e88a57f-kube-api-access-64k8k\") pod \"openshift-controller-manager-operator-756b6f6bc6-p9pvs\" (UID: \"06b421f6-7c09-4105-9aa3-06296e88a57f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p9pvs" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.896077 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hvck\" (UniqueName: \"kubernetes.io/projected/a56487ec-ba83-42dc-b8b4-35c353fbc836-kube-api-access-2hvck\") pod \"catalog-operator-68c6474976-h7flt\" (UID: \"a56487ec-ba83-42dc-b8b4-35c353fbc836\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h7flt" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.896111 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a3bc6803-77e3-4e63-bfaa-761500e161de-trusted-ca\") pod \"ingress-operator-5b745b69d9-sl474\" (UID: \"a3bc6803-77e3-4e63-bfaa-761500e161de\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sl474" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.896131 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/458c25bb-0a4d-4f96-9f46-cdfd66b6ae34-serving-cert\") pod \"route-controller-manager-6576b87f9c-7hn2f\" (UID: \"458c25bb-0a4d-4f96-9f46-cdfd66b6ae34\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7hn2f" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.896155 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/02ee2c67-a501-4ae1-a1c3-f9cc8fa1650d-metrics-tls\") pod \"dns-operator-744455d44c-h69ht\" (UID: \"02ee2c67-a501-4ae1-a1c3-f9cc8fa1650d\") " pod="openshift-dns-operator/dns-operator-744455d44c-h69ht" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.896178 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxj56\" (UniqueName: \"kubernetes.io/projected/570052d5-a3db-4720-b8ab-32b4d71c44f8-kube-api-access-gxj56\") pod \"cluster-samples-operator-665b6dd947-w86m4\" (UID: \"570052d5-a3db-4720-b8ab-32b4d71c44f8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w86m4" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.896199 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-csp6p\" (UID: \"6e0f577b-3526-4f2e-846f-65d5a5ee1d8e\") " pod="openshift-authentication/oauth-openshift-558db77b4-csp6p" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.896218 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/931bdb70-51c0-4893-b3b9-6fe8dd700233-serving-cert\") pod \"apiserver-7bbb656c7d-w56dm\" (UID: \"931bdb70-51c0-4893-b3b9-6fe8dd700233\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-w56dm" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.896241 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c25f2bd8-5e89-40b2-8c62-3c67a364384f-serving-cert\") pod \"etcd-operator-b45778765-hvvd4\" (UID: \"c25f2bd8-5e89-40b2-8c62-3c67a364384f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hvvd4" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.896261 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f31ef6d-c116-4335-bc5d-5357a379d202-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-7x948\" (UID: \"9f31ef6d-c116-4335-bc5d-5357a379d202\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7x948" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.896281 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f31ef6d-c116-4335-bc5d-5357a379d202-serving-cert\") pod \"authentication-operator-69f744f599-7x948\" (UID: \"9f31ef6d-c116-4335-bc5d-5357a379d202\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7x948" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.896302 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7b57c377-c954-4559-af66-e64599cf5f71-trusted-ca\") pod \"console-operator-58897d9998-cjns7\" (UID: \"7b57c377-c954-4559-af66-e64599cf5f71\") " pod="openshift-console-operator/console-operator-58897d9998-cjns7" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.896324 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/931bdb70-51c0-4893-b3b9-6fe8dd700233-audit-policies\") pod \"apiserver-7bbb656c7d-w56dm\" (UID: \"931bdb70-51c0-4893-b3b9-6fe8dd700233\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-w56dm" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.896357 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/847a70f9-ec15-480b-8ed8-9d3cb006ce64-config\") pod \"apiserver-76f77b778f-w2wdh\" (UID: \"847a70f9-ec15-480b-8ed8-9d3cb006ce64\") " pod="openshift-apiserver/apiserver-76f77b778f-w2wdh" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.896407 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27zvm\" (UniqueName: \"kubernetes.io/projected/a3bc6803-77e3-4e63-bfaa-761500e161de-kube-api-access-27zvm\") pod \"ingress-operator-5b745b69d9-sl474\" (UID: \"a3bc6803-77e3-4e63-bfaa-761500e161de\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sl474" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.896431 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04aab66b-a04f-459e-a760-eec00bec0115-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-g44h6\" (UID: \"04aab66b-a04f-459e-a760-eec00bec0115\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-g44h6" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.896452 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvhpg\" (UniqueName: \"kubernetes.io/projected/931bdb70-51c0-4893-b3b9-6fe8dd700233-kube-api-access-zvhpg\") pod \"apiserver-7bbb656c7d-w56dm\" (UID: \"931bdb70-51c0-4893-b3b9-6fe8dd700233\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-w56dm" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.896475 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/12136e64-010e-49bc-9c3e-d1c65467f361-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-qj8zv\" (UID: \"12136e64-010e-49bc-9c3e-d1c65467f361\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qj8zv" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.896498 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-audit-dir\") pod \"oauth-openshift-558db77b4-csp6p\" (UID: \"6e0f577b-3526-4f2e-846f-65d5a5ee1d8e\") " pod="openshift-authentication/oauth-openshift-558db77b4-csp6p" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.896519 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-csp6p\" (UID: \"6e0f577b-3526-4f2e-846f-65d5a5ee1d8e\") " pod="openshift-authentication/oauth-openshift-558db77b4-csp6p" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.896555 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/696189ea-f7d5-445e-a07e-ebb32f5e219c-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-rvt4l\" (UID: \"696189ea-f7d5-445e-a07e-ebb32f5e219c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-rvt4l" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.896576 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fd77fbe-c9db-4f41-9c98-a8f3490c1b30-config\") pod \"openshift-apiserver-operator-796bbdcf4f-cj2tt\" (UID: \"4fd77fbe-c9db-4f41-9c98-a8f3490c1b30\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cj2tt" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.896600 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4002c0b0-4b79-4755-a60c-2fc0cdac7876-console-config\") pod \"console-f9d7485db-jsv9r\" (UID: \"4002c0b0-4b79-4755-a60c-2fc0cdac7876\") " pod="openshift-console/console-f9d7485db-jsv9r" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.896621 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f31ef6d-c116-4335-bc5d-5357a379d202-config\") pod \"authentication-operator-69f744f599-7x948\" (UID: \"9f31ef6d-c116-4335-bc5d-5357a379d202\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7x948" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.896641 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b57c377-c954-4559-af66-e64599cf5f71-serving-cert\") pod \"console-operator-58897d9998-cjns7\" (UID: \"7b57c377-c954-4559-af66-e64599cf5f71\") " pod="openshift-console-operator/console-operator-58897d9998-cjns7" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.896665 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/696189ea-f7d5-445e-a07e-ebb32f5e219c-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-rvt4l\" (UID: \"696189ea-f7d5-445e-a07e-ebb32f5e219c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-rvt4l" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.896687 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5frm\" (UniqueName: \"kubernetes.io/projected/458c25bb-0a4d-4f96-9f46-cdfd66b6ae34-kube-api-access-r5frm\") pod \"route-controller-manager-6576b87f9c-7hn2f\" (UID: \"458c25bb-0a4d-4f96-9f46-cdfd66b6ae34\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7hn2f" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.896711 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6b92760e-ca08-4b9c-b2a0-3f522ba6a975-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-zkgk9\" (UID: \"6b92760e-ca08-4b9c-b2a0-3f522ba6a975\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zkgk9" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.896733 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/847a70f9-ec15-480b-8ed8-9d3cb006ce64-etcd-client\") pod \"apiserver-76f77b778f-w2wdh\" (UID: \"847a70f9-ec15-480b-8ed8-9d3cb006ce64\") " pod="openshift-apiserver/apiserver-76f77b778f-w2wdh" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.896755 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/0b40c111-e562-48b7-9db2-1a494e16786c-available-featuregates\") pod \"openshift-config-operator-7777fb866f-lt2df\" (UID: \"0b40c111-e562-48b7-9db2-1a494e16786c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lt2df" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.896783 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fa1e2012-a9ba-488e-b877-4b0ee5f079b4-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-frc7j\" (UID: \"fa1e2012-a9ba-488e-b877-4b0ee5f079b4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-frc7j" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.896805 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-csp6p\" (UID: \"6e0f577b-3526-4f2e-846f-65d5a5ee1d8e\") " pod="openshift-authentication/oauth-openshift-558db77b4-csp6p" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.896827 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/06b421f6-7c09-4105-9aa3-06296e88a57f-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-p9pvs\" (UID: \"06b421f6-7c09-4105-9aa3-06296e88a57f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p9pvs" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.896862 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ww6ff\" (UniqueName: \"kubernetes.io/projected/7b57c377-c954-4559-af66-e64599cf5f71-kube-api-access-ww6ff\") pod \"console-operator-58897d9998-cjns7\" (UID: \"7b57c377-c954-4559-af66-e64599cf5f71\") " pod="openshift-console-operator/console-operator-58897d9998-cjns7" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.896882 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-audit-policies\") pod \"oauth-openshift-558db77b4-csp6p\" (UID: \"6e0f577b-3526-4f2e-846f-65d5a5ee1d8e\") " pod="openshift-authentication/oauth-openshift-558db77b4-csp6p" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.896903 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8gl2\" (UniqueName: \"kubernetes.io/projected/12136e64-010e-49bc-9c3e-d1c65467f361-kube-api-access-z8gl2\") pod \"machine-api-operator-5694c8668f-qj8zv\" (UID: \"12136e64-010e-49bc-9c3e-d1c65467f361\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qj8zv" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.896926 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/63788da5-7737-4b23-aef1-283bc26a4202-machine-approver-tls\") pod \"machine-approver-56656f9798-9wbzf\" (UID: \"63788da5-7737-4b23-aef1-283bc26a4202\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9wbzf" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.896966 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-csp6p\" (UID: \"6e0f577b-3526-4f2e-846f-65d5a5ee1d8e\") " pod="openshift-authentication/oauth-openshift-558db77b4-csp6p" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.896992 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6fdb\" (UniqueName: \"kubernetes.io/projected/00f6ed0c-f791-460d-acd4-d100a0b21710-kube-api-access-b6fdb\") pod \"control-plane-machine-set-operator-78cbb6b69f-nl5jh\" (UID: \"00f6ed0c-f791-460d-acd4-d100a0b21710\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nl5jh" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.897018 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-csp6p\" (UID: \"6e0f577b-3526-4f2e-846f-65d5a5ee1d8e\") " pod="openshift-authentication/oauth-openshift-558db77b4-csp6p" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.897040 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/847a70f9-ec15-480b-8ed8-9d3cb006ce64-image-import-ca\") pod \"apiserver-76f77b778f-w2wdh\" (UID: \"847a70f9-ec15-480b-8ed8-9d3cb006ce64\") " pod="openshift-apiserver/apiserver-76f77b778f-w2wdh" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.897063 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04aab66b-a04f-459e-a760-eec00bec0115-config\") pod \"kube-controller-manager-operator-78b949d7b-g44h6\" (UID: \"04aab66b-a04f-459e-a760-eec00bec0115\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-g44h6" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.897083 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4fd77fbe-c9db-4f41-9c98-a8f3490c1b30-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-cj2tt\" (UID: \"4fd77fbe-c9db-4f41-9c98-a8f3490c1b30\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cj2tt" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.897108 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63788da5-7737-4b23-aef1-283bc26a4202-config\") pod \"machine-approver-56656f9798-9wbzf\" (UID: \"63788da5-7737-4b23-aef1-283bc26a4202\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9wbzf" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.897129 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4002c0b0-4b79-4755-a60c-2fc0cdac7876-trusted-ca-bundle\") pod \"console-f9d7485db-jsv9r\" (UID: \"4002c0b0-4b79-4755-a60c-2fc0cdac7876\") " pod="openshift-console/console-f9d7485db-jsv9r" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.897151 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/00f6ed0c-f791-460d-acd4-d100a0b21710-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-nl5jh\" (UID: \"00f6ed0c-f791-460d-acd4-d100a0b21710\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nl5jh" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.897177 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b40c111-e562-48b7-9db2-1a494e16786c-serving-cert\") pod \"openshift-config-operator-7777fb866f-lt2df\" (UID: \"0b40c111-e562-48b7-9db2-1a494e16786c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lt2df" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.897200 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztkdx\" (UniqueName: \"kubernetes.io/projected/cab55760-3ea7-41e5-8ac2-7e8c44f5ecb1-kube-api-access-ztkdx\") pod \"multus-admission-controller-857f4d67dd-j5fqx\" (UID: \"cab55760-3ea7-41e5-8ac2-7e8c44f5ecb1\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-j5fqx" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.897221 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7499g\" (UniqueName: \"kubernetes.io/projected/2fb73eca-8b7b-488b-bea1-3ca7f1145fd7-kube-api-access-7499g\") pod \"migrator-59844c95c7-lwmpj\" (UID: \"2fb73eca-8b7b-488b-bea1-3ca7f1145fd7\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lwmpj" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.897242 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4002c0b0-4b79-4755-a60c-2fc0cdac7876-oauth-serving-cert\") pod \"console-f9d7485db-jsv9r\" (UID: \"4002c0b0-4b79-4755-a60c-2fc0cdac7876\") " pod="openshift-console/console-f9d7485db-jsv9r" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.897263 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/931bdb70-51c0-4893-b3b9-6fe8dd700233-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-w56dm\" (UID: \"931bdb70-51c0-4893-b3b9-6fe8dd700233\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-w56dm" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.897284 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/458c25bb-0a4d-4f96-9f46-cdfd66b6ae34-config\") pod \"route-controller-manager-6576b87f9c-7hn2f\" (UID: \"458c25bb-0a4d-4f96-9f46-cdfd66b6ae34\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7hn2f" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.897317 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p28qt\" (UniqueName: \"kubernetes.io/projected/02ee2c67-a501-4ae1-a1c3-f9cc8fa1650d-kube-api-access-p28qt\") pod \"dns-operator-744455d44c-h69ht\" (UID: \"02ee2c67-a501-4ae1-a1c3-f9cc8fa1650d\") " pod="openshift-dns-operator/dns-operator-744455d44c-h69ht" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.897341 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4002c0b0-4b79-4755-a60c-2fc0cdac7876-console-serving-cert\") pod \"console-f9d7485db-jsv9r\" (UID: \"4002c0b0-4b79-4755-a60c-2fc0cdac7876\") " pod="openshift-console/console-f9d7485db-jsv9r" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.897364 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12136e64-010e-49bc-9c3e-d1c65467f361-config\") pod \"machine-api-operator-5694c8668f-qj8zv\" (UID: \"12136e64-010e-49bc-9c3e-d1c65467f361\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qj8zv" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.897384 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06b421f6-7c09-4105-9aa3-06296e88a57f-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-p9pvs\" (UID: \"06b421f6-7c09-4105-9aa3-06296e88a57f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p9pvs" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.897408 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g42xl\" (UniqueName: \"kubernetes.io/projected/c25f2bd8-5e89-40b2-8c62-3c67a364384f-kube-api-access-g42xl\") pod \"etcd-operator-b45778765-hvvd4\" (UID: \"c25f2bd8-5e89-40b2-8c62-3c67a364384f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hvvd4" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.897422 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/63788da5-7737-4b23-aef1-283bc26a4202-auth-proxy-config\") pod \"machine-approver-56656f9798-9wbzf\" (UID: \"63788da5-7737-4b23-aef1-283bc26a4202\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9wbzf" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.897432 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkfbl\" (UniqueName: \"kubernetes.io/projected/9f31ef6d-c116-4335-bc5d-5357a379d202-kube-api-access-vkfbl\") pod \"authentication-operator-69f744f599-7x948\" (UID: \"9f31ef6d-c116-4335-bc5d-5357a379d202\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7x948" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.897510 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/c25f2bd8-5e89-40b2-8c62-3c67a364384f-etcd-ca\") pod \"etcd-operator-b45778765-hvvd4\" (UID: \"c25f2bd8-5e89-40b2-8c62-3c67a364384f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hvvd4" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.897534 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fa1e2012-a9ba-488e-b877-4b0ee5f079b4-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-frc7j\" (UID: \"fa1e2012-a9ba-488e-b877-4b0ee5f079b4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-frc7j" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.897556 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/cab55760-3ea7-41e5-8ac2-7e8c44f5ecb1-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-j5fqx\" (UID: \"cab55760-3ea7-41e5-8ac2-7e8c44f5ecb1\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-j5fqx" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.898417 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p9pvs"] Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.898445 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-h69ht"] Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.898975 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/c25f2bd8-5e89-40b2-8c62-3c67a364384f-etcd-service-ca\") pod \"etcd-operator-b45778765-hvvd4\" (UID: \"c25f2bd8-5e89-40b2-8c62-3c67a364384f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hvvd4" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.899043 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-rvt4l"] Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.899138 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrhts\" (UniqueName: \"kubernetes.io/projected/6b92760e-ca08-4b9c-b2a0-3f522ba6a975-kube-api-access-lrhts\") pod \"kube-storage-version-migrator-operator-b67b599dd-zkgk9\" (UID: \"6b92760e-ca08-4b9c-b2a0-3f522ba6a975\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zkgk9" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.899171 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/696189ea-f7d5-445e-a07e-ebb32f5e219c-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-rvt4l\" (UID: \"696189ea-f7d5-445e-a07e-ebb32f5e219c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-rvt4l" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.899206 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-csp6p\" (UID: \"6e0f577b-3526-4f2e-846f-65d5a5ee1d8e\") " pod="openshift-authentication/oauth-openshift-558db77b4-csp6p" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.899232 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/847a70f9-ec15-480b-8ed8-9d3cb006ce64-encryption-config\") pod \"apiserver-76f77b778f-w2wdh\" (UID: \"847a70f9-ec15-480b-8ed8-9d3cb006ce64\") " pod="openshift-apiserver/apiserver-76f77b778f-w2wdh" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.899254 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a3bc6803-77e3-4e63-bfaa-761500e161de-metrics-tls\") pod \"ingress-operator-5b745b69d9-sl474\" (UID: \"a3bc6803-77e3-4e63-bfaa-761500e161de\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sl474" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.899280 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/04aab66b-a04f-459e-a760-eec00bec0115-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-g44h6\" (UID: \"04aab66b-a04f-459e-a760-eec00bec0115\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-g44h6" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.899303 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/847a70f9-ec15-480b-8ed8-9d3cb006ce64-node-pullsecrets\") pod \"apiserver-76f77b778f-w2wdh\" (UID: \"847a70f9-ec15-480b-8ed8-9d3cb006ce64\") " pod="openshift-apiserver/apiserver-76f77b778f-w2wdh" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.899326 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/847a70f9-ec15-480b-8ed8-9d3cb006ce64-audit\") pod \"apiserver-76f77b778f-w2wdh\" (UID: \"847a70f9-ec15-480b-8ed8-9d3cb006ce64\") " pod="openshift-apiserver/apiserver-76f77b778f-w2wdh" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.899349 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c25f2bd8-5e89-40b2-8c62-3c67a364384f-etcd-client\") pod \"etcd-operator-b45778765-hvvd4\" (UID: \"c25f2bd8-5e89-40b2-8c62-3c67a364384f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hvvd4" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.899370 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/570052d5-a3db-4720-b8ab-32b4d71c44f8-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-w86m4\" (UID: \"570052d5-a3db-4720-b8ab-32b4d71c44f8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w86m4" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.899393 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgq8g\" (UniqueName: \"kubernetes.io/projected/4002c0b0-4b79-4755-a60c-2fc0cdac7876-kube-api-access-zgq8g\") pod \"console-f9d7485db-jsv9r\" (UID: \"4002c0b0-4b79-4755-a60c-2fc0cdac7876\") " pod="openshift-console/console-f9d7485db-jsv9r" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.899415 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f31ef6d-c116-4335-bc5d-5357a379d202-service-ca-bundle\") pod \"authentication-operator-69f744f599-7x948\" (UID: \"9f31ef6d-c116-4335-bc5d-5357a379d202\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7x948" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.899436 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa1e2012-a9ba-488e-b877-4b0ee5f079b4-config\") pod \"kube-apiserver-operator-766d6c64bb-frc7j\" (UID: \"fa1e2012-a9ba-488e-b877-4b0ee5f079b4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-frc7j" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.899456 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/847a70f9-ec15-480b-8ed8-9d3cb006ce64-etcd-serving-ca\") pod \"apiserver-76f77b778f-w2wdh\" (UID: \"847a70f9-ec15-480b-8ed8-9d3cb006ce64\") " pod="openshift-apiserver/apiserver-76f77b778f-w2wdh" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.899477 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/12136e64-010e-49bc-9c3e-d1c65467f361-images\") pod \"machine-api-operator-5694c8668f-qj8zv\" (UID: \"12136e64-010e-49bc-9c3e-d1c65467f361\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qj8zv" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.899747 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c25f2bd8-5e89-40b2-8c62-3c67a364384f-config\") pod \"etcd-operator-b45778765-hvvd4\" (UID: \"c25f2bd8-5e89-40b2-8c62-3c67a364384f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hvvd4" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.900807 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63788da5-7737-4b23-aef1-283bc26a4202-config\") pod \"machine-approver-56656f9798-9wbzf\" (UID: \"63788da5-7737-4b23-aef1-283bc26a4202\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9wbzf" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.901564 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/c25f2bd8-5e89-40b2-8c62-3c67a364384f-etcd-ca\") pod \"etcd-operator-b45778765-hvvd4\" (UID: \"c25f2bd8-5e89-40b2-8c62-3c67a364384f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hvvd4" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.911441 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-csp6p"] Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.912564 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-w56dm"] Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.913716 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-7x948"] Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.928489 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/63788da5-7737-4b23-aef1-283bc26a4202-machine-approver-tls\") pod \"machine-approver-56656f9798-9wbzf\" (UID: \"63788da5-7737-4b23-aef1-283bc26a4202\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9wbzf" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.929088 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c25f2bd8-5e89-40b2-8c62-3c67a364384f-serving-cert\") pod \"etcd-operator-b45778765-hvvd4\" (UID: \"c25f2bd8-5e89-40b2-8c62-3c67a364384f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hvvd4" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.929089 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/02ee2c67-a501-4ae1-a1c3-f9cc8fa1650d-metrics-tls\") pod \"dns-operator-744455d44c-h69ht\" (UID: \"02ee2c67-a501-4ae1-a1c3-f9cc8fa1650d\") " pod="openshift-dns-operator/dns-operator-744455d44c-h69ht" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.929389 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c25f2bd8-5e89-40b2-8c62-3c67a364384f-etcd-client\") pod \"etcd-operator-b45778765-hvvd4\" (UID: \"c25f2bd8-5e89-40b2-8c62-3c67a364384f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hvvd4" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.929768 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/570052d5-a3db-4720-b8ab-32b4d71c44f8-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-w86m4\" (UID: \"570052d5-a3db-4720-b8ab-32b4d71c44f8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w86m4" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.930611 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.931695 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zkgk9"] Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.933008 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.937411 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-qj8zv"] Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.937462 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-sl474"] Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.937478 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-zmd8d"] Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.944963 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.945340 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-frc7j"] Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.945451 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-zmd8d" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.946066 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-gg284"] Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.947585 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-g44h6"] Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.948784 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-k8fq2"] Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.949633 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-lt2df"] Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.950420 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nl5jh"] Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.951427 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-w2wdh"] Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.952997 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-j5fqx"] Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.953398 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-lwmpj"] Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.954383 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-7hn2f"] Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.955567 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-jsv9r"] Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.958425 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cj2tt"] Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.974762 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-hvvd4"] Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.974810 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6ffb7"] Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.990000 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7p82b"] Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.991379 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h7flt"] Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.992890 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.993175 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-7qvr5"] Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.994543 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 01 21:36:00 crc kubenswrapper[4962]: I1201 21:36:00.998183 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ljnp5"] Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.000113 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-p5kqx"] Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.000204 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-cjns7"] Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.003362 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa1e2012-a9ba-488e-b877-4b0ee5f079b4-config\") pod \"kube-apiserver-operator-766d6c64bb-frc7j\" (UID: \"fa1e2012-a9ba-488e-b877-4b0ee5f079b4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-frc7j" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.003397 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ad9fda2-065b-4620-bc36-e33403fcdd53-serving-cert\") pod \"controller-manager-879f6c89f-p5kqx\" (UID: \"7ad9fda2-065b-4620-bc36-e33403fcdd53\") " pod="openshift-controller-manager/controller-manager-879f6c89f-p5kqx" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.003421 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/931bdb70-51c0-4893-b3b9-6fe8dd700233-audit-dir\") pod \"apiserver-7bbb656c7d-w56dm\" (UID: \"931bdb70-51c0-4893-b3b9-6fe8dd700233\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-w56dm" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.003447 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a56487ec-ba83-42dc-b8b4-35c353fbc836-srv-cert\") pod \"catalog-operator-68c6474976-h7flt\" (UID: \"a56487ec-ba83-42dc-b8b4-35c353fbc836\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h7flt" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.003464 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgkfc\" (UniqueName: \"kubernetes.io/projected/7ad9fda2-065b-4620-bc36-e33403fcdd53-kube-api-access-rgkfc\") pod \"controller-manager-879f6c89f-p5kqx\" (UID: \"7ad9fda2-065b-4620-bc36-e33403fcdd53\") " pod="openshift-controller-manager/controller-manager-879f6c89f-p5kqx" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.003486 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b57c377-c954-4559-af66-e64599cf5f71-config\") pod \"console-operator-58897d9998-cjns7\" (UID: \"7b57c377-c954-4559-af66-e64599cf5f71\") " pod="openshift-console-operator/console-operator-58897d9998-cjns7" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.003504 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-csp6p\" (UID: \"6e0f577b-3526-4f2e-846f-65d5a5ee1d8e\") " pod="openshift-authentication/oauth-openshift-558db77b4-csp6p" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.003521 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/847a70f9-ec15-480b-8ed8-9d3cb006ce64-serving-cert\") pod \"apiserver-76f77b778f-w2wdh\" (UID: \"847a70f9-ec15-480b-8ed8-9d3cb006ce64\") " pod="openshift-apiserver/apiserver-76f77b778f-w2wdh" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.003537 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/81201e01-7100-49fe-a7d2-d402fa8fe9af-proxy-tls\") pod \"machine-config-operator-74547568cd-7qvr5\" (UID: \"81201e01-7100-49fe-a7d2-d402fa8fe9af\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7qvr5" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.003553 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ad9fda2-065b-4620-bc36-e33403fcdd53-config\") pod \"controller-manager-879f6c89f-p5kqx\" (UID: \"7ad9fda2-065b-4620-bc36-e33403fcdd53\") " pod="openshift-controller-manager/controller-manager-879f6c89f-p5kqx" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.003569 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a3bc6803-77e3-4e63-bfaa-761500e161de-bound-sa-token\") pod \"ingress-operator-5b745b69d9-sl474\" (UID: \"a3bc6803-77e3-4e63-bfaa-761500e161de\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sl474" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.003595 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/81201e01-7100-49fe-a7d2-d402fa8fe9af-auth-proxy-config\") pod \"machine-config-operator-74547568cd-7qvr5\" (UID: \"81201e01-7100-49fe-a7d2-d402fa8fe9af\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7qvr5" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.003609 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fde3e2c2-ed59-4cbf-8554-1a0438eb81dc-config-volume\") pod \"collect-profiles-29410410-kr2lr\" (UID: \"fde3e2c2-ed59-4cbf-8554-1a0438eb81dc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410410-kr2lr" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.003625 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a56487ec-ba83-42dc-b8b4-35c353fbc836-profile-collector-cert\") pod \"catalog-operator-68c6474976-h7flt\" (UID: \"a56487ec-ba83-42dc-b8b4-35c353fbc836\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h7flt" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.003642 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7ad9fda2-065b-4620-bc36-e33403fcdd53-client-ca\") pod \"controller-manager-879f6c89f-p5kqx\" (UID: \"7ad9fda2-065b-4620-bc36-e33403fcdd53\") " pod="openshift-controller-manager/controller-manager-879f6c89f-p5kqx" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.003658 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/979f8807-55d0-475c-9cd6-08b47d99f9e2-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-kg8bg\" (UID: \"979f8807-55d0-475c-9cd6-08b47d99f9e2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kg8bg" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.003675 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b92760e-ca08-4b9c-b2a0-3f522ba6a975-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-zkgk9\" (UID: \"6b92760e-ca08-4b9c-b2a0-3f522ba6a975\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zkgk9" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.003689 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/458c25bb-0a4d-4f96-9f46-cdfd66b6ae34-client-ca\") pod \"route-controller-manager-6576b87f9c-7hn2f\" (UID: \"458c25bb-0a4d-4f96-9f46-cdfd66b6ae34\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7hn2f" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.003733 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64k8k\" (UniqueName: \"kubernetes.io/projected/06b421f6-7c09-4105-9aa3-06296e88a57f-kube-api-access-64k8k\") pod \"openshift-controller-manager-operator-756b6f6bc6-p9pvs\" (UID: \"06b421f6-7c09-4105-9aa3-06296e88a57f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p9pvs" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.003750 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wgxq\" (UniqueName: \"kubernetes.io/projected/ccb13006-cd7f-4249-9d04-7391d26eaae3-kube-api-access-6wgxq\") pod \"package-server-manager-789f6589d5-7p82b\" (UID: \"ccb13006-cd7f-4249-9d04-7391d26eaae3\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7p82b" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.003775 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/458c25bb-0a4d-4f96-9f46-cdfd66b6ae34-serving-cert\") pod \"route-controller-manager-6576b87f9c-7hn2f\" (UID: \"458c25bb-0a4d-4f96-9f46-cdfd66b6ae34\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7hn2f" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.003792 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-csp6p\" (UID: \"6e0f577b-3526-4f2e-846f-65d5a5ee1d8e\") " pod="openshift-authentication/oauth-openshift-558db77b4-csp6p" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.003811 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f31ef6d-c116-4335-bc5d-5357a379d202-serving-cert\") pod \"authentication-operator-69f744f599-7x948\" (UID: \"9f31ef6d-c116-4335-bc5d-5357a379d202\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7x948" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.003828 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7b57c377-c954-4559-af66-e64599cf5f71-trusted-ca\") pod \"console-operator-58897d9998-cjns7\" (UID: \"7b57c377-c954-4559-af66-e64599cf5f71\") " pod="openshift-console-operator/console-operator-58897d9998-cjns7" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.003845 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/931bdb70-51c0-4893-b3b9-6fe8dd700233-audit-policies\") pod \"apiserver-7bbb656c7d-w56dm\" (UID: \"931bdb70-51c0-4893-b3b9-6fe8dd700233\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-w56dm" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.003867 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27zvm\" (UniqueName: \"kubernetes.io/projected/a3bc6803-77e3-4e63-bfaa-761500e161de-kube-api-access-27zvm\") pod \"ingress-operator-5b745b69d9-sl474\" (UID: \"a3bc6803-77e3-4e63-bfaa-761500e161de\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sl474" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.003883 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f4a93f57-1e83-4b94-b1eb-fc28eea15503-profile-collector-cert\") pod \"olm-operator-6b444d44fb-ljnp5\" (UID: \"f4a93f57-1e83-4b94-b1eb-fc28eea15503\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ljnp5" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.003901 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04aab66b-a04f-459e-a760-eec00bec0115-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-g44h6\" (UID: \"04aab66b-a04f-459e-a760-eec00bec0115\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-g44h6" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.003916 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvhpg\" (UniqueName: \"kubernetes.io/projected/931bdb70-51c0-4893-b3b9-6fe8dd700233-kube-api-access-zvhpg\") pod \"apiserver-7bbb656c7d-w56dm\" (UID: \"931bdb70-51c0-4893-b3b9-6fe8dd700233\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-w56dm" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.003949 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/95b5b235-1db6-458c-bdf9-c065eb0c70e5-apiservice-cert\") pod \"packageserver-d55dfcdfc-6ffb7\" (UID: \"95b5b235-1db6-458c-bdf9-c065eb0c70e5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6ffb7" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.003967 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/10c473f1-a2f6-4565-ba09-d7e28dea1600-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-gh8cn\" (UID: \"10c473f1-a2f6-4565-ba09-d7e28dea1600\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gh8cn" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.003986 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-csp6p\" (UID: \"6e0f577b-3526-4f2e-846f-65d5a5ee1d8e\") " pod="openshift-authentication/oauth-openshift-558db77b4-csp6p" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.004002 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/95b5b235-1db6-458c-bdf9-c065eb0c70e5-webhook-cert\") pod \"packageserver-d55dfcdfc-6ffb7\" (UID: \"95b5b235-1db6-458c-bdf9-c065eb0c70e5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6ffb7" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.004075 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f31ef6d-c116-4335-bc5d-5357a379d202-config\") pod \"authentication-operator-69f744f599-7x948\" (UID: \"9f31ef6d-c116-4335-bc5d-5357a379d202\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7x948" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.004117 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r77mm\" (UniqueName: \"kubernetes.io/projected/f4a93f57-1e83-4b94-b1eb-fc28eea15503-kube-api-access-r77mm\") pod \"olm-operator-6b444d44fb-ljnp5\" (UID: \"f4a93f57-1e83-4b94-b1eb-fc28eea15503\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ljnp5" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.004144 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lccq\" (UniqueName: \"kubernetes.io/projected/fde3e2c2-ed59-4cbf-8554-1a0438eb81dc-kube-api-access-8lccq\") pod \"collect-profiles-29410410-kr2lr\" (UID: \"fde3e2c2-ed59-4cbf-8554-1a0438eb81dc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410410-kr2lr" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.004175 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-csp6p\" (UID: \"6e0f577b-3526-4f2e-846f-65d5a5ee1d8e\") " pod="openshift-authentication/oauth-openshift-558db77b4-csp6p" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.004203 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/06b421f6-7c09-4105-9aa3-06296e88a57f-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-p9pvs\" (UID: \"06b421f6-7c09-4105-9aa3-06296e88a57f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p9pvs" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.005146 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/458c25bb-0a4d-4f96-9f46-cdfd66b6ae34-client-ca\") pod \"route-controller-manager-6576b87f9c-7hn2f\" (UID: \"458c25bb-0a4d-4f96-9f46-cdfd66b6ae34\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7hn2f" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.006128 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-csp6p\" (UID: \"6e0f577b-3526-4f2e-846f-65d5a5ee1d8e\") " pod="openshift-authentication/oauth-openshift-558db77b4-csp6p" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.006998 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kg8bg"] Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.007060 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-zbnc9"] Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.007073 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-gh8cn"] Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.007139 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-audit-policies\") pod \"oauth-openshift-558db77b4-csp6p\" (UID: \"6e0f577b-3526-4f2e-846f-65d5a5ee1d8e\") " pod="openshift-authentication/oauth-openshift-558db77b4-csp6p" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.007169 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/979f8807-55d0-475c-9cd6-08b47d99f9e2-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-kg8bg\" (UID: \"979f8807-55d0-475c-9cd6-08b47d99f9e2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kg8bg" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.007203 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6fdb\" (UniqueName: \"kubernetes.io/projected/00f6ed0c-f791-460d-acd4-d100a0b21710-kube-api-access-b6fdb\") pod \"control-plane-machine-set-operator-78cbb6b69f-nl5jh\" (UID: \"00f6ed0c-f791-460d-acd4-d100a0b21710\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nl5jh" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.007236 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04aab66b-a04f-459e-a760-eec00bec0115-config\") pod \"kube-controller-manager-operator-78b949d7b-g44h6\" (UID: \"04aab66b-a04f-459e-a760-eec00bec0115\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-g44h6" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.007328 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4fd77fbe-c9db-4f41-9c98-a8f3490c1b30-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-cj2tt\" (UID: \"4fd77fbe-c9db-4f41-9c98-a8f3490c1b30\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cj2tt" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.007352 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5ae96720-62a1-4f8e-b9b1-c234fdf318e8-node-bootstrap-token\") pod \"machine-config-server-zmd8d\" (UID: \"5ae96720-62a1-4f8e-b9b1-c234fdf318e8\") " pod="openshift-machine-config-operator/machine-config-server-zmd8d" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.007373 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztkdx\" (UniqueName: \"kubernetes.io/projected/cab55760-3ea7-41e5-8ac2-7e8c44f5ecb1-kube-api-access-ztkdx\") pod \"multus-admission-controller-857f4d67dd-j5fqx\" (UID: \"cab55760-3ea7-41e5-8ac2-7e8c44f5ecb1\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-j5fqx" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.007390 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c08b9259-9d06-4450-a2e6-09740b95fdea-serving-cert\") pod \"service-ca-operator-777779d784-zbnc9\" (UID: \"c08b9259-9d06-4450-a2e6-09740b95fdea\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zbnc9" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.007408 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4002c0b0-4b79-4755-a60c-2fc0cdac7876-oauth-serving-cert\") pod \"console-f9d7485db-jsv9r\" (UID: \"4002c0b0-4b79-4755-a60c-2fc0cdac7876\") " pod="openshift-console/console-f9d7485db-jsv9r" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.007426 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/931bdb70-51c0-4893-b3b9-6fe8dd700233-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-w56dm\" (UID: \"931bdb70-51c0-4893-b3b9-6fe8dd700233\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-w56dm" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.007442 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4002c0b0-4b79-4755-a60c-2fc0cdac7876-console-serving-cert\") pod \"console-f9d7485db-jsv9r\" (UID: \"4002c0b0-4b79-4755-a60c-2fc0cdac7876\") " pod="openshift-console/console-f9d7485db-jsv9r" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.007459 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12136e64-010e-49bc-9c3e-d1c65467f361-config\") pod \"machine-api-operator-5694c8668f-qj8zv\" (UID: \"12136e64-010e-49bc-9c3e-d1c65467f361\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qj8zv" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.007491 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkfbl\" (UniqueName: \"kubernetes.io/projected/9f31ef6d-c116-4335-bc5d-5357a379d202-kube-api-access-vkfbl\") pod \"authentication-operator-69f744f599-7x948\" (UID: \"9f31ef6d-c116-4335-bc5d-5357a379d202\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7x948" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.007509 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/979f8807-55d0-475c-9cd6-08b47d99f9e2-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-kg8bg\" (UID: \"979f8807-55d0-475c-9cd6-08b47d99f9e2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kg8bg" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.007526 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/cab55760-3ea7-41e5-8ac2-7e8c44f5ecb1-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-j5fqx\" (UID: \"cab55760-3ea7-41e5-8ac2-7e8c44f5ecb1\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-j5fqx" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.007560 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrhts\" (UniqueName: \"kubernetes.io/projected/6b92760e-ca08-4b9c-b2a0-3f522ba6a975-kube-api-access-lrhts\") pod \"kube-storage-version-migrator-operator-b67b599dd-zkgk9\" (UID: \"6b92760e-ca08-4b9c-b2a0-3f522ba6a975\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zkgk9" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.007580 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/696189ea-f7d5-445e-a07e-ebb32f5e219c-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-rvt4l\" (UID: \"696189ea-f7d5-445e-a07e-ebb32f5e219c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-rvt4l" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.007596 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5ae96720-62a1-4f8e-b9b1-c234fdf318e8-certs\") pod \"machine-config-server-zmd8d\" (UID: \"5ae96720-62a1-4f8e-b9b1-c234fdf318e8\") " pod="openshift-machine-config-operator/machine-config-server-zmd8d" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.007617 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-csp6p\" (UID: \"6e0f577b-3526-4f2e-846f-65d5a5ee1d8e\") " pod="openshift-authentication/oauth-openshift-558db77b4-csp6p" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.007637 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7ad9fda2-065b-4620-bc36-e33403fcdd53-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-p5kqx\" (UID: \"7ad9fda2-065b-4620-bc36-e33403fcdd53\") " pod="openshift-controller-manager/controller-manager-879f6c89f-p5kqx" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.007742 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/04aab66b-a04f-459e-a760-eec00bec0115-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-g44h6\" (UID: \"04aab66b-a04f-459e-a760-eec00bec0115\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-g44h6" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.007761 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/847a70f9-ec15-480b-8ed8-9d3cb006ce64-node-pullsecrets\") pod \"apiserver-76f77b778f-w2wdh\" (UID: \"847a70f9-ec15-480b-8ed8-9d3cb006ce64\") " pod="openshift-apiserver/apiserver-76f77b778f-w2wdh" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.007777 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgq8g\" (UniqueName: \"kubernetes.io/projected/4002c0b0-4b79-4755-a60c-2fc0cdac7876-kube-api-access-zgq8g\") pod \"console-f9d7485db-jsv9r\" (UID: \"4002c0b0-4b79-4755-a60c-2fc0cdac7876\") " pod="openshift-console/console-f9d7485db-jsv9r" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.007792 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f31ef6d-c116-4335-bc5d-5357a379d202-service-ca-bundle\") pod \"authentication-operator-69f744f599-7x948\" (UID: \"9f31ef6d-c116-4335-bc5d-5357a379d202\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7x948" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.007808 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/847a70f9-ec15-480b-8ed8-9d3cb006ce64-etcd-serving-ca\") pod \"apiserver-76f77b778f-w2wdh\" (UID: \"847a70f9-ec15-480b-8ed8-9d3cb006ce64\") " pod="openshift-apiserver/apiserver-76f77b778f-w2wdh" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.007827 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/12136e64-010e-49bc-9c3e-d1c65467f361-images\") pod \"machine-api-operator-5694c8668f-qj8zv\" (UID: \"12136e64-010e-49bc-9c3e-d1c65467f361\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qj8zv" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.007844 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4002c0b0-4b79-4755-a60c-2fc0cdac7876-service-ca\") pod \"console-f9d7485db-jsv9r\" (UID: \"4002c0b0-4b79-4755-a60c-2fc0cdac7876\") " pod="openshift-console/console-f9d7485db-jsv9r" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.007860 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-csp6p\" (UID: \"6e0f577b-3526-4f2e-846f-65d5a5ee1d8e\") " pod="openshift-authentication/oauth-openshift-558db77b4-csp6p" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.007878 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfzc9\" (UniqueName: \"kubernetes.io/projected/847a70f9-ec15-480b-8ed8-9d3cb006ce64-kube-api-access-kfzc9\") pod \"apiserver-76f77b778f-w2wdh\" (UID: \"847a70f9-ec15-480b-8ed8-9d3cb006ce64\") " pod="openshift-apiserver/apiserver-76f77b778f-w2wdh" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.007896 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztp9d\" (UniqueName: \"kubernetes.io/projected/81201e01-7100-49fe-a7d2-d402fa8fe9af-kube-api-access-ztp9d\") pod \"machine-config-operator-74547568cd-7qvr5\" (UID: \"81201e01-7100-49fe-a7d2-d402fa8fe9af\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7qvr5" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.007914 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/3bc8d3cc-b827-4f76-b41e-18e0790b6e66-stats-auth\") pod \"router-default-5444994796-hjqsp\" (UID: \"3bc8d3cc-b827-4f76-b41e-18e0790b6e66\") " pod="openshift-ingress/router-default-5444994796-hjqsp" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.007954 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qw78\" (UniqueName: \"kubernetes.io/projected/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-kube-api-access-9qw78\") pod \"oauth-openshift-558db77b4-csp6p\" (UID: \"6e0f577b-3526-4f2e-846f-65d5a5ee1d8e\") " pod="openshift-authentication/oauth-openshift-558db77b4-csp6p" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.007977 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/847a70f9-ec15-480b-8ed8-9d3cb006ce64-trusted-ca-bundle\") pod \"apiserver-76f77b778f-w2wdh\" (UID: \"847a70f9-ec15-480b-8ed8-9d3cb006ce64\") " pod="openshift-apiserver/apiserver-76f77b778f-w2wdh" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.007993 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6zs5\" (UniqueName: \"kubernetes.io/projected/0b40c111-e562-48b7-9db2-1a494e16786c-kube-api-access-l6zs5\") pod \"openshift-config-operator-7777fb866f-lt2df\" (UID: \"0b40c111-e562-48b7-9db2-1a494e16786c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lt2df" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.008012 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/931bdb70-51c0-4893-b3b9-6fe8dd700233-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-w56dm\" (UID: \"931bdb70-51c0-4893-b3b9-6fe8dd700233\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-w56dm" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.008030 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f10f3763-03b0-43d0-88fd-ce89274a67d9-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-stv9m\" (UID: \"f10f3763-03b0-43d0-88fd-ce89274a67d9\") " pod="openshift-marketplace/marketplace-operator-79b997595-stv9m" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.008048 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wr64r\" (UniqueName: \"kubernetes.io/projected/5ae96720-62a1-4f8e-b9b1-c234fdf318e8-kube-api-access-wr64r\") pod \"machine-config-server-zmd8d\" (UID: \"5ae96720-62a1-4f8e-b9b1-c234fdf318e8\") " pod="openshift-machine-config-operator/machine-config-server-zmd8d" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.008074 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-csp6p\" (UID: \"6e0f577b-3526-4f2e-846f-65d5a5ee1d8e\") " pod="openshift-authentication/oauth-openshift-558db77b4-csp6p" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.008092 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-csp6p\" (UID: \"6e0f577b-3526-4f2e-846f-65d5a5ee1d8e\") " pod="openshift-authentication/oauth-openshift-558db77b4-csp6p" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.008109 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/10c473f1-a2f6-4565-ba09-d7e28dea1600-proxy-tls\") pod \"machine-config-controller-84d6567774-gh8cn\" (UID: \"10c473f1-a2f6-4565-ba09-d7e28dea1600\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gh8cn" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.008128 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4002c0b0-4b79-4755-a60c-2fc0cdac7876-console-oauth-config\") pod \"console-f9d7485db-jsv9r\" (UID: \"4002c0b0-4b79-4755-a60c-2fc0cdac7876\") " pod="openshift-console/console-f9d7485db-jsv9r" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.008147 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-csp6p\" (UID: \"6e0f577b-3526-4f2e-846f-65d5a5ee1d8e\") " pod="openshift-authentication/oauth-openshift-558db77b4-csp6p" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.008198 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/931bdb70-51c0-4893-b3b9-6fe8dd700233-audit-dir\") pod \"apiserver-7bbb656c7d-w56dm\" (UID: \"931bdb70-51c0-4893-b3b9-6fe8dd700233\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-w56dm" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.008301 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-stv9m"] Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.010771 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-csp6p\" (UID: \"6e0f577b-3526-4f2e-846f-65d5a5ee1d8e\") " pod="openshift-authentication/oauth-openshift-558db77b4-csp6p" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.011477 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/847a70f9-ec15-480b-8ed8-9d3cb006ce64-node-pullsecrets\") pod \"apiserver-76f77b778f-w2wdh\" (UID: \"847a70f9-ec15-480b-8ed8-9d3cb006ce64\") " pod="openshift-apiserver/apiserver-76f77b778f-w2wdh" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.012055 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f31ef6d-c116-4335-bc5d-5357a379d202-service-ca-bundle\") pod \"authentication-operator-69f744f599-7x948\" (UID: \"9f31ef6d-c116-4335-bc5d-5357a379d202\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7x948" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.014606 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/847a70f9-ec15-480b-8ed8-9d3cb006ce64-etcd-serving-ca\") pod \"apiserver-76f77b778f-w2wdh\" (UID: \"847a70f9-ec15-480b-8ed8-9d3cb006ce64\") " pod="openshift-apiserver/apiserver-76f77b778f-w2wdh" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.015280 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/12136e64-010e-49bc-9c3e-d1c65467f361-images\") pod \"machine-api-operator-5694c8668f-qj8zv\" (UID: \"12136e64-010e-49bc-9c3e-d1c65467f361\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qj8zv" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.015358 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/931bdb70-51c0-4893-b3b9-6fe8dd700233-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-w56dm\" (UID: \"931bdb70-51c0-4893-b3b9-6fe8dd700233\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-w56dm" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.015617 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-csp6p\" (UID: \"6e0f577b-3526-4f2e-846f-65d5a5ee1d8e\") " pod="openshift-authentication/oauth-openshift-558db77b4-csp6p" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.016295 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-audit-policies\") pod \"oauth-openshift-558db77b4-csp6p\" (UID: \"6e0f577b-3526-4f2e-846f-65d5a5ee1d8e\") " pod="openshift-authentication/oauth-openshift-558db77b4-csp6p" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.016878 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4002c0b0-4b79-4755-a60c-2fc0cdac7876-service-ca\") pod \"console-f9d7485db-jsv9r\" (UID: \"4002c0b0-4b79-4755-a60c-2fc0cdac7876\") " pod="openshift-console/console-f9d7485db-jsv9r" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.017134 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4002c0b0-4b79-4755-a60c-2fc0cdac7876-oauth-serving-cert\") pod \"console-f9d7485db-jsv9r\" (UID: \"4002c0b0-4b79-4755-a60c-2fc0cdac7876\") " pod="openshift-console/console-f9d7485db-jsv9r" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.017220 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f31ef6d-c116-4335-bc5d-5357a379d202-config\") pod \"authentication-operator-69f744f599-7x948\" (UID: \"9f31ef6d-c116-4335-bc5d-5357a379d202\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7x948" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.017364 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/931bdb70-51c0-4893-b3b9-6fe8dd700233-audit-policies\") pod \"apiserver-7bbb656c7d-w56dm\" (UID: \"931bdb70-51c0-4893-b3b9-6fe8dd700233\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-w56dm" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.017518 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqjb4\" (UniqueName: \"kubernetes.io/projected/f10f3763-03b0-43d0-88fd-ce89274a67d9-kube-api-access-zqjb4\") pod \"marketplace-operator-79b997595-stv9m\" (UID: \"f10f3763-03b0-43d0-88fd-ce89274a67d9\") " pod="openshift-marketplace/marketplace-operator-79b997595-stv9m" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.017543 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/931bdb70-51c0-4893-b3b9-6fe8dd700233-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-w56dm\" (UID: \"931bdb70-51c0-4893-b3b9-6fe8dd700233\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-w56dm" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.017548 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3bc8d3cc-b827-4f76-b41e-18e0790b6e66-service-ca-bundle\") pod \"router-default-5444994796-hjqsp\" (UID: \"3bc8d3cc-b827-4f76-b41e-18e0790b6e66\") " pod="openshift-ingress/router-default-5444994796-hjqsp" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.017681 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8bsz\" (UniqueName: \"kubernetes.io/projected/4fd77fbe-c9db-4f41-9c98-a8f3490c1b30-kube-api-access-z8bsz\") pod \"openshift-apiserver-operator-796bbdcf4f-cj2tt\" (UID: \"4fd77fbe-c9db-4f41-9c98-a8f3490c1b30\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cj2tt" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.017701 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.017714 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/847a70f9-ec15-480b-8ed8-9d3cb006ce64-audit-dir\") pod \"apiserver-76f77b778f-w2wdh\" (UID: \"847a70f9-ec15-480b-8ed8-9d3cb006ce64\") " pod="openshift-apiserver/apiserver-76f77b778f-w2wdh" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.017737 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/931bdb70-51c0-4893-b3b9-6fe8dd700233-etcd-client\") pod \"apiserver-7bbb656c7d-w56dm\" (UID: \"931bdb70-51c0-4893-b3b9-6fe8dd700233\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-w56dm" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.017785 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/931bdb70-51c0-4893-b3b9-6fe8dd700233-encryption-config\") pod \"apiserver-7bbb656c7d-w56dm\" (UID: \"931bdb70-51c0-4893-b3b9-6fe8dd700233\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-w56dm" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.017809 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hvck\" (UniqueName: \"kubernetes.io/projected/a56487ec-ba83-42dc-b8b4-35c353fbc836-kube-api-access-2hvck\") pod \"catalog-operator-68c6474976-h7flt\" (UID: \"a56487ec-ba83-42dc-b8b4-35c353fbc836\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h7flt" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.017833 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/95b5b235-1db6-458c-bdf9-c065eb0c70e5-tmpfs\") pod \"packageserver-d55dfcdfc-6ffb7\" (UID: \"95b5b235-1db6-458c-bdf9-c065eb0c70e5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6ffb7" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.017853 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4qpv\" (UniqueName: \"kubernetes.io/projected/979f8807-55d0-475c-9cd6-08b47d99f9e2-kube-api-access-n4qpv\") pod \"cluster-image-registry-operator-dc59b4c8b-kg8bg\" (UID: \"979f8807-55d0-475c-9cd6-08b47d99f9e2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kg8bg" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.017874 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3bc8d3cc-b827-4f76-b41e-18e0790b6e66-metrics-certs\") pod \"router-default-5444994796-hjqsp\" (UID: \"3bc8d3cc-b827-4f76-b41e-18e0790b6e66\") " pod="openshift-ingress/router-default-5444994796-hjqsp" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.017897 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a3bc6803-77e3-4e63-bfaa-761500e161de-trusted-ca\") pod \"ingress-operator-5b745b69d9-sl474\" (UID: \"a3bc6803-77e3-4e63-bfaa-761500e161de\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sl474" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.017911 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/847a70f9-ec15-480b-8ed8-9d3cb006ce64-audit-dir\") pod \"apiserver-76f77b778f-w2wdh\" (UID: \"847a70f9-ec15-480b-8ed8-9d3cb006ce64\") " pod="openshift-apiserver/apiserver-76f77b778f-w2wdh" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.018244 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/931bdb70-51c0-4893-b3b9-6fe8dd700233-serving-cert\") pod \"apiserver-7bbb656c7d-w56dm\" (UID: \"931bdb70-51c0-4893-b3b9-6fe8dd700233\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-w56dm" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.018287 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f31ef6d-c116-4335-bc5d-5357a379d202-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-7x948\" (UID: \"9f31ef6d-c116-4335-bc5d-5357a379d202\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7x948" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.018323 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c08b9259-9d06-4450-a2e6-09740b95fdea-config\") pod \"service-ca-operator-777779d784-zbnc9\" (UID: \"c08b9259-9d06-4450-a2e6-09740b95fdea\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zbnc9" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.018374 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12136e64-010e-49bc-9c3e-d1c65467f361-config\") pod \"machine-api-operator-5694c8668f-qj8zv\" (UID: \"12136e64-010e-49bc-9c3e-d1c65467f361\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qj8zv" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.018529 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/847a70f9-ec15-480b-8ed8-9d3cb006ce64-config\") pod \"apiserver-76f77b778f-w2wdh\" (UID: \"847a70f9-ec15-480b-8ed8-9d3cb006ce64\") " pod="openshift-apiserver/apiserver-76f77b778f-w2wdh" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.018576 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/12136e64-010e-49bc-9c3e-d1c65467f361-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-qj8zv\" (UID: \"12136e64-010e-49bc-9c3e-d1c65467f361\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qj8zv" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.018615 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-audit-dir\") pod \"oauth-openshift-558db77b4-csp6p\" (UID: \"6e0f577b-3526-4f2e-846f-65d5a5ee1d8e\") " pod="openshift-authentication/oauth-openshift-558db77b4-csp6p" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.018649 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/696189ea-f7d5-445e-a07e-ebb32f5e219c-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-rvt4l\" (UID: \"696189ea-f7d5-445e-a07e-ebb32f5e219c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-rvt4l" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.018648 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/847a70f9-ec15-480b-8ed8-9d3cb006ce64-trusted-ca-bundle\") pod \"apiserver-76f77b778f-w2wdh\" (UID: \"847a70f9-ec15-480b-8ed8-9d3cb006ce64\") " pod="openshift-apiserver/apiserver-76f77b778f-w2wdh" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.018670 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fd77fbe-c9db-4f41-9c98-a8f3490c1b30-config\") pod \"openshift-apiserver-operator-796bbdcf4f-cj2tt\" (UID: \"4fd77fbe-c9db-4f41-9c98-a8f3490c1b30\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cj2tt" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.018694 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/81201e01-7100-49fe-a7d2-d402fa8fe9af-images\") pod \"machine-config-operator-74547568cd-7qvr5\" (UID: \"81201e01-7100-49fe-a7d2-d402fa8fe9af\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7qvr5" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.018771 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4002c0b0-4b79-4755-a60c-2fc0cdac7876-console-config\") pod \"console-f9d7485db-jsv9r\" (UID: \"4002c0b0-4b79-4755-a60c-2fc0cdac7876\") " pod="openshift-console/console-f9d7485db-jsv9r" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.018811 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b57c377-c954-4559-af66-e64599cf5f71-serving-cert\") pod \"console-operator-58897d9998-cjns7\" (UID: \"7b57c377-c954-4559-af66-e64599cf5f71\") " pod="openshift-console-operator/console-operator-58897d9998-cjns7" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.018828 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/696189ea-f7d5-445e-a07e-ebb32f5e219c-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-rvt4l\" (UID: \"696189ea-f7d5-445e-a07e-ebb32f5e219c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-rvt4l" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.018846 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5frm\" (UniqueName: \"kubernetes.io/projected/458c25bb-0a4d-4f96-9f46-cdfd66b6ae34-kube-api-access-r5frm\") pod \"route-controller-manager-6576b87f9c-7hn2f\" (UID: \"458c25bb-0a4d-4f96-9f46-cdfd66b6ae34\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7hn2f" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.019058 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f31ef6d-c116-4335-bc5d-5357a379d202-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-7x948\" (UID: \"9f31ef6d-c116-4335-bc5d-5357a379d202\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7x948" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.019079 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-audit-dir\") pod \"oauth-openshift-558db77b4-csp6p\" (UID: \"6e0f577b-3526-4f2e-846f-65d5a5ee1d8e\") " pod="openshift-authentication/oauth-openshift-558db77b4-csp6p" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.019266 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6b92760e-ca08-4b9c-b2a0-3f522ba6a975-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-zkgk9\" (UID: \"6b92760e-ca08-4b9c-b2a0-3f522ba6a975\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zkgk9" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.019309 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/847a70f9-ec15-480b-8ed8-9d3cb006ce64-etcd-client\") pod \"apiserver-76f77b778f-w2wdh\" (UID: \"847a70f9-ec15-480b-8ed8-9d3cb006ce64\") " pod="openshift-apiserver/apiserver-76f77b778f-w2wdh" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.019443 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/847a70f9-ec15-480b-8ed8-9d3cb006ce64-config\") pod \"apiserver-76f77b778f-w2wdh\" (UID: \"847a70f9-ec15-480b-8ed8-9d3cb006ce64\") " pod="openshift-apiserver/apiserver-76f77b778f-w2wdh" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.019489 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/696189ea-f7d5-445e-a07e-ebb32f5e219c-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-rvt4l\" (UID: \"696189ea-f7d5-445e-a07e-ebb32f5e219c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-rvt4l" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.019498 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/0b40c111-e562-48b7-9db2-1a494e16786c-available-featuregates\") pod \"openshift-config-operator-7777fb866f-lt2df\" (UID: \"0b40c111-e562-48b7-9db2-1a494e16786c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lt2df" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.019653 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fa1e2012-a9ba-488e-b877-4b0ee5f079b4-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-frc7j\" (UID: \"fa1e2012-a9ba-488e-b877-4b0ee5f079b4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-frc7j" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.019764 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fd77fbe-c9db-4f41-9c98-a8f3490c1b30-config\") pod \"openshift-apiserver-operator-796bbdcf4f-cj2tt\" (UID: \"4fd77fbe-c9db-4f41-9c98-a8f3490c1b30\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cj2tt" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.019791 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4002c0b0-4b79-4755-a60c-2fc0cdac7876-console-config\") pod \"console-f9d7485db-jsv9r\" (UID: \"4002c0b0-4b79-4755-a60c-2fc0cdac7876\") " pod="openshift-console/console-f9d7485db-jsv9r" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.019037 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a3bc6803-77e3-4e63-bfaa-761500e161de-trusted-ca\") pod \"ingress-operator-5b745b69d9-sl474\" (UID: \"a3bc6803-77e3-4e63-bfaa-761500e161de\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sl474" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.020028 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/0b40c111-e562-48b7-9db2-1a494e16786c-available-featuregates\") pod \"openshift-config-operator-7777fb866f-lt2df\" (UID: \"0b40c111-e562-48b7-9db2-1a494e16786c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lt2df" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.020165 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ww6ff\" (UniqueName: \"kubernetes.io/projected/7b57c377-c954-4559-af66-e64599cf5f71-kube-api-access-ww6ff\") pod \"console-operator-58897d9998-cjns7\" (UID: \"7b57c377-c954-4559-af66-e64599cf5f71\") " pod="openshift-console-operator/console-operator-58897d9998-cjns7" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.020193 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8gl2\" (UniqueName: \"kubernetes.io/projected/12136e64-010e-49bc-9c3e-d1c65467f361-kube-api-access-z8gl2\") pod \"machine-api-operator-5694c8668f-qj8zv\" (UID: \"12136e64-010e-49bc-9c3e-d1c65467f361\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qj8zv" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.020216 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f10f3763-03b0-43d0-88fd-ce89274a67d9-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-stv9m\" (UID: \"f10f3763-03b0-43d0-88fd-ce89274a67d9\") " pod="openshift-marketplace/marketplace-operator-79b997595-stv9m" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.020246 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-csp6p\" (UID: \"6e0f577b-3526-4f2e-846f-65d5a5ee1d8e\") " pod="openshift-authentication/oauth-openshift-558db77b4-csp6p" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.020268 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-csp6p\" (UID: \"6e0f577b-3526-4f2e-846f-65d5a5ee1d8e\") " pod="openshift-authentication/oauth-openshift-558db77b4-csp6p" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.020286 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/847a70f9-ec15-480b-8ed8-9d3cb006ce64-image-import-ca\") pod \"apiserver-76f77b778f-w2wdh\" (UID: \"847a70f9-ec15-480b-8ed8-9d3cb006ce64\") " pod="openshift-apiserver/apiserver-76f77b778f-w2wdh" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.020319 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4002c0b0-4b79-4755-a60c-2fc0cdac7876-trusted-ca-bundle\") pod \"console-f9d7485db-jsv9r\" (UID: \"4002c0b0-4b79-4755-a60c-2fc0cdac7876\") " pod="openshift-console/console-f9d7485db-jsv9r" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.020419 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/00f6ed0c-f791-460d-acd4-d100a0b21710-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-nl5jh\" (UID: \"00f6ed0c-f791-460d-acd4-d100a0b21710\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nl5jh" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.020449 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b40c111-e562-48b7-9db2-1a494e16786c-serving-cert\") pod \"openshift-config-operator-7777fb866f-lt2df\" (UID: \"0b40c111-e562-48b7-9db2-1a494e16786c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lt2df" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.020471 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fn44l\" (UniqueName: \"kubernetes.io/projected/95b5b235-1db6-458c-bdf9-c065eb0c70e5-kube-api-access-fn44l\") pod \"packageserver-d55dfcdfc-6ffb7\" (UID: \"95b5b235-1db6-458c-bdf9-c065eb0c70e5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6ffb7" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.020490 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/ccb13006-cd7f-4249-9d04-7391d26eaae3-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-7p82b\" (UID: \"ccb13006-cd7f-4249-9d04-7391d26eaae3\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7p82b" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.020513 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7499g\" (UniqueName: \"kubernetes.io/projected/2fb73eca-8b7b-488b-bea1-3ca7f1145fd7-kube-api-access-7499g\") pod \"migrator-59844c95c7-lwmpj\" (UID: \"2fb73eca-8b7b-488b-bea1-3ca7f1145fd7\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lwmpj" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.020534 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/458c25bb-0a4d-4f96-9f46-cdfd66b6ae34-config\") pod \"route-controller-manager-6576b87f9c-7hn2f\" (UID: \"458c25bb-0a4d-4f96-9f46-cdfd66b6ae34\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7hn2f" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.020567 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06b421f6-7c09-4105-9aa3-06296e88a57f-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-p9pvs\" (UID: \"06b421f6-7c09-4105-9aa3-06296e88a57f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p9pvs" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.020585 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fde3e2c2-ed59-4cbf-8554-1a0438eb81dc-secret-volume\") pod \"collect-profiles-29410410-kr2lr\" (UID: \"fde3e2c2-ed59-4cbf-8554-1a0438eb81dc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410410-kr2lr" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.020603 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvqfg\" (UniqueName: \"kubernetes.io/projected/3bc8d3cc-b827-4f76-b41e-18e0790b6e66-kube-api-access-vvqfg\") pod \"router-default-5444994796-hjqsp\" (UID: \"3bc8d3cc-b827-4f76-b41e-18e0790b6e66\") " pod="openshift-ingress/router-default-5444994796-hjqsp" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.020620 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f4a93f57-1e83-4b94-b1eb-fc28eea15503-srv-cert\") pod \"olm-operator-6b444d44fb-ljnp5\" (UID: \"f4a93f57-1e83-4b94-b1eb-fc28eea15503\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ljnp5" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.020655 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fa1e2012-a9ba-488e-b877-4b0ee5f079b4-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-frc7j\" (UID: \"fa1e2012-a9ba-488e-b877-4b0ee5f079b4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-frc7j" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.020675 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wz6n\" (UniqueName: \"kubernetes.io/projected/10c473f1-a2f6-4565-ba09-d7e28dea1600-kube-api-access-9wz6n\") pod \"machine-config-controller-84d6567774-gh8cn\" (UID: \"10c473f1-a2f6-4565-ba09-d7e28dea1600\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gh8cn" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.020693 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfwr8\" (UniqueName: \"kubernetes.io/projected/c08b9259-9d06-4450-a2e6-09740b95fdea-kube-api-access-dfwr8\") pod \"service-ca-operator-777779d784-zbnc9\" (UID: \"c08b9259-9d06-4450-a2e6-09740b95fdea\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zbnc9" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.020708 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/3bc8d3cc-b827-4f76-b41e-18e0790b6e66-default-certificate\") pod \"router-default-5444994796-hjqsp\" (UID: \"3bc8d3cc-b827-4f76-b41e-18e0790b6e66\") " pod="openshift-ingress/router-default-5444994796-hjqsp" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.020727 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/847a70f9-ec15-480b-8ed8-9d3cb006ce64-encryption-config\") pod \"apiserver-76f77b778f-w2wdh\" (UID: \"847a70f9-ec15-480b-8ed8-9d3cb006ce64\") " pod="openshift-apiserver/apiserver-76f77b778f-w2wdh" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.020742 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a3bc6803-77e3-4e63-bfaa-761500e161de-metrics-tls\") pod \"ingress-operator-5b745b69d9-sl474\" (UID: \"a3bc6803-77e3-4e63-bfaa-761500e161de\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sl474" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.020757 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/847a70f9-ec15-480b-8ed8-9d3cb006ce64-audit\") pod \"apiserver-76f77b778f-w2wdh\" (UID: \"847a70f9-ec15-480b-8ed8-9d3cb006ce64\") " pod="openshift-apiserver/apiserver-76f77b778f-w2wdh" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.020775 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z54pr\" (UniqueName: \"kubernetes.io/projected/710d5a74-c24a-452e-a5bf-1c23b3589361-kube-api-access-z54pr\") pod \"downloads-7954f5f757-gg284\" (UID: \"710d5a74-c24a-452e-a5bf-1c23b3589361\") " pod="openshift-console/downloads-7954f5f757-gg284" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.021142 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4002c0b0-4b79-4755-a60c-2fc0cdac7876-trusted-ca-bundle\") pod \"console-f9d7485db-jsv9r\" (UID: \"4002c0b0-4b79-4755-a60c-2fc0cdac7876\") " pod="openshift-console/console-f9d7485db-jsv9r" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.021578 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06b421f6-7c09-4105-9aa3-06296e88a57f-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-p9pvs\" (UID: \"06b421f6-7c09-4105-9aa3-06296e88a57f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p9pvs" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.021659 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/847a70f9-ec15-480b-8ed8-9d3cb006ce64-image-import-ca\") pod \"apiserver-76f77b778f-w2wdh\" (UID: \"847a70f9-ec15-480b-8ed8-9d3cb006ce64\") " pod="openshift-apiserver/apiserver-76f77b778f-w2wdh" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.021987 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/458c25bb-0a4d-4f96-9f46-cdfd66b6ae34-config\") pod \"route-controller-manager-6576b87f9c-7hn2f\" (UID: \"458c25bb-0a4d-4f96-9f46-cdfd66b6ae34\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7hn2f" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.022698 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.022895 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/847a70f9-ec15-480b-8ed8-9d3cb006ce64-audit\") pod \"apiserver-76f77b778f-w2wdh\" (UID: \"847a70f9-ec15-480b-8ed8-9d3cb006ce64\") " pod="openshift-apiserver/apiserver-76f77b778f-w2wdh" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.022706 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f31ef6d-c116-4335-bc5d-5357a379d202-serving-cert\") pod \"authentication-operator-69f744f599-7x948\" (UID: \"9f31ef6d-c116-4335-bc5d-5357a379d202\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7x948" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.024208 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/06b421f6-7c09-4105-9aa3-06296e88a57f-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-p9pvs\" (UID: \"06b421f6-7c09-4105-9aa3-06296e88a57f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p9pvs" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.025001 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-csp6p\" (UID: \"6e0f577b-3526-4f2e-846f-65d5a5ee1d8e\") " pod="openshift-authentication/oauth-openshift-558db77b4-csp6p" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.025006 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/847a70f9-ec15-480b-8ed8-9d3cb006ce64-encryption-config\") pod \"apiserver-76f77b778f-w2wdh\" (UID: \"847a70f9-ec15-480b-8ed8-9d3cb006ce64\") " pod="openshift-apiserver/apiserver-76f77b778f-w2wdh" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.025510 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/847a70f9-ec15-480b-8ed8-9d3cb006ce64-serving-cert\") pod \"apiserver-76f77b778f-w2wdh\" (UID: \"847a70f9-ec15-480b-8ed8-9d3cb006ce64\") " pod="openshift-apiserver/apiserver-76f77b778f-w2wdh" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.025428 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4002c0b0-4b79-4755-a60c-2fc0cdac7876-console-serving-cert\") pod \"console-f9d7485db-jsv9r\" (UID: \"4002c0b0-4b79-4755-a60c-2fc0cdac7876\") " pod="openshift-console/console-f9d7485db-jsv9r" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.025793 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa1e2012-a9ba-488e-b877-4b0ee5f079b4-config\") pod \"kube-apiserver-operator-766d6c64bb-frc7j\" (UID: \"fa1e2012-a9ba-488e-b877-4b0ee5f079b4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-frc7j" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.025855 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410410-kr2lr"] Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.025918 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/696189ea-f7d5-445e-a07e-ebb32f5e219c-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-rvt4l\" (UID: \"696189ea-f7d5-445e-a07e-ebb32f5e219c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-rvt4l" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.026582 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-85jfd"] Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.026946 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-csp6p\" (UID: \"6e0f577b-3526-4f2e-846f-65d5a5ee1d8e\") " pod="openshift-authentication/oauth-openshift-558db77b4-csp6p" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.027090 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-csp6p\" (UID: \"6e0f577b-3526-4f2e-846f-65d5a5ee1d8e\") " pod="openshift-authentication/oauth-openshift-558db77b4-csp6p" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.027232 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-csp6p\" (UID: \"6e0f577b-3526-4f2e-846f-65d5a5ee1d8e\") " pod="openshift-authentication/oauth-openshift-558db77b4-csp6p" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.027358 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-csp6p\" (UID: \"6e0f577b-3526-4f2e-846f-65d5a5ee1d8e\") " pod="openshift-authentication/oauth-openshift-558db77b4-csp6p" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.027905 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-n5kbb"] Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.028185 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4002c0b0-4b79-4755-a60c-2fc0cdac7876-console-oauth-config\") pod \"console-f9d7485db-jsv9r\" (UID: \"4002c0b0-4b79-4755-a60c-2fc0cdac7876\") " pod="openshift-console/console-f9d7485db-jsv9r" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.028484 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-n5kbb" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.028901 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-csp6p\" (UID: \"6e0f577b-3526-4f2e-846f-65d5a5ee1d8e\") " pod="openshift-authentication/oauth-openshift-558db77b4-csp6p" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.029237 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-n9nhh"] Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.029289 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/931bdb70-51c0-4893-b3b9-6fe8dd700233-etcd-client\") pod \"apiserver-7bbb656c7d-w56dm\" (UID: \"931bdb70-51c0-4893-b3b9-6fe8dd700233\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-w56dm" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.029348 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4fd77fbe-c9db-4f41-9c98-a8f3490c1b30-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-cj2tt\" (UID: \"4fd77fbe-c9db-4f41-9c98-a8f3490c1b30\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cj2tt" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.029587 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/458c25bb-0a4d-4f96-9f46-cdfd66b6ae34-serving-cert\") pod \"route-controller-manager-6576b87f9c-7hn2f\" (UID: \"458c25bb-0a4d-4f96-9f46-cdfd66b6ae34\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7hn2f" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.029768 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-n9nhh" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.030054 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-csp6p\" (UID: \"6e0f577b-3526-4f2e-846f-65d5a5ee1d8e\") " pod="openshift-authentication/oauth-openshift-558db77b4-csp6p" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.030411 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-n5kbb"] Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.031550 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-n9nhh"] Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.032731 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-tqsxt"] Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.033522 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-tqsxt" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.034189 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-tqsxt"] Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.037505 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/931bdb70-51c0-4893-b3b9-6fe8dd700233-encryption-config\") pod \"apiserver-7bbb656c7d-w56dm\" (UID: \"931bdb70-51c0-4893-b3b9-6fe8dd700233\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-w56dm" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.039323 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/931bdb70-51c0-4893-b3b9-6fe8dd700233-serving-cert\") pod \"apiserver-7bbb656c7d-w56dm\" (UID: \"931bdb70-51c0-4893-b3b9-6fe8dd700233\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-w56dm" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.041492 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/847a70f9-ec15-480b-8ed8-9d3cb006ce64-etcd-client\") pod \"apiserver-76f77b778f-w2wdh\" (UID: \"847a70f9-ec15-480b-8ed8-9d3cb006ce64\") " pod="openshift-apiserver/apiserver-76f77b778f-w2wdh" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.042646 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/12136e64-010e-49bc-9c3e-d1c65467f361-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-qj8zv\" (UID: \"12136e64-010e-49bc-9c3e-d1c65467f361\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qj8zv" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.042714 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-csp6p\" (UID: \"6e0f577b-3526-4f2e-846f-65d5a5ee1d8e\") " pod="openshift-authentication/oauth-openshift-558db77b4-csp6p" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.042962 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b40c111-e562-48b7-9db2-1a494e16786c-serving-cert\") pod \"openshift-config-operator-7777fb866f-lt2df\" (UID: \"0b40c111-e562-48b7-9db2-1a494e16786c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lt2df" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.044245 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.051000 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fa1e2012-a9ba-488e-b877-4b0ee5f079b4-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-frc7j\" (UID: \"fa1e2012-a9ba-488e-b877-4b0ee5f079b4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-frc7j" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.062995 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.074615 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a3bc6803-77e3-4e63-bfaa-761500e161de-metrics-tls\") pod \"ingress-operator-5b745b69d9-sl474\" (UID: \"a3bc6803-77e3-4e63-bfaa-761500e161de\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sl474" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.083093 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.103089 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.122201 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/95b5b235-1db6-458c-bdf9-c065eb0c70e5-apiservice-cert\") pod \"packageserver-d55dfcdfc-6ffb7\" (UID: \"95b5b235-1db6-458c-bdf9-c065eb0c70e5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6ffb7" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.122255 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/10c473f1-a2f6-4565-ba09-d7e28dea1600-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-gh8cn\" (UID: \"10c473f1-a2f6-4565-ba09-d7e28dea1600\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gh8cn" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.122294 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/95b5b235-1db6-458c-bdf9-c065eb0c70e5-webhook-cert\") pod \"packageserver-d55dfcdfc-6ffb7\" (UID: \"95b5b235-1db6-458c-bdf9-c065eb0c70e5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6ffb7" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.122332 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r77mm\" (UniqueName: \"kubernetes.io/projected/f4a93f57-1e83-4b94-b1eb-fc28eea15503-kube-api-access-r77mm\") pod \"olm-operator-6b444d44fb-ljnp5\" (UID: \"f4a93f57-1e83-4b94-b1eb-fc28eea15503\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ljnp5" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.122365 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lccq\" (UniqueName: \"kubernetes.io/projected/fde3e2c2-ed59-4cbf-8554-1a0438eb81dc-kube-api-access-8lccq\") pod \"collect-profiles-29410410-kr2lr\" (UID: \"fde3e2c2-ed59-4cbf-8554-1a0438eb81dc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410410-kr2lr" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.122403 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/979f8807-55d0-475c-9cd6-08b47d99f9e2-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-kg8bg\" (UID: \"979f8807-55d0-475c-9cd6-08b47d99f9e2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kg8bg" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.122459 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5ae96720-62a1-4f8e-b9b1-c234fdf318e8-node-bootstrap-token\") pod \"machine-config-server-zmd8d\" (UID: \"5ae96720-62a1-4f8e-b9b1-c234fdf318e8\") " pod="openshift-machine-config-operator/machine-config-server-zmd8d" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.122499 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c08b9259-9d06-4450-a2e6-09740b95fdea-serving-cert\") pod \"service-ca-operator-777779d784-zbnc9\" (UID: \"c08b9259-9d06-4450-a2e6-09740b95fdea\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zbnc9" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.122553 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/979f8807-55d0-475c-9cd6-08b47d99f9e2-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-kg8bg\" (UID: \"979f8807-55d0-475c-9cd6-08b47d99f9e2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kg8bg" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.122609 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5ae96720-62a1-4f8e-b9b1-c234fdf318e8-certs\") pod \"machine-config-server-zmd8d\" (UID: \"5ae96720-62a1-4f8e-b9b1-c234fdf318e8\") " pod="openshift-machine-config-operator/machine-config-server-zmd8d" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.122642 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7ad9fda2-065b-4620-bc36-e33403fcdd53-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-p5kqx\" (UID: \"7ad9fda2-065b-4620-bc36-e33403fcdd53\") " pod="openshift-controller-manager/controller-manager-879f6c89f-p5kqx" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.122739 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztp9d\" (UniqueName: \"kubernetes.io/projected/81201e01-7100-49fe-a7d2-d402fa8fe9af-kube-api-access-ztp9d\") pod \"machine-config-operator-74547568cd-7qvr5\" (UID: \"81201e01-7100-49fe-a7d2-d402fa8fe9af\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7qvr5" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.122774 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/3bc8d3cc-b827-4f76-b41e-18e0790b6e66-stats-auth\") pod \"router-default-5444994796-hjqsp\" (UID: \"3bc8d3cc-b827-4f76-b41e-18e0790b6e66\") " pod="openshift-ingress/router-default-5444994796-hjqsp" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.122827 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f10f3763-03b0-43d0-88fd-ce89274a67d9-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-stv9m\" (UID: \"f10f3763-03b0-43d0-88fd-ce89274a67d9\") " pod="openshift-marketplace/marketplace-operator-79b997595-stv9m" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.122861 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wr64r\" (UniqueName: \"kubernetes.io/projected/5ae96720-62a1-4f8e-b9b1-c234fdf318e8-kube-api-access-wr64r\") pod \"machine-config-server-zmd8d\" (UID: \"5ae96720-62a1-4f8e-b9b1-c234fdf318e8\") " pod="openshift-machine-config-operator/machine-config-server-zmd8d" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.122905 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/10c473f1-a2f6-4565-ba09-d7e28dea1600-proxy-tls\") pod \"machine-config-controller-84d6567774-gh8cn\" (UID: \"10c473f1-a2f6-4565-ba09-d7e28dea1600\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gh8cn" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.122965 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqjb4\" (UniqueName: \"kubernetes.io/projected/f10f3763-03b0-43d0-88fd-ce89274a67d9-kube-api-access-zqjb4\") pod \"marketplace-operator-79b997595-stv9m\" (UID: \"f10f3763-03b0-43d0-88fd-ce89274a67d9\") " pod="openshift-marketplace/marketplace-operator-79b997595-stv9m" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.123000 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3bc8d3cc-b827-4f76-b41e-18e0790b6e66-service-ca-bundle\") pod \"router-default-5444994796-hjqsp\" (UID: \"3bc8d3cc-b827-4f76-b41e-18e0790b6e66\") " pod="openshift-ingress/router-default-5444994796-hjqsp" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.123054 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/95b5b235-1db6-458c-bdf9-c065eb0c70e5-tmpfs\") pod \"packageserver-d55dfcdfc-6ffb7\" (UID: \"95b5b235-1db6-458c-bdf9-c065eb0c70e5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6ffb7" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.123087 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4qpv\" (UniqueName: \"kubernetes.io/projected/979f8807-55d0-475c-9cd6-08b47d99f9e2-kube-api-access-n4qpv\") pod \"cluster-image-registry-operator-dc59b4c8b-kg8bg\" (UID: \"979f8807-55d0-475c-9cd6-08b47d99f9e2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kg8bg" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.123109 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/10c473f1-a2f6-4565-ba09-d7e28dea1600-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-gh8cn\" (UID: \"10c473f1-a2f6-4565-ba09-d7e28dea1600\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gh8cn" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.123121 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3bc8d3cc-b827-4f76-b41e-18e0790b6e66-metrics-certs\") pod \"router-default-5444994796-hjqsp\" (UID: \"3bc8d3cc-b827-4f76-b41e-18e0790b6e66\") " pod="openshift-ingress/router-default-5444994796-hjqsp" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.123151 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.123169 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c08b9259-9d06-4450-a2e6-09740b95fdea-config\") pod \"service-ca-operator-777779d784-zbnc9\" (UID: \"c08b9259-9d06-4450-a2e6-09740b95fdea\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zbnc9" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.123207 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/81201e01-7100-49fe-a7d2-d402fa8fe9af-images\") pod \"machine-config-operator-74547568cd-7qvr5\" (UID: \"81201e01-7100-49fe-a7d2-d402fa8fe9af\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7qvr5" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.123318 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f10f3763-03b0-43d0-88fd-ce89274a67d9-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-stv9m\" (UID: \"f10f3763-03b0-43d0-88fd-ce89274a67d9\") " pod="openshift-marketplace/marketplace-operator-79b997595-stv9m" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.123414 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fn44l\" (UniqueName: \"kubernetes.io/projected/95b5b235-1db6-458c-bdf9-c065eb0c70e5-kube-api-access-fn44l\") pod \"packageserver-d55dfcdfc-6ffb7\" (UID: \"95b5b235-1db6-458c-bdf9-c065eb0c70e5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6ffb7" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.123493 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/ccb13006-cd7f-4249-9d04-7391d26eaae3-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-7p82b\" (UID: \"ccb13006-cd7f-4249-9d04-7391d26eaae3\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7p82b" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.123610 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fde3e2c2-ed59-4cbf-8554-1a0438eb81dc-secret-volume\") pod \"collect-profiles-29410410-kr2lr\" (UID: \"fde3e2c2-ed59-4cbf-8554-1a0438eb81dc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410410-kr2lr" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.123671 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/95b5b235-1db6-458c-bdf9-c065eb0c70e5-tmpfs\") pod \"packageserver-d55dfcdfc-6ffb7\" (UID: \"95b5b235-1db6-458c-bdf9-c065eb0c70e5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6ffb7" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.123697 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvqfg\" (UniqueName: \"kubernetes.io/projected/3bc8d3cc-b827-4f76-b41e-18e0790b6e66-kube-api-access-vvqfg\") pod \"router-default-5444994796-hjqsp\" (UID: \"3bc8d3cc-b827-4f76-b41e-18e0790b6e66\") " pod="openshift-ingress/router-default-5444994796-hjqsp" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.123773 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f4a93f57-1e83-4b94-b1eb-fc28eea15503-srv-cert\") pod \"olm-operator-6b444d44fb-ljnp5\" (UID: \"f4a93f57-1e83-4b94-b1eb-fc28eea15503\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ljnp5" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.123873 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wz6n\" (UniqueName: \"kubernetes.io/projected/10c473f1-a2f6-4565-ba09-d7e28dea1600-kube-api-access-9wz6n\") pod \"machine-config-controller-84d6567774-gh8cn\" (UID: \"10c473f1-a2f6-4565-ba09-d7e28dea1600\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gh8cn" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.123971 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfwr8\" (UniqueName: \"kubernetes.io/projected/c08b9259-9d06-4450-a2e6-09740b95fdea-kube-api-access-dfwr8\") pod \"service-ca-operator-777779d784-zbnc9\" (UID: \"c08b9259-9d06-4450-a2e6-09740b95fdea\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zbnc9" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.124013 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/3bc8d3cc-b827-4f76-b41e-18e0790b6e66-default-certificate\") pod \"router-default-5444994796-hjqsp\" (UID: \"3bc8d3cc-b827-4f76-b41e-18e0790b6e66\") " pod="openshift-ingress/router-default-5444994796-hjqsp" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.124055 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z54pr\" (UniqueName: \"kubernetes.io/projected/710d5a74-c24a-452e-a5bf-1c23b3589361-kube-api-access-z54pr\") pod \"downloads-7954f5f757-gg284\" (UID: \"710d5a74-c24a-452e-a5bf-1c23b3589361\") " pod="openshift-console/downloads-7954f5f757-gg284" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.124087 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ad9fda2-065b-4620-bc36-e33403fcdd53-serving-cert\") pod \"controller-manager-879f6c89f-p5kqx\" (UID: \"7ad9fda2-065b-4620-bc36-e33403fcdd53\") " pod="openshift-controller-manager/controller-manager-879f6c89f-p5kqx" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.124133 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgkfc\" (UniqueName: \"kubernetes.io/projected/7ad9fda2-065b-4620-bc36-e33403fcdd53-kube-api-access-rgkfc\") pod \"controller-manager-879f6c89f-p5kqx\" (UID: \"7ad9fda2-065b-4620-bc36-e33403fcdd53\") " pod="openshift-controller-manager/controller-manager-879f6c89f-p5kqx" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.124181 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/81201e01-7100-49fe-a7d2-d402fa8fe9af-proxy-tls\") pod \"machine-config-operator-74547568cd-7qvr5\" (UID: \"81201e01-7100-49fe-a7d2-d402fa8fe9af\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7qvr5" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.124212 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ad9fda2-065b-4620-bc36-e33403fcdd53-config\") pod \"controller-manager-879f6c89f-p5kqx\" (UID: \"7ad9fda2-065b-4620-bc36-e33403fcdd53\") " pod="openshift-controller-manager/controller-manager-879f6c89f-p5kqx" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.124253 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/81201e01-7100-49fe-a7d2-d402fa8fe9af-auth-proxy-config\") pod \"machine-config-operator-74547568cd-7qvr5\" (UID: \"81201e01-7100-49fe-a7d2-d402fa8fe9af\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7qvr5" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.124288 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fde3e2c2-ed59-4cbf-8554-1a0438eb81dc-config-volume\") pod \"collect-profiles-29410410-kr2lr\" (UID: \"fde3e2c2-ed59-4cbf-8554-1a0438eb81dc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410410-kr2lr" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.124329 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7ad9fda2-065b-4620-bc36-e33403fcdd53-client-ca\") pod \"controller-manager-879f6c89f-p5kqx\" (UID: \"7ad9fda2-065b-4620-bc36-e33403fcdd53\") " pod="openshift-controller-manager/controller-manager-879f6c89f-p5kqx" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.124363 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/979f8807-55d0-475c-9cd6-08b47d99f9e2-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-kg8bg\" (UID: \"979f8807-55d0-475c-9cd6-08b47d99f9e2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kg8bg" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.124420 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wgxq\" (UniqueName: \"kubernetes.io/projected/ccb13006-cd7f-4249-9d04-7391d26eaae3-kube-api-access-6wgxq\") pod \"package-server-manager-789f6589d5-7p82b\" (UID: \"ccb13006-cd7f-4249-9d04-7391d26eaae3\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7p82b" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.124502 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f4a93f57-1e83-4b94-b1eb-fc28eea15503-profile-collector-cert\") pod \"olm-operator-6b444d44fb-ljnp5\" (UID: \"f4a93f57-1e83-4b94-b1eb-fc28eea15503\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ljnp5" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.125196 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/81201e01-7100-49fe-a7d2-d402fa8fe9af-auth-proxy-config\") pod \"machine-config-operator-74547568cd-7qvr5\" (UID: \"81201e01-7100-49fe-a7d2-d402fa8fe9af\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7qvr5" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.143676 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.153950 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6b92760e-ca08-4b9c-b2a0-3f522ba6a975-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-zkgk9\" (UID: \"6b92760e-ca08-4b9c-b2a0-3f522ba6a975\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zkgk9" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.162576 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.182588 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.184769 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b92760e-ca08-4b9c-b2a0-3f522ba6a975-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-zkgk9\" (UID: \"6b92760e-ca08-4b9c-b2a0-3f522ba6a975\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zkgk9" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.202663 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.222622 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.242675 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.263085 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.272613 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b57c377-c954-4559-af66-e64599cf5f71-serving-cert\") pod \"console-operator-58897d9998-cjns7\" (UID: \"7b57c377-c954-4559-af66-e64599cf5f71\") " pod="openshift-console-operator/console-operator-58897d9998-cjns7" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.282601 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.302368 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.313264 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b57c377-c954-4559-af66-e64599cf5f71-config\") pod \"console-operator-58897d9998-cjns7\" (UID: \"7b57c377-c954-4559-af66-e64599cf5f71\") " pod="openshift-console-operator/console-operator-58897d9998-cjns7" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.329971 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.331219 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7b57c377-c954-4559-af66-e64599cf5f71-trusted-ca\") pod \"console-operator-58897d9998-cjns7\" (UID: \"7b57c377-c954-4559-af66-e64599cf5f71\") " pod="openshift-console-operator/console-operator-58897d9998-cjns7" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.344039 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.355987 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/00f6ed0c-f791-460d-acd4-d100a0b21710-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-nl5jh\" (UID: \"00f6ed0c-f791-460d-acd4-d100a0b21710\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nl5jh" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.362706 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.383774 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.390210 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a56487ec-ba83-42dc-b8b4-35c353fbc836-srv-cert\") pod \"catalog-operator-68c6474976-h7flt\" (UID: \"a56487ec-ba83-42dc-b8b4-35c353fbc836\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h7flt" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.403965 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.422700 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.443792 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.449828 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f4a93f57-1e83-4b94-b1eb-fc28eea15503-profile-collector-cert\") pod \"olm-operator-6b444d44fb-ljnp5\" (UID: \"f4a93f57-1e83-4b94-b1eb-fc28eea15503\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ljnp5" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.451208 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fde3e2c2-ed59-4cbf-8554-1a0438eb81dc-secret-volume\") pod \"collect-profiles-29410410-kr2lr\" (UID: \"fde3e2c2-ed59-4cbf-8554-1a0438eb81dc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410410-kr2lr" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.455588 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a56487ec-ba83-42dc-b8b4-35c353fbc836-profile-collector-cert\") pod \"catalog-operator-68c6474976-h7flt\" (UID: \"a56487ec-ba83-42dc-b8b4-35c353fbc836\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h7flt" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.463128 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.483137 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.489801 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04aab66b-a04f-459e-a760-eec00bec0115-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-g44h6\" (UID: \"04aab66b-a04f-459e-a760-eec00bec0115\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-g44h6" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.502827 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.522234 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.530253 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04aab66b-a04f-459e-a760-eec00bec0115-config\") pod \"kube-controller-manager-operator-78b949d7b-g44h6\" (UID: \"04aab66b-a04f-459e-a760-eec00bec0115\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-g44h6" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.543337 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.563630 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.592436 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.597663 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/979f8807-55d0-475c-9cd6-08b47d99f9e2-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-kg8bg\" (UID: \"979f8807-55d0-475c-9cd6-08b47d99f9e2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kg8bg" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.605037 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.623983 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.644203 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.663927 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.674538 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3bc8d3cc-b827-4f76-b41e-18e0790b6e66-service-ca-bundle\") pod \"router-default-5444994796-hjqsp\" (UID: \"3bc8d3cc-b827-4f76-b41e-18e0790b6e66\") " pod="openshift-ingress/router-default-5444994796-hjqsp" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.683690 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.693615 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/cab55760-3ea7-41e5-8ac2-7e8c44f5ecb1-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-j5fqx\" (UID: \"cab55760-3ea7-41e5-8ac2-7e8c44f5ecb1\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-j5fqx" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.704537 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.724053 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.729354 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/3bc8d3cc-b827-4f76-b41e-18e0790b6e66-default-certificate\") pod \"router-default-5444994796-hjqsp\" (UID: \"3bc8d3cc-b827-4f76-b41e-18e0790b6e66\") " pod="openshift-ingress/router-default-5444994796-hjqsp" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.744060 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.757922 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/3bc8d3cc-b827-4f76-b41e-18e0790b6e66-stats-auth\") pod \"router-default-5444994796-hjqsp\" (UID: \"3bc8d3cc-b827-4f76-b41e-18e0790b6e66\") " pod="openshift-ingress/router-default-5444994796-hjqsp" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.763429 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.777279 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3bc8d3cc-b827-4f76-b41e-18e0790b6e66-metrics-certs\") pod \"router-default-5444994796-hjqsp\" (UID: \"3bc8d3cc-b827-4f76-b41e-18e0790b6e66\") " pod="openshift-ingress/router-default-5444994796-hjqsp" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.783805 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.804876 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.843923 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.849172 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f4a93f57-1e83-4b94-b1eb-fc28eea15503-srv-cert\") pod \"olm-operator-6b444d44fb-ljnp5\" (UID: \"f4a93f57-1e83-4b94-b1eb-fc28eea15503\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ljnp5" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.863776 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.878434 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/ccb13006-cd7f-4249-9d04-7391d26eaae3-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-7p82b\" (UID: \"ccb13006-cd7f-4249-9d04-7391d26eaae3\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7p82b" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.881856 4962 request.go:700] Waited for 1.006875602s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/secrets?fieldSelector=metadata.name%3Ddefault-dockercfg-chnjx&limit=500&resourceVersion=0 Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.883690 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.904074 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.915480 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/81201e01-7100-49fe-a7d2-d402fa8fe9af-images\") pod \"machine-config-operator-74547568cd-7qvr5\" (UID: \"81201e01-7100-49fe-a7d2-d402fa8fe9af\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7qvr5" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.924077 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.944529 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.959073 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/81201e01-7100-49fe-a7d2-d402fa8fe9af-proxy-tls\") pod \"machine-config-operator-74547568cd-7qvr5\" (UID: \"81201e01-7100-49fe-a7d2-d402fa8fe9af\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7qvr5" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.963902 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.977355 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/95b5b235-1db6-458c-bdf9-c065eb0c70e5-apiservice-cert\") pod \"packageserver-d55dfcdfc-6ffb7\" (UID: \"95b5b235-1db6-458c-bdf9-c065eb0c70e5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6ffb7" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.979879 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/95b5b235-1db6-458c-bdf9-c065eb0c70e5-webhook-cert\") pod \"packageserver-d55dfcdfc-6ffb7\" (UID: \"95b5b235-1db6-458c-bdf9-c065eb0c70e5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6ffb7" Dec 01 21:36:01 crc kubenswrapper[4962]: I1201 21:36:01.984290 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.004385 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.018451 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/10c473f1-a2f6-4565-ba09-d7e28dea1600-proxy-tls\") pod \"machine-config-controller-84d6567774-gh8cn\" (UID: \"10c473f1-a2f6-4565-ba09-d7e28dea1600\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gh8cn" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.023698 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.032854 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.043424 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.052357 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.056033 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c08b9259-9d06-4450-a2e6-09740b95fdea-serving-cert\") pod \"service-ca-operator-777779d784-zbnc9\" (UID: \"c08b9259-9d06-4450-a2e6-09740b95fdea\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zbnc9" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.063539 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.084033 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.105128 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.118911 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/979f8807-55d0-475c-9cd6-08b47d99f9e2-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-kg8bg\" (UID: \"979f8807-55d0-475c-9cd6-08b47d99f9e2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kg8bg" Dec 01 21:36:02 crc kubenswrapper[4962]: E1201 21:36:02.122699 4962 secret.go:188] Couldn't get secret openshift-machine-config-operator/node-bootstrapper-token: failed to sync secret cache: timed out waiting for the condition Dec 01 21:36:02 crc kubenswrapper[4962]: E1201 21:36:02.122818 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ae96720-62a1-4f8e-b9b1-c234fdf318e8-node-bootstrap-token podName:5ae96720-62a1-4f8e-b9b1-c234fdf318e8 nodeName:}" failed. No retries permitted until 2025-12-01 21:36:02.622780784 +0000 UTC m=+146.724220049 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-bootstrap-token" (UniqueName: "kubernetes.io/secret/5ae96720-62a1-4f8e-b9b1-c234fdf318e8-node-bootstrap-token") pod "machine-config-server-zmd8d" (UID: "5ae96720-62a1-4f8e-b9b1-c234fdf318e8") : failed to sync secret cache: timed out waiting for the condition Dec 01 21:36:02 crc kubenswrapper[4962]: E1201 21:36:02.122839 4962 secret.go:188] Couldn't get secret openshift-machine-config-operator/machine-config-server-tls: failed to sync secret cache: timed out waiting for the condition Dec 01 21:36:02 crc kubenswrapper[4962]: E1201 21:36:02.122867 4962 configmap.go:193] Couldn't get configMap openshift-controller-manager/openshift-global-ca: failed to sync configmap cache: timed out waiting for the condition Dec 01 21:36:02 crc kubenswrapper[4962]: E1201 21:36:02.122891 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ae96720-62a1-4f8e-b9b1-c234fdf318e8-certs podName:5ae96720-62a1-4f8e-b9b1-c234fdf318e8 nodeName:}" failed. No retries permitted until 2025-12-01 21:36:02.622874456 +0000 UTC m=+146.724313661 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certs" (UniqueName: "kubernetes.io/secret/5ae96720-62a1-4f8e-b9b1-c234fdf318e8-certs") pod "machine-config-server-zmd8d" (UID: "5ae96720-62a1-4f8e-b9b1-c234fdf318e8") : failed to sync secret cache: timed out waiting for the condition Dec 01 21:36:02 crc kubenswrapper[4962]: E1201 21:36:02.122958 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7ad9fda2-065b-4620-bc36-e33403fcdd53-proxy-ca-bundles podName:7ad9fda2-065b-4620-bc36-e33403fcdd53 nodeName:}" failed. No retries permitted until 2025-12-01 21:36:02.622916768 +0000 UTC m=+146.724356073 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-ca-bundles" (UniqueName: "kubernetes.io/configmap/7ad9fda2-065b-4620-bc36-e33403fcdd53-proxy-ca-bundles") pod "controller-manager-879f6c89f-p5kqx" (UID: "7ad9fda2-065b-4620-bc36-e33403fcdd53") : failed to sync configmap cache: timed out waiting for the condition Dec 01 21:36:02 crc kubenswrapper[4962]: E1201 21:36:02.122993 4962 secret.go:188] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: failed to sync secret cache: timed out waiting for the condition Dec 01 21:36:02 crc kubenswrapper[4962]: E1201 21:36:02.123045 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f10f3763-03b0-43d0-88fd-ce89274a67d9-marketplace-operator-metrics podName:f10f3763-03b0-43d0-88fd-ce89274a67d9 nodeName:}" failed. No retries permitted until 2025-12-01 21:36:02.623032071 +0000 UTC m=+146.724471456 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/f10f3763-03b0-43d0-88fd-ce89274a67d9-marketplace-operator-metrics") pod "marketplace-operator-79b997595-stv9m" (UID: "f10f3763-03b0-43d0-88fd-ce89274a67d9") : failed to sync secret cache: timed out waiting for the condition Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.123092 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 01 21:36:02 crc kubenswrapper[4962]: E1201 21:36:02.123699 4962 configmap.go:193] Couldn't get configMap openshift-marketplace/marketplace-trusted-ca: failed to sync configmap cache: timed out waiting for the condition Dec 01 21:36:02 crc kubenswrapper[4962]: E1201 21:36:02.123769 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f10f3763-03b0-43d0-88fd-ce89274a67d9-marketplace-trusted-ca podName:f10f3763-03b0-43d0-88fd-ce89274a67d9 nodeName:}" failed. No retries permitted until 2025-12-01 21:36:02.623742289 +0000 UTC m=+146.725181564 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-trusted-ca" (UniqueName: "kubernetes.io/configmap/f10f3763-03b0-43d0-88fd-ce89274a67d9-marketplace-trusted-ca") pod "marketplace-operator-79b997595-stv9m" (UID: "f10f3763-03b0-43d0-88fd-ce89274a67d9") : failed to sync configmap cache: timed out waiting for the condition Dec 01 21:36:02 crc kubenswrapper[4962]: E1201 21:36:02.123784 4962 configmap.go:193] Couldn't get configMap openshift-service-ca-operator/service-ca-operator-config: failed to sync configmap cache: timed out waiting for the condition Dec 01 21:36:02 crc kubenswrapper[4962]: E1201 21:36:02.123905 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c08b9259-9d06-4450-a2e6-09740b95fdea-config podName:c08b9259-9d06-4450-a2e6-09740b95fdea nodeName:}" failed. No retries permitted until 2025-12-01 21:36:02.623869173 +0000 UTC m=+146.725308438 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/c08b9259-9d06-4450-a2e6-09740b95fdea-config") pod "service-ca-operator-777779d784-zbnc9" (UID: "c08b9259-9d06-4450-a2e6-09740b95fdea") : failed to sync configmap cache: timed out waiting for the condition Dec 01 21:36:02 crc kubenswrapper[4962]: E1201 21:36:02.124164 4962 secret.go:188] Couldn't get secret openshift-controller-manager/serving-cert: failed to sync secret cache: timed out waiting for the condition Dec 01 21:36:02 crc kubenswrapper[4962]: E1201 21:36:02.124235 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ad9fda2-065b-4620-bc36-e33403fcdd53-serving-cert podName:7ad9fda2-065b-4620-bc36-e33403fcdd53 nodeName:}" failed. No retries permitted until 2025-12-01 21:36:02.624218872 +0000 UTC m=+146.725658107 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/7ad9fda2-065b-4620-bc36-e33403fcdd53-serving-cert") pod "controller-manager-879f6c89f-p5kqx" (UID: "7ad9fda2-065b-4620-bc36-e33403fcdd53") : failed to sync secret cache: timed out waiting for the condition Dec 01 21:36:02 crc kubenswrapper[4962]: E1201 21:36:02.124948 4962 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: failed to sync configmap cache: timed out waiting for the condition Dec 01 21:36:02 crc kubenswrapper[4962]: E1201 21:36:02.124983 4962 configmap.go:193] Couldn't get configMap openshift-controller-manager/config: failed to sync configmap cache: timed out waiting for the condition Dec 01 21:36:02 crc kubenswrapper[4962]: E1201 21:36:02.125009 4962 configmap.go:193] Couldn't get configMap openshift-operator-lifecycle-manager/collect-profiles-config: failed to sync configmap cache: timed out waiting for the condition Dec 01 21:36:02 crc kubenswrapper[4962]: E1201 21:36:02.125052 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7ad9fda2-065b-4620-bc36-e33403fcdd53-client-ca podName:7ad9fda2-065b-4620-bc36-e33403fcdd53 nodeName:}" failed. No retries permitted until 2025-12-01 21:36:02.624999782 +0000 UTC m=+146.726439067 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/7ad9fda2-065b-4620-bc36-e33403fcdd53-client-ca") pod "controller-manager-879f6c89f-p5kqx" (UID: "7ad9fda2-065b-4620-bc36-e33403fcdd53") : failed to sync configmap cache: timed out waiting for the condition Dec 01 21:36:02 crc kubenswrapper[4962]: E1201 21:36:02.125081 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7ad9fda2-065b-4620-bc36-e33403fcdd53-config podName:7ad9fda2-065b-4620-bc36-e33403fcdd53 nodeName:}" failed. No retries permitted until 2025-12-01 21:36:02.625068084 +0000 UTC m=+146.726507399 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/7ad9fda2-065b-4620-bc36-e33403fcdd53-config") pod "controller-manager-879f6c89f-p5kqx" (UID: "7ad9fda2-065b-4620-bc36-e33403fcdd53") : failed to sync configmap cache: timed out waiting for the condition Dec 01 21:36:02 crc kubenswrapper[4962]: E1201 21:36:02.125100 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fde3e2c2-ed59-4cbf-8554-1a0438eb81dc-config-volume podName:fde3e2c2-ed59-4cbf-8554-1a0438eb81dc nodeName:}" failed. No retries permitted until 2025-12-01 21:36:02.625091415 +0000 UTC m=+146.726530750 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/fde3e2c2-ed59-4cbf-8554-1a0438eb81dc-config-volume") pod "collect-profiles-29410410-kr2lr" (UID: "fde3e2c2-ed59-4cbf-8554-1a0438eb81dc") : failed to sync configmap cache: timed out waiting for the condition Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.143802 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.162612 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.183287 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.203092 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.223929 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.242521 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.263681 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.283786 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.303454 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.323706 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.353251 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.363504 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.384003 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.403845 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.423776 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.454283 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.464318 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.483743 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.504312 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.524036 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.570485 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pm796\" (UniqueName: \"kubernetes.io/projected/63788da5-7737-4b23-aef1-283bc26a4202-kube-api-access-pm796\") pod \"machine-approver-56656f9798-9wbzf\" (UID: \"63788da5-7737-4b23-aef1-283bc26a4202\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9wbzf" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.590712 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxj56\" (UniqueName: \"kubernetes.io/projected/570052d5-a3db-4720-b8ab-32b4d71c44f8-kube-api-access-gxj56\") pod \"cluster-samples-operator-665b6dd947-w86m4\" (UID: \"570052d5-a3db-4720-b8ab-32b4d71c44f8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w86m4" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.599453 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w86m4" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.617588 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p28qt\" (UniqueName: \"kubernetes.io/projected/02ee2c67-a501-4ae1-a1c3-f9cc8fa1650d-kube-api-access-p28qt\") pod \"dns-operator-744455d44c-h69ht\" (UID: \"02ee2c67-a501-4ae1-a1c3-f9cc8fa1650d\") " pod="openshift-dns-operator/dns-operator-744455d44c-h69ht" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.624314 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.630363 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g42xl\" (UniqueName: \"kubernetes.io/projected/c25f2bd8-5e89-40b2-8c62-3c67a364384f-kube-api-access-g42xl\") pod \"etcd-operator-b45778765-hvvd4\" (UID: \"c25f2bd8-5e89-40b2-8c62-3c67a364384f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hvvd4" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.645372 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.652074 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fde3e2c2-ed59-4cbf-8554-1a0438eb81dc-config-volume\") pod \"collect-profiles-29410410-kr2lr\" (UID: \"fde3e2c2-ed59-4cbf-8554-1a0438eb81dc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410410-kr2lr" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.652173 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7ad9fda2-065b-4620-bc36-e33403fcdd53-client-ca\") pod \"controller-manager-879f6c89f-p5kqx\" (UID: \"7ad9fda2-065b-4620-bc36-e33403fcdd53\") " pod="openshift-controller-manager/controller-manager-879f6c89f-p5kqx" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.652466 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5ae96720-62a1-4f8e-b9b1-c234fdf318e8-node-bootstrap-token\") pod \"machine-config-server-zmd8d\" (UID: \"5ae96720-62a1-4f8e-b9b1-c234fdf318e8\") " pod="openshift-machine-config-operator/machine-config-server-zmd8d" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.652600 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5ae96720-62a1-4f8e-b9b1-c234fdf318e8-certs\") pod \"machine-config-server-zmd8d\" (UID: \"5ae96720-62a1-4f8e-b9b1-c234fdf318e8\") " pod="openshift-machine-config-operator/machine-config-server-zmd8d" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.652657 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7ad9fda2-065b-4620-bc36-e33403fcdd53-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-p5kqx\" (UID: \"7ad9fda2-065b-4620-bc36-e33403fcdd53\") " pod="openshift-controller-manager/controller-manager-879f6c89f-p5kqx" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.652809 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f10f3763-03b0-43d0-88fd-ce89274a67d9-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-stv9m\" (UID: \"f10f3763-03b0-43d0-88fd-ce89274a67d9\") " pod="openshift-marketplace/marketplace-operator-79b997595-stv9m" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.653042 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c08b9259-9d06-4450-a2e6-09740b95fdea-config\") pod \"service-ca-operator-777779d784-zbnc9\" (UID: \"c08b9259-9d06-4450-a2e6-09740b95fdea\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zbnc9" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.653222 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f10f3763-03b0-43d0-88fd-ce89274a67d9-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-stv9m\" (UID: \"f10f3763-03b0-43d0-88fd-ce89274a67d9\") " pod="openshift-marketplace/marketplace-operator-79b997595-stv9m" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.653465 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ad9fda2-065b-4620-bc36-e33403fcdd53-serving-cert\") pod \"controller-manager-879f6c89f-p5kqx\" (UID: \"7ad9fda2-065b-4620-bc36-e33403fcdd53\") " pod="openshift-controller-manager/controller-manager-879f6c89f-p5kqx" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.653561 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ad9fda2-065b-4620-bc36-e33403fcdd53-config\") pod \"controller-manager-879f6c89f-p5kqx\" (UID: \"7ad9fda2-065b-4620-bc36-e33403fcdd53\") " pod="openshift-controller-manager/controller-manager-879f6c89f-p5kqx" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.654663 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fde3e2c2-ed59-4cbf-8554-1a0438eb81dc-config-volume\") pod \"collect-profiles-29410410-kr2lr\" (UID: \"fde3e2c2-ed59-4cbf-8554-1a0438eb81dc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410410-kr2lr" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.654768 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7ad9fda2-065b-4620-bc36-e33403fcdd53-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-p5kqx\" (UID: \"7ad9fda2-065b-4620-bc36-e33403fcdd53\") " pod="openshift-controller-manager/controller-manager-879f6c89f-p5kqx" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.655115 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c08b9259-9d06-4450-a2e6-09740b95fdea-config\") pod \"service-ca-operator-777779d784-zbnc9\" (UID: \"c08b9259-9d06-4450-a2e6-09740b95fdea\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zbnc9" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.656194 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7ad9fda2-065b-4620-bc36-e33403fcdd53-client-ca\") pod \"controller-manager-879f6c89f-p5kqx\" (UID: \"7ad9fda2-065b-4620-bc36-e33403fcdd53\") " pod="openshift-controller-manager/controller-manager-879f6c89f-p5kqx" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.656383 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ad9fda2-065b-4620-bc36-e33403fcdd53-config\") pod \"controller-manager-879f6c89f-p5kqx\" (UID: \"7ad9fda2-065b-4620-bc36-e33403fcdd53\") " pod="openshift-controller-manager/controller-manager-879f6c89f-p5kqx" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.657038 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f10f3763-03b0-43d0-88fd-ce89274a67d9-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-stv9m\" (UID: \"f10f3763-03b0-43d0-88fd-ce89274a67d9\") " pod="openshift-marketplace/marketplace-operator-79b997595-stv9m" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.657689 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5ae96720-62a1-4f8e-b9b1-c234fdf318e8-node-bootstrap-token\") pod \"machine-config-server-zmd8d\" (UID: \"5ae96720-62a1-4f8e-b9b1-c234fdf318e8\") " pod="openshift-machine-config-operator/machine-config-server-zmd8d" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.657776 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ad9fda2-065b-4620-bc36-e33403fcdd53-serving-cert\") pod \"controller-manager-879f6c89f-p5kqx\" (UID: \"7ad9fda2-065b-4620-bc36-e33403fcdd53\") " pod="openshift-controller-manager/controller-manager-879f6c89f-p5kqx" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.660798 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f10f3763-03b0-43d0-88fd-ce89274a67d9-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-stv9m\" (UID: \"f10f3763-03b0-43d0-88fd-ce89274a67d9\") " pod="openshift-marketplace/marketplace-operator-79b997595-stv9m" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.664372 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.667646 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9wbzf" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.668915 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5ae96720-62a1-4f8e-b9b1-c234fdf318e8-certs\") pod \"machine-config-server-zmd8d\" (UID: \"5ae96720-62a1-4f8e-b9b1-c234fdf318e8\") " pod="openshift-machine-config-operator/machine-config-server-zmd8d" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.735021 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64k8k\" (UniqueName: \"kubernetes.io/projected/06b421f6-7c09-4105-9aa3-06296e88a57f-kube-api-access-64k8k\") pod \"openshift-controller-manager-operator-756b6f6bc6-p9pvs\" (UID: \"06b421f6-7c09-4105-9aa3-06296e88a57f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p9pvs" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.749003 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27zvm\" (UniqueName: \"kubernetes.io/projected/a3bc6803-77e3-4e63-bfaa-761500e161de-kube-api-access-27zvm\") pod \"ingress-operator-5b745b69d9-sl474\" (UID: \"a3bc6803-77e3-4e63-bfaa-761500e161de\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sl474" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.766469 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvhpg\" (UniqueName: \"kubernetes.io/projected/931bdb70-51c0-4893-b3b9-6fe8dd700233-kube-api-access-zvhpg\") pod \"apiserver-7bbb656c7d-w56dm\" (UID: \"931bdb70-51c0-4893-b3b9-6fe8dd700233\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-w56dm" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.786050 4962 patch_prober.go:28] interesting pod/machine-config-daemon-b642k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.786112 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.786710 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qw78\" (UniqueName: \"kubernetes.io/projected/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-kube-api-access-9qw78\") pod \"oauth-openshift-558db77b4-csp6p\" (UID: \"6e0f577b-3526-4f2e-846f-65d5a5ee1d8e\") " pod="openshift-authentication/oauth-openshift-558db77b4-csp6p" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.797360 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-csp6p" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.802484 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/04aab66b-a04f-459e-a760-eec00bec0115-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-g44h6\" (UID: \"04aab66b-a04f-459e-a760-eec00bec0115\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-g44h6" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.818039 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p9pvs" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.824574 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgq8g\" (UniqueName: \"kubernetes.io/projected/4002c0b0-4b79-4755-a60c-2fc0cdac7876-kube-api-access-zgq8g\") pod \"console-f9d7485db-jsv9r\" (UID: \"4002c0b0-4b79-4755-a60c-2fc0cdac7876\") " pod="openshift-console/console-f9d7485db-jsv9r" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.837658 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6zs5\" (UniqueName: \"kubernetes.io/projected/0b40c111-e562-48b7-9db2-1a494e16786c-kube-api-access-l6zs5\") pod \"openshift-config-operator-7777fb866f-lt2df\" (UID: \"0b40c111-e562-48b7-9db2-1a494e16786c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lt2df" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.837834 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-jsv9r" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.848979 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w86m4"] Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.867817 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztkdx\" (UniqueName: \"kubernetes.io/projected/cab55760-3ea7-41e5-8ac2-7e8c44f5ecb1-kube-api-access-ztkdx\") pod \"multus-admission-controller-857f4d67dd-j5fqx\" (UID: \"cab55760-3ea7-41e5-8ac2-7e8c44f5ecb1\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-j5fqx" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.878503 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a3bc6803-77e3-4e63-bfaa-761500e161de-bound-sa-token\") pod \"ingress-operator-5b745b69d9-sl474\" (UID: \"a3bc6803-77e3-4e63-bfaa-761500e161de\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sl474" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.882308 4962 request.go:700] Waited for 1.865769015s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-api/serviceaccounts/control-plane-machine-set-operator/token Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.897136 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-j5fqx" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.902698 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6fdb\" (UniqueName: \"kubernetes.io/projected/00f6ed0c-f791-460d-acd4-d100a0b21710-kube-api-access-b6fdb\") pod \"control-plane-machine-set-operator-78cbb6b69f-nl5jh\" (UID: \"00f6ed0c-f791-460d-acd4-d100a0b21710\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nl5jh" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.935252 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-h69ht" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.935269 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-hvvd4" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.963564 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrhts\" (UniqueName: \"kubernetes.io/projected/6b92760e-ca08-4b9c-b2a0-3f522ba6a975-kube-api-access-lrhts\") pod \"kube-storage-version-migrator-operator-b67b599dd-zkgk9\" (UID: \"6b92760e-ca08-4b9c-b2a0-3f522ba6a975\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zkgk9" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.968086 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfzc9\" (UniqueName: \"kubernetes.io/projected/847a70f9-ec15-480b-8ed8-9d3cb006ce64-kube-api-access-kfzc9\") pod \"apiserver-76f77b778f-w2wdh\" (UID: \"847a70f9-ec15-480b-8ed8-9d3cb006ce64\") " pod="openshift-apiserver/apiserver-76f77b778f-w2wdh" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.971912 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkfbl\" (UniqueName: \"kubernetes.io/projected/9f31ef6d-c116-4335-bc5d-5357a379d202-kube-api-access-vkfbl\") pod \"authentication-operator-69f744f599-7x948\" (UID: \"9f31ef6d-c116-4335-bc5d-5357a379d202\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7x948" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.979841 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8bsz\" (UniqueName: \"kubernetes.io/projected/4fd77fbe-c9db-4f41-9c98-a8f3490c1b30-kube-api-access-z8bsz\") pod \"openshift-apiserver-operator-796bbdcf4f-cj2tt\" (UID: \"4fd77fbe-c9db-4f41-9c98-a8f3490c1b30\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cj2tt" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.981810 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-g44h6" Dec 01 21:36:02 crc kubenswrapper[4962]: I1201 21:36:02.997523 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hvck\" (UniqueName: \"kubernetes.io/projected/a56487ec-ba83-42dc-b8b4-35c353fbc836-kube-api-access-2hvck\") pod \"catalog-operator-68c6474976-h7flt\" (UID: \"a56487ec-ba83-42dc-b8b4-35c353fbc836\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h7flt" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.001429 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cj2tt" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.022972 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/696189ea-f7d5-445e-a07e-ebb32f5e219c-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-rvt4l\" (UID: \"696189ea-f7d5-445e-a07e-ebb32f5e219c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-rvt4l" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.042126 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-7x948" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.046816 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5frm\" (UniqueName: \"kubernetes.io/projected/458c25bb-0a4d-4f96-9f46-cdfd66b6ae34-kube-api-access-r5frm\") pod \"route-controller-manager-6576b87f9c-7hn2f\" (UID: \"458c25bb-0a4d-4f96-9f46-cdfd66b6ae34\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7hn2f" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.052898 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lt2df" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.053441 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-w2wdh" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.060444 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-w56dm" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.061354 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ww6ff\" (UniqueName: \"kubernetes.io/projected/7b57c377-c954-4559-af66-e64599cf5f71-kube-api-access-ww6ff\") pod \"console-operator-58897d9998-cjns7\" (UID: \"7b57c377-c954-4559-af66-e64599cf5f71\") " pod="openshift-console-operator/console-operator-58897d9998-cjns7" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.084434 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7hn2f" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.089727 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8gl2\" (UniqueName: \"kubernetes.io/projected/12136e64-010e-49bc-9c3e-d1c65467f361-kube-api-access-z8gl2\") pod \"machine-api-operator-5694c8668f-qj8zv\" (UID: \"12136e64-010e-49bc-9c3e-d1c65467f361\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qj8zv" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.102644 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7499g\" (UniqueName: \"kubernetes.io/projected/2fb73eca-8b7b-488b-bea1-3ca7f1145fd7-kube-api-access-7499g\") pod \"migrator-59844c95c7-lwmpj\" (UID: \"2fb73eca-8b7b-488b-bea1-3ca7f1145fd7\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lwmpj" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.113995 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w86m4" event={"ID":"570052d5-a3db-4720-b8ab-32b4d71c44f8","Type":"ContainerStarted","Data":"5fa9457c4c8e3b808be3583b26efcf5ed4b30cf1c26b57107f4f6224a6f448ef"} Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.115389 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9wbzf" event={"ID":"63788da5-7737-4b23-aef1-283bc26a4202","Type":"ContainerStarted","Data":"e308227fdeace7554a3eca986479107a4a0379f70b5d71aa604a5a99549c7b1a"} Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.115412 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9wbzf" event={"ID":"63788da5-7737-4b23-aef1-283bc26a4202","Type":"ContainerStarted","Data":"7d316d9bad66425932bce5e12f0e304a2494cd2a60ca34b41430db2c28259593"} Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.119534 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fa1e2012-a9ba-488e-b877-4b0ee5f079b4-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-frc7j\" (UID: \"fa1e2012-a9ba-488e-b877-4b0ee5f079b4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-frc7j" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.122836 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.125581 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-rvt4l" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.132216 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lwmpj" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.143714 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.144601 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-frc7j" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.154458 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sl474" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.160791 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 21:36:03 crc kubenswrapper[4962]: E1201 21:36:03.161861 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 21:38:05.161838761 +0000 UTC m=+269.263277956 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.163428 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.163538 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zkgk9" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.169087 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-cjns7" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.175417 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nl5jh" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.182991 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h7flt" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.183236 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.204384 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.224134 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.244594 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.263207 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.263477 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.263523 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.263627 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.264374 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.266762 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.269376 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.276530 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.276646 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.278207 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-csp6p"] Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.291419 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-qj8zv" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.291573 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 01 21:36:03 crc kubenswrapper[4962]: W1201 21:36:03.299392 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e0f577b_3526_4f2e_846f_65d5a5ee1d8e.slice/crio-9f060d82be232e97b5ecfd17fd7925a7feae037e2fac54fcc0f681d1afee65a7 WatchSource:0}: Error finding container 9f060d82be232e97b5ecfd17fd7925a7feae037e2fac54fcc0f681d1afee65a7: Status 404 returned error can't find the container with id 9f060d82be232e97b5ecfd17fd7925a7feae037e2fac54fcc0f681d1afee65a7 Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.307534 4962 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.313653 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-jsv9r"] Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.326552 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p9pvs"] Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.344728 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r77mm\" (UniqueName: \"kubernetes.io/projected/f4a93f57-1e83-4b94-b1eb-fc28eea15503-kube-api-access-r77mm\") pod \"olm-operator-6b444d44fb-ljnp5\" (UID: \"f4a93f57-1e83-4b94-b1eb-fc28eea15503\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ljnp5" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.357444 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lccq\" (UniqueName: \"kubernetes.io/projected/fde3e2c2-ed59-4cbf-8554-1a0438eb81dc-kube-api-access-8lccq\") pod \"collect-profiles-29410410-kr2lr\" (UID: \"fde3e2c2-ed59-4cbf-8554-1a0438eb81dc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410410-kr2lr" Dec 01 21:36:03 crc kubenswrapper[4962]: W1201 21:36:03.363940 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4002c0b0_4b79_4755_a60c_2fc0cdac7876.slice/crio-21538d0a3c59fee3e4d2509fe44592ed0e6dfec886c727b0424564727581376b WatchSource:0}: Error finding container 21538d0a3c59fee3e4d2509fe44592ed0e6dfec886c727b0424564727581376b: Status 404 returned error can't find the container with id 21538d0a3c59fee3e4d2509fe44592ed0e6dfec886c727b0424564727581376b Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.377989 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/979f8807-55d0-475c-9cd6-08b47d99f9e2-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-kg8bg\" (UID: \"979f8807-55d0-475c-9cd6-08b47d99f9e2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kg8bg" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.408799 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztp9d\" (UniqueName: \"kubernetes.io/projected/81201e01-7100-49fe-a7d2-d402fa8fe9af-kube-api-access-ztp9d\") pod \"machine-config-operator-74547568cd-7qvr5\" (UID: \"81201e01-7100-49fe-a7d2-d402fa8fe9af\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7qvr5" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.426403 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wr64r\" (UniqueName: \"kubernetes.io/projected/5ae96720-62a1-4f8e-b9b1-c234fdf318e8-kube-api-access-wr64r\") pod \"machine-config-server-zmd8d\" (UID: \"5ae96720-62a1-4f8e-b9b1-c234fdf318e8\") " pod="openshift-machine-config-operator/machine-config-server-zmd8d" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.442355 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqjb4\" (UniqueName: \"kubernetes.io/projected/f10f3763-03b0-43d0-88fd-ce89274a67d9-kube-api-access-zqjb4\") pod \"marketplace-operator-79b997595-stv9m\" (UID: \"f10f3763-03b0-43d0-88fd-ce89274a67d9\") " pod="openshift-marketplace/marketplace-operator-79b997595-stv9m" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.459881 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4qpv\" (UniqueName: \"kubernetes.io/projected/979f8807-55d0-475c-9cd6-08b47d99f9e2-kube-api-access-n4qpv\") pod \"cluster-image-registry-operator-dc59b4c8b-kg8bg\" (UID: \"979f8807-55d0-475c-9cd6-08b47d99f9e2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kg8bg" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.490666 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fn44l\" (UniqueName: \"kubernetes.io/projected/95b5b235-1db6-458c-bdf9-c065eb0c70e5-kube-api-access-fn44l\") pod \"packageserver-d55dfcdfc-6ffb7\" (UID: \"95b5b235-1db6-458c-bdf9-c065eb0c70e5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6ffb7" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.508291 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ljnp5" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.508646 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvqfg\" (UniqueName: \"kubernetes.io/projected/3bc8d3cc-b827-4f76-b41e-18e0790b6e66-kube-api-access-vvqfg\") pod \"router-default-5444994796-hjqsp\" (UID: \"3bc8d3cc-b827-4f76-b41e-18e0790b6e66\") " pod="openshift-ingress/router-default-5444994796-hjqsp" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.519830 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wz6n\" (UniqueName: \"kubernetes.io/projected/10c473f1-a2f6-4565-ba09-d7e28dea1600-kube-api-access-9wz6n\") pod \"machine-config-controller-84d6567774-gh8cn\" (UID: \"10c473f1-a2f6-4565-ba09-d7e28dea1600\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gh8cn" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.522399 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-j5fqx"] Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.524571 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-g44h6"] Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.526871 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7qvr5" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.531699 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6ffb7" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.534080 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-h69ht"] Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.537250 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-hvvd4"] Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.537913 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gh8cn" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.543142 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfwr8\" (UniqueName: \"kubernetes.io/projected/c08b9259-9d06-4450-a2e6-09740b95fdea-kube-api-access-dfwr8\") pod \"service-ca-operator-777779d784-zbnc9\" (UID: \"c08b9259-9d06-4450-a2e6-09740b95fdea\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zbnc9" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.545575 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.545692 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-zbnc9" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.551197 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kg8bg" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.558707 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410410-kr2lr" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.562325 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.562965 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z54pr\" (UniqueName: \"kubernetes.io/projected/710d5a74-c24a-452e-a5bf-1c23b3589361-kube-api-access-z54pr\") pod \"downloads-7954f5f757-gg284\" (UID: \"710d5a74-c24a-452e-a5bf-1c23b3589361\") " pod="openshift-console/downloads-7954f5f757-gg284" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.568378 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.571492 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-stv9m" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.579276 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgkfc\" (UniqueName: \"kubernetes.io/projected/7ad9fda2-065b-4620-bc36-e33403fcdd53-kube-api-access-rgkfc\") pod \"controller-manager-879f6c89f-p5kqx\" (UID: \"7ad9fda2-065b-4620-bc36-e33403fcdd53\") " pod="openshift-controller-manager/controller-manager-879f6c89f-p5kqx" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.594337 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-zmd8d" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.597874 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wgxq\" (UniqueName: \"kubernetes.io/projected/ccb13006-cd7f-4249-9d04-7391d26eaae3-kube-api-access-6wgxq\") pod \"package-server-manager-789f6589d5-7p82b\" (UID: \"ccb13006-cd7f-4249-9d04-7391d26eaae3\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7p82b" Dec 01 21:36:03 crc kubenswrapper[4962]: W1201 21:36:03.641231 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc25f2bd8_5e89_40b2_8c62_3c67a364384f.slice/crio-5b448cfa91a5abd948b2908a292f0ca444509e0a6ee3c7da418352e7822039c3 WatchSource:0}: Error finding container 5b448cfa91a5abd948b2908a292f0ca444509e0a6ee3c7da418352e7822039c3: Status 404 returned error can't find the container with id 5b448cfa91a5abd948b2908a292f0ca444509e0a6ee3c7da418352e7822039c3 Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.672288 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b2d93aa1-eee7-4a67-b5ee-a05a6696b624-ca-trust-extracted\") pod \"image-registry-697d97f7c8-k8fq2\" (UID: \"b2d93aa1-eee7-4a67-b5ee-a05a6696b624\") " pod="openshift-image-registry/image-registry-697d97f7c8-k8fq2" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.672608 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b2d93aa1-eee7-4a67-b5ee-a05a6696b624-bound-sa-token\") pod \"image-registry-697d97f7c8-k8fq2\" (UID: \"b2d93aa1-eee7-4a67-b5ee-a05a6696b624\") " pod="openshift-image-registry/image-registry-697d97f7c8-k8fq2" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.672651 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/56be3765-fdc4-4de6-96bb-6a06bfc16c6b-signing-key\") pod \"service-ca-9c57cc56f-85jfd\" (UID: \"56be3765-fdc4-4de6-96bb-6a06bfc16c6b\") " pod="openshift-service-ca/service-ca-9c57cc56f-85jfd" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.672694 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k8fq2\" (UID: \"b2d93aa1-eee7-4a67-b5ee-a05a6696b624\") " pod="openshift-image-registry/image-registry-697d97f7c8-k8fq2" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.672717 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b2d93aa1-eee7-4a67-b5ee-a05a6696b624-installation-pull-secrets\") pod \"image-registry-697d97f7c8-k8fq2\" (UID: \"b2d93aa1-eee7-4a67-b5ee-a05a6696b624\") " pod="openshift-image-registry/image-registry-697d97f7c8-k8fq2" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.672766 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rh5vl\" (UniqueName: \"kubernetes.io/projected/b2d93aa1-eee7-4a67-b5ee-a05a6696b624-kube-api-access-rh5vl\") pod \"image-registry-697d97f7c8-k8fq2\" (UID: \"b2d93aa1-eee7-4a67-b5ee-a05a6696b624\") " pod="openshift-image-registry/image-registry-697d97f7c8-k8fq2" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.672788 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b2d93aa1-eee7-4a67-b5ee-a05a6696b624-trusted-ca\") pod \"image-registry-697d97f7c8-k8fq2\" (UID: \"b2d93aa1-eee7-4a67-b5ee-a05a6696b624\") " pod="openshift-image-registry/image-registry-697d97f7c8-k8fq2" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.672810 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b2d93aa1-eee7-4a67-b5ee-a05a6696b624-registry-certificates\") pod \"image-registry-697d97f7c8-k8fq2\" (UID: \"b2d93aa1-eee7-4a67-b5ee-a05a6696b624\") " pod="openshift-image-registry/image-registry-697d97f7c8-k8fq2" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.672834 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/56be3765-fdc4-4de6-96bb-6a06bfc16c6b-signing-cabundle\") pod \"service-ca-9c57cc56f-85jfd\" (UID: \"56be3765-fdc4-4de6-96bb-6a06bfc16c6b\") " pod="openshift-service-ca/service-ca-9c57cc56f-85jfd" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.672925 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tfld\" (UniqueName: \"kubernetes.io/projected/56be3765-fdc4-4de6-96bb-6a06bfc16c6b-kube-api-access-9tfld\") pod \"service-ca-9c57cc56f-85jfd\" (UID: \"56be3765-fdc4-4de6-96bb-6a06bfc16c6b\") " pod="openshift-service-ca/service-ca-9c57cc56f-85jfd" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.673911 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b2d93aa1-eee7-4a67-b5ee-a05a6696b624-registry-tls\") pod \"image-registry-697d97f7c8-k8fq2\" (UID: \"b2d93aa1-eee7-4a67-b5ee-a05a6696b624\") " pod="openshift-image-registry/image-registry-697d97f7c8-k8fq2" Dec 01 21:36:03 crc kubenswrapper[4962]: E1201 21:36:03.675980 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 21:36:04.175967648 +0000 UTC m=+148.277406843 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k8fq2" (UID: "b2d93aa1-eee7-4a67-b5ee-a05a6696b624") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.688704 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cj2tt"] Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.776211 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 21:36:03 crc kubenswrapper[4962]: E1201 21:36:03.776390 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 21:36:04.276362532 +0000 UTC m=+148.377801717 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.776814 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cfe98aa8-58d0-4ed5-ae64-8c0faa03e247-config-volume\") pod \"dns-default-n9nhh\" (UID: \"cfe98aa8-58d0-4ed5-ae64-8c0faa03e247\") " pod="openshift-dns/dns-default-n9nhh" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.776852 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b2d93aa1-eee7-4a67-b5ee-a05a6696b624-ca-trust-extracted\") pod \"image-registry-697d97f7c8-k8fq2\" (UID: \"b2d93aa1-eee7-4a67-b5ee-a05a6696b624\") " pod="openshift-image-registry/image-registry-697d97f7c8-k8fq2" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.776965 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-lwmpj"] Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.777089 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b2d93aa1-eee7-4a67-b5ee-a05a6696b624-bound-sa-token\") pod \"image-registry-697d97f7c8-k8fq2\" (UID: \"b2d93aa1-eee7-4a67-b5ee-a05a6696b624\") " pod="openshift-image-registry/image-registry-697d97f7c8-k8fq2" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.777387 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/56be3765-fdc4-4de6-96bb-6a06bfc16c6b-signing-key\") pod \"service-ca-9c57cc56f-85jfd\" (UID: \"56be3765-fdc4-4de6-96bb-6a06bfc16c6b\") " pod="openshift-service-ca/service-ca-9c57cc56f-85jfd" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.777469 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b2d93aa1-eee7-4a67-b5ee-a05a6696b624-ca-trust-extracted\") pod \"image-registry-697d97f7c8-k8fq2\" (UID: \"b2d93aa1-eee7-4a67-b5ee-a05a6696b624\") " pod="openshift-image-registry/image-registry-697d97f7c8-k8fq2" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.783372 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swrth\" (UniqueName: \"kubernetes.io/projected/993bb116-5859-4547-84fd-2ed614d5ec8e-kube-api-access-swrth\") pod \"ingress-canary-n5kbb\" (UID: \"993bb116-5859-4547-84fd-2ed614d5ec8e\") " pod="openshift-ingress-canary/ingress-canary-n5kbb" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.785391 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k8fq2\" (UID: \"b2d93aa1-eee7-4a67-b5ee-a05a6696b624\") " pod="openshift-image-registry/image-registry-697d97f7c8-k8fq2" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.785445 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b2d93aa1-eee7-4a67-b5ee-a05a6696b624-installation-pull-secrets\") pod \"image-registry-697d97f7c8-k8fq2\" (UID: \"b2d93aa1-eee7-4a67-b5ee-a05a6696b624\") " pod="openshift-image-registry/image-registry-697d97f7c8-k8fq2" Dec 01 21:36:03 crc kubenswrapper[4962]: E1201 21:36:03.785723 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 21:36:04.285709328 +0000 UTC m=+148.387148533 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k8fq2" (UID: "b2d93aa1-eee7-4a67-b5ee-a05a6696b624") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.786885 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cfe98aa8-58d0-4ed5-ae64-8c0faa03e247-metrics-tls\") pod \"dns-default-n9nhh\" (UID: \"cfe98aa8-58d0-4ed5-ae64-8c0faa03e247\") " pod="openshift-dns/dns-default-n9nhh" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.786984 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/be92f2e8-fa20-4d9b-8891-429dfc64490b-socket-dir\") pod \"csi-hostpathplugin-tqsxt\" (UID: \"be92f2e8-fa20-4d9b-8891-429dfc64490b\") " pod="hostpath-provisioner/csi-hostpathplugin-tqsxt" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.788000 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/be92f2e8-fa20-4d9b-8891-429dfc64490b-registration-dir\") pod \"csi-hostpathplugin-tqsxt\" (UID: \"be92f2e8-fa20-4d9b-8891-429dfc64490b\") " pod="hostpath-provisioner/csi-hostpathplugin-tqsxt" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.788780 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rh5vl\" (UniqueName: \"kubernetes.io/projected/b2d93aa1-eee7-4a67-b5ee-a05a6696b624-kube-api-access-rh5vl\") pod \"image-registry-697d97f7c8-k8fq2\" (UID: \"b2d93aa1-eee7-4a67-b5ee-a05a6696b624\") " pod="openshift-image-registry/image-registry-697d97f7c8-k8fq2" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.788820 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/be92f2e8-fa20-4d9b-8891-429dfc64490b-plugins-dir\") pod \"csi-hostpathplugin-tqsxt\" (UID: \"be92f2e8-fa20-4d9b-8891-429dfc64490b\") " pod="hostpath-provisioner/csi-hostpathplugin-tqsxt" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.790761 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbpr9\" (UniqueName: \"kubernetes.io/projected/cfe98aa8-58d0-4ed5-ae64-8c0faa03e247-kube-api-access-wbpr9\") pod \"dns-default-n9nhh\" (UID: \"cfe98aa8-58d0-4ed5-ae64-8c0faa03e247\") " pod="openshift-dns/dns-default-n9nhh" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.790799 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/be92f2e8-fa20-4d9b-8891-429dfc64490b-csi-data-dir\") pod \"csi-hostpathplugin-tqsxt\" (UID: \"be92f2e8-fa20-4d9b-8891-429dfc64490b\") " pod="hostpath-provisioner/csi-hostpathplugin-tqsxt" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.790841 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b2d93aa1-eee7-4a67-b5ee-a05a6696b624-trusted-ca\") pod \"image-registry-697d97f7c8-k8fq2\" (UID: \"b2d93aa1-eee7-4a67-b5ee-a05a6696b624\") " pod="openshift-image-registry/image-registry-697d97f7c8-k8fq2" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.796743 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-frc7j"] Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.796946 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b2d93aa1-eee7-4a67-b5ee-a05a6696b624-trusted-ca\") pod \"image-registry-697d97f7c8-k8fq2\" (UID: \"b2d93aa1-eee7-4a67-b5ee-a05a6696b624\") " pod="openshift-image-registry/image-registry-697d97f7c8-k8fq2" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.801026 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b2d93aa1-eee7-4a67-b5ee-a05a6696b624-registry-certificates\") pod \"image-registry-697d97f7c8-k8fq2\" (UID: \"b2d93aa1-eee7-4a67-b5ee-a05a6696b624\") " pod="openshift-image-registry/image-registry-697d97f7c8-k8fq2" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.801068 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/56be3765-fdc4-4de6-96bb-6a06bfc16c6b-signing-cabundle\") pod \"service-ca-9c57cc56f-85jfd\" (UID: \"56be3765-fdc4-4de6-96bb-6a06bfc16c6b\") " pod="openshift-service-ca/service-ca-9c57cc56f-85jfd" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.813122 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tfld\" (UniqueName: \"kubernetes.io/projected/56be3765-fdc4-4de6-96bb-6a06bfc16c6b-kube-api-access-9tfld\") pod \"service-ca-9c57cc56f-85jfd\" (UID: \"56be3765-fdc4-4de6-96bb-6a06bfc16c6b\") " pod="openshift-service-ca/service-ca-9c57cc56f-85jfd" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.813139 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b2d93aa1-eee7-4a67-b5ee-a05a6696b624-registry-certificates\") pod \"image-registry-697d97f7c8-k8fq2\" (UID: \"b2d93aa1-eee7-4a67-b5ee-a05a6696b624\") " pod="openshift-image-registry/image-registry-697d97f7c8-k8fq2" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.813218 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b2d93aa1-eee7-4a67-b5ee-a05a6696b624-registry-tls\") pod \"image-registry-697d97f7c8-k8fq2\" (UID: \"b2d93aa1-eee7-4a67-b5ee-a05a6696b624\") " pod="openshift-image-registry/image-registry-697d97f7c8-k8fq2" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.814252 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-7x948"] Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.814347 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-hjqsp" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.814656 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rk5z\" (UniqueName: \"kubernetes.io/projected/be92f2e8-fa20-4d9b-8891-429dfc64490b-kube-api-access-5rk5z\") pod \"csi-hostpathplugin-tqsxt\" (UID: \"be92f2e8-fa20-4d9b-8891-429dfc64490b\") " pod="hostpath-provisioner/csi-hostpathplugin-tqsxt" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.814808 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/be92f2e8-fa20-4d9b-8891-429dfc64490b-mountpoint-dir\") pod \"csi-hostpathplugin-tqsxt\" (UID: \"be92f2e8-fa20-4d9b-8891-429dfc64490b\") " pod="hostpath-provisioner/csi-hostpathplugin-tqsxt" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.814923 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/993bb116-5859-4547-84fd-2ed614d5ec8e-cert\") pod \"ingress-canary-n5kbb\" (UID: \"993bb116-5859-4547-84fd-2ed614d5ec8e\") " pod="openshift-ingress-canary/ingress-canary-n5kbb" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.815244 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7p82b" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.816312 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b2d93aa1-eee7-4a67-b5ee-a05a6696b624-installation-pull-secrets\") pod \"image-registry-697d97f7c8-k8fq2\" (UID: \"b2d93aa1-eee7-4a67-b5ee-a05a6696b624\") " pod="openshift-image-registry/image-registry-697d97f7c8-k8fq2" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.816862 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/56be3765-fdc4-4de6-96bb-6a06bfc16c6b-signing-cabundle\") pod \"service-ca-9c57cc56f-85jfd\" (UID: \"56be3765-fdc4-4de6-96bb-6a06bfc16c6b\") " pod="openshift-service-ca/service-ca-9c57cc56f-85jfd" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.817145 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/56be3765-fdc4-4de6-96bb-6a06bfc16c6b-signing-key\") pod \"service-ca-9c57cc56f-85jfd\" (UID: \"56be3765-fdc4-4de6-96bb-6a06bfc16c6b\") " pod="openshift-service-ca/service-ca-9c57cc56f-85jfd" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.821017 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-w2wdh"] Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.821225 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-gg284" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.824747 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b2d93aa1-eee7-4a67-b5ee-a05a6696b624-registry-tls\") pod \"image-registry-697d97f7c8-k8fq2\" (UID: \"b2d93aa1-eee7-4a67-b5ee-a05a6696b624\") " pod="openshift-image-registry/image-registry-697d97f7c8-k8fq2" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.826695 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b2d93aa1-eee7-4a67-b5ee-a05a6696b624-bound-sa-token\") pod \"image-registry-697d97f7c8-k8fq2\" (UID: \"b2d93aa1-eee7-4a67-b5ee-a05a6696b624\") " pod="openshift-image-registry/image-registry-697d97f7c8-k8fq2" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.840637 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rh5vl\" (UniqueName: \"kubernetes.io/projected/b2d93aa1-eee7-4a67-b5ee-a05a6696b624-kube-api-access-rh5vl\") pod \"image-registry-697d97f7c8-k8fq2\" (UID: \"b2d93aa1-eee7-4a67-b5ee-a05a6696b624\") " pod="openshift-image-registry/image-registry-697d97f7c8-k8fq2" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.858955 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tfld\" (UniqueName: \"kubernetes.io/projected/56be3765-fdc4-4de6-96bb-6a06bfc16c6b-kube-api-access-9tfld\") pod \"service-ca-9c57cc56f-85jfd\" (UID: \"56be3765-fdc4-4de6-96bb-6a06bfc16c6b\") " pod="openshift-service-ca/service-ca-9c57cc56f-85jfd" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.864860 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-p5kqx" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.878909 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-85jfd" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.916339 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 21:36:03 crc kubenswrapper[4962]: E1201 21:36:03.916571 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 21:36:04.416520852 +0000 UTC m=+148.517960047 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.916674 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cfe98aa8-58d0-4ed5-ae64-8c0faa03e247-config-volume\") pod \"dns-default-n9nhh\" (UID: \"cfe98aa8-58d0-4ed5-ae64-8c0faa03e247\") " pod="openshift-dns/dns-default-n9nhh" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.916706 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swrth\" (UniqueName: \"kubernetes.io/projected/993bb116-5859-4547-84fd-2ed614d5ec8e-kube-api-access-swrth\") pod \"ingress-canary-n5kbb\" (UID: \"993bb116-5859-4547-84fd-2ed614d5ec8e\") " pod="openshift-ingress-canary/ingress-canary-n5kbb" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.916727 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k8fq2\" (UID: \"b2d93aa1-eee7-4a67-b5ee-a05a6696b624\") " pod="openshift-image-registry/image-registry-697d97f7c8-k8fq2" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.916746 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cfe98aa8-58d0-4ed5-ae64-8c0faa03e247-metrics-tls\") pod \"dns-default-n9nhh\" (UID: \"cfe98aa8-58d0-4ed5-ae64-8c0faa03e247\") " pod="openshift-dns/dns-default-n9nhh" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.916763 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/be92f2e8-fa20-4d9b-8891-429dfc64490b-socket-dir\") pod \"csi-hostpathplugin-tqsxt\" (UID: \"be92f2e8-fa20-4d9b-8891-429dfc64490b\") " pod="hostpath-provisioner/csi-hostpathplugin-tqsxt" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.916779 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/be92f2e8-fa20-4d9b-8891-429dfc64490b-registration-dir\") pod \"csi-hostpathplugin-tqsxt\" (UID: \"be92f2e8-fa20-4d9b-8891-429dfc64490b\") " pod="hostpath-provisioner/csi-hostpathplugin-tqsxt" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.916794 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/be92f2e8-fa20-4d9b-8891-429dfc64490b-plugins-dir\") pod \"csi-hostpathplugin-tqsxt\" (UID: \"be92f2e8-fa20-4d9b-8891-429dfc64490b\") " pod="hostpath-provisioner/csi-hostpathplugin-tqsxt" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.916810 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbpr9\" (UniqueName: \"kubernetes.io/projected/cfe98aa8-58d0-4ed5-ae64-8c0faa03e247-kube-api-access-wbpr9\") pod \"dns-default-n9nhh\" (UID: \"cfe98aa8-58d0-4ed5-ae64-8c0faa03e247\") " pod="openshift-dns/dns-default-n9nhh" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.916826 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/be92f2e8-fa20-4d9b-8891-429dfc64490b-csi-data-dir\") pod \"csi-hostpathplugin-tqsxt\" (UID: \"be92f2e8-fa20-4d9b-8891-429dfc64490b\") " pod="hostpath-provisioner/csi-hostpathplugin-tqsxt" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.916870 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rk5z\" (UniqueName: \"kubernetes.io/projected/be92f2e8-fa20-4d9b-8891-429dfc64490b-kube-api-access-5rk5z\") pod \"csi-hostpathplugin-tqsxt\" (UID: \"be92f2e8-fa20-4d9b-8891-429dfc64490b\") " pod="hostpath-provisioner/csi-hostpathplugin-tqsxt" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.916896 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/be92f2e8-fa20-4d9b-8891-429dfc64490b-mountpoint-dir\") pod \"csi-hostpathplugin-tqsxt\" (UID: \"be92f2e8-fa20-4d9b-8891-429dfc64490b\") " pod="hostpath-provisioner/csi-hostpathplugin-tqsxt" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.916916 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/993bb116-5859-4547-84fd-2ed614d5ec8e-cert\") pod \"ingress-canary-n5kbb\" (UID: \"993bb116-5859-4547-84fd-2ed614d5ec8e\") " pod="openshift-ingress-canary/ingress-canary-n5kbb" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.917482 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cfe98aa8-58d0-4ed5-ae64-8c0faa03e247-config-volume\") pod \"dns-default-n9nhh\" (UID: \"cfe98aa8-58d0-4ed5-ae64-8c0faa03e247\") " pod="openshift-dns/dns-default-n9nhh" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.917689 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/be92f2e8-fa20-4d9b-8891-429dfc64490b-registration-dir\") pod \"csi-hostpathplugin-tqsxt\" (UID: \"be92f2e8-fa20-4d9b-8891-429dfc64490b\") " pod="hostpath-provisioner/csi-hostpathplugin-tqsxt" Dec 01 21:36:03 crc kubenswrapper[4962]: E1201 21:36:03.918032 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 21:36:04.418020582 +0000 UTC m=+148.519459777 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k8fq2" (UID: "b2d93aa1-eee7-4a67-b5ee-a05a6696b624") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.918315 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/be92f2e8-fa20-4d9b-8891-429dfc64490b-socket-dir\") pod \"csi-hostpathplugin-tqsxt\" (UID: \"be92f2e8-fa20-4d9b-8891-429dfc64490b\") " pod="hostpath-provisioner/csi-hostpathplugin-tqsxt" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.918410 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/be92f2e8-fa20-4d9b-8891-429dfc64490b-mountpoint-dir\") pod \"csi-hostpathplugin-tqsxt\" (UID: \"be92f2e8-fa20-4d9b-8891-429dfc64490b\") " pod="hostpath-provisioner/csi-hostpathplugin-tqsxt" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.918445 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/be92f2e8-fa20-4d9b-8891-429dfc64490b-plugins-dir\") pod \"csi-hostpathplugin-tqsxt\" (UID: \"be92f2e8-fa20-4d9b-8891-429dfc64490b\") " pod="hostpath-provisioner/csi-hostpathplugin-tqsxt" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.918477 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/be92f2e8-fa20-4d9b-8891-429dfc64490b-csi-data-dir\") pod \"csi-hostpathplugin-tqsxt\" (UID: \"be92f2e8-fa20-4d9b-8891-429dfc64490b\") " pod="hostpath-provisioner/csi-hostpathplugin-tqsxt" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.922659 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/993bb116-5859-4547-84fd-2ed614d5ec8e-cert\") pod \"ingress-canary-n5kbb\" (UID: \"993bb116-5859-4547-84fd-2ed614d5ec8e\") " pod="openshift-ingress-canary/ingress-canary-n5kbb" Dec 01 21:36:03 crc kubenswrapper[4962]: I1201 21:36:03.929603 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cfe98aa8-58d0-4ed5-ae64-8c0faa03e247-metrics-tls\") pod \"dns-default-n9nhh\" (UID: \"cfe98aa8-58d0-4ed5-ae64-8c0faa03e247\") " pod="openshift-dns/dns-default-n9nhh" Dec 01 21:36:04 crc kubenswrapper[4962]: I1201 21:36:04.003645 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swrth\" (UniqueName: \"kubernetes.io/projected/993bb116-5859-4547-84fd-2ed614d5ec8e-kube-api-access-swrth\") pod \"ingress-canary-n5kbb\" (UID: \"993bb116-5859-4547-84fd-2ed614d5ec8e\") " pod="openshift-ingress-canary/ingress-canary-n5kbb" Dec 01 21:36:04 crc kubenswrapper[4962]: I1201 21:36:04.009839 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rk5z\" (UniqueName: \"kubernetes.io/projected/be92f2e8-fa20-4d9b-8891-429dfc64490b-kube-api-access-5rk5z\") pod \"csi-hostpathplugin-tqsxt\" (UID: \"be92f2e8-fa20-4d9b-8891-429dfc64490b\") " pod="hostpath-provisioner/csi-hostpathplugin-tqsxt" Dec 01 21:36:04 crc kubenswrapper[4962]: I1201 21:36:04.019847 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbpr9\" (UniqueName: \"kubernetes.io/projected/cfe98aa8-58d0-4ed5-ae64-8c0faa03e247-kube-api-access-wbpr9\") pod \"dns-default-n9nhh\" (UID: \"cfe98aa8-58d0-4ed5-ae64-8c0faa03e247\") " pod="openshift-dns/dns-default-n9nhh" Dec 01 21:36:04 crc kubenswrapper[4962]: I1201 21:36:04.023644 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 21:36:04 crc kubenswrapper[4962]: E1201 21:36:04.024037 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 21:36:04.524005242 +0000 UTC m=+148.625444427 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 21:36:04 crc kubenswrapper[4962]: I1201 21:36:04.024255 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k8fq2\" (UID: \"b2d93aa1-eee7-4a67-b5ee-a05a6696b624\") " pod="openshift-image-registry/image-registry-697d97f7c8-k8fq2" Dec 01 21:36:04 crc kubenswrapper[4962]: E1201 21:36:04.024584 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 21:36:04.524566537 +0000 UTC m=+148.626005732 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k8fq2" (UID: "b2d93aa1-eee7-4a67-b5ee-a05a6696b624") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 21:36:04 crc kubenswrapper[4962]: I1201 21:36:04.125275 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 21:36:04 crc kubenswrapper[4962]: E1201 21:36:04.125687 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 21:36:04.625671439 +0000 UTC m=+148.727110634 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 21:36:04 crc kubenswrapper[4962]: I1201 21:36:04.175927 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p9pvs" event={"ID":"06b421f6-7c09-4105-9aa3-06296e88a57f","Type":"ContainerStarted","Data":"782b604312317cf4b7417b38bfa8a36d66681d2de1990aa2e93df14d4d205b5d"} Dec 01 21:36:04 crc kubenswrapper[4962]: I1201 21:36:04.176709 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p9pvs" event={"ID":"06b421f6-7c09-4105-9aa3-06296e88a57f","Type":"ContainerStarted","Data":"00c40a7b10504d3b736bb66f6965f6717cf1dd13b20b869183afb2cc33ebdc1c"} Dec 01 21:36:04 crc kubenswrapper[4962]: I1201 21:36:04.187212 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w86m4" event={"ID":"570052d5-a3db-4720-b8ab-32b4d71c44f8","Type":"ContainerStarted","Data":"64c9d63b1308fc826b139cd9043099dc0a8020eb268a28350a30c1c5076bb44a"} Dec 01 21:36:04 crc kubenswrapper[4962]: I1201 21:36:04.187252 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w86m4" event={"ID":"570052d5-a3db-4720-b8ab-32b4d71c44f8","Type":"ContainerStarted","Data":"c923938c1829e317bfd2b361037274256540de038c635f3db4f5cca989e5f9ce"} Dec 01 21:36:04 crc kubenswrapper[4962]: I1201 21:36:04.196859 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lwmpj" event={"ID":"2fb73eca-8b7b-488b-bea1-3ca7f1145fd7","Type":"ContainerStarted","Data":"c5067ecbb655f8729e17f03194c8d5e3b8fb724d208a0ce53366ec19ab66ee57"} Dec 01 21:36:04 crc kubenswrapper[4962]: I1201 21:36:04.203363 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-csp6p" event={"ID":"6e0f577b-3526-4f2e-846f-65d5a5ee1d8e","Type":"ContainerStarted","Data":"72d5cfdbcc979b4604c202a89d3d32785ea309eccf5e2e989cb10fac49b689e8"} Dec 01 21:36:04 crc kubenswrapper[4962]: I1201 21:36:04.203406 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-csp6p" event={"ID":"6e0f577b-3526-4f2e-846f-65d5a5ee1d8e","Type":"ContainerStarted","Data":"9f060d82be232e97b5ecfd17fd7925a7feae037e2fac54fcc0f681d1afee65a7"} Dec 01 21:36:04 crc kubenswrapper[4962]: I1201 21:36:04.206684 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-n5kbb" Dec 01 21:36:04 crc kubenswrapper[4962]: I1201 21:36:04.207311 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-csp6p" Dec 01 21:36:04 crc kubenswrapper[4962]: I1201 21:36:04.213264 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-h69ht" event={"ID":"02ee2c67-a501-4ae1-a1c3-f9cc8fa1650d","Type":"ContainerStarted","Data":"243ac9bcd2a527083349d1436700df973ab745aaa51e2536766d67a7ab397060"} Dec 01 21:36:04 crc kubenswrapper[4962]: I1201 21:36:04.215002 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-jsv9r" event={"ID":"4002c0b0-4b79-4755-a60c-2fc0cdac7876","Type":"ContainerStarted","Data":"8d8ebb63957c51726f2495488746d41c334b0d221e6f54c6d88543e7d287cd6e"} Dec 01 21:36:04 crc kubenswrapper[4962]: I1201 21:36:04.215063 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-jsv9r" event={"ID":"4002c0b0-4b79-4755-a60c-2fc0cdac7876","Type":"ContainerStarted","Data":"21538d0a3c59fee3e4d2509fe44592ed0e6dfec886c727b0424564727581376b"} Dec 01 21:36:04 crc kubenswrapper[4962]: I1201 21:36:04.217218 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-g44h6" event={"ID":"04aab66b-a04f-459e-a760-eec00bec0115","Type":"ContainerStarted","Data":"6dfb71d124b459025fcbb8fd37e972c6eda579fee326816448e67c7e01514e5a"} Dec 01 21:36:04 crc kubenswrapper[4962]: I1201 21:36:04.218216 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-j5fqx" event={"ID":"cab55760-3ea7-41e5-8ac2-7e8c44f5ecb1","Type":"ContainerStarted","Data":"5cd1ba508b836addae30081b39d6c666c5a9da9107de667e7889f7bee38c8b78"} Dec 01 21:36:04 crc kubenswrapper[4962]: I1201 21:36:04.220117 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-n9nhh" Dec 01 21:36:04 crc kubenswrapper[4962]: I1201 21:36:04.226587 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k8fq2\" (UID: \"b2d93aa1-eee7-4a67-b5ee-a05a6696b624\") " pod="openshift-image-registry/image-registry-697d97f7c8-k8fq2" Dec 01 21:36:04 crc kubenswrapper[4962]: E1201 21:36:04.227577 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 21:36:04.727566042 +0000 UTC m=+148.829005237 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k8fq2" (UID: "b2d93aa1-eee7-4a67-b5ee-a05a6696b624") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 21:36:04 crc kubenswrapper[4962]: I1201 21:36:04.243522 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-tqsxt" Dec 01 21:36:04 crc kubenswrapper[4962]: I1201 21:36:04.323138 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cj2tt" event={"ID":"4fd77fbe-c9db-4f41-9c98-a8f3490c1b30","Type":"ContainerStarted","Data":"81bd054a9c4545b25af2e7516e552d1cb974a39777866900d4737f9c2b362db2"} Dec 01 21:36:04 crc kubenswrapper[4962]: I1201 21:36:04.323175 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-7hn2f"] Dec 01 21:36:04 crc kubenswrapper[4962]: I1201 21:36:04.323187 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-frc7j" event={"ID":"fa1e2012-a9ba-488e-b877-4b0ee5f079b4","Type":"ContainerStarted","Data":"d27d159b45f1108bc09de7121710a4e918a8779c343cfa18428ea38102d1b7ad"} Dec 01 21:36:04 crc kubenswrapper[4962]: I1201 21:36:04.323211 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9wbzf" event={"ID":"63788da5-7737-4b23-aef1-283bc26a4202","Type":"ContainerStarted","Data":"792fc4aef3114e459cc9754c98b37681a6a0118556cebec4895c899a90944e84"} Dec 01 21:36:04 crc kubenswrapper[4962]: I1201 21:36:04.323230 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-7x948" event={"ID":"9f31ef6d-c116-4335-bc5d-5357a379d202","Type":"ContainerStarted","Data":"0cb2d501eaa896500027c00abe2539f72fe5ef490e9abb7249ac70581d2653eb"} Dec 01 21:36:04 crc kubenswrapper[4962]: I1201 21:36:04.323240 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-hvvd4" event={"ID":"c25f2bd8-5e89-40b2-8c62-3c67a364384f","Type":"ContainerStarted","Data":"5b448cfa91a5abd948b2908a292f0ca444509e0a6ee3c7da418352e7822039c3"} Dec 01 21:36:04 crc kubenswrapper[4962]: I1201 21:36:04.323249 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-zmd8d" event={"ID":"5ae96720-62a1-4f8e-b9b1-c234fdf318e8","Type":"ContainerStarted","Data":"dbe79a69e742e83325a6c17a2d4ee54712449ca912cc8f21272017b7a0be1bd9"} Dec 01 21:36:04 crc kubenswrapper[4962]: I1201 21:36:04.323258 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-w2wdh" event={"ID":"847a70f9-ec15-480b-8ed8-9d3cb006ce64","Type":"ContainerStarted","Data":"808ce0bb9cc3d2f97c12790e20e78a729998e3e1db42b54a7b254721c425e1f0"} Dec 01 21:36:04 crc kubenswrapper[4962]: I1201 21:36:04.323268 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-hjqsp" event={"ID":"3bc8d3cc-b827-4f76-b41e-18e0790b6e66","Type":"ContainerStarted","Data":"6baae3d9e7e3f34157ff292d93e3878e8f9109a911c6cb6ca0d37aa0429a9052"} Dec 01 21:36:04 crc kubenswrapper[4962]: I1201 21:36:04.327265 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 21:36:04 crc kubenswrapper[4962]: E1201 21:36:04.328388 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 21:36:04.828375866 +0000 UTC m=+148.929815061 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 21:36:04 crc kubenswrapper[4962]: I1201 21:36:04.429603 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k8fq2\" (UID: \"b2d93aa1-eee7-4a67-b5ee-a05a6696b624\") " pod="openshift-image-registry/image-registry-697d97f7c8-k8fq2" Dec 01 21:36:04 crc kubenswrapper[4962]: E1201 21:36:04.431577 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 21:36:04.931562763 +0000 UTC m=+149.033001958 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k8fq2" (UID: "b2d93aa1-eee7-4a67-b5ee-a05a6696b624") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 21:36:04 crc kubenswrapper[4962]: I1201 21:36:04.517401 4962 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-csp6p container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 01 21:36:04 crc kubenswrapper[4962]: [+]log ok Dec 01 21:36:04 crc kubenswrapper[4962]: [-]poststarthook/max-in-flight-filter failed: reason withheld Dec 01 21:36:04 crc kubenswrapper[4962]: [-]poststarthook/storage-object-count-tracker-hook failed: reason withheld Dec 01 21:36:04 crc kubenswrapper[4962]: [+]poststarthook/openshift.io-StartUserInformer ok Dec 01 21:36:04 crc kubenswrapper[4962]: healthz check failed Dec 01 21:36:04 crc kubenswrapper[4962]: I1201 21:36:04.517446 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-csp6p" podUID="6e0f577b-3526-4f2e-846f-65d5a5ee1d8e" containerName="oauth-openshift" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 21:36:04 crc kubenswrapper[4962]: I1201 21:36:04.530695 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 21:36:04 crc kubenswrapper[4962]: E1201 21:36:04.531078 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 21:36:05.031061943 +0000 UTC m=+149.132501138 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 21:36:04 crc kubenswrapper[4962]: I1201 21:36:04.531135 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k8fq2\" (UID: \"b2d93aa1-eee7-4a67-b5ee-a05a6696b624\") " pod="openshift-image-registry/image-registry-697d97f7c8-k8fq2" Dec 01 21:36:04 crc kubenswrapper[4962]: E1201 21:36:04.531529 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 21:36:05.031503315 +0000 UTC m=+149.132942510 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k8fq2" (UID: "b2d93aa1-eee7-4a67-b5ee-a05a6696b624") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 21:36:04 crc kubenswrapper[4962]: I1201 21:36:04.632200 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 21:36:04 crc kubenswrapper[4962]: E1201 21:36:04.632477 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 21:36:05.132463023 +0000 UTC m=+149.233902218 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 21:36:04 crc kubenswrapper[4962]: I1201 21:36:04.697476 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h7flt"] Dec 01 21:36:04 crc kubenswrapper[4962]: I1201 21:36:04.705077 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-w56dm"] Dec 01 21:36:04 crc kubenswrapper[4962]: I1201 21:36:04.710990 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nl5jh"] Dec 01 21:36:04 crc kubenswrapper[4962]: I1201 21:36:04.732981 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k8fq2\" (UID: \"b2d93aa1-eee7-4a67-b5ee-a05a6696b624\") " pod="openshift-image-registry/image-registry-697d97f7c8-k8fq2" Dec 01 21:36:04 crc kubenswrapper[4962]: E1201 21:36:04.733428 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 21:36:05.233412481 +0000 UTC m=+149.334851676 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k8fq2" (UID: "b2d93aa1-eee7-4a67-b5ee-a05a6696b624") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 21:36:04 crc kubenswrapper[4962]: I1201 21:36:04.735185 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-rvt4l"] Dec 01 21:36:04 crc kubenswrapper[4962]: I1201 21:36:04.763672 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-lt2df"] Dec 01 21:36:04 crc kubenswrapper[4962]: I1201 21:36:04.763867 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zkgk9"] Dec 01 21:36:04 crc kubenswrapper[4962]: I1201 21:36:04.774181 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-sl474"] Dec 01 21:36:04 crc kubenswrapper[4962]: I1201 21:36:04.780394 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-cjns7"] Dec 01 21:36:04 crc kubenswrapper[4962]: E1201 21:36:04.837262 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 21:36:05.337238475 +0000 UTC m=+149.438677670 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 21:36:04 crc kubenswrapper[4962]: I1201 21:36:04.837603 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 21:36:04 crc kubenswrapper[4962]: I1201 21:36:04.837927 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k8fq2\" (UID: \"b2d93aa1-eee7-4a67-b5ee-a05a6696b624\") " pod="openshift-image-registry/image-registry-697d97f7c8-k8fq2" Dec 01 21:36:04 crc kubenswrapper[4962]: E1201 21:36:04.838244 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 21:36:05.338232621 +0000 UTC m=+149.439671816 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k8fq2" (UID: "b2d93aa1-eee7-4a67-b5ee-a05a6696b624") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 21:36:04 crc kubenswrapper[4962]: I1201 21:36:04.947820 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 21:36:04 crc kubenswrapper[4962]: E1201 21:36:04.948184 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 21:36:05.448168656 +0000 UTC m=+149.549607851 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 21:36:04 crc kubenswrapper[4962]: I1201 21:36:04.948309 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k8fq2\" (UID: \"b2d93aa1-eee7-4a67-b5ee-a05a6696b624\") " pod="openshift-image-registry/image-registry-697d97f7c8-k8fq2" Dec 01 21:36:04 crc kubenswrapper[4962]: E1201 21:36:04.948583 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 21:36:05.448576876 +0000 UTC m=+149.550016071 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k8fq2" (UID: "b2d93aa1-eee7-4a67-b5ee-a05a6696b624") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.055579 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-zbnc9"] Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.057657 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 21:36:05 crc kubenswrapper[4962]: E1201 21:36:05.058019 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 21:36:05.558006568 +0000 UTC m=+149.659445763 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.084097 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-7qvr5"] Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.084161 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kg8bg"] Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.127479 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-qj8zv"] Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.132294 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ljnp5"] Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.142963 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6ffb7"] Dec 01 21:36:05 crc kubenswrapper[4962]: E1201 21:36:05.161346 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 21:36:05.661328578 +0000 UTC m=+149.762767773 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k8fq2" (UID: "b2d93aa1-eee7-4a67-b5ee-a05a6696b624") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.160241 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k8fq2\" (UID: \"b2d93aa1-eee7-4a67-b5ee-a05a6696b624\") " pod="openshift-image-registry/image-registry-697d97f7c8-k8fq2" Dec 01 21:36:05 crc kubenswrapper[4962]: W1201 21:36:05.177740 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12136e64_010e_49bc_9c3e_d1c65467f361.slice/crio-0c583b7b9dc48d677076cc19f774031c59f0f5258ce287309ee7e98499058a90 WatchSource:0}: Error finding container 0c583b7b9dc48d677076cc19f774031c59f0f5258ce287309ee7e98499058a90: Status 404 returned error can't find the container with id 0c583b7b9dc48d677076cc19f774031c59f0f5258ce287309ee7e98499058a90 Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.185204 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-n5kbb"] Dec 01 21:36:05 crc kubenswrapper[4962]: W1201 21:36:05.211477 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-4529f70ef2c39db4d62bb63d189628d10949b53d58e6fdff27f4214121bc6456 WatchSource:0}: Error finding container 4529f70ef2c39db4d62bb63d189628d10949b53d58e6fdff27f4214121bc6456: Status 404 returned error can't find the container with id 4529f70ef2c39db4d62bb63d189628d10949b53d58e6fdff27f4214121bc6456 Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.215496 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-n9nhh"] Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.232313 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-85jfd"] Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.252339 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-gg284"] Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.276042 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 21:36:05 crc kubenswrapper[4962]: E1201 21:36:05.276418 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 21:36:05.776402298 +0000 UTC m=+149.877841493 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.279124 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-tqsxt"] Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.280805 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410410-kr2lr"] Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.282365 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7p82b"] Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.301765 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-gh8cn"] Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.325412 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-stv9m"] Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.340467 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-p5kqx"] Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.361106 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-7x948" event={"ID":"9f31ef6d-c116-4335-bc5d-5357a379d202","Type":"ContainerStarted","Data":"b9d230f604de3dbaac3935eaf8abd50e93cf91127020dc336d7525bbdcb75ff3"} Dec 01 21:36:05 crc kubenswrapper[4962]: W1201 21:36:05.364086 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podccb13006_cd7f_4249_9d04_7391d26eaae3.slice/crio-c6a8335c3a431cf0c445e8fc2958b0f9d4384622eb607564b051fdfd11cac6c0 WatchSource:0}: Error finding container c6a8335c3a431cf0c445e8fc2958b0f9d4384622eb607564b051fdfd11cac6c0: Status 404 returned error can't find the container with id c6a8335c3a431cf0c445e8fc2958b0f9d4384622eb607564b051fdfd11cac6c0 Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.364700 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-j5fqx" event={"ID":"cab55760-3ea7-41e5-8ac2-7e8c44f5ecb1","Type":"ContainerStarted","Data":"3e434e56e8d8e427aca83f8a41d5495766b20d3327aac6c7f24b6a08aa14e151"} Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.369740 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-zbnc9" event={"ID":"c08b9259-9d06-4450-a2e6-09740b95fdea","Type":"ContainerStarted","Data":"dfb41449370889f156da383b03c6b13d71d5532e58c50d70683b362d9abeadf2"} Dec 01 21:36:05 crc kubenswrapper[4962]: W1201 21:36:05.372850 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfde3e2c2_ed59_4cbf_8554_1a0438eb81dc.slice/crio-edf4629c34895a810e8726f6ac94a76f24dddf7936bc79e2a8faddea44a51dd1 WatchSource:0}: Error finding container edf4629c34895a810e8726f6ac94a76f24dddf7936bc79e2a8faddea44a51dd1: Status 404 returned error can't find the container with id edf4629c34895a810e8726f6ac94a76f24dddf7936bc79e2a8faddea44a51dd1 Dec 01 21:36:05 crc kubenswrapper[4962]: W1201 21:36:05.377755 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-e83dc57de9312fa6dfc74a68b4bcce5590d0e460fa073031d26a3421133d699b WatchSource:0}: Error finding container e83dc57de9312fa6dfc74a68b4bcce5590d0e460fa073031d26a3421133d699b: Status 404 returned error can't find the container with id e83dc57de9312fa6dfc74a68b4bcce5590d0e460fa073031d26a3421133d699b Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.379205 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k8fq2\" (UID: \"b2d93aa1-eee7-4a67-b5ee-a05a6696b624\") " pod="openshift-image-registry/image-registry-697d97f7c8-k8fq2" Dec 01 21:36:05 crc kubenswrapper[4962]: E1201 21:36:05.379709 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 21:36:05.879696318 +0000 UTC m=+149.981135513 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k8fq2" (UID: "b2d93aa1-eee7-4a67-b5ee-a05a6696b624") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.381543 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-hjqsp" event={"ID":"3bc8d3cc-b827-4f76-b41e-18e0790b6e66","Type":"ContainerStarted","Data":"ab689686354bf0522bdb924bead6a2926570ab76cce2dbd8c6575acfde4d1c3c"} Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.383844 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cj2tt" event={"ID":"4fd77fbe-c9db-4f41-9c98-a8f3490c1b30","Type":"ContainerStarted","Data":"b7d8fba3895e27bfd6d73b0dd5e97a6d47c4181bae3a13ec6843d584bc49d0de"} Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.385736 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6ffb7" event={"ID":"95b5b235-1db6-458c-bdf9-c065eb0c70e5","Type":"ContainerStarted","Data":"2402a58e6c77f7277037317a517ac4685cc9d5909324c9e7951f089346a88540"} Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.395196 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9wbzf" podStartSLOduration=127.395178515 podStartE2EDuration="2m7.395178515s" podCreationTimestamp="2025-12-01 21:33:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:36:05.395154795 +0000 UTC m=+149.496593990" watchObservedRunningTime="2025-12-01 21:36:05.395178515 +0000 UTC m=+149.496617710" Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.420632 4962 generic.go:334] "Generic (PLEG): container finished" podID="847a70f9-ec15-480b-8ed8-9d3cb006ce64" containerID="a52e002f603e3b68a6563092988de0c50a403db1cc8325531782ad2b2b9cb59c" exitCode=0 Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.420870 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-w2wdh" event={"ID":"847a70f9-ec15-480b-8ed8-9d3cb006ce64","Type":"ContainerDied","Data":"a52e002f603e3b68a6563092988de0c50a403db1cc8325531782ad2b2b9cb59c"} Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.437320 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-n5kbb" event={"ID":"993bb116-5859-4547-84fd-2ed614d5ec8e","Type":"ContainerStarted","Data":"cd1879e2b773e240c3a4280ed1703c780b6fd69d486f165b7eeb3fce527f6f71"} Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.441186 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-qj8zv" event={"ID":"12136e64-010e-49bc-9c3e-d1c65467f361","Type":"ContainerStarted","Data":"0c583b7b9dc48d677076cc19f774031c59f0f5258ce287309ee7e98499058a90"} Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.454064 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-cjns7" event={"ID":"7b57c377-c954-4559-af66-e64599cf5f71","Type":"ContainerStarted","Data":"b7a9b7432d9d5cff3e19343e05b355ae2df7403d3b60b1fe6b1a2f8c1b3ff873"} Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.454116 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-cjns7" event={"ID":"7b57c377-c954-4559-af66-e64599cf5f71","Type":"ContainerStarted","Data":"7053e2a1cafd9da33afedae35a8649ec37052df2b3f48640afcc599412a2f2c4"} Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.455757 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-cjns7" Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.456581 4962 patch_prober.go:28] interesting pod/console-operator-58897d9998-cjns7 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/readyz\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.456644 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-cjns7" podUID="7b57c377-c954-4559-af66-e64599cf5f71" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/readyz\": dial tcp 10.217.0.24:8443: connect: connection refused" Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.457899 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lt2df" event={"ID":"0b40c111-e562-48b7-9db2-1a494e16786c","Type":"ContainerStarted","Data":"9debe690c2efeeded20d4a128ce66c77d55f9fffef6ca74aff96c3fe16b8faed"} Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.461199 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h7flt" event={"ID":"a56487ec-ba83-42dc-b8b4-35c353fbc836","Type":"ContainerStarted","Data":"08727e632521988240131cd3854676add5bac196b6fc56657dde5a3cc8f96ba5"} Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.461243 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h7flt" event={"ID":"a56487ec-ba83-42dc-b8b4-35c353fbc836","Type":"ContainerStarted","Data":"4ff56ce1e177f4ce73ce8041d9e02bad8c51d0a9d37c0bde442df6daf17bd5df"} Dec 01 21:36:05 crc kubenswrapper[4962]: W1201 21:36:05.461581 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-fdb7639c6b652e87e5a5ffced0bb34a23f3f0e4c9ff42fe56206726c149b2f5b WatchSource:0}: Error finding container fdb7639c6b652e87e5a5ffced0bb34a23f3f0e4c9ff42fe56206726c149b2f5b: Status 404 returned error can't find the container with id fdb7639c6b652e87e5a5ffced0bb34a23f3f0e4c9ff42fe56206726c149b2f5b Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.461624 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h7flt" Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.463531 4962 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-h7flt container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" start-of-body= Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.463568 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h7flt" podUID="a56487ec-ba83-42dc-b8b4-35c353fbc836" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.465354 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-frc7j" event={"ID":"fa1e2012-a9ba-488e-b877-4b0ee5f079b4","Type":"ContainerStarted","Data":"c556e1113aabfc1f4f750417cf11e2e7462319a44935fd9518b067dcddc37019"} Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.473479 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p9pvs" podStartSLOduration=127.473463317 podStartE2EDuration="2m7.473463317s" podCreationTimestamp="2025-12-01 21:33:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:36:05.473288912 +0000 UTC m=+149.574728107" watchObservedRunningTime="2025-12-01 21:36:05.473463317 +0000 UTC m=+149.574902502" Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.475592 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w86m4" podStartSLOduration=127.475582843 podStartE2EDuration="2m7.475582843s" podCreationTimestamp="2025-12-01 21:33:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:36:05.436972546 +0000 UTC m=+149.538411741" watchObservedRunningTime="2025-12-01 21:36:05.475582843 +0000 UTC m=+149.577022038" Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.478060 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-g44h6" event={"ID":"04aab66b-a04f-459e-a760-eec00bec0115","Type":"ContainerStarted","Data":"e500c65e5186002e820e721471808e5fa9abfb3e294308e3b46015957c6109ae"} Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.479584 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-w56dm" event={"ID":"931bdb70-51c0-4893-b3b9-6fe8dd700233","Type":"ContainerStarted","Data":"37229accdd6b0c6b9102c82a9dbda9806f89bb0f3eb6d33858052a4fea292c1b"} Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.481876 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-h69ht" event={"ID":"02ee2c67-a501-4ae1-a1c3-f9cc8fa1650d","Type":"ContainerStarted","Data":"26029fb0eea0d5fc2fa03eee51269d03aa02621f21d944ef10ff04ff1a4c4bdd"} Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.481900 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-h69ht" event={"ID":"02ee2c67-a501-4ae1-a1c3-f9cc8fa1650d","Type":"ContainerStarted","Data":"58d4952767665513c1663700f8070a5ed7a0f6b8958c8ddbbe3d6ccb1f109a00"} Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.484663 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-rvt4l" event={"ID":"696189ea-f7d5-445e-a07e-ebb32f5e219c","Type":"ContainerStarted","Data":"b2e1e0fafb7230634305958b2229f0f3279a1f95caf53cb92e2ca4405b5c8583"} Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.486150 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 21:36:05 crc kubenswrapper[4962]: E1201 21:36:05.486262 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 21:36:05.986236343 +0000 UTC m=+150.087675528 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.486827 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ljnp5" event={"ID":"f4a93f57-1e83-4b94-b1eb-fc28eea15503","Type":"ContainerStarted","Data":"4239cfa55fd2e7948237e6c8afa81a1321ef69b68a3f91a1f49abce9db77c393"} Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.486994 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k8fq2\" (UID: \"b2d93aa1-eee7-4a67-b5ee-a05a6696b624\") " pod="openshift-image-registry/image-registry-697d97f7c8-k8fq2" Dec 01 21:36:05 crc kubenswrapper[4962]: E1201 21:36:05.490295 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 21:36:05.99028212 +0000 UTC m=+150.091721315 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k8fq2" (UID: "b2d93aa1-eee7-4a67-b5ee-a05a6696b624") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.492521 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sl474" event={"ID":"a3bc6803-77e3-4e63-bfaa-761500e161de","Type":"ContainerStarted","Data":"268d819ad330a63d2c0cabe494d3fefd3b680d9f6f82df97a7b17f6948d127a8"} Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.521138 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-zmd8d" event={"ID":"5ae96720-62a1-4f8e-b9b1-c234fdf318e8","Type":"ContainerStarted","Data":"8ffe42f6b7630ffddbd2e9560a5153368b5b1d8c20ef6146c3db8cb1c584ea68"} Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.536437 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zkgk9" event={"ID":"6b92760e-ca08-4b9c-b2a0-3f522ba6a975","Type":"ContainerStarted","Data":"4fd164fa16f7b78ace09def9c1e5e4400e74cb4bca80ebf0491179c369e6146e"} Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.537248 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-zmd8d" podStartSLOduration=5.537231536 podStartE2EDuration="5.537231536s" podCreationTimestamp="2025-12-01 21:36:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:36:05.536683331 +0000 UTC m=+149.638122526" watchObservedRunningTime="2025-12-01 21:36:05.537231536 +0000 UTC m=+149.638670741" Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.571300 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"4529f70ef2c39db4d62bb63d189628d10949b53d58e6fdff27f4214121bc6456"} Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.588743 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 21:36:05 crc kubenswrapper[4962]: E1201 21:36:05.591262 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 21:36:06.091239218 +0000 UTC m=+150.192678423 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.597526 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-hvvd4" event={"ID":"c25f2bd8-5e89-40b2-8c62-3c67a364384f","Type":"ContainerStarted","Data":"6ee5d9a41485992ebbe00cc2ddb2c06045f973099c194c2c342691d1a0bd0674"} Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.629196 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7qvr5" event={"ID":"81201e01-7100-49fe-a7d2-d402fa8fe9af","Type":"ContainerStarted","Data":"24c563b5c33de96eae86068959a60bcd73154be38320dc2ec6d862c5e2fe1442"} Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.644695 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-csp6p" podStartSLOduration=127.644678375 podStartE2EDuration="2m7.644678375s" podCreationTimestamp="2025-12-01 21:33:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:36:05.641006998 +0000 UTC m=+149.742446193" watchObservedRunningTime="2025-12-01 21:36:05.644678375 +0000 UTC m=+149.746117580" Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.653676 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kg8bg" event={"ID":"979f8807-55d0-475c-9cd6-08b47d99f9e2","Type":"ContainerStarted","Data":"f1634344f874c774c4d0256dac20b3875e3cb170fed5a3e9ec009f9bd9be440d"} Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.657276 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-7x948" podStartSLOduration=127.657251966 podStartE2EDuration="2m7.657251966s" podCreationTimestamp="2025-12-01 21:33:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:36:05.571751985 +0000 UTC m=+149.673191180" watchObservedRunningTime="2025-12-01 21:36:05.657251966 +0000 UTC m=+149.758691181" Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.668044 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7hn2f" event={"ID":"458c25bb-0a4d-4f96-9f46-cdfd66b6ae34","Type":"ContainerStarted","Data":"4bd3682a0231b1cc286c33cd99673e2d6760c5dc19a8ed1933dd9dd2000d6e74"} Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.668079 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7hn2f" event={"ID":"458c25bb-0a4d-4f96-9f46-cdfd66b6ae34","Type":"ContainerStarted","Data":"619ebd5920a9e3102022f4b131e62761a9eda4a60ca4a89c2e607cb170e98e91"} Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.673965 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7hn2f" Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.696375 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7hn2f" Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.697404 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k8fq2\" (UID: \"b2d93aa1-eee7-4a67-b5ee-a05a6696b624\") " pod="openshift-image-registry/image-registry-697d97f7c8-k8fq2" Dec 01 21:36:05 crc kubenswrapper[4962]: E1201 21:36:05.698313 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 21:36:06.198302677 +0000 UTC m=+150.299741872 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k8fq2" (UID: "b2d93aa1-eee7-4a67-b5ee-a05a6696b624") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.711083 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lwmpj" event={"ID":"2fb73eca-8b7b-488b-bea1-3ca7f1145fd7","Type":"ContainerStarted","Data":"f773a868edc5389f20066e6af0cbb02cb1a12156fd69347ddafbcf3336e38b38"} Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.711132 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lwmpj" event={"ID":"2fb73eca-8b7b-488b-bea1-3ca7f1145fd7","Type":"ContainerStarted","Data":"d8999412ca273e701bbeae7cb6e5a3895b919d9623472a034e0e64be3f72d136"} Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.719101 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nl5jh" event={"ID":"00f6ed0c-f791-460d-acd4-d100a0b21710","Type":"ContainerStarted","Data":"efe9eaa6e6b31324cc57d324593c74166dc370b13eee6e06dcf3ce16b6a3f83e"} Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.739470 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-jsv9r" podStartSLOduration=127.73945569 podStartE2EDuration="2m7.73945569s" podCreationTimestamp="2025-12-01 21:33:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:36:05.738991878 +0000 UTC m=+149.840431073" watchObservedRunningTime="2025-12-01 21:36:05.73945569 +0000 UTC m=+149.840894885" Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.741183 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-csp6p" Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.767817 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-frc7j" podStartSLOduration=127.767799607 podStartE2EDuration="2m7.767799607s" podCreationTimestamp="2025-12-01 21:33:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:36:05.766527163 +0000 UTC m=+149.867966368" watchObservedRunningTime="2025-12-01 21:36:05.767799607 +0000 UTC m=+149.869238802" Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.803831 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 21:36:05 crc kubenswrapper[4962]: E1201 21:36:05.804027 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 21:36:06.30399982 +0000 UTC m=+150.405439015 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.804105 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k8fq2\" (UID: \"b2d93aa1-eee7-4a67-b5ee-a05a6696b624\") " pod="openshift-image-registry/image-registry-697d97f7c8-k8fq2" Dec 01 21:36:05 crc kubenswrapper[4962]: E1201 21:36:05.806109 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 21:36:06.306094045 +0000 UTC m=+150.407533240 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k8fq2" (UID: "b2d93aa1-eee7-4a67-b5ee-a05a6696b624") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.834070 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-hjqsp" Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.864433 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-hjqsp" podStartSLOduration=127.86441343 podStartE2EDuration="2m7.86441343s" podCreationTimestamp="2025-12-01 21:33:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:36:05.857710684 +0000 UTC m=+149.959149899" watchObservedRunningTime="2025-12-01 21:36:05.86441343 +0000 UTC m=+149.965852625" Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.891248 4962 patch_prober.go:28] interesting pod/router-default-5444994796-hjqsp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 21:36:05 crc kubenswrapper[4962]: [-]has-synced failed: reason withheld Dec 01 21:36:05 crc kubenswrapper[4962]: [+]process-running ok Dec 01 21:36:05 crc kubenswrapper[4962]: healthz check failed Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.891307 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hjqsp" podUID="3bc8d3cc-b827-4f76-b41e-18e0790b6e66" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.908580 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 21:36:05 crc kubenswrapper[4962]: E1201 21:36:05.908874 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 21:36:06.408856371 +0000 UTC m=+150.510295566 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.962815 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h7flt" podStartSLOduration=127.962798481 podStartE2EDuration="2m7.962798481s" podCreationTimestamp="2025-12-01 21:33:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:36:05.909094597 +0000 UTC m=+150.010533792" watchObservedRunningTime="2025-12-01 21:36:05.962798481 +0000 UTC m=+150.064237676" Dec 01 21:36:05 crc kubenswrapper[4962]: I1201 21:36:05.963267 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cj2tt" podStartSLOduration=127.963261723 podStartE2EDuration="2m7.963261723s" podCreationTimestamp="2025-12-01 21:33:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:36:05.962275127 +0000 UTC m=+150.063714322" watchObservedRunningTime="2025-12-01 21:36:05.963261723 +0000 UTC m=+150.064700938" Dec 01 21:36:06 crc kubenswrapper[4962]: I1201 21:36:06.016063 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-h69ht" podStartSLOduration=128.016045973 podStartE2EDuration="2m8.016045973s" podCreationTimestamp="2025-12-01 21:33:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:36:06.003509233 +0000 UTC m=+150.104948428" watchObservedRunningTime="2025-12-01 21:36:06.016045973 +0000 UTC m=+150.117485168" Dec 01 21:36:06 crc kubenswrapper[4962]: I1201 21:36:06.016815 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-cjns7" podStartSLOduration=128.016809893 podStartE2EDuration="2m8.016809893s" podCreationTimestamp="2025-12-01 21:33:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:36:06.014762509 +0000 UTC m=+150.116201704" watchObservedRunningTime="2025-12-01 21:36:06.016809893 +0000 UTC m=+150.118249078" Dec 01 21:36:06 crc kubenswrapper[4962]: I1201 21:36:06.020615 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k8fq2\" (UID: \"b2d93aa1-eee7-4a67-b5ee-a05a6696b624\") " pod="openshift-image-registry/image-registry-697d97f7c8-k8fq2" Dec 01 21:36:06 crc kubenswrapper[4962]: E1201 21:36:06.020876 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 21:36:06.52086473 +0000 UTC m=+150.622303925 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k8fq2" (UID: "b2d93aa1-eee7-4a67-b5ee-a05a6696b624") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 21:36:06 crc kubenswrapper[4962]: I1201 21:36:06.032499 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lwmpj" podStartSLOduration=128.032462345 podStartE2EDuration="2m8.032462345s" podCreationTimestamp="2025-12-01 21:33:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:36:06.031482729 +0000 UTC m=+150.132921924" watchObservedRunningTime="2025-12-01 21:36:06.032462345 +0000 UTC m=+150.133901550" Dec 01 21:36:06 crc kubenswrapper[4962]: I1201 21:36:06.065239 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nl5jh" podStartSLOduration=128.065223648 podStartE2EDuration="2m8.065223648s" podCreationTimestamp="2025-12-01 21:33:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:36:06.06416986 +0000 UTC m=+150.165609065" watchObservedRunningTime="2025-12-01 21:36:06.065223648 +0000 UTC m=+150.166662843" Dec 01 21:36:06 crc kubenswrapper[4962]: I1201 21:36:06.088571 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-hvvd4" podStartSLOduration=128.088542572 podStartE2EDuration="2m8.088542572s" podCreationTimestamp="2025-12-01 21:33:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:36:06.08505125 +0000 UTC m=+150.186490445" watchObservedRunningTime="2025-12-01 21:36:06.088542572 +0000 UTC m=+150.189981767" Dec 01 21:36:06 crc kubenswrapper[4962]: I1201 21:36:06.120216 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7hn2f" podStartSLOduration=128.120202465 podStartE2EDuration="2m8.120202465s" podCreationTimestamp="2025-12-01 21:33:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:36:06.118542372 +0000 UTC m=+150.219981577" watchObservedRunningTime="2025-12-01 21:36:06.120202465 +0000 UTC m=+150.221641650" Dec 01 21:36:06 crc kubenswrapper[4962]: I1201 21:36:06.125478 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 21:36:06 crc kubenswrapper[4962]: E1201 21:36:06.125690 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 21:36:06.625652459 +0000 UTC m=+150.727091654 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 21:36:06 crc kubenswrapper[4962]: I1201 21:36:06.125744 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k8fq2\" (UID: \"b2d93aa1-eee7-4a67-b5ee-a05a6696b624\") " pod="openshift-image-registry/image-registry-697d97f7c8-k8fq2" Dec 01 21:36:06 crc kubenswrapper[4962]: E1201 21:36:06.126271 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 21:36:06.626258325 +0000 UTC m=+150.727697520 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k8fq2" (UID: "b2d93aa1-eee7-4a67-b5ee-a05a6696b624") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 21:36:06 crc kubenswrapper[4962]: I1201 21:36:06.144615 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-g44h6" podStartSLOduration=128.144596418 podStartE2EDuration="2m8.144596418s" podCreationTimestamp="2025-12-01 21:33:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:36:06.14165372 +0000 UTC m=+150.243092925" watchObservedRunningTime="2025-12-01 21:36:06.144596418 +0000 UTC m=+150.246035613" Dec 01 21:36:06 crc kubenswrapper[4962]: I1201 21:36:06.228829 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 21:36:06 crc kubenswrapper[4962]: E1201 21:36:06.229367 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 21:36:06.729352019 +0000 UTC m=+150.830791214 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 21:36:06 crc kubenswrapper[4962]: I1201 21:36:06.334743 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k8fq2\" (UID: \"b2d93aa1-eee7-4a67-b5ee-a05a6696b624\") " pod="openshift-image-registry/image-registry-697d97f7c8-k8fq2" Dec 01 21:36:06 crc kubenswrapper[4962]: E1201 21:36:06.335069 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 21:36:06.835052042 +0000 UTC m=+150.936491227 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k8fq2" (UID: "b2d93aa1-eee7-4a67-b5ee-a05a6696b624") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 21:36:06 crc kubenswrapper[4962]: I1201 21:36:06.437069 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 21:36:06 crc kubenswrapper[4962]: E1201 21:36:06.437151 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 21:36:06.937129639 +0000 UTC m=+151.038568834 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 21:36:06 crc kubenswrapper[4962]: I1201 21:36:06.437521 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k8fq2\" (UID: \"b2d93aa1-eee7-4a67-b5ee-a05a6696b624\") " pod="openshift-image-registry/image-registry-697d97f7c8-k8fq2" Dec 01 21:36:06 crc kubenswrapper[4962]: E1201 21:36:06.437901 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 21:36:06.937889489 +0000 UTC m=+151.039328684 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k8fq2" (UID: "b2d93aa1-eee7-4a67-b5ee-a05a6696b624") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 21:36:06 crc kubenswrapper[4962]: I1201 21:36:06.541484 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 21:36:06 crc kubenswrapper[4962]: E1201 21:36:06.541888 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 21:36:07.041872817 +0000 UTC m=+151.143312012 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 21:36:06 crc kubenswrapper[4962]: I1201 21:36:06.642475 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k8fq2\" (UID: \"b2d93aa1-eee7-4a67-b5ee-a05a6696b624\") " pod="openshift-image-registry/image-registry-697d97f7c8-k8fq2" Dec 01 21:36:06 crc kubenswrapper[4962]: E1201 21:36:06.642754 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 21:36:07.142743953 +0000 UTC m=+151.244183148 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k8fq2" (UID: "b2d93aa1-eee7-4a67-b5ee-a05a6696b624") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 21:36:06 crc kubenswrapper[4962]: I1201 21:36:06.745200 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 21:36:06 crc kubenswrapper[4962]: E1201 21:36:06.745509 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 21:36:07.245493989 +0000 UTC m=+151.346933184 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 21:36:06 crc kubenswrapper[4962]: I1201 21:36:06.793300 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sl474" event={"ID":"a3bc6803-77e3-4e63-bfaa-761500e161de","Type":"ContainerStarted","Data":"494a1ccad2d7d166ac5e78020353322ef1143809c9cc923a90c6225c01117ecf"} Dec 01 21:36:06 crc kubenswrapper[4962]: I1201 21:36:06.793561 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sl474" event={"ID":"a3bc6803-77e3-4e63-bfaa-761500e161de","Type":"ContainerStarted","Data":"4c0045f4825ba522e3f113314f7767149412ce2e738de1213415c60079b30d98"} Dec 01 21:36:06 crc kubenswrapper[4962]: I1201 21:36:06.801955 4962 generic.go:334] "Generic (PLEG): container finished" podID="0b40c111-e562-48b7-9db2-1a494e16786c" containerID="a3f3634aa65f531b3dd9036ba045cd69a06e787911d0da61b146d738f76bbe0d" exitCode=0 Dec 01 21:36:06 crc kubenswrapper[4962]: I1201 21:36:06.802633 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lt2df" event={"ID":"0b40c111-e562-48b7-9db2-1a494e16786c","Type":"ContainerDied","Data":"a3f3634aa65f531b3dd9036ba045cd69a06e787911d0da61b146d738f76bbe0d"} Dec 01 21:36:06 crc kubenswrapper[4962]: I1201 21:36:06.804347 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6ffb7" event={"ID":"95b5b235-1db6-458c-bdf9-c065eb0c70e5","Type":"ContainerStarted","Data":"9462fec68562e3c059eadd8f86c503739f113e8efb15fac238b15a22a6da06d9"} Dec 01 21:36:06 crc kubenswrapper[4962]: I1201 21:36:06.805372 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6ffb7" Dec 01 21:36:06 crc kubenswrapper[4962]: I1201 21:36:06.809130 4962 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-6ffb7 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.25:5443/healthz\": dial tcp 10.217.0.25:5443: connect: connection refused" start-of-body= Dec 01 21:36:06 crc kubenswrapper[4962]: I1201 21:36:06.809189 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6ffb7" podUID="95b5b235-1db6-458c-bdf9-c065eb0c70e5" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.25:5443/healthz\": dial tcp 10.217.0.25:5443: connect: connection refused" Dec 01 21:36:06 crc kubenswrapper[4962]: I1201 21:36:06.819565 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"855069b6878eb79d7ea95a847a697d68a2ce26c7f6b33f6777a243fe4138ab26"} Dec 01 21:36:06 crc kubenswrapper[4962]: I1201 21:36:06.823571 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ljnp5" event={"ID":"f4a93f57-1e83-4b94-b1eb-fc28eea15503","Type":"ContainerStarted","Data":"238c709bcbcda4eb0875b5e86710182829f6d17bdd10347c93580431d10c0736"} Dec 01 21:36:06 crc kubenswrapper[4962]: I1201 21:36:06.824268 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ljnp5" Dec 01 21:36:06 crc kubenswrapper[4962]: I1201 21:36:06.825317 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kg8bg" event={"ID":"979f8807-55d0-475c-9cd6-08b47d99f9e2","Type":"ContainerStarted","Data":"22021c647b1260b43fb47df5bcfa84d7e17a470d889fc1127c41daaa8cd3b366"} Dec 01 21:36:06 crc kubenswrapper[4962]: I1201 21:36:06.835187 4962 patch_prober.go:28] interesting pod/router-default-5444994796-hjqsp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 21:36:06 crc kubenswrapper[4962]: [-]has-synced failed: reason withheld Dec 01 21:36:06 crc kubenswrapper[4962]: [+]process-running ok Dec 01 21:36:06 crc kubenswrapper[4962]: healthz check failed Dec 01 21:36:06 crc kubenswrapper[4962]: I1201 21:36:06.835266 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hjqsp" podUID="3bc8d3cc-b827-4f76-b41e-18e0790b6e66" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 21:36:06 crc kubenswrapper[4962]: I1201 21:36:06.844410 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410410-kr2lr" event={"ID":"fde3e2c2-ed59-4cbf-8554-1a0438eb81dc","Type":"ContainerStarted","Data":"578cf2c176fdcfafd49e6d657f8c064a6770b8a85590ca82bc4ea2b72aa4403d"} Dec 01 21:36:06 crc kubenswrapper[4962]: I1201 21:36:06.844461 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410410-kr2lr" event={"ID":"fde3e2c2-ed59-4cbf-8554-1a0438eb81dc","Type":"ContainerStarted","Data":"edf4629c34895a810e8726f6ac94a76f24dddf7936bc79e2a8faddea44a51dd1"} Dec 01 21:36:06 crc kubenswrapper[4962]: I1201 21:36:06.846595 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k8fq2\" (UID: \"b2d93aa1-eee7-4a67-b5ee-a05a6696b624\") " pod="openshift-image-registry/image-registry-697d97f7c8-k8fq2" Dec 01 21:36:06 crc kubenswrapper[4962]: E1201 21:36:06.848448 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 21:36:07.348430339 +0000 UTC m=+151.449869604 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k8fq2" (UID: "b2d93aa1-eee7-4a67-b5ee-a05a6696b624") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 21:36:06 crc kubenswrapper[4962]: I1201 21:36:06.859427 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-p5kqx" event={"ID":"7ad9fda2-065b-4620-bc36-e33403fcdd53","Type":"ContainerStarted","Data":"47d64b0df8f4445f2fb91407f9f29c297bb37075610f907fbd8ab63be614b20d"} Dec 01 21:36:06 crc kubenswrapper[4962]: I1201 21:36:06.859465 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-p5kqx" event={"ID":"7ad9fda2-065b-4620-bc36-e33403fcdd53","Type":"ContainerStarted","Data":"0c73355c783b14554f90a7dfa86cc2c0bb326ec9c8c05b5283ab8fc662357270"} Dec 01 21:36:06 crc kubenswrapper[4962]: I1201 21:36:06.860200 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-p5kqx" Dec 01 21:36:06 crc kubenswrapper[4962]: I1201 21:36:06.861460 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ljnp5" Dec 01 21:36:06 crc kubenswrapper[4962]: I1201 21:36:06.863814 4962 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-p5kqx container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" start-of-body= Dec 01 21:36:06 crc kubenswrapper[4962]: I1201 21:36:06.863845 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-p5kqx" podUID="7ad9fda2-065b-4620-bc36-e33403fcdd53" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" Dec 01 21:36:06 crc kubenswrapper[4962]: I1201 21:36:06.864885 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nl5jh" event={"ID":"00f6ed0c-f791-460d-acd4-d100a0b21710","Type":"ContainerStarted","Data":"fd350d7b1ef6a5438c0e12b47337454eb8a02308e5376e9ed92a6c3491f37851"} Dec 01 21:36:06 crc kubenswrapper[4962]: I1201 21:36:06.882688 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-rvt4l" event={"ID":"696189ea-f7d5-445e-a07e-ebb32f5e219c","Type":"ContainerStarted","Data":"b61b10d52d7e224a654cd5e6ea4ddbabc4f679332b10ccff286076ae44bfe2d1"} Dec 01 21:36:06 crc kubenswrapper[4962]: I1201 21:36:06.931457 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"48f988eb887a9dbbb69c9036cd73dd5c9d57f7c13ed677f2647e7209c2c6df0c"} Dec 01 21:36:06 crc kubenswrapper[4962]: I1201 21:36:06.931499 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"fdb7639c6b652e87e5a5ffced0bb34a23f3f0e4c9ff42fe56206726c149b2f5b"} Dec 01 21:36:06 crc kubenswrapper[4962]: I1201 21:36:06.947650 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 21:36:06 crc kubenswrapper[4962]: E1201 21:36:06.949051 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 21:36:07.449036098 +0000 UTC m=+151.550475293 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 21:36:06 crc kubenswrapper[4962]: I1201 21:36:06.953897 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-n9nhh" event={"ID":"cfe98aa8-58d0-4ed5-ae64-8c0faa03e247","Type":"ContainerStarted","Data":"8c55e05d5cd0d186b7c4ea506fda15b84e7215c80facc6eabcbc2b3a1d4aafe8"} Dec 01 21:36:06 crc kubenswrapper[4962]: I1201 21:36:06.956365 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gh8cn" event={"ID":"10c473f1-a2f6-4565-ba09-d7e28dea1600","Type":"ContainerStarted","Data":"fc0bb9f7e649c122bee6ac8bc39b7e98ca7701fa577075ed346cdf66cc5fd574"} Dec 01 21:36:06 crc kubenswrapper[4962]: I1201 21:36:06.956385 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gh8cn" event={"ID":"10c473f1-a2f6-4565-ba09-d7e28dea1600","Type":"ContainerStarted","Data":"3c1b34230855ef0c2db04e2a5506984a803403736c5f69989c5089eb91b83793"} Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.014051 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-j5fqx" event={"ID":"cab55760-3ea7-41e5-8ac2-7e8c44f5ecb1","Type":"ContainerStarted","Data":"de844423fd6d23b2e33a74110ce8065c06cf1fa11e3cf129ec8d39928510d2b0"} Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.018909 4962 generic.go:334] "Generic (PLEG): container finished" podID="931bdb70-51c0-4893-b3b9-6fe8dd700233" containerID="74aec4f7658d6f18a6f959ad1c1ab5954ec899edf894fca5621919a28a562dbb" exitCode=0 Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.019016 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-w56dm" event={"ID":"931bdb70-51c0-4893-b3b9-6fe8dd700233","Type":"ContainerDied","Data":"74aec4f7658d6f18a6f959ad1c1ab5954ec899edf894fca5621919a28a562dbb"} Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.025146 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-tqsxt" event={"ID":"be92f2e8-fa20-4d9b-8891-429dfc64490b","Type":"ContainerStarted","Data":"1dd592697ea22c15d1bf9f813763b7279648f66788546ac8ad2416a473b5b935"} Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.025186 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-tqsxt" event={"ID":"be92f2e8-fa20-4d9b-8891-429dfc64490b","Type":"ContainerStarted","Data":"9db5be929fc2bf0e2edf4af341111b1168f22023dd9cfa1c7316d242ba3284a8"} Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.030487 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-zbnc9" event={"ID":"c08b9259-9d06-4450-a2e6-09740b95fdea","Type":"ContainerStarted","Data":"424bb622f7e14ce9b1817ff2edc53a84d89a9b7ecd47b9950ac5d70a579b0f94"} Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.038472 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-p5kqx" podStartSLOduration=129.038450942 podStartE2EDuration="2m9.038450942s" podCreationTimestamp="2025-12-01 21:33:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:36:07.000529934 +0000 UTC m=+151.101969139" watchObservedRunningTime="2025-12-01 21:36:07.038450942 +0000 UTC m=+151.139890137" Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.051751 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7qvr5" event={"ID":"81201e01-7100-49fe-a7d2-d402fa8fe9af","Type":"ContainerStarted","Data":"25039f64490694c7329f426ba12af21d64f2e10089a26118215a25de31bbaeaa"} Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.052115 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k8fq2\" (UID: \"b2d93aa1-eee7-4a67-b5ee-a05a6696b624\") " pod="openshift-image-registry/image-registry-697d97f7c8-k8fq2" Dec 01 21:36:07 crc kubenswrapper[4962]: E1201 21:36:07.052373 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 21:36:07.552361928 +0000 UTC m=+151.653801123 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k8fq2" (UID: "b2d93aa1-eee7-4a67-b5ee-a05a6696b624") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.073110 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7p82b" event={"ID":"ccb13006-cd7f-4249-9d04-7391d26eaae3","Type":"ContainerStarted","Data":"c6a8335c3a431cf0c445e8fc2958b0f9d4384622eb607564b051fdfd11cac6c0"} Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.086143 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-n5kbb" event={"ID":"993bb116-5859-4547-84fd-2ed614d5ec8e","Type":"ContainerStarted","Data":"403b9310e01dd5f008bdec383869d6f89644b806fc92440eb298d62b86f8bc2f"} Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.106314 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-stv9m" event={"ID":"f10f3763-03b0-43d0-88fd-ce89274a67d9","Type":"ContainerStarted","Data":"691540e4553b2a2f34c4a2d6839c721ee5869b0bb40c03edf2e133c68c5c2c45"} Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.107027 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-stv9m" Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.120629 4962 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-stv9m container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.120673 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-stv9m" podUID="f10f3763-03b0-43d0-88fd-ce89274a67d9" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.133465 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-w2wdh" event={"ID":"847a70f9-ec15-480b-8ed8-9d3cb006ce64","Type":"ContainerStarted","Data":"6e67015c13a80631dd8cc072ce56610cc0a35ceb44e3053ebf24272fae7b341b"} Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.154795 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mpxbb"] Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.155629 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mpxbb" Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.165608 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.166074 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 21:36:07 crc kubenswrapper[4962]: E1201 21:36:07.166204 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 21:36:07.666156525 +0000 UTC m=+151.767595720 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.166765 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k8fq2\" (UID: \"b2d93aa1-eee7-4a67-b5ee-a05a6696b624\") " pod="openshift-image-registry/image-registry-697d97f7c8-k8fq2" Dec 01 21:36:07 crc kubenswrapper[4962]: E1201 21:36:07.169964 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 21:36:07.669954435 +0000 UTC m=+151.771393630 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k8fq2" (UID: "b2d93aa1-eee7-4a67-b5ee-a05a6696b624") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.190912 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kg8bg" podStartSLOduration=129.190888216 podStartE2EDuration="2m9.190888216s" podCreationTimestamp="2025-12-01 21:33:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:36:07.163664999 +0000 UTC m=+151.265104194" watchObservedRunningTime="2025-12-01 21:36:07.190888216 +0000 UTC m=+151.292327411" Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.204769 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mpxbb"] Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.208650 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-qj8zv" event={"ID":"12136e64-010e-49bc-9c3e-d1c65467f361","Type":"ContainerStarted","Data":"8344d70affc4c231371113058fe03f00b74ea544cf8e525ed6059198a3bae056"} Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.241236 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6ffb7" podStartSLOduration=129.241217461 podStartE2EDuration="2m9.241217461s" podCreationTimestamp="2025-12-01 21:33:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:36:07.235608913 +0000 UTC m=+151.337048108" watchObservedRunningTime="2025-12-01 21:36:07.241217461 +0000 UTC m=+151.342656656" Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.256176 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-85jfd" event={"ID":"56be3765-fdc4-4de6-96bb-6a06bfc16c6b","Type":"ContainerStarted","Data":"e5ea40e33a5ed4b5b531d1541cdc1c7921bbfeb4cbf1f76a2fce6869273b2191"} Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.256215 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-85jfd" event={"ID":"56be3765-fdc4-4de6-96bb-6a06bfc16c6b","Type":"ContainerStarted","Data":"0b9a3f27e13d20c43e8ea3c0d5a44fe58b8f0a270c87fc1d86e904db1309c5d2"} Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.270387 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.270630 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxfzb\" (UniqueName: \"kubernetes.io/projected/b9788313-1ab9-4dce-8fd0-363c8086d8d3-kube-api-access-cxfzb\") pod \"community-operators-mpxbb\" (UID: \"b9788313-1ab9-4dce-8fd0-363c8086d8d3\") " pod="openshift-marketplace/community-operators-mpxbb" Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.270676 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9788313-1ab9-4dce-8fd0-363c8086d8d3-catalog-content\") pod \"community-operators-mpxbb\" (UID: \"b9788313-1ab9-4dce-8fd0-363c8086d8d3\") " pod="openshift-marketplace/community-operators-mpxbb" Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.270720 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9788313-1ab9-4dce-8fd0-363c8086d8d3-utilities\") pod \"community-operators-mpxbb\" (UID: \"b9788313-1ab9-4dce-8fd0-363c8086d8d3\") " pod="openshift-marketplace/community-operators-mpxbb" Dec 01 21:36:07 crc kubenswrapper[4962]: E1201 21:36:07.271567 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 21:36:07.77155128 +0000 UTC m=+151.872990475 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.283498 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"e83dc57de9312fa6dfc74a68b4bcce5590d0e460fa073031d26a3421133d699b"} Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.284083 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.314510 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29410410-kr2lr" podStartSLOduration=129.31449477 podStartE2EDuration="2m9.31449477s" podCreationTimestamp="2025-12-01 21:33:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:36:07.281091251 +0000 UTC m=+151.382530456" watchObservedRunningTime="2025-12-01 21:36:07.31449477 +0000 UTC m=+151.415933965" Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.320426 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zkgk9" event={"ID":"6b92760e-ca08-4b9c-b2a0-3f522ba6a975","Type":"ContainerStarted","Data":"49d8df409d1ddc8e52e272dfda2179c4937d47f8a83e3d807cf98ddd2b23806a"} Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.333703 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-gg284" event={"ID":"710d5a74-c24a-452e-a5bf-1c23b3589361","Type":"ContainerStarted","Data":"8f81f2b474c20f8da1aeddca68f63284cfdd355b6d15d6cd327f29c2914505ab"} Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.333743 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-gg284" Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.333753 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-gg284" event={"ID":"710d5a74-c24a-452e-a5bf-1c23b3589361","Type":"ContainerStarted","Data":"a8232a71e655d8fb2df107d985b242d6c51ede226b1647a04ef51987681d6c69"} Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.337413 4962 patch_prober.go:28] interesting pod/downloads-7954f5f757-gg284 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.337465 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-gg284" podUID="710d5a74-c24a-452e-a5bf-1c23b3589361" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.356335 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h7flt" Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.357398 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-cjns7" Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.358157 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-rvt4l" podStartSLOduration=129.35814791 podStartE2EDuration="2m9.35814791s" podCreationTimestamp="2025-12-01 21:33:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:36:07.314180382 +0000 UTC m=+151.415619577" watchObservedRunningTime="2025-12-01 21:36:07.35814791 +0000 UTC m=+151.459587095" Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.359967 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-x6knc"] Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.360783 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x6knc" Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.371565 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.371887 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k8fq2\" (UID: \"b2d93aa1-eee7-4a67-b5ee-a05a6696b624\") " pod="openshift-image-registry/image-registry-697d97f7c8-k8fq2" Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.371978 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxfzb\" (UniqueName: \"kubernetes.io/projected/b9788313-1ab9-4dce-8fd0-363c8086d8d3-kube-api-access-cxfzb\") pod \"community-operators-mpxbb\" (UID: \"b9788313-1ab9-4dce-8fd0-363c8086d8d3\") " pod="openshift-marketplace/community-operators-mpxbb" Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.372119 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9788313-1ab9-4dce-8fd0-363c8086d8d3-catalog-content\") pod \"community-operators-mpxbb\" (UID: \"b9788313-1ab9-4dce-8fd0-363c8086d8d3\") " pod="openshift-marketplace/community-operators-mpxbb" Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.372241 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9788313-1ab9-4dce-8fd0-363c8086d8d3-utilities\") pod \"community-operators-mpxbb\" (UID: \"b9788313-1ab9-4dce-8fd0-363c8086d8d3\") " pod="openshift-marketplace/community-operators-mpxbb" Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.373425 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ljnp5" podStartSLOduration=129.373405052 podStartE2EDuration="2m9.373405052s" podCreationTimestamp="2025-12-01 21:33:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:36:07.356793254 +0000 UTC m=+151.458232449" watchObservedRunningTime="2025-12-01 21:36:07.373405052 +0000 UTC m=+151.474844237" Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.374628 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9788313-1ab9-4dce-8fd0-363c8086d8d3-catalog-content\") pod \"community-operators-mpxbb\" (UID: \"b9788313-1ab9-4dce-8fd0-363c8086d8d3\") " pod="openshift-marketplace/community-operators-mpxbb" Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.377529 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9788313-1ab9-4dce-8fd0-363c8086d8d3-utilities\") pod \"community-operators-mpxbb\" (UID: \"b9788313-1ab9-4dce-8fd0-363c8086d8d3\") " pod="openshift-marketplace/community-operators-mpxbb" Dec 01 21:36:07 crc kubenswrapper[4962]: E1201 21:36:07.378386 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 21:36:07.878373642 +0000 UTC m=+151.979812827 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k8fq2" (UID: "b2d93aa1-eee7-4a67-b5ee-a05a6696b624") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.406318 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x6knc"] Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.420702 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sl474" podStartSLOduration=129.420683786 podStartE2EDuration="2m9.420683786s" podCreationTimestamp="2025-12-01 21:33:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:36:07.418505939 +0000 UTC m=+151.519945134" watchObservedRunningTime="2025-12-01 21:36:07.420683786 +0000 UTC m=+151.522122981" Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.472892 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-n5kbb" podStartSLOduration=7.472874121 podStartE2EDuration="7.472874121s" podCreationTimestamp="2025-12-01 21:36:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:36:07.470216721 +0000 UTC m=+151.571655926" watchObservedRunningTime="2025-12-01 21:36:07.472874121 +0000 UTC m=+151.574313316" Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.474996 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.475283 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxfzb\" (UniqueName: \"kubernetes.io/projected/b9788313-1ab9-4dce-8fd0-363c8086d8d3-kube-api-access-cxfzb\") pod \"community-operators-mpxbb\" (UID: \"b9788313-1ab9-4dce-8fd0-363c8086d8d3\") " pod="openshift-marketplace/community-operators-mpxbb" Dec 01 21:36:07 crc kubenswrapper[4962]: E1201 21:36:07.475060 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 21:36:07.975045478 +0000 UTC m=+152.076484663 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.475574 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4620db4-9171-44b5-b944-dcf2e871ef41-utilities\") pod \"certified-operators-x6knc\" (UID: \"a4620db4-9171-44b5-b944-dcf2e871ef41\") " pod="openshift-marketplace/certified-operators-x6knc" Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.475611 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4620db4-9171-44b5-b944-dcf2e871ef41-catalog-content\") pod \"certified-operators-x6knc\" (UID: \"a4620db4-9171-44b5-b944-dcf2e871ef41\") " pod="openshift-marketplace/certified-operators-x6knc" Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.475633 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k8fq2\" (UID: \"b2d93aa1-eee7-4a67-b5ee-a05a6696b624\") " pod="openshift-image-registry/image-registry-697d97f7c8-k8fq2" Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.475676 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qz8pv\" (UniqueName: \"kubernetes.io/projected/a4620db4-9171-44b5-b944-dcf2e871ef41-kube-api-access-qz8pv\") pod \"certified-operators-x6knc\" (UID: \"a4620db4-9171-44b5-b944-dcf2e871ef41\") " pod="openshift-marketplace/certified-operators-x6knc" Dec 01 21:36:07 crc kubenswrapper[4962]: E1201 21:36:07.475946 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 21:36:07.975923791 +0000 UTC m=+152.077362986 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k8fq2" (UID: "b2d93aa1-eee7-4a67-b5ee-a05a6696b624") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.494152 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mpxbb" Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.501835 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7qvr5" podStartSLOduration=129.501817663 podStartE2EDuration="2m9.501817663s" podCreationTimestamp="2025-12-01 21:33:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:36:07.499475271 +0000 UTC m=+151.600914466" watchObservedRunningTime="2025-12-01 21:36:07.501817663 +0000 UTC m=+151.603256858" Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.562212 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-zbnc9" podStartSLOduration=128.562193232 podStartE2EDuration="2m8.562193232s" podCreationTimestamp="2025-12-01 21:33:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:36:07.561266638 +0000 UTC m=+151.662705843" watchObservedRunningTime="2025-12-01 21:36:07.562193232 +0000 UTC m=+151.663632437" Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.562474 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-l5625"] Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.563770 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l5625" Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.578760 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.579025 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qz8pv\" (UniqueName: \"kubernetes.io/projected/a4620db4-9171-44b5-b944-dcf2e871ef41-kube-api-access-qz8pv\") pod \"certified-operators-x6knc\" (UID: \"a4620db4-9171-44b5-b944-dcf2e871ef41\") " pod="openshift-marketplace/certified-operators-x6knc" Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.579098 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4620db4-9171-44b5-b944-dcf2e871ef41-utilities\") pod \"certified-operators-x6knc\" (UID: \"a4620db4-9171-44b5-b944-dcf2e871ef41\") " pod="openshift-marketplace/certified-operators-x6knc" Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.579134 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4620db4-9171-44b5-b944-dcf2e871ef41-catalog-content\") pod \"certified-operators-x6knc\" (UID: \"a4620db4-9171-44b5-b944-dcf2e871ef41\") " pod="openshift-marketplace/certified-operators-x6knc" Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.579607 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4620db4-9171-44b5-b944-dcf2e871ef41-catalog-content\") pod \"certified-operators-x6knc\" (UID: \"a4620db4-9171-44b5-b944-dcf2e871ef41\") " pod="openshift-marketplace/certified-operators-x6knc" Dec 01 21:36:07 crc kubenswrapper[4962]: E1201 21:36:07.579686 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 21:36:08.079661062 +0000 UTC m=+152.181100257 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.580054 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4620db4-9171-44b5-b944-dcf2e871ef41-utilities\") pod \"certified-operators-x6knc\" (UID: \"a4620db4-9171-44b5-b944-dcf2e871ef41\") " pod="openshift-marketplace/certified-operators-x6knc" Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.595997 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l5625"] Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.640349 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qz8pv\" (UniqueName: \"kubernetes.io/projected/a4620db4-9171-44b5-b944-dcf2e871ef41-kube-api-access-qz8pv\") pod \"certified-operators-x6knc\" (UID: \"a4620db4-9171-44b5-b944-dcf2e871ef41\") " pod="openshift-marketplace/certified-operators-x6knc" Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.682583 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k8fq2\" (UID: \"b2d93aa1-eee7-4a67-b5ee-a05a6696b624\") " pod="openshift-image-registry/image-registry-697d97f7c8-k8fq2" Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.682673 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1569b19f-7f89-465d-9140-e2dce5e33425-catalog-content\") pod \"community-operators-l5625\" (UID: \"1569b19f-7f89-465d-9140-e2dce5e33425\") " pod="openshift-marketplace/community-operators-l5625" Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.682695 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1569b19f-7f89-465d-9140-e2dce5e33425-utilities\") pod \"community-operators-l5625\" (UID: \"1569b19f-7f89-465d-9140-e2dce5e33425\") " pod="openshift-marketplace/community-operators-l5625" Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.682716 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkgp7\" (UniqueName: \"kubernetes.io/projected/1569b19f-7f89-465d-9140-e2dce5e33425-kube-api-access-fkgp7\") pod \"community-operators-l5625\" (UID: \"1569b19f-7f89-465d-9140-e2dce5e33425\") " pod="openshift-marketplace/community-operators-l5625" Dec 01 21:36:07 crc kubenswrapper[4962]: E1201 21:36:07.682968 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 21:36:08.182957072 +0000 UTC m=+152.284396267 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k8fq2" (UID: "b2d93aa1-eee7-4a67-b5ee-a05a6696b624") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.723384 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x6knc" Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.759157 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rlqt5"] Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.760611 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rlqt5" Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.769655 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rlqt5"] Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.787794 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.788201 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1569b19f-7f89-465d-9140-e2dce5e33425-catalog-content\") pod \"community-operators-l5625\" (UID: \"1569b19f-7f89-465d-9140-e2dce5e33425\") " pod="openshift-marketplace/community-operators-l5625" Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.788254 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1569b19f-7f89-465d-9140-e2dce5e33425-utilities\") pod \"community-operators-l5625\" (UID: \"1569b19f-7f89-465d-9140-e2dce5e33425\") " pod="openshift-marketplace/community-operators-l5625" Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.788286 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkgp7\" (UniqueName: \"kubernetes.io/projected/1569b19f-7f89-465d-9140-e2dce5e33425-kube-api-access-fkgp7\") pod \"community-operators-l5625\" (UID: \"1569b19f-7f89-465d-9140-e2dce5e33425\") " pod="openshift-marketplace/community-operators-l5625" Dec 01 21:36:07 crc kubenswrapper[4962]: E1201 21:36:07.788704 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 21:36:08.288686176 +0000 UTC m=+152.390125371 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.789273 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1569b19f-7f89-465d-9140-e2dce5e33425-utilities\") pod \"community-operators-l5625\" (UID: \"1569b19f-7f89-465d-9140-e2dce5e33425\") " pod="openshift-marketplace/community-operators-l5625" Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.791474 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1569b19f-7f89-465d-9140-e2dce5e33425-catalog-content\") pod \"community-operators-l5625\" (UID: \"1569b19f-7f89-465d-9140-e2dce5e33425\") " pod="openshift-marketplace/community-operators-l5625" Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.828349 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zkgk9" podStartSLOduration=129.82833387 podStartE2EDuration="2m9.82833387s" podCreationTimestamp="2025-12-01 21:33:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:36:07.828232537 +0000 UTC m=+151.929671742" watchObservedRunningTime="2025-12-01 21:36:07.82833387 +0000 UTC m=+151.929773065" Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.828614 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-qj8zv" podStartSLOduration=129.828609067 podStartE2EDuration="2m9.828609067s" podCreationTimestamp="2025-12-01 21:33:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:36:07.775793317 +0000 UTC m=+151.877232512" watchObservedRunningTime="2025-12-01 21:36:07.828609067 +0000 UTC m=+151.930048252" Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.843195 4962 patch_prober.go:28] interesting pod/router-default-5444994796-hjqsp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 21:36:07 crc kubenswrapper[4962]: [-]has-synced failed: reason withheld Dec 01 21:36:07 crc kubenswrapper[4962]: [+]process-running ok Dec 01 21:36:07 crc kubenswrapper[4962]: healthz check failed Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.843247 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hjqsp" podUID="3bc8d3cc-b827-4f76-b41e-18e0790b6e66" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.852436 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkgp7\" (UniqueName: \"kubernetes.io/projected/1569b19f-7f89-465d-9140-e2dce5e33425-kube-api-access-fkgp7\") pod \"community-operators-l5625\" (UID: \"1569b19f-7f89-465d-9140-e2dce5e33425\") " pod="openshift-marketplace/community-operators-l5625" Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.890915 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8308dc46-519c-4b6a-8e97-d073484e64ae-catalog-content\") pod \"certified-operators-rlqt5\" (UID: \"8308dc46-519c-4b6a-8e97-d073484e64ae\") " pod="openshift-marketplace/certified-operators-rlqt5" Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.890963 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8308dc46-519c-4b6a-8e97-d073484e64ae-utilities\") pod \"certified-operators-rlqt5\" (UID: \"8308dc46-519c-4b6a-8e97-d073484e64ae\") " pod="openshift-marketplace/certified-operators-rlqt5" Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.891057 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82xwq\" (UniqueName: \"kubernetes.io/projected/8308dc46-519c-4b6a-8e97-d073484e64ae-kube-api-access-82xwq\") pod \"certified-operators-rlqt5\" (UID: \"8308dc46-519c-4b6a-8e97-d073484e64ae\") " pod="openshift-marketplace/certified-operators-rlqt5" Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.891087 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k8fq2\" (UID: \"b2d93aa1-eee7-4a67-b5ee-a05a6696b624\") " pod="openshift-image-registry/image-registry-697d97f7c8-k8fq2" Dec 01 21:36:07 crc kubenswrapper[4962]: E1201 21:36:07.891432 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 21:36:08.391420321 +0000 UTC m=+152.492859516 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k8fq2" (UID: "b2d93aa1-eee7-4a67-b5ee-a05a6696b624") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.915404 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l5625" Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.934348 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-gg284" podStartSLOduration=129.934332811 podStartE2EDuration="2m9.934332811s" podCreationTimestamp="2025-12-01 21:33:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:36:07.866116735 +0000 UTC m=+151.967555930" watchObservedRunningTime="2025-12-01 21:36:07.934332811 +0000 UTC m=+152.035772006" Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.965716 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-stv9m" podStartSLOduration=129.965699317 podStartE2EDuration="2m9.965699317s" podCreationTimestamp="2025-12-01 21:33:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:36:07.935168043 +0000 UTC m=+152.036607248" watchObservedRunningTime="2025-12-01 21:36:07.965699317 +0000 UTC m=+152.067138512" Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.992284 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.992609 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82xwq\" (UniqueName: \"kubernetes.io/projected/8308dc46-519c-4b6a-8e97-d073484e64ae-kube-api-access-82xwq\") pod \"certified-operators-rlqt5\" (UID: \"8308dc46-519c-4b6a-8e97-d073484e64ae\") " pod="openshift-marketplace/certified-operators-rlqt5" Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.992658 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8308dc46-519c-4b6a-8e97-d073484e64ae-catalog-content\") pod \"certified-operators-rlqt5\" (UID: \"8308dc46-519c-4b6a-8e97-d073484e64ae\") " pod="openshift-marketplace/certified-operators-rlqt5" Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.992679 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8308dc46-519c-4b6a-8e97-d073484e64ae-utilities\") pod \"certified-operators-rlqt5\" (UID: \"8308dc46-519c-4b6a-8e97-d073484e64ae\") " pod="openshift-marketplace/certified-operators-rlqt5" Dec 01 21:36:07 crc kubenswrapper[4962]: E1201 21:36:07.993096 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 21:36:08.493075138 +0000 UTC m=+152.594514333 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.993159 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8308dc46-519c-4b6a-8e97-d073484e64ae-utilities\") pod \"certified-operators-rlqt5\" (UID: \"8308dc46-519c-4b6a-8e97-d073484e64ae\") " pod="openshift-marketplace/certified-operators-rlqt5" Dec 01 21:36:07 crc kubenswrapper[4962]: I1201 21:36:07.993362 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8308dc46-519c-4b6a-8e97-d073484e64ae-catalog-content\") pod \"certified-operators-rlqt5\" (UID: \"8308dc46-519c-4b6a-8e97-d073484e64ae\") " pod="openshift-marketplace/certified-operators-rlqt5" Dec 01 21:36:08 crc kubenswrapper[4962]: I1201 21:36:08.029801 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mpxbb"] Dec 01 21:36:08 crc kubenswrapper[4962]: I1201 21:36:08.046728 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82xwq\" (UniqueName: \"kubernetes.io/projected/8308dc46-519c-4b6a-8e97-d073484e64ae-kube-api-access-82xwq\") pod \"certified-operators-rlqt5\" (UID: \"8308dc46-519c-4b6a-8e97-d073484e64ae\") " pod="openshift-marketplace/certified-operators-rlqt5" Dec 01 21:36:08 crc kubenswrapper[4962]: I1201 21:36:08.094008 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-85jfd" podStartSLOduration=129.093976064 podStartE2EDuration="2m9.093976064s" podCreationTimestamp="2025-12-01 21:33:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:36:08.092703691 +0000 UTC m=+152.194142896" watchObservedRunningTime="2025-12-01 21:36:08.093976064 +0000 UTC m=+152.195415259" Dec 01 21:36:08 crc kubenswrapper[4962]: I1201 21:36:08.095069 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k8fq2\" (UID: \"b2d93aa1-eee7-4a67-b5ee-a05a6696b624\") " pod="openshift-image-registry/image-registry-697d97f7c8-k8fq2" Dec 01 21:36:08 crc kubenswrapper[4962]: E1201 21:36:08.095350 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 21:36:08.59533549 +0000 UTC m=+152.696774685 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k8fq2" (UID: "b2d93aa1-eee7-4a67-b5ee-a05a6696b624") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 21:36:08 crc kubenswrapper[4962]: I1201 21:36:08.095572 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-j5fqx" podStartSLOduration=130.095566016 podStartE2EDuration="2m10.095566016s" podCreationTimestamp="2025-12-01 21:33:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:36:08.04557621 +0000 UTC m=+152.147015405" watchObservedRunningTime="2025-12-01 21:36:08.095566016 +0000 UTC m=+152.197005211" Dec 01 21:36:08 crc kubenswrapper[4962]: I1201 21:36:08.105610 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rlqt5" Dec 01 21:36:08 crc kubenswrapper[4962]: I1201 21:36:08.196015 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 21:36:08 crc kubenswrapper[4962]: E1201 21:36:08.196355 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 21:36:08.69634101 +0000 UTC m=+152.797780205 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 21:36:08 crc kubenswrapper[4962]: I1201 21:36:08.297879 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k8fq2\" (UID: \"b2d93aa1-eee7-4a67-b5ee-a05a6696b624\") " pod="openshift-image-registry/image-registry-697d97f7c8-k8fq2" Dec 01 21:36:08 crc kubenswrapper[4962]: E1201 21:36:08.298172 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 21:36:08.798160091 +0000 UTC m=+152.899599286 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k8fq2" (UID: "b2d93aa1-eee7-4a67-b5ee-a05a6696b624") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 21:36:08 crc kubenswrapper[4962]: I1201 21:36:08.387527 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7p82b" event={"ID":"ccb13006-cd7f-4249-9d04-7391d26eaae3","Type":"ContainerStarted","Data":"ac500cd821194ce840806fbe671a8796d175bda07df9d77ddd3e1e74de84e702"} Dec 01 21:36:08 crc kubenswrapper[4962]: I1201 21:36:08.387824 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7p82b" event={"ID":"ccb13006-cd7f-4249-9d04-7391d26eaae3","Type":"ContainerStarted","Data":"8f36b054e0ba8cb889e9cf957fc7e81c099079d2cf761a35ed2218529e21b47c"} Dec 01 21:36:08 crc kubenswrapper[4962]: I1201 21:36:08.388496 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7p82b" Dec 01 21:36:08 crc kubenswrapper[4962]: I1201 21:36:08.400292 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 21:36:08 crc kubenswrapper[4962]: E1201 21:36:08.400632 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 21:36:08.900620048 +0000 UTC m=+153.002059243 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 21:36:08 crc kubenswrapper[4962]: I1201 21:36:08.402270 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x6knc"] Dec 01 21:36:08 crc kubenswrapper[4962]: I1201 21:36:08.436329 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-qj8zv" event={"ID":"12136e64-010e-49bc-9c3e-d1c65467f361","Type":"ContainerStarted","Data":"49e57f27f0f7c49f1d78300757baee3af298442f2e3c994b02ce91926954fe08"} Dec 01 21:36:08 crc kubenswrapper[4962]: I1201 21:36:08.502855 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k8fq2\" (UID: \"b2d93aa1-eee7-4a67-b5ee-a05a6696b624\") " pod="openshift-image-registry/image-registry-697d97f7c8-k8fq2" Dec 01 21:36:08 crc kubenswrapper[4962]: E1201 21:36:08.504830 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 21:36:09.004817672 +0000 UTC m=+153.106256867 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k8fq2" (UID: "b2d93aa1-eee7-4a67-b5ee-a05a6696b624") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 21:36:08 crc kubenswrapper[4962]: I1201 21:36:08.525593 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7qvr5" event={"ID":"81201e01-7100-49fe-a7d2-d402fa8fe9af","Type":"ContainerStarted","Data":"98fbfdf914478d35d2c929091fa0805d7f1090e7a022e0a4ea9b82eb27c86ade"} Dec 01 21:36:08 crc kubenswrapper[4962]: I1201 21:36:08.555530 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"7c71bc8a9b1d405a46f37cf36d1dc3e58c52184e4d23e08805105083384c131f"} Dec 01 21:36:08 crc kubenswrapper[4962]: I1201 21:36:08.607000 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 21:36:08 crc kubenswrapper[4962]: E1201 21:36:08.607290 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 21:36:09.107266699 +0000 UTC m=+153.208705894 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 21:36:08 crc kubenswrapper[4962]: I1201 21:36:08.607479 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k8fq2\" (UID: \"b2d93aa1-eee7-4a67-b5ee-a05a6696b624\") " pod="openshift-image-registry/image-registry-697d97f7c8-k8fq2" Dec 01 21:36:08 crc kubenswrapper[4962]: E1201 21:36:08.607735 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 21:36:09.107723581 +0000 UTC m=+153.209162776 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k8fq2" (UID: "b2d93aa1-eee7-4a67-b5ee-a05a6696b624") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 21:36:08 crc kubenswrapper[4962]: I1201 21:36:08.608188 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-n9nhh" event={"ID":"cfe98aa8-58d0-4ed5-ae64-8c0faa03e247","Type":"ContainerStarted","Data":"23972d3273e9cf9b9f0bccc5f17c4217d075870889aa4d9a74e660c611aacaec"} Dec 01 21:36:08 crc kubenswrapper[4962]: I1201 21:36:08.608219 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-n9nhh" event={"ID":"cfe98aa8-58d0-4ed5-ae64-8c0faa03e247","Type":"ContainerStarted","Data":"1f60a0b2f8e56e4f1dfcb0e102d1a1176415f814a8e0b532259612e3e0064816"} Dec 01 21:36:08 crc kubenswrapper[4962]: I1201 21:36:08.608247 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-n9nhh" Dec 01 21:36:08 crc kubenswrapper[4962]: I1201 21:36:08.641055 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7p82b" podStartSLOduration=130.641039029 podStartE2EDuration="2m10.641039029s" podCreationTimestamp="2025-12-01 21:33:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:36:08.437165231 +0000 UTC m=+152.538604426" watchObservedRunningTime="2025-12-01 21:36:08.641039029 +0000 UTC m=+152.742478224" Dec 01 21:36:08 crc kubenswrapper[4962]: I1201 21:36:08.642946 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l5625"] Dec 01 21:36:08 crc kubenswrapper[4962]: I1201 21:36:08.658036 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-n9nhh" podStartSLOduration=8.658005325 podStartE2EDuration="8.658005325s" podCreationTimestamp="2025-12-01 21:36:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:36:08.656967718 +0000 UTC m=+152.758406913" watchObservedRunningTime="2025-12-01 21:36:08.658005325 +0000 UTC m=+152.759444520" Dec 01 21:36:08 crc kubenswrapper[4962]: I1201 21:36:08.675542 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-w2wdh" event={"ID":"847a70f9-ec15-480b-8ed8-9d3cb006ce64","Type":"ContainerStarted","Data":"0e1bd982f23e00874ddb2e46a78ea1e9547ccd2b148b1ffb55b77af470916cbf"} Dec 01 21:36:08 crc kubenswrapper[4962]: I1201 21:36:08.683003 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-w56dm" event={"ID":"931bdb70-51c0-4893-b3b9-6fe8dd700233","Type":"ContainerStarted","Data":"cccc0ec4c44699b5039ff16b671c17ca5fae645dbe336975384eabaf11aba155"} Dec 01 21:36:08 crc kubenswrapper[4962]: I1201 21:36:08.696403 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lt2df" event={"ID":"0b40c111-e562-48b7-9db2-1a494e16786c","Type":"ContainerStarted","Data":"d972df236db1bd9b354ef9afd0395068aebc2395811373b280fed1a96df90917"} Dec 01 21:36:08 crc kubenswrapper[4962]: I1201 21:36:08.696436 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lt2df" Dec 01 21:36:08 crc kubenswrapper[4962]: I1201 21:36:08.697957 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gh8cn" event={"ID":"10c473f1-a2f6-4565-ba09-d7e28dea1600","Type":"ContainerStarted","Data":"4a1cd348a7eea3698e0526945513a3e6edae1084820f82c6695ab8ca0b5baf99"} Dec 01 21:36:08 crc kubenswrapper[4962]: I1201 21:36:08.698907 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-stv9m" event={"ID":"f10f3763-03b0-43d0-88fd-ce89274a67d9","Type":"ContainerStarted","Data":"0c4ad07bcc64930ec2ea0fd3bab698ecb5d312c499f12b8042bd15c3d6d79576"} Dec 01 21:36:08 crc kubenswrapper[4962]: I1201 21:36:08.699630 4962 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-stv9m container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Dec 01 21:36:08 crc kubenswrapper[4962]: I1201 21:36:08.699657 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-stv9m" podUID="f10f3763-03b0-43d0-88fd-ce89274a67d9" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" Dec 01 21:36:08 crc kubenswrapper[4962]: I1201 21:36:08.708711 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 21:36:08 crc kubenswrapper[4962]: E1201 21:36:08.709542 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 21:36:09.209526012 +0000 UTC m=+153.310965207 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 21:36:08 crc kubenswrapper[4962]: I1201 21:36:08.751106 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mpxbb" event={"ID":"b9788313-1ab9-4dce-8fd0-363c8086d8d3","Type":"ContainerStarted","Data":"f93a2256f6a9d6a4b618d9b0c0ecbe732b8ff4cfc417a3ce09e523aad0ea87bf"} Dec 01 21:36:08 crc kubenswrapper[4962]: I1201 21:36:08.759047 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-w2wdh" podStartSLOduration=130.759031955 podStartE2EDuration="2m10.759031955s" podCreationTimestamp="2025-12-01 21:33:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:36:08.756604001 +0000 UTC m=+152.858043196" watchObservedRunningTime="2025-12-01 21:36:08.759031955 +0000 UTC m=+152.860471150" Dec 01 21:36:08 crc kubenswrapper[4962]: I1201 21:36:08.761094 4962 patch_prober.go:28] interesting pod/downloads-7954f5f757-gg284 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Dec 01 21:36:08 crc kubenswrapper[4962]: I1201 21:36:08.761160 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-gg284" podUID="710d5a74-c24a-452e-a5bf-1c23b3589361" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Dec 01 21:36:08 crc kubenswrapper[4962]: I1201 21:36:08.761948 4962 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 21:36:08 crc kubenswrapper[4962]: I1201 21:36:08.789477 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-p5kqx" Dec 01 21:36:08 crc kubenswrapper[4962]: I1201 21:36:08.793430 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6ffb7" Dec 01 21:36:08 crc kubenswrapper[4962]: I1201 21:36:08.818453 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k8fq2\" (UID: \"b2d93aa1-eee7-4a67-b5ee-a05a6696b624\") " pod="openshift-image-registry/image-registry-697d97f7c8-k8fq2" Dec 01 21:36:08 crc kubenswrapper[4962]: I1201 21:36:08.830196 4962 patch_prober.go:28] interesting pod/router-default-5444994796-hjqsp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 21:36:08 crc kubenswrapper[4962]: [-]has-synced failed: reason withheld Dec 01 21:36:08 crc kubenswrapper[4962]: [+]process-running ok Dec 01 21:36:08 crc kubenswrapper[4962]: healthz check failed Dec 01 21:36:08 crc kubenswrapper[4962]: I1201 21:36:08.830251 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hjqsp" podUID="3bc8d3cc-b827-4f76-b41e-18e0790b6e66" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 21:36:08 crc kubenswrapper[4962]: E1201 21:36:08.831822 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 21:36:09.331807752 +0000 UTC m=+153.433246937 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k8fq2" (UID: "b2d93aa1-eee7-4a67-b5ee-a05a6696b624") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 21:36:08 crc kubenswrapper[4962]: I1201 21:36:08.863975 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lt2df" podStartSLOduration=130.863959338 podStartE2EDuration="2m10.863959338s" podCreationTimestamp="2025-12-01 21:33:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:36:08.86290075 +0000 UTC m=+152.964339965" watchObservedRunningTime="2025-12-01 21:36:08.863959338 +0000 UTC m=+152.965398533" Dec 01 21:36:08 crc kubenswrapper[4962]: I1201 21:36:08.917535 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rlqt5"] Dec 01 21:36:08 crc kubenswrapper[4962]: I1201 21:36:08.918670 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gh8cn" podStartSLOduration=130.918652288 podStartE2EDuration="2m10.918652288s" podCreationTimestamp="2025-12-01 21:33:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:36:08.9175985 +0000 UTC m=+153.019037695" watchObservedRunningTime="2025-12-01 21:36:08.918652288 +0000 UTC m=+153.020091483" Dec 01 21:36:08 crc kubenswrapper[4962]: I1201 21:36:08.924536 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 21:36:08 crc kubenswrapper[4962]: E1201 21:36:08.925115 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 21:36:09.425100008 +0000 UTC m=+153.526539193 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 21:36:09 crc kubenswrapper[4962]: I1201 21:36:09.027676 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k8fq2\" (UID: \"b2d93aa1-eee7-4a67-b5ee-a05a6696b624\") " pod="openshift-image-registry/image-registry-697d97f7c8-k8fq2" Dec 01 21:36:09 crc kubenswrapper[4962]: E1201 21:36:09.027968 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 21:36:09.527955616 +0000 UTC m=+153.629394811 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k8fq2" (UID: "b2d93aa1-eee7-4a67-b5ee-a05a6696b624") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 21:36:09 crc kubenswrapper[4962]: I1201 21:36:09.049582 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-w56dm" podStartSLOduration=131.049559835 podStartE2EDuration="2m11.049559835s" podCreationTimestamp="2025-12-01 21:33:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:36:08.998721956 +0000 UTC m=+153.100161151" watchObservedRunningTime="2025-12-01 21:36:09.049559835 +0000 UTC m=+153.150999030" Dec 01 21:36:09 crc kubenswrapper[4962]: I1201 21:36:09.128921 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 21:36:09 crc kubenswrapper[4962]: E1201 21:36:09.129386 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 21:36:09.629367816 +0000 UTC m=+153.730807011 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 21:36:09 crc kubenswrapper[4962]: I1201 21:36:09.230556 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k8fq2\" (UID: \"b2d93aa1-eee7-4a67-b5ee-a05a6696b624\") " pod="openshift-image-registry/image-registry-697d97f7c8-k8fq2" Dec 01 21:36:09 crc kubenswrapper[4962]: E1201 21:36:09.230832 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 21:36:09.730807887 +0000 UTC m=+153.832247082 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k8fq2" (UID: "b2d93aa1-eee7-4a67-b5ee-a05a6696b624") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 21:36:09 crc kubenswrapper[4962]: I1201 21:36:09.235156 4962 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Dec 01 21:36:09 crc kubenswrapper[4962]: I1201 21:36:09.327952 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-j9h7f"] Dec 01 21:36:09 crc kubenswrapper[4962]: I1201 21:36:09.329242 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j9h7f" Dec 01 21:36:09 crc kubenswrapper[4962]: I1201 21:36:09.331443 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 21:36:09 crc kubenswrapper[4962]: E1201 21:36:09.331742 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 21:36:09.831725764 +0000 UTC m=+153.933164959 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 21:36:09 crc kubenswrapper[4962]: I1201 21:36:09.332005 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 01 21:36:09 crc kubenswrapper[4962]: I1201 21:36:09.355810 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j9h7f"] Dec 01 21:36:09 crc kubenswrapper[4962]: I1201 21:36:09.400395 4962 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-12-01T21:36:09.235180982Z","Handler":null,"Name":""} Dec 01 21:36:09 crc kubenswrapper[4962]: I1201 21:36:09.432834 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/542eb764-e2cd-4043-9721-fe8f5d6d5d13-utilities\") pod \"redhat-marketplace-j9h7f\" (UID: \"542eb764-e2cd-4043-9721-fe8f5d6d5d13\") " pod="openshift-marketplace/redhat-marketplace-j9h7f" Dec 01 21:36:09 crc kubenswrapper[4962]: I1201 21:36:09.432909 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/542eb764-e2cd-4043-9721-fe8f5d6d5d13-catalog-content\") pod \"redhat-marketplace-j9h7f\" (UID: \"542eb764-e2cd-4043-9721-fe8f5d6d5d13\") " pod="openshift-marketplace/redhat-marketplace-j9h7f" Dec 01 21:36:09 crc kubenswrapper[4962]: I1201 21:36:09.432948 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k8fq2\" (UID: \"b2d93aa1-eee7-4a67-b5ee-a05a6696b624\") " pod="openshift-image-registry/image-registry-697d97f7c8-k8fq2" Dec 01 21:36:09 crc kubenswrapper[4962]: I1201 21:36:09.432967 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbdj6\" (UniqueName: \"kubernetes.io/projected/542eb764-e2cd-4043-9721-fe8f5d6d5d13-kube-api-access-fbdj6\") pod \"redhat-marketplace-j9h7f\" (UID: \"542eb764-e2cd-4043-9721-fe8f5d6d5d13\") " pod="openshift-marketplace/redhat-marketplace-j9h7f" Dec 01 21:36:09 crc kubenswrapper[4962]: E1201 21:36:09.433274 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 21:36:09.933263258 +0000 UTC m=+154.034702453 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k8fq2" (UID: "b2d93aa1-eee7-4a67-b5ee-a05a6696b624") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 21:36:09 crc kubenswrapper[4962]: I1201 21:36:09.444648 4962 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Dec 01 21:36:09 crc kubenswrapper[4962]: I1201 21:36:09.444724 4962 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Dec 01 21:36:09 crc kubenswrapper[4962]: I1201 21:36:09.533983 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 21:36:09 crc kubenswrapper[4962]: I1201 21:36:09.534127 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/542eb764-e2cd-4043-9721-fe8f5d6d5d13-utilities\") pod \"redhat-marketplace-j9h7f\" (UID: \"542eb764-e2cd-4043-9721-fe8f5d6d5d13\") " pod="openshift-marketplace/redhat-marketplace-j9h7f" Dec 01 21:36:09 crc kubenswrapper[4962]: I1201 21:36:09.534202 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/542eb764-e2cd-4043-9721-fe8f5d6d5d13-catalog-content\") pod \"redhat-marketplace-j9h7f\" (UID: \"542eb764-e2cd-4043-9721-fe8f5d6d5d13\") " pod="openshift-marketplace/redhat-marketplace-j9h7f" Dec 01 21:36:09 crc kubenswrapper[4962]: I1201 21:36:09.534234 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbdj6\" (UniqueName: \"kubernetes.io/projected/542eb764-e2cd-4043-9721-fe8f5d6d5d13-kube-api-access-fbdj6\") pod \"redhat-marketplace-j9h7f\" (UID: \"542eb764-e2cd-4043-9721-fe8f5d6d5d13\") " pod="openshift-marketplace/redhat-marketplace-j9h7f" Dec 01 21:36:09 crc kubenswrapper[4962]: I1201 21:36:09.534802 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/542eb764-e2cd-4043-9721-fe8f5d6d5d13-catalog-content\") pod \"redhat-marketplace-j9h7f\" (UID: \"542eb764-e2cd-4043-9721-fe8f5d6d5d13\") " pod="openshift-marketplace/redhat-marketplace-j9h7f" Dec 01 21:36:09 crc kubenswrapper[4962]: I1201 21:36:09.535217 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/542eb764-e2cd-4043-9721-fe8f5d6d5d13-utilities\") pod \"redhat-marketplace-j9h7f\" (UID: \"542eb764-e2cd-4043-9721-fe8f5d6d5d13\") " pod="openshift-marketplace/redhat-marketplace-j9h7f" Dec 01 21:36:09 crc kubenswrapper[4962]: I1201 21:36:09.550415 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 01 21:36:09 crc kubenswrapper[4962]: I1201 21:36:09.567648 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbdj6\" (UniqueName: \"kubernetes.io/projected/542eb764-e2cd-4043-9721-fe8f5d6d5d13-kube-api-access-fbdj6\") pod \"redhat-marketplace-j9h7f\" (UID: \"542eb764-e2cd-4043-9721-fe8f5d6d5d13\") " pod="openshift-marketplace/redhat-marketplace-j9h7f" Dec 01 21:36:09 crc kubenswrapper[4962]: I1201 21:36:09.635879 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k8fq2\" (UID: \"b2d93aa1-eee7-4a67-b5ee-a05a6696b624\") " pod="openshift-image-registry/image-registry-697d97f7c8-k8fq2" Dec 01 21:36:09 crc kubenswrapper[4962]: I1201 21:36:09.651415 4962 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 01 21:36:09 crc kubenswrapper[4962]: I1201 21:36:09.651454 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k8fq2\" (UID: \"b2d93aa1-eee7-4a67-b5ee-a05a6696b624\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-k8fq2" Dec 01 21:36:09 crc kubenswrapper[4962]: I1201 21:36:09.677318 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k8fq2\" (UID: \"b2d93aa1-eee7-4a67-b5ee-a05a6696b624\") " pod="openshift-image-registry/image-registry-697d97f7c8-k8fq2" Dec 01 21:36:09 crc kubenswrapper[4962]: I1201 21:36:09.696627 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j9h7f" Dec 01 21:36:09 crc kubenswrapper[4962]: I1201 21:36:09.745572 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-26z7n"] Dec 01 21:36:09 crc kubenswrapper[4962]: I1201 21:36:09.746542 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-26z7n" Dec 01 21:36:09 crc kubenswrapper[4962]: I1201 21:36:09.780911 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-26z7n"] Dec 01 21:36:09 crc kubenswrapper[4962]: I1201 21:36:09.799370 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-k8fq2" Dec 01 21:36:09 crc kubenswrapper[4962]: I1201 21:36:09.808364 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-tqsxt" event={"ID":"be92f2e8-fa20-4d9b-8891-429dfc64490b","Type":"ContainerStarted","Data":"1808f842be61f5ff4667720f1fcca014c024529fb676abd387b7b5884d36d1d7"} Dec 01 21:36:09 crc kubenswrapper[4962]: I1201 21:36:09.808401 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-tqsxt" event={"ID":"be92f2e8-fa20-4d9b-8891-429dfc64490b","Type":"ContainerStarted","Data":"2ddd07099bd877b76b3efe9c1702d677ae07d97a2f7254cf64902df593a7732f"} Dec 01 21:36:09 crc kubenswrapper[4962]: I1201 21:36:09.823125 4962 patch_prober.go:28] interesting pod/router-default-5444994796-hjqsp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 21:36:09 crc kubenswrapper[4962]: [-]has-synced failed: reason withheld Dec 01 21:36:09 crc kubenswrapper[4962]: [+]process-running ok Dec 01 21:36:09 crc kubenswrapper[4962]: healthz check failed Dec 01 21:36:09 crc kubenswrapper[4962]: I1201 21:36:09.823184 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hjqsp" podUID="3bc8d3cc-b827-4f76-b41e-18e0790b6e66" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 21:36:09 crc kubenswrapper[4962]: I1201 21:36:09.832436 4962 generic.go:334] "Generic (PLEG): container finished" podID="fde3e2c2-ed59-4cbf-8554-1a0438eb81dc" containerID="578cf2c176fdcfafd49e6d657f8c064a6770b8a85590ca82bc4ea2b72aa4403d" exitCode=0 Dec 01 21:36:09 crc kubenswrapper[4962]: I1201 21:36:09.832510 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410410-kr2lr" event={"ID":"fde3e2c2-ed59-4cbf-8554-1a0438eb81dc","Type":"ContainerDied","Data":"578cf2c176fdcfafd49e6d657f8c064a6770b8a85590ca82bc4ea2b72aa4403d"} Dec 01 21:36:09 crc kubenswrapper[4962]: I1201 21:36:09.844319 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/970cf872-ad25-4424-85ca-4bf809e9f5f7-catalog-content\") pod \"redhat-marketplace-26z7n\" (UID: \"970cf872-ad25-4424-85ca-4bf809e9f5f7\") " pod="openshift-marketplace/redhat-marketplace-26z7n" Dec 01 21:36:09 crc kubenswrapper[4962]: I1201 21:36:09.844372 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/970cf872-ad25-4424-85ca-4bf809e9f5f7-utilities\") pod \"redhat-marketplace-26z7n\" (UID: \"970cf872-ad25-4424-85ca-4bf809e9f5f7\") " pod="openshift-marketplace/redhat-marketplace-26z7n" Dec 01 21:36:09 crc kubenswrapper[4962]: I1201 21:36:09.844411 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9p79b\" (UniqueName: \"kubernetes.io/projected/970cf872-ad25-4424-85ca-4bf809e9f5f7-kube-api-access-9p79b\") pod \"redhat-marketplace-26z7n\" (UID: \"970cf872-ad25-4424-85ca-4bf809e9f5f7\") " pod="openshift-marketplace/redhat-marketplace-26z7n" Dec 01 21:36:09 crc kubenswrapper[4962]: I1201 21:36:09.866822 4962 generic.go:334] "Generic (PLEG): container finished" podID="b9788313-1ab9-4dce-8fd0-363c8086d8d3" containerID="36229efaf41b9432aa6c30762b6908f76c4d64613059d01842a762f5d3b08891" exitCode=0 Dec 01 21:36:09 crc kubenswrapper[4962]: I1201 21:36:09.867191 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mpxbb" event={"ID":"b9788313-1ab9-4dce-8fd0-363c8086d8d3","Type":"ContainerDied","Data":"36229efaf41b9432aa6c30762b6908f76c4d64613059d01842a762f5d3b08891"} Dec 01 21:36:09 crc kubenswrapper[4962]: I1201 21:36:09.868459 4962 generic.go:334] "Generic (PLEG): container finished" podID="a4620db4-9171-44b5-b944-dcf2e871ef41" containerID="44e6e4a1e9f7c346a88a3cff569ec23c9e648d76db99b2bb69cee1eb6e00a99b" exitCode=0 Dec 01 21:36:09 crc kubenswrapper[4962]: I1201 21:36:09.868508 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x6knc" event={"ID":"a4620db4-9171-44b5-b944-dcf2e871ef41","Type":"ContainerDied","Data":"44e6e4a1e9f7c346a88a3cff569ec23c9e648d76db99b2bb69cee1eb6e00a99b"} Dec 01 21:36:09 crc kubenswrapper[4962]: I1201 21:36:09.868523 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x6knc" event={"ID":"a4620db4-9171-44b5-b944-dcf2e871ef41","Type":"ContainerStarted","Data":"da9ad88de7b256ba8a05c12739f712eaa71d6b39c8d12f5acd23aabcf8074fd1"} Dec 01 21:36:09 crc kubenswrapper[4962]: I1201 21:36:09.884998 4962 generic.go:334] "Generic (PLEG): container finished" podID="1569b19f-7f89-465d-9140-e2dce5e33425" containerID="8c985e4e459754e1a774e4f55c926ec2a0ddec41d8c5062afeb0045240bfaccf" exitCode=0 Dec 01 21:36:09 crc kubenswrapper[4962]: I1201 21:36:09.885053 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l5625" event={"ID":"1569b19f-7f89-465d-9140-e2dce5e33425","Type":"ContainerDied","Data":"8c985e4e459754e1a774e4f55c926ec2a0ddec41d8c5062afeb0045240bfaccf"} Dec 01 21:36:09 crc kubenswrapper[4962]: I1201 21:36:09.885073 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l5625" event={"ID":"1569b19f-7f89-465d-9140-e2dce5e33425","Type":"ContainerStarted","Data":"9e75162d6953c5e2a749399c2f5396127e6a6b12f31420394ceb4a11109d69b9"} Dec 01 21:36:09 crc kubenswrapper[4962]: I1201 21:36:09.904711 4962 generic.go:334] "Generic (PLEG): container finished" podID="8308dc46-519c-4b6a-8e97-d073484e64ae" containerID="0fba10999ad935b5ef4527037f5411522498d37cce599b24c8898085fd7d4624" exitCode=0 Dec 01 21:36:09 crc kubenswrapper[4962]: I1201 21:36:09.904900 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rlqt5" event={"ID":"8308dc46-519c-4b6a-8e97-d073484e64ae","Type":"ContainerDied","Data":"0fba10999ad935b5ef4527037f5411522498d37cce599b24c8898085fd7d4624"} Dec 01 21:36:09 crc kubenswrapper[4962]: I1201 21:36:09.904955 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rlqt5" event={"ID":"8308dc46-519c-4b6a-8e97-d073484e64ae","Type":"ContainerStarted","Data":"df98c6461874c9b6bb55dcca6ba9ffbba0cd00180b2733c643171316078d16fd"} Dec 01 21:36:09 crc kubenswrapper[4962]: I1201 21:36:09.924344 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-stv9m" Dec 01 21:36:09 crc kubenswrapper[4962]: I1201 21:36:09.945474 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/970cf872-ad25-4424-85ca-4bf809e9f5f7-catalog-content\") pod \"redhat-marketplace-26z7n\" (UID: \"970cf872-ad25-4424-85ca-4bf809e9f5f7\") " pod="openshift-marketplace/redhat-marketplace-26z7n" Dec 01 21:36:09 crc kubenswrapper[4962]: I1201 21:36:09.945624 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/970cf872-ad25-4424-85ca-4bf809e9f5f7-utilities\") pod \"redhat-marketplace-26z7n\" (UID: \"970cf872-ad25-4424-85ca-4bf809e9f5f7\") " pod="openshift-marketplace/redhat-marketplace-26z7n" Dec 01 21:36:09 crc kubenswrapper[4962]: I1201 21:36:09.945823 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9p79b\" (UniqueName: \"kubernetes.io/projected/970cf872-ad25-4424-85ca-4bf809e9f5f7-kube-api-access-9p79b\") pod \"redhat-marketplace-26z7n\" (UID: \"970cf872-ad25-4424-85ca-4bf809e9f5f7\") " pod="openshift-marketplace/redhat-marketplace-26z7n" Dec 01 21:36:09 crc kubenswrapper[4962]: I1201 21:36:09.948545 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/970cf872-ad25-4424-85ca-4bf809e9f5f7-utilities\") pod \"redhat-marketplace-26z7n\" (UID: \"970cf872-ad25-4424-85ca-4bf809e9f5f7\") " pod="openshift-marketplace/redhat-marketplace-26z7n" Dec 01 21:36:09 crc kubenswrapper[4962]: I1201 21:36:09.949358 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/970cf872-ad25-4424-85ca-4bf809e9f5f7-catalog-content\") pod \"redhat-marketplace-26z7n\" (UID: \"970cf872-ad25-4424-85ca-4bf809e9f5f7\") " pod="openshift-marketplace/redhat-marketplace-26z7n" Dec 01 21:36:10 crc kubenswrapper[4962]: I1201 21:36:10.018594 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9p79b\" (UniqueName: \"kubernetes.io/projected/970cf872-ad25-4424-85ca-4bf809e9f5f7-kube-api-access-9p79b\") pod \"redhat-marketplace-26z7n\" (UID: \"970cf872-ad25-4424-85ca-4bf809e9f5f7\") " pod="openshift-marketplace/redhat-marketplace-26z7n" Dec 01 21:36:10 crc kubenswrapper[4962]: I1201 21:36:10.074618 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-26z7n" Dec 01 21:36:10 crc kubenswrapper[4962]: I1201 21:36:10.241605 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Dec 01 21:36:10 crc kubenswrapper[4962]: I1201 21:36:10.242730 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-k8fq2"] Dec 01 21:36:10 crc kubenswrapper[4962]: I1201 21:36:10.284320 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j9h7f"] Dec 01 21:36:10 crc kubenswrapper[4962]: I1201 21:36:10.357821 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zr6pm"] Dec 01 21:36:10 crc kubenswrapper[4962]: I1201 21:36:10.361816 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zr6pm" Dec 01 21:36:10 crc kubenswrapper[4962]: I1201 21:36:10.365027 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 01 21:36:10 crc kubenswrapper[4962]: I1201 21:36:10.376163 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zr6pm"] Dec 01 21:36:10 crc kubenswrapper[4962]: I1201 21:36:10.456558 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9szx8\" (UniqueName: \"kubernetes.io/projected/2c522e07-a29f-4b5f-be82-e5eac46c1f6a-kube-api-access-9szx8\") pod \"redhat-operators-zr6pm\" (UID: \"2c522e07-a29f-4b5f-be82-e5eac46c1f6a\") " pod="openshift-marketplace/redhat-operators-zr6pm" Dec 01 21:36:10 crc kubenswrapper[4962]: I1201 21:36:10.456656 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c522e07-a29f-4b5f-be82-e5eac46c1f6a-catalog-content\") pod \"redhat-operators-zr6pm\" (UID: \"2c522e07-a29f-4b5f-be82-e5eac46c1f6a\") " pod="openshift-marketplace/redhat-operators-zr6pm" Dec 01 21:36:10 crc kubenswrapper[4962]: I1201 21:36:10.456690 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c522e07-a29f-4b5f-be82-e5eac46c1f6a-utilities\") pod \"redhat-operators-zr6pm\" (UID: \"2c522e07-a29f-4b5f-be82-e5eac46c1f6a\") " pod="openshift-marketplace/redhat-operators-zr6pm" Dec 01 21:36:10 crc kubenswrapper[4962]: I1201 21:36:10.506556 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-26z7n"] Dec 01 21:36:10 crc kubenswrapper[4962]: I1201 21:36:10.558366 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9szx8\" (UniqueName: \"kubernetes.io/projected/2c522e07-a29f-4b5f-be82-e5eac46c1f6a-kube-api-access-9szx8\") pod \"redhat-operators-zr6pm\" (UID: \"2c522e07-a29f-4b5f-be82-e5eac46c1f6a\") " pod="openshift-marketplace/redhat-operators-zr6pm" Dec 01 21:36:10 crc kubenswrapper[4962]: I1201 21:36:10.558506 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c522e07-a29f-4b5f-be82-e5eac46c1f6a-catalog-content\") pod \"redhat-operators-zr6pm\" (UID: \"2c522e07-a29f-4b5f-be82-e5eac46c1f6a\") " pod="openshift-marketplace/redhat-operators-zr6pm" Dec 01 21:36:10 crc kubenswrapper[4962]: I1201 21:36:10.558539 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c522e07-a29f-4b5f-be82-e5eac46c1f6a-utilities\") pod \"redhat-operators-zr6pm\" (UID: \"2c522e07-a29f-4b5f-be82-e5eac46c1f6a\") " pod="openshift-marketplace/redhat-operators-zr6pm" Dec 01 21:36:10 crc kubenswrapper[4962]: I1201 21:36:10.559373 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c522e07-a29f-4b5f-be82-e5eac46c1f6a-utilities\") pod \"redhat-operators-zr6pm\" (UID: \"2c522e07-a29f-4b5f-be82-e5eac46c1f6a\") " pod="openshift-marketplace/redhat-operators-zr6pm" Dec 01 21:36:10 crc kubenswrapper[4962]: I1201 21:36:10.559901 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c522e07-a29f-4b5f-be82-e5eac46c1f6a-catalog-content\") pod \"redhat-operators-zr6pm\" (UID: \"2c522e07-a29f-4b5f-be82-e5eac46c1f6a\") " pod="openshift-marketplace/redhat-operators-zr6pm" Dec 01 21:36:10 crc kubenswrapper[4962]: I1201 21:36:10.583734 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9szx8\" (UniqueName: \"kubernetes.io/projected/2c522e07-a29f-4b5f-be82-e5eac46c1f6a-kube-api-access-9szx8\") pod \"redhat-operators-zr6pm\" (UID: \"2c522e07-a29f-4b5f-be82-e5eac46c1f6a\") " pod="openshift-marketplace/redhat-operators-zr6pm" Dec 01 21:36:10 crc kubenswrapper[4962]: I1201 21:36:10.611130 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 01 21:36:10 crc kubenswrapper[4962]: I1201 21:36:10.612001 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 21:36:10 crc kubenswrapper[4962]: I1201 21:36:10.616457 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Dec 01 21:36:10 crc kubenswrapper[4962]: I1201 21:36:10.625394 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 01 21:36:10 crc kubenswrapper[4962]: I1201 21:36:10.631250 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 01 21:36:10 crc kubenswrapper[4962]: I1201 21:36:10.660420 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/674a65c5-5310-4c99-8031-b486c9133db7-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"674a65c5-5310-4c99-8031-b486c9133db7\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 21:36:10 crc kubenswrapper[4962]: I1201 21:36:10.660519 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/674a65c5-5310-4c99-8031-b486c9133db7-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"674a65c5-5310-4c99-8031-b486c9133db7\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 21:36:10 crc kubenswrapper[4962]: I1201 21:36:10.723921 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zr6pm" Dec 01 21:36:10 crc kubenswrapper[4962]: I1201 21:36:10.729917 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gjnwd"] Dec 01 21:36:10 crc kubenswrapper[4962]: I1201 21:36:10.730877 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gjnwd" Dec 01 21:36:10 crc kubenswrapper[4962]: I1201 21:36:10.740941 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gjnwd"] Dec 01 21:36:10 crc kubenswrapper[4962]: I1201 21:36:10.768158 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/674a65c5-5310-4c99-8031-b486c9133db7-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"674a65c5-5310-4c99-8031-b486c9133db7\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 21:36:10 crc kubenswrapper[4962]: I1201 21:36:10.768217 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/674a65c5-5310-4c99-8031-b486c9133db7-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"674a65c5-5310-4c99-8031-b486c9133db7\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 21:36:10 crc kubenswrapper[4962]: I1201 21:36:10.768714 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/674a65c5-5310-4c99-8031-b486c9133db7-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"674a65c5-5310-4c99-8031-b486c9133db7\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 21:36:10 crc kubenswrapper[4962]: I1201 21:36:10.786178 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/674a65c5-5310-4c99-8031-b486c9133db7-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"674a65c5-5310-4c99-8031-b486c9133db7\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 21:36:10 crc kubenswrapper[4962]: I1201 21:36:10.818583 4962 patch_prober.go:28] interesting pod/router-default-5444994796-hjqsp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 21:36:10 crc kubenswrapper[4962]: [-]has-synced failed: reason withheld Dec 01 21:36:10 crc kubenswrapper[4962]: [+]process-running ok Dec 01 21:36:10 crc kubenswrapper[4962]: healthz check failed Dec 01 21:36:10 crc kubenswrapper[4962]: I1201 21:36:10.818635 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hjqsp" podUID="3bc8d3cc-b827-4f76-b41e-18e0790b6e66" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 21:36:10 crc kubenswrapper[4962]: I1201 21:36:10.869584 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpbcs\" (UniqueName: \"kubernetes.io/projected/d7ee99ce-d281-4196-b1aa-1d69bc8c6f2a-kube-api-access-tpbcs\") pod \"redhat-operators-gjnwd\" (UID: \"d7ee99ce-d281-4196-b1aa-1d69bc8c6f2a\") " pod="openshift-marketplace/redhat-operators-gjnwd" Dec 01 21:36:10 crc kubenswrapper[4962]: I1201 21:36:10.869876 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7ee99ce-d281-4196-b1aa-1d69bc8c6f2a-catalog-content\") pod \"redhat-operators-gjnwd\" (UID: \"d7ee99ce-d281-4196-b1aa-1d69bc8c6f2a\") " pod="openshift-marketplace/redhat-operators-gjnwd" Dec 01 21:36:10 crc kubenswrapper[4962]: I1201 21:36:10.869909 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7ee99ce-d281-4196-b1aa-1d69bc8c6f2a-utilities\") pod \"redhat-operators-gjnwd\" (UID: \"d7ee99ce-d281-4196-b1aa-1d69bc8c6f2a\") " pod="openshift-marketplace/redhat-operators-gjnwd" Dec 01 21:36:10 crc kubenswrapper[4962]: I1201 21:36:10.926725 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 21:36:10 crc kubenswrapper[4962]: I1201 21:36:10.937381 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-tqsxt" event={"ID":"be92f2e8-fa20-4d9b-8891-429dfc64490b","Type":"ContainerStarted","Data":"d665b7c59fac8fb5500621d01288e2f3f46c7467f0749fcc9141fc1929f22f9b"} Dec 01 21:36:10 crc kubenswrapper[4962]: I1201 21:36:10.945905 4962 generic.go:334] "Generic (PLEG): container finished" podID="970cf872-ad25-4424-85ca-4bf809e9f5f7" containerID="d79584d260f2bb68e092dba2ac003c2379b83ce840e5a2af21b039b9bc5c1d63" exitCode=0 Dec 01 21:36:10 crc kubenswrapper[4962]: I1201 21:36:10.945973 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-26z7n" event={"ID":"970cf872-ad25-4424-85ca-4bf809e9f5f7","Type":"ContainerDied","Data":"d79584d260f2bb68e092dba2ac003c2379b83ce840e5a2af21b039b9bc5c1d63"} Dec 01 21:36:10 crc kubenswrapper[4962]: I1201 21:36:10.946032 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-26z7n" event={"ID":"970cf872-ad25-4424-85ca-4bf809e9f5f7","Type":"ContainerStarted","Data":"745f6cc2a32463a2a4ee320990a2bf3b1b9cd26af8992d81cd64b76c6c8b5228"} Dec 01 21:36:10 crc kubenswrapper[4962]: I1201 21:36:10.947152 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zr6pm"] Dec 01 21:36:10 crc kubenswrapper[4962]: I1201 21:36:10.949006 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-k8fq2" event={"ID":"b2d93aa1-eee7-4a67-b5ee-a05a6696b624","Type":"ContainerStarted","Data":"e8b6432ff0785108e95c073da1d34c555317244be9bf66ae7e3e272a3b632af4"} Dec 01 21:36:10 crc kubenswrapper[4962]: I1201 21:36:10.949057 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-k8fq2" event={"ID":"b2d93aa1-eee7-4a67-b5ee-a05a6696b624","Type":"ContainerStarted","Data":"f709c5f5721e83dc178563bba08680d253863887dc0d8ea1ea435658deeb5e0c"} Dec 01 21:36:10 crc kubenswrapper[4962]: I1201 21:36:10.949137 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-k8fq2" Dec 01 21:36:10 crc kubenswrapper[4962]: I1201 21:36:10.955558 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-tqsxt" podStartSLOduration=10.955542899 podStartE2EDuration="10.955542899s" podCreationTimestamp="2025-12-01 21:36:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:36:10.954867321 +0000 UTC m=+155.056306516" watchObservedRunningTime="2025-12-01 21:36:10.955542899 +0000 UTC m=+155.056982094" Dec 01 21:36:10 crc kubenswrapper[4962]: I1201 21:36:10.965424 4962 generic.go:334] "Generic (PLEG): container finished" podID="542eb764-e2cd-4043-9721-fe8f5d6d5d13" containerID="3ef67691664d49c549408d0a1a537421dea2fe13f1bd4f059fe2524319683b75" exitCode=0 Dec 01 21:36:10 crc kubenswrapper[4962]: I1201 21:36:10.966911 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j9h7f" event={"ID":"542eb764-e2cd-4043-9721-fe8f5d6d5d13","Type":"ContainerDied","Data":"3ef67691664d49c549408d0a1a537421dea2fe13f1bd4f059fe2524319683b75"} Dec 01 21:36:10 crc kubenswrapper[4962]: I1201 21:36:10.966959 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j9h7f" event={"ID":"542eb764-e2cd-4043-9721-fe8f5d6d5d13","Type":"ContainerStarted","Data":"d50a73638c479e76d3beb5b452d38e6559599a99fb1fd24a4ad7488d7163dc23"} Dec 01 21:36:10 crc kubenswrapper[4962]: I1201 21:36:10.986767 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpbcs\" (UniqueName: \"kubernetes.io/projected/d7ee99ce-d281-4196-b1aa-1d69bc8c6f2a-kube-api-access-tpbcs\") pod \"redhat-operators-gjnwd\" (UID: \"d7ee99ce-d281-4196-b1aa-1d69bc8c6f2a\") " pod="openshift-marketplace/redhat-operators-gjnwd" Dec 01 21:36:10 crc kubenswrapper[4962]: I1201 21:36:10.986855 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7ee99ce-d281-4196-b1aa-1d69bc8c6f2a-catalog-content\") pod \"redhat-operators-gjnwd\" (UID: \"d7ee99ce-d281-4196-b1aa-1d69bc8c6f2a\") " pod="openshift-marketplace/redhat-operators-gjnwd" Dec 01 21:36:10 crc kubenswrapper[4962]: I1201 21:36:10.986900 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7ee99ce-d281-4196-b1aa-1d69bc8c6f2a-utilities\") pod \"redhat-operators-gjnwd\" (UID: \"d7ee99ce-d281-4196-b1aa-1d69bc8c6f2a\") " pod="openshift-marketplace/redhat-operators-gjnwd" Dec 01 21:36:10 crc kubenswrapper[4962]: I1201 21:36:10.987269 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7ee99ce-d281-4196-b1aa-1d69bc8c6f2a-utilities\") pod \"redhat-operators-gjnwd\" (UID: \"d7ee99ce-d281-4196-b1aa-1d69bc8c6f2a\") " pod="openshift-marketplace/redhat-operators-gjnwd" Dec 01 21:36:10 crc kubenswrapper[4962]: I1201 21:36:10.987758 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7ee99ce-d281-4196-b1aa-1d69bc8c6f2a-catalog-content\") pod \"redhat-operators-gjnwd\" (UID: \"d7ee99ce-d281-4196-b1aa-1d69bc8c6f2a\") " pod="openshift-marketplace/redhat-operators-gjnwd" Dec 01 21:36:11 crc kubenswrapper[4962]: I1201 21:36:11.007769 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-k8fq2" podStartSLOduration=133.007753564 podStartE2EDuration="2m13.007753564s" podCreationTimestamp="2025-12-01 21:33:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:36:10.99776113 +0000 UTC m=+155.099200335" watchObservedRunningTime="2025-12-01 21:36:11.007753564 +0000 UTC m=+155.109192749" Dec 01 21:36:11 crc kubenswrapper[4962]: I1201 21:36:11.009150 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpbcs\" (UniqueName: \"kubernetes.io/projected/d7ee99ce-d281-4196-b1aa-1d69bc8c6f2a-kube-api-access-tpbcs\") pod \"redhat-operators-gjnwd\" (UID: \"d7ee99ce-d281-4196-b1aa-1d69bc8c6f2a\") " pod="openshift-marketplace/redhat-operators-gjnwd" Dec 01 21:36:11 crc kubenswrapper[4962]: I1201 21:36:11.077642 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gjnwd" Dec 01 21:36:11 crc kubenswrapper[4962]: I1201 21:36:11.291716 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410410-kr2lr" Dec 01 21:36:11 crc kubenswrapper[4962]: I1201 21:36:11.393300 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fde3e2c2-ed59-4cbf-8554-1a0438eb81dc-config-volume\") pod \"fde3e2c2-ed59-4cbf-8554-1a0438eb81dc\" (UID: \"fde3e2c2-ed59-4cbf-8554-1a0438eb81dc\") " Dec 01 21:36:11 crc kubenswrapper[4962]: I1201 21:36:11.393356 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8lccq\" (UniqueName: \"kubernetes.io/projected/fde3e2c2-ed59-4cbf-8554-1a0438eb81dc-kube-api-access-8lccq\") pod \"fde3e2c2-ed59-4cbf-8554-1a0438eb81dc\" (UID: \"fde3e2c2-ed59-4cbf-8554-1a0438eb81dc\") " Dec 01 21:36:11 crc kubenswrapper[4962]: I1201 21:36:11.393434 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fde3e2c2-ed59-4cbf-8554-1a0438eb81dc-secret-volume\") pod \"fde3e2c2-ed59-4cbf-8554-1a0438eb81dc\" (UID: \"fde3e2c2-ed59-4cbf-8554-1a0438eb81dc\") " Dec 01 21:36:11 crc kubenswrapper[4962]: I1201 21:36:11.394223 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fde3e2c2-ed59-4cbf-8554-1a0438eb81dc-config-volume" (OuterVolumeSpecName: "config-volume") pod "fde3e2c2-ed59-4cbf-8554-1a0438eb81dc" (UID: "fde3e2c2-ed59-4cbf-8554-1a0438eb81dc"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:36:11 crc kubenswrapper[4962]: I1201 21:36:11.399291 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fde3e2c2-ed59-4cbf-8554-1a0438eb81dc-kube-api-access-8lccq" (OuterVolumeSpecName: "kube-api-access-8lccq") pod "fde3e2c2-ed59-4cbf-8554-1a0438eb81dc" (UID: "fde3e2c2-ed59-4cbf-8554-1a0438eb81dc"). InnerVolumeSpecName "kube-api-access-8lccq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:36:11 crc kubenswrapper[4962]: I1201 21:36:11.399361 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fde3e2c2-ed59-4cbf-8554-1a0438eb81dc-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "fde3e2c2-ed59-4cbf-8554-1a0438eb81dc" (UID: "fde3e2c2-ed59-4cbf-8554-1a0438eb81dc"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:36:11 crc kubenswrapper[4962]: I1201 21:36:11.481407 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 01 21:36:11 crc kubenswrapper[4962]: I1201 21:36:11.495291 4962 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fde3e2c2-ed59-4cbf-8554-1a0438eb81dc-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 21:36:11 crc kubenswrapper[4962]: I1201 21:36:11.495323 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8lccq\" (UniqueName: \"kubernetes.io/projected/fde3e2c2-ed59-4cbf-8554-1a0438eb81dc-kube-api-access-8lccq\") on node \"crc\" DevicePath \"\"" Dec 01 21:36:11 crc kubenswrapper[4962]: I1201 21:36:11.495334 4962 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fde3e2c2-ed59-4cbf-8554-1a0438eb81dc-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 01 21:36:11 crc kubenswrapper[4962]: W1201 21:36:11.496153 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod674a65c5_5310_4c99_8031_b486c9133db7.slice/crio-cab0d3624c489039b642fc55fd6b70c1083a88429ad25fa9a60715929772f6f6 WatchSource:0}: Error finding container cab0d3624c489039b642fc55fd6b70c1083a88429ad25fa9a60715929772f6f6: Status 404 returned error can't find the container with id cab0d3624c489039b642fc55fd6b70c1083a88429ad25fa9a60715929772f6f6 Dec 01 21:36:11 crc kubenswrapper[4962]: I1201 21:36:11.573415 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gjnwd"] Dec 01 21:36:11 crc kubenswrapper[4962]: I1201 21:36:11.817538 4962 patch_prober.go:28] interesting pod/router-default-5444994796-hjqsp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 21:36:11 crc kubenswrapper[4962]: [-]has-synced failed: reason withheld Dec 01 21:36:11 crc kubenswrapper[4962]: [+]process-running ok Dec 01 21:36:11 crc kubenswrapper[4962]: healthz check failed Dec 01 21:36:11 crc kubenswrapper[4962]: I1201 21:36:11.817604 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hjqsp" podUID="3bc8d3cc-b827-4f76-b41e-18e0790b6e66" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 21:36:11 crc kubenswrapper[4962]: I1201 21:36:11.975348 4962 generic.go:334] "Generic (PLEG): container finished" podID="2c522e07-a29f-4b5f-be82-e5eac46c1f6a" containerID="9f15e6fbf10941932e1da243b9892bc118049550a289399e4050947fbea68cf0" exitCode=0 Dec 01 21:36:11 crc kubenswrapper[4962]: I1201 21:36:11.975741 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zr6pm" event={"ID":"2c522e07-a29f-4b5f-be82-e5eac46c1f6a","Type":"ContainerDied","Data":"9f15e6fbf10941932e1da243b9892bc118049550a289399e4050947fbea68cf0"} Dec 01 21:36:11 crc kubenswrapper[4962]: I1201 21:36:11.975788 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zr6pm" event={"ID":"2c522e07-a29f-4b5f-be82-e5eac46c1f6a","Type":"ContainerStarted","Data":"eef6c6cb8df65a7865ab6eceeab1f83d60670fce9eca546ec76ab07e08f7e289"} Dec 01 21:36:11 crc kubenswrapper[4962]: I1201 21:36:11.983712 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410410-kr2lr" event={"ID":"fde3e2c2-ed59-4cbf-8554-1a0438eb81dc","Type":"ContainerDied","Data":"edf4629c34895a810e8726f6ac94a76f24dddf7936bc79e2a8faddea44a51dd1"} Dec 01 21:36:11 crc kubenswrapper[4962]: I1201 21:36:11.983751 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="edf4629c34895a810e8726f6ac94a76f24dddf7936bc79e2a8faddea44a51dd1" Dec 01 21:36:11 crc kubenswrapper[4962]: I1201 21:36:11.983805 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410410-kr2lr" Dec 01 21:36:11 crc kubenswrapper[4962]: I1201 21:36:11.987504 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gjnwd" event={"ID":"d7ee99ce-d281-4196-b1aa-1d69bc8c6f2a","Type":"ContainerStarted","Data":"85adddb405d44add697ffdd530f174033a3caa85b7920c3fc46541941c432513"} Dec 01 21:36:11 crc kubenswrapper[4962]: I1201 21:36:11.989727 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"674a65c5-5310-4c99-8031-b486c9133db7","Type":"ContainerStarted","Data":"cab0d3624c489039b642fc55fd6b70c1083a88429ad25fa9a60715929772f6f6"} Dec 01 21:36:12 crc kubenswrapper[4962]: I1201 21:36:12.059903 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lt2df" Dec 01 21:36:12 crc kubenswrapper[4962]: I1201 21:36:12.819129 4962 patch_prober.go:28] interesting pod/router-default-5444994796-hjqsp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 21:36:12 crc kubenswrapper[4962]: [-]has-synced failed: reason withheld Dec 01 21:36:12 crc kubenswrapper[4962]: [+]process-running ok Dec 01 21:36:12 crc kubenswrapper[4962]: healthz check failed Dec 01 21:36:12 crc kubenswrapper[4962]: I1201 21:36:12.819488 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hjqsp" podUID="3bc8d3cc-b827-4f76-b41e-18e0790b6e66" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 21:36:12 crc kubenswrapper[4962]: I1201 21:36:12.838131 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-jsv9r" Dec 01 21:36:12 crc kubenswrapper[4962]: I1201 21:36:12.838187 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-jsv9r" Dec 01 21:36:12 crc kubenswrapper[4962]: I1201 21:36:12.839545 4962 patch_prober.go:28] interesting pod/console-f9d7485db-jsv9r container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.28:8443/health\": dial tcp 10.217.0.28:8443: connect: connection refused" start-of-body= Dec 01 21:36:12 crc kubenswrapper[4962]: I1201 21:36:12.839599 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-jsv9r" podUID="4002c0b0-4b79-4755-a60c-2fc0cdac7876" containerName="console" probeResult="failure" output="Get \"https://10.217.0.28:8443/health\": dial tcp 10.217.0.28:8443: connect: connection refused" Dec 01 21:36:13 crc kubenswrapper[4962]: I1201 21:36:13.013713 4962 generic.go:334] "Generic (PLEG): container finished" podID="d7ee99ce-d281-4196-b1aa-1d69bc8c6f2a" containerID="b7a088a3c22704cb92a5a99344fecfb47550ee7b24e56a0cbeee59b6e7753f27" exitCode=0 Dec 01 21:36:13 crc kubenswrapper[4962]: I1201 21:36:13.013779 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gjnwd" event={"ID":"d7ee99ce-d281-4196-b1aa-1d69bc8c6f2a","Type":"ContainerDied","Data":"b7a088a3c22704cb92a5a99344fecfb47550ee7b24e56a0cbeee59b6e7753f27"} Dec 01 21:36:13 crc kubenswrapper[4962]: I1201 21:36:13.039758 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"674a65c5-5310-4c99-8031-b486c9133db7","Type":"ContainerDied","Data":"35df5c6205993c2b6512084040914c450b6d8c72d196fa1bcb801bb1136fe76e"} Dec 01 21:36:13 crc kubenswrapper[4962]: I1201 21:36:13.038916 4962 generic.go:334] "Generic (PLEG): container finished" podID="674a65c5-5310-4c99-8031-b486c9133db7" containerID="35df5c6205993c2b6512084040914c450b6d8c72d196fa1bcb801bb1136fe76e" exitCode=0 Dec 01 21:36:13 crc kubenswrapper[4962]: I1201 21:36:13.054465 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-w2wdh" Dec 01 21:36:13 crc kubenswrapper[4962]: I1201 21:36:13.054542 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-w2wdh" Dec 01 21:36:13 crc kubenswrapper[4962]: I1201 21:36:13.060866 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-w56dm" Dec 01 21:36:13 crc kubenswrapper[4962]: I1201 21:36:13.060914 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-w56dm" Dec 01 21:36:13 crc kubenswrapper[4962]: I1201 21:36:13.065377 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-w2wdh" Dec 01 21:36:13 crc kubenswrapper[4962]: I1201 21:36:13.067371 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-w56dm" Dec 01 21:36:13 crc kubenswrapper[4962]: I1201 21:36:13.381424 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 01 21:36:13 crc kubenswrapper[4962]: E1201 21:36:13.381632 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fde3e2c2-ed59-4cbf-8554-1a0438eb81dc" containerName="collect-profiles" Dec 01 21:36:13 crc kubenswrapper[4962]: I1201 21:36:13.381643 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="fde3e2c2-ed59-4cbf-8554-1a0438eb81dc" containerName="collect-profiles" Dec 01 21:36:13 crc kubenswrapper[4962]: I1201 21:36:13.381742 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="fde3e2c2-ed59-4cbf-8554-1a0438eb81dc" containerName="collect-profiles" Dec 01 21:36:13 crc kubenswrapper[4962]: I1201 21:36:13.382140 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 21:36:13 crc kubenswrapper[4962]: I1201 21:36:13.384018 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 01 21:36:13 crc kubenswrapper[4962]: I1201 21:36:13.384984 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 01 21:36:13 crc kubenswrapper[4962]: I1201 21:36:13.392241 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 01 21:36:13 crc kubenswrapper[4962]: I1201 21:36:13.434724 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4d7583fd-7224-4abc-81c1-c592d1182ec0-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"4d7583fd-7224-4abc-81c1-c592d1182ec0\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 21:36:13 crc kubenswrapper[4962]: I1201 21:36:13.434838 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4d7583fd-7224-4abc-81c1-c592d1182ec0-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"4d7583fd-7224-4abc-81c1-c592d1182ec0\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 21:36:13 crc kubenswrapper[4962]: I1201 21:36:13.536720 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4d7583fd-7224-4abc-81c1-c592d1182ec0-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"4d7583fd-7224-4abc-81c1-c592d1182ec0\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 21:36:13 crc kubenswrapper[4962]: I1201 21:36:13.536826 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4d7583fd-7224-4abc-81c1-c592d1182ec0-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"4d7583fd-7224-4abc-81c1-c592d1182ec0\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 21:36:13 crc kubenswrapper[4962]: I1201 21:36:13.536967 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4d7583fd-7224-4abc-81c1-c592d1182ec0-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"4d7583fd-7224-4abc-81c1-c592d1182ec0\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 21:36:13 crc kubenswrapper[4962]: I1201 21:36:13.557657 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4d7583fd-7224-4abc-81c1-c592d1182ec0-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"4d7583fd-7224-4abc-81c1-c592d1182ec0\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 21:36:13 crc kubenswrapper[4962]: I1201 21:36:13.744591 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 21:36:13 crc kubenswrapper[4962]: I1201 21:36:13.815194 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-hjqsp" Dec 01 21:36:13 crc kubenswrapper[4962]: I1201 21:36:13.818115 4962 patch_prober.go:28] interesting pod/router-default-5444994796-hjqsp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 21:36:13 crc kubenswrapper[4962]: [-]has-synced failed: reason withheld Dec 01 21:36:13 crc kubenswrapper[4962]: [+]process-running ok Dec 01 21:36:13 crc kubenswrapper[4962]: healthz check failed Dec 01 21:36:13 crc kubenswrapper[4962]: I1201 21:36:13.818198 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hjqsp" podUID="3bc8d3cc-b827-4f76-b41e-18e0790b6e66" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 21:36:13 crc kubenswrapper[4962]: I1201 21:36:13.822758 4962 patch_prober.go:28] interesting pod/downloads-7954f5f757-gg284 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Dec 01 21:36:13 crc kubenswrapper[4962]: I1201 21:36:13.822804 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-gg284" podUID="710d5a74-c24a-452e-a5bf-1c23b3589361" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Dec 01 21:36:13 crc kubenswrapper[4962]: I1201 21:36:13.823230 4962 patch_prober.go:28] interesting pod/downloads-7954f5f757-gg284 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Dec 01 21:36:13 crc kubenswrapper[4962]: I1201 21:36:13.823255 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-gg284" podUID="710d5a74-c24a-452e-a5bf-1c23b3589361" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Dec 01 21:36:14 crc kubenswrapper[4962]: I1201 21:36:14.066207 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-w2wdh" Dec 01 21:36:14 crc kubenswrapper[4962]: I1201 21:36:14.066645 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-w56dm" Dec 01 21:36:14 crc kubenswrapper[4962]: I1201 21:36:14.816517 4962 patch_prober.go:28] interesting pod/router-default-5444994796-hjqsp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 21:36:14 crc kubenswrapper[4962]: [-]has-synced failed: reason withheld Dec 01 21:36:14 crc kubenswrapper[4962]: [+]process-running ok Dec 01 21:36:14 crc kubenswrapper[4962]: healthz check failed Dec 01 21:36:14 crc kubenswrapper[4962]: I1201 21:36:14.816812 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hjqsp" podUID="3bc8d3cc-b827-4f76-b41e-18e0790b6e66" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 21:36:15 crc kubenswrapper[4962]: I1201 21:36:15.818573 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-hjqsp" Dec 01 21:36:15 crc kubenswrapper[4962]: I1201 21:36:15.822183 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-hjqsp" Dec 01 21:36:16 crc kubenswrapper[4962]: I1201 21:36:16.239837 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-n9nhh" Dec 01 21:36:21 crc kubenswrapper[4962]: I1201 21:36:21.075127 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 21:36:21 crc kubenswrapper[4962]: I1201 21:36:21.124803 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/674a65c5-5310-4c99-8031-b486c9133db7-kube-api-access\") pod \"674a65c5-5310-4c99-8031-b486c9133db7\" (UID: \"674a65c5-5310-4c99-8031-b486c9133db7\") " Dec 01 21:36:21 crc kubenswrapper[4962]: I1201 21:36:21.124846 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/674a65c5-5310-4c99-8031-b486c9133db7-kubelet-dir\") pod \"674a65c5-5310-4c99-8031-b486c9133db7\" (UID: \"674a65c5-5310-4c99-8031-b486c9133db7\") " Dec 01 21:36:21 crc kubenswrapper[4962]: I1201 21:36:21.125006 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/674a65c5-5310-4c99-8031-b486c9133db7-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "674a65c5-5310-4c99-8031-b486c9133db7" (UID: "674a65c5-5310-4c99-8031-b486c9133db7"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 21:36:21 crc kubenswrapper[4962]: I1201 21:36:21.126781 4962 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/674a65c5-5310-4c99-8031-b486c9133db7-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 01 21:36:21 crc kubenswrapper[4962]: I1201 21:36:21.130621 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"674a65c5-5310-4c99-8031-b486c9133db7","Type":"ContainerDied","Data":"cab0d3624c489039b642fc55fd6b70c1083a88429ad25fa9a60715929772f6f6"} Dec 01 21:36:21 crc kubenswrapper[4962]: I1201 21:36:21.130665 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cab0d3624c489039b642fc55fd6b70c1083a88429ad25fa9a60715929772f6f6" Dec 01 21:36:21 crc kubenswrapper[4962]: I1201 21:36:21.130722 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 21:36:21 crc kubenswrapper[4962]: I1201 21:36:21.136147 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/674a65c5-5310-4c99-8031-b486c9133db7-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "674a65c5-5310-4c99-8031-b486c9133db7" (UID: "674a65c5-5310-4c99-8031-b486c9133db7"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:36:21 crc kubenswrapper[4962]: I1201 21:36:21.228289 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/674a65c5-5310-4c99-8031-b486c9133db7-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 21:36:21 crc kubenswrapper[4962]: I1201 21:36:21.531857 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5e1746bf-6971-44aa-ae52-f349e6963eb2-metrics-certs\") pod \"network-metrics-daemon-2q5q5\" (UID: \"5e1746bf-6971-44aa-ae52-f349e6963eb2\") " pod="openshift-multus/network-metrics-daemon-2q5q5" Dec 01 21:36:21 crc kubenswrapper[4962]: I1201 21:36:21.548895 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5e1746bf-6971-44aa-ae52-f349e6963eb2-metrics-certs\") pod \"network-metrics-daemon-2q5q5\" (UID: \"5e1746bf-6971-44aa-ae52-f349e6963eb2\") " pod="openshift-multus/network-metrics-daemon-2q5q5" Dec 01 21:36:21 crc kubenswrapper[4962]: I1201 21:36:21.644488 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2q5q5" Dec 01 21:36:22 crc kubenswrapper[4962]: I1201 21:36:22.843573 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-jsv9r" Dec 01 21:36:22 crc kubenswrapper[4962]: I1201 21:36:22.851387 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-jsv9r" Dec 01 21:36:23 crc kubenswrapper[4962]: I1201 21:36:23.838328 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-gg284" Dec 01 21:36:29 crc kubenswrapper[4962]: I1201 21:36:29.807101 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-k8fq2" Dec 01 21:36:32 crc kubenswrapper[4962]: I1201 21:36:32.784623 4962 patch_prober.go:28] interesting pod/machine-config-daemon-b642k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 21:36:32 crc kubenswrapper[4962]: I1201 21:36:32.785252 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 21:36:32 crc kubenswrapper[4962]: I1201 21:36:32.884156 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 01 21:36:37 crc kubenswrapper[4962]: E1201 21:36:37.587303 4962 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 01 21:36:37 crc kubenswrapper[4962]: E1201 21:36:37.588314 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9p79b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-26z7n_openshift-marketplace(970cf872-ad25-4424-85ca-4bf809e9f5f7): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 21:36:37 crc kubenswrapper[4962]: E1201 21:36:37.589527 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-26z7n" podUID="970cf872-ad25-4424-85ca-4bf809e9f5f7" Dec 01 21:36:43 crc kubenswrapper[4962]: E1201 21:36:43.413038 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-26z7n" podUID="970cf872-ad25-4424-85ca-4bf809e9f5f7" Dec 01 21:36:43 crc kubenswrapper[4962]: I1201 21:36:43.590069 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 21:36:43 crc kubenswrapper[4962]: I1201 21:36:43.822091 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7p82b" Dec 01 21:36:48 crc kubenswrapper[4962]: E1201 21:36:48.037618 4962 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 01 21:36:48 crc kubenswrapper[4962]: E1201 21:36:48.037801 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-82xwq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-rlqt5_openshift-marketplace(8308dc46-519c-4b6a-8e97-d073484e64ae): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 21:36:48 crc kubenswrapper[4962]: E1201 21:36:48.038983 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-rlqt5" podUID="8308dc46-519c-4b6a-8e97-d073484e64ae" Dec 01 21:36:50 crc kubenswrapper[4962]: E1201 21:36:50.777334 4962 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 01 21:36:50 crc kubenswrapper[4962]: E1201 21:36:50.777773 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fkgp7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-l5625_openshift-marketplace(1569b19f-7f89-465d-9140-e2dce5e33425): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 21:36:50 crc kubenswrapper[4962]: E1201 21:36:50.778985 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-l5625" podUID="1569b19f-7f89-465d-9140-e2dce5e33425" Dec 01 21:36:52 crc kubenswrapper[4962]: E1201 21:36:52.899987 4962 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 01 21:36:52 crc kubenswrapper[4962]: E1201 21:36:52.900166 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qz8pv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-x6knc_openshift-marketplace(a4620db4-9171-44b5-b944-dcf2e871ef41): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 21:36:52 crc kubenswrapper[4962]: E1201 21:36:52.901763 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-x6knc" podUID="a4620db4-9171-44b5-b944-dcf2e871ef41" Dec 01 21:36:53 crc kubenswrapper[4962]: I1201 21:36:53.023468 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 01 21:36:53 crc kubenswrapper[4962]: E1201 21:36:53.023899 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="674a65c5-5310-4c99-8031-b486c9133db7" containerName="pruner" Dec 01 21:36:53 crc kubenswrapper[4962]: I1201 21:36:53.023946 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="674a65c5-5310-4c99-8031-b486c9133db7" containerName="pruner" Dec 01 21:36:53 crc kubenswrapper[4962]: I1201 21:36:53.024063 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="674a65c5-5310-4c99-8031-b486c9133db7" containerName="pruner" Dec 01 21:36:53 crc kubenswrapper[4962]: I1201 21:36:53.024493 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 21:36:53 crc kubenswrapper[4962]: I1201 21:36:53.027764 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 01 21:36:53 crc kubenswrapper[4962]: E1201 21:36:53.124861 4962 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 01 21:36:53 crc kubenswrapper[4962]: E1201 21:36:53.125027 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cxfzb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-mpxbb_openshift-marketplace(b9788313-1ab9-4dce-8fd0-363c8086d8d3): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 21:36:53 crc kubenswrapper[4962]: E1201 21:36:53.126827 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-mpxbb" podUID="b9788313-1ab9-4dce-8fd0-363c8086d8d3" Dec 01 21:36:53 crc kubenswrapper[4962]: I1201 21:36:53.198077 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e294655f-e01a-4944-a8da-3480d76481b8-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e294655f-e01a-4944-a8da-3480d76481b8\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 21:36:53 crc kubenswrapper[4962]: I1201 21:36:53.198121 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e294655f-e01a-4944-a8da-3480d76481b8-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e294655f-e01a-4944-a8da-3480d76481b8\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 21:36:53 crc kubenswrapper[4962]: I1201 21:36:53.299428 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e294655f-e01a-4944-a8da-3480d76481b8-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e294655f-e01a-4944-a8da-3480d76481b8\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 21:36:53 crc kubenswrapper[4962]: I1201 21:36:53.299482 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e294655f-e01a-4944-a8da-3480d76481b8-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e294655f-e01a-4944-a8da-3480d76481b8\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 21:36:53 crc kubenswrapper[4962]: I1201 21:36:53.299668 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e294655f-e01a-4944-a8da-3480d76481b8-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e294655f-e01a-4944-a8da-3480d76481b8\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 21:36:53 crc kubenswrapper[4962]: I1201 21:36:53.338806 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e294655f-e01a-4944-a8da-3480d76481b8-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e294655f-e01a-4944-a8da-3480d76481b8\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 21:36:53 crc kubenswrapper[4962]: I1201 21:36:53.354320 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 21:36:57 crc kubenswrapper[4962]: E1201 21:36:57.466388 4962 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 01 21:36:57 crc kubenswrapper[4962]: E1201 21:36:57.467181 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9szx8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-zr6pm_openshift-marketplace(2c522e07-a29f-4b5f-be82-e5eac46c1f6a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 21:36:57 crc kubenswrapper[4962]: E1201 21:36:57.468466 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-zr6pm" podUID="2c522e07-a29f-4b5f-be82-e5eac46c1f6a" Dec 01 21:36:57 crc kubenswrapper[4962]: I1201 21:36:57.661179 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"4d7583fd-7224-4abc-81c1-c592d1182ec0","Type":"ContainerStarted","Data":"6eb539226ccbfe8f9f6e52594c35952c386a0b2896a0255b30c062050e682936"} Dec 01 21:36:58 crc kubenswrapper[4962]: E1201 21:36:58.243308 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-x6knc" podUID="a4620db4-9171-44b5-b944-dcf2e871ef41" Dec 01 21:36:58 crc kubenswrapper[4962]: E1201 21:36:58.243714 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-mpxbb" podUID="b9788313-1ab9-4dce-8fd0-363c8086d8d3" Dec 01 21:36:58 crc kubenswrapper[4962]: E1201 21:36:58.243763 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-l5625" podUID="1569b19f-7f89-465d-9140-e2dce5e33425" Dec 01 21:36:58 crc kubenswrapper[4962]: E1201 21:36:58.243836 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-zr6pm" podUID="2c522e07-a29f-4b5f-be82-e5eac46c1f6a" Dec 01 21:36:58 crc kubenswrapper[4962]: E1201 21:36:58.322066 4962 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 01 21:36:58 crc kubenswrapper[4962]: E1201 21:36:58.322559 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tpbcs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-gjnwd_openshift-marketplace(d7ee99ce-d281-4196-b1aa-1d69bc8c6f2a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 21:36:58 crc kubenswrapper[4962]: E1201 21:36:58.327252 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-gjnwd" podUID="d7ee99ce-d281-4196-b1aa-1d69bc8c6f2a" Dec 01 21:36:58 crc kubenswrapper[4962]: I1201 21:36:58.537474 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 01 21:36:58 crc kubenswrapper[4962]: W1201 21:36:58.548056 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pode294655f_e01a_4944_a8da_3480d76481b8.slice/crio-04eb058a9f3fb47177a6feed32df1f33fae13a186017ff48ad709c543eb058f5 WatchSource:0}: Error finding container 04eb058a9f3fb47177a6feed32df1f33fae13a186017ff48ad709c543eb058f5: Status 404 returned error can't find the container with id 04eb058a9f3fb47177a6feed32df1f33fae13a186017ff48ad709c543eb058f5 Dec 01 21:36:58 crc kubenswrapper[4962]: I1201 21:36:58.625562 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 01 21:36:58 crc kubenswrapper[4962]: I1201 21:36:58.627104 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 01 21:36:58 crc kubenswrapper[4962]: I1201 21:36:58.628969 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 01 21:36:58 crc kubenswrapper[4962]: I1201 21:36:58.669519 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"4d7583fd-7224-4abc-81c1-c592d1182ec0","Type":"ContainerStarted","Data":"6e20069819285e38322b60b502d4a1795ceb6823ee1a9e5ca5287f40fe9ee905"} Dec 01 21:36:58 crc kubenswrapper[4962]: I1201 21:36:58.672414 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"e294655f-e01a-4944-a8da-3480d76481b8","Type":"ContainerStarted","Data":"04eb058a9f3fb47177a6feed32df1f33fae13a186017ff48ad709c543eb058f5"} Dec 01 21:36:58 crc kubenswrapper[4962]: I1201 21:36:58.673464 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-2q5q5"] Dec 01 21:36:58 crc kubenswrapper[4962]: I1201 21:36:58.677477 4962 generic.go:334] "Generic (PLEG): container finished" podID="542eb764-e2cd-4043-9721-fe8f5d6d5d13" containerID="de86d7faffa7c5acfddf3aeda3fbcef7c401908f6f019cb2e934fb45a32e82bc" exitCode=0 Dec 01 21:36:58 crc kubenswrapper[4962]: I1201 21:36:58.677570 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j9h7f" event={"ID":"542eb764-e2cd-4043-9721-fe8f5d6d5d13","Type":"ContainerDied","Data":"de86d7faffa7c5acfddf3aeda3fbcef7c401908f6f019cb2e934fb45a32e82bc"} Dec 01 21:36:58 crc kubenswrapper[4962]: E1201 21:36:58.688889 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-gjnwd" podUID="d7ee99ce-d281-4196-b1aa-1d69bc8c6f2a" Dec 01 21:36:58 crc kubenswrapper[4962]: I1201 21:36:58.701956 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=45.701917999 podStartE2EDuration="45.701917999s" podCreationTimestamp="2025-12-01 21:36:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:36:58.682565386 +0000 UTC m=+202.784004591" watchObservedRunningTime="2025-12-01 21:36:58.701917999 +0000 UTC m=+202.803357204" Dec 01 21:36:58 crc kubenswrapper[4962]: I1201 21:36:58.786439 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b178e206-e661-4a20-af64-1b538fbc947a-var-lock\") pod \"installer-9-crc\" (UID: \"b178e206-e661-4a20-af64-1b538fbc947a\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 21:36:58 crc kubenswrapper[4962]: I1201 21:36:58.786508 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b178e206-e661-4a20-af64-1b538fbc947a-kubelet-dir\") pod \"installer-9-crc\" (UID: \"b178e206-e661-4a20-af64-1b538fbc947a\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 21:36:58 crc kubenswrapper[4962]: I1201 21:36:58.786607 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b178e206-e661-4a20-af64-1b538fbc947a-kube-api-access\") pod \"installer-9-crc\" (UID: \"b178e206-e661-4a20-af64-1b538fbc947a\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 21:36:58 crc kubenswrapper[4962]: I1201 21:36:58.888475 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b178e206-e661-4a20-af64-1b538fbc947a-var-lock\") pod \"installer-9-crc\" (UID: \"b178e206-e661-4a20-af64-1b538fbc947a\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 21:36:58 crc kubenswrapper[4962]: I1201 21:36:58.888616 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b178e206-e661-4a20-af64-1b538fbc947a-kubelet-dir\") pod \"installer-9-crc\" (UID: \"b178e206-e661-4a20-af64-1b538fbc947a\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 21:36:58 crc kubenswrapper[4962]: I1201 21:36:58.888570 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b178e206-e661-4a20-af64-1b538fbc947a-var-lock\") pod \"installer-9-crc\" (UID: \"b178e206-e661-4a20-af64-1b538fbc947a\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 21:36:58 crc kubenswrapper[4962]: I1201 21:36:58.888718 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b178e206-e661-4a20-af64-1b538fbc947a-kube-api-access\") pod \"installer-9-crc\" (UID: \"b178e206-e661-4a20-af64-1b538fbc947a\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 21:36:58 crc kubenswrapper[4962]: I1201 21:36:58.888776 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b178e206-e661-4a20-af64-1b538fbc947a-kubelet-dir\") pod \"installer-9-crc\" (UID: \"b178e206-e661-4a20-af64-1b538fbc947a\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 21:36:58 crc kubenswrapper[4962]: I1201 21:36:58.910534 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b178e206-e661-4a20-af64-1b538fbc947a-kube-api-access\") pod \"installer-9-crc\" (UID: \"b178e206-e661-4a20-af64-1b538fbc947a\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 21:36:58 crc kubenswrapper[4962]: I1201 21:36:58.972139 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 01 21:36:59 crc kubenswrapper[4962]: I1201 21:36:59.181887 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 01 21:36:59 crc kubenswrapper[4962]: I1201 21:36:59.684533 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j9h7f" event={"ID":"542eb764-e2cd-4043-9721-fe8f5d6d5d13","Type":"ContainerStarted","Data":"af8aa798b4813eae1308a47731b5026329864daffa8b1689b9cd1480898f7f54"} Dec 01 21:36:59 crc kubenswrapper[4962]: I1201 21:36:59.686111 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"b178e206-e661-4a20-af64-1b538fbc947a","Type":"ContainerStarted","Data":"50676283b1f5f2c528010bf413d0d8627e9afde30700c96062d318ba343f0cbe"} Dec 01 21:36:59 crc kubenswrapper[4962]: I1201 21:36:59.686160 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"b178e206-e661-4a20-af64-1b538fbc947a","Type":"ContainerStarted","Data":"3efe046b06727fb9057fe9ce6463ca986605bab026b349a6fa1ef3051cc47b30"} Dec 01 21:36:59 crc kubenswrapper[4962]: I1201 21:36:59.688607 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2q5q5" event={"ID":"5e1746bf-6971-44aa-ae52-f349e6963eb2","Type":"ContainerStarted","Data":"d64d02997084fe9346922443a669f2fb66bee199d84b536b2a7dfe6e81dc0aa4"} Dec 01 21:36:59 crc kubenswrapper[4962]: I1201 21:36:59.688649 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2q5q5" event={"ID":"5e1746bf-6971-44aa-ae52-f349e6963eb2","Type":"ContainerStarted","Data":"28ff7609ac56e8308e67a32ca7b4a23e93d5e3cad1d224be1617b719507c0499"} Dec 01 21:36:59 crc kubenswrapper[4962]: I1201 21:36:59.688663 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2q5q5" event={"ID":"5e1746bf-6971-44aa-ae52-f349e6963eb2","Type":"ContainerStarted","Data":"1bee6b89147c4ce496a0fd7dc54df40372789276584c6ccacbc07cda4cee222b"} Dec 01 21:36:59 crc kubenswrapper[4962]: I1201 21:36:59.690444 4962 generic.go:334] "Generic (PLEG): container finished" podID="4d7583fd-7224-4abc-81c1-c592d1182ec0" containerID="6e20069819285e38322b60b502d4a1795ceb6823ee1a9e5ca5287f40fe9ee905" exitCode=0 Dec 01 21:36:59 crc kubenswrapper[4962]: I1201 21:36:59.690502 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"4d7583fd-7224-4abc-81c1-c592d1182ec0","Type":"ContainerDied","Data":"6e20069819285e38322b60b502d4a1795ceb6823ee1a9e5ca5287f40fe9ee905"} Dec 01 21:36:59 crc kubenswrapper[4962]: I1201 21:36:59.693047 4962 generic.go:334] "Generic (PLEG): container finished" podID="e294655f-e01a-4944-a8da-3480d76481b8" containerID="33f09c6554b6f28183d997be4655c24df09651e78d1cb53f055e76edf63a1d93" exitCode=0 Dec 01 21:36:59 crc kubenswrapper[4962]: I1201 21:36:59.693105 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"e294655f-e01a-4944-a8da-3480d76481b8","Type":"ContainerDied","Data":"33f09c6554b6f28183d997be4655c24df09651e78d1cb53f055e76edf63a1d93"} Dec 01 21:36:59 crc kubenswrapper[4962]: I1201 21:36:59.696346 4962 generic.go:334] "Generic (PLEG): container finished" podID="970cf872-ad25-4424-85ca-4bf809e9f5f7" containerID="804d92c8f92f115dd66314701eb5ce998ee2f92ec92de56f65f440301a574cab" exitCode=0 Dec 01 21:36:59 crc kubenswrapper[4962]: I1201 21:36:59.696386 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-26z7n" event={"ID":"970cf872-ad25-4424-85ca-4bf809e9f5f7","Type":"ContainerDied","Data":"804d92c8f92f115dd66314701eb5ce998ee2f92ec92de56f65f440301a574cab"} Dec 01 21:36:59 crc kubenswrapper[4962]: I1201 21:36:59.697361 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-j9h7f" Dec 01 21:36:59 crc kubenswrapper[4962]: I1201 21:36:59.697387 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-j9h7f" Dec 01 21:36:59 crc kubenswrapper[4962]: I1201 21:36:59.701091 4962 generic.go:334] "Generic (PLEG): container finished" podID="8308dc46-519c-4b6a-8e97-d073484e64ae" containerID="59bd59c13ff292c1a086d9e4d4afa01eecdb9b399d787432173464f1b28f2cc4" exitCode=0 Dec 01 21:36:59 crc kubenswrapper[4962]: I1201 21:36:59.701142 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rlqt5" event={"ID":"8308dc46-519c-4b6a-8e97-d073484e64ae","Type":"ContainerDied","Data":"59bd59c13ff292c1a086d9e4d4afa01eecdb9b399d787432173464f1b28f2cc4"} Dec 01 21:36:59 crc kubenswrapper[4962]: I1201 21:36:59.702918 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-j9h7f" podStartSLOduration=2.512338089 podStartE2EDuration="50.702907976s" podCreationTimestamp="2025-12-01 21:36:09 +0000 UTC" firstStartedPulling="2025-12-01 21:36:10.978422221 +0000 UTC m=+155.079861406" lastFinishedPulling="2025-12-01 21:36:59.168992098 +0000 UTC m=+203.270431293" observedRunningTime="2025-12-01 21:36:59.701098967 +0000 UTC m=+203.802538182" watchObservedRunningTime="2025-12-01 21:36:59.702907976 +0000 UTC m=+203.804347171" Dec 01 21:36:59 crc kubenswrapper[4962]: I1201 21:36:59.719080 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-2q5q5" podStartSLOduration=181.719061653 podStartE2EDuration="3m1.719061653s" podCreationTimestamp="2025-12-01 21:33:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:36:59.716928155 +0000 UTC m=+203.818367350" watchObservedRunningTime="2025-12-01 21:36:59.719061653 +0000 UTC m=+203.820500858" Dec 01 21:36:59 crc kubenswrapper[4962]: I1201 21:36:59.750995 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=1.750978027 podStartE2EDuration="1.750978027s" podCreationTimestamp="2025-12-01 21:36:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:36:59.750450692 +0000 UTC m=+203.851889887" watchObservedRunningTime="2025-12-01 21:36:59.750978027 +0000 UTC m=+203.852417222" Dec 01 21:37:00 crc kubenswrapper[4962]: I1201 21:37:00.711192 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rlqt5" event={"ID":"8308dc46-519c-4b6a-8e97-d073484e64ae","Type":"ContainerStarted","Data":"eadca5d0a4e1433b3cfe8ecbe44ebe22df8d5ad8ff02f87e32ddae2990c43d5b"} Dec 01 21:37:00 crc kubenswrapper[4962]: I1201 21:37:00.713671 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-26z7n" event={"ID":"970cf872-ad25-4424-85ca-4bf809e9f5f7","Type":"ContainerStarted","Data":"bb80e357db8323314e01bdf450386af80beb265ef5049df73c7a70a2f33574ca"} Dec 01 21:37:00 crc kubenswrapper[4962]: I1201 21:37:00.729390 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rlqt5" podStartSLOduration=3.507573867 podStartE2EDuration="53.72935425s" podCreationTimestamp="2025-12-01 21:36:07 +0000 UTC" firstStartedPulling="2025-12-01 21:36:09.915441174 +0000 UTC m=+154.016880379" lastFinishedPulling="2025-12-01 21:37:00.137221557 +0000 UTC m=+204.238660762" observedRunningTime="2025-12-01 21:37:00.726676858 +0000 UTC m=+204.828116063" watchObservedRunningTime="2025-12-01 21:37:00.72935425 +0000 UTC m=+204.830793485" Dec 01 21:37:00 crc kubenswrapper[4962]: I1201 21:37:00.749511 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-26z7n" podStartSLOduration=2.4672520479999998 podStartE2EDuration="51.749486375s" podCreationTimestamp="2025-12-01 21:36:09 +0000 UTC" firstStartedPulling="2025-12-01 21:36:10.965817489 +0000 UTC m=+155.067256684" lastFinishedPulling="2025-12-01 21:37:00.248051806 +0000 UTC m=+204.349491011" observedRunningTime="2025-12-01 21:37:00.745545649 +0000 UTC m=+204.846984884" watchObservedRunningTime="2025-12-01 21:37:00.749486375 +0000 UTC m=+204.850925570" Dec 01 21:37:00 crc kubenswrapper[4962]: I1201 21:37:00.773526 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-j9h7f" podUID="542eb764-e2cd-4043-9721-fe8f5d6d5d13" containerName="registry-server" probeResult="failure" output=< Dec 01 21:37:00 crc kubenswrapper[4962]: timeout: failed to connect service ":50051" within 1s Dec 01 21:37:00 crc kubenswrapper[4962]: > Dec 01 21:37:00 crc kubenswrapper[4962]: I1201 21:37:00.944603 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 21:37:00 crc kubenswrapper[4962]: I1201 21:37:00.994310 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 21:37:01 crc kubenswrapper[4962]: I1201 21:37:01.020919 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4d7583fd-7224-4abc-81c1-c592d1182ec0-kube-api-access\") pod \"4d7583fd-7224-4abc-81c1-c592d1182ec0\" (UID: \"4d7583fd-7224-4abc-81c1-c592d1182ec0\") " Dec 01 21:37:01 crc kubenswrapper[4962]: I1201 21:37:01.021013 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4d7583fd-7224-4abc-81c1-c592d1182ec0-kubelet-dir\") pod \"4d7583fd-7224-4abc-81c1-c592d1182ec0\" (UID: \"4d7583fd-7224-4abc-81c1-c592d1182ec0\") " Dec 01 21:37:01 crc kubenswrapper[4962]: I1201 21:37:01.021081 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e294655f-e01a-4944-a8da-3480d76481b8-kube-api-access\") pod \"e294655f-e01a-4944-a8da-3480d76481b8\" (UID: \"e294655f-e01a-4944-a8da-3480d76481b8\") " Dec 01 21:37:01 crc kubenswrapper[4962]: I1201 21:37:01.021145 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e294655f-e01a-4944-a8da-3480d76481b8-kubelet-dir\") pod \"e294655f-e01a-4944-a8da-3480d76481b8\" (UID: \"e294655f-e01a-4944-a8da-3480d76481b8\") " Dec 01 21:37:01 crc kubenswrapper[4962]: I1201 21:37:01.021371 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e294655f-e01a-4944-a8da-3480d76481b8-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e294655f-e01a-4944-a8da-3480d76481b8" (UID: "e294655f-e01a-4944-a8da-3480d76481b8"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 21:37:01 crc kubenswrapper[4962]: I1201 21:37:01.022287 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4d7583fd-7224-4abc-81c1-c592d1182ec0-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "4d7583fd-7224-4abc-81c1-c592d1182ec0" (UID: "4d7583fd-7224-4abc-81c1-c592d1182ec0"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 21:37:01 crc kubenswrapper[4962]: I1201 21:37:01.027208 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d7583fd-7224-4abc-81c1-c592d1182ec0-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "4d7583fd-7224-4abc-81c1-c592d1182ec0" (UID: "4d7583fd-7224-4abc-81c1-c592d1182ec0"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:37:01 crc kubenswrapper[4962]: I1201 21:37:01.027455 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e294655f-e01a-4944-a8da-3480d76481b8-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e294655f-e01a-4944-a8da-3480d76481b8" (UID: "e294655f-e01a-4944-a8da-3480d76481b8"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:37:01 crc kubenswrapper[4962]: I1201 21:37:01.122509 4962 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e294655f-e01a-4944-a8da-3480d76481b8-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 01 21:37:01 crc kubenswrapper[4962]: I1201 21:37:01.122535 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4d7583fd-7224-4abc-81c1-c592d1182ec0-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 21:37:01 crc kubenswrapper[4962]: I1201 21:37:01.122546 4962 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4d7583fd-7224-4abc-81c1-c592d1182ec0-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 01 21:37:01 crc kubenswrapper[4962]: I1201 21:37:01.122554 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e294655f-e01a-4944-a8da-3480d76481b8-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 21:37:01 crc kubenswrapper[4962]: I1201 21:37:01.720029 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"4d7583fd-7224-4abc-81c1-c592d1182ec0","Type":"ContainerDied","Data":"6eb539226ccbfe8f9f6e52594c35952c386a0b2896a0255b30c062050e682936"} Dec 01 21:37:01 crc kubenswrapper[4962]: I1201 21:37:01.720572 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6eb539226ccbfe8f9f6e52594c35952c386a0b2896a0255b30c062050e682936" Dec 01 21:37:01 crc kubenswrapper[4962]: I1201 21:37:01.720671 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 21:37:01 crc kubenswrapper[4962]: I1201 21:37:01.723638 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 21:37:01 crc kubenswrapper[4962]: I1201 21:37:01.723831 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"e294655f-e01a-4944-a8da-3480d76481b8","Type":"ContainerDied","Data":"04eb058a9f3fb47177a6feed32df1f33fae13a186017ff48ad709c543eb058f5"} Dec 01 21:37:01 crc kubenswrapper[4962]: I1201 21:37:01.723887 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04eb058a9f3fb47177a6feed32df1f33fae13a186017ff48ad709c543eb058f5" Dec 01 21:37:02 crc kubenswrapper[4962]: I1201 21:37:02.785234 4962 patch_prober.go:28] interesting pod/machine-config-daemon-b642k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 21:37:02 crc kubenswrapper[4962]: I1201 21:37:02.785756 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 21:37:02 crc kubenswrapper[4962]: I1201 21:37:02.785815 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b642k" Dec 01 21:37:02 crc kubenswrapper[4962]: I1201 21:37:02.786626 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ca94caf0040f2d86239f149d34e419adcc86594fda2b4189e74f699c125857f6"} pod="openshift-machine-config-operator/machine-config-daemon-b642k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 21:37:02 crc kubenswrapper[4962]: I1201 21:37:02.786744 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" containerID="cri-o://ca94caf0040f2d86239f149d34e419adcc86594fda2b4189e74f699c125857f6" gracePeriod=600 Dec 01 21:37:03 crc kubenswrapper[4962]: I1201 21:37:03.742316 4962 generic.go:334] "Generic (PLEG): container finished" podID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerID="ca94caf0040f2d86239f149d34e419adcc86594fda2b4189e74f699c125857f6" exitCode=0 Dec 01 21:37:03 crc kubenswrapper[4962]: I1201 21:37:03.742415 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" event={"ID":"191b6ce3-f613-4217-b224-a65ee4cfdfe7","Type":"ContainerDied","Data":"ca94caf0040f2d86239f149d34e419adcc86594fda2b4189e74f699c125857f6"} Dec 01 21:37:03 crc kubenswrapper[4962]: I1201 21:37:03.744076 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" event={"ID":"191b6ce3-f613-4217-b224-a65ee4cfdfe7","Type":"ContainerStarted","Data":"749bd494341ecd94507a174dd68318952a7c94f26fd3fad275718b333cbd13e5"} Dec 01 21:37:08 crc kubenswrapper[4962]: I1201 21:37:08.106371 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rlqt5" Dec 01 21:37:08 crc kubenswrapper[4962]: I1201 21:37:08.106679 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rlqt5" Dec 01 21:37:08 crc kubenswrapper[4962]: I1201 21:37:08.154504 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rlqt5" Dec 01 21:37:08 crc kubenswrapper[4962]: I1201 21:37:08.849324 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rlqt5" Dec 01 21:37:09 crc kubenswrapper[4962]: I1201 21:37:09.752474 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-j9h7f" Dec 01 21:37:09 crc kubenswrapper[4962]: I1201 21:37:09.805015 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-j9h7f" Dec 01 21:37:10 crc kubenswrapper[4962]: I1201 21:37:10.079300 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-26z7n" Dec 01 21:37:10 crc kubenswrapper[4962]: I1201 21:37:10.079378 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-26z7n" Dec 01 21:37:10 crc kubenswrapper[4962]: I1201 21:37:10.119266 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-26z7n" Dec 01 21:37:10 crc kubenswrapper[4962]: I1201 21:37:10.173092 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rlqt5"] Dec 01 21:37:10 crc kubenswrapper[4962]: I1201 21:37:10.795817 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rlqt5" podUID="8308dc46-519c-4b6a-8e97-d073484e64ae" containerName="registry-server" containerID="cri-o://eadca5d0a4e1433b3cfe8ecbe44ebe22df8d5ad8ff02f87e32ddae2990c43d5b" gracePeriod=2 Dec 01 21:37:10 crc kubenswrapper[4962]: I1201 21:37:10.848722 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-26z7n" Dec 01 21:37:11 crc kubenswrapper[4962]: I1201 21:37:11.168494 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rlqt5" Dec 01 21:37:11 crc kubenswrapper[4962]: I1201 21:37:11.261010 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82xwq\" (UniqueName: \"kubernetes.io/projected/8308dc46-519c-4b6a-8e97-d073484e64ae-kube-api-access-82xwq\") pod \"8308dc46-519c-4b6a-8e97-d073484e64ae\" (UID: \"8308dc46-519c-4b6a-8e97-d073484e64ae\") " Dec 01 21:37:11 crc kubenswrapper[4962]: I1201 21:37:11.261131 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8308dc46-519c-4b6a-8e97-d073484e64ae-catalog-content\") pod \"8308dc46-519c-4b6a-8e97-d073484e64ae\" (UID: \"8308dc46-519c-4b6a-8e97-d073484e64ae\") " Dec 01 21:37:11 crc kubenswrapper[4962]: I1201 21:37:11.261161 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8308dc46-519c-4b6a-8e97-d073484e64ae-utilities\") pod \"8308dc46-519c-4b6a-8e97-d073484e64ae\" (UID: \"8308dc46-519c-4b6a-8e97-d073484e64ae\") " Dec 01 21:37:11 crc kubenswrapper[4962]: I1201 21:37:11.265076 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8308dc46-519c-4b6a-8e97-d073484e64ae-utilities" (OuterVolumeSpecName: "utilities") pod "8308dc46-519c-4b6a-8e97-d073484e64ae" (UID: "8308dc46-519c-4b6a-8e97-d073484e64ae"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:37:11 crc kubenswrapper[4962]: I1201 21:37:11.269068 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8308dc46-519c-4b6a-8e97-d073484e64ae-kube-api-access-82xwq" (OuterVolumeSpecName: "kube-api-access-82xwq") pod "8308dc46-519c-4b6a-8e97-d073484e64ae" (UID: "8308dc46-519c-4b6a-8e97-d073484e64ae"). InnerVolumeSpecName "kube-api-access-82xwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:37:11 crc kubenswrapper[4962]: I1201 21:37:11.318669 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8308dc46-519c-4b6a-8e97-d073484e64ae-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8308dc46-519c-4b6a-8e97-d073484e64ae" (UID: "8308dc46-519c-4b6a-8e97-d073484e64ae"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:37:11 crc kubenswrapper[4962]: I1201 21:37:11.363712 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82xwq\" (UniqueName: \"kubernetes.io/projected/8308dc46-519c-4b6a-8e97-d073484e64ae-kube-api-access-82xwq\") on node \"crc\" DevicePath \"\"" Dec 01 21:37:11 crc kubenswrapper[4962]: I1201 21:37:11.363744 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8308dc46-519c-4b6a-8e97-d073484e64ae-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 21:37:11 crc kubenswrapper[4962]: I1201 21:37:11.363755 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8308dc46-519c-4b6a-8e97-d073484e64ae-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 21:37:11 crc kubenswrapper[4962]: I1201 21:37:11.803133 4962 generic.go:334] "Generic (PLEG): container finished" podID="8308dc46-519c-4b6a-8e97-d073484e64ae" containerID="eadca5d0a4e1433b3cfe8ecbe44ebe22df8d5ad8ff02f87e32ddae2990c43d5b" exitCode=0 Dec 01 21:37:11 crc kubenswrapper[4962]: I1201 21:37:11.803209 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rlqt5" event={"ID":"8308dc46-519c-4b6a-8e97-d073484e64ae","Type":"ContainerDied","Data":"eadca5d0a4e1433b3cfe8ecbe44ebe22df8d5ad8ff02f87e32ddae2990c43d5b"} Dec 01 21:37:11 crc kubenswrapper[4962]: I1201 21:37:11.803226 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rlqt5" Dec 01 21:37:11 crc kubenswrapper[4962]: I1201 21:37:11.803252 4962 scope.go:117] "RemoveContainer" containerID="eadca5d0a4e1433b3cfe8ecbe44ebe22df8d5ad8ff02f87e32ddae2990c43d5b" Dec 01 21:37:11 crc kubenswrapper[4962]: I1201 21:37:11.803241 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rlqt5" event={"ID":"8308dc46-519c-4b6a-8e97-d073484e64ae","Type":"ContainerDied","Data":"df98c6461874c9b6bb55dcca6ba9ffbba0cd00180b2733c643171316078d16fd"} Dec 01 21:37:11 crc kubenswrapper[4962]: I1201 21:37:11.805856 4962 generic.go:334] "Generic (PLEG): container finished" podID="2c522e07-a29f-4b5f-be82-e5eac46c1f6a" containerID="ecf4d77c279a5ba520a36bdfac9721a6b77e9308f0d7c75228ada1db85c44137" exitCode=0 Dec 01 21:37:11 crc kubenswrapper[4962]: I1201 21:37:11.806898 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zr6pm" event={"ID":"2c522e07-a29f-4b5f-be82-e5eac46c1f6a","Type":"ContainerDied","Data":"ecf4d77c279a5ba520a36bdfac9721a6b77e9308f0d7c75228ada1db85c44137"} Dec 01 21:37:11 crc kubenswrapper[4962]: I1201 21:37:11.831793 4962 scope.go:117] "RemoveContainer" containerID="59bd59c13ff292c1a086d9e4d4afa01eecdb9b399d787432173464f1b28f2cc4" Dec 01 21:37:11 crc kubenswrapper[4962]: I1201 21:37:11.859143 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rlqt5"] Dec 01 21:37:11 crc kubenswrapper[4962]: I1201 21:37:11.872906 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rlqt5"] Dec 01 21:37:11 crc kubenswrapper[4962]: I1201 21:37:11.874787 4962 scope.go:117] "RemoveContainer" containerID="0fba10999ad935b5ef4527037f5411522498d37cce599b24c8898085fd7d4624" Dec 01 21:37:11 crc kubenswrapper[4962]: I1201 21:37:11.915061 4962 scope.go:117] "RemoveContainer" containerID="eadca5d0a4e1433b3cfe8ecbe44ebe22df8d5ad8ff02f87e32ddae2990c43d5b" Dec 01 21:37:11 crc kubenswrapper[4962]: E1201 21:37:11.915563 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eadca5d0a4e1433b3cfe8ecbe44ebe22df8d5ad8ff02f87e32ddae2990c43d5b\": container with ID starting with eadca5d0a4e1433b3cfe8ecbe44ebe22df8d5ad8ff02f87e32ddae2990c43d5b not found: ID does not exist" containerID="eadca5d0a4e1433b3cfe8ecbe44ebe22df8d5ad8ff02f87e32ddae2990c43d5b" Dec 01 21:37:11 crc kubenswrapper[4962]: I1201 21:37:11.915617 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eadca5d0a4e1433b3cfe8ecbe44ebe22df8d5ad8ff02f87e32ddae2990c43d5b"} err="failed to get container status \"eadca5d0a4e1433b3cfe8ecbe44ebe22df8d5ad8ff02f87e32ddae2990c43d5b\": rpc error: code = NotFound desc = could not find container \"eadca5d0a4e1433b3cfe8ecbe44ebe22df8d5ad8ff02f87e32ddae2990c43d5b\": container with ID starting with eadca5d0a4e1433b3cfe8ecbe44ebe22df8d5ad8ff02f87e32ddae2990c43d5b not found: ID does not exist" Dec 01 21:37:11 crc kubenswrapper[4962]: I1201 21:37:11.915653 4962 scope.go:117] "RemoveContainer" containerID="59bd59c13ff292c1a086d9e4d4afa01eecdb9b399d787432173464f1b28f2cc4" Dec 01 21:37:11 crc kubenswrapper[4962]: E1201 21:37:11.916133 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59bd59c13ff292c1a086d9e4d4afa01eecdb9b399d787432173464f1b28f2cc4\": container with ID starting with 59bd59c13ff292c1a086d9e4d4afa01eecdb9b399d787432173464f1b28f2cc4 not found: ID does not exist" containerID="59bd59c13ff292c1a086d9e4d4afa01eecdb9b399d787432173464f1b28f2cc4" Dec 01 21:37:11 crc kubenswrapper[4962]: I1201 21:37:11.916188 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59bd59c13ff292c1a086d9e4d4afa01eecdb9b399d787432173464f1b28f2cc4"} err="failed to get container status \"59bd59c13ff292c1a086d9e4d4afa01eecdb9b399d787432173464f1b28f2cc4\": rpc error: code = NotFound desc = could not find container \"59bd59c13ff292c1a086d9e4d4afa01eecdb9b399d787432173464f1b28f2cc4\": container with ID starting with 59bd59c13ff292c1a086d9e4d4afa01eecdb9b399d787432173464f1b28f2cc4 not found: ID does not exist" Dec 01 21:37:11 crc kubenswrapper[4962]: I1201 21:37:11.916228 4962 scope.go:117] "RemoveContainer" containerID="0fba10999ad935b5ef4527037f5411522498d37cce599b24c8898085fd7d4624" Dec 01 21:37:11 crc kubenswrapper[4962]: E1201 21:37:11.916493 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fba10999ad935b5ef4527037f5411522498d37cce599b24c8898085fd7d4624\": container with ID starting with 0fba10999ad935b5ef4527037f5411522498d37cce599b24c8898085fd7d4624 not found: ID does not exist" containerID="0fba10999ad935b5ef4527037f5411522498d37cce599b24c8898085fd7d4624" Dec 01 21:37:11 crc kubenswrapper[4962]: I1201 21:37:11.916515 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fba10999ad935b5ef4527037f5411522498d37cce599b24c8898085fd7d4624"} err="failed to get container status \"0fba10999ad935b5ef4527037f5411522498d37cce599b24c8898085fd7d4624\": rpc error: code = NotFound desc = could not find container \"0fba10999ad935b5ef4527037f5411522498d37cce599b24c8898085fd7d4624\": container with ID starting with 0fba10999ad935b5ef4527037f5411522498d37cce599b24c8898085fd7d4624 not found: ID does not exist" Dec 01 21:37:12 crc kubenswrapper[4962]: I1201 21:37:12.229482 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8308dc46-519c-4b6a-8e97-d073484e64ae" path="/var/lib/kubelet/pods/8308dc46-519c-4b6a-8e97-d073484e64ae/volumes" Dec 01 21:37:12 crc kubenswrapper[4962]: I1201 21:37:12.572992 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-26z7n"] Dec 01 21:37:12 crc kubenswrapper[4962]: I1201 21:37:12.811513 4962 generic.go:334] "Generic (PLEG): container finished" podID="1569b19f-7f89-465d-9140-e2dce5e33425" containerID="1f3f289ac151392bcc47972bea17d814a142d11beefcafcf42c03d0a8bd7f2e8" exitCode=0 Dec 01 21:37:12 crc kubenswrapper[4962]: I1201 21:37:12.811563 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l5625" event={"ID":"1569b19f-7f89-465d-9140-e2dce5e33425","Type":"ContainerDied","Data":"1f3f289ac151392bcc47972bea17d814a142d11beefcafcf42c03d0a8bd7f2e8"} Dec 01 21:37:12 crc kubenswrapper[4962]: I1201 21:37:12.816985 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-26z7n" podUID="970cf872-ad25-4424-85ca-4bf809e9f5f7" containerName="registry-server" containerID="cri-o://bb80e357db8323314e01bdf450386af80beb265ef5049df73c7a70a2f33574ca" gracePeriod=2 Dec 01 21:37:12 crc kubenswrapper[4962]: I1201 21:37:12.817044 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zr6pm" event={"ID":"2c522e07-a29f-4b5f-be82-e5eac46c1f6a","Type":"ContainerStarted","Data":"86341b7c09507b951e35616189b882e33c61c4e945f43e4bae8e5cda7b832e93"} Dec 01 21:37:13 crc kubenswrapper[4962]: I1201 21:37:13.630132 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-26z7n" Dec 01 21:37:13 crc kubenswrapper[4962]: I1201 21:37:13.653972 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zr6pm" podStartSLOduration=3.324449476 podStartE2EDuration="1m3.653954195s" podCreationTimestamp="2025-12-01 21:36:10 +0000 UTC" firstStartedPulling="2025-12-01 21:36:11.977332963 +0000 UTC m=+156.078772158" lastFinishedPulling="2025-12-01 21:37:12.306837682 +0000 UTC m=+216.408276877" observedRunningTime="2025-12-01 21:37:12.871105441 +0000 UTC m=+216.972544636" watchObservedRunningTime="2025-12-01 21:37:13.653954195 +0000 UTC m=+217.755393390" Dec 01 21:37:13 crc kubenswrapper[4962]: I1201 21:37:13.705918 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/970cf872-ad25-4424-85ca-4bf809e9f5f7-catalog-content\") pod \"970cf872-ad25-4424-85ca-4bf809e9f5f7\" (UID: \"970cf872-ad25-4424-85ca-4bf809e9f5f7\") " Dec 01 21:37:13 crc kubenswrapper[4962]: I1201 21:37:13.706006 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9p79b\" (UniqueName: \"kubernetes.io/projected/970cf872-ad25-4424-85ca-4bf809e9f5f7-kube-api-access-9p79b\") pod \"970cf872-ad25-4424-85ca-4bf809e9f5f7\" (UID: \"970cf872-ad25-4424-85ca-4bf809e9f5f7\") " Dec 01 21:37:13 crc kubenswrapper[4962]: I1201 21:37:13.706039 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/970cf872-ad25-4424-85ca-4bf809e9f5f7-utilities\") pod \"970cf872-ad25-4424-85ca-4bf809e9f5f7\" (UID: \"970cf872-ad25-4424-85ca-4bf809e9f5f7\") " Dec 01 21:37:13 crc kubenswrapper[4962]: I1201 21:37:13.706925 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/970cf872-ad25-4424-85ca-4bf809e9f5f7-utilities" (OuterVolumeSpecName: "utilities") pod "970cf872-ad25-4424-85ca-4bf809e9f5f7" (UID: "970cf872-ad25-4424-85ca-4bf809e9f5f7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:37:13 crc kubenswrapper[4962]: I1201 21:37:13.712244 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/970cf872-ad25-4424-85ca-4bf809e9f5f7-kube-api-access-9p79b" (OuterVolumeSpecName: "kube-api-access-9p79b") pod "970cf872-ad25-4424-85ca-4bf809e9f5f7" (UID: "970cf872-ad25-4424-85ca-4bf809e9f5f7"). InnerVolumeSpecName "kube-api-access-9p79b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:37:13 crc kubenswrapper[4962]: I1201 21:37:13.726284 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/970cf872-ad25-4424-85ca-4bf809e9f5f7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "970cf872-ad25-4424-85ca-4bf809e9f5f7" (UID: "970cf872-ad25-4424-85ca-4bf809e9f5f7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:37:13 crc kubenswrapper[4962]: I1201 21:37:13.807858 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/970cf872-ad25-4424-85ca-4bf809e9f5f7-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 21:37:13 crc kubenswrapper[4962]: I1201 21:37:13.807896 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9p79b\" (UniqueName: \"kubernetes.io/projected/970cf872-ad25-4424-85ca-4bf809e9f5f7-kube-api-access-9p79b\") on node \"crc\" DevicePath \"\"" Dec 01 21:37:13 crc kubenswrapper[4962]: I1201 21:37:13.807908 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/970cf872-ad25-4424-85ca-4bf809e9f5f7-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 21:37:13 crc kubenswrapper[4962]: I1201 21:37:13.827719 4962 generic.go:334] "Generic (PLEG): container finished" podID="a4620db4-9171-44b5-b944-dcf2e871ef41" containerID="92b16bf04b13d30418c74797f6ec8ea4cbc2cc8316df87dcab0950c2a6524d04" exitCode=0 Dec 01 21:37:13 crc kubenswrapper[4962]: I1201 21:37:13.827815 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x6knc" event={"ID":"a4620db4-9171-44b5-b944-dcf2e871ef41","Type":"ContainerDied","Data":"92b16bf04b13d30418c74797f6ec8ea4cbc2cc8316df87dcab0950c2a6524d04"} Dec 01 21:37:13 crc kubenswrapper[4962]: I1201 21:37:13.832500 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l5625" event={"ID":"1569b19f-7f89-465d-9140-e2dce5e33425","Type":"ContainerStarted","Data":"f4d8761ba204257d12b36566746dbab02fa414a21a1d543568459253a0445e9d"} Dec 01 21:37:13 crc kubenswrapper[4962]: I1201 21:37:13.835246 4962 generic.go:334] "Generic (PLEG): container finished" podID="970cf872-ad25-4424-85ca-4bf809e9f5f7" containerID="bb80e357db8323314e01bdf450386af80beb265ef5049df73c7a70a2f33574ca" exitCode=0 Dec 01 21:37:13 crc kubenswrapper[4962]: I1201 21:37:13.835300 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-26z7n" event={"ID":"970cf872-ad25-4424-85ca-4bf809e9f5f7","Type":"ContainerDied","Data":"bb80e357db8323314e01bdf450386af80beb265ef5049df73c7a70a2f33574ca"} Dec 01 21:37:13 crc kubenswrapper[4962]: I1201 21:37:13.835325 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-26z7n" event={"ID":"970cf872-ad25-4424-85ca-4bf809e9f5f7","Type":"ContainerDied","Data":"745f6cc2a32463a2a4ee320990a2bf3b1b9cd26af8992d81cd64b76c6c8b5228"} Dec 01 21:37:13 crc kubenswrapper[4962]: I1201 21:37:13.835354 4962 scope.go:117] "RemoveContainer" containerID="bb80e357db8323314e01bdf450386af80beb265ef5049df73c7a70a2f33574ca" Dec 01 21:37:13 crc kubenswrapper[4962]: I1201 21:37:13.835451 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-26z7n" Dec 01 21:37:13 crc kubenswrapper[4962]: I1201 21:37:13.843149 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gjnwd" event={"ID":"d7ee99ce-d281-4196-b1aa-1d69bc8c6f2a","Type":"ContainerStarted","Data":"c0fadbadec801895092d269e0f453ec970db64a0cb192da1c6c761322fa7cd64"} Dec 01 21:37:13 crc kubenswrapper[4962]: I1201 21:37:13.855267 4962 generic.go:334] "Generic (PLEG): container finished" podID="b9788313-1ab9-4dce-8fd0-363c8086d8d3" containerID="ccbfd8375aea896fb731bf6832958397603861a2a486699a9c2aead6853b01ce" exitCode=0 Dec 01 21:37:13 crc kubenswrapper[4962]: I1201 21:37:13.855337 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mpxbb" event={"ID":"b9788313-1ab9-4dce-8fd0-363c8086d8d3","Type":"ContainerDied","Data":"ccbfd8375aea896fb731bf6832958397603861a2a486699a9c2aead6853b01ce"} Dec 01 21:37:13 crc kubenswrapper[4962]: I1201 21:37:13.859694 4962 scope.go:117] "RemoveContainer" containerID="804d92c8f92f115dd66314701eb5ce998ee2f92ec92de56f65f440301a574cab" Dec 01 21:37:13 crc kubenswrapper[4962]: I1201 21:37:13.892789 4962 scope.go:117] "RemoveContainer" containerID="d79584d260f2bb68e092dba2ac003c2379b83ce840e5a2af21b039b9bc5c1d63" Dec 01 21:37:13 crc kubenswrapper[4962]: I1201 21:37:13.901396 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-l5625" podStartSLOduration=3.355832677 podStartE2EDuration="1m6.90137674s" podCreationTimestamp="2025-12-01 21:36:07 +0000 UTC" firstStartedPulling="2025-12-01 21:36:09.892751116 +0000 UTC m=+153.994190311" lastFinishedPulling="2025-12-01 21:37:13.438295179 +0000 UTC m=+217.539734374" observedRunningTime="2025-12-01 21:37:13.896979781 +0000 UTC m=+217.998418986" watchObservedRunningTime="2025-12-01 21:37:13.90137674 +0000 UTC m=+218.002815935" Dec 01 21:37:13 crc kubenswrapper[4962]: I1201 21:37:13.932370 4962 scope.go:117] "RemoveContainer" containerID="bb80e357db8323314e01bdf450386af80beb265ef5049df73c7a70a2f33574ca" Dec 01 21:37:13 crc kubenswrapper[4962]: E1201 21:37:13.937330 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb80e357db8323314e01bdf450386af80beb265ef5049df73c7a70a2f33574ca\": container with ID starting with bb80e357db8323314e01bdf450386af80beb265ef5049df73c7a70a2f33574ca not found: ID does not exist" containerID="bb80e357db8323314e01bdf450386af80beb265ef5049df73c7a70a2f33574ca" Dec 01 21:37:13 crc kubenswrapper[4962]: I1201 21:37:13.937410 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb80e357db8323314e01bdf450386af80beb265ef5049df73c7a70a2f33574ca"} err="failed to get container status \"bb80e357db8323314e01bdf450386af80beb265ef5049df73c7a70a2f33574ca\": rpc error: code = NotFound desc = could not find container \"bb80e357db8323314e01bdf450386af80beb265ef5049df73c7a70a2f33574ca\": container with ID starting with bb80e357db8323314e01bdf450386af80beb265ef5049df73c7a70a2f33574ca not found: ID does not exist" Dec 01 21:37:13 crc kubenswrapper[4962]: I1201 21:37:13.937465 4962 scope.go:117] "RemoveContainer" containerID="804d92c8f92f115dd66314701eb5ce998ee2f92ec92de56f65f440301a574cab" Dec 01 21:37:13 crc kubenswrapper[4962]: E1201 21:37:13.938350 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"804d92c8f92f115dd66314701eb5ce998ee2f92ec92de56f65f440301a574cab\": container with ID starting with 804d92c8f92f115dd66314701eb5ce998ee2f92ec92de56f65f440301a574cab not found: ID does not exist" containerID="804d92c8f92f115dd66314701eb5ce998ee2f92ec92de56f65f440301a574cab" Dec 01 21:37:13 crc kubenswrapper[4962]: I1201 21:37:13.938462 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"804d92c8f92f115dd66314701eb5ce998ee2f92ec92de56f65f440301a574cab"} err="failed to get container status \"804d92c8f92f115dd66314701eb5ce998ee2f92ec92de56f65f440301a574cab\": rpc error: code = NotFound desc = could not find container \"804d92c8f92f115dd66314701eb5ce998ee2f92ec92de56f65f440301a574cab\": container with ID starting with 804d92c8f92f115dd66314701eb5ce998ee2f92ec92de56f65f440301a574cab not found: ID does not exist" Dec 01 21:37:13 crc kubenswrapper[4962]: I1201 21:37:13.938514 4962 scope.go:117] "RemoveContainer" containerID="d79584d260f2bb68e092dba2ac003c2379b83ce840e5a2af21b039b9bc5c1d63" Dec 01 21:37:13 crc kubenswrapper[4962]: E1201 21:37:13.940151 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d79584d260f2bb68e092dba2ac003c2379b83ce840e5a2af21b039b9bc5c1d63\": container with ID starting with d79584d260f2bb68e092dba2ac003c2379b83ce840e5a2af21b039b9bc5c1d63 not found: ID does not exist" containerID="d79584d260f2bb68e092dba2ac003c2379b83ce840e5a2af21b039b9bc5c1d63" Dec 01 21:37:13 crc kubenswrapper[4962]: I1201 21:37:13.940324 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d79584d260f2bb68e092dba2ac003c2379b83ce840e5a2af21b039b9bc5c1d63"} err="failed to get container status \"d79584d260f2bb68e092dba2ac003c2379b83ce840e5a2af21b039b9bc5c1d63\": rpc error: code = NotFound desc = could not find container \"d79584d260f2bb68e092dba2ac003c2379b83ce840e5a2af21b039b9bc5c1d63\": container with ID starting with d79584d260f2bb68e092dba2ac003c2379b83ce840e5a2af21b039b9bc5c1d63 not found: ID does not exist" Dec 01 21:37:13 crc kubenswrapper[4962]: I1201 21:37:13.958548 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-26z7n"] Dec 01 21:37:13 crc kubenswrapper[4962]: I1201 21:37:13.961617 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-26z7n"] Dec 01 21:37:14 crc kubenswrapper[4962]: I1201 21:37:14.226178 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="970cf872-ad25-4424-85ca-4bf809e9f5f7" path="/var/lib/kubelet/pods/970cf872-ad25-4424-85ca-4bf809e9f5f7/volumes" Dec 01 21:37:14 crc kubenswrapper[4962]: I1201 21:37:14.862704 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mpxbb" event={"ID":"b9788313-1ab9-4dce-8fd0-363c8086d8d3","Type":"ContainerStarted","Data":"470d7563c342109eadd956602357db0cd2168c04b1141bdce4766bd37dc7e21c"} Dec 01 21:37:14 crc kubenswrapper[4962]: I1201 21:37:14.866077 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x6knc" event={"ID":"a4620db4-9171-44b5-b944-dcf2e871ef41","Type":"ContainerStarted","Data":"47fd1722336ab6778e4fd62a0eefcd3d902b43fa345f94699c8b6b368e2ce0e0"} Dec 01 21:37:14 crc kubenswrapper[4962]: I1201 21:37:14.869138 4962 generic.go:334] "Generic (PLEG): container finished" podID="d7ee99ce-d281-4196-b1aa-1d69bc8c6f2a" containerID="c0fadbadec801895092d269e0f453ec970db64a0cb192da1c6c761322fa7cd64" exitCode=0 Dec 01 21:37:14 crc kubenswrapper[4962]: I1201 21:37:14.869187 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gjnwd" event={"ID":"d7ee99ce-d281-4196-b1aa-1d69bc8c6f2a","Type":"ContainerDied","Data":"c0fadbadec801895092d269e0f453ec970db64a0cb192da1c6c761322fa7cd64"} Dec 01 21:37:14 crc kubenswrapper[4962]: I1201 21:37:14.893689 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mpxbb" podStartSLOduration=2.233730889 podStartE2EDuration="1m7.89366334s" podCreationTimestamp="2025-12-01 21:36:07 +0000 UTC" firstStartedPulling="2025-12-01 21:36:08.761696016 +0000 UTC m=+152.863135211" lastFinishedPulling="2025-12-01 21:37:14.421628467 +0000 UTC m=+218.523067662" observedRunningTime="2025-12-01 21:37:14.890522015 +0000 UTC m=+218.991961210" watchObservedRunningTime="2025-12-01 21:37:14.89366334 +0000 UTC m=+218.995102535" Dec 01 21:37:14 crc kubenswrapper[4962]: I1201 21:37:14.910622 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-x6knc" podStartSLOduration=3.499689397 podStartE2EDuration="1m7.910601909s" podCreationTimestamp="2025-12-01 21:36:07 +0000 UTC" firstStartedPulling="2025-12-01 21:36:09.881170521 +0000 UTC m=+153.982609716" lastFinishedPulling="2025-12-01 21:37:14.292083033 +0000 UTC m=+218.393522228" observedRunningTime="2025-12-01 21:37:14.907295159 +0000 UTC m=+219.008734354" watchObservedRunningTime="2025-12-01 21:37:14.910601909 +0000 UTC m=+219.012041104" Dec 01 21:37:15 crc kubenswrapper[4962]: I1201 21:37:15.881005 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gjnwd" event={"ID":"d7ee99ce-d281-4196-b1aa-1d69bc8c6f2a","Type":"ContainerStarted","Data":"29b39f2a636d2e309d9c079b474dbcde6a89e9a001c9db390ab375821c3efd72"} Dec 01 21:37:15 crc kubenswrapper[4962]: I1201 21:37:15.901373 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gjnwd" podStartSLOduration=3.423522562 podStartE2EDuration="1m5.901353898s" podCreationTimestamp="2025-12-01 21:36:10 +0000 UTC" firstStartedPulling="2025-12-01 21:36:13.024166746 +0000 UTC m=+157.125605941" lastFinishedPulling="2025-12-01 21:37:15.501998082 +0000 UTC m=+219.603437277" observedRunningTime="2025-12-01 21:37:15.897411352 +0000 UTC m=+219.998850567" watchObservedRunningTime="2025-12-01 21:37:15.901353898 +0000 UTC m=+220.002793113" Dec 01 21:37:17 crc kubenswrapper[4962]: I1201 21:37:17.495132 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mpxbb" Dec 01 21:37:17 crc kubenswrapper[4962]: I1201 21:37:17.495201 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mpxbb" Dec 01 21:37:17 crc kubenswrapper[4962]: I1201 21:37:17.538957 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mpxbb" Dec 01 21:37:17 crc kubenswrapper[4962]: I1201 21:37:17.727674 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-x6knc" Dec 01 21:37:17 crc kubenswrapper[4962]: I1201 21:37:17.728044 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-x6knc" Dec 01 21:37:17 crc kubenswrapper[4962]: I1201 21:37:17.794663 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-x6knc" Dec 01 21:37:17 crc kubenswrapper[4962]: I1201 21:37:17.916973 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-l5625" Dec 01 21:37:17 crc kubenswrapper[4962]: I1201 21:37:17.917033 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-l5625" Dec 01 21:37:17 crc kubenswrapper[4962]: I1201 21:37:17.983901 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-l5625" Dec 01 21:37:18 crc kubenswrapper[4962]: I1201 21:37:18.978116 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-l5625" Dec 01 21:37:19 crc kubenswrapper[4962]: I1201 21:37:19.575407 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l5625"] Dec 01 21:37:20 crc kubenswrapper[4962]: I1201 21:37:20.124404 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-74bhz"] Dec 01 21:37:20 crc kubenswrapper[4962]: E1201 21:37:20.124638 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="970cf872-ad25-4424-85ca-4bf809e9f5f7" containerName="extract-utilities" Dec 01 21:37:20 crc kubenswrapper[4962]: I1201 21:37:20.124653 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="970cf872-ad25-4424-85ca-4bf809e9f5f7" containerName="extract-utilities" Dec 01 21:37:20 crc kubenswrapper[4962]: E1201 21:37:20.124670 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="970cf872-ad25-4424-85ca-4bf809e9f5f7" containerName="extract-content" Dec 01 21:37:20 crc kubenswrapper[4962]: I1201 21:37:20.124679 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="970cf872-ad25-4424-85ca-4bf809e9f5f7" containerName="extract-content" Dec 01 21:37:20 crc kubenswrapper[4962]: E1201 21:37:20.124692 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="970cf872-ad25-4424-85ca-4bf809e9f5f7" containerName="registry-server" Dec 01 21:37:20 crc kubenswrapper[4962]: I1201 21:37:20.124700 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="970cf872-ad25-4424-85ca-4bf809e9f5f7" containerName="registry-server" Dec 01 21:37:20 crc kubenswrapper[4962]: E1201 21:37:20.124723 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8308dc46-519c-4b6a-8e97-d073484e64ae" containerName="registry-server" Dec 01 21:37:20 crc kubenswrapper[4962]: I1201 21:37:20.124732 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="8308dc46-519c-4b6a-8e97-d073484e64ae" containerName="registry-server" Dec 01 21:37:20 crc kubenswrapper[4962]: E1201 21:37:20.124743 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8308dc46-519c-4b6a-8e97-d073484e64ae" containerName="extract-utilities" Dec 01 21:37:20 crc kubenswrapper[4962]: I1201 21:37:20.124751 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="8308dc46-519c-4b6a-8e97-d073484e64ae" containerName="extract-utilities" Dec 01 21:37:20 crc kubenswrapper[4962]: E1201 21:37:20.124766 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d7583fd-7224-4abc-81c1-c592d1182ec0" containerName="pruner" Dec 01 21:37:20 crc kubenswrapper[4962]: I1201 21:37:20.124774 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d7583fd-7224-4abc-81c1-c592d1182ec0" containerName="pruner" Dec 01 21:37:20 crc kubenswrapper[4962]: E1201 21:37:20.124784 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e294655f-e01a-4944-a8da-3480d76481b8" containerName="pruner" Dec 01 21:37:20 crc kubenswrapper[4962]: I1201 21:37:20.124791 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="e294655f-e01a-4944-a8da-3480d76481b8" containerName="pruner" Dec 01 21:37:20 crc kubenswrapper[4962]: E1201 21:37:20.124803 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8308dc46-519c-4b6a-8e97-d073484e64ae" containerName="extract-content" Dec 01 21:37:20 crc kubenswrapper[4962]: I1201 21:37:20.124811 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="8308dc46-519c-4b6a-8e97-d073484e64ae" containerName="extract-content" Dec 01 21:37:20 crc kubenswrapper[4962]: I1201 21:37:20.124977 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="e294655f-e01a-4944-a8da-3480d76481b8" containerName="pruner" Dec 01 21:37:20 crc kubenswrapper[4962]: I1201 21:37:20.124993 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="970cf872-ad25-4424-85ca-4bf809e9f5f7" containerName="registry-server" Dec 01 21:37:20 crc kubenswrapper[4962]: I1201 21:37:20.125006 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d7583fd-7224-4abc-81c1-c592d1182ec0" containerName="pruner" Dec 01 21:37:20 crc kubenswrapper[4962]: I1201 21:37:20.125022 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="8308dc46-519c-4b6a-8e97-d073484e64ae" containerName="registry-server" Dec 01 21:37:20 crc kubenswrapper[4962]: I1201 21:37:20.125439 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-74bhz" Dec 01 21:37:20 crc kubenswrapper[4962]: I1201 21:37:20.139767 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-74bhz"] Dec 01 21:37:20 crc kubenswrapper[4962]: I1201 21:37:20.296203 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0cfa40bb-018f-4c4f-afbf-cfab90e33210-bound-sa-token\") pod \"image-registry-66df7c8f76-74bhz\" (UID: \"0cfa40bb-018f-4c4f-afbf-cfab90e33210\") " pod="openshift-image-registry/image-registry-66df7c8f76-74bhz" Dec 01 21:37:20 crc kubenswrapper[4962]: I1201 21:37:20.296250 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0cfa40bb-018f-4c4f-afbf-cfab90e33210-ca-trust-extracted\") pod \"image-registry-66df7c8f76-74bhz\" (UID: \"0cfa40bb-018f-4c4f-afbf-cfab90e33210\") " pod="openshift-image-registry/image-registry-66df7c8f76-74bhz" Dec 01 21:37:20 crc kubenswrapper[4962]: I1201 21:37:20.296296 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0cfa40bb-018f-4c4f-afbf-cfab90e33210-registry-certificates\") pod \"image-registry-66df7c8f76-74bhz\" (UID: \"0cfa40bb-018f-4c4f-afbf-cfab90e33210\") " pod="openshift-image-registry/image-registry-66df7c8f76-74bhz" Dec 01 21:37:20 crc kubenswrapper[4962]: I1201 21:37:20.296445 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0cfa40bb-018f-4c4f-afbf-cfab90e33210-installation-pull-secrets\") pod \"image-registry-66df7c8f76-74bhz\" (UID: \"0cfa40bb-018f-4c4f-afbf-cfab90e33210\") " pod="openshift-image-registry/image-registry-66df7c8f76-74bhz" Dec 01 21:37:20 crc kubenswrapper[4962]: I1201 21:37:20.296512 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-74bhz\" (UID: \"0cfa40bb-018f-4c4f-afbf-cfab90e33210\") " pod="openshift-image-registry/image-registry-66df7c8f76-74bhz" Dec 01 21:37:20 crc kubenswrapper[4962]: I1201 21:37:20.296535 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0cfa40bb-018f-4c4f-afbf-cfab90e33210-trusted-ca\") pod \"image-registry-66df7c8f76-74bhz\" (UID: \"0cfa40bb-018f-4c4f-afbf-cfab90e33210\") " pod="openshift-image-registry/image-registry-66df7c8f76-74bhz" Dec 01 21:37:20 crc kubenswrapper[4962]: I1201 21:37:20.296605 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0cfa40bb-018f-4c4f-afbf-cfab90e33210-registry-tls\") pod \"image-registry-66df7c8f76-74bhz\" (UID: \"0cfa40bb-018f-4c4f-afbf-cfab90e33210\") " pod="openshift-image-registry/image-registry-66df7c8f76-74bhz" Dec 01 21:37:20 crc kubenswrapper[4962]: I1201 21:37:20.296648 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9m6m4\" (UniqueName: \"kubernetes.io/projected/0cfa40bb-018f-4c4f-afbf-cfab90e33210-kube-api-access-9m6m4\") pod \"image-registry-66df7c8f76-74bhz\" (UID: \"0cfa40bb-018f-4c4f-afbf-cfab90e33210\") " pod="openshift-image-registry/image-registry-66df7c8f76-74bhz" Dec 01 21:37:20 crc kubenswrapper[4962]: I1201 21:37:20.331551 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-74bhz\" (UID: \"0cfa40bb-018f-4c4f-afbf-cfab90e33210\") " pod="openshift-image-registry/image-registry-66df7c8f76-74bhz" Dec 01 21:37:20 crc kubenswrapper[4962]: I1201 21:37:20.398370 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0cfa40bb-018f-4c4f-afbf-cfab90e33210-registry-certificates\") pod \"image-registry-66df7c8f76-74bhz\" (UID: \"0cfa40bb-018f-4c4f-afbf-cfab90e33210\") " pod="openshift-image-registry/image-registry-66df7c8f76-74bhz" Dec 01 21:37:20 crc kubenswrapper[4962]: I1201 21:37:20.398466 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0cfa40bb-018f-4c4f-afbf-cfab90e33210-installation-pull-secrets\") pod \"image-registry-66df7c8f76-74bhz\" (UID: \"0cfa40bb-018f-4c4f-afbf-cfab90e33210\") " pod="openshift-image-registry/image-registry-66df7c8f76-74bhz" Dec 01 21:37:20 crc kubenswrapper[4962]: I1201 21:37:20.398521 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0cfa40bb-018f-4c4f-afbf-cfab90e33210-trusted-ca\") pod \"image-registry-66df7c8f76-74bhz\" (UID: \"0cfa40bb-018f-4c4f-afbf-cfab90e33210\") " pod="openshift-image-registry/image-registry-66df7c8f76-74bhz" Dec 01 21:37:20 crc kubenswrapper[4962]: I1201 21:37:20.398557 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0cfa40bb-018f-4c4f-afbf-cfab90e33210-registry-tls\") pod \"image-registry-66df7c8f76-74bhz\" (UID: \"0cfa40bb-018f-4c4f-afbf-cfab90e33210\") " pod="openshift-image-registry/image-registry-66df7c8f76-74bhz" Dec 01 21:37:20 crc kubenswrapper[4962]: I1201 21:37:20.398594 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9m6m4\" (UniqueName: \"kubernetes.io/projected/0cfa40bb-018f-4c4f-afbf-cfab90e33210-kube-api-access-9m6m4\") pod \"image-registry-66df7c8f76-74bhz\" (UID: \"0cfa40bb-018f-4c4f-afbf-cfab90e33210\") " pod="openshift-image-registry/image-registry-66df7c8f76-74bhz" Dec 01 21:37:20 crc kubenswrapper[4962]: I1201 21:37:20.398639 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0cfa40bb-018f-4c4f-afbf-cfab90e33210-bound-sa-token\") pod \"image-registry-66df7c8f76-74bhz\" (UID: \"0cfa40bb-018f-4c4f-afbf-cfab90e33210\") " pod="openshift-image-registry/image-registry-66df7c8f76-74bhz" Dec 01 21:37:20 crc kubenswrapper[4962]: I1201 21:37:20.398662 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0cfa40bb-018f-4c4f-afbf-cfab90e33210-ca-trust-extracted\") pod \"image-registry-66df7c8f76-74bhz\" (UID: \"0cfa40bb-018f-4c4f-afbf-cfab90e33210\") " pod="openshift-image-registry/image-registry-66df7c8f76-74bhz" Dec 01 21:37:20 crc kubenswrapper[4962]: I1201 21:37:20.400353 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0cfa40bb-018f-4c4f-afbf-cfab90e33210-ca-trust-extracted\") pod \"image-registry-66df7c8f76-74bhz\" (UID: \"0cfa40bb-018f-4c4f-afbf-cfab90e33210\") " pod="openshift-image-registry/image-registry-66df7c8f76-74bhz" Dec 01 21:37:20 crc kubenswrapper[4962]: I1201 21:37:20.400550 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0cfa40bb-018f-4c4f-afbf-cfab90e33210-registry-certificates\") pod \"image-registry-66df7c8f76-74bhz\" (UID: \"0cfa40bb-018f-4c4f-afbf-cfab90e33210\") " pod="openshift-image-registry/image-registry-66df7c8f76-74bhz" Dec 01 21:37:20 crc kubenswrapper[4962]: I1201 21:37:20.403767 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0cfa40bb-018f-4c4f-afbf-cfab90e33210-trusted-ca\") pod \"image-registry-66df7c8f76-74bhz\" (UID: \"0cfa40bb-018f-4c4f-afbf-cfab90e33210\") " pod="openshift-image-registry/image-registry-66df7c8f76-74bhz" Dec 01 21:37:20 crc kubenswrapper[4962]: I1201 21:37:20.418174 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0cfa40bb-018f-4c4f-afbf-cfab90e33210-bound-sa-token\") pod \"image-registry-66df7c8f76-74bhz\" (UID: \"0cfa40bb-018f-4c4f-afbf-cfab90e33210\") " pod="openshift-image-registry/image-registry-66df7c8f76-74bhz" Dec 01 21:37:20 crc kubenswrapper[4962]: I1201 21:37:20.420140 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0cfa40bb-018f-4c4f-afbf-cfab90e33210-registry-tls\") pod \"image-registry-66df7c8f76-74bhz\" (UID: \"0cfa40bb-018f-4c4f-afbf-cfab90e33210\") " pod="openshift-image-registry/image-registry-66df7c8f76-74bhz" Dec 01 21:37:20 crc kubenswrapper[4962]: I1201 21:37:20.420147 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9m6m4\" (UniqueName: \"kubernetes.io/projected/0cfa40bb-018f-4c4f-afbf-cfab90e33210-kube-api-access-9m6m4\") pod \"image-registry-66df7c8f76-74bhz\" (UID: \"0cfa40bb-018f-4c4f-afbf-cfab90e33210\") " pod="openshift-image-registry/image-registry-66df7c8f76-74bhz" Dec 01 21:37:20 crc kubenswrapper[4962]: I1201 21:37:20.428971 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0cfa40bb-018f-4c4f-afbf-cfab90e33210-installation-pull-secrets\") pod \"image-registry-66df7c8f76-74bhz\" (UID: \"0cfa40bb-018f-4c4f-afbf-cfab90e33210\") " pod="openshift-image-registry/image-registry-66df7c8f76-74bhz" Dec 01 21:37:20 crc kubenswrapper[4962]: I1201 21:37:20.443447 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-74bhz" Dec 01 21:37:20 crc kubenswrapper[4962]: I1201 21:37:20.650463 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-74bhz"] Dec 01 21:37:20 crc kubenswrapper[4962]: I1201 21:37:20.724710 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zr6pm" Dec 01 21:37:20 crc kubenswrapper[4962]: I1201 21:37:20.724754 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zr6pm" Dec 01 21:37:20 crc kubenswrapper[4962]: I1201 21:37:20.774820 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zr6pm" Dec 01 21:37:20 crc kubenswrapper[4962]: I1201 21:37:20.908441 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-74bhz" event={"ID":"0cfa40bb-018f-4c4f-afbf-cfab90e33210","Type":"ContainerStarted","Data":"38e6cdfa7140388c657d7d647b42265d45d1dc0fbfef8ffd238593b6177f55d8"} Dec 01 21:37:20 crc kubenswrapper[4962]: I1201 21:37:20.908698 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-l5625" podUID="1569b19f-7f89-465d-9140-e2dce5e33425" containerName="registry-server" containerID="cri-o://f4d8761ba204257d12b36566746dbab02fa414a21a1d543568459253a0445e9d" gracePeriod=2 Dec 01 21:37:20 crc kubenswrapper[4962]: I1201 21:37:20.948551 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zr6pm" Dec 01 21:37:21 crc kubenswrapper[4962]: I1201 21:37:21.077976 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gjnwd" Dec 01 21:37:21 crc kubenswrapper[4962]: I1201 21:37:21.078262 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gjnwd" Dec 01 21:37:21 crc kubenswrapper[4962]: I1201 21:37:21.915792 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-74bhz" event={"ID":"0cfa40bb-018f-4c4f-afbf-cfab90e33210","Type":"ContainerStarted","Data":"71e5446fddd7e7c7c1d1c4004847e312b33d5d7aa513ce878d52b5cdaa383a0d"} Dec 01 21:37:22 crc kubenswrapper[4962]: I1201 21:37:22.146412 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gjnwd" podUID="d7ee99ce-d281-4196-b1aa-1d69bc8c6f2a" containerName="registry-server" probeResult="failure" output=< Dec 01 21:37:22 crc kubenswrapper[4962]: timeout: failed to connect service ":50051" within 1s Dec 01 21:37:22 crc kubenswrapper[4962]: > Dec 01 21:37:22 crc kubenswrapper[4962]: I1201 21:37:22.922115 4962 generic.go:334] "Generic (PLEG): container finished" podID="1569b19f-7f89-465d-9140-e2dce5e33425" containerID="f4d8761ba204257d12b36566746dbab02fa414a21a1d543568459253a0445e9d" exitCode=0 Dec 01 21:37:22 crc kubenswrapper[4962]: I1201 21:37:22.922179 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l5625" event={"ID":"1569b19f-7f89-465d-9140-e2dce5e33425","Type":"ContainerDied","Data":"f4d8761ba204257d12b36566746dbab02fa414a21a1d543568459253a0445e9d"} Dec 01 21:37:22 crc kubenswrapper[4962]: I1201 21:37:22.922653 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-74bhz" Dec 01 21:37:22 crc kubenswrapper[4962]: I1201 21:37:22.942375 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-74bhz" podStartSLOduration=2.942360515 podStartE2EDuration="2.942360515s" podCreationTimestamp="2025-12-01 21:37:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:37:22.940823213 +0000 UTC m=+227.042262418" watchObservedRunningTime="2025-12-01 21:37:22.942360515 +0000 UTC m=+227.043799710" Dec 01 21:37:23 crc kubenswrapper[4962]: I1201 21:37:23.826151 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l5625" Dec 01 21:37:23 crc kubenswrapper[4962]: I1201 21:37:23.930268 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l5625" event={"ID":"1569b19f-7f89-465d-9140-e2dce5e33425","Type":"ContainerDied","Data":"9e75162d6953c5e2a749399c2f5396127e6a6b12f31420394ceb4a11109d69b9"} Dec 01 21:37:23 crc kubenswrapper[4962]: I1201 21:37:23.930292 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l5625" Dec 01 21:37:23 crc kubenswrapper[4962]: I1201 21:37:23.930344 4962 scope.go:117] "RemoveContainer" containerID="f4d8761ba204257d12b36566746dbab02fa414a21a1d543568459253a0445e9d" Dec 01 21:37:23 crc kubenswrapper[4962]: I1201 21:37:23.946988 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1569b19f-7f89-465d-9140-e2dce5e33425-catalog-content\") pod \"1569b19f-7f89-465d-9140-e2dce5e33425\" (UID: \"1569b19f-7f89-465d-9140-e2dce5e33425\") " Dec 01 21:37:23 crc kubenswrapper[4962]: I1201 21:37:23.947301 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fkgp7\" (UniqueName: \"kubernetes.io/projected/1569b19f-7f89-465d-9140-e2dce5e33425-kube-api-access-fkgp7\") pod \"1569b19f-7f89-465d-9140-e2dce5e33425\" (UID: \"1569b19f-7f89-465d-9140-e2dce5e33425\") " Dec 01 21:37:23 crc kubenswrapper[4962]: I1201 21:37:23.947338 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1569b19f-7f89-465d-9140-e2dce5e33425-utilities\") pod \"1569b19f-7f89-465d-9140-e2dce5e33425\" (UID: \"1569b19f-7f89-465d-9140-e2dce5e33425\") " Dec 01 21:37:23 crc kubenswrapper[4962]: I1201 21:37:23.948235 4962 scope.go:117] "RemoveContainer" containerID="1f3f289ac151392bcc47972bea17d814a142d11beefcafcf42c03d0a8bd7f2e8" Dec 01 21:37:23 crc kubenswrapper[4962]: I1201 21:37:23.948664 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1569b19f-7f89-465d-9140-e2dce5e33425-utilities" (OuterVolumeSpecName: "utilities") pod "1569b19f-7f89-465d-9140-e2dce5e33425" (UID: "1569b19f-7f89-465d-9140-e2dce5e33425"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:37:23 crc kubenswrapper[4962]: I1201 21:37:23.954667 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1569b19f-7f89-465d-9140-e2dce5e33425-kube-api-access-fkgp7" (OuterVolumeSpecName: "kube-api-access-fkgp7") pod "1569b19f-7f89-465d-9140-e2dce5e33425" (UID: "1569b19f-7f89-465d-9140-e2dce5e33425"). InnerVolumeSpecName "kube-api-access-fkgp7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:37:23 crc kubenswrapper[4962]: I1201 21:37:23.978901 4962 scope.go:117] "RemoveContainer" containerID="8c985e4e459754e1a774e4f55c926ec2a0ddec41d8c5062afeb0045240bfaccf" Dec 01 21:37:24 crc kubenswrapper[4962]: I1201 21:37:24.011187 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1569b19f-7f89-465d-9140-e2dce5e33425-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1569b19f-7f89-465d-9140-e2dce5e33425" (UID: "1569b19f-7f89-465d-9140-e2dce5e33425"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:37:24 crc kubenswrapper[4962]: I1201 21:37:24.049202 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1569b19f-7f89-465d-9140-e2dce5e33425-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 21:37:24 crc kubenswrapper[4962]: I1201 21:37:24.049476 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fkgp7\" (UniqueName: \"kubernetes.io/projected/1569b19f-7f89-465d-9140-e2dce5e33425-kube-api-access-fkgp7\") on node \"crc\" DevicePath \"\"" Dec 01 21:37:24 crc kubenswrapper[4962]: I1201 21:37:24.049549 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1569b19f-7f89-465d-9140-e2dce5e33425-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 21:37:24 crc kubenswrapper[4962]: I1201 21:37:24.253916 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l5625"] Dec 01 21:37:24 crc kubenswrapper[4962]: I1201 21:37:24.259330 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-l5625"] Dec 01 21:37:26 crc kubenswrapper[4962]: I1201 21:37:26.229997 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1569b19f-7f89-465d-9140-e2dce5e33425" path="/var/lib/kubelet/pods/1569b19f-7f89-465d-9140-e2dce5e33425/volumes" Dec 01 21:37:27 crc kubenswrapper[4962]: I1201 21:37:27.370982 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-x6knc"] Dec 01 21:37:27 crc kubenswrapper[4962]: I1201 21:37:27.372535 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-x6knc" podUID="a4620db4-9171-44b5-b944-dcf2e871ef41" containerName="registry-server" containerID="cri-o://47fd1722336ab6778e4fd62a0eefcd3d902b43fa345f94699c8b6b368e2ce0e0" gracePeriod=30 Dec 01 21:37:27 crc kubenswrapper[4962]: I1201 21:37:27.380626 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mpxbb"] Dec 01 21:37:27 crc kubenswrapper[4962]: I1201 21:37:27.380913 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mpxbb" podUID="b9788313-1ab9-4dce-8fd0-363c8086d8d3" containerName="registry-server" containerID="cri-o://470d7563c342109eadd956602357db0cd2168c04b1141bdce4766bd37dc7e21c" gracePeriod=30 Dec 01 21:37:27 crc kubenswrapper[4962]: E1201 21:37:27.382924 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="470d7563c342109eadd956602357db0cd2168c04b1141bdce4766bd37dc7e21c" cmd=["grpc_health_probe","-addr=:50051"] Dec 01 21:37:27 crc kubenswrapper[4962]: E1201 21:37:27.384461 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="470d7563c342109eadd956602357db0cd2168c04b1141bdce4766bd37dc7e21c" cmd=["grpc_health_probe","-addr=:50051"] Dec 01 21:37:27 crc kubenswrapper[4962]: E1201 21:37:27.386160 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="470d7563c342109eadd956602357db0cd2168c04b1141bdce4766bd37dc7e21c" cmd=["grpc_health_probe","-addr=:50051"] Dec 01 21:37:27 crc kubenswrapper[4962]: E1201 21:37:27.386236 4962 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-marketplace/community-operators-mpxbb" podUID="b9788313-1ab9-4dce-8fd0-363c8086d8d3" containerName="registry-server" Dec 01 21:37:27 crc kubenswrapper[4962]: I1201 21:37:27.413075 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-stv9m"] Dec 01 21:37:27 crc kubenswrapper[4962]: I1201 21:37:27.413379 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-stv9m" podUID="f10f3763-03b0-43d0-88fd-ce89274a67d9" containerName="marketplace-operator" containerID="cri-o://0c4ad07bcc64930ec2ea0fd3bab698ecb5d312c499f12b8042bd15c3d6d79576" gracePeriod=30 Dec 01 21:37:27 crc kubenswrapper[4962]: I1201 21:37:27.427318 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j9h7f"] Dec 01 21:37:27 crc kubenswrapper[4962]: I1201 21:37:27.427589 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-j9h7f" podUID="542eb764-e2cd-4043-9721-fe8f5d6d5d13" containerName="registry-server" containerID="cri-o://af8aa798b4813eae1308a47731b5026329864daffa8b1689b9cd1480898f7f54" gracePeriod=30 Dec 01 21:37:27 crc kubenswrapper[4962]: I1201 21:37:27.431014 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gjnwd"] Dec 01 21:37:27 crc kubenswrapper[4962]: I1201 21:37:27.431181 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gjnwd" podUID="d7ee99ce-d281-4196-b1aa-1d69bc8c6f2a" containerName="registry-server" containerID="cri-o://29b39f2a636d2e309d9c079b474dbcde6a89e9a001c9db390ab375821c3efd72" gracePeriod=30 Dec 01 21:37:27 crc kubenswrapper[4962]: I1201 21:37:27.434022 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-lg4bg"] Dec 01 21:37:27 crc kubenswrapper[4962]: E1201 21:37:27.434355 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1569b19f-7f89-465d-9140-e2dce5e33425" containerName="extract-utilities" Dec 01 21:37:27 crc kubenswrapper[4962]: I1201 21:37:27.434377 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="1569b19f-7f89-465d-9140-e2dce5e33425" containerName="extract-utilities" Dec 01 21:37:27 crc kubenswrapper[4962]: E1201 21:37:27.434392 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1569b19f-7f89-465d-9140-e2dce5e33425" containerName="registry-server" Dec 01 21:37:27 crc kubenswrapper[4962]: I1201 21:37:27.434399 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="1569b19f-7f89-465d-9140-e2dce5e33425" containerName="registry-server" Dec 01 21:37:27 crc kubenswrapper[4962]: E1201 21:37:27.434418 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1569b19f-7f89-465d-9140-e2dce5e33425" containerName="extract-content" Dec 01 21:37:27 crc kubenswrapper[4962]: I1201 21:37:27.434426 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="1569b19f-7f89-465d-9140-e2dce5e33425" containerName="extract-content" Dec 01 21:37:27 crc kubenswrapper[4962]: I1201 21:37:27.434573 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="1569b19f-7f89-465d-9140-e2dce5e33425" containerName="registry-server" Dec 01 21:37:27 crc kubenswrapper[4962]: I1201 21:37:27.435122 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-lg4bg" Dec 01 21:37:27 crc kubenswrapper[4962]: I1201 21:37:27.437694 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zr6pm"] Dec 01 21:37:27 crc kubenswrapper[4962]: I1201 21:37:27.438022 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zr6pm" podUID="2c522e07-a29f-4b5f-be82-e5eac46c1f6a" containerName="registry-server" containerID="cri-o://86341b7c09507b951e35616189b882e33c61c4e945f43e4bae8e5cda7b832e93" gracePeriod=30 Dec 01 21:37:27 crc kubenswrapper[4962]: I1201 21:37:27.438683 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-lg4bg"] Dec 01 21:37:27 crc kubenswrapper[4962]: I1201 21:37:27.477481 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/certified-operators-x6knc" podUID="a4620db4-9171-44b5-b944-dcf2e871ef41" containerName="registry-server" probeResult="failure" output="" Dec 01 21:37:27 crc kubenswrapper[4962]: E1201 21:37:27.495339 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 470d7563c342109eadd956602357db0cd2168c04b1141bdce4766bd37dc7e21c is running failed: container process not found" containerID="470d7563c342109eadd956602357db0cd2168c04b1141bdce4766bd37dc7e21c" cmd=["grpc_health_probe","-addr=:50051"] Dec 01 21:37:27 crc kubenswrapper[4962]: E1201 21:37:27.495795 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 470d7563c342109eadd956602357db0cd2168c04b1141bdce4766bd37dc7e21c is running failed: container process not found" containerID="470d7563c342109eadd956602357db0cd2168c04b1141bdce4766bd37dc7e21c" cmd=["grpc_health_probe","-addr=:50051"] Dec 01 21:37:27 crc kubenswrapper[4962]: E1201 21:37:27.496133 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 470d7563c342109eadd956602357db0cd2168c04b1141bdce4766bd37dc7e21c is running failed: container process not found" containerID="470d7563c342109eadd956602357db0cd2168c04b1141bdce4766bd37dc7e21c" cmd=["grpc_health_probe","-addr=:50051"] Dec 01 21:37:27 crc kubenswrapper[4962]: E1201 21:37:27.496209 4962 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 470d7563c342109eadd956602357db0cd2168c04b1141bdce4766bd37dc7e21c is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-mpxbb" podUID="b9788313-1ab9-4dce-8fd0-363c8086d8d3" containerName="registry-server" Dec 01 21:37:27 crc kubenswrapper[4962]: I1201 21:37:27.605833 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r62r5\" (UniqueName: \"kubernetes.io/projected/def1945b-b735-4267-8798-cdb6e28ac006-kube-api-access-r62r5\") pod \"marketplace-operator-79b997595-lg4bg\" (UID: \"def1945b-b735-4267-8798-cdb6e28ac006\") " pod="openshift-marketplace/marketplace-operator-79b997595-lg4bg" Dec 01 21:37:27 crc kubenswrapper[4962]: I1201 21:37:27.605902 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/def1945b-b735-4267-8798-cdb6e28ac006-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-lg4bg\" (UID: \"def1945b-b735-4267-8798-cdb6e28ac006\") " pod="openshift-marketplace/marketplace-operator-79b997595-lg4bg" Dec 01 21:37:27 crc kubenswrapper[4962]: I1201 21:37:27.605965 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/def1945b-b735-4267-8798-cdb6e28ac006-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-lg4bg\" (UID: \"def1945b-b735-4267-8798-cdb6e28ac006\") " pod="openshift-marketplace/marketplace-operator-79b997595-lg4bg" Dec 01 21:37:27 crc kubenswrapper[4962]: I1201 21:37:27.706957 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r62r5\" (UniqueName: \"kubernetes.io/projected/def1945b-b735-4267-8798-cdb6e28ac006-kube-api-access-r62r5\") pod \"marketplace-operator-79b997595-lg4bg\" (UID: \"def1945b-b735-4267-8798-cdb6e28ac006\") " pod="openshift-marketplace/marketplace-operator-79b997595-lg4bg" Dec 01 21:37:27 crc kubenswrapper[4962]: I1201 21:37:27.707036 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/def1945b-b735-4267-8798-cdb6e28ac006-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-lg4bg\" (UID: \"def1945b-b735-4267-8798-cdb6e28ac006\") " pod="openshift-marketplace/marketplace-operator-79b997595-lg4bg" Dec 01 21:37:27 crc kubenswrapper[4962]: I1201 21:37:27.707080 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/def1945b-b735-4267-8798-cdb6e28ac006-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-lg4bg\" (UID: \"def1945b-b735-4267-8798-cdb6e28ac006\") " pod="openshift-marketplace/marketplace-operator-79b997595-lg4bg" Dec 01 21:37:27 crc kubenswrapper[4962]: I1201 21:37:27.708357 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/def1945b-b735-4267-8798-cdb6e28ac006-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-lg4bg\" (UID: \"def1945b-b735-4267-8798-cdb6e28ac006\") " pod="openshift-marketplace/marketplace-operator-79b997595-lg4bg" Dec 01 21:37:27 crc kubenswrapper[4962]: I1201 21:37:27.719620 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/def1945b-b735-4267-8798-cdb6e28ac006-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-lg4bg\" (UID: \"def1945b-b735-4267-8798-cdb6e28ac006\") " pod="openshift-marketplace/marketplace-operator-79b997595-lg4bg" Dec 01 21:37:27 crc kubenswrapper[4962]: E1201 21:37:27.732027 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 47fd1722336ab6778e4fd62a0eefcd3d902b43fa345f94699c8b6b368e2ce0e0 is running failed: container process not found" containerID="47fd1722336ab6778e4fd62a0eefcd3d902b43fa345f94699c8b6b368e2ce0e0" cmd=["grpc_health_probe","-addr=:50051"] Dec 01 21:37:27 crc kubenswrapper[4962]: E1201 21:37:27.732604 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 47fd1722336ab6778e4fd62a0eefcd3d902b43fa345f94699c8b6b368e2ce0e0 is running failed: container process not found" containerID="47fd1722336ab6778e4fd62a0eefcd3d902b43fa345f94699c8b6b368e2ce0e0" cmd=["grpc_health_probe","-addr=:50051"] Dec 01 21:37:27 crc kubenswrapper[4962]: I1201 21:37:27.732659 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r62r5\" (UniqueName: \"kubernetes.io/projected/def1945b-b735-4267-8798-cdb6e28ac006-kube-api-access-r62r5\") pod \"marketplace-operator-79b997595-lg4bg\" (UID: \"def1945b-b735-4267-8798-cdb6e28ac006\") " pod="openshift-marketplace/marketplace-operator-79b997595-lg4bg" Dec 01 21:37:27 crc kubenswrapper[4962]: E1201 21:37:27.733135 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 47fd1722336ab6778e4fd62a0eefcd3d902b43fa345f94699c8b6b368e2ce0e0 is running failed: container process not found" containerID="47fd1722336ab6778e4fd62a0eefcd3d902b43fa345f94699c8b6b368e2ce0e0" cmd=["grpc_health_probe","-addr=:50051"] Dec 01 21:37:27 crc kubenswrapper[4962]: E1201 21:37:27.733188 4962 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 47fd1722336ab6778e4fd62a0eefcd3d902b43fa345f94699c8b6b368e2ce0e0 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-x6knc" podUID="a4620db4-9171-44b5-b944-dcf2e871ef41" containerName="registry-server" Dec 01 21:37:27 crc kubenswrapper[4962]: I1201 21:37:27.782191 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-lg4bg" Dec 01 21:37:28 crc kubenswrapper[4962]: I1201 21:37:28.013155 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-lg4bg"] Dec 01 21:37:28 crc kubenswrapper[4962]: I1201 21:37:28.769492 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-csp6p"] Dec 01 21:37:28 crc kubenswrapper[4962]: I1201 21:37:28.955340 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-lg4bg" event={"ID":"def1945b-b735-4267-8798-cdb6e28ac006","Type":"ContainerStarted","Data":"1729d2e8f7c9fda018884814877cc03105cee57cfba0783fd00ce0cec0a8b743"} Dec 01 21:37:29 crc kubenswrapper[4962]: I1201 21:37:29.587598 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gjnwd" Dec 01 21:37:29 crc kubenswrapper[4962]: I1201 21:37:29.654453 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mpxbb" Dec 01 21:37:29 crc kubenswrapper[4962]: I1201 21:37:29.667986 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x6knc" Dec 01 21:37:29 crc kubenswrapper[4962]: I1201 21:37:29.674201 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-stv9m" Dec 01 21:37:29 crc kubenswrapper[4962]: E1201 21:37:29.702012 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of af8aa798b4813eae1308a47731b5026329864daffa8b1689b9cd1480898f7f54 is running failed: container process not found" containerID="af8aa798b4813eae1308a47731b5026329864daffa8b1689b9cd1480898f7f54" cmd=["grpc_health_probe","-addr=:50051"] Dec 01 21:37:29 crc kubenswrapper[4962]: E1201 21:37:29.704588 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of af8aa798b4813eae1308a47731b5026329864daffa8b1689b9cd1480898f7f54 is running failed: container process not found" containerID="af8aa798b4813eae1308a47731b5026329864daffa8b1689b9cd1480898f7f54" cmd=["grpc_health_probe","-addr=:50051"] Dec 01 21:37:29 crc kubenswrapper[4962]: E1201 21:37:29.705118 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of af8aa798b4813eae1308a47731b5026329864daffa8b1689b9cd1480898f7f54 is running failed: container process not found" containerID="af8aa798b4813eae1308a47731b5026329864daffa8b1689b9cd1480898f7f54" cmd=["grpc_health_probe","-addr=:50051"] Dec 01 21:37:29 crc kubenswrapper[4962]: E1201 21:37:29.705187 4962 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of af8aa798b4813eae1308a47731b5026329864daffa8b1689b9cd1480898f7f54 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-j9h7f" podUID="542eb764-e2cd-4043-9721-fe8f5d6d5d13" containerName="registry-server" Dec 01 21:37:29 crc kubenswrapper[4962]: I1201 21:37:29.744101 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7ee99ce-d281-4196-b1aa-1d69bc8c6f2a-catalog-content\") pod \"d7ee99ce-d281-4196-b1aa-1d69bc8c6f2a\" (UID: \"d7ee99ce-d281-4196-b1aa-1d69bc8c6f2a\") " Dec 01 21:37:29 crc kubenswrapper[4962]: I1201 21:37:29.744172 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpbcs\" (UniqueName: \"kubernetes.io/projected/d7ee99ce-d281-4196-b1aa-1d69bc8c6f2a-kube-api-access-tpbcs\") pod \"d7ee99ce-d281-4196-b1aa-1d69bc8c6f2a\" (UID: \"d7ee99ce-d281-4196-b1aa-1d69bc8c6f2a\") " Dec 01 21:37:29 crc kubenswrapper[4962]: I1201 21:37:29.744248 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7ee99ce-d281-4196-b1aa-1d69bc8c6f2a-utilities\") pod \"d7ee99ce-d281-4196-b1aa-1d69bc8c6f2a\" (UID: \"d7ee99ce-d281-4196-b1aa-1d69bc8c6f2a\") " Dec 01 21:37:29 crc kubenswrapper[4962]: I1201 21:37:29.745455 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7ee99ce-d281-4196-b1aa-1d69bc8c6f2a-utilities" (OuterVolumeSpecName: "utilities") pod "d7ee99ce-d281-4196-b1aa-1d69bc8c6f2a" (UID: "d7ee99ce-d281-4196-b1aa-1d69bc8c6f2a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:37:29 crc kubenswrapper[4962]: I1201 21:37:29.760341 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7ee99ce-d281-4196-b1aa-1d69bc8c6f2a-kube-api-access-tpbcs" (OuterVolumeSpecName: "kube-api-access-tpbcs") pod "d7ee99ce-d281-4196-b1aa-1d69bc8c6f2a" (UID: "d7ee99ce-d281-4196-b1aa-1d69bc8c6f2a"). InnerVolumeSpecName "kube-api-access-tpbcs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:37:29 crc kubenswrapper[4962]: I1201 21:37:29.845859 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f10f3763-03b0-43d0-88fd-ce89274a67d9-marketplace-trusted-ca\") pod \"f10f3763-03b0-43d0-88fd-ce89274a67d9\" (UID: \"f10f3763-03b0-43d0-88fd-ce89274a67d9\") " Dec 01 21:37:29 crc kubenswrapper[4962]: I1201 21:37:29.846015 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4620db4-9171-44b5-b944-dcf2e871ef41-utilities\") pod \"a4620db4-9171-44b5-b944-dcf2e871ef41\" (UID: \"a4620db4-9171-44b5-b944-dcf2e871ef41\") " Dec 01 21:37:29 crc kubenswrapper[4962]: I1201 21:37:29.846054 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9788313-1ab9-4dce-8fd0-363c8086d8d3-catalog-content\") pod \"b9788313-1ab9-4dce-8fd0-363c8086d8d3\" (UID: \"b9788313-1ab9-4dce-8fd0-363c8086d8d3\") " Dec 01 21:37:29 crc kubenswrapper[4962]: I1201 21:37:29.846079 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqjb4\" (UniqueName: \"kubernetes.io/projected/f10f3763-03b0-43d0-88fd-ce89274a67d9-kube-api-access-zqjb4\") pod \"f10f3763-03b0-43d0-88fd-ce89274a67d9\" (UID: \"f10f3763-03b0-43d0-88fd-ce89274a67d9\") " Dec 01 21:37:29 crc kubenswrapper[4962]: I1201 21:37:29.846581 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f10f3763-03b0-43d0-88fd-ce89274a67d9-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "f10f3763-03b0-43d0-88fd-ce89274a67d9" (UID: "f10f3763-03b0-43d0-88fd-ce89274a67d9"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:37:29 crc kubenswrapper[4962]: I1201 21:37:29.846863 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4620db4-9171-44b5-b944-dcf2e871ef41-utilities" (OuterVolumeSpecName: "utilities") pod "a4620db4-9171-44b5-b944-dcf2e871ef41" (UID: "a4620db4-9171-44b5-b944-dcf2e871ef41"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:37:29 crc kubenswrapper[4962]: I1201 21:37:29.852254 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f10f3763-03b0-43d0-88fd-ce89274a67d9-kube-api-access-zqjb4" (OuterVolumeSpecName: "kube-api-access-zqjb4") pod "f10f3763-03b0-43d0-88fd-ce89274a67d9" (UID: "f10f3763-03b0-43d0-88fd-ce89274a67d9"). InnerVolumeSpecName "kube-api-access-zqjb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:37:29 crc kubenswrapper[4962]: I1201 21:37:29.853135 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f10f3763-03b0-43d0-88fd-ce89274a67d9-marketplace-operator-metrics\") pod \"f10f3763-03b0-43d0-88fd-ce89274a67d9\" (UID: \"f10f3763-03b0-43d0-88fd-ce89274a67d9\") " Dec 01 21:37:29 crc kubenswrapper[4962]: I1201 21:37:29.853191 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4620db4-9171-44b5-b944-dcf2e871ef41-catalog-content\") pod \"a4620db4-9171-44b5-b944-dcf2e871ef41\" (UID: \"a4620db4-9171-44b5-b944-dcf2e871ef41\") " Dec 01 21:37:29 crc kubenswrapper[4962]: I1201 21:37:29.853232 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9788313-1ab9-4dce-8fd0-363c8086d8d3-utilities\") pod \"b9788313-1ab9-4dce-8fd0-363c8086d8d3\" (UID: \"b9788313-1ab9-4dce-8fd0-363c8086d8d3\") " Dec 01 21:37:29 crc kubenswrapper[4962]: I1201 21:37:29.853252 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxfzb\" (UniqueName: \"kubernetes.io/projected/b9788313-1ab9-4dce-8fd0-363c8086d8d3-kube-api-access-cxfzb\") pod \"b9788313-1ab9-4dce-8fd0-363c8086d8d3\" (UID: \"b9788313-1ab9-4dce-8fd0-363c8086d8d3\") " Dec 01 21:37:29 crc kubenswrapper[4962]: I1201 21:37:29.853737 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qz8pv\" (UniqueName: \"kubernetes.io/projected/a4620db4-9171-44b5-b944-dcf2e871ef41-kube-api-access-qz8pv\") pod \"a4620db4-9171-44b5-b944-dcf2e871ef41\" (UID: \"a4620db4-9171-44b5-b944-dcf2e871ef41\") " Dec 01 21:37:29 crc kubenswrapper[4962]: I1201 21:37:29.854912 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9788313-1ab9-4dce-8fd0-363c8086d8d3-utilities" (OuterVolumeSpecName: "utilities") pod "b9788313-1ab9-4dce-8fd0-363c8086d8d3" (UID: "b9788313-1ab9-4dce-8fd0-363c8086d8d3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:37:29 crc kubenswrapper[4962]: I1201 21:37:29.855680 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f10f3763-03b0-43d0-88fd-ce89274a67d9-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "f10f3763-03b0-43d0-88fd-ce89274a67d9" (UID: "f10f3763-03b0-43d0-88fd-ce89274a67d9"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:37:29 crc kubenswrapper[4962]: I1201 21:37:29.855977 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4620db4-9171-44b5-b944-dcf2e871ef41-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 21:37:29 crc kubenswrapper[4962]: I1201 21:37:29.856004 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqjb4\" (UniqueName: \"kubernetes.io/projected/f10f3763-03b0-43d0-88fd-ce89274a67d9-kube-api-access-zqjb4\") on node \"crc\" DevicePath \"\"" Dec 01 21:37:29 crc kubenswrapper[4962]: I1201 21:37:29.856022 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7ee99ce-d281-4196-b1aa-1d69bc8c6f2a-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 21:37:29 crc kubenswrapper[4962]: I1201 21:37:29.856032 4962 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f10f3763-03b0-43d0-88fd-ce89274a67d9-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 01 21:37:29 crc kubenswrapper[4962]: I1201 21:37:29.856040 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9788313-1ab9-4dce-8fd0-363c8086d8d3-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 21:37:29 crc kubenswrapper[4962]: I1201 21:37:29.856050 4962 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f10f3763-03b0-43d0-88fd-ce89274a67d9-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 21:37:29 crc kubenswrapper[4962]: I1201 21:37:29.856059 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tpbcs\" (UniqueName: \"kubernetes.io/projected/d7ee99ce-d281-4196-b1aa-1d69bc8c6f2a-kube-api-access-tpbcs\") on node \"crc\" DevicePath \"\"" Dec 01 21:37:29 crc kubenswrapper[4962]: I1201 21:37:29.858454 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4620db4-9171-44b5-b944-dcf2e871ef41-kube-api-access-qz8pv" (OuterVolumeSpecName: "kube-api-access-qz8pv") pod "a4620db4-9171-44b5-b944-dcf2e871ef41" (UID: "a4620db4-9171-44b5-b944-dcf2e871ef41"). InnerVolumeSpecName "kube-api-access-qz8pv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:37:29 crc kubenswrapper[4962]: I1201 21:37:29.859735 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9788313-1ab9-4dce-8fd0-363c8086d8d3-kube-api-access-cxfzb" (OuterVolumeSpecName: "kube-api-access-cxfzb") pod "b9788313-1ab9-4dce-8fd0-363c8086d8d3" (UID: "b9788313-1ab9-4dce-8fd0-363c8086d8d3"). InnerVolumeSpecName "kube-api-access-cxfzb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:37:29 crc kubenswrapper[4962]: I1201 21:37:29.871970 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zr6pm" Dec 01 21:37:29 crc kubenswrapper[4962]: I1201 21:37:29.893507 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7ee99ce-d281-4196-b1aa-1d69bc8c6f2a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d7ee99ce-d281-4196-b1aa-1d69bc8c6f2a" (UID: "d7ee99ce-d281-4196-b1aa-1d69bc8c6f2a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:37:29 crc kubenswrapper[4962]: I1201 21:37:29.942821 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9788313-1ab9-4dce-8fd0-363c8086d8d3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b9788313-1ab9-4dce-8fd0-363c8086d8d3" (UID: "b9788313-1ab9-4dce-8fd0-363c8086d8d3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:37:29 crc kubenswrapper[4962]: I1201 21:37:29.949738 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4620db4-9171-44b5-b944-dcf2e871ef41-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a4620db4-9171-44b5-b944-dcf2e871ef41" (UID: "a4620db4-9171-44b5-b944-dcf2e871ef41"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:37:29 crc kubenswrapper[4962]: I1201 21:37:29.967714 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c522e07-a29f-4b5f-be82-e5eac46c1f6a-utilities\") pod \"2c522e07-a29f-4b5f-be82-e5eac46c1f6a\" (UID: \"2c522e07-a29f-4b5f-be82-e5eac46c1f6a\") " Dec 01 21:37:29 crc kubenswrapper[4962]: I1201 21:37:29.968478 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c522e07-a29f-4b5f-be82-e5eac46c1f6a-utilities" (OuterVolumeSpecName: "utilities") pod "2c522e07-a29f-4b5f-be82-e5eac46c1f6a" (UID: "2c522e07-a29f-4b5f-be82-e5eac46c1f6a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:37:29 crc kubenswrapper[4962]: I1201 21:37:29.968546 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9szx8\" (UniqueName: \"kubernetes.io/projected/2c522e07-a29f-4b5f-be82-e5eac46c1f6a-kube-api-access-9szx8\") pod \"2c522e07-a29f-4b5f-be82-e5eac46c1f6a\" (UID: \"2c522e07-a29f-4b5f-be82-e5eac46c1f6a\") " Dec 01 21:37:29 crc kubenswrapper[4962]: I1201 21:37:29.968582 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c522e07-a29f-4b5f-be82-e5eac46c1f6a-catalog-content\") pod \"2c522e07-a29f-4b5f-be82-e5eac46c1f6a\" (UID: \"2c522e07-a29f-4b5f-be82-e5eac46c1f6a\") " Dec 01 21:37:29 crc kubenswrapper[4962]: I1201 21:37:29.968874 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4620db4-9171-44b5-b944-dcf2e871ef41-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 21:37:29 crc kubenswrapper[4962]: I1201 21:37:29.968886 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxfzb\" (UniqueName: \"kubernetes.io/projected/b9788313-1ab9-4dce-8fd0-363c8086d8d3-kube-api-access-cxfzb\") on node \"crc\" DevicePath \"\"" Dec 01 21:37:29 crc kubenswrapper[4962]: I1201 21:37:29.968897 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c522e07-a29f-4b5f-be82-e5eac46c1f6a-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 21:37:29 crc kubenswrapper[4962]: I1201 21:37:29.968905 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qz8pv\" (UniqueName: \"kubernetes.io/projected/a4620db4-9171-44b5-b944-dcf2e871ef41-kube-api-access-qz8pv\") on node \"crc\" DevicePath \"\"" Dec 01 21:37:29 crc kubenswrapper[4962]: I1201 21:37:29.968912 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7ee99ce-d281-4196-b1aa-1d69bc8c6f2a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 21:37:29 crc kubenswrapper[4962]: I1201 21:37:29.968921 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9788313-1ab9-4dce-8fd0-363c8086d8d3-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 21:37:29 crc kubenswrapper[4962]: I1201 21:37:29.980296 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c522e07-a29f-4b5f-be82-e5eac46c1f6a-kube-api-access-9szx8" (OuterVolumeSpecName: "kube-api-access-9szx8") pod "2c522e07-a29f-4b5f-be82-e5eac46c1f6a" (UID: "2c522e07-a29f-4b5f-be82-e5eac46c1f6a"). InnerVolumeSpecName "kube-api-access-9szx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:37:29 crc kubenswrapper[4962]: I1201 21:37:29.986442 4962 generic.go:334] "Generic (PLEG): container finished" podID="542eb764-e2cd-4043-9721-fe8f5d6d5d13" containerID="af8aa798b4813eae1308a47731b5026329864daffa8b1689b9cd1480898f7f54" exitCode=0 Dec 01 21:37:29 crc kubenswrapper[4962]: I1201 21:37:29.986517 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j9h7f" event={"ID":"542eb764-e2cd-4043-9721-fe8f5d6d5d13","Type":"ContainerDied","Data":"af8aa798b4813eae1308a47731b5026329864daffa8b1689b9cd1480898f7f54"} Dec 01 21:37:29 crc kubenswrapper[4962]: I1201 21:37:29.988089 4962 generic.go:334] "Generic (PLEG): container finished" podID="2c522e07-a29f-4b5f-be82-e5eac46c1f6a" containerID="86341b7c09507b951e35616189b882e33c61c4e945f43e4bae8e5cda7b832e93" exitCode=0 Dec 01 21:37:29 crc kubenswrapper[4962]: I1201 21:37:29.988124 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zr6pm" event={"ID":"2c522e07-a29f-4b5f-be82-e5eac46c1f6a","Type":"ContainerDied","Data":"86341b7c09507b951e35616189b882e33c61c4e945f43e4bae8e5cda7b832e93"} Dec 01 21:37:29 crc kubenswrapper[4962]: I1201 21:37:29.988139 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zr6pm" event={"ID":"2c522e07-a29f-4b5f-be82-e5eac46c1f6a","Type":"ContainerDied","Data":"eef6c6cb8df65a7865ab6eceeab1f83d60670fce9eca546ec76ab07e08f7e289"} Dec 01 21:37:29 crc kubenswrapper[4962]: I1201 21:37:29.988156 4962 scope.go:117] "RemoveContainer" containerID="86341b7c09507b951e35616189b882e33c61c4e945f43e4bae8e5cda7b832e93" Dec 01 21:37:29 crc kubenswrapper[4962]: I1201 21:37:29.988261 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zr6pm" Dec 01 21:37:29 crc kubenswrapper[4962]: I1201 21:37:29.997784 4962 generic.go:334] "Generic (PLEG): container finished" podID="d7ee99ce-d281-4196-b1aa-1d69bc8c6f2a" containerID="29b39f2a636d2e309d9c079b474dbcde6a89e9a001c9db390ab375821c3efd72" exitCode=0 Dec 01 21:37:29 crc kubenswrapper[4962]: I1201 21:37:29.999193 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gjnwd" event={"ID":"d7ee99ce-d281-4196-b1aa-1d69bc8c6f2a","Type":"ContainerDied","Data":"29b39f2a636d2e309d9c079b474dbcde6a89e9a001c9db390ab375821c3efd72"} Dec 01 21:37:29 crc kubenswrapper[4962]: I1201 21:37:29.999633 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gjnwd" event={"ID":"d7ee99ce-d281-4196-b1aa-1d69bc8c6f2a","Type":"ContainerDied","Data":"85adddb405d44add697ffdd530f174033a3caa85b7920c3fc46541941c432513"} Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.000092 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gjnwd" Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.005844 4962 generic.go:334] "Generic (PLEG): container finished" podID="f10f3763-03b0-43d0-88fd-ce89274a67d9" containerID="0c4ad07bcc64930ec2ea0fd3bab698ecb5d312c499f12b8042bd15c3d6d79576" exitCode=0 Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.005973 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-stv9m" Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.006253 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-stv9m" event={"ID":"f10f3763-03b0-43d0-88fd-ce89274a67d9","Type":"ContainerDied","Data":"0c4ad07bcc64930ec2ea0fd3bab698ecb5d312c499f12b8042bd15c3d6d79576"} Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.006295 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-stv9m" event={"ID":"f10f3763-03b0-43d0-88fd-ce89274a67d9","Type":"ContainerDied","Data":"691540e4553b2a2f34c4a2d6839c721ee5869b0bb40c03edf2e133c68c5c2c45"} Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.016530 4962 generic.go:334] "Generic (PLEG): container finished" podID="b9788313-1ab9-4dce-8fd0-363c8086d8d3" containerID="470d7563c342109eadd956602357db0cd2168c04b1141bdce4766bd37dc7e21c" exitCode=0 Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.016585 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mpxbb" event={"ID":"b9788313-1ab9-4dce-8fd0-363c8086d8d3","Type":"ContainerDied","Data":"470d7563c342109eadd956602357db0cd2168c04b1141bdce4766bd37dc7e21c"} Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.016609 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mpxbb" event={"ID":"b9788313-1ab9-4dce-8fd0-363c8086d8d3","Type":"ContainerDied","Data":"f93a2256f6a9d6a4b618d9b0c0ecbe732b8ff4cfc417a3ce09e523aad0ea87bf"} Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.016674 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mpxbb" Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.027976 4962 scope.go:117] "RemoveContainer" containerID="ecf4d77c279a5ba520a36bdfac9721a6b77e9308f0d7c75228ada1db85c44137" Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.028606 4962 generic.go:334] "Generic (PLEG): container finished" podID="a4620db4-9171-44b5-b944-dcf2e871ef41" containerID="47fd1722336ab6778e4fd62a0eefcd3d902b43fa345f94699c8b6b368e2ce0e0" exitCode=0 Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.028649 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x6knc" event={"ID":"a4620db4-9171-44b5-b944-dcf2e871ef41","Type":"ContainerDied","Data":"47fd1722336ab6778e4fd62a0eefcd3d902b43fa345f94699c8b6b368e2ce0e0"} Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.028696 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x6knc" event={"ID":"a4620db4-9171-44b5-b944-dcf2e871ef41","Type":"ContainerDied","Data":"da9ad88de7b256ba8a05c12739f712eaa71d6b39c8d12f5acd23aabcf8074fd1"} Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.028774 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x6knc" Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.033827 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-lg4bg" event={"ID":"def1945b-b735-4267-8798-cdb6e28ac006","Type":"ContainerStarted","Data":"699d31a3a9f7c3aa2bd8b801e0c852d0f2d1537358360d6f10ed6128b93054f2"} Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.034411 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-lg4bg" Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.039854 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-lg4bg" Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.045713 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-stv9m"] Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.046798 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j9h7f" Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.052110 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-stv9m"] Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.056222 4962 scope.go:117] "RemoveContainer" containerID="9f15e6fbf10941932e1da243b9892bc118049550a289399e4050947fbea68cf0" Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.071149 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbdj6\" (UniqueName: \"kubernetes.io/projected/542eb764-e2cd-4043-9721-fe8f5d6d5d13-kube-api-access-fbdj6\") pod \"542eb764-e2cd-4043-9721-fe8f5d6d5d13\" (UID: \"542eb764-e2cd-4043-9721-fe8f5d6d5d13\") " Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.071275 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/542eb764-e2cd-4043-9721-fe8f5d6d5d13-catalog-content\") pod \"542eb764-e2cd-4043-9721-fe8f5d6d5d13\" (UID: \"542eb764-e2cd-4043-9721-fe8f5d6d5d13\") " Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.071266 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-lg4bg" podStartSLOduration=3.071232619 podStartE2EDuration="3.071232619s" podCreationTimestamp="2025-12-01 21:37:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:37:30.056309055 +0000 UTC m=+234.157748250" watchObservedRunningTime="2025-12-01 21:37:30.071232619 +0000 UTC m=+234.172671804" Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.071301 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/542eb764-e2cd-4043-9721-fe8f5d6d5d13-utilities\") pod \"542eb764-e2cd-4043-9721-fe8f5d6d5d13\" (UID: \"542eb764-e2cd-4043-9721-fe8f5d6d5d13\") " Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.073547 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/542eb764-e2cd-4043-9721-fe8f5d6d5d13-utilities" (OuterVolumeSpecName: "utilities") pod "542eb764-e2cd-4043-9721-fe8f5d6d5d13" (UID: "542eb764-e2cd-4043-9721-fe8f5d6d5d13"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.074291 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mpxbb"] Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.074622 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9szx8\" (UniqueName: \"kubernetes.io/projected/2c522e07-a29f-4b5f-be82-e5eac46c1f6a-kube-api-access-9szx8\") on node \"crc\" DevicePath \"\"" Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.075510 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/542eb764-e2cd-4043-9721-fe8f5d6d5d13-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.077579 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mpxbb"] Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.086192 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c522e07-a29f-4b5f-be82-e5eac46c1f6a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2c522e07-a29f-4b5f-be82-e5eac46c1f6a" (UID: "2c522e07-a29f-4b5f-be82-e5eac46c1f6a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.105044 4962 scope.go:117] "RemoveContainer" containerID="86341b7c09507b951e35616189b882e33c61c4e945f43e4bae8e5cda7b832e93" Dec 01 21:37:30 crc kubenswrapper[4962]: E1201 21:37:30.106268 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86341b7c09507b951e35616189b882e33c61c4e945f43e4bae8e5cda7b832e93\": container with ID starting with 86341b7c09507b951e35616189b882e33c61c4e945f43e4bae8e5cda7b832e93 not found: ID does not exist" containerID="86341b7c09507b951e35616189b882e33c61c4e945f43e4bae8e5cda7b832e93" Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.106313 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86341b7c09507b951e35616189b882e33c61c4e945f43e4bae8e5cda7b832e93"} err="failed to get container status \"86341b7c09507b951e35616189b882e33c61c4e945f43e4bae8e5cda7b832e93\": rpc error: code = NotFound desc = could not find container \"86341b7c09507b951e35616189b882e33c61c4e945f43e4bae8e5cda7b832e93\": container with ID starting with 86341b7c09507b951e35616189b882e33c61c4e945f43e4bae8e5cda7b832e93 not found: ID does not exist" Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.106340 4962 scope.go:117] "RemoveContainer" containerID="ecf4d77c279a5ba520a36bdfac9721a6b77e9308f0d7c75228ada1db85c44137" Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.107568 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/542eb764-e2cd-4043-9721-fe8f5d6d5d13-kube-api-access-fbdj6" (OuterVolumeSpecName: "kube-api-access-fbdj6") pod "542eb764-e2cd-4043-9721-fe8f5d6d5d13" (UID: "542eb764-e2cd-4043-9721-fe8f5d6d5d13"). InnerVolumeSpecName "kube-api-access-fbdj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:37:30 crc kubenswrapper[4962]: E1201 21:37:30.109092 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ecf4d77c279a5ba520a36bdfac9721a6b77e9308f0d7c75228ada1db85c44137\": container with ID starting with ecf4d77c279a5ba520a36bdfac9721a6b77e9308f0d7c75228ada1db85c44137 not found: ID does not exist" containerID="ecf4d77c279a5ba520a36bdfac9721a6b77e9308f0d7c75228ada1db85c44137" Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.109120 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecf4d77c279a5ba520a36bdfac9721a6b77e9308f0d7c75228ada1db85c44137"} err="failed to get container status \"ecf4d77c279a5ba520a36bdfac9721a6b77e9308f0d7c75228ada1db85c44137\": rpc error: code = NotFound desc = could not find container \"ecf4d77c279a5ba520a36bdfac9721a6b77e9308f0d7c75228ada1db85c44137\": container with ID starting with ecf4d77c279a5ba520a36bdfac9721a6b77e9308f0d7c75228ada1db85c44137 not found: ID does not exist" Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.109135 4962 scope.go:117] "RemoveContainer" containerID="9f15e6fbf10941932e1da243b9892bc118049550a289399e4050947fbea68cf0" Dec 01 21:37:30 crc kubenswrapper[4962]: E1201 21:37:30.109470 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f15e6fbf10941932e1da243b9892bc118049550a289399e4050947fbea68cf0\": container with ID starting with 9f15e6fbf10941932e1da243b9892bc118049550a289399e4050947fbea68cf0 not found: ID does not exist" containerID="9f15e6fbf10941932e1da243b9892bc118049550a289399e4050947fbea68cf0" Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.109509 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f15e6fbf10941932e1da243b9892bc118049550a289399e4050947fbea68cf0"} err="failed to get container status \"9f15e6fbf10941932e1da243b9892bc118049550a289399e4050947fbea68cf0\": rpc error: code = NotFound desc = could not find container \"9f15e6fbf10941932e1da243b9892bc118049550a289399e4050947fbea68cf0\": container with ID starting with 9f15e6fbf10941932e1da243b9892bc118049550a289399e4050947fbea68cf0 not found: ID does not exist" Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.109538 4962 scope.go:117] "RemoveContainer" containerID="29b39f2a636d2e309d9c079b474dbcde6a89e9a001c9db390ab375821c3efd72" Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.113897 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/542eb764-e2cd-4043-9721-fe8f5d6d5d13-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "542eb764-e2cd-4043-9721-fe8f5d6d5d13" (UID: "542eb764-e2cd-4043-9721-fe8f5d6d5d13"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.119222 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gjnwd"] Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.132464 4962 scope.go:117] "RemoveContainer" containerID="c0fadbadec801895092d269e0f453ec970db64a0cb192da1c6c761322fa7cd64" Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.134041 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gjnwd"] Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.148988 4962 scope.go:117] "RemoveContainer" containerID="b7a088a3c22704cb92a5a99344fecfb47550ee7b24e56a0cbeee59b6e7753f27" Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.160230 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-x6knc"] Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.160309 4962 scope.go:117] "RemoveContainer" containerID="29b39f2a636d2e309d9c079b474dbcde6a89e9a001c9db390ab375821c3efd72" Dec 01 21:37:30 crc kubenswrapper[4962]: E1201 21:37:30.161249 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29b39f2a636d2e309d9c079b474dbcde6a89e9a001c9db390ab375821c3efd72\": container with ID starting with 29b39f2a636d2e309d9c079b474dbcde6a89e9a001c9db390ab375821c3efd72 not found: ID does not exist" containerID="29b39f2a636d2e309d9c079b474dbcde6a89e9a001c9db390ab375821c3efd72" Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.161297 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29b39f2a636d2e309d9c079b474dbcde6a89e9a001c9db390ab375821c3efd72"} err="failed to get container status \"29b39f2a636d2e309d9c079b474dbcde6a89e9a001c9db390ab375821c3efd72\": rpc error: code = NotFound desc = could not find container \"29b39f2a636d2e309d9c079b474dbcde6a89e9a001c9db390ab375821c3efd72\": container with ID starting with 29b39f2a636d2e309d9c079b474dbcde6a89e9a001c9db390ab375821c3efd72 not found: ID does not exist" Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.161326 4962 scope.go:117] "RemoveContainer" containerID="c0fadbadec801895092d269e0f453ec970db64a0cb192da1c6c761322fa7cd64" Dec 01 21:37:30 crc kubenswrapper[4962]: E1201 21:37:30.161649 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0fadbadec801895092d269e0f453ec970db64a0cb192da1c6c761322fa7cd64\": container with ID starting with c0fadbadec801895092d269e0f453ec970db64a0cb192da1c6c761322fa7cd64 not found: ID does not exist" containerID="c0fadbadec801895092d269e0f453ec970db64a0cb192da1c6c761322fa7cd64" Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.161680 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0fadbadec801895092d269e0f453ec970db64a0cb192da1c6c761322fa7cd64"} err="failed to get container status \"c0fadbadec801895092d269e0f453ec970db64a0cb192da1c6c761322fa7cd64\": rpc error: code = NotFound desc = could not find container \"c0fadbadec801895092d269e0f453ec970db64a0cb192da1c6c761322fa7cd64\": container with ID starting with c0fadbadec801895092d269e0f453ec970db64a0cb192da1c6c761322fa7cd64 not found: ID does not exist" Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.161703 4962 scope.go:117] "RemoveContainer" containerID="b7a088a3c22704cb92a5a99344fecfb47550ee7b24e56a0cbeee59b6e7753f27" Dec 01 21:37:30 crc kubenswrapper[4962]: E1201 21:37:30.161894 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7a088a3c22704cb92a5a99344fecfb47550ee7b24e56a0cbeee59b6e7753f27\": container with ID starting with b7a088a3c22704cb92a5a99344fecfb47550ee7b24e56a0cbeee59b6e7753f27 not found: ID does not exist" containerID="b7a088a3c22704cb92a5a99344fecfb47550ee7b24e56a0cbeee59b6e7753f27" Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.161947 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7a088a3c22704cb92a5a99344fecfb47550ee7b24e56a0cbeee59b6e7753f27"} err="failed to get container status \"b7a088a3c22704cb92a5a99344fecfb47550ee7b24e56a0cbeee59b6e7753f27\": rpc error: code = NotFound desc = could not find container \"b7a088a3c22704cb92a5a99344fecfb47550ee7b24e56a0cbeee59b6e7753f27\": container with ID starting with b7a088a3c22704cb92a5a99344fecfb47550ee7b24e56a0cbeee59b6e7753f27 not found: ID does not exist" Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.161963 4962 scope.go:117] "RemoveContainer" containerID="0c4ad07bcc64930ec2ea0fd3bab698ecb5d312c499f12b8042bd15c3d6d79576" Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.166878 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-x6knc"] Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.172805 4962 scope.go:117] "RemoveContainer" containerID="0c4ad07bcc64930ec2ea0fd3bab698ecb5d312c499f12b8042bd15c3d6d79576" Dec 01 21:37:30 crc kubenswrapper[4962]: E1201 21:37:30.173155 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c4ad07bcc64930ec2ea0fd3bab698ecb5d312c499f12b8042bd15c3d6d79576\": container with ID starting with 0c4ad07bcc64930ec2ea0fd3bab698ecb5d312c499f12b8042bd15c3d6d79576 not found: ID does not exist" containerID="0c4ad07bcc64930ec2ea0fd3bab698ecb5d312c499f12b8042bd15c3d6d79576" Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.173211 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c4ad07bcc64930ec2ea0fd3bab698ecb5d312c499f12b8042bd15c3d6d79576"} err="failed to get container status \"0c4ad07bcc64930ec2ea0fd3bab698ecb5d312c499f12b8042bd15c3d6d79576\": rpc error: code = NotFound desc = could not find container \"0c4ad07bcc64930ec2ea0fd3bab698ecb5d312c499f12b8042bd15c3d6d79576\": container with ID starting with 0c4ad07bcc64930ec2ea0fd3bab698ecb5d312c499f12b8042bd15c3d6d79576 not found: ID does not exist" Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.173238 4962 scope.go:117] "RemoveContainer" containerID="470d7563c342109eadd956602357db0cd2168c04b1141bdce4766bd37dc7e21c" Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.176429 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c522e07-a29f-4b5f-be82-e5eac46c1f6a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.176451 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fbdj6\" (UniqueName: \"kubernetes.io/projected/542eb764-e2cd-4043-9721-fe8f5d6d5d13-kube-api-access-fbdj6\") on node \"crc\" DevicePath \"\"" Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.176462 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/542eb764-e2cd-4043-9721-fe8f5d6d5d13-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.190094 4962 scope.go:117] "RemoveContainer" containerID="ccbfd8375aea896fb731bf6832958397603861a2a486699a9c2aead6853b01ce" Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.203577 4962 scope.go:117] "RemoveContainer" containerID="36229efaf41b9432aa6c30762b6908f76c4d64613059d01842a762f5d3b08891" Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.217704 4962 scope.go:117] "RemoveContainer" containerID="470d7563c342109eadd956602357db0cd2168c04b1141bdce4766bd37dc7e21c" Dec 01 21:37:30 crc kubenswrapper[4962]: E1201 21:37:30.218149 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"470d7563c342109eadd956602357db0cd2168c04b1141bdce4766bd37dc7e21c\": container with ID starting with 470d7563c342109eadd956602357db0cd2168c04b1141bdce4766bd37dc7e21c not found: ID does not exist" containerID="470d7563c342109eadd956602357db0cd2168c04b1141bdce4766bd37dc7e21c" Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.218178 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"470d7563c342109eadd956602357db0cd2168c04b1141bdce4766bd37dc7e21c"} err="failed to get container status \"470d7563c342109eadd956602357db0cd2168c04b1141bdce4766bd37dc7e21c\": rpc error: code = NotFound desc = could not find container \"470d7563c342109eadd956602357db0cd2168c04b1141bdce4766bd37dc7e21c\": container with ID starting with 470d7563c342109eadd956602357db0cd2168c04b1141bdce4766bd37dc7e21c not found: ID does not exist" Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.218198 4962 scope.go:117] "RemoveContainer" containerID="ccbfd8375aea896fb731bf6832958397603861a2a486699a9c2aead6853b01ce" Dec 01 21:37:30 crc kubenswrapper[4962]: E1201 21:37:30.218559 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ccbfd8375aea896fb731bf6832958397603861a2a486699a9c2aead6853b01ce\": container with ID starting with ccbfd8375aea896fb731bf6832958397603861a2a486699a9c2aead6853b01ce not found: ID does not exist" containerID="ccbfd8375aea896fb731bf6832958397603861a2a486699a9c2aead6853b01ce" Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.218585 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccbfd8375aea896fb731bf6832958397603861a2a486699a9c2aead6853b01ce"} err="failed to get container status \"ccbfd8375aea896fb731bf6832958397603861a2a486699a9c2aead6853b01ce\": rpc error: code = NotFound desc = could not find container \"ccbfd8375aea896fb731bf6832958397603861a2a486699a9c2aead6853b01ce\": container with ID starting with ccbfd8375aea896fb731bf6832958397603861a2a486699a9c2aead6853b01ce not found: ID does not exist" Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.218599 4962 scope.go:117] "RemoveContainer" containerID="36229efaf41b9432aa6c30762b6908f76c4d64613059d01842a762f5d3b08891" Dec 01 21:37:30 crc kubenswrapper[4962]: E1201 21:37:30.219011 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36229efaf41b9432aa6c30762b6908f76c4d64613059d01842a762f5d3b08891\": container with ID starting with 36229efaf41b9432aa6c30762b6908f76c4d64613059d01842a762f5d3b08891 not found: ID does not exist" containerID="36229efaf41b9432aa6c30762b6908f76c4d64613059d01842a762f5d3b08891" Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.219031 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36229efaf41b9432aa6c30762b6908f76c4d64613059d01842a762f5d3b08891"} err="failed to get container status \"36229efaf41b9432aa6c30762b6908f76c4d64613059d01842a762f5d3b08891\": rpc error: code = NotFound desc = could not find container \"36229efaf41b9432aa6c30762b6908f76c4d64613059d01842a762f5d3b08891\": container with ID starting with 36229efaf41b9432aa6c30762b6908f76c4d64613059d01842a762f5d3b08891 not found: ID does not exist" Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.219046 4962 scope.go:117] "RemoveContainer" containerID="47fd1722336ab6778e4fd62a0eefcd3d902b43fa345f94699c8b6b368e2ce0e0" Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.224196 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4620db4-9171-44b5-b944-dcf2e871ef41" path="/var/lib/kubelet/pods/a4620db4-9171-44b5-b944-dcf2e871ef41/volumes" Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.224735 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9788313-1ab9-4dce-8fd0-363c8086d8d3" path="/var/lib/kubelet/pods/b9788313-1ab9-4dce-8fd0-363c8086d8d3/volumes" Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.225827 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7ee99ce-d281-4196-b1aa-1d69bc8c6f2a" path="/var/lib/kubelet/pods/d7ee99ce-d281-4196-b1aa-1d69bc8c6f2a/volumes" Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.226632 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f10f3763-03b0-43d0-88fd-ce89274a67d9" path="/var/lib/kubelet/pods/f10f3763-03b0-43d0-88fd-ce89274a67d9/volumes" Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.231924 4962 scope.go:117] "RemoveContainer" containerID="92b16bf04b13d30418c74797f6ec8ea4cbc2cc8316df87dcab0950c2a6524d04" Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.254500 4962 scope.go:117] "RemoveContainer" containerID="44e6e4a1e9f7c346a88a3cff569ec23c9e648d76db99b2bb69cee1eb6e00a99b" Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.284314 4962 scope.go:117] "RemoveContainer" containerID="47fd1722336ab6778e4fd62a0eefcd3d902b43fa345f94699c8b6b368e2ce0e0" Dec 01 21:37:30 crc kubenswrapper[4962]: E1201 21:37:30.285111 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47fd1722336ab6778e4fd62a0eefcd3d902b43fa345f94699c8b6b368e2ce0e0\": container with ID starting with 47fd1722336ab6778e4fd62a0eefcd3d902b43fa345f94699c8b6b368e2ce0e0 not found: ID does not exist" containerID="47fd1722336ab6778e4fd62a0eefcd3d902b43fa345f94699c8b6b368e2ce0e0" Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.285188 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47fd1722336ab6778e4fd62a0eefcd3d902b43fa345f94699c8b6b368e2ce0e0"} err="failed to get container status \"47fd1722336ab6778e4fd62a0eefcd3d902b43fa345f94699c8b6b368e2ce0e0\": rpc error: code = NotFound desc = could not find container \"47fd1722336ab6778e4fd62a0eefcd3d902b43fa345f94699c8b6b368e2ce0e0\": container with ID starting with 47fd1722336ab6778e4fd62a0eefcd3d902b43fa345f94699c8b6b368e2ce0e0 not found: ID does not exist" Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.285248 4962 scope.go:117] "RemoveContainer" containerID="92b16bf04b13d30418c74797f6ec8ea4cbc2cc8316df87dcab0950c2a6524d04" Dec 01 21:37:30 crc kubenswrapper[4962]: E1201 21:37:30.285649 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92b16bf04b13d30418c74797f6ec8ea4cbc2cc8316df87dcab0950c2a6524d04\": container with ID starting with 92b16bf04b13d30418c74797f6ec8ea4cbc2cc8316df87dcab0950c2a6524d04 not found: ID does not exist" containerID="92b16bf04b13d30418c74797f6ec8ea4cbc2cc8316df87dcab0950c2a6524d04" Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.285692 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92b16bf04b13d30418c74797f6ec8ea4cbc2cc8316df87dcab0950c2a6524d04"} err="failed to get container status \"92b16bf04b13d30418c74797f6ec8ea4cbc2cc8316df87dcab0950c2a6524d04\": rpc error: code = NotFound desc = could not find container \"92b16bf04b13d30418c74797f6ec8ea4cbc2cc8316df87dcab0950c2a6524d04\": container with ID starting with 92b16bf04b13d30418c74797f6ec8ea4cbc2cc8316df87dcab0950c2a6524d04 not found: ID does not exist" Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.285718 4962 scope.go:117] "RemoveContainer" containerID="44e6e4a1e9f7c346a88a3cff569ec23c9e648d76db99b2bb69cee1eb6e00a99b" Dec 01 21:37:30 crc kubenswrapper[4962]: E1201 21:37:30.286022 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44e6e4a1e9f7c346a88a3cff569ec23c9e648d76db99b2bb69cee1eb6e00a99b\": container with ID starting with 44e6e4a1e9f7c346a88a3cff569ec23c9e648d76db99b2bb69cee1eb6e00a99b not found: ID does not exist" containerID="44e6e4a1e9f7c346a88a3cff569ec23c9e648d76db99b2bb69cee1eb6e00a99b" Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.286064 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44e6e4a1e9f7c346a88a3cff569ec23c9e648d76db99b2bb69cee1eb6e00a99b"} err="failed to get container status \"44e6e4a1e9f7c346a88a3cff569ec23c9e648d76db99b2bb69cee1eb6e00a99b\": rpc error: code = NotFound desc = could not find container \"44e6e4a1e9f7c346a88a3cff569ec23c9e648d76db99b2bb69cee1eb6e00a99b\": container with ID starting with 44e6e4a1e9f7c346a88a3cff569ec23c9e648d76db99b2bb69cee1eb6e00a99b not found: ID does not exist" Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.312987 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zr6pm"] Dec 01 21:37:30 crc kubenswrapper[4962]: I1201 21:37:30.318358 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zr6pm"] Dec 01 21:37:31 crc kubenswrapper[4962]: I1201 21:37:31.041755 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j9h7f" Dec 01 21:37:31 crc kubenswrapper[4962]: I1201 21:37:31.041797 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j9h7f" event={"ID":"542eb764-e2cd-4043-9721-fe8f5d6d5d13","Type":"ContainerDied","Data":"d50a73638c479e76d3beb5b452d38e6559599a99fb1fd24a4ad7488d7163dc23"} Dec 01 21:37:31 crc kubenswrapper[4962]: I1201 21:37:31.041862 4962 scope.go:117] "RemoveContainer" containerID="af8aa798b4813eae1308a47731b5026329864daffa8b1689b9cd1480898f7f54" Dec 01 21:37:31 crc kubenswrapper[4962]: I1201 21:37:31.066647 4962 scope.go:117] "RemoveContainer" containerID="de86d7faffa7c5acfddf3aeda3fbcef7c401908f6f019cb2e934fb45a32e82bc" Dec 01 21:37:31 crc kubenswrapper[4962]: I1201 21:37:31.081233 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j9h7f"] Dec 01 21:37:31 crc kubenswrapper[4962]: I1201 21:37:31.086652 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-j9h7f"] Dec 01 21:37:31 crc kubenswrapper[4962]: I1201 21:37:31.103404 4962 scope.go:117] "RemoveContainer" containerID="3ef67691664d49c549408d0a1a537421dea2fe13f1bd4f059fe2524319683b75" Dec 01 21:37:31 crc kubenswrapper[4962]: I1201 21:37:31.583926 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-t6qvm"] Dec 01 21:37:31 crc kubenswrapper[4962]: E1201 21:37:31.584181 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="542eb764-e2cd-4043-9721-fe8f5d6d5d13" containerName="extract-utilities" Dec 01 21:37:31 crc kubenswrapper[4962]: I1201 21:37:31.584198 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="542eb764-e2cd-4043-9721-fe8f5d6d5d13" containerName="extract-utilities" Dec 01 21:37:31 crc kubenswrapper[4962]: E1201 21:37:31.584213 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c522e07-a29f-4b5f-be82-e5eac46c1f6a" containerName="registry-server" Dec 01 21:37:31 crc kubenswrapper[4962]: I1201 21:37:31.584220 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c522e07-a29f-4b5f-be82-e5eac46c1f6a" containerName="registry-server" Dec 01 21:37:31 crc kubenswrapper[4962]: E1201 21:37:31.584232 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c522e07-a29f-4b5f-be82-e5eac46c1f6a" containerName="extract-content" Dec 01 21:37:31 crc kubenswrapper[4962]: I1201 21:37:31.584239 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c522e07-a29f-4b5f-be82-e5eac46c1f6a" containerName="extract-content" Dec 01 21:37:31 crc kubenswrapper[4962]: E1201 21:37:31.584251 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="542eb764-e2cd-4043-9721-fe8f5d6d5d13" containerName="extract-content" Dec 01 21:37:31 crc kubenswrapper[4962]: I1201 21:37:31.584258 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="542eb764-e2cd-4043-9721-fe8f5d6d5d13" containerName="extract-content" Dec 01 21:37:31 crc kubenswrapper[4962]: E1201 21:37:31.584266 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4620db4-9171-44b5-b944-dcf2e871ef41" containerName="extract-utilities" Dec 01 21:37:31 crc kubenswrapper[4962]: I1201 21:37:31.584272 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4620db4-9171-44b5-b944-dcf2e871ef41" containerName="extract-utilities" Dec 01 21:37:31 crc kubenswrapper[4962]: E1201 21:37:31.584282 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9788313-1ab9-4dce-8fd0-363c8086d8d3" containerName="extract-utilities" Dec 01 21:37:31 crc kubenswrapper[4962]: I1201 21:37:31.584287 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9788313-1ab9-4dce-8fd0-363c8086d8d3" containerName="extract-utilities" Dec 01 21:37:31 crc kubenswrapper[4962]: E1201 21:37:31.584301 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4620db4-9171-44b5-b944-dcf2e871ef41" containerName="extract-content" Dec 01 21:37:31 crc kubenswrapper[4962]: I1201 21:37:31.584310 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4620db4-9171-44b5-b944-dcf2e871ef41" containerName="extract-content" Dec 01 21:37:31 crc kubenswrapper[4962]: E1201 21:37:31.584320 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7ee99ce-d281-4196-b1aa-1d69bc8c6f2a" containerName="registry-server" Dec 01 21:37:31 crc kubenswrapper[4962]: I1201 21:37:31.584327 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7ee99ce-d281-4196-b1aa-1d69bc8c6f2a" containerName="registry-server" Dec 01 21:37:31 crc kubenswrapper[4962]: E1201 21:37:31.584340 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9788313-1ab9-4dce-8fd0-363c8086d8d3" containerName="registry-server" Dec 01 21:37:31 crc kubenswrapper[4962]: I1201 21:37:31.584348 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9788313-1ab9-4dce-8fd0-363c8086d8d3" containerName="registry-server" Dec 01 21:37:31 crc kubenswrapper[4962]: E1201 21:37:31.584361 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c522e07-a29f-4b5f-be82-e5eac46c1f6a" containerName="extract-utilities" Dec 01 21:37:31 crc kubenswrapper[4962]: I1201 21:37:31.584368 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c522e07-a29f-4b5f-be82-e5eac46c1f6a" containerName="extract-utilities" Dec 01 21:37:31 crc kubenswrapper[4962]: E1201 21:37:31.584377 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7ee99ce-d281-4196-b1aa-1d69bc8c6f2a" containerName="extract-content" Dec 01 21:37:31 crc kubenswrapper[4962]: I1201 21:37:31.584384 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7ee99ce-d281-4196-b1aa-1d69bc8c6f2a" containerName="extract-content" Dec 01 21:37:31 crc kubenswrapper[4962]: E1201 21:37:31.584392 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4620db4-9171-44b5-b944-dcf2e871ef41" containerName="registry-server" Dec 01 21:37:31 crc kubenswrapper[4962]: I1201 21:37:31.584400 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4620db4-9171-44b5-b944-dcf2e871ef41" containerName="registry-server" Dec 01 21:37:31 crc kubenswrapper[4962]: E1201 21:37:31.584409 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="542eb764-e2cd-4043-9721-fe8f5d6d5d13" containerName="registry-server" Dec 01 21:37:31 crc kubenswrapper[4962]: I1201 21:37:31.584417 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="542eb764-e2cd-4043-9721-fe8f5d6d5d13" containerName="registry-server" Dec 01 21:37:31 crc kubenswrapper[4962]: E1201 21:37:31.584429 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f10f3763-03b0-43d0-88fd-ce89274a67d9" containerName="marketplace-operator" Dec 01 21:37:31 crc kubenswrapper[4962]: I1201 21:37:31.584436 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f10f3763-03b0-43d0-88fd-ce89274a67d9" containerName="marketplace-operator" Dec 01 21:37:31 crc kubenswrapper[4962]: E1201 21:37:31.584446 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7ee99ce-d281-4196-b1aa-1d69bc8c6f2a" containerName="extract-utilities" Dec 01 21:37:31 crc kubenswrapper[4962]: I1201 21:37:31.584454 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7ee99ce-d281-4196-b1aa-1d69bc8c6f2a" containerName="extract-utilities" Dec 01 21:37:31 crc kubenswrapper[4962]: E1201 21:37:31.584466 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9788313-1ab9-4dce-8fd0-363c8086d8d3" containerName="extract-content" Dec 01 21:37:31 crc kubenswrapper[4962]: I1201 21:37:31.584474 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9788313-1ab9-4dce-8fd0-363c8086d8d3" containerName="extract-content" Dec 01 21:37:31 crc kubenswrapper[4962]: I1201 21:37:31.584568 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7ee99ce-d281-4196-b1aa-1d69bc8c6f2a" containerName="registry-server" Dec 01 21:37:31 crc kubenswrapper[4962]: I1201 21:37:31.584582 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f10f3763-03b0-43d0-88fd-ce89274a67d9" containerName="marketplace-operator" Dec 01 21:37:31 crc kubenswrapper[4962]: I1201 21:37:31.584590 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="542eb764-e2cd-4043-9721-fe8f5d6d5d13" containerName="registry-server" Dec 01 21:37:31 crc kubenswrapper[4962]: I1201 21:37:31.584599 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c522e07-a29f-4b5f-be82-e5eac46c1f6a" containerName="registry-server" Dec 01 21:37:31 crc kubenswrapper[4962]: I1201 21:37:31.584607 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9788313-1ab9-4dce-8fd0-363c8086d8d3" containerName="registry-server" Dec 01 21:37:31 crc kubenswrapper[4962]: I1201 21:37:31.584615 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4620db4-9171-44b5-b944-dcf2e871ef41" containerName="registry-server" Dec 01 21:37:31 crc kubenswrapper[4962]: I1201 21:37:31.585387 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t6qvm" Dec 01 21:37:31 crc kubenswrapper[4962]: I1201 21:37:31.588440 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 01 21:37:31 crc kubenswrapper[4962]: I1201 21:37:31.597917 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-t6qvm"] Dec 01 21:37:31 crc kubenswrapper[4962]: I1201 21:37:31.699261 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5kqj\" (UniqueName: \"kubernetes.io/projected/21cefc69-51ca-4baa-a4f4-aa7de0d8aa7a-kube-api-access-l5kqj\") pod \"community-operators-t6qvm\" (UID: \"21cefc69-51ca-4baa-a4f4-aa7de0d8aa7a\") " pod="openshift-marketplace/community-operators-t6qvm" Dec 01 21:37:31 crc kubenswrapper[4962]: I1201 21:37:31.699512 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21cefc69-51ca-4baa-a4f4-aa7de0d8aa7a-utilities\") pod \"community-operators-t6qvm\" (UID: \"21cefc69-51ca-4baa-a4f4-aa7de0d8aa7a\") " pod="openshift-marketplace/community-operators-t6qvm" Dec 01 21:37:31 crc kubenswrapper[4962]: I1201 21:37:31.699681 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21cefc69-51ca-4baa-a4f4-aa7de0d8aa7a-catalog-content\") pod \"community-operators-t6qvm\" (UID: \"21cefc69-51ca-4baa-a4f4-aa7de0d8aa7a\") " pod="openshift-marketplace/community-operators-t6qvm" Dec 01 21:37:31 crc kubenswrapper[4962]: I1201 21:37:31.778237 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cckxw"] Dec 01 21:37:31 crc kubenswrapper[4962]: I1201 21:37:31.780350 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cckxw" Dec 01 21:37:31 crc kubenswrapper[4962]: I1201 21:37:31.783092 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 01 21:37:31 crc kubenswrapper[4962]: I1201 21:37:31.800681 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cckxw"] Dec 01 21:37:31 crc kubenswrapper[4962]: I1201 21:37:31.800686 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5jl6\" (UniqueName: \"kubernetes.io/projected/d6ce364a-a2f9-48fa-9c65-5f8e65da569f-kube-api-access-n5jl6\") pod \"redhat-operators-cckxw\" (UID: \"d6ce364a-a2f9-48fa-9c65-5f8e65da569f\") " pod="openshift-marketplace/redhat-operators-cckxw" Dec 01 21:37:31 crc kubenswrapper[4962]: I1201 21:37:31.800898 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21cefc69-51ca-4baa-a4f4-aa7de0d8aa7a-catalog-content\") pod \"community-operators-t6qvm\" (UID: \"21cefc69-51ca-4baa-a4f4-aa7de0d8aa7a\") " pod="openshift-marketplace/community-operators-t6qvm" Dec 01 21:37:31 crc kubenswrapper[4962]: I1201 21:37:31.800986 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6ce364a-a2f9-48fa-9c65-5f8e65da569f-catalog-content\") pod \"redhat-operators-cckxw\" (UID: \"d6ce364a-a2f9-48fa-9c65-5f8e65da569f\") " pod="openshift-marketplace/redhat-operators-cckxw" Dec 01 21:37:31 crc kubenswrapper[4962]: I1201 21:37:31.801102 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5kqj\" (UniqueName: \"kubernetes.io/projected/21cefc69-51ca-4baa-a4f4-aa7de0d8aa7a-kube-api-access-l5kqj\") pod \"community-operators-t6qvm\" (UID: \"21cefc69-51ca-4baa-a4f4-aa7de0d8aa7a\") " pod="openshift-marketplace/community-operators-t6qvm" Dec 01 21:37:31 crc kubenswrapper[4962]: I1201 21:37:31.801140 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6ce364a-a2f9-48fa-9c65-5f8e65da569f-utilities\") pod \"redhat-operators-cckxw\" (UID: \"d6ce364a-a2f9-48fa-9c65-5f8e65da569f\") " pod="openshift-marketplace/redhat-operators-cckxw" Dec 01 21:37:31 crc kubenswrapper[4962]: I1201 21:37:31.801202 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21cefc69-51ca-4baa-a4f4-aa7de0d8aa7a-utilities\") pod \"community-operators-t6qvm\" (UID: \"21cefc69-51ca-4baa-a4f4-aa7de0d8aa7a\") " pod="openshift-marketplace/community-operators-t6qvm" Dec 01 21:37:31 crc kubenswrapper[4962]: I1201 21:37:31.801835 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21cefc69-51ca-4baa-a4f4-aa7de0d8aa7a-catalog-content\") pod \"community-operators-t6qvm\" (UID: \"21cefc69-51ca-4baa-a4f4-aa7de0d8aa7a\") " pod="openshift-marketplace/community-operators-t6qvm" Dec 01 21:37:31 crc kubenswrapper[4962]: I1201 21:37:31.802073 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21cefc69-51ca-4baa-a4f4-aa7de0d8aa7a-utilities\") pod \"community-operators-t6qvm\" (UID: \"21cefc69-51ca-4baa-a4f4-aa7de0d8aa7a\") " pod="openshift-marketplace/community-operators-t6qvm" Dec 01 21:37:31 crc kubenswrapper[4962]: I1201 21:37:31.826582 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5kqj\" (UniqueName: \"kubernetes.io/projected/21cefc69-51ca-4baa-a4f4-aa7de0d8aa7a-kube-api-access-l5kqj\") pod \"community-operators-t6qvm\" (UID: \"21cefc69-51ca-4baa-a4f4-aa7de0d8aa7a\") " pod="openshift-marketplace/community-operators-t6qvm" Dec 01 21:37:31 crc kubenswrapper[4962]: I1201 21:37:31.902893 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5jl6\" (UniqueName: \"kubernetes.io/projected/d6ce364a-a2f9-48fa-9c65-5f8e65da569f-kube-api-access-n5jl6\") pod \"redhat-operators-cckxw\" (UID: \"d6ce364a-a2f9-48fa-9c65-5f8e65da569f\") " pod="openshift-marketplace/redhat-operators-cckxw" Dec 01 21:37:31 crc kubenswrapper[4962]: I1201 21:37:31.902990 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6ce364a-a2f9-48fa-9c65-5f8e65da569f-catalog-content\") pod \"redhat-operators-cckxw\" (UID: \"d6ce364a-a2f9-48fa-9c65-5f8e65da569f\") " pod="openshift-marketplace/redhat-operators-cckxw" Dec 01 21:37:31 crc kubenswrapper[4962]: I1201 21:37:31.903036 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6ce364a-a2f9-48fa-9c65-5f8e65da569f-utilities\") pod \"redhat-operators-cckxw\" (UID: \"d6ce364a-a2f9-48fa-9c65-5f8e65da569f\") " pod="openshift-marketplace/redhat-operators-cckxw" Dec 01 21:37:31 crc kubenswrapper[4962]: I1201 21:37:31.903512 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6ce364a-a2f9-48fa-9c65-5f8e65da569f-utilities\") pod \"redhat-operators-cckxw\" (UID: \"d6ce364a-a2f9-48fa-9c65-5f8e65da569f\") " pod="openshift-marketplace/redhat-operators-cckxw" Dec 01 21:37:31 crc kubenswrapper[4962]: I1201 21:37:31.903870 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6ce364a-a2f9-48fa-9c65-5f8e65da569f-catalog-content\") pod \"redhat-operators-cckxw\" (UID: \"d6ce364a-a2f9-48fa-9c65-5f8e65da569f\") " pod="openshift-marketplace/redhat-operators-cckxw" Dec 01 21:37:31 crc kubenswrapper[4962]: I1201 21:37:31.906672 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t6qvm" Dec 01 21:37:31 crc kubenswrapper[4962]: I1201 21:37:31.931437 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5jl6\" (UniqueName: \"kubernetes.io/projected/d6ce364a-a2f9-48fa-9c65-5f8e65da569f-kube-api-access-n5jl6\") pod \"redhat-operators-cckxw\" (UID: \"d6ce364a-a2f9-48fa-9c65-5f8e65da569f\") " pod="openshift-marketplace/redhat-operators-cckxw" Dec 01 21:37:32 crc kubenswrapper[4962]: I1201 21:37:32.103053 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cckxw" Dec 01 21:37:32 crc kubenswrapper[4962]: I1201 21:37:32.121140 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-t6qvm"] Dec 01 21:37:32 crc kubenswrapper[4962]: W1201 21:37:32.142237 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21cefc69_51ca_4baa_a4f4_aa7de0d8aa7a.slice/crio-41eb82ae0c26cdf50b80818b6c827581e020e53e8941392f4faa4a1d54fe89c4 WatchSource:0}: Error finding container 41eb82ae0c26cdf50b80818b6c827581e020e53e8941392f4faa4a1d54fe89c4: Status 404 returned error can't find the container with id 41eb82ae0c26cdf50b80818b6c827581e020e53e8941392f4faa4a1d54fe89c4 Dec 01 21:37:32 crc kubenswrapper[4962]: I1201 21:37:32.256314 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c522e07-a29f-4b5f-be82-e5eac46c1f6a" path="/var/lib/kubelet/pods/2c522e07-a29f-4b5f-be82-e5eac46c1f6a/volumes" Dec 01 21:37:32 crc kubenswrapper[4962]: I1201 21:37:32.257421 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="542eb764-e2cd-4043-9721-fe8f5d6d5d13" path="/var/lib/kubelet/pods/542eb764-e2cd-4043-9721-fe8f5d6d5d13/volumes" Dec 01 21:37:32 crc kubenswrapper[4962]: I1201 21:37:32.362594 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cckxw"] Dec 01 21:37:32 crc kubenswrapper[4962]: W1201 21:37:32.370881 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6ce364a_a2f9_48fa_9c65_5f8e65da569f.slice/crio-b6761a19b40f0b36c99f34d8fc7eedb45591b0202cff72f04fcaf828713cc59a WatchSource:0}: Error finding container b6761a19b40f0b36c99f34d8fc7eedb45591b0202cff72f04fcaf828713cc59a: Status 404 returned error can't find the container with id b6761a19b40f0b36c99f34d8fc7eedb45591b0202cff72f04fcaf828713cc59a Dec 01 21:37:33 crc kubenswrapper[4962]: I1201 21:37:33.083334 4962 generic.go:334] "Generic (PLEG): container finished" podID="d6ce364a-a2f9-48fa-9c65-5f8e65da569f" containerID="024fc11a6167a637433a86ce960ae3d8e14a0f427998b84174d10cc3b3658faf" exitCode=0 Dec 01 21:37:33 crc kubenswrapper[4962]: I1201 21:37:33.083398 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cckxw" event={"ID":"d6ce364a-a2f9-48fa-9c65-5f8e65da569f","Type":"ContainerDied","Data":"024fc11a6167a637433a86ce960ae3d8e14a0f427998b84174d10cc3b3658faf"} Dec 01 21:37:33 crc kubenswrapper[4962]: I1201 21:37:33.083716 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cckxw" event={"ID":"d6ce364a-a2f9-48fa-9c65-5f8e65da569f","Type":"ContainerStarted","Data":"b6761a19b40f0b36c99f34d8fc7eedb45591b0202cff72f04fcaf828713cc59a"} Dec 01 21:37:33 crc kubenswrapper[4962]: I1201 21:37:33.087280 4962 generic.go:334] "Generic (PLEG): container finished" podID="21cefc69-51ca-4baa-a4f4-aa7de0d8aa7a" containerID="d79a1842784f9ad5ee958c415555769787eaa40972e7b373b65fe9c91d760a29" exitCode=0 Dec 01 21:37:33 crc kubenswrapper[4962]: I1201 21:37:33.088160 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t6qvm" event={"ID":"21cefc69-51ca-4baa-a4f4-aa7de0d8aa7a","Type":"ContainerDied","Data":"d79a1842784f9ad5ee958c415555769787eaa40972e7b373b65fe9c91d760a29"} Dec 01 21:37:33 crc kubenswrapper[4962]: I1201 21:37:33.088222 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t6qvm" event={"ID":"21cefc69-51ca-4baa-a4f4-aa7de0d8aa7a","Type":"ContainerStarted","Data":"41eb82ae0c26cdf50b80818b6c827581e020e53e8941392f4faa4a1d54fe89c4"} Dec 01 21:37:33 crc kubenswrapper[4962]: I1201 21:37:33.984478 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4sv58"] Dec 01 21:37:33 crc kubenswrapper[4962]: I1201 21:37:33.986041 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4sv58" Dec 01 21:37:33 crc kubenswrapper[4962]: I1201 21:37:33.988339 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 01 21:37:33 crc kubenswrapper[4962]: I1201 21:37:33.994707 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4sv58"] Dec 01 21:37:34 crc kubenswrapper[4962]: I1201 21:37:34.070065 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kf9z\" (UniqueName: \"kubernetes.io/projected/3ea88a22-b18e-4b46-812c-35cb8dcdeb30-kube-api-access-2kf9z\") pod \"certified-operators-4sv58\" (UID: \"3ea88a22-b18e-4b46-812c-35cb8dcdeb30\") " pod="openshift-marketplace/certified-operators-4sv58" Dec 01 21:37:34 crc kubenswrapper[4962]: I1201 21:37:34.070295 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ea88a22-b18e-4b46-812c-35cb8dcdeb30-catalog-content\") pod \"certified-operators-4sv58\" (UID: \"3ea88a22-b18e-4b46-812c-35cb8dcdeb30\") " pod="openshift-marketplace/certified-operators-4sv58" Dec 01 21:37:34 crc kubenswrapper[4962]: I1201 21:37:34.070404 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ea88a22-b18e-4b46-812c-35cb8dcdeb30-utilities\") pod \"certified-operators-4sv58\" (UID: \"3ea88a22-b18e-4b46-812c-35cb8dcdeb30\") " pod="openshift-marketplace/certified-operators-4sv58" Dec 01 21:37:34 crc kubenswrapper[4962]: I1201 21:37:34.171787 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ea88a22-b18e-4b46-812c-35cb8dcdeb30-catalog-content\") pod \"certified-operators-4sv58\" (UID: \"3ea88a22-b18e-4b46-812c-35cb8dcdeb30\") " pod="openshift-marketplace/certified-operators-4sv58" Dec 01 21:37:34 crc kubenswrapper[4962]: I1201 21:37:34.172186 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ea88a22-b18e-4b46-812c-35cb8dcdeb30-utilities\") pod \"certified-operators-4sv58\" (UID: \"3ea88a22-b18e-4b46-812c-35cb8dcdeb30\") " pod="openshift-marketplace/certified-operators-4sv58" Dec 01 21:37:34 crc kubenswrapper[4962]: I1201 21:37:34.172479 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kf9z\" (UniqueName: \"kubernetes.io/projected/3ea88a22-b18e-4b46-812c-35cb8dcdeb30-kube-api-access-2kf9z\") pod \"certified-operators-4sv58\" (UID: \"3ea88a22-b18e-4b46-812c-35cb8dcdeb30\") " pod="openshift-marketplace/certified-operators-4sv58" Dec 01 21:37:34 crc kubenswrapper[4962]: I1201 21:37:34.173038 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ea88a22-b18e-4b46-812c-35cb8dcdeb30-utilities\") pod \"certified-operators-4sv58\" (UID: \"3ea88a22-b18e-4b46-812c-35cb8dcdeb30\") " pod="openshift-marketplace/certified-operators-4sv58" Dec 01 21:37:34 crc kubenswrapper[4962]: I1201 21:37:34.173147 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ea88a22-b18e-4b46-812c-35cb8dcdeb30-catalog-content\") pod \"certified-operators-4sv58\" (UID: \"3ea88a22-b18e-4b46-812c-35cb8dcdeb30\") " pod="openshift-marketplace/certified-operators-4sv58" Dec 01 21:37:34 crc kubenswrapper[4962]: I1201 21:37:34.194881 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6bphp"] Dec 01 21:37:34 crc kubenswrapper[4962]: I1201 21:37:34.201560 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6bphp" Dec 01 21:37:34 crc kubenswrapper[4962]: I1201 21:37:34.203174 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6bphp"] Dec 01 21:37:34 crc kubenswrapper[4962]: I1201 21:37:34.204053 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 01 21:37:34 crc kubenswrapper[4962]: I1201 21:37:34.217562 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kf9z\" (UniqueName: \"kubernetes.io/projected/3ea88a22-b18e-4b46-812c-35cb8dcdeb30-kube-api-access-2kf9z\") pod \"certified-operators-4sv58\" (UID: \"3ea88a22-b18e-4b46-812c-35cb8dcdeb30\") " pod="openshift-marketplace/certified-operators-4sv58" Dec 01 21:37:34 crc kubenswrapper[4962]: I1201 21:37:34.274015 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ee35195-33b7-4bc8-80fb-7eb9f0ca221f-utilities\") pod \"redhat-marketplace-6bphp\" (UID: \"6ee35195-33b7-4bc8-80fb-7eb9f0ca221f\") " pod="openshift-marketplace/redhat-marketplace-6bphp" Dec 01 21:37:34 crc kubenswrapper[4962]: I1201 21:37:34.274139 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ee35195-33b7-4bc8-80fb-7eb9f0ca221f-catalog-content\") pod \"redhat-marketplace-6bphp\" (UID: \"6ee35195-33b7-4bc8-80fb-7eb9f0ca221f\") " pod="openshift-marketplace/redhat-marketplace-6bphp" Dec 01 21:37:34 crc kubenswrapper[4962]: I1201 21:37:34.274180 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8frq9\" (UniqueName: \"kubernetes.io/projected/6ee35195-33b7-4bc8-80fb-7eb9f0ca221f-kube-api-access-8frq9\") pod \"redhat-marketplace-6bphp\" (UID: \"6ee35195-33b7-4bc8-80fb-7eb9f0ca221f\") " pod="openshift-marketplace/redhat-marketplace-6bphp" Dec 01 21:37:34 crc kubenswrapper[4962]: I1201 21:37:34.338760 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4sv58" Dec 01 21:37:34 crc kubenswrapper[4962]: I1201 21:37:34.375549 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8frq9\" (UniqueName: \"kubernetes.io/projected/6ee35195-33b7-4bc8-80fb-7eb9f0ca221f-kube-api-access-8frq9\") pod \"redhat-marketplace-6bphp\" (UID: \"6ee35195-33b7-4bc8-80fb-7eb9f0ca221f\") " pod="openshift-marketplace/redhat-marketplace-6bphp" Dec 01 21:37:34 crc kubenswrapper[4962]: I1201 21:37:34.375733 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ee35195-33b7-4bc8-80fb-7eb9f0ca221f-utilities\") pod \"redhat-marketplace-6bphp\" (UID: \"6ee35195-33b7-4bc8-80fb-7eb9f0ca221f\") " pod="openshift-marketplace/redhat-marketplace-6bphp" Dec 01 21:37:34 crc kubenswrapper[4962]: I1201 21:37:34.375875 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ee35195-33b7-4bc8-80fb-7eb9f0ca221f-catalog-content\") pod \"redhat-marketplace-6bphp\" (UID: \"6ee35195-33b7-4bc8-80fb-7eb9f0ca221f\") " pod="openshift-marketplace/redhat-marketplace-6bphp" Dec 01 21:37:34 crc kubenswrapper[4962]: I1201 21:37:34.376389 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ee35195-33b7-4bc8-80fb-7eb9f0ca221f-catalog-content\") pod \"redhat-marketplace-6bphp\" (UID: \"6ee35195-33b7-4bc8-80fb-7eb9f0ca221f\") " pod="openshift-marketplace/redhat-marketplace-6bphp" Dec 01 21:37:34 crc kubenswrapper[4962]: I1201 21:37:34.377180 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ee35195-33b7-4bc8-80fb-7eb9f0ca221f-utilities\") pod \"redhat-marketplace-6bphp\" (UID: \"6ee35195-33b7-4bc8-80fb-7eb9f0ca221f\") " pod="openshift-marketplace/redhat-marketplace-6bphp" Dec 01 21:37:34 crc kubenswrapper[4962]: I1201 21:37:34.397901 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8frq9\" (UniqueName: \"kubernetes.io/projected/6ee35195-33b7-4bc8-80fb-7eb9f0ca221f-kube-api-access-8frq9\") pod \"redhat-marketplace-6bphp\" (UID: \"6ee35195-33b7-4bc8-80fb-7eb9f0ca221f\") " pod="openshift-marketplace/redhat-marketplace-6bphp" Dec 01 21:37:34 crc kubenswrapper[4962]: I1201 21:37:34.517239 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6bphp" Dec 01 21:37:34 crc kubenswrapper[4962]: I1201 21:37:34.709036 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6bphp"] Dec 01 21:37:34 crc kubenswrapper[4962]: W1201 21:37:34.721166 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ee35195_33b7_4bc8_80fb_7eb9f0ca221f.slice/crio-dab24b4866293f1547f4b94ad756614696d9036eb2c0413c9e2f28ddb462bfb9 WatchSource:0}: Error finding container dab24b4866293f1547f4b94ad756614696d9036eb2c0413c9e2f28ddb462bfb9: Status 404 returned error can't find the container with id dab24b4866293f1547f4b94ad756614696d9036eb2c0413c9e2f28ddb462bfb9 Dec 01 21:37:34 crc kubenswrapper[4962]: I1201 21:37:34.745685 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4sv58"] Dec 01 21:37:35 crc kubenswrapper[4962]: I1201 21:37:35.100333 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cckxw" event={"ID":"d6ce364a-a2f9-48fa-9c65-5f8e65da569f","Type":"ContainerStarted","Data":"b27197336395df6a815efa02c13908ca823d5671666631d8afbe776d987be562"} Dec 01 21:37:35 crc kubenswrapper[4962]: I1201 21:37:35.102718 4962 generic.go:334] "Generic (PLEG): container finished" podID="6ee35195-33b7-4bc8-80fb-7eb9f0ca221f" containerID="969b7948dcb30742a75ebaa241cad824697e05860b329024610986d18a8bb3e5" exitCode=0 Dec 01 21:37:35 crc kubenswrapper[4962]: I1201 21:37:35.102780 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6bphp" event={"ID":"6ee35195-33b7-4bc8-80fb-7eb9f0ca221f","Type":"ContainerDied","Data":"969b7948dcb30742a75ebaa241cad824697e05860b329024610986d18a8bb3e5"} Dec 01 21:37:35 crc kubenswrapper[4962]: I1201 21:37:35.102801 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6bphp" event={"ID":"6ee35195-33b7-4bc8-80fb-7eb9f0ca221f","Type":"ContainerStarted","Data":"dab24b4866293f1547f4b94ad756614696d9036eb2c0413c9e2f28ddb462bfb9"} Dec 01 21:37:35 crc kubenswrapper[4962]: I1201 21:37:35.105525 4962 generic.go:334] "Generic (PLEG): container finished" podID="3ea88a22-b18e-4b46-812c-35cb8dcdeb30" containerID="7c56b2a5ea1b6735ff6a6e0263b0e37bd43fdaa23f08b8af0f22f9fcd37e103a" exitCode=0 Dec 01 21:37:35 crc kubenswrapper[4962]: I1201 21:37:35.105572 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4sv58" event={"ID":"3ea88a22-b18e-4b46-812c-35cb8dcdeb30","Type":"ContainerDied","Data":"7c56b2a5ea1b6735ff6a6e0263b0e37bd43fdaa23f08b8af0f22f9fcd37e103a"} Dec 01 21:37:35 crc kubenswrapper[4962]: I1201 21:37:35.105600 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4sv58" event={"ID":"3ea88a22-b18e-4b46-812c-35cb8dcdeb30","Type":"ContainerStarted","Data":"e06fa76c0250346651117248fc0e89923116deab477c9844beeeb3b5783356d5"} Dec 01 21:37:36 crc kubenswrapper[4962]: I1201 21:37:36.113082 4962 generic.go:334] "Generic (PLEG): container finished" podID="21cefc69-51ca-4baa-a4f4-aa7de0d8aa7a" containerID="93b8e5ac9e5ef8803464849c341ce87abb199b73371ac2f3d1290161211ba71b" exitCode=0 Dec 01 21:37:36 crc kubenswrapper[4962]: I1201 21:37:36.113179 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t6qvm" event={"ID":"21cefc69-51ca-4baa-a4f4-aa7de0d8aa7a","Type":"ContainerDied","Data":"93b8e5ac9e5ef8803464849c341ce87abb199b73371ac2f3d1290161211ba71b"} Dec 01 21:37:36 crc kubenswrapper[4962]: I1201 21:37:36.115778 4962 generic.go:334] "Generic (PLEG): container finished" podID="d6ce364a-a2f9-48fa-9c65-5f8e65da569f" containerID="b27197336395df6a815efa02c13908ca823d5671666631d8afbe776d987be562" exitCode=0 Dec 01 21:37:36 crc kubenswrapper[4962]: I1201 21:37:36.115826 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cckxw" event={"ID":"d6ce364a-a2f9-48fa-9c65-5f8e65da569f","Type":"ContainerDied","Data":"b27197336395df6a815efa02c13908ca823d5671666631d8afbe776d987be562"} Dec 01 21:37:37 crc kubenswrapper[4962]: I1201 21:37:37.122584 4962 generic.go:334] "Generic (PLEG): container finished" podID="6ee35195-33b7-4bc8-80fb-7eb9f0ca221f" containerID="fb81f9dc92d4b0597e47287d966f334703513b6a8848ffa3937f9ebeec5f56a3" exitCode=0 Dec 01 21:37:37 crc kubenswrapper[4962]: I1201 21:37:37.122679 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6bphp" event={"ID":"6ee35195-33b7-4bc8-80fb-7eb9f0ca221f","Type":"ContainerDied","Data":"fb81f9dc92d4b0597e47287d966f334703513b6a8848ffa3937f9ebeec5f56a3"} Dec 01 21:37:37 crc kubenswrapper[4962]: I1201 21:37:37.125584 4962 generic.go:334] "Generic (PLEG): container finished" podID="3ea88a22-b18e-4b46-812c-35cb8dcdeb30" containerID="651b08209cf1e93576a29d30ddc48ea39e1df42db2be41d893f65061acf92c0b" exitCode=0 Dec 01 21:37:37 crc kubenswrapper[4962]: I1201 21:37:37.125630 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4sv58" event={"ID":"3ea88a22-b18e-4b46-812c-35cb8dcdeb30","Type":"ContainerDied","Data":"651b08209cf1e93576a29d30ddc48ea39e1df42db2be41d893f65061acf92c0b"} Dec 01 21:37:37 crc kubenswrapper[4962]: I1201 21:37:37.294758 4962 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 01 21:37:37 crc kubenswrapper[4962]: I1201 21:37:37.298191 4962 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 01 21:37:37 crc kubenswrapper[4962]: I1201 21:37:37.298220 4962 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 01 21:37:37 crc kubenswrapper[4962]: I1201 21:37:37.298310 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 21:37:37 crc kubenswrapper[4962]: I1201 21:37:37.298737 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://edc674b0b2e671ee34618f4a89fb513b4d6be3b5b2c65510554ed57b84f305b6" gracePeriod=15 Dec 01 21:37:37 crc kubenswrapper[4962]: I1201 21:37:37.298753 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://ff7bb726576fbf1e8a595c9fdab30ed4852d838f3ad72aa432b880e8466493ee" gracePeriod=15 Dec 01 21:37:37 crc kubenswrapper[4962]: I1201 21:37:37.298828 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://2d764e05de4e1d86add3da927e55f8d2f211bba19a2104af89e43d13f986f5ff" gracePeriod=15 Dec 01 21:37:37 crc kubenswrapper[4962]: I1201 21:37:37.298905 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://82bbe0097117ffbc7aa9c8247995dcec3765f03a9aa7657ee0c0da4c3a3ec874" gracePeriod=15 Dec 01 21:37:37 crc kubenswrapper[4962]: I1201 21:37:37.298925 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://d2a26275d40e0cdc563a8704d985b92ca116fd053ac663b23ada1f3d55d3d3a8" gracePeriod=15 Dec 01 21:37:37 crc kubenswrapper[4962]: E1201 21:37:37.299241 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 01 21:37:37 crc kubenswrapper[4962]: I1201 21:37:37.299254 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 01 21:37:37 crc kubenswrapper[4962]: E1201 21:37:37.299265 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 01 21:37:37 crc kubenswrapper[4962]: I1201 21:37:37.300769 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 01 21:37:37 crc kubenswrapper[4962]: E1201 21:37:37.300788 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 01 21:37:37 crc kubenswrapper[4962]: I1201 21:37:37.300795 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 01 21:37:37 crc kubenswrapper[4962]: E1201 21:37:37.300804 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 01 21:37:37 crc kubenswrapper[4962]: I1201 21:37:37.300811 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 01 21:37:37 crc kubenswrapper[4962]: E1201 21:37:37.300822 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 01 21:37:37 crc kubenswrapper[4962]: I1201 21:37:37.300829 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 01 21:37:37 crc kubenswrapper[4962]: E1201 21:37:37.300844 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 01 21:37:37 crc kubenswrapper[4962]: I1201 21:37:37.300852 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 01 21:37:37 crc kubenswrapper[4962]: I1201 21:37:37.301077 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 01 21:37:37 crc kubenswrapper[4962]: I1201 21:37:37.301091 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 01 21:37:37 crc kubenswrapper[4962]: I1201 21:37:37.301102 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 01 21:37:37 crc kubenswrapper[4962]: I1201 21:37:37.301113 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 01 21:37:37 crc kubenswrapper[4962]: I1201 21:37:37.301127 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 01 21:37:37 crc kubenswrapper[4962]: E1201 21:37:37.301248 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 01 21:37:37 crc kubenswrapper[4962]: I1201 21:37:37.301260 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 01 21:37:37 crc kubenswrapper[4962]: I1201 21:37:37.301305 4962 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Dec 01 21:37:37 crc kubenswrapper[4962]: I1201 21:37:37.301416 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 01 21:37:37 crc kubenswrapper[4962]: E1201 21:37:37.354639 4962 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.110:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 21:37:37 crc kubenswrapper[4962]: I1201 21:37:37.418151 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 21:37:37 crc kubenswrapper[4962]: I1201 21:37:37.418214 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 21:37:37 crc kubenswrapper[4962]: I1201 21:37:37.418243 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 21:37:37 crc kubenswrapper[4962]: I1201 21:37:37.418269 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 21:37:37 crc kubenswrapper[4962]: I1201 21:37:37.418288 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 21:37:37 crc kubenswrapper[4962]: I1201 21:37:37.418351 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 21:37:37 crc kubenswrapper[4962]: I1201 21:37:37.418407 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 21:37:37 crc kubenswrapper[4962]: I1201 21:37:37.418425 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 21:37:37 crc kubenswrapper[4962]: I1201 21:37:37.519534 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 21:37:37 crc kubenswrapper[4962]: I1201 21:37:37.519588 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 21:37:37 crc kubenswrapper[4962]: I1201 21:37:37.519637 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 21:37:37 crc kubenswrapper[4962]: I1201 21:37:37.519661 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 21:37:37 crc kubenswrapper[4962]: I1201 21:37:37.519681 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 21:37:37 crc kubenswrapper[4962]: I1201 21:37:37.519706 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 21:37:37 crc kubenswrapper[4962]: I1201 21:37:37.519708 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 21:37:37 crc kubenswrapper[4962]: I1201 21:37:37.519673 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 21:37:37 crc kubenswrapper[4962]: I1201 21:37:37.519742 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 21:37:37 crc kubenswrapper[4962]: I1201 21:37:37.519768 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 21:37:37 crc kubenswrapper[4962]: I1201 21:37:37.519768 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 21:37:37 crc kubenswrapper[4962]: I1201 21:37:37.519842 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 21:37:37 crc kubenswrapper[4962]: I1201 21:37:37.519883 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 21:37:37 crc kubenswrapper[4962]: I1201 21:37:37.519914 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 21:37:37 crc kubenswrapper[4962]: I1201 21:37:37.519997 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 21:37:37 crc kubenswrapper[4962]: I1201 21:37:37.520066 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 21:37:37 crc kubenswrapper[4962]: E1201 21:37:37.546898 4962 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events\": dial tcp 38.102.83.110:6443: connect: connection refused" event="&Event{ObjectMeta:{redhat-operators-cckxw.187d3523f4cdb9c5 openshift-marketplace 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:redhat-operators-cckxw,UID:d6ce364a-a2f9-48fa-9c65-5f8e65da569f,APIVersion:v1,ResourceVersion:29429,FieldPath:spec.containers{registry-server},},Reason:Created,Message:Created container registry-server,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-01 21:37:37.546353093 +0000 UTC m=+241.647792298,LastTimestamp:2025-12-01 21:37:37.546353093 +0000 UTC m=+241.647792298,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 01 21:37:37 crc kubenswrapper[4962]: I1201 21:37:37.656086 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 21:37:38 crc kubenswrapper[4962]: I1201 21:37:38.132485 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t6qvm" event={"ID":"21cefc69-51ca-4baa-a4f4-aa7de0d8aa7a","Type":"ContainerStarted","Data":"83bffa257a413145b293b17070a718bd62937dd740497613744090bfe9e36047"} Dec 01 21:37:38 crc kubenswrapper[4962]: I1201 21:37:38.133158 4962 status_manager.go:851] "Failed to get status for pod" podUID="21cefc69-51ca-4baa-a4f4-aa7de0d8aa7a" pod="openshift-marketplace/community-operators-t6qvm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-t6qvm\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:38 crc kubenswrapper[4962]: I1201 21:37:38.135182 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cckxw" event={"ID":"d6ce364a-a2f9-48fa-9c65-5f8e65da569f","Type":"ContainerStarted","Data":"cc24ea89dd62b1c96a9f9517b1a56c088b28361b9bddfe9be6a07bc52a48ad98"} Dec 01 21:37:38 crc kubenswrapper[4962]: I1201 21:37:38.136278 4962 status_manager.go:851] "Failed to get status for pod" podUID="21cefc69-51ca-4baa-a4f4-aa7de0d8aa7a" pod="openshift-marketplace/community-operators-t6qvm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-t6qvm\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:38 crc kubenswrapper[4962]: I1201 21:37:38.136550 4962 status_manager.go:851] "Failed to get status for pod" podUID="d6ce364a-a2f9-48fa-9c65-5f8e65da569f" pod="openshift-marketplace/redhat-operators-cckxw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cckxw\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:38 crc kubenswrapper[4962]: I1201 21:37:38.137148 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"c84559d0dc527ca39a746d3e1b24b6706ea1243d121135a8f297e6c2fa0a69ed"} Dec 01 21:37:38 crc kubenswrapper[4962]: I1201 21:37:38.137179 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"5346b2f445e8b0a9d29b85885f00aae1b55751dbd553571ade7251d2679d6300"} Dec 01 21:37:38 crc kubenswrapper[4962]: I1201 21:37:38.137661 4962 status_manager.go:851] "Failed to get status for pod" podUID="21cefc69-51ca-4baa-a4f4-aa7de0d8aa7a" pod="openshift-marketplace/community-operators-t6qvm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-t6qvm\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:38 crc kubenswrapper[4962]: E1201 21:37:38.137718 4962 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.110:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 21:37:38 crc kubenswrapper[4962]: I1201 21:37:38.137818 4962 status_manager.go:851] "Failed to get status for pod" podUID="d6ce364a-a2f9-48fa-9c65-5f8e65da569f" pod="openshift-marketplace/redhat-operators-cckxw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cckxw\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:38 crc kubenswrapper[4962]: I1201 21:37:38.141332 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 01 21:37:38 crc kubenswrapper[4962]: I1201 21:37:38.142406 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 01 21:37:38 crc kubenswrapper[4962]: I1201 21:37:38.143051 4962 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ff7bb726576fbf1e8a595c9fdab30ed4852d838f3ad72aa432b880e8466493ee" exitCode=0 Dec 01 21:37:38 crc kubenswrapper[4962]: I1201 21:37:38.143074 4962 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="82bbe0097117ffbc7aa9c8247995dcec3765f03a9aa7657ee0c0da4c3a3ec874" exitCode=0 Dec 01 21:37:38 crc kubenswrapper[4962]: I1201 21:37:38.143085 4962 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2d764e05de4e1d86add3da927e55f8d2f211bba19a2104af89e43d13f986f5ff" exitCode=0 Dec 01 21:37:38 crc kubenswrapper[4962]: I1201 21:37:38.143094 4962 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d2a26275d40e0cdc563a8704d985b92ca116fd053ac663b23ada1f3d55d3d3a8" exitCode=2 Dec 01 21:37:38 crc kubenswrapper[4962]: I1201 21:37:38.143149 4962 scope.go:117] "RemoveContainer" containerID="4fa02037f2a0411aa4c3b7eff66c72a1885d70e56cf72cdf786d04b5840ecbe6" Dec 01 21:37:38 crc kubenswrapper[4962]: I1201 21:37:38.145239 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4sv58" event={"ID":"3ea88a22-b18e-4b46-812c-35cb8dcdeb30","Type":"ContainerStarted","Data":"0ee34f2c70499dd7fb539a1014899e81e367ff3dec9f7db987047e960db40e55"} Dec 01 21:37:38 crc kubenswrapper[4962]: I1201 21:37:38.145729 4962 status_manager.go:851] "Failed to get status for pod" podUID="3ea88a22-b18e-4b46-812c-35cb8dcdeb30" pod="openshift-marketplace/certified-operators-4sv58" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-4sv58\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:38 crc kubenswrapper[4962]: I1201 21:37:38.145979 4962 status_manager.go:851] "Failed to get status for pod" podUID="21cefc69-51ca-4baa-a4f4-aa7de0d8aa7a" pod="openshift-marketplace/community-operators-t6qvm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-t6qvm\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:38 crc kubenswrapper[4962]: I1201 21:37:38.146219 4962 status_manager.go:851] "Failed to get status for pod" podUID="d6ce364a-a2f9-48fa-9c65-5f8e65da569f" pod="openshift-marketplace/redhat-operators-cckxw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cckxw\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:38 crc kubenswrapper[4962]: I1201 21:37:38.151216 4962 generic.go:334] "Generic (PLEG): container finished" podID="b178e206-e661-4a20-af64-1b538fbc947a" containerID="50676283b1f5f2c528010bf413d0d8627e9afde30700c96062d318ba343f0cbe" exitCode=0 Dec 01 21:37:38 crc kubenswrapper[4962]: I1201 21:37:38.151256 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"b178e206-e661-4a20-af64-1b538fbc947a","Type":"ContainerDied","Data":"50676283b1f5f2c528010bf413d0d8627e9afde30700c96062d318ba343f0cbe"} Dec 01 21:37:38 crc kubenswrapper[4962]: I1201 21:37:38.151774 4962 status_manager.go:851] "Failed to get status for pod" podUID="21cefc69-51ca-4baa-a4f4-aa7de0d8aa7a" pod="openshift-marketplace/community-operators-t6qvm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-t6qvm\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:38 crc kubenswrapper[4962]: I1201 21:37:38.152371 4962 status_manager.go:851] "Failed to get status for pod" podUID="d6ce364a-a2f9-48fa-9c65-5f8e65da569f" pod="openshift-marketplace/redhat-operators-cckxw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cckxw\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:38 crc kubenswrapper[4962]: I1201 21:37:38.152560 4962 status_manager.go:851] "Failed to get status for pod" podUID="b178e206-e661-4a20-af64-1b538fbc947a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:38 crc kubenswrapper[4962]: I1201 21:37:38.152709 4962 status_manager.go:851] "Failed to get status for pod" podUID="3ea88a22-b18e-4b46-812c-35cb8dcdeb30" pod="openshift-marketplace/certified-operators-4sv58" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-4sv58\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:39 crc kubenswrapper[4962]: E1201 21:37:39.004690 4962 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:39 crc kubenswrapper[4962]: E1201 21:37:39.005397 4962 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:39 crc kubenswrapper[4962]: E1201 21:37:39.005804 4962 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:39 crc kubenswrapper[4962]: E1201 21:37:39.006047 4962 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:39 crc kubenswrapper[4962]: E1201 21:37:39.006250 4962 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:39 crc kubenswrapper[4962]: I1201 21:37:39.006282 4962 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 01 21:37:39 crc kubenswrapper[4962]: E1201 21:37:39.006609 4962 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" interval="200ms" Dec 01 21:37:39 crc kubenswrapper[4962]: I1201 21:37:39.165656 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6bphp" event={"ID":"6ee35195-33b7-4bc8-80fb-7eb9f0ca221f","Type":"ContainerStarted","Data":"fee3a62a30276c67e03660d0752a56fcc5b2c086516826580ad4762b5106fc43"} Dec 01 21:37:39 crc kubenswrapper[4962]: I1201 21:37:39.169231 4962 status_manager.go:851] "Failed to get status for pod" podUID="6ee35195-33b7-4bc8-80fb-7eb9f0ca221f" pod="openshift-marketplace/redhat-marketplace-6bphp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-6bphp\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:39 crc kubenswrapper[4962]: I1201 21:37:39.171140 4962 status_manager.go:851] "Failed to get status for pod" podUID="21cefc69-51ca-4baa-a4f4-aa7de0d8aa7a" pod="openshift-marketplace/community-operators-t6qvm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-t6qvm\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:39 crc kubenswrapper[4962]: I1201 21:37:39.171628 4962 status_manager.go:851] "Failed to get status for pod" podUID="d6ce364a-a2f9-48fa-9c65-5f8e65da569f" pod="openshift-marketplace/redhat-operators-cckxw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cckxw\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:39 crc kubenswrapper[4962]: I1201 21:37:39.171985 4962 status_manager.go:851] "Failed to get status for pod" podUID="b178e206-e661-4a20-af64-1b538fbc947a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:39 crc kubenswrapper[4962]: I1201 21:37:39.172332 4962 status_manager.go:851] "Failed to get status for pod" podUID="3ea88a22-b18e-4b46-812c-35cb8dcdeb30" pod="openshift-marketplace/certified-operators-4sv58" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-4sv58\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:39 crc kubenswrapper[4962]: I1201 21:37:39.174924 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 01 21:37:39 crc kubenswrapper[4962]: E1201 21:37:39.207525 4962 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" interval="400ms" Dec 01 21:37:39 crc kubenswrapper[4962]: I1201 21:37:39.519397 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 01 21:37:39 crc kubenswrapper[4962]: I1201 21:37:39.520492 4962 status_manager.go:851] "Failed to get status for pod" podUID="6ee35195-33b7-4bc8-80fb-7eb9f0ca221f" pod="openshift-marketplace/redhat-marketplace-6bphp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-6bphp\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:39 crc kubenswrapper[4962]: I1201 21:37:39.520919 4962 status_manager.go:851] "Failed to get status for pod" podUID="21cefc69-51ca-4baa-a4f4-aa7de0d8aa7a" pod="openshift-marketplace/community-operators-t6qvm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-t6qvm\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:39 crc kubenswrapper[4962]: I1201 21:37:39.521302 4962 status_manager.go:851] "Failed to get status for pod" podUID="d6ce364a-a2f9-48fa-9c65-5f8e65da569f" pod="openshift-marketplace/redhat-operators-cckxw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cckxw\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:39 crc kubenswrapper[4962]: I1201 21:37:39.521548 4962 status_manager.go:851] "Failed to get status for pod" podUID="b178e206-e661-4a20-af64-1b538fbc947a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:39 crc kubenswrapper[4962]: I1201 21:37:39.521786 4962 status_manager.go:851] "Failed to get status for pod" podUID="3ea88a22-b18e-4b46-812c-35cb8dcdeb30" pod="openshift-marketplace/certified-operators-4sv58" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-4sv58\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:39 crc kubenswrapper[4962]: E1201 21:37:39.609333 4962 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" interval="800ms" Dec 01 21:37:39 crc kubenswrapper[4962]: I1201 21:37:39.645091 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b178e206-e661-4a20-af64-1b538fbc947a-kubelet-dir\") pod \"b178e206-e661-4a20-af64-1b538fbc947a\" (UID: \"b178e206-e661-4a20-af64-1b538fbc947a\") " Dec 01 21:37:39 crc kubenswrapper[4962]: I1201 21:37:39.645122 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b178e206-e661-4a20-af64-1b538fbc947a-var-lock\") pod \"b178e206-e661-4a20-af64-1b538fbc947a\" (UID: \"b178e206-e661-4a20-af64-1b538fbc947a\") " Dec 01 21:37:39 crc kubenswrapper[4962]: I1201 21:37:39.645159 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b178e206-e661-4a20-af64-1b538fbc947a-kube-api-access\") pod \"b178e206-e661-4a20-af64-1b538fbc947a\" (UID: \"b178e206-e661-4a20-af64-1b538fbc947a\") " Dec 01 21:37:39 crc kubenswrapper[4962]: I1201 21:37:39.645881 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b178e206-e661-4a20-af64-1b538fbc947a-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "b178e206-e661-4a20-af64-1b538fbc947a" (UID: "b178e206-e661-4a20-af64-1b538fbc947a"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 21:37:39 crc kubenswrapper[4962]: I1201 21:37:39.645960 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b178e206-e661-4a20-af64-1b538fbc947a-var-lock" (OuterVolumeSpecName: "var-lock") pod "b178e206-e661-4a20-af64-1b538fbc947a" (UID: "b178e206-e661-4a20-af64-1b538fbc947a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 21:37:39 crc kubenswrapper[4962]: I1201 21:37:39.651651 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b178e206-e661-4a20-af64-1b538fbc947a-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "b178e206-e661-4a20-af64-1b538fbc947a" (UID: "b178e206-e661-4a20-af64-1b538fbc947a"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:37:39 crc kubenswrapper[4962]: I1201 21:37:39.658199 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 01 21:37:39 crc kubenswrapper[4962]: I1201 21:37:39.659396 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 21:37:39 crc kubenswrapper[4962]: I1201 21:37:39.659845 4962 status_manager.go:851] "Failed to get status for pod" podUID="b178e206-e661-4a20-af64-1b538fbc947a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:39 crc kubenswrapper[4962]: I1201 21:37:39.660140 4962 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:39 crc kubenswrapper[4962]: I1201 21:37:39.660508 4962 status_manager.go:851] "Failed to get status for pod" podUID="3ea88a22-b18e-4b46-812c-35cb8dcdeb30" pod="openshift-marketplace/certified-operators-4sv58" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-4sv58\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:39 crc kubenswrapper[4962]: I1201 21:37:39.660681 4962 status_manager.go:851] "Failed to get status for pod" podUID="6ee35195-33b7-4bc8-80fb-7eb9f0ca221f" pod="openshift-marketplace/redhat-marketplace-6bphp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-6bphp\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:39 crc kubenswrapper[4962]: I1201 21:37:39.660824 4962 status_manager.go:851] "Failed to get status for pod" podUID="21cefc69-51ca-4baa-a4f4-aa7de0d8aa7a" pod="openshift-marketplace/community-operators-t6qvm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-t6qvm\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:39 crc kubenswrapper[4962]: I1201 21:37:39.660992 4962 status_manager.go:851] "Failed to get status for pod" podUID="d6ce364a-a2f9-48fa-9c65-5f8e65da569f" pod="openshift-marketplace/redhat-operators-cckxw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cckxw\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:39 crc kubenswrapper[4962]: I1201 21:37:39.746335 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 01 21:37:39 crc kubenswrapper[4962]: I1201 21:37:39.746384 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 01 21:37:39 crc kubenswrapper[4962]: I1201 21:37:39.746482 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 01 21:37:39 crc kubenswrapper[4962]: I1201 21:37:39.746749 4962 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b178e206-e661-4a20-af64-1b538fbc947a-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 01 21:37:39 crc kubenswrapper[4962]: I1201 21:37:39.746774 4962 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b178e206-e661-4a20-af64-1b538fbc947a-var-lock\") on node \"crc\" DevicePath \"\"" Dec 01 21:37:39 crc kubenswrapper[4962]: I1201 21:37:39.746787 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b178e206-e661-4a20-af64-1b538fbc947a-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 21:37:39 crc kubenswrapper[4962]: I1201 21:37:39.746843 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 21:37:39 crc kubenswrapper[4962]: I1201 21:37:39.746874 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 21:37:39 crc kubenswrapper[4962]: I1201 21:37:39.746894 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 21:37:39 crc kubenswrapper[4962]: I1201 21:37:39.848235 4962 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 01 21:37:39 crc kubenswrapper[4962]: I1201 21:37:39.848269 4962 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Dec 01 21:37:39 crc kubenswrapper[4962]: I1201 21:37:39.848277 4962 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 01 21:37:40 crc kubenswrapper[4962]: I1201 21:37:40.185752 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"b178e206-e661-4a20-af64-1b538fbc947a","Type":"ContainerDied","Data":"3efe046b06727fb9057fe9ce6463ca986605bab026b349a6fa1ef3051cc47b30"} Dec 01 21:37:40 crc kubenswrapper[4962]: I1201 21:37:40.185775 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 01 21:37:40 crc kubenswrapper[4962]: I1201 21:37:40.185792 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3efe046b06727fb9057fe9ce6463ca986605bab026b349a6fa1ef3051cc47b30" Dec 01 21:37:40 crc kubenswrapper[4962]: I1201 21:37:40.189847 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 01 21:37:40 crc kubenswrapper[4962]: I1201 21:37:40.191518 4962 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="edc674b0b2e671ee34618f4a89fb513b4d6be3b5b2c65510554ed57b84f305b6" exitCode=0 Dec 01 21:37:40 crc kubenswrapper[4962]: I1201 21:37:40.192559 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 21:37:40 crc kubenswrapper[4962]: I1201 21:37:40.196037 4962 scope.go:117] "RemoveContainer" containerID="ff7bb726576fbf1e8a595c9fdab30ed4852d838f3ad72aa432b880e8466493ee" Dec 01 21:37:40 crc kubenswrapper[4962]: I1201 21:37:40.200125 4962 status_manager.go:851] "Failed to get status for pod" podUID="6ee35195-33b7-4bc8-80fb-7eb9f0ca221f" pod="openshift-marketplace/redhat-marketplace-6bphp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-6bphp\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:40 crc kubenswrapper[4962]: I1201 21:37:40.200349 4962 status_manager.go:851] "Failed to get status for pod" podUID="21cefc69-51ca-4baa-a4f4-aa7de0d8aa7a" pod="openshift-marketplace/community-operators-t6qvm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-t6qvm\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:40 crc kubenswrapper[4962]: I1201 21:37:40.200559 4962 status_manager.go:851] "Failed to get status for pod" podUID="d6ce364a-a2f9-48fa-9c65-5f8e65da569f" pod="openshift-marketplace/redhat-operators-cckxw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cckxw\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:40 crc kubenswrapper[4962]: I1201 21:37:40.200787 4962 status_manager.go:851] "Failed to get status for pod" podUID="b178e206-e661-4a20-af64-1b538fbc947a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:40 crc kubenswrapper[4962]: I1201 21:37:40.201056 4962 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:40 crc kubenswrapper[4962]: I1201 21:37:40.201346 4962 status_manager.go:851] "Failed to get status for pod" podUID="3ea88a22-b18e-4b46-812c-35cb8dcdeb30" pod="openshift-marketplace/certified-operators-4sv58" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-4sv58\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:40 crc kubenswrapper[4962]: I1201 21:37:40.213222 4962 status_manager.go:851] "Failed to get status for pod" podUID="b178e206-e661-4a20-af64-1b538fbc947a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:40 crc kubenswrapper[4962]: I1201 21:37:40.213566 4962 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:40 crc kubenswrapper[4962]: I1201 21:37:40.214043 4962 status_manager.go:851] "Failed to get status for pod" podUID="3ea88a22-b18e-4b46-812c-35cb8dcdeb30" pod="openshift-marketplace/certified-operators-4sv58" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-4sv58\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:40 crc kubenswrapper[4962]: I1201 21:37:40.214267 4962 status_manager.go:851] "Failed to get status for pod" podUID="6ee35195-33b7-4bc8-80fb-7eb9f0ca221f" pod="openshift-marketplace/redhat-marketplace-6bphp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-6bphp\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:40 crc kubenswrapper[4962]: I1201 21:37:40.214493 4962 status_manager.go:851] "Failed to get status for pod" podUID="21cefc69-51ca-4baa-a4f4-aa7de0d8aa7a" pod="openshift-marketplace/community-operators-t6qvm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-t6qvm\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:40 crc kubenswrapper[4962]: I1201 21:37:40.214759 4962 status_manager.go:851] "Failed to get status for pod" podUID="d6ce364a-a2f9-48fa-9c65-5f8e65da569f" pod="openshift-marketplace/redhat-operators-cckxw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cckxw\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:40 crc kubenswrapper[4962]: I1201 21:37:40.227634 4962 scope.go:117] "RemoveContainer" containerID="82bbe0097117ffbc7aa9c8247995dcec3765f03a9aa7657ee0c0da4c3a3ec874" Dec 01 21:37:40 crc kubenswrapper[4962]: I1201 21:37:40.228071 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Dec 01 21:37:40 crc kubenswrapper[4962]: I1201 21:37:40.242843 4962 scope.go:117] "RemoveContainer" containerID="2d764e05de4e1d86add3da927e55f8d2f211bba19a2104af89e43d13f986f5ff" Dec 01 21:37:40 crc kubenswrapper[4962]: I1201 21:37:40.271307 4962 scope.go:117] "RemoveContainer" containerID="d2a26275d40e0cdc563a8704d985b92ca116fd053ac663b23ada1f3d55d3d3a8" Dec 01 21:37:40 crc kubenswrapper[4962]: I1201 21:37:40.291890 4962 scope.go:117] "RemoveContainer" containerID="edc674b0b2e671ee34618f4a89fb513b4d6be3b5b2c65510554ed57b84f305b6" Dec 01 21:37:40 crc kubenswrapper[4962]: I1201 21:37:40.311068 4962 scope.go:117] "RemoveContainer" containerID="9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818" Dec 01 21:37:40 crc kubenswrapper[4962]: I1201 21:37:40.337088 4962 scope.go:117] "RemoveContainer" containerID="ff7bb726576fbf1e8a595c9fdab30ed4852d838f3ad72aa432b880e8466493ee" Dec 01 21:37:40 crc kubenswrapper[4962]: E1201 21:37:40.337565 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff7bb726576fbf1e8a595c9fdab30ed4852d838f3ad72aa432b880e8466493ee\": container with ID starting with ff7bb726576fbf1e8a595c9fdab30ed4852d838f3ad72aa432b880e8466493ee not found: ID does not exist" containerID="ff7bb726576fbf1e8a595c9fdab30ed4852d838f3ad72aa432b880e8466493ee" Dec 01 21:37:40 crc kubenswrapper[4962]: I1201 21:37:40.337596 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff7bb726576fbf1e8a595c9fdab30ed4852d838f3ad72aa432b880e8466493ee"} err="failed to get container status \"ff7bb726576fbf1e8a595c9fdab30ed4852d838f3ad72aa432b880e8466493ee\": rpc error: code = NotFound desc = could not find container \"ff7bb726576fbf1e8a595c9fdab30ed4852d838f3ad72aa432b880e8466493ee\": container with ID starting with ff7bb726576fbf1e8a595c9fdab30ed4852d838f3ad72aa432b880e8466493ee not found: ID does not exist" Dec 01 21:37:40 crc kubenswrapper[4962]: I1201 21:37:40.337799 4962 scope.go:117] "RemoveContainer" containerID="82bbe0097117ffbc7aa9c8247995dcec3765f03a9aa7657ee0c0da4c3a3ec874" Dec 01 21:37:40 crc kubenswrapper[4962]: E1201 21:37:40.338104 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82bbe0097117ffbc7aa9c8247995dcec3765f03a9aa7657ee0c0da4c3a3ec874\": container with ID starting with 82bbe0097117ffbc7aa9c8247995dcec3765f03a9aa7657ee0c0da4c3a3ec874 not found: ID does not exist" containerID="82bbe0097117ffbc7aa9c8247995dcec3765f03a9aa7657ee0c0da4c3a3ec874" Dec 01 21:37:40 crc kubenswrapper[4962]: I1201 21:37:40.338122 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82bbe0097117ffbc7aa9c8247995dcec3765f03a9aa7657ee0c0da4c3a3ec874"} err="failed to get container status \"82bbe0097117ffbc7aa9c8247995dcec3765f03a9aa7657ee0c0da4c3a3ec874\": rpc error: code = NotFound desc = could not find container \"82bbe0097117ffbc7aa9c8247995dcec3765f03a9aa7657ee0c0da4c3a3ec874\": container with ID starting with 82bbe0097117ffbc7aa9c8247995dcec3765f03a9aa7657ee0c0da4c3a3ec874 not found: ID does not exist" Dec 01 21:37:40 crc kubenswrapper[4962]: I1201 21:37:40.338135 4962 scope.go:117] "RemoveContainer" containerID="2d764e05de4e1d86add3da927e55f8d2f211bba19a2104af89e43d13f986f5ff" Dec 01 21:37:40 crc kubenswrapper[4962]: E1201 21:37:40.338765 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d764e05de4e1d86add3da927e55f8d2f211bba19a2104af89e43d13f986f5ff\": container with ID starting with 2d764e05de4e1d86add3da927e55f8d2f211bba19a2104af89e43d13f986f5ff not found: ID does not exist" containerID="2d764e05de4e1d86add3da927e55f8d2f211bba19a2104af89e43d13f986f5ff" Dec 01 21:37:40 crc kubenswrapper[4962]: I1201 21:37:40.338800 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d764e05de4e1d86add3da927e55f8d2f211bba19a2104af89e43d13f986f5ff"} err="failed to get container status \"2d764e05de4e1d86add3da927e55f8d2f211bba19a2104af89e43d13f986f5ff\": rpc error: code = NotFound desc = could not find container \"2d764e05de4e1d86add3da927e55f8d2f211bba19a2104af89e43d13f986f5ff\": container with ID starting with 2d764e05de4e1d86add3da927e55f8d2f211bba19a2104af89e43d13f986f5ff not found: ID does not exist" Dec 01 21:37:40 crc kubenswrapper[4962]: I1201 21:37:40.338819 4962 scope.go:117] "RemoveContainer" containerID="d2a26275d40e0cdc563a8704d985b92ca116fd053ac663b23ada1f3d55d3d3a8" Dec 01 21:37:40 crc kubenswrapper[4962]: E1201 21:37:40.339223 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2a26275d40e0cdc563a8704d985b92ca116fd053ac663b23ada1f3d55d3d3a8\": container with ID starting with d2a26275d40e0cdc563a8704d985b92ca116fd053ac663b23ada1f3d55d3d3a8 not found: ID does not exist" containerID="d2a26275d40e0cdc563a8704d985b92ca116fd053ac663b23ada1f3d55d3d3a8" Dec 01 21:37:40 crc kubenswrapper[4962]: I1201 21:37:40.339245 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2a26275d40e0cdc563a8704d985b92ca116fd053ac663b23ada1f3d55d3d3a8"} err="failed to get container status \"d2a26275d40e0cdc563a8704d985b92ca116fd053ac663b23ada1f3d55d3d3a8\": rpc error: code = NotFound desc = could not find container \"d2a26275d40e0cdc563a8704d985b92ca116fd053ac663b23ada1f3d55d3d3a8\": container with ID starting with d2a26275d40e0cdc563a8704d985b92ca116fd053ac663b23ada1f3d55d3d3a8 not found: ID does not exist" Dec 01 21:37:40 crc kubenswrapper[4962]: I1201 21:37:40.339261 4962 scope.go:117] "RemoveContainer" containerID="edc674b0b2e671ee34618f4a89fb513b4d6be3b5b2c65510554ed57b84f305b6" Dec 01 21:37:40 crc kubenswrapper[4962]: E1201 21:37:40.339764 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edc674b0b2e671ee34618f4a89fb513b4d6be3b5b2c65510554ed57b84f305b6\": container with ID starting with edc674b0b2e671ee34618f4a89fb513b4d6be3b5b2c65510554ed57b84f305b6 not found: ID does not exist" containerID="edc674b0b2e671ee34618f4a89fb513b4d6be3b5b2c65510554ed57b84f305b6" Dec 01 21:37:40 crc kubenswrapper[4962]: I1201 21:37:40.339792 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edc674b0b2e671ee34618f4a89fb513b4d6be3b5b2c65510554ed57b84f305b6"} err="failed to get container status \"edc674b0b2e671ee34618f4a89fb513b4d6be3b5b2c65510554ed57b84f305b6\": rpc error: code = NotFound desc = could not find container \"edc674b0b2e671ee34618f4a89fb513b4d6be3b5b2c65510554ed57b84f305b6\": container with ID starting with edc674b0b2e671ee34618f4a89fb513b4d6be3b5b2c65510554ed57b84f305b6 not found: ID does not exist" Dec 01 21:37:40 crc kubenswrapper[4962]: I1201 21:37:40.339810 4962 scope.go:117] "RemoveContainer" containerID="9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818" Dec 01 21:37:40 crc kubenswrapper[4962]: E1201 21:37:40.340076 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\": container with ID starting with 9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818 not found: ID does not exist" containerID="9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818" Dec 01 21:37:40 crc kubenswrapper[4962]: I1201 21:37:40.340108 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818"} err="failed to get container status \"9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\": rpc error: code = NotFound desc = could not find container \"9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818\": container with ID starting with 9bd7a7ff350f5e3583a03b0dd72ebc58aad114c0789b7c886f0cfbd3383c6818 not found: ID does not exist" Dec 01 21:37:40 crc kubenswrapper[4962]: E1201 21:37:40.410199 4962 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" interval="1.6s" Dec 01 21:37:40 crc kubenswrapper[4962]: I1201 21:37:40.448083 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-74bhz" Dec 01 21:37:40 crc kubenswrapper[4962]: I1201 21:37:40.448570 4962 status_manager.go:851] "Failed to get status for pod" podUID="6ee35195-33b7-4bc8-80fb-7eb9f0ca221f" pod="openshift-marketplace/redhat-marketplace-6bphp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-6bphp\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:40 crc kubenswrapper[4962]: I1201 21:37:40.448918 4962 status_manager.go:851] "Failed to get status for pod" podUID="21cefc69-51ca-4baa-a4f4-aa7de0d8aa7a" pod="openshift-marketplace/community-operators-t6qvm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-t6qvm\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:40 crc kubenswrapper[4962]: I1201 21:37:40.449220 4962 status_manager.go:851] "Failed to get status for pod" podUID="0cfa40bb-018f-4c4f-afbf-cfab90e33210" pod="openshift-image-registry/image-registry-66df7c8f76-74bhz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-66df7c8f76-74bhz\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:40 crc kubenswrapper[4962]: I1201 21:37:40.449468 4962 status_manager.go:851] "Failed to get status for pod" podUID="d6ce364a-a2f9-48fa-9c65-5f8e65da569f" pod="openshift-marketplace/redhat-operators-cckxw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cckxw\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:40 crc kubenswrapper[4962]: I1201 21:37:40.449749 4962 status_manager.go:851] "Failed to get status for pod" podUID="b178e206-e661-4a20-af64-1b538fbc947a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:40 crc kubenswrapper[4962]: I1201 21:37:40.450132 4962 status_manager.go:851] "Failed to get status for pod" podUID="3ea88a22-b18e-4b46-812c-35cb8dcdeb30" pod="openshift-marketplace/certified-operators-4sv58" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-4sv58\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:40 crc kubenswrapper[4962]: E1201 21:37:40.505508 4962 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.110:6443: connect: connection refused" pod="openshift-image-registry/image-registry-66df7c8f76-74bhz" volumeName="registry-storage" Dec 01 21:37:41 crc kubenswrapper[4962]: I1201 21:37:41.907506 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-t6qvm" Dec 01 21:37:41 crc kubenswrapper[4962]: I1201 21:37:41.910468 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-t6qvm" Dec 01 21:37:41 crc kubenswrapper[4962]: I1201 21:37:41.951127 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-t6qvm" Dec 01 21:37:41 crc kubenswrapper[4962]: I1201 21:37:41.951840 4962 status_manager.go:851] "Failed to get status for pod" podUID="b178e206-e661-4a20-af64-1b538fbc947a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:41 crc kubenswrapper[4962]: I1201 21:37:41.952154 4962 status_manager.go:851] "Failed to get status for pod" podUID="3ea88a22-b18e-4b46-812c-35cb8dcdeb30" pod="openshift-marketplace/certified-operators-4sv58" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-4sv58\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:41 crc kubenswrapper[4962]: I1201 21:37:41.952390 4962 status_manager.go:851] "Failed to get status for pod" podUID="6ee35195-33b7-4bc8-80fb-7eb9f0ca221f" pod="openshift-marketplace/redhat-marketplace-6bphp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-6bphp\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:41 crc kubenswrapper[4962]: I1201 21:37:41.952607 4962 status_manager.go:851] "Failed to get status for pod" podUID="0cfa40bb-018f-4c4f-afbf-cfab90e33210" pod="openshift-image-registry/image-registry-66df7c8f76-74bhz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-66df7c8f76-74bhz\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:41 crc kubenswrapper[4962]: I1201 21:37:41.952833 4962 status_manager.go:851] "Failed to get status for pod" podUID="21cefc69-51ca-4baa-a4f4-aa7de0d8aa7a" pod="openshift-marketplace/community-operators-t6qvm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-t6qvm\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:41 crc kubenswrapper[4962]: I1201 21:37:41.953056 4962 status_manager.go:851] "Failed to get status for pod" podUID="d6ce364a-a2f9-48fa-9c65-5f8e65da569f" pod="openshift-marketplace/redhat-operators-cckxw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cckxw\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:42 crc kubenswrapper[4962]: E1201 21:37:42.011064 4962 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" interval="3.2s" Dec 01 21:37:42 crc kubenswrapper[4962]: I1201 21:37:42.103205 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cckxw" Dec 01 21:37:42 crc kubenswrapper[4962]: I1201 21:37:42.104638 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cckxw" Dec 01 21:37:42 crc kubenswrapper[4962]: I1201 21:37:42.282627 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-t6qvm" Dec 01 21:37:42 crc kubenswrapper[4962]: I1201 21:37:42.283195 4962 status_manager.go:851] "Failed to get status for pod" podUID="0cfa40bb-018f-4c4f-afbf-cfab90e33210" pod="openshift-image-registry/image-registry-66df7c8f76-74bhz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-66df7c8f76-74bhz\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:42 crc kubenswrapper[4962]: I1201 21:37:42.283669 4962 status_manager.go:851] "Failed to get status for pod" podUID="21cefc69-51ca-4baa-a4f4-aa7de0d8aa7a" pod="openshift-marketplace/community-operators-t6qvm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-t6qvm\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:42 crc kubenswrapper[4962]: I1201 21:37:42.283909 4962 status_manager.go:851] "Failed to get status for pod" podUID="d6ce364a-a2f9-48fa-9c65-5f8e65da569f" pod="openshift-marketplace/redhat-operators-cckxw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cckxw\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:42 crc kubenswrapper[4962]: I1201 21:37:42.284353 4962 status_manager.go:851] "Failed to get status for pod" podUID="b178e206-e661-4a20-af64-1b538fbc947a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:42 crc kubenswrapper[4962]: I1201 21:37:42.284725 4962 status_manager.go:851] "Failed to get status for pod" podUID="3ea88a22-b18e-4b46-812c-35cb8dcdeb30" pod="openshift-marketplace/certified-operators-4sv58" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-4sv58\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:42 crc kubenswrapper[4962]: I1201 21:37:42.285092 4962 status_manager.go:851] "Failed to get status for pod" podUID="6ee35195-33b7-4bc8-80fb-7eb9f0ca221f" pod="openshift-marketplace/redhat-marketplace-6bphp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-6bphp\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:43 crc kubenswrapper[4962]: I1201 21:37:43.157196 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cckxw" podUID="d6ce364a-a2f9-48fa-9c65-5f8e65da569f" containerName="registry-server" probeResult="failure" output=< Dec 01 21:37:43 crc kubenswrapper[4962]: timeout: failed to connect service ":50051" within 1s Dec 01 21:37:43 crc kubenswrapper[4962]: > Dec 01 21:37:44 crc kubenswrapper[4962]: I1201 21:37:44.339830 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4sv58" Dec 01 21:37:44 crc kubenswrapper[4962]: I1201 21:37:44.340951 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4sv58" Dec 01 21:37:44 crc kubenswrapper[4962]: I1201 21:37:44.386199 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4sv58" Dec 01 21:37:44 crc kubenswrapper[4962]: I1201 21:37:44.386885 4962 status_manager.go:851] "Failed to get status for pod" podUID="b178e206-e661-4a20-af64-1b538fbc947a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:44 crc kubenswrapper[4962]: I1201 21:37:44.387359 4962 status_manager.go:851] "Failed to get status for pod" podUID="3ea88a22-b18e-4b46-812c-35cb8dcdeb30" pod="openshift-marketplace/certified-operators-4sv58" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-4sv58\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:44 crc kubenswrapper[4962]: I1201 21:37:44.388030 4962 status_manager.go:851] "Failed to get status for pod" podUID="6ee35195-33b7-4bc8-80fb-7eb9f0ca221f" pod="openshift-marketplace/redhat-marketplace-6bphp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-6bphp\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:44 crc kubenswrapper[4962]: I1201 21:37:44.388315 4962 status_manager.go:851] "Failed to get status for pod" podUID="0cfa40bb-018f-4c4f-afbf-cfab90e33210" pod="openshift-image-registry/image-registry-66df7c8f76-74bhz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-66df7c8f76-74bhz\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:44 crc kubenswrapper[4962]: I1201 21:37:44.388587 4962 status_manager.go:851] "Failed to get status for pod" podUID="21cefc69-51ca-4baa-a4f4-aa7de0d8aa7a" pod="openshift-marketplace/community-operators-t6qvm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-t6qvm\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:44 crc kubenswrapper[4962]: I1201 21:37:44.388827 4962 status_manager.go:851] "Failed to get status for pod" podUID="d6ce364a-a2f9-48fa-9c65-5f8e65da569f" pod="openshift-marketplace/redhat-operators-cckxw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cckxw\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:44 crc kubenswrapper[4962]: I1201 21:37:44.518422 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6bphp" Dec 01 21:37:44 crc kubenswrapper[4962]: I1201 21:37:44.518504 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6bphp" Dec 01 21:37:44 crc kubenswrapper[4962]: I1201 21:37:44.578359 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6bphp" Dec 01 21:37:44 crc kubenswrapper[4962]: I1201 21:37:44.579270 4962 status_manager.go:851] "Failed to get status for pod" podUID="b178e206-e661-4a20-af64-1b538fbc947a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:44 crc kubenswrapper[4962]: I1201 21:37:44.580135 4962 status_manager.go:851] "Failed to get status for pod" podUID="3ea88a22-b18e-4b46-812c-35cb8dcdeb30" pod="openshift-marketplace/certified-operators-4sv58" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-4sv58\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:44 crc kubenswrapper[4962]: I1201 21:37:44.581024 4962 status_manager.go:851] "Failed to get status for pod" podUID="6ee35195-33b7-4bc8-80fb-7eb9f0ca221f" pod="openshift-marketplace/redhat-marketplace-6bphp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-6bphp\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:44 crc kubenswrapper[4962]: I1201 21:37:44.581563 4962 status_manager.go:851] "Failed to get status for pod" podUID="0cfa40bb-018f-4c4f-afbf-cfab90e33210" pod="openshift-image-registry/image-registry-66df7c8f76-74bhz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-66df7c8f76-74bhz\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:44 crc kubenswrapper[4962]: I1201 21:37:44.582190 4962 status_manager.go:851] "Failed to get status for pod" podUID="21cefc69-51ca-4baa-a4f4-aa7de0d8aa7a" pod="openshift-marketplace/community-operators-t6qvm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-t6qvm\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:44 crc kubenswrapper[4962]: I1201 21:37:44.582707 4962 status_manager.go:851] "Failed to get status for pod" podUID="d6ce364a-a2f9-48fa-9c65-5f8e65da569f" pod="openshift-marketplace/redhat-operators-cckxw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cckxw\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:45 crc kubenswrapper[4962]: E1201 21:37:45.212900 4962 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" interval="6.4s" Dec 01 21:37:45 crc kubenswrapper[4962]: I1201 21:37:45.275474 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4sv58" Dec 01 21:37:45 crc kubenswrapper[4962]: I1201 21:37:45.276464 4962 status_manager.go:851] "Failed to get status for pod" podUID="b178e206-e661-4a20-af64-1b538fbc947a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:45 crc kubenswrapper[4962]: I1201 21:37:45.277161 4962 status_manager.go:851] "Failed to get status for pod" podUID="3ea88a22-b18e-4b46-812c-35cb8dcdeb30" pod="openshift-marketplace/certified-operators-4sv58" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-4sv58\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:45 crc kubenswrapper[4962]: I1201 21:37:45.277822 4962 status_manager.go:851] "Failed to get status for pod" podUID="6ee35195-33b7-4bc8-80fb-7eb9f0ca221f" pod="openshift-marketplace/redhat-marketplace-6bphp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-6bphp\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:45 crc kubenswrapper[4962]: I1201 21:37:45.278412 4962 status_manager.go:851] "Failed to get status for pod" podUID="0cfa40bb-018f-4c4f-afbf-cfab90e33210" pod="openshift-image-registry/image-registry-66df7c8f76-74bhz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-66df7c8f76-74bhz\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:45 crc kubenswrapper[4962]: I1201 21:37:45.278900 4962 status_manager.go:851] "Failed to get status for pod" podUID="21cefc69-51ca-4baa-a4f4-aa7de0d8aa7a" pod="openshift-marketplace/community-operators-t6qvm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-t6qvm\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:45 crc kubenswrapper[4962]: I1201 21:37:45.279337 4962 status_manager.go:851] "Failed to get status for pod" podUID="d6ce364a-a2f9-48fa-9c65-5f8e65da569f" pod="openshift-marketplace/redhat-operators-cckxw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cckxw\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:45 crc kubenswrapper[4962]: I1201 21:37:45.282041 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6bphp" Dec 01 21:37:45 crc kubenswrapper[4962]: I1201 21:37:45.282709 4962 status_manager.go:851] "Failed to get status for pod" podUID="b178e206-e661-4a20-af64-1b538fbc947a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:45 crc kubenswrapper[4962]: I1201 21:37:45.283188 4962 status_manager.go:851] "Failed to get status for pod" podUID="3ea88a22-b18e-4b46-812c-35cb8dcdeb30" pod="openshift-marketplace/certified-operators-4sv58" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-4sv58\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:45 crc kubenswrapper[4962]: I1201 21:37:45.283566 4962 status_manager.go:851] "Failed to get status for pod" podUID="6ee35195-33b7-4bc8-80fb-7eb9f0ca221f" pod="openshift-marketplace/redhat-marketplace-6bphp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-6bphp\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:45 crc kubenswrapper[4962]: I1201 21:37:45.283973 4962 status_manager.go:851] "Failed to get status for pod" podUID="21cefc69-51ca-4baa-a4f4-aa7de0d8aa7a" pod="openshift-marketplace/community-operators-t6qvm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-t6qvm\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:45 crc kubenswrapper[4962]: I1201 21:37:45.284371 4962 status_manager.go:851] "Failed to get status for pod" podUID="0cfa40bb-018f-4c4f-afbf-cfab90e33210" pod="openshift-image-registry/image-registry-66df7c8f76-74bhz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-66df7c8f76-74bhz\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:45 crc kubenswrapper[4962]: I1201 21:37:45.284712 4962 status_manager.go:851] "Failed to get status for pod" podUID="d6ce364a-a2f9-48fa-9c65-5f8e65da569f" pod="openshift-marketplace/redhat-operators-cckxw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cckxw\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:46 crc kubenswrapper[4962]: E1201 21:37:46.060538 4962 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events\": dial tcp 38.102.83.110:6443: connect: connection refused" event="&Event{ObjectMeta:{redhat-operators-cckxw.187d3523f4cdb9c5 openshift-marketplace 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:redhat-operators-cckxw,UID:d6ce364a-a2f9-48fa-9c65-5f8e65da569f,APIVersion:v1,ResourceVersion:29429,FieldPath:spec.containers{registry-server},},Reason:Created,Message:Created container registry-server,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-01 21:37:37.546353093 +0000 UTC m=+241.647792298,LastTimestamp:2025-12-01 21:37:37.546353093 +0000 UTC m=+241.647792298,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 01 21:37:46 crc kubenswrapper[4962]: I1201 21:37:46.223079 4962 status_manager.go:851] "Failed to get status for pod" podUID="b178e206-e661-4a20-af64-1b538fbc947a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:46 crc kubenswrapper[4962]: I1201 21:37:46.223493 4962 status_manager.go:851] "Failed to get status for pod" podUID="3ea88a22-b18e-4b46-812c-35cb8dcdeb30" pod="openshift-marketplace/certified-operators-4sv58" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-4sv58\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:46 crc kubenswrapper[4962]: I1201 21:37:46.224109 4962 status_manager.go:851] "Failed to get status for pod" podUID="6ee35195-33b7-4bc8-80fb-7eb9f0ca221f" pod="openshift-marketplace/redhat-marketplace-6bphp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-6bphp\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:46 crc kubenswrapper[4962]: I1201 21:37:46.224359 4962 status_manager.go:851] "Failed to get status for pod" podUID="0cfa40bb-018f-4c4f-afbf-cfab90e33210" pod="openshift-image-registry/image-registry-66df7c8f76-74bhz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-66df7c8f76-74bhz\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:46 crc kubenswrapper[4962]: I1201 21:37:46.224660 4962 status_manager.go:851] "Failed to get status for pod" podUID="21cefc69-51ca-4baa-a4f4-aa7de0d8aa7a" pod="openshift-marketplace/community-operators-t6qvm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-t6qvm\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:46 crc kubenswrapper[4962]: I1201 21:37:46.225011 4962 status_manager.go:851] "Failed to get status for pod" podUID="d6ce364a-a2f9-48fa-9c65-5f8e65da569f" pod="openshift-marketplace/redhat-operators-cckxw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cckxw\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:51 crc kubenswrapper[4962]: E1201 21:37:51.614388 4962 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" interval="7s" Dec 01 21:37:52 crc kubenswrapper[4962]: I1201 21:37:52.173373 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cckxw" Dec 01 21:37:52 crc kubenswrapper[4962]: I1201 21:37:52.174091 4962 status_manager.go:851] "Failed to get status for pod" podUID="6ee35195-33b7-4bc8-80fb-7eb9f0ca221f" pod="openshift-marketplace/redhat-marketplace-6bphp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-6bphp\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:52 crc kubenswrapper[4962]: I1201 21:37:52.174711 4962 status_manager.go:851] "Failed to get status for pod" podUID="0cfa40bb-018f-4c4f-afbf-cfab90e33210" pod="openshift-image-registry/image-registry-66df7c8f76-74bhz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-66df7c8f76-74bhz\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:52 crc kubenswrapper[4962]: I1201 21:37:52.175108 4962 status_manager.go:851] "Failed to get status for pod" podUID="21cefc69-51ca-4baa-a4f4-aa7de0d8aa7a" pod="openshift-marketplace/community-operators-t6qvm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-t6qvm\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:52 crc kubenswrapper[4962]: I1201 21:37:52.175485 4962 status_manager.go:851] "Failed to get status for pod" podUID="d6ce364a-a2f9-48fa-9c65-5f8e65da569f" pod="openshift-marketplace/redhat-operators-cckxw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cckxw\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:52 crc kubenswrapper[4962]: I1201 21:37:52.175816 4962 status_manager.go:851] "Failed to get status for pod" podUID="b178e206-e661-4a20-af64-1b538fbc947a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:52 crc kubenswrapper[4962]: I1201 21:37:52.176166 4962 status_manager.go:851] "Failed to get status for pod" podUID="3ea88a22-b18e-4b46-812c-35cb8dcdeb30" pod="openshift-marketplace/certified-operators-4sv58" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-4sv58\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:52 crc kubenswrapper[4962]: I1201 21:37:52.219657 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 21:37:52 crc kubenswrapper[4962]: I1201 21:37:52.220654 4962 status_manager.go:851] "Failed to get status for pod" podUID="6ee35195-33b7-4bc8-80fb-7eb9f0ca221f" pod="openshift-marketplace/redhat-marketplace-6bphp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-6bphp\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:52 crc kubenswrapper[4962]: I1201 21:37:52.221070 4962 status_manager.go:851] "Failed to get status for pod" podUID="21cefc69-51ca-4baa-a4f4-aa7de0d8aa7a" pod="openshift-marketplace/community-operators-t6qvm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-t6qvm\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:52 crc kubenswrapper[4962]: I1201 21:37:52.221537 4962 status_manager.go:851] "Failed to get status for pod" podUID="0cfa40bb-018f-4c4f-afbf-cfab90e33210" pod="openshift-image-registry/image-registry-66df7c8f76-74bhz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-66df7c8f76-74bhz\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:52 crc kubenswrapper[4962]: I1201 21:37:52.221888 4962 status_manager.go:851] "Failed to get status for pod" podUID="d6ce364a-a2f9-48fa-9c65-5f8e65da569f" pod="openshift-marketplace/redhat-operators-cckxw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cckxw\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:52 crc kubenswrapper[4962]: I1201 21:37:52.222269 4962 status_manager.go:851] "Failed to get status for pod" podUID="b178e206-e661-4a20-af64-1b538fbc947a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:52 crc kubenswrapper[4962]: I1201 21:37:52.222497 4962 status_manager.go:851] "Failed to get status for pod" podUID="3ea88a22-b18e-4b46-812c-35cb8dcdeb30" pod="openshift-marketplace/certified-operators-4sv58" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-4sv58\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:52 crc kubenswrapper[4962]: I1201 21:37:52.230240 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cckxw" Dec 01 21:37:52 crc kubenswrapper[4962]: I1201 21:37:52.230682 4962 status_manager.go:851] "Failed to get status for pod" podUID="6ee35195-33b7-4bc8-80fb-7eb9f0ca221f" pod="openshift-marketplace/redhat-marketplace-6bphp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-6bphp\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:52 crc kubenswrapper[4962]: I1201 21:37:52.231179 4962 status_manager.go:851] "Failed to get status for pod" podUID="0cfa40bb-018f-4c4f-afbf-cfab90e33210" pod="openshift-image-registry/image-registry-66df7c8f76-74bhz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-66df7c8f76-74bhz\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:52 crc kubenswrapper[4962]: I1201 21:37:52.231499 4962 status_manager.go:851] "Failed to get status for pod" podUID="21cefc69-51ca-4baa-a4f4-aa7de0d8aa7a" pod="openshift-marketplace/community-operators-t6qvm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-t6qvm\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:52 crc kubenswrapper[4962]: I1201 21:37:52.232193 4962 status_manager.go:851] "Failed to get status for pod" podUID="d6ce364a-a2f9-48fa-9c65-5f8e65da569f" pod="openshift-marketplace/redhat-operators-cckxw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cckxw\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:52 crc kubenswrapper[4962]: I1201 21:37:52.232511 4962 status_manager.go:851] "Failed to get status for pod" podUID="b178e206-e661-4a20-af64-1b538fbc947a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:52 crc kubenswrapper[4962]: I1201 21:37:52.232894 4962 status_manager.go:851] "Failed to get status for pod" podUID="3ea88a22-b18e-4b46-812c-35cb8dcdeb30" pod="openshift-marketplace/certified-operators-4sv58" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-4sv58\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:52 crc kubenswrapper[4962]: I1201 21:37:52.236064 4962 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ac15c86b-cd18-4110-bda1-ba116dccb445" Dec 01 21:37:52 crc kubenswrapper[4962]: I1201 21:37:52.236086 4962 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ac15c86b-cd18-4110-bda1-ba116dccb445" Dec 01 21:37:52 crc kubenswrapper[4962]: E1201 21:37:52.236449 4962 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 21:37:52 crc kubenswrapper[4962]: I1201 21:37:52.236978 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 21:37:52 crc kubenswrapper[4962]: I1201 21:37:52.269032 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 01 21:37:52 crc kubenswrapper[4962]: I1201 21:37:52.269090 4962 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="b00d16ddb5e5c38227a87ee09c7e0d74ca43fb9226904dc5d604abe6a9c9b5ee" exitCode=1 Dec 01 21:37:52 crc kubenswrapper[4962]: I1201 21:37:52.269343 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"b00d16ddb5e5c38227a87ee09c7e0d74ca43fb9226904dc5d604abe6a9c9b5ee"} Dec 01 21:37:52 crc kubenswrapper[4962]: W1201 21:37:52.269776 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-72d2ff77cb691cb47d55dbe9c184b4ab088c9385484034dc2960240b7ced3074 WatchSource:0}: Error finding container 72d2ff77cb691cb47d55dbe9c184b4ab088c9385484034dc2960240b7ced3074: Status 404 returned error can't find the container with id 72d2ff77cb691cb47d55dbe9c184b4ab088c9385484034dc2960240b7ced3074 Dec 01 21:37:52 crc kubenswrapper[4962]: I1201 21:37:52.269889 4962 scope.go:117] "RemoveContainer" containerID="b00d16ddb5e5c38227a87ee09c7e0d74ca43fb9226904dc5d604abe6a9c9b5ee" Dec 01 21:37:52 crc kubenswrapper[4962]: I1201 21:37:52.270388 4962 status_manager.go:851] "Failed to get status for pod" podUID="0cfa40bb-018f-4c4f-afbf-cfab90e33210" pod="openshift-image-registry/image-registry-66df7c8f76-74bhz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-66df7c8f76-74bhz\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:52 crc kubenswrapper[4962]: I1201 21:37:52.270789 4962 status_manager.go:851] "Failed to get status for pod" podUID="21cefc69-51ca-4baa-a4f4-aa7de0d8aa7a" pod="openshift-marketplace/community-operators-t6qvm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-t6qvm\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:52 crc kubenswrapper[4962]: I1201 21:37:52.271422 4962 status_manager.go:851] "Failed to get status for pod" podUID="d6ce364a-a2f9-48fa-9c65-5f8e65da569f" pod="openshift-marketplace/redhat-operators-cckxw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cckxw\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:52 crc kubenswrapper[4962]: I1201 21:37:52.271923 4962 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:52 crc kubenswrapper[4962]: I1201 21:37:52.272496 4962 status_manager.go:851] "Failed to get status for pod" podUID="b178e206-e661-4a20-af64-1b538fbc947a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:52 crc kubenswrapper[4962]: I1201 21:37:52.272928 4962 status_manager.go:851] "Failed to get status for pod" podUID="3ea88a22-b18e-4b46-812c-35cb8dcdeb30" pod="openshift-marketplace/certified-operators-4sv58" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-4sv58\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:52 crc kubenswrapper[4962]: I1201 21:37:52.273524 4962 status_manager.go:851] "Failed to get status for pod" podUID="6ee35195-33b7-4bc8-80fb-7eb9f0ca221f" pod="openshift-marketplace/redhat-marketplace-6bphp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-6bphp\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:52 crc kubenswrapper[4962]: E1201 21:37:52.917344 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:37:52Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:37:52Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:37:52Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T21:37:52Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:52 crc kubenswrapper[4962]: E1201 21:37:52.917954 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:52 crc kubenswrapper[4962]: E1201 21:37:52.918387 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:52 crc kubenswrapper[4962]: E1201 21:37:52.918592 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:52 crc kubenswrapper[4962]: E1201 21:37:52.918856 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:52 crc kubenswrapper[4962]: E1201 21:37:52.918881 4962 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 21:37:53 crc kubenswrapper[4962]: I1201 21:37:53.277606 4962 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="b9a3771ffb101e270ee13ef239655fc866dff36590ad818fc787746e0c77d9b4" exitCode=0 Dec 01 21:37:53 crc kubenswrapper[4962]: I1201 21:37:53.277708 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"b9a3771ffb101e270ee13ef239655fc866dff36590ad818fc787746e0c77d9b4"} Dec 01 21:37:53 crc kubenswrapper[4962]: I1201 21:37:53.277777 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"72d2ff77cb691cb47d55dbe9c184b4ab088c9385484034dc2960240b7ced3074"} Dec 01 21:37:53 crc kubenswrapper[4962]: I1201 21:37:53.278233 4962 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ac15c86b-cd18-4110-bda1-ba116dccb445" Dec 01 21:37:53 crc kubenswrapper[4962]: I1201 21:37:53.278256 4962 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ac15c86b-cd18-4110-bda1-ba116dccb445" Dec 01 21:37:53 crc kubenswrapper[4962]: E1201 21:37:53.278754 4962 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 21:37:53 crc kubenswrapper[4962]: I1201 21:37:53.278768 4962 status_manager.go:851] "Failed to get status for pod" podUID="b178e206-e661-4a20-af64-1b538fbc947a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:53 crc kubenswrapper[4962]: I1201 21:37:53.279878 4962 status_manager.go:851] "Failed to get status for pod" podUID="3ea88a22-b18e-4b46-812c-35cb8dcdeb30" pod="openshift-marketplace/certified-operators-4sv58" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-4sv58\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:53 crc kubenswrapper[4962]: I1201 21:37:53.280608 4962 status_manager.go:851] "Failed to get status for pod" podUID="6ee35195-33b7-4bc8-80fb-7eb9f0ca221f" pod="openshift-marketplace/redhat-marketplace-6bphp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-6bphp\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:53 crc kubenswrapper[4962]: I1201 21:37:53.281015 4962 status_manager.go:851] "Failed to get status for pod" podUID="0cfa40bb-018f-4c4f-afbf-cfab90e33210" pod="openshift-image-registry/image-registry-66df7c8f76-74bhz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-66df7c8f76-74bhz\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:53 crc kubenswrapper[4962]: I1201 21:37:53.281347 4962 status_manager.go:851] "Failed to get status for pod" podUID="21cefc69-51ca-4baa-a4f4-aa7de0d8aa7a" pod="openshift-marketplace/community-operators-t6qvm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-t6qvm\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:53 crc kubenswrapper[4962]: I1201 21:37:53.281717 4962 status_manager.go:851] "Failed to get status for pod" podUID="d6ce364a-a2f9-48fa-9c65-5f8e65da569f" pod="openshift-marketplace/redhat-operators-cckxw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cckxw\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:53 crc kubenswrapper[4962]: I1201 21:37:53.282095 4962 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:53 crc kubenswrapper[4962]: I1201 21:37:53.282204 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 01 21:37:53 crc kubenswrapper[4962]: I1201 21:37:53.282243 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"7e17fd59ed1228c45e15c2d62dba5f098d3995e83941c78693e037d973b27106"} Dec 01 21:37:53 crc kubenswrapper[4962]: I1201 21:37:53.282999 4962 status_manager.go:851] "Failed to get status for pod" podUID="b178e206-e661-4a20-af64-1b538fbc947a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:53 crc kubenswrapper[4962]: I1201 21:37:53.283407 4962 status_manager.go:851] "Failed to get status for pod" podUID="3ea88a22-b18e-4b46-812c-35cb8dcdeb30" pod="openshift-marketplace/certified-operators-4sv58" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-4sv58\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:53 crc kubenswrapper[4962]: I1201 21:37:53.283714 4962 status_manager.go:851] "Failed to get status for pod" podUID="6ee35195-33b7-4bc8-80fb-7eb9f0ca221f" pod="openshift-marketplace/redhat-marketplace-6bphp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-6bphp\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:53 crc kubenswrapper[4962]: I1201 21:37:53.284125 4962 status_manager.go:851] "Failed to get status for pod" podUID="0cfa40bb-018f-4c4f-afbf-cfab90e33210" pod="openshift-image-registry/image-registry-66df7c8f76-74bhz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-66df7c8f76-74bhz\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:53 crc kubenswrapper[4962]: I1201 21:37:53.284510 4962 status_manager.go:851] "Failed to get status for pod" podUID="21cefc69-51ca-4baa-a4f4-aa7de0d8aa7a" pod="openshift-marketplace/community-operators-t6qvm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-t6qvm\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:53 crc kubenswrapper[4962]: I1201 21:37:53.284922 4962 status_manager.go:851] "Failed to get status for pod" podUID="d6ce364a-a2f9-48fa-9c65-5f8e65da569f" pod="openshift-marketplace/redhat-operators-cckxw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cckxw\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:53 crc kubenswrapper[4962]: I1201 21:37:53.285293 4962 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 01 21:37:53 crc kubenswrapper[4962]: I1201 21:37:53.793675 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 21:37:53 crc kubenswrapper[4962]: I1201 21:37:53.797511 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-csp6p" podUID="6e0f577b-3526-4f2e-846f-65d5a5ee1d8e" containerName="oauth-openshift" containerID="cri-o://72d5cfdbcc979b4604c202a89d3d32785ea309eccf5e2e989cb10fac49b689e8" gracePeriod=15 Dec 01 21:37:53 crc kubenswrapper[4962]: I1201 21:37:53.797819 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 21:37:54 crc kubenswrapper[4962]: I1201 21:37:54.258427 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-csp6p" Dec 01 21:37:54 crc kubenswrapper[4962]: I1201 21:37:54.289669 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"4e154076db2d4f4f53da84a6f1ade439582d428396507f40d103ce26a37c2ed8"} Dec 01 21:37:54 crc kubenswrapper[4962]: I1201 21:37:54.289724 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"451ff4a04910ffaa0eecb9816dfcb2e1ed764f0ee48ed8bc1a2f0eb2c4ad3fad"} Dec 01 21:37:54 crc kubenswrapper[4962]: I1201 21:37:54.289737 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"88582ea3120d715a99bd9d0962f8fe291a152e940e366ce7bcfdfd47a8a137b0"} Dec 01 21:37:54 crc kubenswrapper[4962]: I1201 21:37:54.294121 4962 generic.go:334] "Generic (PLEG): container finished" podID="6e0f577b-3526-4f2e-846f-65d5a5ee1d8e" containerID="72d5cfdbcc979b4604c202a89d3d32785ea309eccf5e2e989cb10fac49b689e8" exitCode=0 Dec 01 21:37:54 crc kubenswrapper[4962]: I1201 21:37:54.294205 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-csp6p" Dec 01 21:37:54 crc kubenswrapper[4962]: I1201 21:37:54.294243 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-csp6p" event={"ID":"6e0f577b-3526-4f2e-846f-65d5a5ee1d8e","Type":"ContainerDied","Data":"72d5cfdbcc979b4604c202a89d3d32785ea309eccf5e2e989cb10fac49b689e8"} Dec 01 21:37:54 crc kubenswrapper[4962]: I1201 21:37:54.294265 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-csp6p" event={"ID":"6e0f577b-3526-4f2e-846f-65d5a5ee1d8e","Type":"ContainerDied","Data":"9f060d82be232e97b5ecfd17fd7925a7feae037e2fac54fcc0f681d1afee65a7"} Dec 01 21:37:54 crc kubenswrapper[4962]: I1201 21:37:54.294288 4962 scope.go:117] "RemoveContainer" containerID="72d5cfdbcc979b4604c202a89d3d32785ea309eccf5e2e989cb10fac49b689e8" Dec 01 21:37:54 crc kubenswrapper[4962]: I1201 21:37:54.294410 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 21:37:54 crc kubenswrapper[4962]: I1201 21:37:54.311202 4962 scope.go:117] "RemoveContainer" containerID="72d5cfdbcc979b4604c202a89d3d32785ea309eccf5e2e989cb10fac49b689e8" Dec 01 21:37:54 crc kubenswrapper[4962]: E1201 21:37:54.311647 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72d5cfdbcc979b4604c202a89d3d32785ea309eccf5e2e989cb10fac49b689e8\": container with ID starting with 72d5cfdbcc979b4604c202a89d3d32785ea309eccf5e2e989cb10fac49b689e8 not found: ID does not exist" containerID="72d5cfdbcc979b4604c202a89d3d32785ea309eccf5e2e989cb10fac49b689e8" Dec 01 21:37:54 crc kubenswrapper[4962]: I1201 21:37:54.311685 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72d5cfdbcc979b4604c202a89d3d32785ea309eccf5e2e989cb10fac49b689e8"} err="failed to get container status \"72d5cfdbcc979b4604c202a89d3d32785ea309eccf5e2e989cb10fac49b689e8\": rpc error: code = NotFound desc = could not find container \"72d5cfdbcc979b4604c202a89d3d32785ea309eccf5e2e989cb10fac49b689e8\": container with ID starting with 72d5cfdbcc979b4604c202a89d3d32785ea309eccf5e2e989cb10fac49b689e8 not found: ID does not exist" Dec 01 21:37:54 crc kubenswrapper[4962]: I1201 21:37:54.383002 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-v4-0-config-system-router-certs\") pod \"6e0f577b-3526-4f2e-846f-65d5a5ee1d8e\" (UID: \"6e0f577b-3526-4f2e-846f-65d5a5ee1d8e\") " Dec 01 21:37:54 crc kubenswrapper[4962]: I1201 21:37:54.383056 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-audit-policies\") pod \"6e0f577b-3526-4f2e-846f-65d5a5ee1d8e\" (UID: \"6e0f577b-3526-4f2e-846f-65d5a5ee1d8e\") " Dec 01 21:37:54 crc kubenswrapper[4962]: I1201 21:37:54.383123 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-v4-0-config-system-cliconfig\") pod \"6e0f577b-3526-4f2e-846f-65d5a5ee1d8e\" (UID: \"6e0f577b-3526-4f2e-846f-65d5a5ee1d8e\") " Dec 01 21:37:54 crc kubenswrapper[4962]: I1201 21:37:54.383149 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-v4-0-config-system-trusted-ca-bundle\") pod \"6e0f577b-3526-4f2e-846f-65d5a5ee1d8e\" (UID: \"6e0f577b-3526-4f2e-846f-65d5a5ee1d8e\") " Dec 01 21:37:54 crc kubenswrapper[4962]: I1201 21:37:54.383175 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-v4-0-config-user-idp-0-file-data\") pod \"6e0f577b-3526-4f2e-846f-65d5a5ee1d8e\" (UID: \"6e0f577b-3526-4f2e-846f-65d5a5ee1d8e\") " Dec 01 21:37:54 crc kubenswrapper[4962]: I1201 21:37:54.383201 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-v4-0-config-system-serving-cert\") pod \"6e0f577b-3526-4f2e-846f-65d5a5ee1d8e\" (UID: \"6e0f577b-3526-4f2e-846f-65d5a5ee1d8e\") " Dec 01 21:37:54 crc kubenswrapper[4962]: I1201 21:37:54.383239 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-v4-0-config-user-template-login\") pod \"6e0f577b-3526-4f2e-846f-65d5a5ee1d8e\" (UID: \"6e0f577b-3526-4f2e-846f-65d5a5ee1d8e\") " Dec 01 21:37:54 crc kubenswrapper[4962]: I1201 21:37:54.383265 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-v4-0-config-user-template-provider-selection\") pod \"6e0f577b-3526-4f2e-846f-65d5a5ee1d8e\" (UID: \"6e0f577b-3526-4f2e-846f-65d5a5ee1d8e\") " Dec 01 21:37:54 crc kubenswrapper[4962]: I1201 21:37:54.383297 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qw78\" (UniqueName: \"kubernetes.io/projected/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-kube-api-access-9qw78\") pod \"6e0f577b-3526-4f2e-846f-65d5a5ee1d8e\" (UID: \"6e0f577b-3526-4f2e-846f-65d5a5ee1d8e\") " Dec 01 21:37:54 crc kubenswrapper[4962]: I1201 21:37:54.383324 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-v4-0-config-system-ocp-branding-template\") pod \"6e0f577b-3526-4f2e-846f-65d5a5ee1d8e\" (UID: \"6e0f577b-3526-4f2e-846f-65d5a5ee1d8e\") " Dec 01 21:37:54 crc kubenswrapper[4962]: I1201 21:37:54.383350 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-v4-0-config-system-service-ca\") pod \"6e0f577b-3526-4f2e-846f-65d5a5ee1d8e\" (UID: \"6e0f577b-3526-4f2e-846f-65d5a5ee1d8e\") " Dec 01 21:37:54 crc kubenswrapper[4962]: I1201 21:37:54.383388 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-v4-0-config-user-template-error\") pod \"6e0f577b-3526-4f2e-846f-65d5a5ee1d8e\" (UID: \"6e0f577b-3526-4f2e-846f-65d5a5ee1d8e\") " Dec 01 21:37:54 crc kubenswrapper[4962]: I1201 21:37:54.383429 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-v4-0-config-system-session\") pod \"6e0f577b-3526-4f2e-846f-65d5a5ee1d8e\" (UID: \"6e0f577b-3526-4f2e-846f-65d5a5ee1d8e\") " Dec 01 21:37:54 crc kubenswrapper[4962]: I1201 21:37:54.383450 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-audit-dir\") pod \"6e0f577b-3526-4f2e-846f-65d5a5ee1d8e\" (UID: \"6e0f577b-3526-4f2e-846f-65d5a5ee1d8e\") " Dec 01 21:37:54 crc kubenswrapper[4962]: I1201 21:37:54.383728 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "6e0f577b-3526-4f2e-846f-65d5a5ee1d8e" (UID: "6e0f577b-3526-4f2e-846f-65d5a5ee1d8e"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 21:37:54 crc kubenswrapper[4962]: I1201 21:37:54.385788 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "6e0f577b-3526-4f2e-846f-65d5a5ee1d8e" (UID: "6e0f577b-3526-4f2e-846f-65d5a5ee1d8e"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:37:54 crc kubenswrapper[4962]: I1201 21:37:54.385799 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "6e0f577b-3526-4f2e-846f-65d5a5ee1d8e" (UID: "6e0f577b-3526-4f2e-846f-65d5a5ee1d8e"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:37:54 crc kubenswrapper[4962]: I1201 21:37:54.386060 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "6e0f577b-3526-4f2e-846f-65d5a5ee1d8e" (UID: "6e0f577b-3526-4f2e-846f-65d5a5ee1d8e"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:37:54 crc kubenswrapper[4962]: I1201 21:37:54.388711 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "6e0f577b-3526-4f2e-846f-65d5a5ee1d8e" (UID: "6e0f577b-3526-4f2e-846f-65d5a5ee1d8e"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:37:54 crc kubenswrapper[4962]: I1201 21:37:54.400311 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "6e0f577b-3526-4f2e-846f-65d5a5ee1d8e" (UID: "6e0f577b-3526-4f2e-846f-65d5a5ee1d8e"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:37:54 crc kubenswrapper[4962]: I1201 21:37:54.400588 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-kube-api-access-9qw78" (OuterVolumeSpecName: "kube-api-access-9qw78") pod "6e0f577b-3526-4f2e-846f-65d5a5ee1d8e" (UID: "6e0f577b-3526-4f2e-846f-65d5a5ee1d8e"). InnerVolumeSpecName "kube-api-access-9qw78". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:37:54 crc kubenswrapper[4962]: I1201 21:37:54.401010 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "6e0f577b-3526-4f2e-846f-65d5a5ee1d8e" (UID: "6e0f577b-3526-4f2e-846f-65d5a5ee1d8e"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:37:54 crc kubenswrapper[4962]: I1201 21:37:54.404523 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "6e0f577b-3526-4f2e-846f-65d5a5ee1d8e" (UID: "6e0f577b-3526-4f2e-846f-65d5a5ee1d8e"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:37:54 crc kubenswrapper[4962]: I1201 21:37:54.405204 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "6e0f577b-3526-4f2e-846f-65d5a5ee1d8e" (UID: "6e0f577b-3526-4f2e-846f-65d5a5ee1d8e"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:37:54 crc kubenswrapper[4962]: I1201 21:37:54.405519 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "6e0f577b-3526-4f2e-846f-65d5a5ee1d8e" (UID: "6e0f577b-3526-4f2e-846f-65d5a5ee1d8e"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:37:54 crc kubenswrapper[4962]: I1201 21:37:54.405664 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "6e0f577b-3526-4f2e-846f-65d5a5ee1d8e" (UID: "6e0f577b-3526-4f2e-846f-65d5a5ee1d8e"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:37:54 crc kubenswrapper[4962]: I1201 21:37:54.405707 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "6e0f577b-3526-4f2e-846f-65d5a5ee1d8e" (UID: "6e0f577b-3526-4f2e-846f-65d5a5ee1d8e"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:37:54 crc kubenswrapper[4962]: I1201 21:37:54.406174 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "6e0f577b-3526-4f2e-846f-65d5a5ee1d8e" (UID: "6e0f577b-3526-4f2e-846f-65d5a5ee1d8e"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:37:54 crc kubenswrapper[4962]: I1201 21:37:54.485021 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 01 21:37:54 crc kubenswrapper[4962]: I1201 21:37:54.485058 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 01 21:37:54 crc kubenswrapper[4962]: I1201 21:37:54.485068 4962 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 01 21:37:54 crc kubenswrapper[4962]: I1201 21:37:54.485077 4962 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 01 21:37:54 crc kubenswrapper[4962]: I1201 21:37:54.485085 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 01 21:37:54 crc kubenswrapper[4962]: I1201 21:37:54.485094 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 01 21:37:54 crc kubenswrapper[4962]: I1201 21:37:54.485104 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 21:37:54 crc kubenswrapper[4962]: I1201 21:37:54.485115 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 01 21:37:54 crc kubenswrapper[4962]: I1201 21:37:54.485131 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 21:37:54 crc kubenswrapper[4962]: I1201 21:37:54.485140 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 01 21:37:54 crc kubenswrapper[4962]: I1201 21:37:54.485150 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 01 21:37:54 crc kubenswrapper[4962]: I1201 21:37:54.485159 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qw78\" (UniqueName: \"kubernetes.io/projected/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-kube-api-access-9qw78\") on node \"crc\" DevicePath \"\"" Dec 01 21:37:54 crc kubenswrapper[4962]: I1201 21:37:54.485168 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 01 21:37:54 crc kubenswrapper[4962]: I1201 21:37:54.485179 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 01 21:37:55 crc kubenswrapper[4962]: I1201 21:37:55.302615 4962 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ac15c86b-cd18-4110-bda1-ba116dccb445" Dec 01 21:37:55 crc kubenswrapper[4962]: I1201 21:37:55.302651 4962 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ac15c86b-cd18-4110-bda1-ba116dccb445" Dec 01 21:37:55 crc kubenswrapper[4962]: I1201 21:37:55.302883 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"df7906d839d4aa1ab0bd86b03ed7129b336760d2ba9dfb2b644059b70099e1d6"} Dec 01 21:37:55 crc kubenswrapper[4962]: I1201 21:37:55.302912 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 21:37:55 crc kubenswrapper[4962]: I1201 21:37:55.302923 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"6eecc3148815b1a13921c31123839413c7586b6ad37af956d78ea4b7ebf24959"} Dec 01 21:37:57 crc kubenswrapper[4962]: I1201 21:37:57.238155 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 21:37:57 crc kubenswrapper[4962]: I1201 21:37:57.238516 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 21:37:57 crc kubenswrapper[4962]: I1201 21:37:57.246173 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 21:38:00 crc kubenswrapper[4962]: I1201 21:38:00.360560 4962 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 21:38:00 crc kubenswrapper[4962]: I1201 21:38:00.562882 4962 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="1e325d7a-4305-4e6f-8df3-e4e963a1553d" Dec 01 21:38:01 crc kubenswrapper[4962]: I1201 21:38:01.355446 4962 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ac15c86b-cd18-4110-bda1-ba116dccb445" Dec 01 21:38:01 crc kubenswrapper[4962]: I1201 21:38:01.355494 4962 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ac15c86b-cd18-4110-bda1-ba116dccb445" Dec 01 21:38:01 crc kubenswrapper[4962]: I1201 21:38:01.358670 4962 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="1e325d7a-4305-4e6f-8df3-e4e963a1553d" Dec 01 21:38:06 crc kubenswrapper[4962]: I1201 21:38:06.239445 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 21:38:09 crc kubenswrapper[4962]: I1201 21:38:09.974107 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 01 21:38:10 crc kubenswrapper[4962]: I1201 21:38:10.319623 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 01 21:38:10 crc kubenswrapper[4962]: I1201 21:38:10.843595 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 01 21:38:10 crc kubenswrapper[4962]: I1201 21:38:10.972885 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 01 21:38:11 crc kubenswrapper[4962]: I1201 21:38:11.086684 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 01 21:38:11 crc kubenswrapper[4962]: I1201 21:38:11.738209 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 01 21:38:11 crc kubenswrapper[4962]: I1201 21:38:11.813242 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 01 21:38:11 crc kubenswrapper[4962]: I1201 21:38:11.892441 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 01 21:38:11 crc kubenswrapper[4962]: I1201 21:38:11.921529 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 01 21:38:11 crc kubenswrapper[4962]: I1201 21:38:11.947485 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 01 21:38:11 crc kubenswrapper[4962]: I1201 21:38:11.970121 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 01 21:38:12 crc kubenswrapper[4962]: I1201 21:38:12.047665 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 01 21:38:12 crc kubenswrapper[4962]: I1201 21:38:12.237466 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 01 21:38:12 crc kubenswrapper[4962]: I1201 21:38:12.272207 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 01 21:38:12 crc kubenswrapper[4962]: I1201 21:38:12.298657 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 01 21:38:12 crc kubenswrapper[4962]: I1201 21:38:12.388437 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 01 21:38:12 crc kubenswrapper[4962]: I1201 21:38:12.485863 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 01 21:38:12 crc kubenswrapper[4962]: I1201 21:38:12.523308 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 01 21:38:13 crc kubenswrapper[4962]: I1201 21:38:13.075245 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 01 21:38:13 crc kubenswrapper[4962]: I1201 21:38:13.136276 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 01 21:38:13 crc kubenswrapper[4962]: I1201 21:38:13.178312 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 01 21:38:13 crc kubenswrapper[4962]: I1201 21:38:13.210559 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 01 21:38:13 crc kubenswrapper[4962]: I1201 21:38:13.382739 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 01 21:38:13 crc kubenswrapper[4962]: I1201 21:38:13.411749 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 01 21:38:13 crc kubenswrapper[4962]: I1201 21:38:13.443674 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 01 21:38:13 crc kubenswrapper[4962]: I1201 21:38:13.517182 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 01 21:38:13 crc kubenswrapper[4962]: I1201 21:38:13.531710 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 01 21:38:13 crc kubenswrapper[4962]: I1201 21:38:13.538188 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 01 21:38:13 crc kubenswrapper[4962]: I1201 21:38:13.587228 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 01 21:38:13 crc kubenswrapper[4962]: I1201 21:38:13.587308 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 01 21:38:13 crc kubenswrapper[4962]: I1201 21:38:13.621335 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 01 21:38:13 crc kubenswrapper[4962]: I1201 21:38:13.749631 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 01 21:38:13 crc kubenswrapper[4962]: I1201 21:38:13.817761 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 01 21:38:13 crc kubenswrapper[4962]: I1201 21:38:13.825045 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 01 21:38:13 crc kubenswrapper[4962]: I1201 21:38:13.881649 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 01 21:38:13 crc kubenswrapper[4962]: I1201 21:38:13.923413 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 01 21:38:14 crc kubenswrapper[4962]: I1201 21:38:14.029880 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 01 21:38:14 crc kubenswrapper[4962]: I1201 21:38:14.085540 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 01 21:38:14 crc kubenswrapper[4962]: I1201 21:38:14.161911 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 01 21:38:14 crc kubenswrapper[4962]: I1201 21:38:14.190178 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 01 21:38:14 crc kubenswrapper[4962]: I1201 21:38:14.290343 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 01 21:38:14 crc kubenswrapper[4962]: I1201 21:38:14.321878 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 01 21:38:14 crc kubenswrapper[4962]: I1201 21:38:14.438810 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 01 21:38:14 crc kubenswrapper[4962]: I1201 21:38:14.450796 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 01 21:38:14 crc kubenswrapper[4962]: I1201 21:38:14.558419 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 01 21:38:14 crc kubenswrapper[4962]: I1201 21:38:14.576019 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 01 21:38:14 crc kubenswrapper[4962]: I1201 21:38:14.618169 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 01 21:38:14 crc kubenswrapper[4962]: I1201 21:38:14.626929 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 01 21:38:14 crc kubenswrapper[4962]: I1201 21:38:14.719419 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 01 21:38:14 crc kubenswrapper[4962]: I1201 21:38:14.903559 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 01 21:38:15 crc kubenswrapper[4962]: I1201 21:38:15.049362 4962 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 01 21:38:15 crc kubenswrapper[4962]: I1201 21:38:15.165352 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 01 21:38:15 crc kubenswrapper[4962]: I1201 21:38:15.187538 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 01 21:38:15 crc kubenswrapper[4962]: I1201 21:38:15.296019 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 01 21:38:15 crc kubenswrapper[4962]: I1201 21:38:15.307173 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 01 21:38:15 crc kubenswrapper[4962]: I1201 21:38:15.460637 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 01 21:38:15 crc kubenswrapper[4962]: I1201 21:38:15.471057 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 01 21:38:15 crc kubenswrapper[4962]: I1201 21:38:15.487439 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 01 21:38:15 crc kubenswrapper[4962]: I1201 21:38:15.490017 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 01 21:38:15 crc kubenswrapper[4962]: I1201 21:38:15.778209 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 01 21:38:15 crc kubenswrapper[4962]: I1201 21:38:15.796250 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 01 21:38:15 crc kubenswrapper[4962]: I1201 21:38:15.959074 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 01 21:38:16 crc kubenswrapper[4962]: I1201 21:38:16.048701 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 01 21:38:16 crc kubenswrapper[4962]: I1201 21:38:16.076227 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 01 21:38:16 crc kubenswrapper[4962]: I1201 21:38:16.223081 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 01 21:38:16 crc kubenswrapper[4962]: I1201 21:38:16.229374 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 01 21:38:16 crc kubenswrapper[4962]: I1201 21:38:16.233665 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 01 21:38:16 crc kubenswrapper[4962]: I1201 21:38:16.273432 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 01 21:38:16 crc kubenswrapper[4962]: I1201 21:38:16.276279 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 01 21:38:16 crc kubenswrapper[4962]: I1201 21:38:16.302365 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 01 21:38:16 crc kubenswrapper[4962]: I1201 21:38:16.320095 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 01 21:38:16 crc kubenswrapper[4962]: I1201 21:38:16.343045 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 01 21:38:16 crc kubenswrapper[4962]: I1201 21:38:16.415923 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 01 21:38:16 crc kubenswrapper[4962]: I1201 21:38:16.417892 4962 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 01 21:38:16 crc kubenswrapper[4962]: I1201 21:38:16.447167 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 01 21:38:16 crc kubenswrapper[4962]: I1201 21:38:16.452075 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 01 21:38:16 crc kubenswrapper[4962]: I1201 21:38:16.643681 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 01 21:38:16 crc kubenswrapper[4962]: I1201 21:38:16.682262 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 01 21:38:16 crc kubenswrapper[4962]: I1201 21:38:16.704757 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 01 21:38:16 crc kubenswrapper[4962]: I1201 21:38:16.711431 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 01 21:38:16 crc kubenswrapper[4962]: I1201 21:38:16.711906 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 01 21:38:16 crc kubenswrapper[4962]: I1201 21:38:16.723736 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 01 21:38:16 crc kubenswrapper[4962]: I1201 21:38:16.743600 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 01 21:38:16 crc kubenswrapper[4962]: I1201 21:38:16.760892 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 01 21:38:16 crc kubenswrapper[4962]: I1201 21:38:16.771686 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 01 21:38:16 crc kubenswrapper[4962]: I1201 21:38:16.793919 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 01 21:38:16 crc kubenswrapper[4962]: I1201 21:38:16.902662 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 01 21:38:16 crc kubenswrapper[4962]: I1201 21:38:16.907179 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 01 21:38:16 crc kubenswrapper[4962]: I1201 21:38:16.915103 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 01 21:38:16 crc kubenswrapper[4962]: I1201 21:38:16.958132 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 01 21:38:16 crc kubenswrapper[4962]: I1201 21:38:16.987553 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 01 21:38:16 crc kubenswrapper[4962]: I1201 21:38:16.989060 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 01 21:38:17 crc kubenswrapper[4962]: I1201 21:38:17.035708 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 01 21:38:17 crc kubenswrapper[4962]: I1201 21:38:17.068217 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 01 21:38:17 crc kubenswrapper[4962]: I1201 21:38:17.084812 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 01 21:38:17 crc kubenswrapper[4962]: I1201 21:38:17.095511 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 01 21:38:17 crc kubenswrapper[4962]: I1201 21:38:17.130671 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 01 21:38:17 crc kubenswrapper[4962]: I1201 21:38:17.141255 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 01 21:38:17 crc kubenswrapper[4962]: I1201 21:38:17.161183 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 01 21:38:17 crc kubenswrapper[4962]: I1201 21:38:17.170282 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 01 21:38:17 crc kubenswrapper[4962]: I1201 21:38:17.205032 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 01 21:38:17 crc kubenswrapper[4962]: I1201 21:38:17.255177 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 01 21:38:17 crc kubenswrapper[4962]: I1201 21:38:17.265415 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 01 21:38:17 crc kubenswrapper[4962]: I1201 21:38:17.271449 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 01 21:38:17 crc kubenswrapper[4962]: I1201 21:38:17.367658 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 01 21:38:17 crc kubenswrapper[4962]: I1201 21:38:17.422872 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 01 21:38:17 crc kubenswrapper[4962]: I1201 21:38:17.432868 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 01 21:38:17 crc kubenswrapper[4962]: I1201 21:38:17.527211 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 01 21:38:17 crc kubenswrapper[4962]: I1201 21:38:17.527238 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 01 21:38:17 crc kubenswrapper[4962]: I1201 21:38:17.532096 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 01 21:38:17 crc kubenswrapper[4962]: I1201 21:38:17.543842 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 01 21:38:17 crc kubenswrapper[4962]: I1201 21:38:17.544194 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 01 21:38:17 crc kubenswrapper[4962]: I1201 21:38:17.556057 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 01 21:38:17 crc kubenswrapper[4962]: I1201 21:38:17.563202 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 01 21:38:17 crc kubenswrapper[4962]: I1201 21:38:17.639063 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 01 21:38:17 crc kubenswrapper[4962]: I1201 21:38:17.672482 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 01 21:38:17 crc kubenswrapper[4962]: I1201 21:38:17.701593 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 01 21:38:17 crc kubenswrapper[4962]: I1201 21:38:17.737769 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 01 21:38:17 crc kubenswrapper[4962]: I1201 21:38:17.814912 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 01 21:38:17 crc kubenswrapper[4962]: I1201 21:38:17.853172 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 01 21:38:17 crc kubenswrapper[4962]: I1201 21:38:17.918603 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 01 21:38:18 crc kubenswrapper[4962]: I1201 21:38:18.141853 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 01 21:38:18 crc kubenswrapper[4962]: I1201 21:38:18.184219 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 01 21:38:18 crc kubenswrapper[4962]: I1201 21:38:18.198735 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 01 21:38:18 crc kubenswrapper[4962]: I1201 21:38:18.308733 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 01 21:38:18 crc kubenswrapper[4962]: I1201 21:38:18.320317 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 01 21:38:18 crc kubenswrapper[4962]: I1201 21:38:18.340456 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 01 21:38:18 crc kubenswrapper[4962]: I1201 21:38:18.366555 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 01 21:38:18 crc kubenswrapper[4962]: I1201 21:38:18.422053 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 01 21:38:18 crc kubenswrapper[4962]: I1201 21:38:18.463340 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 01 21:38:18 crc kubenswrapper[4962]: I1201 21:38:18.575165 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 01 21:38:18 crc kubenswrapper[4962]: I1201 21:38:18.664820 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 01 21:38:18 crc kubenswrapper[4962]: I1201 21:38:18.669063 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 01 21:38:18 crc kubenswrapper[4962]: I1201 21:38:18.776012 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 01 21:38:18 crc kubenswrapper[4962]: I1201 21:38:18.776097 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 01 21:38:18 crc kubenswrapper[4962]: I1201 21:38:18.796650 4962 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 01 21:38:18 crc kubenswrapper[4962]: I1201 21:38:18.798649 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 01 21:38:18 crc kubenswrapper[4962]: I1201 21:38:18.814702 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 01 21:38:18 crc kubenswrapper[4962]: I1201 21:38:18.824720 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 01 21:38:18 crc kubenswrapper[4962]: I1201 21:38:18.876599 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 01 21:38:18 crc kubenswrapper[4962]: I1201 21:38:18.902598 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 01 21:38:18 crc kubenswrapper[4962]: I1201 21:38:18.984221 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 01 21:38:19 crc kubenswrapper[4962]: I1201 21:38:19.057400 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 01 21:38:19 crc kubenswrapper[4962]: I1201 21:38:19.087441 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 01 21:38:19 crc kubenswrapper[4962]: I1201 21:38:19.104873 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 01 21:38:19 crc kubenswrapper[4962]: I1201 21:38:19.106272 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 01 21:38:19 crc kubenswrapper[4962]: I1201 21:38:19.139909 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 01 21:38:19 crc kubenswrapper[4962]: I1201 21:38:19.171366 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 01 21:38:19 crc kubenswrapper[4962]: I1201 21:38:19.176578 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 01 21:38:19 crc kubenswrapper[4962]: I1201 21:38:19.219229 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 01 21:38:19 crc kubenswrapper[4962]: I1201 21:38:19.348807 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 01 21:38:19 crc kubenswrapper[4962]: I1201 21:38:19.405295 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 01 21:38:19 crc kubenswrapper[4962]: I1201 21:38:19.498061 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 01 21:38:19 crc kubenswrapper[4962]: I1201 21:38:19.505119 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 01 21:38:19 crc kubenswrapper[4962]: I1201 21:38:19.530849 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 01 21:38:19 crc kubenswrapper[4962]: I1201 21:38:19.671668 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 01 21:38:19 crc kubenswrapper[4962]: I1201 21:38:19.859799 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 01 21:38:19 crc kubenswrapper[4962]: I1201 21:38:19.952898 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 01 21:38:19 crc kubenswrapper[4962]: I1201 21:38:19.969667 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 01 21:38:19 crc kubenswrapper[4962]: I1201 21:38:19.994920 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 01 21:38:20 crc kubenswrapper[4962]: I1201 21:38:20.130637 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 01 21:38:20 crc kubenswrapper[4962]: I1201 21:38:20.194605 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 01 21:38:20 crc kubenswrapper[4962]: I1201 21:38:20.232080 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 01 21:38:20 crc kubenswrapper[4962]: I1201 21:38:20.461121 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 01 21:38:20 crc kubenswrapper[4962]: I1201 21:38:20.496467 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 01 21:38:20 crc kubenswrapper[4962]: I1201 21:38:20.671026 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 01 21:38:20 crc kubenswrapper[4962]: I1201 21:38:20.788512 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 01 21:38:20 crc kubenswrapper[4962]: I1201 21:38:20.796544 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 01 21:38:21 crc kubenswrapper[4962]: I1201 21:38:21.398270 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 01 21:38:21 crc kubenswrapper[4962]: I1201 21:38:21.419130 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 01 21:38:21 crc kubenswrapper[4962]: I1201 21:38:21.431092 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 01 21:38:21 crc kubenswrapper[4962]: I1201 21:38:21.444920 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 01 21:38:21 crc kubenswrapper[4962]: I1201 21:38:21.459149 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 01 21:38:21 crc kubenswrapper[4962]: I1201 21:38:21.570387 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 01 21:38:21 crc kubenswrapper[4962]: I1201 21:38:21.791475 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 01 21:38:21 crc kubenswrapper[4962]: I1201 21:38:21.854028 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 01 21:38:21 crc kubenswrapper[4962]: I1201 21:38:21.907672 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.004920 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.041395 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.047837 4962 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.049301 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-t6qvm" podStartSLOduration=46.83844763 podStartE2EDuration="51.049282663s" podCreationTimestamp="2025-12-01 21:37:31 +0000 UTC" firstStartedPulling="2025-12-01 21:37:33.101311562 +0000 UTC m=+237.202750767" lastFinishedPulling="2025-12-01 21:37:37.312146585 +0000 UTC m=+241.413585800" observedRunningTime="2025-12-01 21:38:00.457519959 +0000 UTC m=+264.558959164" watchObservedRunningTime="2025-12-01 21:38:22.049282663 +0000 UTC m=+286.150721858" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.051851 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6bphp" podStartSLOduration=45.182009936 podStartE2EDuration="48.051844822s" podCreationTimestamp="2025-12-01 21:37:34 +0000 UTC" firstStartedPulling="2025-12-01 21:37:35.193782584 +0000 UTC m=+239.295221779" lastFinishedPulling="2025-12-01 21:37:38.06361747 +0000 UTC m=+242.165056665" observedRunningTime="2025-12-01 21:38:00.408540384 +0000 UTC m=+264.509979599" watchObservedRunningTime="2025-12-01 21:38:22.051844822 +0000 UTC m=+286.153284017" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.052081 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4sv58" podStartSLOduration=46.286116904 podStartE2EDuration="49.052074969s" podCreationTimestamp="2025-12-01 21:37:33 +0000 UTC" firstStartedPulling="2025-12-01 21:37:35.181790119 +0000 UTC m=+239.283229324" lastFinishedPulling="2025-12-01 21:37:37.947748194 +0000 UTC m=+242.049187389" observedRunningTime="2025-12-01 21:38:00.382081088 +0000 UTC m=+264.483520353" watchObservedRunningTime="2025-12-01 21:38:22.052074969 +0000 UTC m=+286.153514164" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.052486 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cckxw" podStartSLOduration=46.918754353 podStartE2EDuration="51.05248028s" podCreationTimestamp="2025-12-01 21:37:31 +0000 UTC" firstStartedPulling="2025-12-01 21:37:33.084922048 +0000 UTC m=+237.186361253" lastFinishedPulling="2025-12-01 21:37:37.218647975 +0000 UTC m=+241.320087180" observedRunningTime="2025-12-01 21:38:00.483835671 +0000 UTC m=+264.585274926" watchObservedRunningTime="2025-12-01 21:38:22.05248028 +0000 UTC m=+286.153919475" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.052967 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-csp6p","openshift-kube-apiserver/kube-apiserver-crc"] Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.053021 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-7c7b56dd96-qsrlp"] Dec 01 21:38:22 crc kubenswrapper[4962]: E1201 21:38:22.053199 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e0f577b-3526-4f2e-846f-65d5a5ee1d8e" containerName="oauth-openshift" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.053215 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e0f577b-3526-4f2e-846f-65d5a5ee1d8e" containerName="oauth-openshift" Dec 01 21:38:22 crc kubenswrapper[4962]: E1201 21:38:22.053232 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b178e206-e661-4a20-af64-1b538fbc947a" containerName="installer" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.053238 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b178e206-e661-4a20-af64-1b538fbc947a" containerName="installer" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.053322 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e0f577b-3526-4f2e-846f-65d5a5ee1d8e" containerName="oauth-openshift" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.053332 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="b178e206-e661-4a20-af64-1b538fbc947a" containerName="installer" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.053653 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-k8fq2"] Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.053887 4962 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ac15c86b-cd18-4110-bda1-ba116dccb445" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.053987 4962 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ac15c86b-cd18-4110-bda1-ba116dccb445" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.053960 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7c7b56dd96-qsrlp" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.056305 4962 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.061469 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.061557 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.061596 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.061711 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.061768 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.061769 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.061912 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.062296 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.062407 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.062504 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.062607 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.065824 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.065969 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.067777 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.074524 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.075264 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.083168 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=22.0831529 podStartE2EDuration="22.0831529s" podCreationTimestamp="2025-12-01 21:38:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:38:22.082671797 +0000 UTC m=+286.184111012" watchObservedRunningTime="2025-12-01 21:38:22.0831529 +0000 UTC m=+286.184592095" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.091027 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.137668 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.149885 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/119ef561-9041-4041-9753-ec6650966caa-v4-0-config-user-template-error\") pod \"oauth-openshift-7c7b56dd96-qsrlp\" (UID: \"119ef561-9041-4041-9753-ec6650966caa\") " pod="openshift-authentication/oauth-openshift-7c7b56dd96-qsrlp" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.149957 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/119ef561-9041-4041-9753-ec6650966caa-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7c7b56dd96-qsrlp\" (UID: \"119ef561-9041-4041-9753-ec6650966caa\") " pod="openshift-authentication/oauth-openshift-7c7b56dd96-qsrlp" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.149982 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/119ef561-9041-4041-9753-ec6650966caa-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7c7b56dd96-qsrlp\" (UID: \"119ef561-9041-4041-9753-ec6650966caa\") " pod="openshift-authentication/oauth-openshift-7c7b56dd96-qsrlp" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.150005 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/119ef561-9041-4041-9753-ec6650966caa-v4-0-config-user-template-login\") pod \"oauth-openshift-7c7b56dd96-qsrlp\" (UID: \"119ef561-9041-4041-9753-ec6650966caa\") " pod="openshift-authentication/oauth-openshift-7c7b56dd96-qsrlp" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.150027 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/119ef561-9041-4041-9753-ec6650966caa-v4-0-config-system-session\") pod \"oauth-openshift-7c7b56dd96-qsrlp\" (UID: \"119ef561-9041-4041-9753-ec6650966caa\") " pod="openshift-authentication/oauth-openshift-7c7b56dd96-qsrlp" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.150063 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxg2v\" (UniqueName: \"kubernetes.io/projected/119ef561-9041-4041-9753-ec6650966caa-kube-api-access-gxg2v\") pod \"oauth-openshift-7c7b56dd96-qsrlp\" (UID: \"119ef561-9041-4041-9753-ec6650966caa\") " pod="openshift-authentication/oauth-openshift-7c7b56dd96-qsrlp" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.150081 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/119ef561-9041-4041-9753-ec6650966caa-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7c7b56dd96-qsrlp\" (UID: \"119ef561-9041-4041-9753-ec6650966caa\") " pod="openshift-authentication/oauth-openshift-7c7b56dd96-qsrlp" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.150104 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/119ef561-9041-4041-9753-ec6650966caa-audit-dir\") pod \"oauth-openshift-7c7b56dd96-qsrlp\" (UID: \"119ef561-9041-4041-9753-ec6650966caa\") " pod="openshift-authentication/oauth-openshift-7c7b56dd96-qsrlp" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.150131 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/119ef561-9041-4041-9753-ec6650966caa-audit-policies\") pod \"oauth-openshift-7c7b56dd96-qsrlp\" (UID: \"119ef561-9041-4041-9753-ec6650966caa\") " pod="openshift-authentication/oauth-openshift-7c7b56dd96-qsrlp" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.150155 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/119ef561-9041-4041-9753-ec6650966caa-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7c7b56dd96-qsrlp\" (UID: \"119ef561-9041-4041-9753-ec6650966caa\") " pod="openshift-authentication/oauth-openshift-7c7b56dd96-qsrlp" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.150182 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/119ef561-9041-4041-9753-ec6650966caa-v4-0-config-system-router-certs\") pod \"oauth-openshift-7c7b56dd96-qsrlp\" (UID: \"119ef561-9041-4041-9753-ec6650966caa\") " pod="openshift-authentication/oauth-openshift-7c7b56dd96-qsrlp" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.150206 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/119ef561-9041-4041-9753-ec6650966caa-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7c7b56dd96-qsrlp\" (UID: \"119ef561-9041-4041-9753-ec6650966caa\") " pod="openshift-authentication/oauth-openshift-7c7b56dd96-qsrlp" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.150234 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/119ef561-9041-4041-9753-ec6650966caa-v4-0-config-system-service-ca\") pod \"oauth-openshift-7c7b56dd96-qsrlp\" (UID: \"119ef561-9041-4041-9753-ec6650966caa\") " pod="openshift-authentication/oauth-openshift-7c7b56dd96-qsrlp" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.150260 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/119ef561-9041-4041-9753-ec6650966caa-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7c7b56dd96-qsrlp\" (UID: \"119ef561-9041-4041-9753-ec6650966caa\") " pod="openshift-authentication/oauth-openshift-7c7b56dd96-qsrlp" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.185645 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.228885 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e0f577b-3526-4f2e-846f-65d5a5ee1d8e" path="/var/lib/kubelet/pods/6e0f577b-3526-4f2e-846f-65d5a5ee1d8e/volumes" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.251310 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/119ef561-9041-4041-9753-ec6650966caa-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7c7b56dd96-qsrlp\" (UID: \"119ef561-9041-4041-9753-ec6650966caa\") " pod="openshift-authentication/oauth-openshift-7c7b56dd96-qsrlp" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.251623 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/119ef561-9041-4041-9753-ec6650966caa-v4-0-config-system-router-certs\") pod \"oauth-openshift-7c7b56dd96-qsrlp\" (UID: \"119ef561-9041-4041-9753-ec6650966caa\") " pod="openshift-authentication/oauth-openshift-7c7b56dd96-qsrlp" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.251722 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/119ef561-9041-4041-9753-ec6650966caa-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7c7b56dd96-qsrlp\" (UID: \"119ef561-9041-4041-9753-ec6650966caa\") " pod="openshift-authentication/oauth-openshift-7c7b56dd96-qsrlp" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.251811 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/119ef561-9041-4041-9753-ec6650966caa-v4-0-config-system-service-ca\") pod \"oauth-openshift-7c7b56dd96-qsrlp\" (UID: \"119ef561-9041-4041-9753-ec6650966caa\") " pod="openshift-authentication/oauth-openshift-7c7b56dd96-qsrlp" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.251970 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/119ef561-9041-4041-9753-ec6650966caa-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7c7b56dd96-qsrlp\" (UID: \"119ef561-9041-4041-9753-ec6650966caa\") " pod="openshift-authentication/oauth-openshift-7c7b56dd96-qsrlp" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.252126 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/119ef561-9041-4041-9753-ec6650966caa-v4-0-config-user-template-error\") pod \"oauth-openshift-7c7b56dd96-qsrlp\" (UID: \"119ef561-9041-4041-9753-ec6650966caa\") " pod="openshift-authentication/oauth-openshift-7c7b56dd96-qsrlp" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.252296 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/119ef561-9041-4041-9753-ec6650966caa-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7c7b56dd96-qsrlp\" (UID: \"119ef561-9041-4041-9753-ec6650966caa\") " pod="openshift-authentication/oauth-openshift-7c7b56dd96-qsrlp" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.252452 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/119ef561-9041-4041-9753-ec6650966caa-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7c7b56dd96-qsrlp\" (UID: \"119ef561-9041-4041-9753-ec6650966caa\") " pod="openshift-authentication/oauth-openshift-7c7b56dd96-qsrlp" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.252587 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/119ef561-9041-4041-9753-ec6650966caa-v4-0-config-user-template-login\") pod \"oauth-openshift-7c7b56dd96-qsrlp\" (UID: \"119ef561-9041-4041-9753-ec6650966caa\") " pod="openshift-authentication/oauth-openshift-7c7b56dd96-qsrlp" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.252702 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/119ef561-9041-4041-9753-ec6650966caa-v4-0-config-system-session\") pod \"oauth-openshift-7c7b56dd96-qsrlp\" (UID: \"119ef561-9041-4041-9753-ec6650966caa\") " pod="openshift-authentication/oauth-openshift-7c7b56dd96-qsrlp" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.252840 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxg2v\" (UniqueName: \"kubernetes.io/projected/119ef561-9041-4041-9753-ec6650966caa-kube-api-access-gxg2v\") pod \"oauth-openshift-7c7b56dd96-qsrlp\" (UID: \"119ef561-9041-4041-9753-ec6650966caa\") " pod="openshift-authentication/oauth-openshift-7c7b56dd96-qsrlp" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.253029 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/119ef561-9041-4041-9753-ec6650966caa-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7c7b56dd96-qsrlp\" (UID: \"119ef561-9041-4041-9753-ec6650966caa\") " pod="openshift-authentication/oauth-openshift-7c7b56dd96-qsrlp" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.253160 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/119ef561-9041-4041-9753-ec6650966caa-audit-dir\") pod \"oauth-openshift-7c7b56dd96-qsrlp\" (UID: \"119ef561-9041-4041-9753-ec6650966caa\") " pod="openshift-authentication/oauth-openshift-7c7b56dd96-qsrlp" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.253279 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/119ef561-9041-4041-9753-ec6650966caa-audit-policies\") pod \"oauth-openshift-7c7b56dd96-qsrlp\" (UID: \"119ef561-9041-4041-9753-ec6650966caa\") " pod="openshift-authentication/oauth-openshift-7c7b56dd96-qsrlp" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.253715 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/119ef561-9041-4041-9753-ec6650966caa-v4-0-config-system-service-ca\") pod \"oauth-openshift-7c7b56dd96-qsrlp\" (UID: \"119ef561-9041-4041-9753-ec6650966caa\") " pod="openshift-authentication/oauth-openshift-7c7b56dd96-qsrlp" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.253811 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/119ef561-9041-4041-9753-ec6650966caa-audit-dir\") pod \"oauth-openshift-7c7b56dd96-qsrlp\" (UID: \"119ef561-9041-4041-9753-ec6650966caa\") " pod="openshift-authentication/oauth-openshift-7c7b56dd96-qsrlp" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.254129 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/119ef561-9041-4041-9753-ec6650966caa-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7c7b56dd96-qsrlp\" (UID: \"119ef561-9041-4041-9753-ec6650966caa\") " pod="openshift-authentication/oauth-openshift-7c7b56dd96-qsrlp" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.254383 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/119ef561-9041-4041-9753-ec6650966caa-audit-policies\") pod \"oauth-openshift-7c7b56dd96-qsrlp\" (UID: \"119ef561-9041-4041-9753-ec6650966caa\") " pod="openshift-authentication/oauth-openshift-7c7b56dd96-qsrlp" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.255052 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/119ef561-9041-4041-9753-ec6650966caa-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7c7b56dd96-qsrlp\" (UID: \"119ef561-9041-4041-9753-ec6650966caa\") " pod="openshift-authentication/oauth-openshift-7c7b56dd96-qsrlp" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.259058 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/119ef561-9041-4041-9753-ec6650966caa-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7c7b56dd96-qsrlp\" (UID: \"119ef561-9041-4041-9753-ec6650966caa\") " pod="openshift-authentication/oauth-openshift-7c7b56dd96-qsrlp" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.259159 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/119ef561-9041-4041-9753-ec6650966caa-v4-0-config-system-session\") pod \"oauth-openshift-7c7b56dd96-qsrlp\" (UID: \"119ef561-9041-4041-9753-ec6650966caa\") " pod="openshift-authentication/oauth-openshift-7c7b56dd96-qsrlp" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.259482 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/119ef561-9041-4041-9753-ec6650966caa-v4-0-config-user-template-error\") pod \"oauth-openshift-7c7b56dd96-qsrlp\" (UID: \"119ef561-9041-4041-9753-ec6650966caa\") " pod="openshift-authentication/oauth-openshift-7c7b56dd96-qsrlp" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.259981 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/119ef561-9041-4041-9753-ec6650966caa-v4-0-config-user-template-login\") pod \"oauth-openshift-7c7b56dd96-qsrlp\" (UID: \"119ef561-9041-4041-9753-ec6650966caa\") " pod="openshift-authentication/oauth-openshift-7c7b56dd96-qsrlp" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.260149 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/119ef561-9041-4041-9753-ec6650966caa-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7c7b56dd96-qsrlp\" (UID: \"119ef561-9041-4041-9753-ec6650966caa\") " pod="openshift-authentication/oauth-openshift-7c7b56dd96-qsrlp" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.261180 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/119ef561-9041-4041-9753-ec6650966caa-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7c7b56dd96-qsrlp\" (UID: \"119ef561-9041-4041-9753-ec6650966caa\") " pod="openshift-authentication/oauth-openshift-7c7b56dd96-qsrlp" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.261326 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/119ef561-9041-4041-9753-ec6650966caa-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7c7b56dd96-qsrlp\" (UID: \"119ef561-9041-4041-9753-ec6650966caa\") " pod="openshift-authentication/oauth-openshift-7c7b56dd96-qsrlp" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.263286 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/119ef561-9041-4041-9753-ec6650966caa-v4-0-config-system-router-certs\") pod \"oauth-openshift-7c7b56dd96-qsrlp\" (UID: \"119ef561-9041-4041-9753-ec6650966caa\") " pod="openshift-authentication/oauth-openshift-7c7b56dd96-qsrlp" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.269064 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxg2v\" (UniqueName: \"kubernetes.io/projected/119ef561-9041-4041-9753-ec6650966caa-kube-api-access-gxg2v\") pod \"oauth-openshift-7c7b56dd96-qsrlp\" (UID: \"119ef561-9041-4041-9753-ec6650966caa\") " pod="openshift-authentication/oauth-openshift-7c7b56dd96-qsrlp" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.317159 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.370853 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.381384 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7c7b56dd96-qsrlp" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.391565 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.401157 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.438275 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.599225 4962 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.688572 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.690211 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7c7b56dd96-qsrlp"] Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.867728 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.882878 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 01 21:38:22 crc kubenswrapper[4962]: I1201 21:38:22.950238 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 01 21:38:23 crc kubenswrapper[4962]: I1201 21:38:23.108158 4962 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 01 21:38:23 crc kubenswrapper[4962]: I1201 21:38:23.108517 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://c84559d0dc527ca39a746d3e1b24b6706ea1243d121135a8f297e6c2fa0a69ed" gracePeriod=5 Dec 01 21:38:23 crc kubenswrapper[4962]: I1201 21:38:23.171898 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 01 21:38:23 crc kubenswrapper[4962]: I1201 21:38:23.199000 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 01 21:38:23 crc kubenswrapper[4962]: I1201 21:38:23.255567 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 01 21:38:23 crc kubenswrapper[4962]: I1201 21:38:23.365281 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 01 21:38:23 crc kubenswrapper[4962]: I1201 21:38:23.473219 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 01 21:38:23 crc kubenswrapper[4962]: I1201 21:38:23.492286 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 01 21:38:23 crc kubenswrapper[4962]: I1201 21:38:23.507462 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7c7b56dd96-qsrlp" event={"ID":"119ef561-9041-4041-9753-ec6650966caa","Type":"ContainerStarted","Data":"b40735ba9a4173ced275aca67ac6f80b9f328447dd83574c9688afdd7c7e7ec8"} Dec 01 21:38:23 crc kubenswrapper[4962]: I1201 21:38:23.507536 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7c7b56dd96-qsrlp" event={"ID":"119ef561-9041-4041-9753-ec6650966caa","Type":"ContainerStarted","Data":"f4d4974ac06293bd00f1550696c60fe0bf2734ce7314ffe60cc0e18b97854d6a"} Dec 01 21:38:23 crc kubenswrapper[4962]: I1201 21:38:23.508145 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-7c7b56dd96-qsrlp" Dec 01 21:38:23 crc kubenswrapper[4962]: I1201 21:38:23.517914 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-7c7b56dd96-qsrlp" Dec 01 21:38:23 crc kubenswrapper[4962]: I1201 21:38:23.529236 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 01 21:38:23 crc kubenswrapper[4962]: I1201 21:38:23.547815 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 01 21:38:23 crc kubenswrapper[4962]: I1201 21:38:23.575323 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-7c7b56dd96-qsrlp" podStartSLOduration=55.575299256 podStartE2EDuration="55.575299256s" podCreationTimestamp="2025-12-01 21:37:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:38:23.54623906 +0000 UTC m=+287.647678295" watchObservedRunningTime="2025-12-01 21:38:23.575299256 +0000 UTC m=+287.676738491" Dec 01 21:38:23 crc kubenswrapper[4962]: I1201 21:38:23.827960 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 01 21:38:23 crc kubenswrapper[4962]: I1201 21:38:23.830100 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 01 21:38:23 crc kubenswrapper[4962]: I1201 21:38:23.919061 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 01 21:38:23 crc kubenswrapper[4962]: I1201 21:38:23.921883 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 01 21:38:23 crc kubenswrapper[4962]: I1201 21:38:23.970200 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 01 21:38:24 crc kubenswrapper[4962]: I1201 21:38:24.032679 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 01 21:38:24 crc kubenswrapper[4962]: I1201 21:38:24.044739 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 01 21:38:24 crc kubenswrapper[4962]: I1201 21:38:24.161511 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 01 21:38:24 crc kubenswrapper[4962]: I1201 21:38:24.533590 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 01 21:38:24 crc kubenswrapper[4962]: I1201 21:38:24.670666 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 01 21:38:24 crc kubenswrapper[4962]: I1201 21:38:24.815796 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 01 21:38:24 crc kubenswrapper[4962]: I1201 21:38:24.819524 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 01 21:38:24 crc kubenswrapper[4962]: I1201 21:38:24.962356 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 01 21:38:24 crc kubenswrapper[4962]: I1201 21:38:24.970691 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 01 21:38:25 crc kubenswrapper[4962]: I1201 21:38:25.030594 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 01 21:38:25 crc kubenswrapper[4962]: I1201 21:38:25.119571 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 01 21:38:25 crc kubenswrapper[4962]: I1201 21:38:25.272638 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 01 21:38:25 crc kubenswrapper[4962]: I1201 21:38:25.289779 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 01 21:38:25 crc kubenswrapper[4962]: I1201 21:38:25.318755 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 01 21:38:25 crc kubenswrapper[4962]: I1201 21:38:25.340207 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 01 21:38:25 crc kubenswrapper[4962]: I1201 21:38:25.644350 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 01 21:38:25 crc kubenswrapper[4962]: I1201 21:38:25.872211 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 01 21:38:26 crc kubenswrapper[4962]: I1201 21:38:26.194750 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 01 21:38:26 crc kubenswrapper[4962]: I1201 21:38:26.363537 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 01 21:38:26 crc kubenswrapper[4962]: I1201 21:38:26.556172 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 01 21:38:28 crc kubenswrapper[4962]: I1201 21:38:28.540491 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 01 21:38:28 crc kubenswrapper[4962]: I1201 21:38:28.540553 4962 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="c84559d0dc527ca39a746d3e1b24b6706ea1243d121135a8f297e6c2fa0a69ed" exitCode=137 Dec 01 21:38:28 crc kubenswrapper[4962]: I1201 21:38:28.723787 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 01 21:38:28 crc kubenswrapper[4962]: I1201 21:38:28.723913 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 21:38:28 crc kubenswrapper[4962]: I1201 21:38:28.844243 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 01 21:38:28 crc kubenswrapper[4962]: I1201 21:38:28.844458 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 01 21:38:28 crc kubenswrapper[4962]: I1201 21:38:28.844517 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 01 21:38:28 crc kubenswrapper[4962]: I1201 21:38:28.844588 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 01 21:38:28 crc kubenswrapper[4962]: I1201 21:38:28.844605 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 21:38:28 crc kubenswrapper[4962]: I1201 21:38:28.844656 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 01 21:38:28 crc kubenswrapper[4962]: I1201 21:38:28.844750 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 21:38:28 crc kubenswrapper[4962]: I1201 21:38:28.844900 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 21:38:28 crc kubenswrapper[4962]: I1201 21:38:28.845041 4962 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Dec 01 21:38:28 crc kubenswrapper[4962]: I1201 21:38:28.845065 4962 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Dec 01 21:38:28 crc kubenswrapper[4962]: I1201 21:38:28.845059 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 21:38:28 crc kubenswrapper[4962]: I1201 21:38:28.856972 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 21:38:28 crc kubenswrapper[4962]: I1201 21:38:28.946200 4962 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 01 21:38:28 crc kubenswrapper[4962]: I1201 21:38:28.946255 4962 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Dec 01 21:38:28 crc kubenswrapper[4962]: I1201 21:38:28.946273 4962 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 01 21:38:29 crc kubenswrapper[4962]: I1201 21:38:29.550687 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 01 21:38:29 crc kubenswrapper[4962]: I1201 21:38:29.550816 4962 scope.go:117] "RemoveContainer" containerID="c84559d0dc527ca39a746d3e1b24b6706ea1243d121135a8f297e6c2fa0a69ed" Dec 01 21:38:29 crc kubenswrapper[4962]: I1201 21:38:29.550930 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 21:38:30 crc kubenswrapper[4962]: I1201 21:38:30.230852 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Dec 01 21:38:36 crc kubenswrapper[4962]: I1201 21:38:36.120778 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 01 21:38:37 crc kubenswrapper[4962]: I1201 21:38:37.388970 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 01 21:38:39 crc kubenswrapper[4962]: I1201 21:38:39.956786 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6d5b84845-vr7k6"] Dec 01 21:38:39 crc kubenswrapper[4962]: E1201 21:38:39.957142 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 01 21:38:39 crc kubenswrapper[4962]: I1201 21:38:39.957164 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 01 21:38:39 crc kubenswrapper[4962]: I1201 21:38:39.957343 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 01 21:38:39 crc kubenswrapper[4962]: I1201 21:38:39.957960 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-vr7k6" Dec 01 21:38:39 crc kubenswrapper[4962]: I1201 21:38:39.961998 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Dec 01 21:38:39 crc kubenswrapper[4962]: I1201 21:38:39.962105 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Dec 01 21:38:39 crc kubenswrapper[4962]: I1201 21:38:39.964488 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-dockercfg-wwt9l" Dec 01 21:38:39 crc kubenswrapper[4962]: I1201 21:38:39.965142 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Dec 01 21:38:39 crc kubenswrapper[4962]: I1201 21:38:39.965221 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Dec 01 21:38:39 crc kubenswrapper[4962]: I1201 21:38:39.976416 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6d5b84845-vr7k6"] Dec 01 21:38:40 crc kubenswrapper[4962]: I1201 21:38:40.021697 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/94a0af0d-9396-4d73-87b6-4ebcafde97af-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6d5b84845-vr7k6\" (UID: \"94a0af0d-9396-4d73-87b6-4ebcafde97af\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-vr7k6" Dec 01 21:38:40 crc kubenswrapper[4962]: I1201 21:38:40.021789 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2ds8\" (UniqueName: \"kubernetes.io/projected/94a0af0d-9396-4d73-87b6-4ebcafde97af-kube-api-access-w2ds8\") pod \"cluster-monitoring-operator-6d5b84845-vr7k6\" (UID: \"94a0af0d-9396-4d73-87b6-4ebcafde97af\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-vr7k6" Dec 01 21:38:40 crc kubenswrapper[4962]: I1201 21:38:40.021844 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/94a0af0d-9396-4d73-87b6-4ebcafde97af-telemetry-config\") pod \"cluster-monitoring-operator-6d5b84845-vr7k6\" (UID: \"94a0af0d-9396-4d73-87b6-4ebcafde97af\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-vr7k6" Dec 01 21:38:40 crc kubenswrapper[4962]: I1201 21:38:40.123610 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/94a0af0d-9396-4d73-87b6-4ebcafde97af-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6d5b84845-vr7k6\" (UID: \"94a0af0d-9396-4d73-87b6-4ebcafde97af\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-vr7k6" Dec 01 21:38:40 crc kubenswrapper[4962]: I1201 21:38:40.123747 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2ds8\" (UniqueName: \"kubernetes.io/projected/94a0af0d-9396-4d73-87b6-4ebcafde97af-kube-api-access-w2ds8\") pod \"cluster-monitoring-operator-6d5b84845-vr7k6\" (UID: \"94a0af0d-9396-4d73-87b6-4ebcafde97af\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-vr7k6" Dec 01 21:38:40 crc kubenswrapper[4962]: I1201 21:38:40.123806 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/94a0af0d-9396-4d73-87b6-4ebcafde97af-telemetry-config\") pod \"cluster-monitoring-operator-6d5b84845-vr7k6\" (UID: \"94a0af0d-9396-4d73-87b6-4ebcafde97af\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-vr7k6" Dec 01 21:38:40 crc kubenswrapper[4962]: I1201 21:38:40.125028 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/94a0af0d-9396-4d73-87b6-4ebcafde97af-telemetry-config\") pod \"cluster-monitoring-operator-6d5b84845-vr7k6\" (UID: \"94a0af0d-9396-4d73-87b6-4ebcafde97af\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-vr7k6" Dec 01 21:38:40 crc kubenswrapper[4962]: I1201 21:38:40.134478 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/94a0af0d-9396-4d73-87b6-4ebcafde97af-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6d5b84845-vr7k6\" (UID: \"94a0af0d-9396-4d73-87b6-4ebcafde97af\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-vr7k6" Dec 01 21:38:40 crc kubenswrapper[4962]: I1201 21:38:40.158744 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2ds8\" (UniqueName: \"kubernetes.io/projected/94a0af0d-9396-4d73-87b6-4ebcafde97af-kube-api-access-w2ds8\") pod \"cluster-monitoring-operator-6d5b84845-vr7k6\" (UID: \"94a0af0d-9396-4d73-87b6-4ebcafde97af\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-vr7k6" Dec 01 21:38:40 crc kubenswrapper[4962]: I1201 21:38:40.287627 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-vr7k6" Dec 01 21:38:40 crc kubenswrapper[4962]: I1201 21:38:40.596235 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6d5b84845-vr7k6"] Dec 01 21:38:40 crc kubenswrapper[4962]: I1201 21:38:40.635554 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-vr7k6" event={"ID":"94a0af0d-9396-4d73-87b6-4ebcafde97af","Type":"ContainerStarted","Data":"b020e077685979a60571ffcbf718354d7dc90aad5c1feb2bfde6d315e2d92380"} Dec 01 21:38:41 crc kubenswrapper[4962]: I1201 21:38:41.747998 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 01 21:38:43 crc kubenswrapper[4962]: I1201 21:38:43.474523 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dfszb"] Dec 01 21:38:43 crc kubenswrapper[4962]: I1201 21:38:43.476488 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dfszb" Dec 01 21:38:43 crc kubenswrapper[4962]: I1201 21:38:43.481819 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Dec 01 21:38:43 crc kubenswrapper[4962]: I1201 21:38:43.494558 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dfszb"] Dec 01 21:38:43 crc kubenswrapper[4962]: I1201 21:38:43.588155 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/f02d5ee7-1f55-4474-94c6-005cdc9974bf-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-dfszb\" (UID: \"f02d5ee7-1f55-4474-94c6-005cdc9974bf\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dfszb" Dec 01 21:38:43 crc kubenswrapper[4962]: I1201 21:38:43.657507 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 01 21:38:43 crc kubenswrapper[4962]: I1201 21:38:43.658197 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-vr7k6" event={"ID":"94a0af0d-9396-4d73-87b6-4ebcafde97af","Type":"ContainerStarted","Data":"2b470f602070c480005e20ab2e6e12c70c618a3f357e660867df11a9a81a5057"} Dec 01 21:38:43 crc kubenswrapper[4962]: I1201 21:38:43.680823 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-vr7k6" podStartSLOduration=2.458558479 podStartE2EDuration="4.680798452s" podCreationTimestamp="2025-12-01 21:38:39 +0000 UTC" firstStartedPulling="2025-12-01 21:38:40.604265922 +0000 UTC m=+304.705705137" lastFinishedPulling="2025-12-01 21:38:42.826505885 +0000 UTC m=+306.927945110" observedRunningTime="2025-12-01 21:38:43.67958638 +0000 UTC m=+307.781025645" watchObservedRunningTime="2025-12-01 21:38:43.680798452 +0000 UTC m=+307.782237687" Dec 01 21:38:43 crc kubenswrapper[4962]: I1201 21:38:43.691825 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/f02d5ee7-1f55-4474-94c6-005cdc9974bf-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-dfszb\" (UID: \"f02d5ee7-1f55-4474-94c6-005cdc9974bf\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dfszb" Dec 01 21:38:43 crc kubenswrapper[4962]: E1201 21:38:43.692167 4962 secret.go:188] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Dec 01 21:38:43 crc kubenswrapper[4962]: E1201 21:38:43.692280 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f02d5ee7-1f55-4474-94c6-005cdc9974bf-tls-certificates podName:f02d5ee7-1f55-4474-94c6-005cdc9974bf nodeName:}" failed. No retries permitted until 2025-12-01 21:38:44.192253622 +0000 UTC m=+308.293692857 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/f02d5ee7-1f55-4474-94c6-005cdc9974bf-tls-certificates") pod "prometheus-operator-admission-webhook-f54c54754-dfszb" (UID: "f02d5ee7-1f55-4474-94c6-005cdc9974bf") : secret "prometheus-operator-admission-webhook-tls" not found Dec 01 21:38:44 crc kubenswrapper[4962]: I1201 21:38:44.200494 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/f02d5ee7-1f55-4474-94c6-005cdc9974bf-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-dfszb\" (UID: \"f02d5ee7-1f55-4474-94c6-005cdc9974bf\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dfszb" Dec 01 21:38:44 crc kubenswrapper[4962]: E1201 21:38:44.200811 4962 secret.go:188] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Dec 01 21:38:44 crc kubenswrapper[4962]: E1201 21:38:44.200977 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f02d5ee7-1f55-4474-94c6-005cdc9974bf-tls-certificates podName:f02d5ee7-1f55-4474-94c6-005cdc9974bf nodeName:}" failed. No retries permitted until 2025-12-01 21:38:45.200916996 +0000 UTC m=+309.302356221 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/f02d5ee7-1f55-4474-94c6-005cdc9974bf-tls-certificates") pod "prometheus-operator-admission-webhook-f54c54754-dfszb" (UID: "f02d5ee7-1f55-4474-94c6-005cdc9974bf") : secret "prometheus-operator-admission-webhook-tls" not found Dec 01 21:38:45 crc kubenswrapper[4962]: I1201 21:38:45.216207 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/f02d5ee7-1f55-4474-94c6-005cdc9974bf-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-dfszb\" (UID: \"f02d5ee7-1f55-4474-94c6-005cdc9974bf\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dfszb" Dec 01 21:38:45 crc kubenswrapper[4962]: E1201 21:38:45.216483 4962 secret.go:188] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Dec 01 21:38:45 crc kubenswrapper[4962]: E1201 21:38:45.216959 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f02d5ee7-1f55-4474-94c6-005cdc9974bf-tls-certificates podName:f02d5ee7-1f55-4474-94c6-005cdc9974bf nodeName:}" failed. No retries permitted until 2025-12-01 21:38:47.216901638 +0000 UTC m=+311.318340873 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/f02d5ee7-1f55-4474-94c6-005cdc9974bf-tls-certificates") pod "prometheus-operator-admission-webhook-f54c54754-dfszb" (UID: "f02d5ee7-1f55-4474-94c6-005cdc9974bf") : secret "prometheus-operator-admission-webhook-tls" not found Dec 01 21:38:47 crc kubenswrapper[4962]: I1201 21:38:47.105095 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-k8fq2" podUID="b2d93aa1-eee7-4a67-b5ee-a05a6696b624" containerName="registry" containerID="cri-o://e8b6432ff0785108e95c073da1d34c555317244be9bf66ae7e3e272a3b632af4" gracePeriod=30 Dec 01 21:38:47 crc kubenswrapper[4962]: I1201 21:38:47.245847 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/f02d5ee7-1f55-4474-94c6-005cdc9974bf-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-dfszb\" (UID: \"f02d5ee7-1f55-4474-94c6-005cdc9974bf\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dfszb" Dec 01 21:38:47 crc kubenswrapper[4962]: E1201 21:38:47.246048 4962 secret.go:188] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Dec 01 21:38:47 crc kubenswrapper[4962]: E1201 21:38:47.246183 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f02d5ee7-1f55-4474-94c6-005cdc9974bf-tls-certificates podName:f02d5ee7-1f55-4474-94c6-005cdc9974bf nodeName:}" failed. No retries permitted until 2025-12-01 21:38:51.246151949 +0000 UTC m=+315.347591174 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/f02d5ee7-1f55-4474-94c6-005cdc9974bf-tls-certificates") pod "prometheus-operator-admission-webhook-f54c54754-dfszb" (UID: "f02d5ee7-1f55-4474-94c6-005cdc9974bf") : secret "prometheus-operator-admission-webhook-tls" not found Dec 01 21:38:47 crc kubenswrapper[4962]: I1201 21:38:47.551086 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-k8fq2" Dec 01 21:38:47 crc kubenswrapper[4962]: I1201 21:38:47.650827 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b2d93aa1-eee7-4a67-b5ee-a05a6696b624-registry-tls\") pod \"b2d93aa1-eee7-4a67-b5ee-a05a6696b624\" (UID: \"b2d93aa1-eee7-4a67-b5ee-a05a6696b624\") " Dec 01 21:38:47 crc kubenswrapper[4962]: I1201 21:38:47.650961 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b2d93aa1-eee7-4a67-b5ee-a05a6696b624-bound-sa-token\") pod \"b2d93aa1-eee7-4a67-b5ee-a05a6696b624\" (UID: \"b2d93aa1-eee7-4a67-b5ee-a05a6696b624\") " Dec 01 21:38:47 crc kubenswrapper[4962]: I1201 21:38:47.651008 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b2d93aa1-eee7-4a67-b5ee-a05a6696b624-ca-trust-extracted\") pod \"b2d93aa1-eee7-4a67-b5ee-a05a6696b624\" (UID: \"b2d93aa1-eee7-4a67-b5ee-a05a6696b624\") " Dec 01 21:38:47 crc kubenswrapper[4962]: I1201 21:38:47.651052 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b2d93aa1-eee7-4a67-b5ee-a05a6696b624-registry-certificates\") pod \"b2d93aa1-eee7-4a67-b5ee-a05a6696b624\" (UID: \"b2d93aa1-eee7-4a67-b5ee-a05a6696b624\") " Dec 01 21:38:47 crc kubenswrapper[4962]: I1201 21:38:47.651251 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"b2d93aa1-eee7-4a67-b5ee-a05a6696b624\" (UID: \"b2d93aa1-eee7-4a67-b5ee-a05a6696b624\") " Dec 01 21:38:47 crc kubenswrapper[4962]: I1201 21:38:47.651296 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rh5vl\" (UniqueName: \"kubernetes.io/projected/b2d93aa1-eee7-4a67-b5ee-a05a6696b624-kube-api-access-rh5vl\") pod \"b2d93aa1-eee7-4a67-b5ee-a05a6696b624\" (UID: \"b2d93aa1-eee7-4a67-b5ee-a05a6696b624\") " Dec 01 21:38:47 crc kubenswrapper[4962]: I1201 21:38:47.651332 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b2d93aa1-eee7-4a67-b5ee-a05a6696b624-installation-pull-secrets\") pod \"b2d93aa1-eee7-4a67-b5ee-a05a6696b624\" (UID: \"b2d93aa1-eee7-4a67-b5ee-a05a6696b624\") " Dec 01 21:38:47 crc kubenswrapper[4962]: I1201 21:38:47.651366 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b2d93aa1-eee7-4a67-b5ee-a05a6696b624-trusted-ca\") pod \"b2d93aa1-eee7-4a67-b5ee-a05a6696b624\" (UID: \"b2d93aa1-eee7-4a67-b5ee-a05a6696b624\") " Dec 01 21:38:47 crc kubenswrapper[4962]: I1201 21:38:47.652986 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2d93aa1-eee7-4a67-b5ee-a05a6696b624-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "b2d93aa1-eee7-4a67-b5ee-a05a6696b624" (UID: "b2d93aa1-eee7-4a67-b5ee-a05a6696b624"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:38:47 crc kubenswrapper[4962]: I1201 21:38:47.653031 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2d93aa1-eee7-4a67-b5ee-a05a6696b624-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "b2d93aa1-eee7-4a67-b5ee-a05a6696b624" (UID: "b2d93aa1-eee7-4a67-b5ee-a05a6696b624"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:38:47 crc kubenswrapper[4962]: I1201 21:38:47.658836 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2d93aa1-eee7-4a67-b5ee-a05a6696b624-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "b2d93aa1-eee7-4a67-b5ee-a05a6696b624" (UID: "b2d93aa1-eee7-4a67-b5ee-a05a6696b624"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:38:47 crc kubenswrapper[4962]: I1201 21:38:47.659251 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2d93aa1-eee7-4a67-b5ee-a05a6696b624-kube-api-access-rh5vl" (OuterVolumeSpecName: "kube-api-access-rh5vl") pod "b2d93aa1-eee7-4a67-b5ee-a05a6696b624" (UID: "b2d93aa1-eee7-4a67-b5ee-a05a6696b624"). InnerVolumeSpecName "kube-api-access-rh5vl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:38:47 crc kubenswrapper[4962]: I1201 21:38:47.663095 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2d93aa1-eee7-4a67-b5ee-a05a6696b624-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "b2d93aa1-eee7-4a67-b5ee-a05a6696b624" (UID: "b2d93aa1-eee7-4a67-b5ee-a05a6696b624"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:38:47 crc kubenswrapper[4962]: I1201 21:38:47.664172 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2d93aa1-eee7-4a67-b5ee-a05a6696b624-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "b2d93aa1-eee7-4a67-b5ee-a05a6696b624" (UID: "b2d93aa1-eee7-4a67-b5ee-a05a6696b624"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:38:47 crc kubenswrapper[4962]: I1201 21:38:47.674044 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "b2d93aa1-eee7-4a67-b5ee-a05a6696b624" (UID: "b2d93aa1-eee7-4a67-b5ee-a05a6696b624"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 01 21:38:47 crc kubenswrapper[4962]: I1201 21:38:47.684370 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2d93aa1-eee7-4a67-b5ee-a05a6696b624-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "b2d93aa1-eee7-4a67-b5ee-a05a6696b624" (UID: "b2d93aa1-eee7-4a67-b5ee-a05a6696b624"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:38:47 crc kubenswrapper[4962]: I1201 21:38:47.689078 4962 generic.go:334] "Generic (PLEG): container finished" podID="b2d93aa1-eee7-4a67-b5ee-a05a6696b624" containerID="e8b6432ff0785108e95c073da1d34c555317244be9bf66ae7e3e272a3b632af4" exitCode=0 Dec 01 21:38:47 crc kubenswrapper[4962]: I1201 21:38:47.689151 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-k8fq2" event={"ID":"b2d93aa1-eee7-4a67-b5ee-a05a6696b624","Type":"ContainerDied","Data":"e8b6432ff0785108e95c073da1d34c555317244be9bf66ae7e3e272a3b632af4"} Dec 01 21:38:47 crc kubenswrapper[4962]: I1201 21:38:47.689214 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-k8fq2" event={"ID":"b2d93aa1-eee7-4a67-b5ee-a05a6696b624","Type":"ContainerDied","Data":"f709c5f5721e83dc178563bba08680d253863887dc0d8ea1ea435658deeb5e0c"} Dec 01 21:38:47 crc kubenswrapper[4962]: I1201 21:38:47.689230 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-k8fq2" Dec 01 21:38:47 crc kubenswrapper[4962]: I1201 21:38:47.689256 4962 scope.go:117] "RemoveContainer" containerID="e8b6432ff0785108e95c073da1d34c555317244be9bf66ae7e3e272a3b632af4" Dec 01 21:38:47 crc kubenswrapper[4962]: I1201 21:38:47.739218 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-k8fq2"] Dec 01 21:38:47 crc kubenswrapper[4962]: I1201 21:38:47.743529 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-k8fq2"] Dec 01 21:38:47 crc kubenswrapper[4962]: I1201 21:38:47.748623 4962 scope.go:117] "RemoveContainer" containerID="e8b6432ff0785108e95c073da1d34c555317244be9bf66ae7e3e272a3b632af4" Dec 01 21:38:47 crc kubenswrapper[4962]: E1201 21:38:47.750246 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8b6432ff0785108e95c073da1d34c555317244be9bf66ae7e3e272a3b632af4\": container with ID starting with e8b6432ff0785108e95c073da1d34c555317244be9bf66ae7e3e272a3b632af4 not found: ID does not exist" containerID="e8b6432ff0785108e95c073da1d34c555317244be9bf66ae7e3e272a3b632af4" Dec 01 21:38:47 crc kubenswrapper[4962]: I1201 21:38:47.750300 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8b6432ff0785108e95c073da1d34c555317244be9bf66ae7e3e272a3b632af4"} err="failed to get container status \"e8b6432ff0785108e95c073da1d34c555317244be9bf66ae7e3e272a3b632af4\": rpc error: code = NotFound desc = could not find container \"e8b6432ff0785108e95c073da1d34c555317244be9bf66ae7e3e272a3b632af4\": container with ID starting with e8b6432ff0785108e95c073da1d34c555317244be9bf66ae7e3e272a3b632af4 not found: ID does not exist" Dec 01 21:38:47 crc kubenswrapper[4962]: I1201 21:38:47.754659 4962 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b2d93aa1-eee7-4a67-b5ee-a05a6696b624-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 01 21:38:47 crc kubenswrapper[4962]: I1201 21:38:47.754705 4962 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b2d93aa1-eee7-4a67-b5ee-a05a6696b624-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 01 21:38:47 crc kubenswrapper[4962]: I1201 21:38:47.754728 4962 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b2d93aa1-eee7-4a67-b5ee-a05a6696b624-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 01 21:38:47 crc kubenswrapper[4962]: I1201 21:38:47.754749 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rh5vl\" (UniqueName: \"kubernetes.io/projected/b2d93aa1-eee7-4a67-b5ee-a05a6696b624-kube-api-access-rh5vl\") on node \"crc\" DevicePath \"\"" Dec 01 21:38:47 crc kubenswrapper[4962]: I1201 21:38:47.754768 4962 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b2d93aa1-eee7-4a67-b5ee-a05a6696b624-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 01 21:38:47 crc kubenswrapper[4962]: I1201 21:38:47.754787 4962 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b2d93aa1-eee7-4a67-b5ee-a05a6696b624-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 21:38:47 crc kubenswrapper[4962]: I1201 21:38:47.754805 4962 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b2d93aa1-eee7-4a67-b5ee-a05a6696b624-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 01 21:38:48 crc kubenswrapper[4962]: I1201 21:38:48.232922 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2d93aa1-eee7-4a67-b5ee-a05a6696b624" path="/var/lib/kubelet/pods/b2d93aa1-eee7-4a67-b5ee-a05a6696b624/volumes" Dec 01 21:38:51 crc kubenswrapper[4962]: I1201 21:38:51.334892 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/f02d5ee7-1f55-4474-94c6-005cdc9974bf-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-dfszb\" (UID: \"f02d5ee7-1f55-4474-94c6-005cdc9974bf\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dfszb" Dec 01 21:38:51 crc kubenswrapper[4962]: E1201 21:38:51.335149 4962 secret.go:188] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Dec 01 21:38:51 crc kubenswrapper[4962]: E1201 21:38:51.335508 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f02d5ee7-1f55-4474-94c6-005cdc9974bf-tls-certificates podName:f02d5ee7-1f55-4474-94c6-005cdc9974bf nodeName:}" failed. No retries permitted until 2025-12-01 21:38:59.335471794 +0000 UTC m=+323.436911019 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/f02d5ee7-1f55-4474-94c6-005cdc9974bf-tls-certificates") pod "prometheus-operator-admission-webhook-f54c54754-dfszb" (UID: "f02d5ee7-1f55-4474-94c6-005cdc9974bf") : secret "prometheus-operator-admission-webhook-tls" not found Dec 01 21:38:52 crc kubenswrapper[4962]: I1201 21:38:52.671888 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 01 21:38:53 crc kubenswrapper[4962]: I1201 21:38:53.518214 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 01 21:38:54 crc kubenswrapper[4962]: I1201 21:38:54.636539 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 01 21:38:55 crc kubenswrapper[4962]: I1201 21:38:55.521468 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 01 21:38:59 crc kubenswrapper[4962]: I1201 21:38:59.351176 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/f02d5ee7-1f55-4474-94c6-005cdc9974bf-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-dfszb\" (UID: \"f02d5ee7-1f55-4474-94c6-005cdc9974bf\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dfszb" Dec 01 21:38:59 crc kubenswrapper[4962]: E1201 21:38:59.351420 4962 secret.go:188] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Dec 01 21:38:59 crc kubenswrapper[4962]: E1201 21:38:59.351705 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f02d5ee7-1f55-4474-94c6-005cdc9974bf-tls-certificates podName:f02d5ee7-1f55-4474-94c6-005cdc9974bf nodeName:}" failed. No retries permitted until 2025-12-01 21:39:15.351687209 +0000 UTC m=+339.453126414 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/f02d5ee7-1f55-4474-94c6-005cdc9974bf-tls-certificates") pod "prometheus-operator-admission-webhook-f54c54754-dfszb" (UID: "f02d5ee7-1f55-4474-94c6-005cdc9974bf") : secret "prometheus-operator-admission-webhook-tls" not found Dec 01 21:39:02 crc kubenswrapper[4962]: I1201 21:39:02.279717 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 01 21:39:05 crc kubenswrapper[4962]: I1201 21:39:05.152462 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 01 21:39:05 crc kubenswrapper[4962]: I1201 21:39:05.429298 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-p5kqx"] Dec 01 21:39:05 crc kubenswrapper[4962]: I1201 21:39:05.429823 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-p5kqx" podUID="7ad9fda2-065b-4620-bc36-e33403fcdd53" containerName="controller-manager" containerID="cri-o://47d64b0df8f4445f2fb91407f9f29c297bb37075610f907fbd8ab63be614b20d" gracePeriod=30 Dec 01 21:39:05 crc kubenswrapper[4962]: I1201 21:39:05.527331 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-7hn2f"] Dec 01 21:39:05 crc kubenswrapper[4962]: I1201 21:39:05.527629 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7hn2f" podUID="458c25bb-0a4d-4f96-9f46-cdfd66b6ae34" containerName="route-controller-manager" containerID="cri-o://4bd3682a0231b1cc286c33cd99673e2d6760c5dc19a8ed1933dd9dd2000d6e74" gracePeriod=30 Dec 01 21:39:05 crc kubenswrapper[4962]: I1201 21:39:05.804746 4962 generic.go:334] "Generic (PLEG): container finished" podID="7ad9fda2-065b-4620-bc36-e33403fcdd53" containerID="47d64b0df8f4445f2fb91407f9f29c297bb37075610f907fbd8ab63be614b20d" exitCode=0 Dec 01 21:39:05 crc kubenswrapper[4962]: I1201 21:39:05.804830 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-p5kqx" event={"ID":"7ad9fda2-065b-4620-bc36-e33403fcdd53","Type":"ContainerDied","Data":"47d64b0df8f4445f2fb91407f9f29c297bb37075610f907fbd8ab63be614b20d"} Dec 01 21:39:05 crc kubenswrapper[4962]: I1201 21:39:05.806250 4962 generic.go:334] "Generic (PLEG): container finished" podID="458c25bb-0a4d-4f96-9f46-cdfd66b6ae34" containerID="4bd3682a0231b1cc286c33cd99673e2d6760c5dc19a8ed1933dd9dd2000d6e74" exitCode=0 Dec 01 21:39:05 crc kubenswrapper[4962]: I1201 21:39:05.806281 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7hn2f" event={"ID":"458c25bb-0a4d-4f96-9f46-cdfd66b6ae34","Type":"ContainerDied","Data":"4bd3682a0231b1cc286c33cd99673e2d6760c5dc19a8ed1933dd9dd2000d6e74"} Dec 01 21:39:05 crc kubenswrapper[4962]: I1201 21:39:05.836886 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-p5kqx" Dec 01 21:39:05 crc kubenswrapper[4962]: I1201 21:39:05.890715 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7hn2f" Dec 01 21:39:05 crc kubenswrapper[4962]: I1201 21:39:05.947634 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7ad9fda2-065b-4620-bc36-e33403fcdd53-proxy-ca-bundles\") pod \"7ad9fda2-065b-4620-bc36-e33403fcdd53\" (UID: \"7ad9fda2-065b-4620-bc36-e33403fcdd53\") " Dec 01 21:39:05 crc kubenswrapper[4962]: I1201 21:39:05.947702 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/458c25bb-0a4d-4f96-9f46-cdfd66b6ae34-client-ca\") pod \"458c25bb-0a4d-4f96-9f46-cdfd66b6ae34\" (UID: \"458c25bb-0a4d-4f96-9f46-cdfd66b6ae34\") " Dec 01 21:39:05 crc kubenswrapper[4962]: I1201 21:39:05.947723 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/458c25bb-0a4d-4f96-9f46-cdfd66b6ae34-serving-cert\") pod \"458c25bb-0a4d-4f96-9f46-cdfd66b6ae34\" (UID: \"458c25bb-0a4d-4f96-9f46-cdfd66b6ae34\") " Dec 01 21:39:05 crc kubenswrapper[4962]: I1201 21:39:05.947748 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7ad9fda2-065b-4620-bc36-e33403fcdd53-client-ca\") pod \"7ad9fda2-065b-4620-bc36-e33403fcdd53\" (UID: \"7ad9fda2-065b-4620-bc36-e33403fcdd53\") " Dec 01 21:39:05 crc kubenswrapper[4962]: I1201 21:39:05.947784 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgkfc\" (UniqueName: \"kubernetes.io/projected/7ad9fda2-065b-4620-bc36-e33403fcdd53-kube-api-access-rgkfc\") pod \"7ad9fda2-065b-4620-bc36-e33403fcdd53\" (UID: \"7ad9fda2-065b-4620-bc36-e33403fcdd53\") " Dec 01 21:39:05 crc kubenswrapper[4962]: I1201 21:39:05.947815 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ad9fda2-065b-4620-bc36-e33403fcdd53-serving-cert\") pod \"7ad9fda2-065b-4620-bc36-e33403fcdd53\" (UID: \"7ad9fda2-065b-4620-bc36-e33403fcdd53\") " Dec 01 21:39:05 crc kubenswrapper[4962]: I1201 21:39:05.947861 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5frm\" (UniqueName: \"kubernetes.io/projected/458c25bb-0a4d-4f96-9f46-cdfd66b6ae34-kube-api-access-r5frm\") pod \"458c25bb-0a4d-4f96-9f46-cdfd66b6ae34\" (UID: \"458c25bb-0a4d-4f96-9f46-cdfd66b6ae34\") " Dec 01 21:39:05 crc kubenswrapper[4962]: I1201 21:39:05.947884 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/458c25bb-0a4d-4f96-9f46-cdfd66b6ae34-config\") pod \"458c25bb-0a4d-4f96-9f46-cdfd66b6ae34\" (UID: \"458c25bb-0a4d-4f96-9f46-cdfd66b6ae34\") " Dec 01 21:39:05 crc kubenswrapper[4962]: I1201 21:39:05.947943 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ad9fda2-065b-4620-bc36-e33403fcdd53-config\") pod \"7ad9fda2-065b-4620-bc36-e33403fcdd53\" (UID: \"7ad9fda2-065b-4620-bc36-e33403fcdd53\") " Dec 01 21:39:05 crc kubenswrapper[4962]: I1201 21:39:05.949279 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ad9fda2-065b-4620-bc36-e33403fcdd53-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7ad9fda2-065b-4620-bc36-e33403fcdd53" (UID: "7ad9fda2-065b-4620-bc36-e33403fcdd53"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:39:05 crc kubenswrapper[4962]: I1201 21:39:05.949390 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ad9fda2-065b-4620-bc36-e33403fcdd53-config" (OuterVolumeSpecName: "config") pod "7ad9fda2-065b-4620-bc36-e33403fcdd53" (UID: "7ad9fda2-065b-4620-bc36-e33403fcdd53"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:39:05 crc kubenswrapper[4962]: I1201 21:39:05.949737 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ad9fda2-065b-4620-bc36-e33403fcdd53-client-ca" (OuterVolumeSpecName: "client-ca") pod "7ad9fda2-065b-4620-bc36-e33403fcdd53" (UID: "7ad9fda2-065b-4620-bc36-e33403fcdd53"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:39:05 crc kubenswrapper[4962]: I1201 21:39:05.949832 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/458c25bb-0a4d-4f96-9f46-cdfd66b6ae34-client-ca" (OuterVolumeSpecName: "client-ca") pod "458c25bb-0a4d-4f96-9f46-cdfd66b6ae34" (UID: "458c25bb-0a4d-4f96-9f46-cdfd66b6ae34"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:39:05 crc kubenswrapper[4962]: I1201 21:39:05.950322 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/458c25bb-0a4d-4f96-9f46-cdfd66b6ae34-config" (OuterVolumeSpecName: "config") pod "458c25bb-0a4d-4f96-9f46-cdfd66b6ae34" (UID: "458c25bb-0a4d-4f96-9f46-cdfd66b6ae34"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:39:05 crc kubenswrapper[4962]: I1201 21:39:05.954496 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ad9fda2-065b-4620-bc36-e33403fcdd53-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7ad9fda2-065b-4620-bc36-e33403fcdd53" (UID: "7ad9fda2-065b-4620-bc36-e33403fcdd53"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:39:05 crc kubenswrapper[4962]: I1201 21:39:05.954502 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ad9fda2-065b-4620-bc36-e33403fcdd53-kube-api-access-rgkfc" (OuterVolumeSpecName: "kube-api-access-rgkfc") pod "7ad9fda2-065b-4620-bc36-e33403fcdd53" (UID: "7ad9fda2-065b-4620-bc36-e33403fcdd53"). InnerVolumeSpecName "kube-api-access-rgkfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:39:05 crc kubenswrapper[4962]: I1201 21:39:05.954580 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/458c25bb-0a4d-4f96-9f46-cdfd66b6ae34-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "458c25bb-0a4d-4f96-9f46-cdfd66b6ae34" (UID: "458c25bb-0a4d-4f96-9f46-cdfd66b6ae34"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:39:05 crc kubenswrapper[4962]: I1201 21:39:05.955821 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/458c25bb-0a4d-4f96-9f46-cdfd66b6ae34-kube-api-access-r5frm" (OuterVolumeSpecName: "kube-api-access-r5frm") pod "458c25bb-0a4d-4f96-9f46-cdfd66b6ae34" (UID: "458c25bb-0a4d-4f96-9f46-cdfd66b6ae34"). InnerVolumeSpecName "kube-api-access-r5frm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:39:06 crc kubenswrapper[4962]: I1201 21:39:06.049706 4962 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7ad9fda2-065b-4620-bc36-e33403fcdd53-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 01 21:39:06 crc kubenswrapper[4962]: I1201 21:39:06.050038 4962 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/458c25bb-0a4d-4f96-9f46-cdfd66b6ae34-client-ca\") on node \"crc\" DevicePath \"\"" Dec 01 21:39:06 crc kubenswrapper[4962]: I1201 21:39:06.050054 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/458c25bb-0a4d-4f96-9f46-cdfd66b6ae34-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 21:39:06 crc kubenswrapper[4962]: I1201 21:39:06.050065 4962 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7ad9fda2-065b-4620-bc36-e33403fcdd53-client-ca\") on node \"crc\" DevicePath \"\"" Dec 01 21:39:06 crc kubenswrapper[4962]: I1201 21:39:06.050076 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgkfc\" (UniqueName: \"kubernetes.io/projected/7ad9fda2-065b-4620-bc36-e33403fcdd53-kube-api-access-rgkfc\") on node \"crc\" DevicePath \"\"" Dec 01 21:39:06 crc kubenswrapper[4962]: I1201 21:39:06.050086 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ad9fda2-065b-4620-bc36-e33403fcdd53-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 21:39:06 crc kubenswrapper[4962]: I1201 21:39:06.050094 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5frm\" (UniqueName: \"kubernetes.io/projected/458c25bb-0a4d-4f96-9f46-cdfd66b6ae34-kube-api-access-r5frm\") on node \"crc\" DevicePath \"\"" Dec 01 21:39:06 crc kubenswrapper[4962]: I1201 21:39:06.050105 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/458c25bb-0a4d-4f96-9f46-cdfd66b6ae34-config\") on node \"crc\" DevicePath \"\"" Dec 01 21:39:06 crc kubenswrapper[4962]: I1201 21:39:06.050117 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ad9fda2-065b-4620-bc36-e33403fcdd53-config\") on node \"crc\" DevicePath \"\"" Dec 01 21:39:06 crc kubenswrapper[4962]: I1201 21:39:06.611468 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5f8bd458f5-vq6ns"] Dec 01 21:39:06 crc kubenswrapper[4962]: E1201 21:39:06.611733 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="458c25bb-0a4d-4f96-9f46-cdfd66b6ae34" containerName="route-controller-manager" Dec 01 21:39:06 crc kubenswrapper[4962]: I1201 21:39:06.611746 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="458c25bb-0a4d-4f96-9f46-cdfd66b6ae34" containerName="route-controller-manager" Dec 01 21:39:06 crc kubenswrapper[4962]: E1201 21:39:06.611757 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2d93aa1-eee7-4a67-b5ee-a05a6696b624" containerName="registry" Dec 01 21:39:06 crc kubenswrapper[4962]: I1201 21:39:06.611763 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2d93aa1-eee7-4a67-b5ee-a05a6696b624" containerName="registry" Dec 01 21:39:06 crc kubenswrapper[4962]: E1201 21:39:06.611776 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ad9fda2-065b-4620-bc36-e33403fcdd53" containerName="controller-manager" Dec 01 21:39:06 crc kubenswrapper[4962]: I1201 21:39:06.611783 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ad9fda2-065b-4620-bc36-e33403fcdd53" containerName="controller-manager" Dec 01 21:39:06 crc kubenswrapper[4962]: I1201 21:39:06.611874 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ad9fda2-065b-4620-bc36-e33403fcdd53" containerName="controller-manager" Dec 01 21:39:06 crc kubenswrapper[4962]: I1201 21:39:06.611881 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="458c25bb-0a4d-4f96-9f46-cdfd66b6ae34" containerName="route-controller-manager" Dec 01 21:39:06 crc kubenswrapper[4962]: I1201 21:39:06.611890 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2d93aa1-eee7-4a67-b5ee-a05a6696b624" containerName="registry" Dec 01 21:39:06 crc kubenswrapper[4962]: I1201 21:39:06.612292 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5f8bd458f5-vq6ns" Dec 01 21:39:06 crc kubenswrapper[4962]: I1201 21:39:06.616306 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-757cc6bfc7-wdzwq"] Dec 01 21:39:06 crc kubenswrapper[4962]: I1201 21:39:06.618716 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-757cc6bfc7-wdzwq" Dec 01 21:39:06 crc kubenswrapper[4962]: I1201 21:39:06.624431 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5f8bd458f5-vq6ns"] Dec 01 21:39:06 crc kubenswrapper[4962]: I1201 21:39:06.631331 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-757cc6bfc7-wdzwq"] Dec 01 21:39:06 crc kubenswrapper[4962]: I1201 21:39:06.760686 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b49a1bc-a2f2-48b8-98b9-96dfbb06381b-config\") pod \"route-controller-manager-757cc6bfc7-wdzwq\" (UID: \"6b49a1bc-a2f2-48b8-98b9-96dfbb06381b\") " pod="openshift-route-controller-manager/route-controller-manager-757cc6bfc7-wdzwq" Dec 01 21:39:06 crc kubenswrapper[4962]: I1201 21:39:06.760740 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6b49a1bc-a2f2-48b8-98b9-96dfbb06381b-serving-cert\") pod \"route-controller-manager-757cc6bfc7-wdzwq\" (UID: \"6b49a1bc-a2f2-48b8-98b9-96dfbb06381b\") " pod="openshift-route-controller-manager/route-controller-manager-757cc6bfc7-wdzwq" Dec 01 21:39:06 crc kubenswrapper[4962]: I1201 21:39:06.760772 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a9797aca-9578-449d-ab30-190815736ab7-proxy-ca-bundles\") pod \"controller-manager-5f8bd458f5-vq6ns\" (UID: \"a9797aca-9578-449d-ab30-190815736ab7\") " pod="openshift-controller-manager/controller-manager-5f8bd458f5-vq6ns" Dec 01 21:39:06 crc kubenswrapper[4962]: I1201 21:39:06.760809 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a9797aca-9578-449d-ab30-190815736ab7-client-ca\") pod \"controller-manager-5f8bd458f5-vq6ns\" (UID: \"a9797aca-9578-449d-ab30-190815736ab7\") " pod="openshift-controller-manager/controller-manager-5f8bd458f5-vq6ns" Dec 01 21:39:06 crc kubenswrapper[4962]: I1201 21:39:06.760999 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6b49a1bc-a2f2-48b8-98b9-96dfbb06381b-client-ca\") pod \"route-controller-manager-757cc6bfc7-wdzwq\" (UID: \"6b49a1bc-a2f2-48b8-98b9-96dfbb06381b\") " pod="openshift-route-controller-manager/route-controller-manager-757cc6bfc7-wdzwq" Dec 01 21:39:06 crc kubenswrapper[4962]: I1201 21:39:06.761191 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9797aca-9578-449d-ab30-190815736ab7-serving-cert\") pod \"controller-manager-5f8bd458f5-vq6ns\" (UID: \"a9797aca-9578-449d-ab30-190815736ab7\") " pod="openshift-controller-manager/controller-manager-5f8bd458f5-vq6ns" Dec 01 21:39:06 crc kubenswrapper[4962]: I1201 21:39:06.763089 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9797aca-9578-449d-ab30-190815736ab7-config\") pod \"controller-manager-5f8bd458f5-vq6ns\" (UID: \"a9797aca-9578-449d-ab30-190815736ab7\") " pod="openshift-controller-manager/controller-manager-5f8bd458f5-vq6ns" Dec 01 21:39:06 crc kubenswrapper[4962]: I1201 21:39:06.763173 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4nct\" (UniqueName: \"kubernetes.io/projected/a9797aca-9578-449d-ab30-190815736ab7-kube-api-access-x4nct\") pod \"controller-manager-5f8bd458f5-vq6ns\" (UID: \"a9797aca-9578-449d-ab30-190815736ab7\") " pod="openshift-controller-manager/controller-manager-5f8bd458f5-vq6ns" Dec 01 21:39:06 crc kubenswrapper[4962]: I1201 21:39:06.763260 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2zf6\" (UniqueName: \"kubernetes.io/projected/6b49a1bc-a2f2-48b8-98b9-96dfbb06381b-kube-api-access-s2zf6\") pod \"route-controller-manager-757cc6bfc7-wdzwq\" (UID: \"6b49a1bc-a2f2-48b8-98b9-96dfbb06381b\") " pod="openshift-route-controller-manager/route-controller-manager-757cc6bfc7-wdzwq" Dec 01 21:39:06 crc kubenswrapper[4962]: I1201 21:39:06.815818 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-p5kqx" event={"ID":"7ad9fda2-065b-4620-bc36-e33403fcdd53","Type":"ContainerDied","Data":"0c73355c783b14554f90a7dfa86cc2c0bb326ec9c8c05b5283ab8fc662357270"} Dec 01 21:39:06 crc kubenswrapper[4962]: I1201 21:39:06.815911 4962 scope.go:117] "RemoveContainer" containerID="47d64b0df8f4445f2fb91407f9f29c297bb37075610f907fbd8ab63be614b20d" Dec 01 21:39:06 crc kubenswrapper[4962]: I1201 21:39:06.816153 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-p5kqx" Dec 01 21:39:06 crc kubenswrapper[4962]: I1201 21:39:06.822433 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7hn2f" event={"ID":"458c25bb-0a4d-4f96-9f46-cdfd66b6ae34","Type":"ContainerDied","Data":"619ebd5920a9e3102022f4b131e62761a9eda4a60ca4a89c2e607cb170e98e91"} Dec 01 21:39:06 crc kubenswrapper[4962]: I1201 21:39:06.822584 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7hn2f" Dec 01 21:39:06 crc kubenswrapper[4962]: I1201 21:39:06.841984 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-p5kqx"] Dec 01 21:39:06 crc kubenswrapper[4962]: I1201 21:39:06.846855 4962 scope.go:117] "RemoveContainer" containerID="4bd3682a0231b1cc286c33cd99673e2d6760c5dc19a8ed1933dd9dd2000d6e74" Dec 01 21:39:06 crc kubenswrapper[4962]: I1201 21:39:06.852081 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-p5kqx"] Dec 01 21:39:06 crc kubenswrapper[4962]: I1201 21:39:06.859365 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-7hn2f"] Dec 01 21:39:06 crc kubenswrapper[4962]: I1201 21:39:06.863384 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-7hn2f"] Dec 01 21:39:06 crc kubenswrapper[4962]: I1201 21:39:06.863827 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b49a1bc-a2f2-48b8-98b9-96dfbb06381b-config\") pod \"route-controller-manager-757cc6bfc7-wdzwq\" (UID: \"6b49a1bc-a2f2-48b8-98b9-96dfbb06381b\") " pod="openshift-route-controller-manager/route-controller-manager-757cc6bfc7-wdzwq" Dec 01 21:39:06 crc kubenswrapper[4962]: I1201 21:39:06.863873 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6b49a1bc-a2f2-48b8-98b9-96dfbb06381b-serving-cert\") pod \"route-controller-manager-757cc6bfc7-wdzwq\" (UID: \"6b49a1bc-a2f2-48b8-98b9-96dfbb06381b\") " pod="openshift-route-controller-manager/route-controller-manager-757cc6bfc7-wdzwq" Dec 01 21:39:06 crc kubenswrapper[4962]: I1201 21:39:06.863902 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a9797aca-9578-449d-ab30-190815736ab7-proxy-ca-bundles\") pod \"controller-manager-5f8bd458f5-vq6ns\" (UID: \"a9797aca-9578-449d-ab30-190815736ab7\") " pod="openshift-controller-manager/controller-manager-5f8bd458f5-vq6ns" Dec 01 21:39:06 crc kubenswrapper[4962]: I1201 21:39:06.863943 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a9797aca-9578-449d-ab30-190815736ab7-client-ca\") pod \"controller-manager-5f8bd458f5-vq6ns\" (UID: \"a9797aca-9578-449d-ab30-190815736ab7\") " pod="openshift-controller-manager/controller-manager-5f8bd458f5-vq6ns" Dec 01 21:39:06 crc kubenswrapper[4962]: I1201 21:39:06.863988 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6b49a1bc-a2f2-48b8-98b9-96dfbb06381b-client-ca\") pod \"route-controller-manager-757cc6bfc7-wdzwq\" (UID: \"6b49a1bc-a2f2-48b8-98b9-96dfbb06381b\") " pod="openshift-route-controller-manager/route-controller-manager-757cc6bfc7-wdzwq" Dec 01 21:39:06 crc kubenswrapper[4962]: I1201 21:39:06.864023 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9797aca-9578-449d-ab30-190815736ab7-serving-cert\") pod \"controller-manager-5f8bd458f5-vq6ns\" (UID: \"a9797aca-9578-449d-ab30-190815736ab7\") " pod="openshift-controller-manager/controller-manager-5f8bd458f5-vq6ns" Dec 01 21:39:06 crc kubenswrapper[4962]: I1201 21:39:06.864050 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9797aca-9578-449d-ab30-190815736ab7-config\") pod \"controller-manager-5f8bd458f5-vq6ns\" (UID: \"a9797aca-9578-449d-ab30-190815736ab7\") " pod="openshift-controller-manager/controller-manager-5f8bd458f5-vq6ns" Dec 01 21:39:06 crc kubenswrapper[4962]: I1201 21:39:06.864079 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4nct\" (UniqueName: \"kubernetes.io/projected/a9797aca-9578-449d-ab30-190815736ab7-kube-api-access-x4nct\") pod \"controller-manager-5f8bd458f5-vq6ns\" (UID: \"a9797aca-9578-449d-ab30-190815736ab7\") " pod="openshift-controller-manager/controller-manager-5f8bd458f5-vq6ns" Dec 01 21:39:06 crc kubenswrapper[4962]: I1201 21:39:06.864117 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2zf6\" (UniqueName: \"kubernetes.io/projected/6b49a1bc-a2f2-48b8-98b9-96dfbb06381b-kube-api-access-s2zf6\") pod \"route-controller-manager-757cc6bfc7-wdzwq\" (UID: \"6b49a1bc-a2f2-48b8-98b9-96dfbb06381b\") " pod="openshift-route-controller-manager/route-controller-manager-757cc6bfc7-wdzwq" Dec 01 21:39:06 crc kubenswrapper[4962]: I1201 21:39:06.865364 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a9797aca-9578-449d-ab30-190815736ab7-client-ca\") pod \"controller-manager-5f8bd458f5-vq6ns\" (UID: \"a9797aca-9578-449d-ab30-190815736ab7\") " pod="openshift-controller-manager/controller-manager-5f8bd458f5-vq6ns" Dec 01 21:39:06 crc kubenswrapper[4962]: I1201 21:39:06.865666 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b49a1bc-a2f2-48b8-98b9-96dfbb06381b-config\") pod \"route-controller-manager-757cc6bfc7-wdzwq\" (UID: \"6b49a1bc-a2f2-48b8-98b9-96dfbb06381b\") " pod="openshift-route-controller-manager/route-controller-manager-757cc6bfc7-wdzwq" Dec 01 21:39:06 crc kubenswrapper[4962]: I1201 21:39:06.866438 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6b49a1bc-a2f2-48b8-98b9-96dfbb06381b-client-ca\") pod \"route-controller-manager-757cc6bfc7-wdzwq\" (UID: \"6b49a1bc-a2f2-48b8-98b9-96dfbb06381b\") " pod="openshift-route-controller-manager/route-controller-manager-757cc6bfc7-wdzwq" Dec 01 21:39:06 crc kubenswrapper[4962]: I1201 21:39:06.866775 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a9797aca-9578-449d-ab30-190815736ab7-proxy-ca-bundles\") pod \"controller-manager-5f8bd458f5-vq6ns\" (UID: \"a9797aca-9578-449d-ab30-190815736ab7\") " pod="openshift-controller-manager/controller-manager-5f8bd458f5-vq6ns" Dec 01 21:39:06 crc kubenswrapper[4962]: I1201 21:39:06.867560 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9797aca-9578-449d-ab30-190815736ab7-config\") pod \"controller-manager-5f8bd458f5-vq6ns\" (UID: \"a9797aca-9578-449d-ab30-190815736ab7\") " pod="openshift-controller-manager/controller-manager-5f8bd458f5-vq6ns" Dec 01 21:39:06 crc kubenswrapper[4962]: I1201 21:39:06.873505 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6b49a1bc-a2f2-48b8-98b9-96dfbb06381b-serving-cert\") pod \"route-controller-manager-757cc6bfc7-wdzwq\" (UID: \"6b49a1bc-a2f2-48b8-98b9-96dfbb06381b\") " pod="openshift-route-controller-manager/route-controller-manager-757cc6bfc7-wdzwq" Dec 01 21:39:06 crc kubenswrapper[4962]: I1201 21:39:06.873631 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9797aca-9578-449d-ab30-190815736ab7-serving-cert\") pod \"controller-manager-5f8bd458f5-vq6ns\" (UID: \"a9797aca-9578-449d-ab30-190815736ab7\") " pod="openshift-controller-manager/controller-manager-5f8bd458f5-vq6ns" Dec 01 21:39:06 crc kubenswrapper[4962]: I1201 21:39:06.886226 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4nct\" (UniqueName: \"kubernetes.io/projected/a9797aca-9578-449d-ab30-190815736ab7-kube-api-access-x4nct\") pod \"controller-manager-5f8bd458f5-vq6ns\" (UID: \"a9797aca-9578-449d-ab30-190815736ab7\") " pod="openshift-controller-manager/controller-manager-5f8bd458f5-vq6ns" Dec 01 21:39:06 crc kubenswrapper[4962]: I1201 21:39:06.887152 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2zf6\" (UniqueName: \"kubernetes.io/projected/6b49a1bc-a2f2-48b8-98b9-96dfbb06381b-kube-api-access-s2zf6\") pod \"route-controller-manager-757cc6bfc7-wdzwq\" (UID: \"6b49a1bc-a2f2-48b8-98b9-96dfbb06381b\") " pod="openshift-route-controller-manager/route-controller-manager-757cc6bfc7-wdzwq" Dec 01 21:39:06 crc kubenswrapper[4962]: I1201 21:39:06.951076 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5f8bd458f5-vq6ns" Dec 01 21:39:06 crc kubenswrapper[4962]: I1201 21:39:06.960612 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-757cc6bfc7-wdzwq" Dec 01 21:39:07 crc kubenswrapper[4962]: I1201 21:39:07.238853 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5f8bd458f5-vq6ns"] Dec 01 21:39:07 crc kubenswrapper[4962]: W1201 21:39:07.248466 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9797aca_9578_449d_ab30_190815736ab7.slice/crio-24588e2f341b8769443e08c43f6ae6569bdb93f68d6efc5ab1c99ad0e5b9c073 WatchSource:0}: Error finding container 24588e2f341b8769443e08c43f6ae6569bdb93f68d6efc5ab1c99ad0e5b9c073: Status 404 returned error can't find the container with id 24588e2f341b8769443e08c43f6ae6569bdb93f68d6efc5ab1c99ad0e5b9c073 Dec 01 21:39:07 crc kubenswrapper[4962]: I1201 21:39:07.410063 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-757cc6bfc7-wdzwq"] Dec 01 21:39:07 crc kubenswrapper[4962]: W1201 21:39:07.414003 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b49a1bc_a2f2_48b8_98b9_96dfbb06381b.slice/crio-f113b1000910cc54939428c006c5f3a6e1ca520a6b92b6a70ce81b5b043b1d89 WatchSource:0}: Error finding container f113b1000910cc54939428c006c5f3a6e1ca520a6b92b6a70ce81b5b043b1d89: Status 404 returned error can't find the container with id f113b1000910cc54939428c006c5f3a6e1ca520a6b92b6a70ce81b5b043b1d89 Dec 01 21:39:07 crc kubenswrapper[4962]: I1201 21:39:07.831971 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5f8bd458f5-vq6ns" event={"ID":"a9797aca-9578-449d-ab30-190815736ab7","Type":"ContainerStarted","Data":"c216df2ec7f0bd0f5ec66f3393980d373da9534e581aa57b74f3f4d4d0c57abf"} Dec 01 21:39:07 crc kubenswrapper[4962]: I1201 21:39:07.832326 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5f8bd458f5-vq6ns" event={"ID":"a9797aca-9578-449d-ab30-190815736ab7","Type":"ContainerStarted","Data":"24588e2f341b8769443e08c43f6ae6569bdb93f68d6efc5ab1c99ad0e5b9c073"} Dec 01 21:39:07 crc kubenswrapper[4962]: I1201 21:39:07.832760 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5f8bd458f5-vq6ns" Dec 01 21:39:07 crc kubenswrapper[4962]: I1201 21:39:07.833663 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-757cc6bfc7-wdzwq" event={"ID":"6b49a1bc-a2f2-48b8-98b9-96dfbb06381b","Type":"ContainerStarted","Data":"c9344987651b66cb93f80768e8a2b22ba4b0c94ea0f3eb340b67b3d7ee3b60ad"} Dec 01 21:39:07 crc kubenswrapper[4962]: I1201 21:39:07.833726 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-757cc6bfc7-wdzwq" event={"ID":"6b49a1bc-a2f2-48b8-98b9-96dfbb06381b","Type":"ContainerStarted","Data":"f113b1000910cc54939428c006c5f3a6e1ca520a6b92b6a70ce81b5b043b1d89"} Dec 01 21:39:07 crc kubenswrapper[4962]: I1201 21:39:07.833886 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-757cc6bfc7-wdzwq" Dec 01 21:39:07 crc kubenswrapper[4962]: I1201 21:39:07.839248 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5f8bd458f5-vq6ns" Dec 01 21:39:07 crc kubenswrapper[4962]: I1201 21:39:07.878670 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-757cc6bfc7-wdzwq" podStartSLOduration=2.878655776 podStartE2EDuration="2.878655776s" podCreationTimestamp="2025-12-01 21:39:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:39:07.876273381 +0000 UTC m=+331.977712576" watchObservedRunningTime="2025-12-01 21:39:07.878655776 +0000 UTC m=+331.980094961" Dec 01 21:39:07 crc kubenswrapper[4962]: I1201 21:39:07.878814 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5f8bd458f5-vq6ns" podStartSLOduration=2.87880845 podStartE2EDuration="2.87880845s" podCreationTimestamp="2025-12-01 21:39:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:39:07.857630007 +0000 UTC m=+331.959069302" watchObservedRunningTime="2025-12-01 21:39:07.87880845 +0000 UTC m=+331.980247645" Dec 01 21:39:07 crc kubenswrapper[4962]: I1201 21:39:07.988181 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-757cc6bfc7-wdzwq" Dec 01 21:39:08 crc kubenswrapper[4962]: I1201 21:39:08.226211 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="458c25bb-0a4d-4f96-9f46-cdfd66b6ae34" path="/var/lib/kubelet/pods/458c25bb-0a4d-4f96-9f46-cdfd66b6ae34/volumes" Dec 01 21:39:08 crc kubenswrapper[4962]: I1201 21:39:08.226857 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ad9fda2-065b-4620-bc36-e33403fcdd53" path="/var/lib/kubelet/pods/7ad9fda2-065b-4620-bc36-e33403fcdd53/volumes" Dec 01 21:39:09 crc kubenswrapper[4962]: I1201 21:39:09.211196 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 01 21:39:15 crc kubenswrapper[4962]: I1201 21:39:15.409846 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/f02d5ee7-1f55-4474-94c6-005cdc9974bf-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-dfszb\" (UID: \"f02d5ee7-1f55-4474-94c6-005cdc9974bf\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dfszb" Dec 01 21:39:15 crc kubenswrapper[4962]: E1201 21:39:15.410079 4962 secret.go:188] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Dec 01 21:39:15 crc kubenswrapper[4962]: E1201 21:39:15.410884 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f02d5ee7-1f55-4474-94c6-005cdc9974bf-tls-certificates podName:f02d5ee7-1f55-4474-94c6-005cdc9974bf nodeName:}" failed. No retries permitted until 2025-12-01 21:39:47.410856314 +0000 UTC m=+371.512295539 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/f02d5ee7-1f55-4474-94c6-005cdc9974bf-tls-certificates") pod "prometheus-operator-admission-webhook-f54c54754-dfszb" (UID: "f02d5ee7-1f55-4474-94c6-005cdc9974bf") : secret "prometheus-operator-admission-webhook-tls" not found Dec 01 21:39:23 crc kubenswrapper[4962]: I1201 21:39:23.299494 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5f8bd458f5-vq6ns"] Dec 01 21:39:23 crc kubenswrapper[4962]: I1201 21:39:23.300714 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5f8bd458f5-vq6ns" podUID="a9797aca-9578-449d-ab30-190815736ab7" containerName="controller-manager" containerID="cri-o://c216df2ec7f0bd0f5ec66f3393980d373da9534e581aa57b74f3f4d4d0c57abf" gracePeriod=30 Dec 01 21:39:23 crc kubenswrapper[4962]: I1201 21:39:23.319044 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-757cc6bfc7-wdzwq"] Dec 01 21:39:23 crc kubenswrapper[4962]: I1201 21:39:23.319346 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-757cc6bfc7-wdzwq" podUID="6b49a1bc-a2f2-48b8-98b9-96dfbb06381b" containerName="route-controller-manager" containerID="cri-o://c9344987651b66cb93f80768e8a2b22ba4b0c94ea0f3eb340b67b3d7ee3b60ad" gracePeriod=30 Dec 01 21:39:23 crc kubenswrapper[4962]: I1201 21:39:23.937545 4962 generic.go:334] "Generic (PLEG): container finished" podID="6b49a1bc-a2f2-48b8-98b9-96dfbb06381b" containerID="c9344987651b66cb93f80768e8a2b22ba4b0c94ea0f3eb340b67b3d7ee3b60ad" exitCode=0 Dec 01 21:39:23 crc kubenswrapper[4962]: I1201 21:39:23.937794 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-757cc6bfc7-wdzwq" event={"ID":"6b49a1bc-a2f2-48b8-98b9-96dfbb06381b","Type":"ContainerDied","Data":"c9344987651b66cb93f80768e8a2b22ba4b0c94ea0f3eb340b67b3d7ee3b60ad"} Dec 01 21:39:23 crc kubenswrapper[4962]: I1201 21:39:23.940383 4962 generic.go:334] "Generic (PLEG): container finished" podID="a9797aca-9578-449d-ab30-190815736ab7" containerID="c216df2ec7f0bd0f5ec66f3393980d373da9534e581aa57b74f3f4d4d0c57abf" exitCode=0 Dec 01 21:39:23 crc kubenswrapper[4962]: I1201 21:39:23.940415 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5f8bd458f5-vq6ns" event={"ID":"a9797aca-9578-449d-ab30-190815736ab7","Type":"ContainerDied","Data":"c216df2ec7f0bd0f5ec66f3393980d373da9534e581aa57b74f3f4d4d0c57abf"} Dec 01 21:39:24 crc kubenswrapper[4962]: I1201 21:39:24.797715 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-757cc6bfc7-wdzwq" Dec 01 21:39:24 crc kubenswrapper[4962]: I1201 21:39:24.856296 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86d966b9bd-gwr42"] Dec 01 21:39:24 crc kubenswrapper[4962]: E1201 21:39:24.856867 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b49a1bc-a2f2-48b8-98b9-96dfbb06381b" containerName="route-controller-manager" Dec 01 21:39:24 crc kubenswrapper[4962]: I1201 21:39:24.856901 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b49a1bc-a2f2-48b8-98b9-96dfbb06381b" containerName="route-controller-manager" Dec 01 21:39:24 crc kubenswrapper[4962]: I1201 21:39:24.857253 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b49a1bc-a2f2-48b8-98b9-96dfbb06381b" containerName="route-controller-manager" Dec 01 21:39:24 crc kubenswrapper[4962]: I1201 21:39:24.858258 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-86d966b9bd-gwr42" Dec 01 21:39:24 crc kubenswrapper[4962]: I1201 21:39:24.865789 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5f8bd458f5-vq6ns" Dec 01 21:39:24 crc kubenswrapper[4962]: I1201 21:39:24.869416 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86d966b9bd-gwr42"] Dec 01 21:39:24 crc kubenswrapper[4962]: I1201 21:39:24.944636 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6b49a1bc-a2f2-48b8-98b9-96dfbb06381b-serving-cert\") pod \"6b49a1bc-a2f2-48b8-98b9-96dfbb06381b\" (UID: \"6b49a1bc-a2f2-48b8-98b9-96dfbb06381b\") " Dec 01 21:39:24 crc kubenswrapper[4962]: I1201 21:39:24.944719 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6b49a1bc-a2f2-48b8-98b9-96dfbb06381b-client-ca\") pod \"6b49a1bc-a2f2-48b8-98b9-96dfbb06381b\" (UID: \"6b49a1bc-a2f2-48b8-98b9-96dfbb06381b\") " Dec 01 21:39:24 crc kubenswrapper[4962]: I1201 21:39:24.944910 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b49a1bc-a2f2-48b8-98b9-96dfbb06381b-config\") pod \"6b49a1bc-a2f2-48b8-98b9-96dfbb06381b\" (UID: \"6b49a1bc-a2f2-48b8-98b9-96dfbb06381b\") " Dec 01 21:39:24 crc kubenswrapper[4962]: I1201 21:39:24.944995 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a9797aca-9578-449d-ab30-190815736ab7-client-ca\") pod \"a9797aca-9578-449d-ab30-190815736ab7\" (UID: \"a9797aca-9578-449d-ab30-190815736ab7\") " Dec 01 21:39:24 crc kubenswrapper[4962]: I1201 21:39:24.945031 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9797aca-9578-449d-ab30-190815736ab7-config\") pod \"a9797aca-9578-449d-ab30-190815736ab7\" (UID: \"a9797aca-9578-449d-ab30-190815736ab7\") " Dec 01 21:39:24 crc kubenswrapper[4962]: I1201 21:39:24.945073 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2zf6\" (UniqueName: \"kubernetes.io/projected/6b49a1bc-a2f2-48b8-98b9-96dfbb06381b-kube-api-access-s2zf6\") pod \"6b49a1bc-a2f2-48b8-98b9-96dfbb06381b\" (UID: \"6b49a1bc-a2f2-48b8-98b9-96dfbb06381b\") " Dec 01 21:39:24 crc kubenswrapper[4962]: I1201 21:39:24.945238 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf39f3c8-4876-439d-9450-3f9a0ba72480-serving-cert\") pod \"route-controller-manager-86d966b9bd-gwr42\" (UID: \"cf39f3c8-4876-439d-9450-3f9a0ba72480\") " pod="openshift-route-controller-manager/route-controller-manager-86d966b9bd-gwr42" Dec 01 21:39:24 crc kubenswrapper[4962]: I1201 21:39:24.945272 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf39f3c8-4876-439d-9450-3f9a0ba72480-config\") pod \"route-controller-manager-86d966b9bd-gwr42\" (UID: \"cf39f3c8-4876-439d-9450-3f9a0ba72480\") " pod="openshift-route-controller-manager/route-controller-manager-86d966b9bd-gwr42" Dec 01 21:39:24 crc kubenswrapper[4962]: I1201 21:39:24.945294 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cf39f3c8-4876-439d-9450-3f9a0ba72480-client-ca\") pod \"route-controller-manager-86d966b9bd-gwr42\" (UID: \"cf39f3c8-4876-439d-9450-3f9a0ba72480\") " pod="openshift-route-controller-manager/route-controller-manager-86d966b9bd-gwr42" Dec 01 21:39:24 crc kubenswrapper[4962]: I1201 21:39:24.945355 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9x9bq\" (UniqueName: \"kubernetes.io/projected/cf39f3c8-4876-439d-9450-3f9a0ba72480-kube-api-access-9x9bq\") pod \"route-controller-manager-86d966b9bd-gwr42\" (UID: \"cf39f3c8-4876-439d-9450-3f9a0ba72480\") " pod="openshift-route-controller-manager/route-controller-manager-86d966b9bd-gwr42" Dec 01 21:39:24 crc kubenswrapper[4962]: I1201 21:39:24.946436 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9797aca-9578-449d-ab30-190815736ab7-config" (OuterVolumeSpecName: "config") pod "a9797aca-9578-449d-ab30-190815736ab7" (UID: "a9797aca-9578-449d-ab30-190815736ab7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:39:24 crc kubenswrapper[4962]: I1201 21:39:24.947203 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9797aca-9578-449d-ab30-190815736ab7-client-ca" (OuterVolumeSpecName: "client-ca") pod "a9797aca-9578-449d-ab30-190815736ab7" (UID: "a9797aca-9578-449d-ab30-190815736ab7"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:39:24 crc kubenswrapper[4962]: I1201 21:39:24.947617 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b49a1bc-a2f2-48b8-98b9-96dfbb06381b-client-ca" (OuterVolumeSpecName: "client-ca") pod "6b49a1bc-a2f2-48b8-98b9-96dfbb06381b" (UID: "6b49a1bc-a2f2-48b8-98b9-96dfbb06381b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:39:24 crc kubenswrapper[4962]: I1201 21:39:24.947709 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b49a1bc-a2f2-48b8-98b9-96dfbb06381b-config" (OuterVolumeSpecName: "config") pod "6b49a1bc-a2f2-48b8-98b9-96dfbb06381b" (UID: "6b49a1bc-a2f2-48b8-98b9-96dfbb06381b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:39:24 crc kubenswrapper[4962]: I1201 21:39:24.948656 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5f8bd458f5-vq6ns" event={"ID":"a9797aca-9578-449d-ab30-190815736ab7","Type":"ContainerDied","Data":"24588e2f341b8769443e08c43f6ae6569bdb93f68d6efc5ab1c99ad0e5b9c073"} Dec 01 21:39:24 crc kubenswrapper[4962]: I1201 21:39:24.948712 4962 scope.go:117] "RemoveContainer" containerID="c216df2ec7f0bd0f5ec66f3393980d373da9534e581aa57b74f3f4d4d0c57abf" Dec 01 21:39:24 crc kubenswrapper[4962]: I1201 21:39:24.948718 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5f8bd458f5-vq6ns" Dec 01 21:39:24 crc kubenswrapper[4962]: I1201 21:39:24.951754 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-757cc6bfc7-wdzwq" event={"ID":"6b49a1bc-a2f2-48b8-98b9-96dfbb06381b","Type":"ContainerDied","Data":"f113b1000910cc54939428c006c5f3a6e1ca520a6b92b6a70ce81b5b043b1d89"} Dec 01 21:39:24 crc kubenswrapper[4962]: I1201 21:39:24.951813 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b49a1bc-a2f2-48b8-98b9-96dfbb06381b-kube-api-access-s2zf6" (OuterVolumeSpecName: "kube-api-access-s2zf6") pod "6b49a1bc-a2f2-48b8-98b9-96dfbb06381b" (UID: "6b49a1bc-a2f2-48b8-98b9-96dfbb06381b"). InnerVolumeSpecName "kube-api-access-s2zf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:39:24 crc kubenswrapper[4962]: I1201 21:39:24.951868 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-757cc6bfc7-wdzwq" Dec 01 21:39:24 crc kubenswrapper[4962]: I1201 21:39:24.962140 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b49a1bc-a2f2-48b8-98b9-96dfbb06381b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6b49a1bc-a2f2-48b8-98b9-96dfbb06381b" (UID: "6b49a1bc-a2f2-48b8-98b9-96dfbb06381b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:39:24 crc kubenswrapper[4962]: I1201 21:39:24.974470 4962 scope.go:117] "RemoveContainer" containerID="c9344987651b66cb93f80768e8a2b22ba4b0c94ea0f3eb340b67b3d7ee3b60ad" Dec 01 21:39:25 crc kubenswrapper[4962]: I1201 21:39:25.046045 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4nct\" (UniqueName: \"kubernetes.io/projected/a9797aca-9578-449d-ab30-190815736ab7-kube-api-access-x4nct\") pod \"a9797aca-9578-449d-ab30-190815736ab7\" (UID: \"a9797aca-9578-449d-ab30-190815736ab7\") " Dec 01 21:39:25 crc kubenswrapper[4962]: I1201 21:39:25.046163 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a9797aca-9578-449d-ab30-190815736ab7-proxy-ca-bundles\") pod \"a9797aca-9578-449d-ab30-190815736ab7\" (UID: \"a9797aca-9578-449d-ab30-190815736ab7\") " Dec 01 21:39:25 crc kubenswrapper[4962]: I1201 21:39:25.046221 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9797aca-9578-449d-ab30-190815736ab7-serving-cert\") pod \"a9797aca-9578-449d-ab30-190815736ab7\" (UID: \"a9797aca-9578-449d-ab30-190815736ab7\") " Dec 01 21:39:25 crc kubenswrapper[4962]: I1201 21:39:25.046959 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9797aca-9578-449d-ab30-190815736ab7-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "a9797aca-9578-449d-ab30-190815736ab7" (UID: "a9797aca-9578-449d-ab30-190815736ab7"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:39:25 crc kubenswrapper[4962]: I1201 21:39:25.047135 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf39f3c8-4876-439d-9450-3f9a0ba72480-serving-cert\") pod \"route-controller-manager-86d966b9bd-gwr42\" (UID: \"cf39f3c8-4876-439d-9450-3f9a0ba72480\") " pod="openshift-route-controller-manager/route-controller-manager-86d966b9bd-gwr42" Dec 01 21:39:25 crc kubenswrapper[4962]: I1201 21:39:25.047160 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf39f3c8-4876-439d-9450-3f9a0ba72480-config\") pod \"route-controller-manager-86d966b9bd-gwr42\" (UID: \"cf39f3c8-4876-439d-9450-3f9a0ba72480\") " pod="openshift-route-controller-manager/route-controller-manager-86d966b9bd-gwr42" Dec 01 21:39:25 crc kubenswrapper[4962]: I1201 21:39:25.048457 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf39f3c8-4876-439d-9450-3f9a0ba72480-config\") pod \"route-controller-manager-86d966b9bd-gwr42\" (UID: \"cf39f3c8-4876-439d-9450-3f9a0ba72480\") " pod="openshift-route-controller-manager/route-controller-manager-86d966b9bd-gwr42" Dec 01 21:39:25 crc kubenswrapper[4962]: I1201 21:39:25.049066 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cf39f3c8-4876-439d-9450-3f9a0ba72480-client-ca\") pod \"route-controller-manager-86d966b9bd-gwr42\" (UID: \"cf39f3c8-4876-439d-9450-3f9a0ba72480\") " pod="openshift-route-controller-manager/route-controller-manager-86d966b9bd-gwr42" Dec 01 21:39:25 crc kubenswrapper[4962]: I1201 21:39:25.049702 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cf39f3c8-4876-439d-9450-3f9a0ba72480-client-ca\") pod \"route-controller-manager-86d966b9bd-gwr42\" (UID: \"cf39f3c8-4876-439d-9450-3f9a0ba72480\") " pod="openshift-route-controller-manager/route-controller-manager-86d966b9bd-gwr42" Dec 01 21:39:25 crc kubenswrapper[4962]: I1201 21:39:25.049822 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9x9bq\" (UniqueName: \"kubernetes.io/projected/cf39f3c8-4876-439d-9450-3f9a0ba72480-kube-api-access-9x9bq\") pod \"route-controller-manager-86d966b9bd-gwr42\" (UID: \"cf39f3c8-4876-439d-9450-3f9a0ba72480\") " pod="openshift-route-controller-manager/route-controller-manager-86d966b9bd-gwr42" Dec 01 21:39:25 crc kubenswrapper[4962]: I1201 21:39:25.049919 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6b49a1bc-a2f2-48b8-98b9-96dfbb06381b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 21:39:25 crc kubenswrapper[4962]: I1201 21:39:25.050280 4962 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6b49a1bc-a2f2-48b8-98b9-96dfbb06381b-client-ca\") on node \"crc\" DevicePath \"\"" Dec 01 21:39:25 crc kubenswrapper[4962]: I1201 21:39:25.050292 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b49a1bc-a2f2-48b8-98b9-96dfbb06381b-config\") on node \"crc\" DevicePath \"\"" Dec 01 21:39:25 crc kubenswrapper[4962]: I1201 21:39:25.050302 4962 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a9797aca-9578-449d-ab30-190815736ab7-client-ca\") on node \"crc\" DevicePath \"\"" Dec 01 21:39:25 crc kubenswrapper[4962]: I1201 21:39:25.050310 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9797aca-9578-449d-ab30-190815736ab7-config\") on node \"crc\" DevicePath \"\"" Dec 01 21:39:25 crc kubenswrapper[4962]: I1201 21:39:25.050318 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2zf6\" (UniqueName: \"kubernetes.io/projected/6b49a1bc-a2f2-48b8-98b9-96dfbb06381b-kube-api-access-s2zf6\") on node \"crc\" DevicePath \"\"" Dec 01 21:39:25 crc kubenswrapper[4962]: I1201 21:39:25.050346 4962 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a9797aca-9578-449d-ab30-190815736ab7-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 01 21:39:25 crc kubenswrapper[4962]: I1201 21:39:25.051052 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9797aca-9578-449d-ab30-190815736ab7-kube-api-access-x4nct" (OuterVolumeSpecName: "kube-api-access-x4nct") pod "a9797aca-9578-449d-ab30-190815736ab7" (UID: "a9797aca-9578-449d-ab30-190815736ab7"). InnerVolumeSpecName "kube-api-access-x4nct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:39:25 crc kubenswrapper[4962]: I1201 21:39:25.052117 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9797aca-9578-449d-ab30-190815736ab7-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a9797aca-9578-449d-ab30-190815736ab7" (UID: "a9797aca-9578-449d-ab30-190815736ab7"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:39:25 crc kubenswrapper[4962]: I1201 21:39:25.059148 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf39f3c8-4876-439d-9450-3f9a0ba72480-serving-cert\") pod \"route-controller-manager-86d966b9bd-gwr42\" (UID: \"cf39f3c8-4876-439d-9450-3f9a0ba72480\") " pod="openshift-route-controller-manager/route-controller-manager-86d966b9bd-gwr42" Dec 01 21:39:25 crc kubenswrapper[4962]: I1201 21:39:25.065358 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9x9bq\" (UniqueName: \"kubernetes.io/projected/cf39f3c8-4876-439d-9450-3f9a0ba72480-kube-api-access-9x9bq\") pod \"route-controller-manager-86d966b9bd-gwr42\" (UID: \"cf39f3c8-4876-439d-9450-3f9a0ba72480\") " pod="openshift-route-controller-manager/route-controller-manager-86d966b9bd-gwr42" Dec 01 21:39:25 crc kubenswrapper[4962]: I1201 21:39:25.151146 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4nct\" (UniqueName: \"kubernetes.io/projected/a9797aca-9578-449d-ab30-190815736ab7-kube-api-access-x4nct\") on node \"crc\" DevicePath \"\"" Dec 01 21:39:25 crc kubenswrapper[4962]: I1201 21:39:25.151175 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9797aca-9578-449d-ab30-190815736ab7-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 21:39:25 crc kubenswrapper[4962]: I1201 21:39:25.183887 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-86d966b9bd-gwr42" Dec 01 21:39:25 crc kubenswrapper[4962]: I1201 21:39:25.295736 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5f8bd458f5-vq6ns"] Dec 01 21:39:25 crc kubenswrapper[4962]: I1201 21:39:25.303371 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5f8bd458f5-vq6ns"] Dec 01 21:39:25 crc kubenswrapper[4962]: I1201 21:39:25.319118 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-757cc6bfc7-wdzwq"] Dec 01 21:39:25 crc kubenswrapper[4962]: I1201 21:39:25.323555 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-757cc6bfc7-wdzwq"] Dec 01 21:39:25 crc kubenswrapper[4962]: I1201 21:39:25.461957 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86d966b9bd-gwr42"] Dec 01 21:39:25 crc kubenswrapper[4962]: I1201 21:39:25.956787 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-86d966b9bd-gwr42" event={"ID":"cf39f3c8-4876-439d-9450-3f9a0ba72480","Type":"ContainerStarted","Data":"8fa75406acf0000047503c5e428c6e8e1662eacb0667dfdfde67b94c9ffe31c7"} Dec 01 21:39:25 crc kubenswrapper[4962]: I1201 21:39:25.957175 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-86d966b9bd-gwr42" Dec 01 21:39:25 crc kubenswrapper[4962]: I1201 21:39:25.957202 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-86d966b9bd-gwr42" event={"ID":"cf39f3c8-4876-439d-9450-3f9a0ba72480","Type":"ContainerStarted","Data":"5bf5c87b2da16da09b9c9869aa97c942e28efe9d16e058d3bb7f26b413b0289b"} Dec 01 21:39:25 crc kubenswrapper[4962]: I1201 21:39:25.979502 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-86d966b9bd-gwr42" podStartSLOduration=2.979482936 podStartE2EDuration="2.979482936s" podCreationTimestamp="2025-12-01 21:39:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:39:25.976772193 +0000 UTC m=+350.078211398" watchObservedRunningTime="2025-12-01 21:39:25.979482936 +0000 UTC m=+350.080922131" Dec 01 21:39:26 crc kubenswrapper[4962]: I1201 21:39:26.227579 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b49a1bc-a2f2-48b8-98b9-96dfbb06381b" path="/var/lib/kubelet/pods/6b49a1bc-a2f2-48b8-98b9-96dfbb06381b/volumes" Dec 01 21:39:26 crc kubenswrapper[4962]: I1201 21:39:26.228289 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9797aca-9578-449d-ab30-190815736ab7" path="/var/lib/kubelet/pods/a9797aca-9578-449d-ab30-190815736ab7/volumes" Dec 01 21:39:26 crc kubenswrapper[4962]: I1201 21:39:26.379336 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-86d966b9bd-gwr42" Dec 01 21:39:27 crc kubenswrapper[4962]: I1201 21:39:27.631509 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-584b64974b-cm8vf"] Dec 01 21:39:27 crc kubenswrapper[4962]: E1201 21:39:27.632250 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9797aca-9578-449d-ab30-190815736ab7" containerName="controller-manager" Dec 01 21:39:27 crc kubenswrapper[4962]: I1201 21:39:27.632273 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9797aca-9578-449d-ab30-190815736ab7" containerName="controller-manager" Dec 01 21:39:27 crc kubenswrapper[4962]: I1201 21:39:27.632451 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9797aca-9578-449d-ab30-190815736ab7" containerName="controller-manager" Dec 01 21:39:27 crc kubenswrapper[4962]: I1201 21:39:27.633228 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-584b64974b-cm8vf" Dec 01 21:39:27 crc kubenswrapper[4962]: I1201 21:39:27.637490 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 01 21:39:27 crc kubenswrapper[4962]: I1201 21:39:27.637810 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 01 21:39:27 crc kubenswrapper[4962]: I1201 21:39:27.639299 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 01 21:39:27 crc kubenswrapper[4962]: I1201 21:39:27.639463 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 01 21:39:27 crc kubenswrapper[4962]: I1201 21:39:27.640556 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 01 21:39:27 crc kubenswrapper[4962]: I1201 21:39:27.644819 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 01 21:39:27 crc kubenswrapper[4962]: I1201 21:39:27.652883 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 01 21:39:27 crc kubenswrapper[4962]: I1201 21:39:27.671146 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-584b64974b-cm8vf"] Dec 01 21:39:27 crc kubenswrapper[4962]: I1201 21:39:27.689163 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c314eced-b941-4cb3-9ffd-a33bee1fbfe4-proxy-ca-bundles\") pod \"controller-manager-584b64974b-cm8vf\" (UID: \"c314eced-b941-4cb3-9ffd-a33bee1fbfe4\") " pod="openshift-controller-manager/controller-manager-584b64974b-cm8vf" Dec 01 21:39:27 crc kubenswrapper[4962]: I1201 21:39:27.689468 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c314eced-b941-4cb3-9ffd-a33bee1fbfe4-serving-cert\") pod \"controller-manager-584b64974b-cm8vf\" (UID: \"c314eced-b941-4cb3-9ffd-a33bee1fbfe4\") " pod="openshift-controller-manager/controller-manager-584b64974b-cm8vf" Dec 01 21:39:27 crc kubenswrapper[4962]: I1201 21:39:27.689593 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9bfl\" (UniqueName: \"kubernetes.io/projected/c314eced-b941-4cb3-9ffd-a33bee1fbfe4-kube-api-access-r9bfl\") pod \"controller-manager-584b64974b-cm8vf\" (UID: \"c314eced-b941-4cb3-9ffd-a33bee1fbfe4\") " pod="openshift-controller-manager/controller-manager-584b64974b-cm8vf" Dec 01 21:39:27 crc kubenswrapper[4962]: I1201 21:39:27.689767 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c314eced-b941-4cb3-9ffd-a33bee1fbfe4-client-ca\") pod \"controller-manager-584b64974b-cm8vf\" (UID: \"c314eced-b941-4cb3-9ffd-a33bee1fbfe4\") " pod="openshift-controller-manager/controller-manager-584b64974b-cm8vf" Dec 01 21:39:27 crc kubenswrapper[4962]: I1201 21:39:27.689861 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c314eced-b941-4cb3-9ffd-a33bee1fbfe4-config\") pod \"controller-manager-584b64974b-cm8vf\" (UID: \"c314eced-b941-4cb3-9ffd-a33bee1fbfe4\") " pod="openshift-controller-manager/controller-manager-584b64974b-cm8vf" Dec 01 21:39:27 crc kubenswrapper[4962]: I1201 21:39:27.791500 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c314eced-b941-4cb3-9ffd-a33bee1fbfe4-client-ca\") pod \"controller-manager-584b64974b-cm8vf\" (UID: \"c314eced-b941-4cb3-9ffd-a33bee1fbfe4\") " pod="openshift-controller-manager/controller-manager-584b64974b-cm8vf" Dec 01 21:39:27 crc kubenswrapper[4962]: I1201 21:39:27.791617 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c314eced-b941-4cb3-9ffd-a33bee1fbfe4-config\") pod \"controller-manager-584b64974b-cm8vf\" (UID: \"c314eced-b941-4cb3-9ffd-a33bee1fbfe4\") " pod="openshift-controller-manager/controller-manager-584b64974b-cm8vf" Dec 01 21:39:27 crc kubenswrapper[4962]: I1201 21:39:27.791676 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c314eced-b941-4cb3-9ffd-a33bee1fbfe4-proxy-ca-bundles\") pod \"controller-manager-584b64974b-cm8vf\" (UID: \"c314eced-b941-4cb3-9ffd-a33bee1fbfe4\") " pod="openshift-controller-manager/controller-manager-584b64974b-cm8vf" Dec 01 21:39:27 crc kubenswrapper[4962]: I1201 21:39:27.791772 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c314eced-b941-4cb3-9ffd-a33bee1fbfe4-serving-cert\") pod \"controller-manager-584b64974b-cm8vf\" (UID: \"c314eced-b941-4cb3-9ffd-a33bee1fbfe4\") " pod="openshift-controller-manager/controller-manager-584b64974b-cm8vf" Dec 01 21:39:27 crc kubenswrapper[4962]: I1201 21:39:27.791804 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9bfl\" (UniqueName: \"kubernetes.io/projected/c314eced-b941-4cb3-9ffd-a33bee1fbfe4-kube-api-access-r9bfl\") pod \"controller-manager-584b64974b-cm8vf\" (UID: \"c314eced-b941-4cb3-9ffd-a33bee1fbfe4\") " pod="openshift-controller-manager/controller-manager-584b64974b-cm8vf" Dec 01 21:39:27 crc kubenswrapper[4962]: I1201 21:39:27.794001 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c314eced-b941-4cb3-9ffd-a33bee1fbfe4-client-ca\") pod \"controller-manager-584b64974b-cm8vf\" (UID: \"c314eced-b941-4cb3-9ffd-a33bee1fbfe4\") " pod="openshift-controller-manager/controller-manager-584b64974b-cm8vf" Dec 01 21:39:27 crc kubenswrapper[4962]: I1201 21:39:27.796441 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c314eced-b941-4cb3-9ffd-a33bee1fbfe4-config\") pod \"controller-manager-584b64974b-cm8vf\" (UID: \"c314eced-b941-4cb3-9ffd-a33bee1fbfe4\") " pod="openshift-controller-manager/controller-manager-584b64974b-cm8vf" Dec 01 21:39:27 crc kubenswrapper[4962]: I1201 21:39:27.800334 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c314eced-b941-4cb3-9ffd-a33bee1fbfe4-proxy-ca-bundles\") pod \"controller-manager-584b64974b-cm8vf\" (UID: \"c314eced-b941-4cb3-9ffd-a33bee1fbfe4\") " pod="openshift-controller-manager/controller-manager-584b64974b-cm8vf" Dec 01 21:39:27 crc kubenswrapper[4962]: I1201 21:39:27.813501 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c314eced-b941-4cb3-9ffd-a33bee1fbfe4-serving-cert\") pod \"controller-manager-584b64974b-cm8vf\" (UID: \"c314eced-b941-4cb3-9ffd-a33bee1fbfe4\") " pod="openshift-controller-manager/controller-manager-584b64974b-cm8vf" Dec 01 21:39:27 crc kubenswrapper[4962]: I1201 21:39:27.817910 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9bfl\" (UniqueName: \"kubernetes.io/projected/c314eced-b941-4cb3-9ffd-a33bee1fbfe4-kube-api-access-r9bfl\") pod \"controller-manager-584b64974b-cm8vf\" (UID: \"c314eced-b941-4cb3-9ffd-a33bee1fbfe4\") " pod="openshift-controller-manager/controller-manager-584b64974b-cm8vf" Dec 01 21:39:27 crc kubenswrapper[4962]: I1201 21:39:27.970450 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-584b64974b-cm8vf" Dec 01 21:39:28 crc kubenswrapper[4962]: I1201 21:39:28.189020 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-584b64974b-cm8vf"] Dec 01 21:39:28 crc kubenswrapper[4962]: I1201 21:39:28.979883 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-584b64974b-cm8vf" event={"ID":"c314eced-b941-4cb3-9ffd-a33bee1fbfe4","Type":"ContainerStarted","Data":"5e032bd0b70c20d2e7244427057284cc05215656d67be028f4bdf1ed111d7257"} Dec 01 21:39:28 crc kubenswrapper[4962]: I1201 21:39:28.980207 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-584b64974b-cm8vf" event={"ID":"c314eced-b941-4cb3-9ffd-a33bee1fbfe4","Type":"ContainerStarted","Data":"be60171060eb8d93bc3080dae1d6af72f81c56aefd9213985c51da8cf60d63ec"} Dec 01 21:39:28 crc kubenswrapper[4962]: I1201 21:39:28.981044 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-584b64974b-cm8vf" Dec 01 21:39:28 crc kubenswrapper[4962]: I1201 21:39:28.985728 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-584b64974b-cm8vf" Dec 01 21:39:29 crc kubenswrapper[4962]: I1201 21:39:29.008400 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-584b64974b-cm8vf" podStartSLOduration=6.008384037 podStartE2EDuration="6.008384037s" podCreationTimestamp="2025-12-01 21:39:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:39:29.006102315 +0000 UTC m=+353.107541530" watchObservedRunningTime="2025-12-01 21:39:29.008384037 +0000 UTC m=+353.109823232" Dec 01 21:39:32 crc kubenswrapper[4962]: I1201 21:39:32.784249 4962 patch_prober.go:28] interesting pod/machine-config-daemon-b642k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 21:39:32 crc kubenswrapper[4962]: I1201 21:39:32.784344 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 21:39:45 crc kubenswrapper[4962]: I1201 21:39:45.444398 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-584b64974b-cm8vf"] Dec 01 21:39:45 crc kubenswrapper[4962]: I1201 21:39:45.445103 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-584b64974b-cm8vf" podUID="c314eced-b941-4cb3-9ffd-a33bee1fbfe4" containerName="controller-manager" containerID="cri-o://5e032bd0b70c20d2e7244427057284cc05215656d67be028f4bdf1ed111d7257" gracePeriod=30 Dec 01 21:39:45 crc kubenswrapper[4962]: I1201 21:39:45.452243 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86d966b9bd-gwr42"] Dec 01 21:39:45 crc kubenswrapper[4962]: I1201 21:39:45.452775 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-86d966b9bd-gwr42" podUID="cf39f3c8-4876-439d-9450-3f9a0ba72480" containerName="route-controller-manager" containerID="cri-o://8fa75406acf0000047503c5e428c6e8e1662eacb0667dfdfde67b94c9ffe31c7" gracePeriod=30 Dec 01 21:39:45 crc kubenswrapper[4962]: I1201 21:39:45.942982 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-86d966b9bd-gwr42" Dec 01 21:39:45 crc kubenswrapper[4962]: I1201 21:39:45.946739 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-584b64974b-cm8vf" Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.029226 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf39f3c8-4876-439d-9450-3f9a0ba72480-serving-cert\") pod \"cf39f3c8-4876-439d-9450-3f9a0ba72480\" (UID: \"cf39f3c8-4876-439d-9450-3f9a0ba72480\") " Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.029338 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c314eced-b941-4cb3-9ffd-a33bee1fbfe4-config\") pod \"c314eced-b941-4cb3-9ffd-a33bee1fbfe4\" (UID: \"c314eced-b941-4cb3-9ffd-a33bee1fbfe4\") " Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.029379 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c314eced-b941-4cb3-9ffd-a33bee1fbfe4-client-ca\") pod \"c314eced-b941-4cb3-9ffd-a33bee1fbfe4\" (UID: \"c314eced-b941-4cb3-9ffd-a33bee1fbfe4\") " Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.029401 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c314eced-b941-4cb3-9ffd-a33bee1fbfe4-proxy-ca-bundles\") pod \"c314eced-b941-4cb3-9ffd-a33bee1fbfe4\" (UID: \"c314eced-b941-4cb3-9ffd-a33bee1fbfe4\") " Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.029426 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9x9bq\" (UniqueName: \"kubernetes.io/projected/cf39f3c8-4876-439d-9450-3f9a0ba72480-kube-api-access-9x9bq\") pod \"cf39f3c8-4876-439d-9450-3f9a0ba72480\" (UID: \"cf39f3c8-4876-439d-9450-3f9a0ba72480\") " Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.029444 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cf39f3c8-4876-439d-9450-3f9a0ba72480-client-ca\") pod \"cf39f3c8-4876-439d-9450-3f9a0ba72480\" (UID: \"cf39f3c8-4876-439d-9450-3f9a0ba72480\") " Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.029475 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c314eced-b941-4cb3-9ffd-a33bee1fbfe4-serving-cert\") pod \"c314eced-b941-4cb3-9ffd-a33bee1fbfe4\" (UID: \"c314eced-b941-4cb3-9ffd-a33bee1fbfe4\") " Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.029492 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9bfl\" (UniqueName: \"kubernetes.io/projected/c314eced-b941-4cb3-9ffd-a33bee1fbfe4-kube-api-access-r9bfl\") pod \"c314eced-b941-4cb3-9ffd-a33bee1fbfe4\" (UID: \"c314eced-b941-4cb3-9ffd-a33bee1fbfe4\") " Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.029518 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf39f3c8-4876-439d-9450-3f9a0ba72480-config\") pod \"cf39f3c8-4876-439d-9450-3f9a0ba72480\" (UID: \"cf39f3c8-4876-439d-9450-3f9a0ba72480\") " Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.030106 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf39f3c8-4876-439d-9450-3f9a0ba72480-client-ca" (OuterVolumeSpecName: "client-ca") pod "cf39f3c8-4876-439d-9450-3f9a0ba72480" (UID: "cf39f3c8-4876-439d-9450-3f9a0ba72480"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.030244 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c314eced-b941-4cb3-9ffd-a33bee1fbfe4-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "c314eced-b941-4cb3-9ffd-a33bee1fbfe4" (UID: "c314eced-b941-4cb3-9ffd-a33bee1fbfe4"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.030351 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c314eced-b941-4cb3-9ffd-a33bee1fbfe4-client-ca" (OuterVolumeSpecName: "client-ca") pod "c314eced-b941-4cb3-9ffd-a33bee1fbfe4" (UID: "c314eced-b941-4cb3-9ffd-a33bee1fbfe4"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.030338 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c314eced-b941-4cb3-9ffd-a33bee1fbfe4-config" (OuterVolumeSpecName: "config") pod "c314eced-b941-4cb3-9ffd-a33bee1fbfe4" (UID: "c314eced-b941-4cb3-9ffd-a33bee1fbfe4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.030453 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf39f3c8-4876-439d-9450-3f9a0ba72480-config" (OuterVolumeSpecName: "config") pod "cf39f3c8-4876-439d-9450-3f9a0ba72480" (UID: "cf39f3c8-4876-439d-9450-3f9a0ba72480"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.036198 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf39f3c8-4876-439d-9450-3f9a0ba72480-kube-api-access-9x9bq" (OuterVolumeSpecName: "kube-api-access-9x9bq") pod "cf39f3c8-4876-439d-9450-3f9a0ba72480" (UID: "cf39f3c8-4876-439d-9450-3f9a0ba72480"). InnerVolumeSpecName "kube-api-access-9x9bq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.036222 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf39f3c8-4876-439d-9450-3f9a0ba72480-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "cf39f3c8-4876-439d-9450-3f9a0ba72480" (UID: "cf39f3c8-4876-439d-9450-3f9a0ba72480"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.036255 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c314eced-b941-4cb3-9ffd-a33bee1fbfe4-kube-api-access-r9bfl" (OuterVolumeSpecName: "kube-api-access-r9bfl") pod "c314eced-b941-4cb3-9ffd-a33bee1fbfe4" (UID: "c314eced-b941-4cb3-9ffd-a33bee1fbfe4"). InnerVolumeSpecName "kube-api-access-r9bfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.036429 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c314eced-b941-4cb3-9ffd-a33bee1fbfe4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c314eced-b941-4cb3-9ffd-a33bee1fbfe4" (UID: "c314eced-b941-4cb3-9ffd-a33bee1fbfe4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.091005 4962 generic.go:334] "Generic (PLEG): container finished" podID="cf39f3c8-4876-439d-9450-3f9a0ba72480" containerID="8fa75406acf0000047503c5e428c6e8e1662eacb0667dfdfde67b94c9ffe31c7" exitCode=0 Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.091094 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-86d966b9bd-gwr42" event={"ID":"cf39f3c8-4876-439d-9450-3f9a0ba72480","Type":"ContainerDied","Data":"8fa75406acf0000047503c5e428c6e8e1662eacb0667dfdfde67b94c9ffe31c7"} Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.091182 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-86d966b9bd-gwr42" event={"ID":"cf39f3c8-4876-439d-9450-3f9a0ba72480","Type":"ContainerDied","Data":"5bf5c87b2da16da09b9c9869aa97c942e28efe9d16e058d3bb7f26b413b0289b"} Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.091205 4962 scope.go:117] "RemoveContainer" containerID="8fa75406acf0000047503c5e428c6e8e1662eacb0667dfdfde67b94c9ffe31c7" Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.091107 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-86d966b9bd-gwr42" Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.097008 4962 generic.go:334] "Generic (PLEG): container finished" podID="c314eced-b941-4cb3-9ffd-a33bee1fbfe4" containerID="5e032bd0b70c20d2e7244427057284cc05215656d67be028f4bdf1ed111d7257" exitCode=0 Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.097055 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-584b64974b-cm8vf" event={"ID":"c314eced-b941-4cb3-9ffd-a33bee1fbfe4","Type":"ContainerDied","Data":"5e032bd0b70c20d2e7244427057284cc05215656d67be028f4bdf1ed111d7257"} Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.097075 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-584b64974b-cm8vf" Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.097082 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-584b64974b-cm8vf" event={"ID":"c314eced-b941-4cb3-9ffd-a33bee1fbfe4","Type":"ContainerDied","Data":"be60171060eb8d93bc3080dae1d6af72f81c56aefd9213985c51da8cf60d63ec"} Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.117812 4962 scope.go:117] "RemoveContainer" containerID="8fa75406acf0000047503c5e428c6e8e1662eacb0667dfdfde67b94c9ffe31c7" Dec 01 21:39:46 crc kubenswrapper[4962]: E1201 21:39:46.118436 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fa75406acf0000047503c5e428c6e8e1662eacb0667dfdfde67b94c9ffe31c7\": container with ID starting with 8fa75406acf0000047503c5e428c6e8e1662eacb0667dfdfde67b94c9ffe31c7 not found: ID does not exist" containerID="8fa75406acf0000047503c5e428c6e8e1662eacb0667dfdfde67b94c9ffe31c7" Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.118475 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fa75406acf0000047503c5e428c6e8e1662eacb0667dfdfde67b94c9ffe31c7"} err="failed to get container status \"8fa75406acf0000047503c5e428c6e8e1662eacb0667dfdfde67b94c9ffe31c7\": rpc error: code = NotFound desc = could not find container \"8fa75406acf0000047503c5e428c6e8e1662eacb0667dfdfde67b94c9ffe31c7\": container with ID starting with 8fa75406acf0000047503c5e428c6e8e1662eacb0667dfdfde67b94c9ffe31c7 not found: ID does not exist" Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.118499 4962 scope.go:117] "RemoveContainer" containerID="5e032bd0b70c20d2e7244427057284cc05215656d67be028f4bdf1ed111d7257" Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.127413 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86d966b9bd-gwr42"] Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.130080 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86d966b9bd-gwr42"] Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.130713 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c314eced-b941-4cb3-9ffd-a33bee1fbfe4-config\") on node \"crc\" DevicePath \"\"" Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.130738 4962 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c314eced-b941-4cb3-9ffd-a33bee1fbfe4-client-ca\") on node \"crc\" DevicePath \"\"" Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.130747 4962 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c314eced-b941-4cb3-9ffd-a33bee1fbfe4-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.130756 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9x9bq\" (UniqueName: \"kubernetes.io/projected/cf39f3c8-4876-439d-9450-3f9a0ba72480-kube-api-access-9x9bq\") on node \"crc\" DevicePath \"\"" Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.130765 4962 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cf39f3c8-4876-439d-9450-3f9a0ba72480-client-ca\") on node \"crc\" DevicePath \"\"" Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.130773 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c314eced-b941-4cb3-9ffd-a33bee1fbfe4-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.130783 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r9bfl\" (UniqueName: \"kubernetes.io/projected/c314eced-b941-4cb3-9ffd-a33bee1fbfe4-kube-api-access-r9bfl\") on node \"crc\" DevicePath \"\"" Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.130791 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf39f3c8-4876-439d-9450-3f9a0ba72480-config\") on node \"crc\" DevicePath \"\"" Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.130799 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf39f3c8-4876-439d-9450-3f9a0ba72480-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.132040 4962 scope.go:117] "RemoveContainer" containerID="5e032bd0b70c20d2e7244427057284cc05215656d67be028f4bdf1ed111d7257" Dec 01 21:39:46 crc kubenswrapper[4962]: E1201 21:39:46.132418 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e032bd0b70c20d2e7244427057284cc05215656d67be028f4bdf1ed111d7257\": container with ID starting with 5e032bd0b70c20d2e7244427057284cc05215656d67be028f4bdf1ed111d7257 not found: ID does not exist" containerID="5e032bd0b70c20d2e7244427057284cc05215656d67be028f4bdf1ed111d7257" Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.132465 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e032bd0b70c20d2e7244427057284cc05215656d67be028f4bdf1ed111d7257"} err="failed to get container status \"5e032bd0b70c20d2e7244427057284cc05215656d67be028f4bdf1ed111d7257\": rpc error: code = NotFound desc = could not find container \"5e032bd0b70c20d2e7244427057284cc05215656d67be028f4bdf1ed111d7257\": container with ID starting with 5e032bd0b70c20d2e7244427057284cc05215656d67be028f4bdf1ed111d7257 not found: ID does not exist" Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.140399 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-584b64974b-cm8vf"] Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.145207 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-584b64974b-cm8vf"] Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.225618 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c314eced-b941-4cb3-9ffd-a33bee1fbfe4" path="/var/lib/kubelet/pods/c314eced-b941-4cb3-9ffd-a33bee1fbfe4/volumes" Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.226161 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf39f3c8-4876-439d-9450-3f9a0ba72480" path="/var/lib/kubelet/pods/cf39f3c8-4876-439d-9450-3f9a0ba72480/volumes" Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.797070 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-78f444d6c-nlx86"] Dec 01 21:39:46 crc kubenswrapper[4962]: E1201 21:39:46.797483 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf39f3c8-4876-439d-9450-3f9a0ba72480" containerName="route-controller-manager" Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.797520 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf39f3c8-4876-439d-9450-3f9a0ba72480" containerName="route-controller-manager" Dec 01 21:39:46 crc kubenswrapper[4962]: E1201 21:39:46.797554 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c314eced-b941-4cb3-9ffd-a33bee1fbfe4" containerName="controller-manager" Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.797563 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="c314eced-b941-4cb3-9ffd-a33bee1fbfe4" containerName="controller-manager" Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.797724 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="c314eced-b941-4cb3-9ffd-a33bee1fbfe4" containerName="controller-manager" Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.797762 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf39f3c8-4876-439d-9450-3f9a0ba72480" containerName="route-controller-manager" Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.798505 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-78f444d6c-nlx86" Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.800985 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8479ffcfd5-h7qp4"] Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.801462 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.801582 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.801829 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8479ffcfd5-h7qp4" Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.803873 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.804270 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.804739 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.804793 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.806993 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.807090 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.807149 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.807188 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.807724 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.811503 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.820610 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-78f444d6c-nlx86"] Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.822476 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.836204 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8479ffcfd5-h7qp4"] Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.840247 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3f9fd187-e952-4b88-862c-1e1642af1e35-proxy-ca-bundles\") pod \"controller-manager-78f444d6c-nlx86\" (UID: \"3f9fd187-e952-4b88-862c-1e1642af1e35\") " pod="openshift-controller-manager/controller-manager-78f444d6c-nlx86" Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.840304 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8bb96f83-1535-43e7-8126-4caaf42b53ab-client-ca\") pod \"route-controller-manager-8479ffcfd5-h7qp4\" (UID: \"8bb96f83-1535-43e7-8126-4caaf42b53ab\") " pod="openshift-route-controller-manager/route-controller-manager-8479ffcfd5-h7qp4" Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.840330 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-768xw\" (UniqueName: \"kubernetes.io/projected/8bb96f83-1535-43e7-8126-4caaf42b53ab-kube-api-access-768xw\") pod \"route-controller-manager-8479ffcfd5-h7qp4\" (UID: \"8bb96f83-1535-43e7-8126-4caaf42b53ab\") " pod="openshift-route-controller-manager/route-controller-manager-8479ffcfd5-h7qp4" Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.840356 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdpm9\" (UniqueName: \"kubernetes.io/projected/3f9fd187-e952-4b88-862c-1e1642af1e35-kube-api-access-hdpm9\") pod \"controller-manager-78f444d6c-nlx86\" (UID: \"3f9fd187-e952-4b88-862c-1e1642af1e35\") " pod="openshift-controller-manager/controller-manager-78f444d6c-nlx86" Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.840407 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bb96f83-1535-43e7-8126-4caaf42b53ab-config\") pod \"route-controller-manager-8479ffcfd5-h7qp4\" (UID: \"8bb96f83-1535-43e7-8126-4caaf42b53ab\") " pod="openshift-route-controller-manager/route-controller-manager-8479ffcfd5-h7qp4" Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.840437 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f9fd187-e952-4b88-862c-1e1642af1e35-config\") pod \"controller-manager-78f444d6c-nlx86\" (UID: \"3f9fd187-e952-4b88-862c-1e1642af1e35\") " pod="openshift-controller-manager/controller-manager-78f444d6c-nlx86" Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.840460 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f9fd187-e952-4b88-862c-1e1642af1e35-serving-cert\") pod \"controller-manager-78f444d6c-nlx86\" (UID: \"3f9fd187-e952-4b88-862c-1e1642af1e35\") " pod="openshift-controller-manager/controller-manager-78f444d6c-nlx86" Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.840498 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8bb96f83-1535-43e7-8126-4caaf42b53ab-serving-cert\") pod \"route-controller-manager-8479ffcfd5-h7qp4\" (UID: \"8bb96f83-1535-43e7-8126-4caaf42b53ab\") " pod="openshift-route-controller-manager/route-controller-manager-8479ffcfd5-h7qp4" Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.840521 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3f9fd187-e952-4b88-862c-1e1642af1e35-client-ca\") pod \"controller-manager-78f444d6c-nlx86\" (UID: \"3f9fd187-e952-4b88-862c-1e1642af1e35\") " pod="openshift-controller-manager/controller-manager-78f444d6c-nlx86" Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.942033 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bb96f83-1535-43e7-8126-4caaf42b53ab-config\") pod \"route-controller-manager-8479ffcfd5-h7qp4\" (UID: \"8bb96f83-1535-43e7-8126-4caaf42b53ab\") " pod="openshift-route-controller-manager/route-controller-manager-8479ffcfd5-h7qp4" Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.942223 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f9fd187-e952-4b88-862c-1e1642af1e35-config\") pod \"controller-manager-78f444d6c-nlx86\" (UID: \"3f9fd187-e952-4b88-862c-1e1642af1e35\") " pod="openshift-controller-manager/controller-manager-78f444d6c-nlx86" Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.942299 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f9fd187-e952-4b88-862c-1e1642af1e35-serving-cert\") pod \"controller-manager-78f444d6c-nlx86\" (UID: \"3f9fd187-e952-4b88-862c-1e1642af1e35\") " pod="openshift-controller-manager/controller-manager-78f444d6c-nlx86" Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.942379 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8bb96f83-1535-43e7-8126-4caaf42b53ab-serving-cert\") pod \"route-controller-manager-8479ffcfd5-h7qp4\" (UID: \"8bb96f83-1535-43e7-8126-4caaf42b53ab\") " pod="openshift-route-controller-manager/route-controller-manager-8479ffcfd5-h7qp4" Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.942425 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3f9fd187-e952-4b88-862c-1e1642af1e35-client-ca\") pod \"controller-manager-78f444d6c-nlx86\" (UID: \"3f9fd187-e952-4b88-862c-1e1642af1e35\") " pod="openshift-controller-manager/controller-manager-78f444d6c-nlx86" Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.942481 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3f9fd187-e952-4b88-862c-1e1642af1e35-proxy-ca-bundles\") pod \"controller-manager-78f444d6c-nlx86\" (UID: \"3f9fd187-e952-4b88-862c-1e1642af1e35\") " pod="openshift-controller-manager/controller-manager-78f444d6c-nlx86" Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.942556 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8bb96f83-1535-43e7-8126-4caaf42b53ab-client-ca\") pod \"route-controller-manager-8479ffcfd5-h7qp4\" (UID: \"8bb96f83-1535-43e7-8126-4caaf42b53ab\") " pod="openshift-route-controller-manager/route-controller-manager-8479ffcfd5-h7qp4" Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.942593 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-768xw\" (UniqueName: \"kubernetes.io/projected/8bb96f83-1535-43e7-8126-4caaf42b53ab-kube-api-access-768xw\") pod \"route-controller-manager-8479ffcfd5-h7qp4\" (UID: \"8bb96f83-1535-43e7-8126-4caaf42b53ab\") " pod="openshift-route-controller-manager/route-controller-manager-8479ffcfd5-h7qp4" Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.942638 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdpm9\" (UniqueName: \"kubernetes.io/projected/3f9fd187-e952-4b88-862c-1e1642af1e35-kube-api-access-hdpm9\") pod \"controller-manager-78f444d6c-nlx86\" (UID: \"3f9fd187-e952-4b88-862c-1e1642af1e35\") " pod="openshift-controller-manager/controller-manager-78f444d6c-nlx86" Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.943633 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bb96f83-1535-43e7-8126-4caaf42b53ab-config\") pod \"route-controller-manager-8479ffcfd5-h7qp4\" (UID: \"8bb96f83-1535-43e7-8126-4caaf42b53ab\") " pod="openshift-route-controller-manager/route-controller-manager-8479ffcfd5-h7qp4" Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.943844 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8bb96f83-1535-43e7-8126-4caaf42b53ab-client-ca\") pod \"route-controller-manager-8479ffcfd5-h7qp4\" (UID: \"8bb96f83-1535-43e7-8126-4caaf42b53ab\") " pod="openshift-route-controller-manager/route-controller-manager-8479ffcfd5-h7qp4" Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.945369 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f9fd187-e952-4b88-862c-1e1642af1e35-config\") pod \"controller-manager-78f444d6c-nlx86\" (UID: \"3f9fd187-e952-4b88-862c-1e1642af1e35\") " pod="openshift-controller-manager/controller-manager-78f444d6c-nlx86" Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.945850 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3f9fd187-e952-4b88-862c-1e1642af1e35-proxy-ca-bundles\") pod \"controller-manager-78f444d6c-nlx86\" (UID: \"3f9fd187-e952-4b88-862c-1e1642af1e35\") " pod="openshift-controller-manager/controller-manager-78f444d6c-nlx86" Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.946050 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3f9fd187-e952-4b88-862c-1e1642af1e35-client-ca\") pod \"controller-manager-78f444d6c-nlx86\" (UID: \"3f9fd187-e952-4b88-862c-1e1642af1e35\") " pod="openshift-controller-manager/controller-manager-78f444d6c-nlx86" Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.948202 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8bb96f83-1535-43e7-8126-4caaf42b53ab-serving-cert\") pod \"route-controller-manager-8479ffcfd5-h7qp4\" (UID: \"8bb96f83-1535-43e7-8126-4caaf42b53ab\") " pod="openshift-route-controller-manager/route-controller-manager-8479ffcfd5-h7qp4" Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.949412 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f9fd187-e952-4b88-862c-1e1642af1e35-serving-cert\") pod \"controller-manager-78f444d6c-nlx86\" (UID: \"3f9fd187-e952-4b88-862c-1e1642af1e35\") " pod="openshift-controller-manager/controller-manager-78f444d6c-nlx86" Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.965807 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-768xw\" (UniqueName: \"kubernetes.io/projected/8bb96f83-1535-43e7-8126-4caaf42b53ab-kube-api-access-768xw\") pod \"route-controller-manager-8479ffcfd5-h7qp4\" (UID: \"8bb96f83-1535-43e7-8126-4caaf42b53ab\") " pod="openshift-route-controller-manager/route-controller-manager-8479ffcfd5-h7qp4" Dec 01 21:39:46 crc kubenswrapper[4962]: I1201 21:39:46.969843 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdpm9\" (UniqueName: \"kubernetes.io/projected/3f9fd187-e952-4b88-862c-1e1642af1e35-kube-api-access-hdpm9\") pod \"controller-manager-78f444d6c-nlx86\" (UID: \"3f9fd187-e952-4b88-862c-1e1642af1e35\") " pod="openshift-controller-manager/controller-manager-78f444d6c-nlx86" Dec 01 21:39:47 crc kubenswrapper[4962]: I1201 21:39:47.167151 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-78f444d6c-nlx86" Dec 01 21:39:47 crc kubenswrapper[4962]: I1201 21:39:47.181725 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8479ffcfd5-h7qp4" Dec 01 21:39:47 crc kubenswrapper[4962]: I1201 21:39:47.409466 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-78f444d6c-nlx86"] Dec 01 21:39:47 crc kubenswrapper[4962]: W1201 21:39:47.413281 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f9fd187_e952_4b88_862c_1e1642af1e35.slice/crio-12951f8974b3a3cbcd8f558b92b544f8c44fde568044b2f90a246c03e582dc21 WatchSource:0}: Error finding container 12951f8974b3a3cbcd8f558b92b544f8c44fde568044b2f90a246c03e582dc21: Status 404 returned error can't find the container with id 12951f8974b3a3cbcd8f558b92b544f8c44fde568044b2f90a246c03e582dc21 Dec 01 21:39:47 crc kubenswrapper[4962]: I1201 21:39:47.450966 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/f02d5ee7-1f55-4474-94c6-005cdc9974bf-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-dfszb\" (UID: \"f02d5ee7-1f55-4474-94c6-005cdc9974bf\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dfszb" Dec 01 21:39:47 crc kubenswrapper[4962]: I1201 21:39:47.456867 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/f02d5ee7-1f55-4474-94c6-005cdc9974bf-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-dfszb\" (UID: \"f02d5ee7-1f55-4474-94c6-005cdc9974bf\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dfszb" Dec 01 21:39:47 crc kubenswrapper[4962]: I1201 21:39:47.675204 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8479ffcfd5-h7qp4"] Dec 01 21:39:47 crc kubenswrapper[4962]: W1201 21:39:47.682144 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8bb96f83_1535_43e7_8126_4caaf42b53ab.slice/crio-ee2720c3d8126f893e224da77ca3304f3f19f8936b56d2e2e8e723eb58b7aa3a WatchSource:0}: Error finding container ee2720c3d8126f893e224da77ca3304f3f19f8936b56d2e2e8e723eb58b7aa3a: Status 404 returned error can't find the container with id ee2720c3d8126f893e224da77ca3304f3f19f8936b56d2e2e8e723eb58b7aa3a Dec 01 21:39:47 crc kubenswrapper[4962]: I1201 21:39:47.694191 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dfszb" Dec 01 21:39:48 crc kubenswrapper[4962]: I1201 21:39:48.111531 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-78f444d6c-nlx86" event={"ID":"3f9fd187-e952-4b88-862c-1e1642af1e35","Type":"ContainerStarted","Data":"a03b4b28cee9f8c369a787c0e9e76d9972c822f6e52138c12856aa1e627cdd2f"} Dec 01 21:39:48 crc kubenswrapper[4962]: I1201 21:39:48.111564 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-78f444d6c-nlx86" event={"ID":"3f9fd187-e952-4b88-862c-1e1642af1e35","Type":"ContainerStarted","Data":"12951f8974b3a3cbcd8f558b92b544f8c44fde568044b2f90a246c03e582dc21"} Dec 01 21:39:48 crc kubenswrapper[4962]: I1201 21:39:48.111751 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-78f444d6c-nlx86" Dec 01 21:39:48 crc kubenswrapper[4962]: I1201 21:39:48.112818 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8479ffcfd5-h7qp4" event={"ID":"8bb96f83-1535-43e7-8126-4caaf42b53ab","Type":"ContainerStarted","Data":"e6de2ce140f0f20059cece9dd4fc3347fa5a532c99e353dcc38eb9f9a1b99f93"} Dec 01 21:39:48 crc kubenswrapper[4962]: I1201 21:39:48.112839 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8479ffcfd5-h7qp4" event={"ID":"8bb96f83-1535-43e7-8126-4caaf42b53ab","Type":"ContainerStarted","Data":"ee2720c3d8126f893e224da77ca3304f3f19f8936b56d2e2e8e723eb58b7aa3a"} Dec 01 21:39:48 crc kubenswrapper[4962]: I1201 21:39:48.113287 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-8479ffcfd5-h7qp4" Dec 01 21:39:48 crc kubenswrapper[4962]: I1201 21:39:48.117363 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-78f444d6c-nlx86" Dec 01 21:39:48 crc kubenswrapper[4962]: I1201 21:39:48.129654 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-78f444d6c-nlx86" podStartSLOduration=3.12964181 podStartE2EDuration="3.12964181s" podCreationTimestamp="2025-12-01 21:39:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:39:48.127662217 +0000 UTC m=+372.229101442" watchObservedRunningTime="2025-12-01 21:39:48.12964181 +0000 UTC m=+372.231081005" Dec 01 21:39:48 crc kubenswrapper[4962]: I1201 21:39:48.148163 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-8479ffcfd5-h7qp4" podStartSLOduration=3.148145111 podStartE2EDuration="3.148145111s" podCreationTimestamp="2025-12-01 21:39:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:39:48.144576834 +0000 UTC m=+372.246016039" watchObservedRunningTime="2025-12-01 21:39:48.148145111 +0000 UTC m=+372.249584306" Dec 01 21:39:48 crc kubenswrapper[4962]: I1201 21:39:48.186663 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dfszb"] Dec 01 21:39:48 crc kubenswrapper[4962]: I1201 21:39:48.359003 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-8479ffcfd5-h7qp4" Dec 01 21:39:49 crc kubenswrapper[4962]: I1201 21:39:49.117747 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dfszb" event={"ID":"f02d5ee7-1f55-4474-94c6-005cdc9974bf","Type":"ContainerStarted","Data":"38715fc049e665e56b4e8fdd1c9754208d4593cd943b1e95f81bce77f1ed98dd"} Dec 01 21:39:51 crc kubenswrapper[4962]: I1201 21:39:51.148601 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dfszb" event={"ID":"f02d5ee7-1f55-4474-94c6-005cdc9974bf","Type":"ContainerStarted","Data":"30a87f90b2d91c758070e04f5563712c1a6f6516258ea2d55c7cbf7c406b99ce"} Dec 01 21:39:51 crc kubenswrapper[4962]: I1201 21:39:51.150203 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dfszb" Dec 01 21:39:51 crc kubenswrapper[4962]: I1201 21:39:51.158639 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dfszb" Dec 01 21:39:51 crc kubenswrapper[4962]: I1201 21:39:51.166838 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dfszb" podStartSLOduration=66.322754935 podStartE2EDuration="1m8.166805054s" podCreationTimestamp="2025-12-01 21:38:43 +0000 UTC" firstStartedPulling="2025-12-01 21:39:48.19395313 +0000 UTC m=+372.295392365" lastFinishedPulling="2025-12-01 21:39:50.038003289 +0000 UTC m=+374.139442484" observedRunningTime="2025-12-01 21:39:51.16369083 +0000 UTC m=+375.265130105" watchObservedRunningTime="2025-12-01 21:39:51.166805054 +0000 UTC m=+375.268244269" Dec 01 21:39:51 crc kubenswrapper[4962]: I1201 21:39:51.531965 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-db54df47d-9xxm9"] Dec 01 21:39:51 crc kubenswrapper[4962]: I1201 21:39:51.533333 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-db54df47d-9xxm9" Dec 01 21:39:51 crc kubenswrapper[4962]: I1201 21:39:51.536579 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Dec 01 21:39:51 crc kubenswrapper[4962]: I1201 21:39:51.536739 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Dec 01 21:39:51 crc kubenswrapper[4962]: I1201 21:39:51.536618 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Dec 01 21:39:51 crc kubenswrapper[4962]: I1201 21:39:51.537538 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-g4scc" Dec 01 21:39:51 crc kubenswrapper[4962]: I1201 21:39:51.539537 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-db54df47d-9xxm9"] Dec 01 21:39:51 crc kubenswrapper[4962]: I1201 21:39:51.715310 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8w6sx\" (UniqueName: \"kubernetes.io/projected/09f86e77-5df3-4ef5-b8e4-2f09adc88598-kube-api-access-8w6sx\") pod \"prometheus-operator-db54df47d-9xxm9\" (UID: \"09f86e77-5df3-4ef5-b8e4-2f09adc88598\") " pod="openshift-monitoring/prometheus-operator-db54df47d-9xxm9" Dec 01 21:39:51 crc kubenswrapper[4962]: I1201 21:39:51.715374 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/09f86e77-5df3-4ef5-b8e4-2f09adc88598-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-db54df47d-9xxm9\" (UID: \"09f86e77-5df3-4ef5-b8e4-2f09adc88598\") " pod="openshift-monitoring/prometheus-operator-db54df47d-9xxm9" Dec 01 21:39:51 crc kubenswrapper[4962]: I1201 21:39:51.715409 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/09f86e77-5df3-4ef5-b8e4-2f09adc88598-metrics-client-ca\") pod \"prometheus-operator-db54df47d-9xxm9\" (UID: \"09f86e77-5df3-4ef5-b8e4-2f09adc88598\") " pod="openshift-monitoring/prometheus-operator-db54df47d-9xxm9" Dec 01 21:39:51 crc kubenswrapper[4962]: I1201 21:39:51.715460 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/09f86e77-5df3-4ef5-b8e4-2f09adc88598-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-9xxm9\" (UID: \"09f86e77-5df3-4ef5-b8e4-2f09adc88598\") " pod="openshift-monitoring/prometheus-operator-db54df47d-9xxm9" Dec 01 21:39:51 crc kubenswrapper[4962]: I1201 21:39:51.817192 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8w6sx\" (UniqueName: \"kubernetes.io/projected/09f86e77-5df3-4ef5-b8e4-2f09adc88598-kube-api-access-8w6sx\") pod \"prometheus-operator-db54df47d-9xxm9\" (UID: \"09f86e77-5df3-4ef5-b8e4-2f09adc88598\") " pod="openshift-monitoring/prometheus-operator-db54df47d-9xxm9" Dec 01 21:39:51 crc kubenswrapper[4962]: I1201 21:39:51.817267 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/09f86e77-5df3-4ef5-b8e4-2f09adc88598-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-db54df47d-9xxm9\" (UID: \"09f86e77-5df3-4ef5-b8e4-2f09adc88598\") " pod="openshift-monitoring/prometheus-operator-db54df47d-9xxm9" Dec 01 21:39:51 crc kubenswrapper[4962]: I1201 21:39:51.817294 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/09f86e77-5df3-4ef5-b8e4-2f09adc88598-metrics-client-ca\") pod \"prometheus-operator-db54df47d-9xxm9\" (UID: \"09f86e77-5df3-4ef5-b8e4-2f09adc88598\") " pod="openshift-monitoring/prometheus-operator-db54df47d-9xxm9" Dec 01 21:39:51 crc kubenswrapper[4962]: I1201 21:39:51.817321 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/09f86e77-5df3-4ef5-b8e4-2f09adc88598-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-9xxm9\" (UID: \"09f86e77-5df3-4ef5-b8e4-2f09adc88598\") " pod="openshift-monitoring/prometheus-operator-db54df47d-9xxm9" Dec 01 21:39:51 crc kubenswrapper[4962]: I1201 21:39:51.819378 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/09f86e77-5df3-4ef5-b8e4-2f09adc88598-metrics-client-ca\") pod \"prometheus-operator-db54df47d-9xxm9\" (UID: \"09f86e77-5df3-4ef5-b8e4-2f09adc88598\") " pod="openshift-monitoring/prometheus-operator-db54df47d-9xxm9" Dec 01 21:39:51 crc kubenswrapper[4962]: I1201 21:39:51.823179 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/09f86e77-5df3-4ef5-b8e4-2f09adc88598-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-db54df47d-9xxm9\" (UID: \"09f86e77-5df3-4ef5-b8e4-2f09adc88598\") " pod="openshift-monitoring/prometheus-operator-db54df47d-9xxm9" Dec 01 21:39:51 crc kubenswrapper[4962]: I1201 21:39:51.824049 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/09f86e77-5df3-4ef5-b8e4-2f09adc88598-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-9xxm9\" (UID: \"09f86e77-5df3-4ef5-b8e4-2f09adc88598\") " pod="openshift-monitoring/prometheus-operator-db54df47d-9xxm9" Dec 01 21:39:51 crc kubenswrapper[4962]: I1201 21:39:51.854144 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8w6sx\" (UniqueName: \"kubernetes.io/projected/09f86e77-5df3-4ef5-b8e4-2f09adc88598-kube-api-access-8w6sx\") pod \"prometheus-operator-db54df47d-9xxm9\" (UID: \"09f86e77-5df3-4ef5-b8e4-2f09adc88598\") " pod="openshift-monitoring/prometheus-operator-db54df47d-9xxm9" Dec 01 21:39:51 crc kubenswrapper[4962]: I1201 21:39:51.854434 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-db54df47d-9xxm9" Dec 01 21:39:52 crc kubenswrapper[4962]: I1201 21:39:52.295036 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-db54df47d-9xxm9"] Dec 01 21:39:52 crc kubenswrapper[4962]: W1201 21:39:52.310578 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09f86e77_5df3_4ef5_b8e4_2f09adc88598.slice/crio-414707065da8e395b8412269ba8f7e92b3cb8271fa5f4df184b08b132f68ce83 WatchSource:0}: Error finding container 414707065da8e395b8412269ba8f7e92b3cb8271fa5f4df184b08b132f68ce83: Status 404 returned error can't find the container with id 414707065da8e395b8412269ba8f7e92b3cb8271fa5f4df184b08b132f68ce83 Dec 01 21:39:53 crc kubenswrapper[4962]: I1201 21:39:53.160888 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-db54df47d-9xxm9" event={"ID":"09f86e77-5df3-4ef5-b8e4-2f09adc88598","Type":"ContainerStarted","Data":"414707065da8e395b8412269ba8f7e92b3cb8271fa5f4df184b08b132f68ce83"} Dec 01 21:39:54 crc kubenswrapper[4962]: I1201 21:39:54.168635 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-db54df47d-9xxm9" event={"ID":"09f86e77-5df3-4ef5-b8e4-2f09adc88598","Type":"ContainerStarted","Data":"8874586c3d55cfdc278296e99ff0e92648d6678fb0a31e9e3ecef99e2512da29"} Dec 01 21:39:54 crc kubenswrapper[4962]: I1201 21:39:54.169087 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-db54df47d-9xxm9" event={"ID":"09f86e77-5df3-4ef5-b8e4-2f09adc88598","Type":"ContainerStarted","Data":"fe4e2d88630c5c75a90de285c2ee8916a1b3ecb225a1c48063c769c0e72c4f15"} Dec 01 21:39:54 crc kubenswrapper[4962]: I1201 21:39:54.197424 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-db54df47d-9xxm9" podStartSLOduration=1.6560879929999999 podStartE2EDuration="3.197402331s" podCreationTimestamp="2025-12-01 21:39:51 +0000 UTC" firstStartedPulling="2025-12-01 21:39:52.312858405 +0000 UTC m=+376.414297640" lastFinishedPulling="2025-12-01 21:39:53.854172773 +0000 UTC m=+377.955611978" observedRunningTime="2025-12-01 21:39:54.192848508 +0000 UTC m=+378.294287763" watchObservedRunningTime="2025-12-01 21:39:54.197402331 +0000 UTC m=+378.298841556" Dec 01 21:39:55 crc kubenswrapper[4962]: I1201 21:39:55.890890 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-566fddb674-t8nvz"] Dec 01 21:39:55 crc kubenswrapper[4962]: I1201 21:39:55.892630 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-566fddb674-t8nvz" Dec 01 21:39:55 crc kubenswrapper[4962]: I1201 21:39:55.894521 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-tls" Dec 01 21:39:55 crc kubenswrapper[4962]: I1201 21:39:55.895731 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-dockercfg-rbthz" Dec 01 21:39:55 crc kubenswrapper[4962]: I1201 21:39:55.895772 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config" Dec 01 21:39:55 crc kubenswrapper[4962]: I1201 21:39:55.911655 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-566fddb674-t8nvz"] Dec 01 21:39:55 crc kubenswrapper[4962]: I1201 21:39:55.925229 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-2fwdb"] Dec 01 21:39:55 crc kubenswrapper[4962]: I1201 21:39:55.926427 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-2fwdb" Dec 01 21:39:55 crc kubenswrapper[4962]: I1201 21:39:55.928129 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-dockercfg-7pzfv" Dec 01 21:39:55 crc kubenswrapper[4962]: I1201 21:39:55.930138 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-777cb5bd5d-cgcbf"] Dec 01 21:39:55 crc kubenswrapper[4962]: I1201 21:39:55.931392 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-cgcbf" Dec 01 21:39:55 crc kubenswrapper[4962]: I1201 21:39:55.936019 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Dec 01 21:39:55 crc kubenswrapper[4962]: I1201 21:39:55.936316 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-dockercfg-t9jv2" Dec 01 21:39:55 crc kubenswrapper[4962]: I1201 21:39:55.937177 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Dec 01 21:39:55 crc kubenswrapper[4962]: I1201 21:39:55.937404 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Dec 01 21:39:55 crc kubenswrapper[4962]: I1201 21:39:55.937647 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Dec 01 21:39:55 crc kubenswrapper[4962]: I1201 21:39:55.942416 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Dec 01 21:39:55 crc kubenswrapper[4962]: I1201 21:39:55.954241 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-777cb5bd5d-cgcbf"] Dec 01 21:39:55 crc kubenswrapper[4962]: I1201 21:39:55.975169 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1c1f9eb9-b365-483d-85cb-a877ecaee3b3-node-exporter-tls\") pod \"node-exporter-2fwdb\" (UID: \"1c1f9eb9-b365-483d-85cb-a877ecaee3b3\") " pod="openshift-monitoring/node-exporter-2fwdb" Dec 01 21:39:55 crc kubenswrapper[4962]: I1201 21:39:55.975223 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p77dp\" (UniqueName: \"kubernetes.io/projected/3b3107c8-baf4-4aee-a3b9-70df9e5022eb-kube-api-access-p77dp\") pod \"openshift-state-metrics-566fddb674-t8nvz\" (UID: \"3b3107c8-baf4-4aee-a3b9-70df9e5022eb\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-t8nvz" Dec 01 21:39:55 crc kubenswrapper[4962]: I1201 21:39:55.975243 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfc2q\" (UniqueName: \"kubernetes.io/projected/1c1f9eb9-b365-483d-85cb-a877ecaee3b3-kube-api-access-gfc2q\") pod \"node-exporter-2fwdb\" (UID: \"1c1f9eb9-b365-483d-85cb-a877ecaee3b3\") " pod="openshift-monitoring/node-exporter-2fwdb" Dec 01 21:39:55 crc kubenswrapper[4962]: I1201 21:39:55.975270 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f9954d35-3a54-4528-975f-e834bd649bc6-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-777cb5bd5d-cgcbf\" (UID: \"f9954d35-3a54-4528-975f-e834bd649bc6\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-cgcbf" Dec 01 21:39:55 crc kubenswrapper[4962]: I1201 21:39:55.975299 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1c1f9eb9-b365-483d-85cb-a877ecaee3b3-metrics-client-ca\") pod \"node-exporter-2fwdb\" (UID: \"1c1f9eb9-b365-483d-85cb-a877ecaee3b3\") " pod="openshift-monitoring/node-exporter-2fwdb" Dec 01 21:39:55 crc kubenswrapper[4962]: I1201 21:39:55.975316 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f9954d35-3a54-4528-975f-e834bd649bc6-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-cgcbf\" (UID: \"f9954d35-3a54-4528-975f-e834bd649bc6\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-cgcbf" Dec 01 21:39:55 crc kubenswrapper[4962]: I1201 21:39:55.975335 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8dgj\" (UniqueName: \"kubernetes.io/projected/f9954d35-3a54-4528-975f-e834bd649bc6-kube-api-access-l8dgj\") pod \"kube-state-metrics-777cb5bd5d-cgcbf\" (UID: \"f9954d35-3a54-4528-975f-e834bd649bc6\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-cgcbf" Dec 01 21:39:55 crc kubenswrapper[4962]: I1201 21:39:55.975350 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1c1f9eb9-b365-483d-85cb-a877ecaee3b3-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-2fwdb\" (UID: \"1c1f9eb9-b365-483d-85cb-a877ecaee3b3\") " pod="openshift-monitoring/node-exporter-2fwdb" Dec 01 21:39:55 crc kubenswrapper[4962]: I1201 21:39:55.975368 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3b3107c8-baf4-4aee-a3b9-70df9e5022eb-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-566fddb674-t8nvz\" (UID: \"3b3107c8-baf4-4aee-a3b9-70df9e5022eb\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-t8nvz" Dec 01 21:39:55 crc kubenswrapper[4962]: I1201 21:39:55.975392 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/1c1f9eb9-b365-483d-85cb-a877ecaee3b3-node-exporter-wtmp\") pod \"node-exporter-2fwdb\" (UID: \"1c1f9eb9-b365-483d-85cb-a877ecaee3b3\") " pod="openshift-monitoring/node-exporter-2fwdb" Dec 01 21:39:55 crc kubenswrapper[4962]: I1201 21:39:55.975412 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/f9954d35-3a54-4528-975f-e834bd649bc6-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-777cb5bd5d-cgcbf\" (UID: \"f9954d35-3a54-4528-975f-e834bd649bc6\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-cgcbf" Dec 01 21:39:55 crc kubenswrapper[4962]: I1201 21:39:55.975430 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/f9954d35-3a54-4528-975f-e834bd649bc6-volume-directive-shadow\") pod \"kube-state-metrics-777cb5bd5d-cgcbf\" (UID: \"f9954d35-3a54-4528-975f-e834bd649bc6\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-cgcbf" Dec 01 21:39:55 crc kubenswrapper[4962]: I1201 21:39:55.975447 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f9954d35-3a54-4528-975f-e834bd649bc6-metrics-client-ca\") pod \"kube-state-metrics-777cb5bd5d-cgcbf\" (UID: \"f9954d35-3a54-4528-975f-e834bd649bc6\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-cgcbf" Dec 01 21:39:55 crc kubenswrapper[4962]: I1201 21:39:55.975463 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/1c1f9eb9-b365-483d-85cb-a877ecaee3b3-node-exporter-textfile\") pod \"node-exporter-2fwdb\" (UID: \"1c1f9eb9-b365-483d-85cb-a877ecaee3b3\") " pod="openshift-monitoring/node-exporter-2fwdb" Dec 01 21:39:55 crc kubenswrapper[4962]: I1201 21:39:55.975483 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3b3107c8-baf4-4aee-a3b9-70df9e5022eb-metrics-client-ca\") pod \"openshift-state-metrics-566fddb674-t8nvz\" (UID: \"3b3107c8-baf4-4aee-a3b9-70df9e5022eb\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-t8nvz" Dec 01 21:39:55 crc kubenswrapper[4962]: I1201 21:39:55.975500 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/3b3107c8-baf4-4aee-a3b9-70df9e5022eb-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-t8nvz\" (UID: \"3b3107c8-baf4-4aee-a3b9-70df9e5022eb\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-t8nvz" Dec 01 21:39:55 crc kubenswrapper[4962]: I1201 21:39:55.975521 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1c1f9eb9-b365-483d-85cb-a877ecaee3b3-sys\") pod \"node-exporter-2fwdb\" (UID: \"1c1f9eb9-b365-483d-85cb-a877ecaee3b3\") " pod="openshift-monitoring/node-exporter-2fwdb" Dec 01 21:39:55 crc kubenswrapper[4962]: I1201 21:39:55.975536 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/1c1f9eb9-b365-483d-85cb-a877ecaee3b3-root\") pod \"node-exporter-2fwdb\" (UID: \"1c1f9eb9-b365-483d-85cb-a877ecaee3b3\") " pod="openshift-monitoring/node-exporter-2fwdb" Dec 01 21:39:56 crc kubenswrapper[4962]: I1201 21:39:56.077112 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1c1f9eb9-b365-483d-85cb-a877ecaee3b3-metrics-client-ca\") pod \"node-exporter-2fwdb\" (UID: \"1c1f9eb9-b365-483d-85cb-a877ecaee3b3\") " pod="openshift-monitoring/node-exporter-2fwdb" Dec 01 21:39:56 crc kubenswrapper[4962]: I1201 21:39:56.078012 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f9954d35-3a54-4528-975f-e834bd649bc6-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-cgcbf\" (UID: \"f9954d35-3a54-4528-975f-e834bd649bc6\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-cgcbf" Dec 01 21:39:56 crc kubenswrapper[4962]: I1201 21:39:56.078214 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8dgj\" (UniqueName: \"kubernetes.io/projected/f9954d35-3a54-4528-975f-e834bd649bc6-kube-api-access-l8dgj\") pod \"kube-state-metrics-777cb5bd5d-cgcbf\" (UID: \"f9954d35-3a54-4528-975f-e834bd649bc6\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-cgcbf" Dec 01 21:39:56 crc kubenswrapper[4962]: I1201 21:39:56.078544 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1c1f9eb9-b365-483d-85cb-a877ecaee3b3-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-2fwdb\" (UID: \"1c1f9eb9-b365-483d-85cb-a877ecaee3b3\") " pod="openshift-monitoring/node-exporter-2fwdb" Dec 01 21:39:56 crc kubenswrapper[4962]: I1201 21:39:56.079515 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3b3107c8-baf4-4aee-a3b9-70df9e5022eb-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-566fddb674-t8nvz\" (UID: \"3b3107c8-baf4-4aee-a3b9-70df9e5022eb\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-t8nvz" Dec 01 21:39:56 crc kubenswrapper[4962]: I1201 21:39:56.079670 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/1c1f9eb9-b365-483d-85cb-a877ecaee3b3-node-exporter-wtmp\") pod \"node-exporter-2fwdb\" (UID: \"1c1f9eb9-b365-483d-85cb-a877ecaee3b3\") " pod="openshift-monitoring/node-exporter-2fwdb" Dec 01 21:39:56 crc kubenswrapper[4962]: I1201 21:39:56.079786 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/f9954d35-3a54-4528-975f-e834bd649bc6-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-777cb5bd5d-cgcbf\" (UID: \"f9954d35-3a54-4528-975f-e834bd649bc6\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-cgcbf" Dec 01 21:39:56 crc kubenswrapper[4962]: I1201 21:39:56.080552 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/f9954d35-3a54-4528-975f-e834bd649bc6-volume-directive-shadow\") pod \"kube-state-metrics-777cb5bd5d-cgcbf\" (UID: \"f9954d35-3a54-4528-975f-e834bd649bc6\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-cgcbf" Dec 01 21:39:56 crc kubenswrapper[4962]: I1201 21:39:56.080914 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f9954d35-3a54-4528-975f-e834bd649bc6-metrics-client-ca\") pod \"kube-state-metrics-777cb5bd5d-cgcbf\" (UID: \"f9954d35-3a54-4528-975f-e834bd649bc6\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-cgcbf" Dec 01 21:39:56 crc kubenswrapper[4962]: I1201 21:39:56.081018 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/1c1f9eb9-b365-483d-85cb-a877ecaee3b3-node-exporter-textfile\") pod \"node-exporter-2fwdb\" (UID: \"1c1f9eb9-b365-483d-85cb-a877ecaee3b3\") " pod="openshift-monitoring/node-exporter-2fwdb" Dec 01 21:39:56 crc kubenswrapper[4962]: I1201 21:39:56.081091 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3b3107c8-baf4-4aee-a3b9-70df9e5022eb-metrics-client-ca\") pod \"openshift-state-metrics-566fddb674-t8nvz\" (UID: \"3b3107c8-baf4-4aee-a3b9-70df9e5022eb\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-t8nvz" Dec 01 21:39:56 crc kubenswrapper[4962]: I1201 21:39:56.081161 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/3b3107c8-baf4-4aee-a3b9-70df9e5022eb-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-t8nvz\" (UID: \"3b3107c8-baf4-4aee-a3b9-70df9e5022eb\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-t8nvz" Dec 01 21:39:56 crc kubenswrapper[4962]: I1201 21:39:56.082052 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1c1f9eb9-b365-483d-85cb-a877ecaee3b3-sys\") pod \"node-exporter-2fwdb\" (UID: \"1c1f9eb9-b365-483d-85cb-a877ecaee3b3\") " pod="openshift-monitoring/node-exporter-2fwdb" Dec 01 21:39:56 crc kubenswrapper[4962]: I1201 21:39:56.082093 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/1c1f9eb9-b365-483d-85cb-a877ecaee3b3-root\") pod \"node-exporter-2fwdb\" (UID: \"1c1f9eb9-b365-483d-85cb-a877ecaee3b3\") " pod="openshift-monitoring/node-exporter-2fwdb" Dec 01 21:39:56 crc kubenswrapper[4962]: I1201 21:39:56.082135 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1c1f9eb9-b365-483d-85cb-a877ecaee3b3-node-exporter-tls\") pod \"node-exporter-2fwdb\" (UID: \"1c1f9eb9-b365-483d-85cb-a877ecaee3b3\") " pod="openshift-monitoring/node-exporter-2fwdb" Dec 01 21:39:56 crc kubenswrapper[4962]: I1201 21:39:56.082198 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfc2q\" (UniqueName: \"kubernetes.io/projected/1c1f9eb9-b365-483d-85cb-a877ecaee3b3-kube-api-access-gfc2q\") pod \"node-exporter-2fwdb\" (UID: \"1c1f9eb9-b365-483d-85cb-a877ecaee3b3\") " pod="openshift-monitoring/node-exporter-2fwdb" Dec 01 21:39:56 crc kubenswrapper[4962]: I1201 21:39:56.082219 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p77dp\" (UniqueName: \"kubernetes.io/projected/3b3107c8-baf4-4aee-a3b9-70df9e5022eb-kube-api-access-p77dp\") pod \"openshift-state-metrics-566fddb674-t8nvz\" (UID: \"3b3107c8-baf4-4aee-a3b9-70df9e5022eb\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-t8nvz" Dec 01 21:39:56 crc kubenswrapper[4962]: I1201 21:39:56.082275 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f9954d35-3a54-4528-975f-e834bd649bc6-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-777cb5bd5d-cgcbf\" (UID: \"f9954d35-3a54-4528-975f-e834bd649bc6\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-cgcbf" Dec 01 21:39:56 crc kubenswrapper[4962]: I1201 21:39:56.082515 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/1c1f9eb9-b365-483d-85cb-a877ecaee3b3-root\") pod \"node-exporter-2fwdb\" (UID: \"1c1f9eb9-b365-483d-85cb-a877ecaee3b3\") " pod="openshift-monitoring/node-exporter-2fwdb" Dec 01 21:39:56 crc kubenswrapper[4962]: I1201 21:39:56.080871 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/f9954d35-3a54-4528-975f-e834bd649bc6-volume-directive-shadow\") pod \"kube-state-metrics-777cb5bd5d-cgcbf\" (UID: \"f9954d35-3a54-4528-975f-e834bd649bc6\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-cgcbf" Dec 01 21:39:56 crc kubenswrapper[4962]: I1201 21:39:56.082661 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1c1f9eb9-b365-483d-85cb-a877ecaee3b3-sys\") pod \"node-exporter-2fwdb\" (UID: \"1c1f9eb9-b365-483d-85cb-a877ecaee3b3\") " pod="openshift-monitoring/node-exporter-2fwdb" Dec 01 21:39:56 crc kubenswrapper[4962]: I1201 21:39:56.082980 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/1c1f9eb9-b365-483d-85cb-a877ecaee3b3-node-exporter-textfile\") pod \"node-exporter-2fwdb\" (UID: \"1c1f9eb9-b365-483d-85cb-a877ecaee3b3\") " pod="openshift-monitoring/node-exporter-2fwdb" Dec 01 21:39:56 crc kubenswrapper[4962]: I1201 21:39:56.077974 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1c1f9eb9-b365-483d-85cb-a877ecaee3b3-metrics-client-ca\") pod \"node-exporter-2fwdb\" (UID: \"1c1f9eb9-b365-483d-85cb-a877ecaee3b3\") " pod="openshift-monitoring/node-exporter-2fwdb" Dec 01 21:39:56 crc kubenswrapper[4962]: I1201 21:39:56.083766 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f9954d35-3a54-4528-975f-e834bd649bc6-metrics-client-ca\") pod \"kube-state-metrics-777cb5bd5d-cgcbf\" (UID: \"f9954d35-3a54-4528-975f-e834bd649bc6\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-cgcbf" Dec 01 21:39:56 crc kubenswrapper[4962]: I1201 21:39:56.079893 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/1c1f9eb9-b365-483d-85cb-a877ecaee3b3-node-exporter-wtmp\") pod \"node-exporter-2fwdb\" (UID: \"1c1f9eb9-b365-483d-85cb-a877ecaee3b3\") " pod="openshift-monitoring/node-exporter-2fwdb" Dec 01 21:39:56 crc kubenswrapper[4962]: I1201 21:39:56.084560 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3b3107c8-baf4-4aee-a3b9-70df9e5022eb-metrics-client-ca\") pod \"openshift-state-metrics-566fddb674-t8nvz\" (UID: \"3b3107c8-baf4-4aee-a3b9-70df9e5022eb\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-t8nvz" Dec 01 21:39:56 crc kubenswrapper[4962]: I1201 21:39:56.080482 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/f9954d35-3a54-4528-975f-e834bd649bc6-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-777cb5bd5d-cgcbf\" (UID: \"f9954d35-3a54-4528-975f-e834bd649bc6\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-cgcbf" Dec 01 21:39:56 crc kubenswrapper[4962]: E1201 21:39:56.078175 4962 secret.go:188] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Dec 01 21:39:56 crc kubenswrapper[4962]: E1201 21:39:56.084850 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9954d35-3a54-4528-975f-e834bd649bc6-kube-state-metrics-tls podName:f9954d35-3a54-4528-975f-e834bd649bc6 nodeName:}" failed. No retries permitted until 2025-12-01 21:39:56.584819593 +0000 UTC m=+380.686258788 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/f9954d35-3a54-4528-975f-e834bd649bc6-kube-state-metrics-tls") pod "kube-state-metrics-777cb5bd5d-cgcbf" (UID: "f9954d35-3a54-4528-975f-e834bd649bc6") : secret "kube-state-metrics-tls" not found Dec 01 21:39:56 crc kubenswrapper[4962]: I1201 21:39:56.085330 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1c1f9eb9-b365-483d-85cb-a877ecaee3b3-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-2fwdb\" (UID: \"1c1f9eb9-b365-483d-85cb-a877ecaee3b3\") " pod="openshift-monitoring/node-exporter-2fwdb" Dec 01 21:39:56 crc kubenswrapper[4962]: I1201 21:39:56.085358 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3b3107c8-baf4-4aee-a3b9-70df9e5022eb-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-566fddb674-t8nvz\" (UID: \"3b3107c8-baf4-4aee-a3b9-70df9e5022eb\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-t8nvz" Dec 01 21:39:56 crc kubenswrapper[4962]: I1201 21:39:56.085883 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/3b3107c8-baf4-4aee-a3b9-70df9e5022eb-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-t8nvz\" (UID: \"3b3107c8-baf4-4aee-a3b9-70df9e5022eb\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-t8nvz" Dec 01 21:39:56 crc kubenswrapper[4962]: I1201 21:39:56.089315 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f9954d35-3a54-4528-975f-e834bd649bc6-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-777cb5bd5d-cgcbf\" (UID: \"f9954d35-3a54-4528-975f-e834bd649bc6\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-cgcbf" Dec 01 21:39:56 crc kubenswrapper[4962]: I1201 21:39:56.100427 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1c1f9eb9-b365-483d-85cb-a877ecaee3b3-node-exporter-tls\") pod \"node-exporter-2fwdb\" (UID: \"1c1f9eb9-b365-483d-85cb-a877ecaee3b3\") " pod="openshift-monitoring/node-exporter-2fwdb" Dec 01 21:39:56 crc kubenswrapper[4962]: I1201 21:39:56.102057 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8dgj\" (UniqueName: \"kubernetes.io/projected/f9954d35-3a54-4528-975f-e834bd649bc6-kube-api-access-l8dgj\") pod \"kube-state-metrics-777cb5bd5d-cgcbf\" (UID: \"f9954d35-3a54-4528-975f-e834bd649bc6\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-cgcbf" Dec 01 21:39:56 crc kubenswrapper[4962]: I1201 21:39:56.107479 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfc2q\" (UniqueName: \"kubernetes.io/projected/1c1f9eb9-b365-483d-85cb-a877ecaee3b3-kube-api-access-gfc2q\") pod \"node-exporter-2fwdb\" (UID: \"1c1f9eb9-b365-483d-85cb-a877ecaee3b3\") " pod="openshift-monitoring/node-exporter-2fwdb" Dec 01 21:39:56 crc kubenswrapper[4962]: I1201 21:39:56.107529 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p77dp\" (UniqueName: \"kubernetes.io/projected/3b3107c8-baf4-4aee-a3b9-70df9e5022eb-kube-api-access-p77dp\") pod \"openshift-state-metrics-566fddb674-t8nvz\" (UID: \"3b3107c8-baf4-4aee-a3b9-70df9e5022eb\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-t8nvz" Dec 01 21:39:56 crc kubenswrapper[4962]: I1201 21:39:56.206375 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-566fddb674-t8nvz" Dec 01 21:39:56 crc kubenswrapper[4962]: I1201 21:39:56.243631 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-2fwdb" Dec 01 21:39:56 crc kubenswrapper[4962]: W1201 21:39:56.273282 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c1f9eb9_b365_483d_85cb_a877ecaee3b3.slice/crio-df3ce76fbba3c5a0be3ad16d687d5917a015a8d70f609a682e570c46a1587de1 WatchSource:0}: Error finding container df3ce76fbba3c5a0be3ad16d687d5917a015a8d70f609a682e570c46a1587de1: Status 404 returned error can't find the container with id df3ce76fbba3c5a0be3ad16d687d5917a015a8d70f609a682e570c46a1587de1 Dec 01 21:39:56 crc kubenswrapper[4962]: I1201 21:39:56.592134 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f9954d35-3a54-4528-975f-e834bd649bc6-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-cgcbf\" (UID: \"f9954d35-3a54-4528-975f-e834bd649bc6\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-cgcbf" Dec 01 21:39:56 crc kubenswrapper[4962]: I1201 21:39:56.601126 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f9954d35-3a54-4528-975f-e834bd649bc6-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-cgcbf\" (UID: \"f9954d35-3a54-4528-975f-e834bd649bc6\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-cgcbf" Dec 01 21:39:56 crc kubenswrapper[4962]: I1201 21:39:56.700699 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-566fddb674-t8nvz"] Dec 01 21:39:56 crc kubenswrapper[4962]: W1201 21:39:56.703272 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b3107c8_baf4_4aee_a3b9_70df9e5022eb.slice/crio-18093068c6c0036e0dd4778284ebb37f3cd775bc38b9f2e1b1e40e87135f1c88 WatchSource:0}: Error finding container 18093068c6c0036e0dd4778284ebb37f3cd775bc38b9f2e1b1e40e87135f1c88: Status 404 returned error can't find the container with id 18093068c6c0036e0dd4778284ebb37f3cd775bc38b9f2e1b1e40e87135f1c88 Dec 01 21:39:56 crc kubenswrapper[4962]: I1201 21:39:56.853972 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-cgcbf" Dec 01 21:39:57 crc kubenswrapper[4962]: I1201 21:39:57.028044 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Dec 01 21:39:57 crc kubenswrapper[4962]: I1201 21:39:57.030284 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Dec 01 21:39:57 crc kubenswrapper[4962]: I1201 21:39:57.031781 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-metric" Dec 01 21:39:57 crc kubenswrapper[4962]: I1201 21:39:57.046765 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-web" Dec 01 21:39:57 crc kubenswrapper[4962]: I1201 21:39:57.047116 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy" Dec 01 21:39:57 crc kubenswrapper[4962]: I1201 21:39:57.047140 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls-assets-0" Dec 01 21:39:57 crc kubenswrapper[4962]: I1201 21:39:57.047270 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-generated" Dec 01 21:39:57 crc kubenswrapper[4962]: I1201 21:39:57.047408 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-web-config" Dec 01 21:39:57 crc kubenswrapper[4962]: I1201 21:39:57.047444 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls" Dec 01 21:39:57 crc kubenswrapper[4962]: I1201 21:39:57.053187 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-dockercfg-6dnt2" Dec 01 21:39:57 crc kubenswrapper[4962]: I1201 21:39:57.062734 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"alertmanager-trusted-ca-bundle" Dec 01 21:39:57 crc kubenswrapper[4962]: I1201 21:39:57.095182 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Dec 01 21:39:57 crc kubenswrapper[4962]: I1201 21:39:57.102817 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c1b4dd7c-b41f-442b-9fc7-9e248312c359-tls-assets\") pod \"alertmanager-main-0\" (UID: \"c1b4dd7c-b41f-442b-9fc7-9e248312c359\") " pod="openshift-monitoring/alertmanager-main-0" Dec 01 21:39:57 crc kubenswrapper[4962]: I1201 21:39:57.102877 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1b4dd7c-b41f-442b-9fc7-9e248312c359-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"c1b4dd7c-b41f-442b-9fc7-9e248312c359\") " pod="openshift-monitoring/alertmanager-main-0" Dec 01 21:39:57 crc kubenswrapper[4962]: I1201 21:39:57.102922 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c1b4dd7c-b41f-442b-9fc7-9e248312c359-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"c1b4dd7c-b41f-442b-9fc7-9e248312c359\") " pod="openshift-monitoring/alertmanager-main-0" Dec 01 21:39:57 crc kubenswrapper[4962]: I1201 21:39:57.102984 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c1b4dd7c-b41f-442b-9fc7-9e248312c359-web-config\") pod \"alertmanager-main-0\" (UID: \"c1b4dd7c-b41f-442b-9fc7-9e248312c359\") " pod="openshift-monitoring/alertmanager-main-0" Dec 01 21:39:57 crc kubenswrapper[4962]: I1201 21:39:57.103014 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/c1b4dd7c-b41f-442b-9fc7-9e248312c359-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"c1b4dd7c-b41f-442b-9fc7-9e248312c359\") " pod="openshift-monitoring/alertmanager-main-0" Dec 01 21:39:57 crc kubenswrapper[4962]: I1201 21:39:57.103051 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/c1b4dd7c-b41f-442b-9fc7-9e248312c359-config-volume\") pod \"alertmanager-main-0\" (UID: \"c1b4dd7c-b41f-442b-9fc7-9e248312c359\") " pod="openshift-monitoring/alertmanager-main-0" Dec 01 21:39:57 crc kubenswrapper[4962]: I1201 21:39:57.103084 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/c1b4dd7c-b41f-442b-9fc7-9e248312c359-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"c1b4dd7c-b41f-442b-9fc7-9e248312c359\") " pod="openshift-monitoring/alertmanager-main-0" Dec 01 21:39:57 crc kubenswrapper[4962]: I1201 21:39:57.103128 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lggr2\" (UniqueName: \"kubernetes.io/projected/c1b4dd7c-b41f-442b-9fc7-9e248312c359-kube-api-access-lggr2\") pod \"alertmanager-main-0\" (UID: \"c1b4dd7c-b41f-442b-9fc7-9e248312c359\") " pod="openshift-monitoring/alertmanager-main-0" Dec 01 21:39:57 crc kubenswrapper[4962]: I1201 21:39:57.103158 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c1b4dd7c-b41f-442b-9fc7-9e248312c359-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"c1b4dd7c-b41f-442b-9fc7-9e248312c359\") " pod="openshift-monitoring/alertmanager-main-0" Dec 01 21:39:57 crc kubenswrapper[4962]: I1201 21:39:57.103185 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/c1b4dd7c-b41f-442b-9fc7-9e248312c359-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"c1b4dd7c-b41f-442b-9fc7-9e248312c359\") " pod="openshift-monitoring/alertmanager-main-0" Dec 01 21:39:57 crc kubenswrapper[4962]: I1201 21:39:57.103227 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c1b4dd7c-b41f-442b-9fc7-9e248312c359-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"c1b4dd7c-b41f-442b-9fc7-9e248312c359\") " pod="openshift-monitoring/alertmanager-main-0" Dec 01 21:39:57 crc kubenswrapper[4962]: I1201 21:39:57.103264 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c1b4dd7c-b41f-442b-9fc7-9e248312c359-config-out\") pod \"alertmanager-main-0\" (UID: \"c1b4dd7c-b41f-442b-9fc7-9e248312c359\") " pod="openshift-monitoring/alertmanager-main-0" Dec 01 21:39:57 crc kubenswrapper[4962]: I1201 21:39:57.204811 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c1b4dd7c-b41f-442b-9fc7-9e248312c359-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"c1b4dd7c-b41f-442b-9fc7-9e248312c359\") " pod="openshift-monitoring/alertmanager-main-0" Dec 01 21:39:57 crc kubenswrapper[4962]: I1201 21:39:57.204859 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c1b4dd7c-b41f-442b-9fc7-9e248312c359-config-out\") pod \"alertmanager-main-0\" (UID: \"c1b4dd7c-b41f-442b-9fc7-9e248312c359\") " pod="openshift-monitoring/alertmanager-main-0" Dec 01 21:39:57 crc kubenswrapper[4962]: I1201 21:39:57.204893 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c1b4dd7c-b41f-442b-9fc7-9e248312c359-tls-assets\") pod \"alertmanager-main-0\" (UID: \"c1b4dd7c-b41f-442b-9fc7-9e248312c359\") " pod="openshift-monitoring/alertmanager-main-0" Dec 01 21:39:57 crc kubenswrapper[4962]: I1201 21:39:57.205156 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1b4dd7c-b41f-442b-9fc7-9e248312c359-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"c1b4dd7c-b41f-442b-9fc7-9e248312c359\") " pod="openshift-monitoring/alertmanager-main-0" Dec 01 21:39:57 crc kubenswrapper[4962]: I1201 21:39:57.205191 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c1b4dd7c-b41f-442b-9fc7-9e248312c359-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"c1b4dd7c-b41f-442b-9fc7-9e248312c359\") " pod="openshift-monitoring/alertmanager-main-0" Dec 01 21:39:57 crc kubenswrapper[4962]: I1201 21:39:57.205711 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/c1b4dd7c-b41f-442b-9fc7-9e248312c359-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"c1b4dd7c-b41f-442b-9fc7-9e248312c359\") " pod="openshift-monitoring/alertmanager-main-0" Dec 01 21:39:57 crc kubenswrapper[4962]: I1201 21:39:57.205733 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c1b4dd7c-b41f-442b-9fc7-9e248312c359-web-config\") pod \"alertmanager-main-0\" (UID: \"c1b4dd7c-b41f-442b-9fc7-9e248312c359\") " pod="openshift-monitoring/alertmanager-main-0" Dec 01 21:39:57 crc kubenswrapper[4962]: I1201 21:39:57.205754 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/c1b4dd7c-b41f-442b-9fc7-9e248312c359-config-volume\") pod \"alertmanager-main-0\" (UID: \"c1b4dd7c-b41f-442b-9fc7-9e248312c359\") " pod="openshift-monitoring/alertmanager-main-0" Dec 01 21:39:57 crc kubenswrapper[4962]: I1201 21:39:57.205775 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/c1b4dd7c-b41f-442b-9fc7-9e248312c359-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"c1b4dd7c-b41f-442b-9fc7-9e248312c359\") " pod="openshift-monitoring/alertmanager-main-0" Dec 01 21:39:57 crc kubenswrapper[4962]: I1201 21:39:57.205796 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lggr2\" (UniqueName: \"kubernetes.io/projected/c1b4dd7c-b41f-442b-9fc7-9e248312c359-kube-api-access-lggr2\") pod \"alertmanager-main-0\" (UID: \"c1b4dd7c-b41f-442b-9fc7-9e248312c359\") " pod="openshift-monitoring/alertmanager-main-0" Dec 01 21:39:57 crc kubenswrapper[4962]: I1201 21:39:57.205814 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/c1b4dd7c-b41f-442b-9fc7-9e248312c359-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"c1b4dd7c-b41f-442b-9fc7-9e248312c359\") " pod="openshift-monitoring/alertmanager-main-0" Dec 01 21:39:57 crc kubenswrapper[4962]: I1201 21:39:57.205829 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c1b4dd7c-b41f-442b-9fc7-9e248312c359-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"c1b4dd7c-b41f-442b-9fc7-9e248312c359\") " pod="openshift-monitoring/alertmanager-main-0" Dec 01 21:39:57 crc kubenswrapper[4962]: I1201 21:39:57.205972 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c1b4dd7c-b41f-442b-9fc7-9e248312c359-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"c1b4dd7c-b41f-442b-9fc7-9e248312c359\") " pod="openshift-monitoring/alertmanager-main-0" Dec 01 21:39:57 crc kubenswrapper[4962]: I1201 21:39:57.205177 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-t8nvz" event={"ID":"3b3107c8-baf4-4aee-a3b9-70df9e5022eb","Type":"ContainerStarted","Data":"18093068c6c0036e0dd4778284ebb37f3cd775bc38b9f2e1b1e40e87135f1c88"} Dec 01 21:39:57 crc kubenswrapper[4962]: E1201 21:39:57.206621 4962 secret.go:188] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Dec 01 21:39:57 crc kubenswrapper[4962]: E1201 21:39:57.206677 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1b4dd7c-b41f-442b-9fc7-9e248312c359-secret-alertmanager-main-tls podName:c1b4dd7c-b41f-442b-9fc7-9e248312c359 nodeName:}" failed. No retries permitted until 2025-12-01 21:39:57.70666012 +0000 UTC m=+381.808099315 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/c1b4dd7c-b41f-442b-9fc7-9e248312c359-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "c1b4dd7c-b41f-442b-9fc7-9e248312c359") : secret "alertmanager-main-tls" not found Dec 01 21:39:57 crc kubenswrapper[4962]: I1201 21:39:57.206761 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1b4dd7c-b41f-442b-9fc7-9e248312c359-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"c1b4dd7c-b41f-442b-9fc7-9e248312c359\") " pod="openshift-monitoring/alertmanager-main-0" Dec 01 21:39:57 crc kubenswrapper[4962]: I1201 21:39:57.207856 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/c1b4dd7c-b41f-442b-9fc7-9e248312c359-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"c1b4dd7c-b41f-442b-9fc7-9e248312c359\") " pod="openshift-monitoring/alertmanager-main-0" Dec 01 21:39:57 crc kubenswrapper[4962]: I1201 21:39:57.211820 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c1b4dd7c-b41f-442b-9fc7-9e248312c359-tls-assets\") pod \"alertmanager-main-0\" (UID: \"c1b4dd7c-b41f-442b-9fc7-9e248312c359\") " pod="openshift-monitoring/alertmanager-main-0" Dec 01 21:39:57 crc kubenswrapper[4962]: I1201 21:39:57.212195 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/c1b4dd7c-b41f-442b-9fc7-9e248312c359-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"c1b4dd7c-b41f-442b-9fc7-9e248312c359\") " pod="openshift-monitoring/alertmanager-main-0" Dec 01 21:39:57 crc kubenswrapper[4962]: I1201 21:39:57.212653 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2fwdb" event={"ID":"1c1f9eb9-b365-483d-85cb-a877ecaee3b3","Type":"ContainerStarted","Data":"df3ce76fbba3c5a0be3ad16d687d5917a015a8d70f609a682e570c46a1587de1"} Dec 01 21:39:57 crc kubenswrapper[4962]: I1201 21:39:57.212830 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/c1b4dd7c-b41f-442b-9fc7-9e248312c359-config-volume\") pod \"alertmanager-main-0\" (UID: \"c1b4dd7c-b41f-442b-9fc7-9e248312c359\") " pod="openshift-monitoring/alertmanager-main-0" Dec 01 21:39:57 crc kubenswrapper[4962]: I1201 21:39:57.216434 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c1b4dd7c-b41f-442b-9fc7-9e248312c359-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"c1b4dd7c-b41f-442b-9fc7-9e248312c359\") " pod="openshift-monitoring/alertmanager-main-0" Dec 01 21:39:57 crc kubenswrapper[4962]: I1201 21:39:57.231510 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c1b4dd7c-b41f-442b-9fc7-9e248312c359-web-config\") pod \"alertmanager-main-0\" (UID: \"c1b4dd7c-b41f-442b-9fc7-9e248312c359\") " pod="openshift-monitoring/alertmanager-main-0" Dec 01 21:39:57 crc kubenswrapper[4962]: I1201 21:39:57.232880 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c1b4dd7c-b41f-442b-9fc7-9e248312c359-config-out\") pod \"alertmanager-main-0\" (UID: \"c1b4dd7c-b41f-442b-9fc7-9e248312c359\") " pod="openshift-monitoring/alertmanager-main-0" Dec 01 21:39:57 crc kubenswrapper[4962]: I1201 21:39:57.248325 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lggr2\" (UniqueName: \"kubernetes.io/projected/c1b4dd7c-b41f-442b-9fc7-9e248312c359-kube-api-access-lggr2\") pod \"alertmanager-main-0\" (UID: \"c1b4dd7c-b41f-442b-9fc7-9e248312c359\") " pod="openshift-monitoring/alertmanager-main-0" Dec 01 21:39:57 crc kubenswrapper[4962]: I1201 21:39:57.258611 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c1b4dd7c-b41f-442b-9fc7-9e248312c359-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"c1b4dd7c-b41f-442b-9fc7-9e248312c359\") " pod="openshift-monitoring/alertmanager-main-0" Dec 01 21:39:57 crc kubenswrapper[4962]: I1201 21:39:57.434073 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-777cb5bd5d-cgcbf"] Dec 01 21:39:57 crc kubenswrapper[4962]: W1201 21:39:57.441495 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9954d35_3a54_4528_975f_e834bd649bc6.slice/crio-fff0844413ca55024450fd52a1d978d05f0a89a72e2e2ef04cda6036ad5401d7 WatchSource:0}: Error finding container fff0844413ca55024450fd52a1d978d05f0a89a72e2e2ef04cda6036ad5401d7: Status 404 returned error can't find the container with id fff0844413ca55024450fd52a1d978d05f0a89a72e2e2ef04cda6036ad5401d7 Dec 01 21:39:57 crc kubenswrapper[4962]: I1201 21:39:57.719127 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/c1b4dd7c-b41f-442b-9fc7-9e248312c359-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"c1b4dd7c-b41f-442b-9fc7-9e248312c359\") " pod="openshift-monitoring/alertmanager-main-0" Dec 01 21:39:57 crc kubenswrapper[4962]: E1201 21:39:57.719300 4962 secret.go:188] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Dec 01 21:39:57 crc kubenswrapper[4962]: E1201 21:39:57.719455 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1b4dd7c-b41f-442b-9fc7-9e248312c359-secret-alertmanager-main-tls podName:c1b4dd7c-b41f-442b-9fc7-9e248312c359 nodeName:}" failed. No retries permitted until 2025-12-01 21:39:58.719432155 +0000 UTC m=+382.820871350 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/c1b4dd7c-b41f-442b-9fc7-9e248312c359-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "c1b4dd7c-b41f-442b-9fc7-9e248312c359") : secret "alertmanager-main-tls" not found Dec 01 21:39:58 crc kubenswrapper[4962]: I1201 21:39:58.057043 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-7f8fc6b774-9pv2r"] Dec 01 21:39:58 crc kubenswrapper[4962]: I1201 21:39:58.058753 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7f8fc6b774-9pv2r" Dec 01 21:39:58 crc kubenswrapper[4962]: I1201 21:39:58.062422 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-tls" Dec 01 21:39:58 crc kubenswrapper[4962]: I1201 21:39:58.062446 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-rules" Dec 01 21:39:58 crc kubenswrapper[4962]: I1201 21:39:58.062558 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-dockercfg-6cbnv" Dec 01 21:39:58 crc kubenswrapper[4962]: I1201 21:39:58.062612 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-metrics" Dec 01 21:39:58 crc kubenswrapper[4962]: I1201 21:39:58.062637 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-web" Dec 01 21:39:58 crc kubenswrapper[4962]: I1201 21:39:58.062699 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy" Dec 01 21:39:58 crc kubenswrapper[4962]: I1201 21:39:58.063208 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-grpc-tls-3qt9dgou2fu2b" Dec 01 21:39:58 crc kubenswrapper[4962]: I1201 21:39:58.077367 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7f8fc6b774-9pv2r"] Dec 01 21:39:58 crc kubenswrapper[4962]: I1201 21:39:58.124898 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ec3f33c7-9df3-4c44-919e-a84254466ca6-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7f8fc6b774-9pv2r\" (UID: \"ec3f33c7-9df3-4c44-919e-a84254466ca6\") " pod="openshift-monitoring/thanos-querier-7f8fc6b774-9pv2r" Dec 01 21:39:58 crc kubenswrapper[4962]: I1201 21:39:58.124993 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/ec3f33c7-9df3-4c44-919e-a84254466ca6-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7f8fc6b774-9pv2r\" (UID: \"ec3f33c7-9df3-4c44-919e-a84254466ca6\") " pod="openshift-monitoring/thanos-querier-7f8fc6b774-9pv2r" Dec 01 21:39:58 crc kubenswrapper[4962]: I1201 21:39:58.125015 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64449\" (UniqueName: \"kubernetes.io/projected/ec3f33c7-9df3-4c44-919e-a84254466ca6-kube-api-access-64449\") pod \"thanos-querier-7f8fc6b774-9pv2r\" (UID: \"ec3f33c7-9df3-4c44-919e-a84254466ca6\") " pod="openshift-monitoring/thanos-querier-7f8fc6b774-9pv2r" Dec 01 21:39:58 crc kubenswrapper[4962]: I1201 21:39:58.125044 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ec3f33c7-9df3-4c44-919e-a84254466ca6-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7f8fc6b774-9pv2r\" (UID: \"ec3f33c7-9df3-4c44-919e-a84254466ca6\") " pod="openshift-monitoring/thanos-querier-7f8fc6b774-9pv2r" Dec 01 21:39:58 crc kubenswrapper[4962]: I1201 21:39:58.125097 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ec3f33c7-9df3-4c44-919e-a84254466ca6-metrics-client-ca\") pod \"thanos-querier-7f8fc6b774-9pv2r\" (UID: \"ec3f33c7-9df3-4c44-919e-a84254466ca6\") " pod="openshift-monitoring/thanos-querier-7f8fc6b774-9pv2r" Dec 01 21:39:58 crc kubenswrapper[4962]: I1201 21:39:58.125137 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/ec3f33c7-9df3-4c44-919e-a84254466ca6-secret-thanos-querier-tls\") pod \"thanos-querier-7f8fc6b774-9pv2r\" (UID: \"ec3f33c7-9df3-4c44-919e-a84254466ca6\") " pod="openshift-monitoring/thanos-querier-7f8fc6b774-9pv2r" Dec 01 21:39:58 crc kubenswrapper[4962]: I1201 21:39:58.125160 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ec3f33c7-9df3-4c44-919e-a84254466ca6-secret-grpc-tls\") pod \"thanos-querier-7f8fc6b774-9pv2r\" (UID: \"ec3f33c7-9df3-4c44-919e-a84254466ca6\") " pod="openshift-monitoring/thanos-querier-7f8fc6b774-9pv2r" Dec 01 21:39:58 crc kubenswrapper[4962]: I1201 21:39:58.125178 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/ec3f33c7-9df3-4c44-919e-a84254466ca6-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7f8fc6b774-9pv2r\" (UID: \"ec3f33c7-9df3-4c44-919e-a84254466ca6\") " pod="openshift-monitoring/thanos-querier-7f8fc6b774-9pv2r" Dec 01 21:39:58 crc kubenswrapper[4962]: I1201 21:39:58.224706 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-t8nvz" event={"ID":"3b3107c8-baf4-4aee-a3b9-70df9e5022eb","Type":"ContainerStarted","Data":"1207ec67fd7eaddbba682a45034a20033ca18fb1004dc9bab2af6a7aaefc5d3b"} Dec 01 21:39:58 crc kubenswrapper[4962]: I1201 21:39:58.224755 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-t8nvz" event={"ID":"3b3107c8-baf4-4aee-a3b9-70df9e5022eb","Type":"ContainerStarted","Data":"bc45708551563d87091c42c82d2399182b36cc24b8103429842d9379a228536c"} Dec 01 21:39:58 crc kubenswrapper[4962]: I1201 21:39:58.224770 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-cgcbf" event={"ID":"f9954d35-3a54-4528-975f-e834bd649bc6","Type":"ContainerStarted","Data":"fff0844413ca55024450fd52a1d978d05f0a89a72e2e2ef04cda6036ad5401d7"} Dec 01 21:39:58 crc kubenswrapper[4962]: I1201 21:39:58.225843 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/ec3f33c7-9df3-4c44-919e-a84254466ca6-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7f8fc6b774-9pv2r\" (UID: \"ec3f33c7-9df3-4c44-919e-a84254466ca6\") " pod="openshift-monitoring/thanos-querier-7f8fc6b774-9pv2r" Dec 01 21:39:58 crc kubenswrapper[4962]: I1201 21:39:58.225884 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64449\" (UniqueName: \"kubernetes.io/projected/ec3f33c7-9df3-4c44-919e-a84254466ca6-kube-api-access-64449\") pod \"thanos-querier-7f8fc6b774-9pv2r\" (UID: \"ec3f33c7-9df3-4c44-919e-a84254466ca6\") " pod="openshift-monitoring/thanos-querier-7f8fc6b774-9pv2r" Dec 01 21:39:58 crc kubenswrapper[4962]: I1201 21:39:58.225909 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ec3f33c7-9df3-4c44-919e-a84254466ca6-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7f8fc6b774-9pv2r\" (UID: \"ec3f33c7-9df3-4c44-919e-a84254466ca6\") " pod="openshift-monitoring/thanos-querier-7f8fc6b774-9pv2r" Dec 01 21:39:58 crc kubenswrapper[4962]: I1201 21:39:58.225957 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ec3f33c7-9df3-4c44-919e-a84254466ca6-metrics-client-ca\") pod \"thanos-querier-7f8fc6b774-9pv2r\" (UID: \"ec3f33c7-9df3-4c44-919e-a84254466ca6\") " pod="openshift-monitoring/thanos-querier-7f8fc6b774-9pv2r" Dec 01 21:39:58 crc kubenswrapper[4962]: I1201 21:39:58.225989 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/ec3f33c7-9df3-4c44-919e-a84254466ca6-secret-thanos-querier-tls\") pod \"thanos-querier-7f8fc6b774-9pv2r\" (UID: \"ec3f33c7-9df3-4c44-919e-a84254466ca6\") " pod="openshift-monitoring/thanos-querier-7f8fc6b774-9pv2r" Dec 01 21:39:58 crc kubenswrapper[4962]: I1201 21:39:58.226014 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ec3f33c7-9df3-4c44-919e-a84254466ca6-secret-grpc-tls\") pod \"thanos-querier-7f8fc6b774-9pv2r\" (UID: \"ec3f33c7-9df3-4c44-919e-a84254466ca6\") " pod="openshift-monitoring/thanos-querier-7f8fc6b774-9pv2r" Dec 01 21:39:58 crc kubenswrapper[4962]: I1201 21:39:58.226031 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/ec3f33c7-9df3-4c44-919e-a84254466ca6-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7f8fc6b774-9pv2r\" (UID: \"ec3f33c7-9df3-4c44-919e-a84254466ca6\") " pod="openshift-monitoring/thanos-querier-7f8fc6b774-9pv2r" Dec 01 21:39:58 crc kubenswrapper[4962]: I1201 21:39:58.226068 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ec3f33c7-9df3-4c44-919e-a84254466ca6-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7f8fc6b774-9pv2r\" (UID: \"ec3f33c7-9df3-4c44-919e-a84254466ca6\") " pod="openshift-monitoring/thanos-querier-7f8fc6b774-9pv2r" Dec 01 21:39:58 crc kubenswrapper[4962]: I1201 21:39:58.227806 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ec3f33c7-9df3-4c44-919e-a84254466ca6-metrics-client-ca\") pod \"thanos-querier-7f8fc6b774-9pv2r\" (UID: \"ec3f33c7-9df3-4c44-919e-a84254466ca6\") " pod="openshift-monitoring/thanos-querier-7f8fc6b774-9pv2r" Dec 01 21:39:58 crc kubenswrapper[4962]: I1201 21:39:58.232871 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/ec3f33c7-9df3-4c44-919e-a84254466ca6-secret-thanos-querier-tls\") pod \"thanos-querier-7f8fc6b774-9pv2r\" (UID: \"ec3f33c7-9df3-4c44-919e-a84254466ca6\") " pod="openshift-monitoring/thanos-querier-7f8fc6b774-9pv2r" Dec 01 21:39:58 crc kubenswrapper[4962]: I1201 21:39:58.234083 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ec3f33c7-9df3-4c44-919e-a84254466ca6-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7f8fc6b774-9pv2r\" (UID: \"ec3f33c7-9df3-4c44-919e-a84254466ca6\") " pod="openshift-monitoring/thanos-querier-7f8fc6b774-9pv2r" Dec 01 21:39:58 crc kubenswrapper[4962]: I1201 21:39:58.238531 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/ec3f33c7-9df3-4c44-919e-a84254466ca6-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7f8fc6b774-9pv2r\" (UID: \"ec3f33c7-9df3-4c44-919e-a84254466ca6\") " pod="openshift-monitoring/thanos-querier-7f8fc6b774-9pv2r" Dec 01 21:39:58 crc kubenswrapper[4962]: I1201 21:39:58.240094 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ec3f33c7-9df3-4c44-919e-a84254466ca6-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7f8fc6b774-9pv2r\" (UID: \"ec3f33c7-9df3-4c44-919e-a84254466ca6\") " pod="openshift-monitoring/thanos-querier-7f8fc6b774-9pv2r" Dec 01 21:39:58 crc kubenswrapper[4962]: I1201 21:39:58.242244 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ec3f33c7-9df3-4c44-919e-a84254466ca6-secret-grpc-tls\") pod \"thanos-querier-7f8fc6b774-9pv2r\" (UID: \"ec3f33c7-9df3-4c44-919e-a84254466ca6\") " pod="openshift-monitoring/thanos-querier-7f8fc6b774-9pv2r" Dec 01 21:39:58 crc kubenswrapper[4962]: I1201 21:39:58.252024 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/ec3f33c7-9df3-4c44-919e-a84254466ca6-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7f8fc6b774-9pv2r\" (UID: \"ec3f33c7-9df3-4c44-919e-a84254466ca6\") " pod="openshift-monitoring/thanos-querier-7f8fc6b774-9pv2r" Dec 01 21:39:58 crc kubenswrapper[4962]: I1201 21:39:58.255776 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64449\" (UniqueName: \"kubernetes.io/projected/ec3f33c7-9df3-4c44-919e-a84254466ca6-kube-api-access-64449\") pod \"thanos-querier-7f8fc6b774-9pv2r\" (UID: \"ec3f33c7-9df3-4c44-919e-a84254466ca6\") " pod="openshift-monitoring/thanos-querier-7f8fc6b774-9pv2r" Dec 01 21:39:58 crc kubenswrapper[4962]: I1201 21:39:58.378197 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7f8fc6b774-9pv2r" Dec 01 21:39:58 crc kubenswrapper[4962]: I1201 21:39:58.731082 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/c1b4dd7c-b41f-442b-9fc7-9e248312c359-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"c1b4dd7c-b41f-442b-9fc7-9e248312c359\") " pod="openshift-monitoring/alertmanager-main-0" Dec 01 21:39:58 crc kubenswrapper[4962]: I1201 21:39:58.736668 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/c1b4dd7c-b41f-442b-9fc7-9e248312c359-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"c1b4dd7c-b41f-442b-9fc7-9e248312c359\") " pod="openshift-monitoring/alertmanager-main-0" Dec 01 21:39:58 crc kubenswrapper[4962]: I1201 21:39:58.844410 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Dec 01 21:39:59 crc kubenswrapper[4962]: I1201 21:39:59.226432 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2fwdb" event={"ID":"1c1f9eb9-b365-483d-85cb-a877ecaee3b3","Type":"ContainerStarted","Data":"a2e4974fd4ad6f3f32b452045df84e3fcad81861827a419dfc76213cb1f660bf"} Dec 01 21:39:59 crc kubenswrapper[4962]: I1201 21:39:59.396155 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Dec 01 21:39:59 crc kubenswrapper[4962]: W1201 21:39:59.405371 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1b4dd7c_b41f_442b_9fc7_9e248312c359.slice/crio-5467e143ccac30d7953d492c4030d433b020fb7e2559f376363ec376769a5142 WatchSource:0}: Error finding container 5467e143ccac30d7953d492c4030d433b020fb7e2559f376363ec376769a5142: Status 404 returned error can't find the container with id 5467e143ccac30d7953d492c4030d433b020fb7e2559f376363ec376769a5142 Dec 01 21:39:59 crc kubenswrapper[4962]: I1201 21:39:59.497450 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7f8fc6b774-9pv2r"] Dec 01 21:39:59 crc kubenswrapper[4962]: W1201 21:39:59.505439 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec3f33c7_9df3_4c44_919e_a84254466ca6.slice/crio-90640f7240048e717572d36f93c07d3122c878ed80869605e8913eb1617fa9c8 WatchSource:0}: Error finding container 90640f7240048e717572d36f93c07d3122c878ed80869605e8913eb1617fa9c8: Status 404 returned error can't find the container with id 90640f7240048e717572d36f93c07d3122c878ed80869605e8913eb1617fa9c8 Dec 01 21:40:00 crc kubenswrapper[4962]: I1201 21:40:00.237049 4962 generic.go:334] "Generic (PLEG): container finished" podID="1c1f9eb9-b365-483d-85cb-a877ecaee3b3" containerID="a2e4974fd4ad6f3f32b452045df84e3fcad81861827a419dfc76213cb1f660bf" exitCode=0 Dec 01 21:40:00 crc kubenswrapper[4962]: I1201 21:40:00.237199 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2fwdb" event={"ID":"1c1f9eb9-b365-483d-85cb-a877ecaee3b3","Type":"ContainerDied","Data":"a2e4974fd4ad6f3f32b452045df84e3fcad81861827a419dfc76213cb1f660bf"} Dec 01 21:40:00 crc kubenswrapper[4962]: I1201 21:40:00.238254 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c1b4dd7c-b41f-442b-9fc7-9e248312c359","Type":"ContainerStarted","Data":"5467e143ccac30d7953d492c4030d433b020fb7e2559f376363ec376769a5142"} Dec 01 21:40:00 crc kubenswrapper[4962]: I1201 21:40:00.239060 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7f8fc6b774-9pv2r" event={"ID":"ec3f33c7-9df3-4c44-919e-a84254466ca6","Type":"ContainerStarted","Data":"90640f7240048e717572d36f93c07d3122c878ed80869605e8913eb1617fa9c8"} Dec 01 21:40:00 crc kubenswrapper[4962]: I1201 21:40:00.713618 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-75566bdf7c-zxn4f"] Dec 01 21:40:00 crc kubenswrapper[4962]: I1201 21:40:00.714712 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-75566bdf7c-zxn4f" Dec 01 21:40:00 crc kubenswrapper[4962]: I1201 21:40:00.730558 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-75566bdf7c-zxn4f"] Dec 01 21:40:00 crc kubenswrapper[4962]: I1201 21:40:00.769720 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkmxf\" (UniqueName: \"kubernetes.io/projected/4f3bd254-c29c-48b9-9ecf-0824411c2e6f-kube-api-access-tkmxf\") pod \"console-75566bdf7c-zxn4f\" (UID: \"4f3bd254-c29c-48b9-9ecf-0824411c2e6f\") " pod="openshift-console/console-75566bdf7c-zxn4f" Dec 01 21:40:00 crc kubenswrapper[4962]: I1201 21:40:00.769786 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f3bd254-c29c-48b9-9ecf-0824411c2e6f-trusted-ca-bundle\") pod \"console-75566bdf7c-zxn4f\" (UID: \"4f3bd254-c29c-48b9-9ecf-0824411c2e6f\") " pod="openshift-console/console-75566bdf7c-zxn4f" Dec 01 21:40:00 crc kubenswrapper[4962]: I1201 21:40:00.769809 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4f3bd254-c29c-48b9-9ecf-0824411c2e6f-console-config\") pod \"console-75566bdf7c-zxn4f\" (UID: \"4f3bd254-c29c-48b9-9ecf-0824411c2e6f\") " pod="openshift-console/console-75566bdf7c-zxn4f" Dec 01 21:40:00 crc kubenswrapper[4962]: I1201 21:40:00.769832 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4f3bd254-c29c-48b9-9ecf-0824411c2e6f-service-ca\") pod \"console-75566bdf7c-zxn4f\" (UID: \"4f3bd254-c29c-48b9-9ecf-0824411c2e6f\") " pod="openshift-console/console-75566bdf7c-zxn4f" Dec 01 21:40:00 crc kubenswrapper[4962]: I1201 21:40:00.769880 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4f3bd254-c29c-48b9-9ecf-0824411c2e6f-console-oauth-config\") pod \"console-75566bdf7c-zxn4f\" (UID: \"4f3bd254-c29c-48b9-9ecf-0824411c2e6f\") " pod="openshift-console/console-75566bdf7c-zxn4f" Dec 01 21:40:00 crc kubenswrapper[4962]: I1201 21:40:00.769907 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4f3bd254-c29c-48b9-9ecf-0824411c2e6f-console-serving-cert\") pod \"console-75566bdf7c-zxn4f\" (UID: \"4f3bd254-c29c-48b9-9ecf-0824411c2e6f\") " pod="openshift-console/console-75566bdf7c-zxn4f" Dec 01 21:40:00 crc kubenswrapper[4962]: I1201 21:40:00.770018 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4f3bd254-c29c-48b9-9ecf-0824411c2e6f-oauth-serving-cert\") pod \"console-75566bdf7c-zxn4f\" (UID: \"4f3bd254-c29c-48b9-9ecf-0824411c2e6f\") " pod="openshift-console/console-75566bdf7c-zxn4f" Dec 01 21:40:00 crc kubenswrapper[4962]: I1201 21:40:00.870908 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkmxf\" (UniqueName: \"kubernetes.io/projected/4f3bd254-c29c-48b9-9ecf-0824411c2e6f-kube-api-access-tkmxf\") pod \"console-75566bdf7c-zxn4f\" (UID: \"4f3bd254-c29c-48b9-9ecf-0824411c2e6f\") " pod="openshift-console/console-75566bdf7c-zxn4f" Dec 01 21:40:00 crc kubenswrapper[4962]: I1201 21:40:00.870980 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f3bd254-c29c-48b9-9ecf-0824411c2e6f-trusted-ca-bundle\") pod \"console-75566bdf7c-zxn4f\" (UID: \"4f3bd254-c29c-48b9-9ecf-0824411c2e6f\") " pod="openshift-console/console-75566bdf7c-zxn4f" Dec 01 21:40:00 crc kubenswrapper[4962]: I1201 21:40:00.871007 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4f3bd254-c29c-48b9-9ecf-0824411c2e6f-console-config\") pod \"console-75566bdf7c-zxn4f\" (UID: \"4f3bd254-c29c-48b9-9ecf-0824411c2e6f\") " pod="openshift-console/console-75566bdf7c-zxn4f" Dec 01 21:40:00 crc kubenswrapper[4962]: I1201 21:40:00.871031 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4f3bd254-c29c-48b9-9ecf-0824411c2e6f-service-ca\") pod \"console-75566bdf7c-zxn4f\" (UID: \"4f3bd254-c29c-48b9-9ecf-0824411c2e6f\") " pod="openshift-console/console-75566bdf7c-zxn4f" Dec 01 21:40:00 crc kubenswrapper[4962]: I1201 21:40:00.871296 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4f3bd254-c29c-48b9-9ecf-0824411c2e6f-console-oauth-config\") pod \"console-75566bdf7c-zxn4f\" (UID: \"4f3bd254-c29c-48b9-9ecf-0824411c2e6f\") " pod="openshift-console/console-75566bdf7c-zxn4f" Dec 01 21:40:00 crc kubenswrapper[4962]: I1201 21:40:00.871358 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4f3bd254-c29c-48b9-9ecf-0824411c2e6f-console-serving-cert\") pod \"console-75566bdf7c-zxn4f\" (UID: \"4f3bd254-c29c-48b9-9ecf-0824411c2e6f\") " pod="openshift-console/console-75566bdf7c-zxn4f" Dec 01 21:40:00 crc kubenswrapper[4962]: I1201 21:40:00.871434 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4f3bd254-c29c-48b9-9ecf-0824411c2e6f-oauth-serving-cert\") pod \"console-75566bdf7c-zxn4f\" (UID: \"4f3bd254-c29c-48b9-9ecf-0824411c2e6f\") " pod="openshift-console/console-75566bdf7c-zxn4f" Dec 01 21:40:00 crc kubenswrapper[4962]: I1201 21:40:00.872123 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4f3bd254-c29c-48b9-9ecf-0824411c2e6f-oauth-serving-cert\") pod \"console-75566bdf7c-zxn4f\" (UID: \"4f3bd254-c29c-48b9-9ecf-0824411c2e6f\") " pod="openshift-console/console-75566bdf7c-zxn4f" Dec 01 21:40:00 crc kubenswrapper[4962]: I1201 21:40:00.872213 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f3bd254-c29c-48b9-9ecf-0824411c2e6f-trusted-ca-bundle\") pod \"console-75566bdf7c-zxn4f\" (UID: \"4f3bd254-c29c-48b9-9ecf-0824411c2e6f\") " pod="openshift-console/console-75566bdf7c-zxn4f" Dec 01 21:40:00 crc kubenswrapper[4962]: I1201 21:40:00.872671 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4f3bd254-c29c-48b9-9ecf-0824411c2e6f-console-config\") pod \"console-75566bdf7c-zxn4f\" (UID: \"4f3bd254-c29c-48b9-9ecf-0824411c2e6f\") " pod="openshift-console/console-75566bdf7c-zxn4f" Dec 01 21:40:00 crc kubenswrapper[4962]: I1201 21:40:00.872981 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4f3bd254-c29c-48b9-9ecf-0824411c2e6f-service-ca\") pod \"console-75566bdf7c-zxn4f\" (UID: \"4f3bd254-c29c-48b9-9ecf-0824411c2e6f\") " pod="openshift-console/console-75566bdf7c-zxn4f" Dec 01 21:40:00 crc kubenswrapper[4962]: I1201 21:40:00.881486 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4f3bd254-c29c-48b9-9ecf-0824411c2e6f-console-oauth-config\") pod \"console-75566bdf7c-zxn4f\" (UID: \"4f3bd254-c29c-48b9-9ecf-0824411c2e6f\") " pod="openshift-console/console-75566bdf7c-zxn4f" Dec 01 21:40:00 crc kubenswrapper[4962]: I1201 21:40:00.881515 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4f3bd254-c29c-48b9-9ecf-0824411c2e6f-console-serving-cert\") pod \"console-75566bdf7c-zxn4f\" (UID: \"4f3bd254-c29c-48b9-9ecf-0824411c2e6f\") " pod="openshift-console/console-75566bdf7c-zxn4f" Dec 01 21:40:00 crc kubenswrapper[4962]: I1201 21:40:00.889107 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkmxf\" (UniqueName: \"kubernetes.io/projected/4f3bd254-c29c-48b9-9ecf-0824411c2e6f-kube-api-access-tkmxf\") pod \"console-75566bdf7c-zxn4f\" (UID: \"4f3bd254-c29c-48b9-9ecf-0824411c2e6f\") " pod="openshift-console/console-75566bdf7c-zxn4f" Dec 01 21:40:01 crc kubenswrapper[4962]: I1201 21:40:01.154623 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-75566bdf7c-zxn4f" Dec 01 21:40:01 crc kubenswrapper[4962]: I1201 21:40:01.257344 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2fwdb" event={"ID":"1c1f9eb9-b365-483d-85cb-a877ecaee3b3","Type":"ContainerStarted","Data":"5a967606fe9cdac9b695427b43365568e4b7a3832e8494a46853d9132e367844"} Dec 01 21:40:01 crc kubenswrapper[4962]: I1201 21:40:01.257384 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2fwdb" event={"ID":"1c1f9eb9-b365-483d-85cb-a877ecaee3b3","Type":"ContainerStarted","Data":"803c8c959bc99af9ead6f8871fd761ecca79762f9aed57308f0ae08adacb1ee8"} Dec 01 21:40:01 crc kubenswrapper[4962]: I1201 21:40:01.261708 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-t8nvz" event={"ID":"3b3107c8-baf4-4aee-a3b9-70df9e5022eb","Type":"ContainerStarted","Data":"48a1b697aa971174e5a16f13c6446d0f6558486a695ec5996a885afdad505662"} Dec 01 21:40:01 crc kubenswrapper[4962]: I1201 21:40:01.266700 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-cgcbf" event={"ID":"f9954d35-3a54-4528-975f-e834bd649bc6","Type":"ContainerStarted","Data":"6c1bd260f7cacab0ae92c0cc0788516e7a194bb13cb8c546f44636353f10f67e"} Dec 01 21:40:01 crc kubenswrapper[4962]: I1201 21:40:01.266736 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-cgcbf" event={"ID":"f9954d35-3a54-4528-975f-e834bd649bc6","Type":"ContainerStarted","Data":"50d68e80d0bc69974b46484abb478e7d3fe31364b78cf2d3f4ddae4dfd27765a"} Dec 01 21:40:01 crc kubenswrapper[4962]: I1201 21:40:01.299810 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-2fwdb" podStartSLOduration=3.667227381 podStartE2EDuration="6.299769988s" podCreationTimestamp="2025-12-01 21:39:55 +0000 UTC" firstStartedPulling="2025-12-01 21:39:56.281400582 +0000 UTC m=+380.382839787" lastFinishedPulling="2025-12-01 21:39:58.913943199 +0000 UTC m=+383.015382394" observedRunningTime="2025-12-01 21:40:01.278109032 +0000 UTC m=+385.379548217" watchObservedRunningTime="2025-12-01 21:40:01.299769988 +0000 UTC m=+385.401209193" Dec 01 21:40:01 crc kubenswrapper[4962]: I1201 21:40:01.302406 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-566fddb674-t8nvz" podStartSLOduration=3.219561979 podStartE2EDuration="6.302391059s" podCreationTimestamp="2025-12-01 21:39:55 +0000 UTC" firstStartedPulling="2025-12-01 21:39:57.502797083 +0000 UTC m=+381.604236278" lastFinishedPulling="2025-12-01 21:40:00.585626163 +0000 UTC m=+384.687065358" observedRunningTime="2025-12-01 21:40:01.296264863 +0000 UTC m=+385.397704068" watchObservedRunningTime="2025-12-01 21:40:01.302391059 +0000 UTC m=+385.403830254" Dec 01 21:40:01 crc kubenswrapper[4962]: I1201 21:40:01.356726 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-98676cfd9-qw7bj"] Dec 01 21:40:01 crc kubenswrapper[4962]: I1201 21:40:01.357428 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-98676cfd9-qw7bj" Dec 01 21:40:01 crc kubenswrapper[4962]: I1201 21:40:01.360610 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Dec 01 21:40:01 crc kubenswrapper[4962]: I1201 21:40:01.360993 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Dec 01 21:40:01 crc kubenswrapper[4962]: I1201 21:40:01.361063 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-dockercfg-r6vjk" Dec 01 21:40:01 crc kubenswrapper[4962]: I1201 21:40:01.361512 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-3j36tkuabja74" Dec 01 21:40:01 crc kubenswrapper[4962]: I1201 21:40:01.361704 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Dec 01 21:40:01 crc kubenswrapper[4962]: I1201 21:40:01.361865 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Dec 01 21:40:01 crc kubenswrapper[4962]: I1201 21:40:01.377688 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-98676cfd9-qw7bj"] Dec 01 21:40:01 crc kubenswrapper[4962]: I1201 21:40:01.386341 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/96d314cb-1713-4e20-8eec-70bedd5cabad-audit-log\") pod \"metrics-server-98676cfd9-qw7bj\" (UID: \"96d314cb-1713-4e20-8eec-70bedd5cabad\") " pod="openshift-monitoring/metrics-server-98676cfd9-qw7bj" Dec 01 21:40:01 crc kubenswrapper[4962]: I1201 21:40:01.386415 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/96d314cb-1713-4e20-8eec-70bedd5cabad-secret-metrics-server-tls\") pod \"metrics-server-98676cfd9-qw7bj\" (UID: \"96d314cb-1713-4e20-8eec-70bedd5cabad\") " pod="openshift-monitoring/metrics-server-98676cfd9-qw7bj" Dec 01 21:40:01 crc kubenswrapper[4962]: I1201 21:40:01.386551 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96d314cb-1713-4e20-8eec-70bedd5cabad-client-ca-bundle\") pod \"metrics-server-98676cfd9-qw7bj\" (UID: \"96d314cb-1713-4e20-8eec-70bedd5cabad\") " pod="openshift-monitoring/metrics-server-98676cfd9-qw7bj" Dec 01 21:40:01 crc kubenswrapper[4962]: I1201 21:40:01.386633 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/96d314cb-1713-4e20-8eec-70bedd5cabad-metrics-server-audit-profiles\") pod \"metrics-server-98676cfd9-qw7bj\" (UID: \"96d314cb-1713-4e20-8eec-70bedd5cabad\") " pod="openshift-monitoring/metrics-server-98676cfd9-qw7bj" Dec 01 21:40:01 crc kubenswrapper[4962]: I1201 21:40:01.386664 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96d314cb-1713-4e20-8eec-70bedd5cabad-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-98676cfd9-qw7bj\" (UID: \"96d314cb-1713-4e20-8eec-70bedd5cabad\") " pod="openshift-monitoring/metrics-server-98676cfd9-qw7bj" Dec 01 21:40:01 crc kubenswrapper[4962]: I1201 21:40:01.386727 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/96d314cb-1713-4e20-8eec-70bedd5cabad-secret-metrics-client-certs\") pod \"metrics-server-98676cfd9-qw7bj\" (UID: \"96d314cb-1713-4e20-8eec-70bedd5cabad\") " pod="openshift-monitoring/metrics-server-98676cfd9-qw7bj" Dec 01 21:40:01 crc kubenswrapper[4962]: I1201 21:40:01.386761 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5v488\" (UniqueName: \"kubernetes.io/projected/96d314cb-1713-4e20-8eec-70bedd5cabad-kube-api-access-5v488\") pod \"metrics-server-98676cfd9-qw7bj\" (UID: \"96d314cb-1713-4e20-8eec-70bedd5cabad\") " pod="openshift-monitoring/metrics-server-98676cfd9-qw7bj" Dec 01 21:40:01 crc kubenswrapper[4962]: I1201 21:40:01.487990 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/96d314cb-1713-4e20-8eec-70bedd5cabad-secret-metrics-server-tls\") pod \"metrics-server-98676cfd9-qw7bj\" (UID: \"96d314cb-1713-4e20-8eec-70bedd5cabad\") " pod="openshift-monitoring/metrics-server-98676cfd9-qw7bj" Dec 01 21:40:01 crc kubenswrapper[4962]: I1201 21:40:01.488342 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96d314cb-1713-4e20-8eec-70bedd5cabad-client-ca-bundle\") pod \"metrics-server-98676cfd9-qw7bj\" (UID: \"96d314cb-1713-4e20-8eec-70bedd5cabad\") " pod="openshift-monitoring/metrics-server-98676cfd9-qw7bj" Dec 01 21:40:01 crc kubenswrapper[4962]: I1201 21:40:01.488380 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/96d314cb-1713-4e20-8eec-70bedd5cabad-metrics-server-audit-profiles\") pod \"metrics-server-98676cfd9-qw7bj\" (UID: \"96d314cb-1713-4e20-8eec-70bedd5cabad\") " pod="openshift-monitoring/metrics-server-98676cfd9-qw7bj" Dec 01 21:40:01 crc kubenswrapper[4962]: I1201 21:40:01.488396 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96d314cb-1713-4e20-8eec-70bedd5cabad-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-98676cfd9-qw7bj\" (UID: \"96d314cb-1713-4e20-8eec-70bedd5cabad\") " pod="openshift-monitoring/metrics-server-98676cfd9-qw7bj" Dec 01 21:40:01 crc kubenswrapper[4962]: I1201 21:40:01.488432 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/96d314cb-1713-4e20-8eec-70bedd5cabad-secret-metrics-client-certs\") pod \"metrics-server-98676cfd9-qw7bj\" (UID: \"96d314cb-1713-4e20-8eec-70bedd5cabad\") " pod="openshift-monitoring/metrics-server-98676cfd9-qw7bj" Dec 01 21:40:01 crc kubenswrapper[4962]: I1201 21:40:01.488453 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5v488\" (UniqueName: \"kubernetes.io/projected/96d314cb-1713-4e20-8eec-70bedd5cabad-kube-api-access-5v488\") pod \"metrics-server-98676cfd9-qw7bj\" (UID: \"96d314cb-1713-4e20-8eec-70bedd5cabad\") " pod="openshift-monitoring/metrics-server-98676cfd9-qw7bj" Dec 01 21:40:01 crc kubenswrapper[4962]: I1201 21:40:01.488478 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/96d314cb-1713-4e20-8eec-70bedd5cabad-audit-log\") pod \"metrics-server-98676cfd9-qw7bj\" (UID: \"96d314cb-1713-4e20-8eec-70bedd5cabad\") " pod="openshift-monitoring/metrics-server-98676cfd9-qw7bj" Dec 01 21:40:01 crc kubenswrapper[4962]: I1201 21:40:01.488949 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/96d314cb-1713-4e20-8eec-70bedd5cabad-audit-log\") pod \"metrics-server-98676cfd9-qw7bj\" (UID: \"96d314cb-1713-4e20-8eec-70bedd5cabad\") " pod="openshift-monitoring/metrics-server-98676cfd9-qw7bj" Dec 01 21:40:01 crc kubenswrapper[4962]: I1201 21:40:01.490342 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96d314cb-1713-4e20-8eec-70bedd5cabad-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-98676cfd9-qw7bj\" (UID: \"96d314cb-1713-4e20-8eec-70bedd5cabad\") " pod="openshift-monitoring/metrics-server-98676cfd9-qw7bj" Dec 01 21:40:01 crc kubenswrapper[4962]: I1201 21:40:01.491027 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/96d314cb-1713-4e20-8eec-70bedd5cabad-metrics-server-audit-profiles\") pod \"metrics-server-98676cfd9-qw7bj\" (UID: \"96d314cb-1713-4e20-8eec-70bedd5cabad\") " pod="openshift-monitoring/metrics-server-98676cfd9-qw7bj" Dec 01 21:40:01 crc kubenswrapper[4962]: I1201 21:40:01.494448 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/96d314cb-1713-4e20-8eec-70bedd5cabad-secret-metrics-client-certs\") pod \"metrics-server-98676cfd9-qw7bj\" (UID: \"96d314cb-1713-4e20-8eec-70bedd5cabad\") " pod="openshift-monitoring/metrics-server-98676cfd9-qw7bj" Dec 01 21:40:01 crc kubenswrapper[4962]: I1201 21:40:01.494692 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96d314cb-1713-4e20-8eec-70bedd5cabad-client-ca-bundle\") pod \"metrics-server-98676cfd9-qw7bj\" (UID: \"96d314cb-1713-4e20-8eec-70bedd5cabad\") " pod="openshift-monitoring/metrics-server-98676cfd9-qw7bj" Dec 01 21:40:01 crc kubenswrapper[4962]: I1201 21:40:01.495475 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/96d314cb-1713-4e20-8eec-70bedd5cabad-secret-metrics-server-tls\") pod \"metrics-server-98676cfd9-qw7bj\" (UID: \"96d314cb-1713-4e20-8eec-70bedd5cabad\") " pod="openshift-monitoring/metrics-server-98676cfd9-qw7bj" Dec 01 21:40:01 crc kubenswrapper[4962]: I1201 21:40:01.508701 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5v488\" (UniqueName: \"kubernetes.io/projected/96d314cb-1713-4e20-8eec-70bedd5cabad-kube-api-access-5v488\") pod \"metrics-server-98676cfd9-qw7bj\" (UID: \"96d314cb-1713-4e20-8eec-70bedd5cabad\") " pod="openshift-monitoring/metrics-server-98676cfd9-qw7bj" Dec 01 21:40:01 crc kubenswrapper[4962]: I1201 21:40:01.573164 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-75566bdf7c-zxn4f"] Dec 01 21:40:01 crc kubenswrapper[4962]: I1201 21:40:01.678462 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-98676cfd9-qw7bj" Dec 01 21:40:01 crc kubenswrapper[4962]: I1201 21:40:01.683995 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-78587f9f64-ll6vv"] Dec 01 21:40:01 crc kubenswrapper[4962]: I1201 21:40:01.684901 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-78587f9f64-ll6vv" Dec 01 21:40:01 crc kubenswrapper[4962]: I1201 21:40:01.693165 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-78587f9f64-ll6vv"] Dec 01 21:40:01 crc kubenswrapper[4962]: I1201 21:40:01.700954 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"monitoring-plugin-cert" Dec 01 21:40:01 crc kubenswrapper[4962]: I1201 21:40:01.701226 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"default-dockercfg-6tstp" Dec 01 21:40:01 crc kubenswrapper[4962]: I1201 21:40:01.791094 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/19da0b48-61f6-4af3-89e9-36bfce4480f7-monitoring-plugin-cert\") pod \"monitoring-plugin-78587f9f64-ll6vv\" (UID: \"19da0b48-61f6-4af3-89e9-36bfce4480f7\") " pod="openshift-monitoring/monitoring-plugin-78587f9f64-ll6vv" Dec 01 21:40:01 crc kubenswrapper[4962]: I1201 21:40:01.893612 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/19da0b48-61f6-4af3-89e9-36bfce4480f7-monitoring-plugin-cert\") pod \"monitoring-plugin-78587f9f64-ll6vv\" (UID: \"19da0b48-61f6-4af3-89e9-36bfce4480f7\") " pod="openshift-monitoring/monitoring-plugin-78587f9f64-ll6vv" Dec 01 21:40:01 crc kubenswrapper[4962]: I1201 21:40:01.902011 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/19da0b48-61f6-4af3-89e9-36bfce4480f7-monitoring-plugin-cert\") pod \"monitoring-plugin-78587f9f64-ll6vv\" (UID: \"19da0b48-61f6-4af3-89e9-36bfce4480f7\") " pod="openshift-monitoring/monitoring-plugin-78587f9f64-ll6vv" Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.014424 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-78587f9f64-ll6vv" Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.281267 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-75566bdf7c-zxn4f" event={"ID":"4f3bd254-c29c-48b9-9ecf-0824411c2e6f","Type":"ContainerStarted","Data":"a68832c66f791bdb00d2078fc1724d178d13515fbaf21cc4cd053997dfd14c54"} Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.281624 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-75566bdf7c-zxn4f" event={"ID":"4f3bd254-c29c-48b9-9ecf-0824411c2e6f","Type":"ContainerStarted","Data":"d141f7ee7926f800878928608d1e54d0a4cd4c2e4ed6bd75e573d6372cc0d8e0"} Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.304140 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-75566bdf7c-zxn4f" podStartSLOduration=2.304102445 podStartE2EDuration="2.304102445s" podCreationTimestamp="2025-12-01 21:40:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:40:02.301085673 +0000 UTC m=+386.402524868" watchObservedRunningTime="2025-12-01 21:40:02.304102445 +0000 UTC m=+386.405541640" Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.307084 4962 generic.go:334] "Generic (PLEG): container finished" podID="c1b4dd7c-b41f-442b-9fc7-9e248312c359" containerID="df5102fbd9bf9a04f96764da54f95a5329acae7da4e5b40c798ec45b78fde9c9" exitCode=0 Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.307652 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c1b4dd7c-b41f-442b-9fc7-9e248312c359","Type":"ContainerDied","Data":"df5102fbd9bf9a04f96764da54f95a5329acae7da4e5b40c798ec45b78fde9c9"} Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.323593 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-cgcbf" event={"ID":"f9954d35-3a54-4528-975f-e834bd649bc6","Type":"ContainerStarted","Data":"30b29577443f08643a52c424275ac715a568da630ced405cf07058d4961ee57c"} Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.364343 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.378266 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-cgcbf" podStartSLOduration=4.235914451 podStartE2EDuration="7.378244441s" podCreationTimestamp="2025-12-01 21:39:55 +0000 UTC" firstStartedPulling="2025-12-01 21:39:57.443315094 +0000 UTC m=+381.544754289" lastFinishedPulling="2025-12-01 21:40:00.585645074 +0000 UTC m=+384.687084279" observedRunningTime="2025-12-01 21:40:02.371255102 +0000 UTC m=+386.472694297" watchObservedRunningTime="2025-12-01 21:40:02.378244441 +0000 UTC m=+386.479683636" Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.380248 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.392339 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-kube-rbac-proxy-web" Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.397268 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-web-config" Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.397491 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-prometheus-http-client-file" Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.397569 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-rbac-proxy" Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.397715 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls-assets-0" Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.397803 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"serving-certs-ca-bundle" Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.417684 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-k8s-rulefiles-0" Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.418212 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-dockercfg-pwcdp" Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.418422 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls" Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.418660 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s" Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.418894 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-sidecar-tls" Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.419238 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-grpc-tls-84mt3gjredp5t" Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.421752 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-trusted-ca-bundle" Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.432324 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/89ba1c4c-644c-4832-bc73-5d072d80cfee-web-config\") pod \"prometheus-k8s-0\" (UID: \"89ba1c4c-644c-4832-bc73-5d072d80cfee\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.432382 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/89ba1c4c-644c-4832-bc73-5d072d80cfee-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"89ba1c4c-644c-4832-bc73-5d072d80cfee\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.432576 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/89ba1c4c-644c-4832-bc73-5d072d80cfee-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"89ba1c4c-644c-4832-bc73-5d072d80cfee\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.432596 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89ba1c4c-644c-4832-bc73-5d072d80cfee-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"89ba1c4c-644c-4832-bc73-5d072d80cfee\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.432727 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/89ba1c4c-644c-4832-bc73-5d072d80cfee-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"89ba1c4c-644c-4832-bc73-5d072d80cfee\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.432772 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/89ba1c4c-644c-4832-bc73-5d072d80cfee-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"89ba1c4c-644c-4832-bc73-5d072d80cfee\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.432787 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/89ba1c4c-644c-4832-bc73-5d072d80cfee-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"89ba1c4c-644c-4832-bc73-5d072d80cfee\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.432800 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/89ba1c4c-644c-4832-bc73-5d072d80cfee-config\") pod \"prometheus-k8s-0\" (UID: \"89ba1c4c-644c-4832-bc73-5d072d80cfee\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.432885 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g65wz\" (UniqueName: \"kubernetes.io/projected/89ba1c4c-644c-4832-bc73-5d072d80cfee-kube-api-access-g65wz\") pod \"prometheus-k8s-0\" (UID: \"89ba1c4c-644c-4832-bc73-5d072d80cfee\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.432912 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/89ba1c4c-644c-4832-bc73-5d072d80cfee-config-out\") pod \"prometheus-k8s-0\" (UID: \"89ba1c4c-644c-4832-bc73-5d072d80cfee\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.432926 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89ba1c4c-644c-4832-bc73-5d072d80cfee-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"89ba1c4c-644c-4832-bc73-5d072d80cfee\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.433004 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/89ba1c4c-644c-4832-bc73-5d072d80cfee-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"89ba1c4c-644c-4832-bc73-5d072d80cfee\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.433056 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/89ba1c4c-644c-4832-bc73-5d072d80cfee-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"89ba1c4c-644c-4832-bc73-5d072d80cfee\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.433118 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/89ba1c4c-644c-4832-bc73-5d072d80cfee-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"89ba1c4c-644c-4832-bc73-5d072d80cfee\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.433137 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/89ba1c4c-644c-4832-bc73-5d072d80cfee-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"89ba1c4c-644c-4832-bc73-5d072d80cfee\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.433200 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89ba1c4c-644c-4832-bc73-5d072d80cfee-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"89ba1c4c-644c-4832-bc73-5d072d80cfee\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.433221 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/89ba1c4c-644c-4832-bc73-5d072d80cfee-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"89ba1c4c-644c-4832-bc73-5d072d80cfee\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.433249 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/89ba1c4c-644c-4832-bc73-5d072d80cfee-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"89ba1c4c-644c-4832-bc73-5d072d80cfee\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.451666 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.452995 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-98676cfd9-qw7bj"] Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.534130 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/89ba1c4c-644c-4832-bc73-5d072d80cfee-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"89ba1c4c-644c-4832-bc73-5d072d80cfee\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.534207 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/89ba1c4c-644c-4832-bc73-5d072d80cfee-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"89ba1c4c-644c-4832-bc73-5d072d80cfee\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.534235 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/89ba1c4c-644c-4832-bc73-5d072d80cfee-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"89ba1c4c-644c-4832-bc73-5d072d80cfee\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.534286 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89ba1c4c-644c-4832-bc73-5d072d80cfee-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"89ba1c4c-644c-4832-bc73-5d072d80cfee\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.534306 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/89ba1c4c-644c-4832-bc73-5d072d80cfee-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"89ba1c4c-644c-4832-bc73-5d072d80cfee\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.534325 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/89ba1c4c-644c-4832-bc73-5d072d80cfee-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"89ba1c4c-644c-4832-bc73-5d072d80cfee\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.534385 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/89ba1c4c-644c-4832-bc73-5d072d80cfee-web-config\") pod \"prometheus-k8s-0\" (UID: \"89ba1c4c-644c-4832-bc73-5d072d80cfee\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.534404 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/89ba1c4c-644c-4832-bc73-5d072d80cfee-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"89ba1c4c-644c-4832-bc73-5d072d80cfee\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.534450 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/89ba1c4c-644c-4832-bc73-5d072d80cfee-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"89ba1c4c-644c-4832-bc73-5d072d80cfee\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.534468 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89ba1c4c-644c-4832-bc73-5d072d80cfee-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"89ba1c4c-644c-4832-bc73-5d072d80cfee\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.534490 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/89ba1c4c-644c-4832-bc73-5d072d80cfee-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"89ba1c4c-644c-4832-bc73-5d072d80cfee\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.534537 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/89ba1c4c-644c-4832-bc73-5d072d80cfee-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"89ba1c4c-644c-4832-bc73-5d072d80cfee\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.534556 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/89ba1c4c-644c-4832-bc73-5d072d80cfee-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"89ba1c4c-644c-4832-bc73-5d072d80cfee\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.534570 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/89ba1c4c-644c-4832-bc73-5d072d80cfee-config\") pod \"prometheus-k8s-0\" (UID: \"89ba1c4c-644c-4832-bc73-5d072d80cfee\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.534620 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g65wz\" (UniqueName: \"kubernetes.io/projected/89ba1c4c-644c-4832-bc73-5d072d80cfee-kube-api-access-g65wz\") pod \"prometheus-k8s-0\" (UID: \"89ba1c4c-644c-4832-bc73-5d072d80cfee\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.534636 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/89ba1c4c-644c-4832-bc73-5d072d80cfee-config-out\") pod \"prometheus-k8s-0\" (UID: \"89ba1c4c-644c-4832-bc73-5d072d80cfee\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.534650 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89ba1c4c-644c-4832-bc73-5d072d80cfee-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"89ba1c4c-644c-4832-bc73-5d072d80cfee\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.534696 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/89ba1c4c-644c-4832-bc73-5d072d80cfee-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"89ba1c4c-644c-4832-bc73-5d072d80cfee\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.536382 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/89ba1c4c-644c-4832-bc73-5d072d80cfee-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"89ba1c4c-644c-4832-bc73-5d072d80cfee\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.536689 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89ba1c4c-644c-4832-bc73-5d072d80cfee-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"89ba1c4c-644c-4832-bc73-5d072d80cfee\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.537781 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89ba1c4c-644c-4832-bc73-5d072d80cfee-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"89ba1c4c-644c-4832-bc73-5d072d80cfee\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.539230 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/89ba1c4c-644c-4832-bc73-5d072d80cfee-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"89ba1c4c-644c-4832-bc73-5d072d80cfee\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.539547 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89ba1c4c-644c-4832-bc73-5d072d80cfee-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"89ba1c4c-644c-4832-bc73-5d072d80cfee\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.540787 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-78587f9f64-ll6vv"] Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.545311 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/89ba1c4c-644c-4832-bc73-5d072d80cfee-web-config\") pod \"prometheus-k8s-0\" (UID: \"89ba1c4c-644c-4832-bc73-5d072d80cfee\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.545326 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/89ba1c4c-644c-4832-bc73-5d072d80cfee-config-out\") pod \"prometheus-k8s-0\" (UID: \"89ba1c4c-644c-4832-bc73-5d072d80cfee\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.545392 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/89ba1c4c-644c-4832-bc73-5d072d80cfee-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"89ba1c4c-644c-4832-bc73-5d072d80cfee\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.545425 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/89ba1c4c-644c-4832-bc73-5d072d80cfee-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"89ba1c4c-644c-4832-bc73-5d072d80cfee\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.545663 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/89ba1c4c-644c-4832-bc73-5d072d80cfee-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"89ba1c4c-644c-4832-bc73-5d072d80cfee\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.545744 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/89ba1c4c-644c-4832-bc73-5d072d80cfee-config\") pod \"prometheus-k8s-0\" (UID: \"89ba1c4c-644c-4832-bc73-5d072d80cfee\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.545780 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/89ba1c4c-644c-4832-bc73-5d072d80cfee-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"89ba1c4c-644c-4832-bc73-5d072d80cfee\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.545915 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/89ba1c4c-644c-4832-bc73-5d072d80cfee-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"89ba1c4c-644c-4832-bc73-5d072d80cfee\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.546253 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/89ba1c4c-644c-4832-bc73-5d072d80cfee-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"89ba1c4c-644c-4832-bc73-5d072d80cfee\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.548479 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/89ba1c4c-644c-4832-bc73-5d072d80cfee-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"89ba1c4c-644c-4832-bc73-5d072d80cfee\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.549797 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/89ba1c4c-644c-4832-bc73-5d072d80cfee-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"89ba1c4c-644c-4832-bc73-5d072d80cfee\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.550260 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/89ba1c4c-644c-4832-bc73-5d072d80cfee-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"89ba1c4c-644c-4832-bc73-5d072d80cfee\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.552292 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g65wz\" (UniqueName: \"kubernetes.io/projected/89ba1c4c-644c-4832-bc73-5d072d80cfee-kube-api-access-g65wz\") pod \"prometheus-k8s-0\" (UID: \"89ba1c4c-644c-4832-bc73-5d072d80cfee\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.757588 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.784768 4962 patch_prober.go:28] interesting pod/machine-config-daemon-b642k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 21:40:02 crc kubenswrapper[4962]: I1201 21:40:02.784835 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 21:40:03 crc kubenswrapper[4962]: I1201 21:40:03.848372 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Dec 01 21:40:03 crc kubenswrapper[4962]: W1201 21:40:03.857007 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89ba1c4c_644c_4832_bc73_5d072d80cfee.slice/crio-5a79541fd9917f371974a45bbd932936384bbf0085e1cbd570aada81b7563244 WatchSource:0}: Error finding container 5a79541fd9917f371974a45bbd932936384bbf0085e1cbd570aada81b7563244: Status 404 returned error can't find the container with id 5a79541fd9917f371974a45bbd932936384bbf0085e1cbd570aada81b7563244 Dec 01 21:40:04 crc kubenswrapper[4962]: I1201 21:40:04.344148 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7f8fc6b774-9pv2r" event={"ID":"ec3f33c7-9df3-4c44-919e-a84254466ca6","Type":"ContainerStarted","Data":"1cdaa62f37da88a6be4032c92b473d428a36a47167ea71966b1557c4bb7e3f42"} Dec 01 21:40:04 crc kubenswrapper[4962]: I1201 21:40:04.344202 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7f8fc6b774-9pv2r" event={"ID":"ec3f33c7-9df3-4c44-919e-a84254466ca6","Type":"ContainerStarted","Data":"54b298c7a32d450c374d1904b673e9c48d5d1bba150c2fda1a0ff5947277363d"} Dec 01 21:40:04 crc kubenswrapper[4962]: I1201 21:40:04.344218 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7f8fc6b774-9pv2r" event={"ID":"ec3f33c7-9df3-4c44-919e-a84254466ca6","Type":"ContainerStarted","Data":"848423380ac969eae84cd20865a16f465e22dc2b050b1183e694c7713066e0e4"} Dec 01 21:40:04 crc kubenswrapper[4962]: I1201 21:40:04.345526 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-78587f9f64-ll6vv" event={"ID":"19da0b48-61f6-4af3-89e9-36bfce4480f7","Type":"ContainerStarted","Data":"267e9c3d9bab295638a74c4b1eaccc5fad1ff8e15af40c6382e2970fdc8ca169"} Dec 01 21:40:04 crc kubenswrapper[4962]: I1201 21:40:04.347188 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"89ba1c4c-644c-4832-bc73-5d072d80cfee","Type":"ContainerDied","Data":"9c64ddff5875dec90e151ae991bcc2f7019f96fd8a37ef5371da29a52307c7b2"} Dec 01 21:40:04 crc kubenswrapper[4962]: I1201 21:40:04.346996 4962 generic.go:334] "Generic (PLEG): container finished" podID="89ba1c4c-644c-4832-bc73-5d072d80cfee" containerID="9c64ddff5875dec90e151ae991bcc2f7019f96fd8a37ef5371da29a52307c7b2" exitCode=0 Dec 01 21:40:04 crc kubenswrapper[4962]: I1201 21:40:04.348045 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"89ba1c4c-644c-4832-bc73-5d072d80cfee","Type":"ContainerStarted","Data":"5a79541fd9917f371974a45bbd932936384bbf0085e1cbd570aada81b7563244"} Dec 01 21:40:04 crc kubenswrapper[4962]: I1201 21:40:04.352956 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-98676cfd9-qw7bj" event={"ID":"96d314cb-1713-4e20-8eec-70bedd5cabad","Type":"ContainerStarted","Data":"351113dbd95ada2891e84bcc4be8059f46aa569bd39542c0cfe5103142183db0"} Dec 01 21:40:06 crc kubenswrapper[4962]: I1201 21:40:06.370948 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c1b4dd7c-b41f-442b-9fc7-9e248312c359","Type":"ContainerStarted","Data":"12c1e2961ceec8d39433dccc82804f3a6d8a0583ad2eec143f130dda692a8cdc"} Dec 01 21:40:06 crc kubenswrapper[4962]: I1201 21:40:06.371415 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c1b4dd7c-b41f-442b-9fc7-9e248312c359","Type":"ContainerStarted","Data":"edbf1981c75efca18dc2a940a6fca88c3a42690b205ebf4d8e52ad7a89645ee7"} Dec 01 21:40:06 crc kubenswrapper[4962]: I1201 21:40:06.373024 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-98676cfd9-qw7bj" event={"ID":"96d314cb-1713-4e20-8eec-70bedd5cabad","Type":"ContainerStarted","Data":"305fc01654247128bde1a7cd7511f2b42c4df8a1704e7135b1469d1649722aaa"} Dec 01 21:40:06 crc kubenswrapper[4962]: I1201 21:40:06.376010 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-78587f9f64-ll6vv" event={"ID":"19da0b48-61f6-4af3-89e9-36bfce4480f7","Type":"ContainerStarted","Data":"9b513ced4b9ce37b9c6dc5ce11486ceec38987ec104be06a6e7d155a92304cff"} Dec 01 21:40:06 crc kubenswrapper[4962]: I1201 21:40:06.376271 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/monitoring-plugin-78587f9f64-ll6vv" Dec 01 21:40:06 crc kubenswrapper[4962]: I1201 21:40:06.383409 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-78587f9f64-ll6vv" Dec 01 21:40:06 crc kubenswrapper[4962]: I1201 21:40:06.401026 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-98676cfd9-qw7bj" podStartSLOduration=3.154010671 podStartE2EDuration="5.400997194s" podCreationTimestamp="2025-12-01 21:40:01 +0000 UTC" firstStartedPulling="2025-12-01 21:40:03.403218676 +0000 UTC m=+387.504657881" lastFinishedPulling="2025-12-01 21:40:05.650205209 +0000 UTC m=+389.751644404" observedRunningTime="2025-12-01 21:40:06.390518181 +0000 UTC m=+390.491957436" watchObservedRunningTime="2025-12-01 21:40:06.400997194 +0000 UTC m=+390.502436439" Dec 01 21:40:08 crc kubenswrapper[4962]: I1201 21:40:08.391633 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c1b4dd7c-b41f-442b-9fc7-9e248312c359","Type":"ContainerStarted","Data":"efbafef9a6c1d203eed1f8633946d6d9d67875f0083be9d8ec514f47368859c5"} Dec 01 21:40:09 crc kubenswrapper[4962]: I1201 21:40:09.401498 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7f8fc6b774-9pv2r" event={"ID":"ec3f33c7-9df3-4c44-919e-a84254466ca6","Type":"ContainerStarted","Data":"a744947d2233053f18f71476256594d5f00f0dd03059d4d93376767a54c518f6"} Dec 01 21:40:09 crc kubenswrapper[4962]: I1201 21:40:09.401548 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7f8fc6b774-9pv2r" event={"ID":"ec3f33c7-9df3-4c44-919e-a84254466ca6","Type":"ContainerStarted","Data":"499549e5846d3eb185c39fa66fc8d4573210e90ef6cbc97928fe471abdf45d7f"} Dec 01 21:40:09 crc kubenswrapper[4962]: I1201 21:40:09.401557 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7f8fc6b774-9pv2r" event={"ID":"ec3f33c7-9df3-4c44-919e-a84254466ca6","Type":"ContainerStarted","Data":"98f9cfc49d0f015f109c452cd02bcc039b3d03127ac4c394a3b7a76bb37ff3f1"} Dec 01 21:40:09 crc kubenswrapper[4962]: I1201 21:40:09.405623 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c1b4dd7c-b41f-442b-9fc7-9e248312c359","Type":"ContainerStarted","Data":"c4c66c051a56dcea18bd6cdf2cece3afbc040af57ce4c887a007c64616875aba"} Dec 01 21:40:09 crc kubenswrapper[4962]: I1201 21:40:09.405733 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c1b4dd7c-b41f-442b-9fc7-9e248312c359","Type":"ContainerStarted","Data":"ba0f803dc2a662054849c4df041be426d7f2235104aa99ac3a7c0523faac562f"} Dec 01 21:40:09 crc kubenswrapper[4962]: I1201 21:40:09.405793 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c1b4dd7c-b41f-442b-9fc7-9e248312c359","Type":"ContainerStarted","Data":"a4ea4180b3f9ca92ece8490aad56e213c1674ae7fcd1e11a0c1011187a2bb10e"} Dec 01 21:40:09 crc kubenswrapper[4962]: I1201 21:40:09.409239 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"89ba1c4c-644c-4832-bc73-5d072d80cfee","Type":"ContainerStarted","Data":"7a1fa8fe1f58001e55dedc7d44a5f0ce05a31d66bce206d7ab9fd6067c9dfade"} Dec 01 21:40:09 crc kubenswrapper[4962]: I1201 21:40:09.409339 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"89ba1c4c-644c-4832-bc73-5d072d80cfee","Type":"ContainerStarted","Data":"80cac5e5b81543b8a07da1ea31b1b10326d06356188143c3f9ed5d6284fe17bb"} Dec 01 21:40:09 crc kubenswrapper[4962]: I1201 21:40:09.409400 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"89ba1c4c-644c-4832-bc73-5d072d80cfee","Type":"ContainerStarted","Data":"303a934858b4ff452a263a0916da4fa9c358006d7c5e4c45fba601059c6d4861"} Dec 01 21:40:09 crc kubenswrapper[4962]: I1201 21:40:09.438570 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-78587f9f64-ll6vv" podStartSLOduration=6.192934534 podStartE2EDuration="8.43854722s" podCreationTimestamp="2025-12-01 21:40:01 +0000 UTC" firstStartedPulling="2025-12-01 21:40:03.403262387 +0000 UTC m=+387.504701582" lastFinishedPulling="2025-12-01 21:40:05.648875073 +0000 UTC m=+389.750314268" observedRunningTime="2025-12-01 21:40:06.406323878 +0000 UTC m=+390.507763113" watchObservedRunningTime="2025-12-01 21:40:09.43854722 +0000 UTC m=+393.539986445" Dec 01 21:40:09 crc kubenswrapper[4962]: I1201 21:40:09.441404 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=6.213472432 podStartE2EDuration="12.441393247s" podCreationTimestamp="2025-12-01 21:39:57 +0000 UTC" firstStartedPulling="2025-12-01 21:39:59.40736795 +0000 UTC m=+383.508807155" lastFinishedPulling="2025-12-01 21:40:05.635288775 +0000 UTC m=+389.736727970" observedRunningTime="2025-12-01 21:40:09.434803679 +0000 UTC m=+393.536242884" watchObservedRunningTime="2025-12-01 21:40:09.441393247 +0000 UTC m=+393.542832482" Dec 01 21:40:10 crc kubenswrapper[4962]: I1201 21:40:10.424523 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"89ba1c4c-644c-4832-bc73-5d072d80cfee","Type":"ContainerStarted","Data":"d060af201a6204427c211ab805cd6eff70ddb86d3734cad712873fe60c043ed9"} Dec 01 21:40:10 crc kubenswrapper[4962]: I1201 21:40:10.424870 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"89ba1c4c-644c-4832-bc73-5d072d80cfee","Type":"ContainerStarted","Data":"ba6dcefc123078accbdf7f8bf05556f7c44f8a1080f3681beccf4d80ab230b7d"} Dec 01 21:40:10 crc kubenswrapper[4962]: I1201 21:40:10.425199 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/thanos-querier-7f8fc6b774-9pv2r" Dec 01 21:40:10 crc kubenswrapper[4962]: I1201 21:40:10.425268 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"89ba1c4c-644c-4832-bc73-5d072d80cfee","Type":"ContainerStarted","Data":"ec2befa62562ad93e96c397a727b27f093d1f725b3f678aa436fda1b4a341bab"} Dec 01 21:40:10 crc kubenswrapper[4962]: I1201 21:40:10.438268 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-7f8fc6b774-9pv2r" Dec 01 21:40:10 crc kubenswrapper[4962]: I1201 21:40:10.482679 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-7f8fc6b774-9pv2r" podStartSLOduration=3.311141695 podStartE2EDuration="12.482644872s" podCreationTimestamp="2025-12-01 21:39:58 +0000 UTC" firstStartedPulling="2025-12-01 21:39:59.507634493 +0000 UTC m=+383.609073698" lastFinishedPulling="2025-12-01 21:40:08.67913768 +0000 UTC m=+392.780576875" observedRunningTime="2025-12-01 21:40:10.464855221 +0000 UTC m=+394.566294456" watchObservedRunningTime="2025-12-01 21:40:10.482644872 +0000 UTC m=+394.584084097" Dec 01 21:40:10 crc kubenswrapper[4962]: I1201 21:40:10.554568 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=4.215797223 podStartE2EDuration="8.554548358s" podCreationTimestamp="2025-12-01 21:40:02 +0000 UTC" firstStartedPulling="2025-12-01 21:40:04.348773522 +0000 UTC m=+388.450212717" lastFinishedPulling="2025-12-01 21:40:08.687524657 +0000 UTC m=+392.788963852" observedRunningTime="2025-12-01 21:40:10.551442584 +0000 UTC m=+394.652881789" watchObservedRunningTime="2025-12-01 21:40:10.554548358 +0000 UTC m=+394.655987563" Dec 01 21:40:11 crc kubenswrapper[4962]: I1201 21:40:11.154748 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-75566bdf7c-zxn4f" Dec 01 21:40:11 crc kubenswrapper[4962]: I1201 21:40:11.154821 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-75566bdf7c-zxn4f" Dec 01 21:40:11 crc kubenswrapper[4962]: I1201 21:40:11.161763 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-75566bdf7c-zxn4f" Dec 01 21:40:11 crc kubenswrapper[4962]: I1201 21:40:11.435166 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-75566bdf7c-zxn4f" Dec 01 21:40:11 crc kubenswrapper[4962]: I1201 21:40:11.492978 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-jsv9r"] Dec 01 21:40:12 crc kubenswrapper[4962]: I1201 21:40:12.758080 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-k8s-0" Dec 01 21:40:21 crc kubenswrapper[4962]: I1201 21:40:21.679008 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/metrics-server-98676cfd9-qw7bj" Dec 01 21:40:21 crc kubenswrapper[4962]: I1201 21:40:21.679744 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-98676cfd9-qw7bj" Dec 01 21:40:32 crc kubenswrapper[4962]: I1201 21:40:32.784424 4962 patch_prober.go:28] interesting pod/machine-config-daemon-b642k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 21:40:32 crc kubenswrapper[4962]: I1201 21:40:32.785115 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 21:40:32 crc kubenswrapper[4962]: I1201 21:40:32.785197 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b642k" Dec 01 21:40:32 crc kubenswrapper[4962]: I1201 21:40:32.786296 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"749bd494341ecd94507a174dd68318952a7c94f26fd3fad275718b333cbd13e5"} pod="openshift-machine-config-operator/machine-config-daemon-b642k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 21:40:32 crc kubenswrapper[4962]: I1201 21:40:32.786429 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" containerID="cri-o://749bd494341ecd94507a174dd68318952a7c94f26fd3fad275718b333cbd13e5" gracePeriod=600 Dec 01 21:40:34 crc kubenswrapper[4962]: I1201 21:40:34.628391 4962 generic.go:334] "Generic (PLEG): container finished" podID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerID="749bd494341ecd94507a174dd68318952a7c94f26fd3fad275718b333cbd13e5" exitCode=0 Dec 01 21:40:34 crc kubenswrapper[4962]: I1201 21:40:34.628486 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" event={"ID":"191b6ce3-f613-4217-b224-a65ee4cfdfe7","Type":"ContainerDied","Data":"749bd494341ecd94507a174dd68318952a7c94f26fd3fad275718b333cbd13e5"} Dec 01 21:40:34 crc kubenswrapper[4962]: I1201 21:40:34.628719 4962 scope.go:117] "RemoveContainer" containerID="ca94caf0040f2d86239f149d34e419adcc86594fda2b4189e74f699c125857f6" Dec 01 21:40:35 crc kubenswrapper[4962]: I1201 21:40:35.641353 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" event={"ID":"191b6ce3-f613-4217-b224-a65ee4cfdfe7","Type":"ContainerStarted","Data":"4b21fa950e24527d3d7f2945a34117bc2a69fe50d90966acf9350574b99da5ad"} Dec 01 21:40:36 crc kubenswrapper[4962]: I1201 21:40:36.539617 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-jsv9r" podUID="4002c0b0-4b79-4755-a60c-2fc0cdac7876" containerName="console" containerID="cri-o://8d8ebb63957c51726f2495488746d41c334b0d221e6f54c6d88543e7d287cd6e" gracePeriod=15 Dec 01 21:40:36 crc kubenswrapper[4962]: E1201 21:40:36.579749 4962 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4002c0b0_4b79_4755_a60c_2fc0cdac7876.slice/crio-conmon-8d8ebb63957c51726f2495488746d41c334b0d221e6f54c6d88543e7d287cd6e.scope\": RecentStats: unable to find data in memory cache]" Dec 01 21:40:37 crc kubenswrapper[4962]: I1201 21:40:37.019110 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-jsv9r_4002c0b0-4b79-4755-a60c-2fc0cdac7876/console/0.log" Dec 01 21:40:37 crc kubenswrapper[4962]: I1201 21:40:37.019574 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-jsv9r" Dec 01 21:40:37 crc kubenswrapper[4962]: I1201 21:40:37.206251 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4002c0b0-4b79-4755-a60c-2fc0cdac7876-console-oauth-config\") pod \"4002c0b0-4b79-4755-a60c-2fc0cdac7876\" (UID: \"4002c0b0-4b79-4755-a60c-2fc0cdac7876\") " Dec 01 21:40:37 crc kubenswrapper[4962]: I1201 21:40:37.206318 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4002c0b0-4b79-4755-a60c-2fc0cdac7876-service-ca\") pod \"4002c0b0-4b79-4755-a60c-2fc0cdac7876\" (UID: \"4002c0b0-4b79-4755-a60c-2fc0cdac7876\") " Dec 01 21:40:37 crc kubenswrapper[4962]: I1201 21:40:37.206369 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgq8g\" (UniqueName: \"kubernetes.io/projected/4002c0b0-4b79-4755-a60c-2fc0cdac7876-kube-api-access-zgq8g\") pod \"4002c0b0-4b79-4755-a60c-2fc0cdac7876\" (UID: \"4002c0b0-4b79-4755-a60c-2fc0cdac7876\") " Dec 01 21:40:37 crc kubenswrapper[4962]: I1201 21:40:37.206397 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4002c0b0-4b79-4755-a60c-2fc0cdac7876-console-serving-cert\") pod \"4002c0b0-4b79-4755-a60c-2fc0cdac7876\" (UID: \"4002c0b0-4b79-4755-a60c-2fc0cdac7876\") " Dec 01 21:40:37 crc kubenswrapper[4962]: I1201 21:40:37.206462 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4002c0b0-4b79-4755-a60c-2fc0cdac7876-trusted-ca-bundle\") pod \"4002c0b0-4b79-4755-a60c-2fc0cdac7876\" (UID: \"4002c0b0-4b79-4755-a60c-2fc0cdac7876\") " Dec 01 21:40:37 crc kubenswrapper[4962]: I1201 21:40:37.206505 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4002c0b0-4b79-4755-a60c-2fc0cdac7876-console-config\") pod \"4002c0b0-4b79-4755-a60c-2fc0cdac7876\" (UID: \"4002c0b0-4b79-4755-a60c-2fc0cdac7876\") " Dec 01 21:40:37 crc kubenswrapper[4962]: I1201 21:40:37.206570 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4002c0b0-4b79-4755-a60c-2fc0cdac7876-oauth-serving-cert\") pod \"4002c0b0-4b79-4755-a60c-2fc0cdac7876\" (UID: \"4002c0b0-4b79-4755-a60c-2fc0cdac7876\") " Dec 01 21:40:37 crc kubenswrapper[4962]: I1201 21:40:37.207351 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4002c0b0-4b79-4755-a60c-2fc0cdac7876-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "4002c0b0-4b79-4755-a60c-2fc0cdac7876" (UID: "4002c0b0-4b79-4755-a60c-2fc0cdac7876"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:40:37 crc kubenswrapper[4962]: I1201 21:40:37.207845 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4002c0b0-4b79-4755-a60c-2fc0cdac7876-console-config" (OuterVolumeSpecName: "console-config") pod "4002c0b0-4b79-4755-a60c-2fc0cdac7876" (UID: "4002c0b0-4b79-4755-a60c-2fc0cdac7876"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:40:37 crc kubenswrapper[4962]: I1201 21:40:37.207929 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4002c0b0-4b79-4755-a60c-2fc0cdac7876-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "4002c0b0-4b79-4755-a60c-2fc0cdac7876" (UID: "4002c0b0-4b79-4755-a60c-2fc0cdac7876"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:40:37 crc kubenswrapper[4962]: I1201 21:40:37.208197 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4002c0b0-4b79-4755-a60c-2fc0cdac7876-service-ca" (OuterVolumeSpecName: "service-ca") pod "4002c0b0-4b79-4755-a60c-2fc0cdac7876" (UID: "4002c0b0-4b79-4755-a60c-2fc0cdac7876"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:40:37 crc kubenswrapper[4962]: I1201 21:40:37.215088 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4002c0b0-4b79-4755-a60c-2fc0cdac7876-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "4002c0b0-4b79-4755-a60c-2fc0cdac7876" (UID: "4002c0b0-4b79-4755-a60c-2fc0cdac7876"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:40:37 crc kubenswrapper[4962]: I1201 21:40:37.216764 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4002c0b0-4b79-4755-a60c-2fc0cdac7876-kube-api-access-zgq8g" (OuterVolumeSpecName: "kube-api-access-zgq8g") pod "4002c0b0-4b79-4755-a60c-2fc0cdac7876" (UID: "4002c0b0-4b79-4755-a60c-2fc0cdac7876"). InnerVolumeSpecName "kube-api-access-zgq8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:40:37 crc kubenswrapper[4962]: I1201 21:40:37.218047 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4002c0b0-4b79-4755-a60c-2fc0cdac7876-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "4002c0b0-4b79-4755-a60c-2fc0cdac7876" (UID: "4002c0b0-4b79-4755-a60c-2fc0cdac7876"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:40:37 crc kubenswrapper[4962]: I1201 21:40:37.308814 4962 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4002c0b0-4b79-4755-a60c-2fc0cdac7876-console-config\") on node \"crc\" DevicePath \"\"" Dec 01 21:40:37 crc kubenswrapper[4962]: I1201 21:40:37.309201 4962 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4002c0b0-4b79-4755-a60c-2fc0cdac7876-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 21:40:37 crc kubenswrapper[4962]: I1201 21:40:37.309315 4962 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4002c0b0-4b79-4755-a60c-2fc0cdac7876-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 01 21:40:37 crc kubenswrapper[4962]: I1201 21:40:37.309374 4962 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4002c0b0-4b79-4755-a60c-2fc0cdac7876-service-ca\") on node \"crc\" DevicePath \"\"" Dec 01 21:40:37 crc kubenswrapper[4962]: I1201 21:40:37.309428 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgq8g\" (UniqueName: \"kubernetes.io/projected/4002c0b0-4b79-4755-a60c-2fc0cdac7876-kube-api-access-zgq8g\") on node \"crc\" DevicePath \"\"" Dec 01 21:40:37 crc kubenswrapper[4962]: I1201 21:40:37.309485 4962 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4002c0b0-4b79-4755-a60c-2fc0cdac7876-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 21:40:37 crc kubenswrapper[4962]: I1201 21:40:37.309535 4962 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4002c0b0-4b79-4755-a60c-2fc0cdac7876-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 21:40:37 crc kubenswrapper[4962]: I1201 21:40:37.656106 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-jsv9r_4002c0b0-4b79-4755-a60c-2fc0cdac7876/console/0.log" Dec 01 21:40:37 crc kubenswrapper[4962]: I1201 21:40:37.656161 4962 generic.go:334] "Generic (PLEG): container finished" podID="4002c0b0-4b79-4755-a60c-2fc0cdac7876" containerID="8d8ebb63957c51726f2495488746d41c334b0d221e6f54c6d88543e7d287cd6e" exitCode=2 Dec 01 21:40:37 crc kubenswrapper[4962]: I1201 21:40:37.656253 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-jsv9r" Dec 01 21:40:37 crc kubenswrapper[4962]: I1201 21:40:37.656188 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-jsv9r" event={"ID":"4002c0b0-4b79-4755-a60c-2fc0cdac7876","Type":"ContainerDied","Data":"8d8ebb63957c51726f2495488746d41c334b0d221e6f54c6d88543e7d287cd6e"} Dec 01 21:40:37 crc kubenswrapper[4962]: I1201 21:40:37.657202 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-jsv9r" event={"ID":"4002c0b0-4b79-4755-a60c-2fc0cdac7876","Type":"ContainerDied","Data":"21538d0a3c59fee3e4d2509fe44592ed0e6dfec886c727b0424564727581376b"} Dec 01 21:40:37 crc kubenswrapper[4962]: I1201 21:40:37.657255 4962 scope.go:117] "RemoveContainer" containerID="8d8ebb63957c51726f2495488746d41c334b0d221e6f54c6d88543e7d287cd6e" Dec 01 21:40:37 crc kubenswrapper[4962]: I1201 21:40:37.681251 4962 scope.go:117] "RemoveContainer" containerID="8d8ebb63957c51726f2495488746d41c334b0d221e6f54c6d88543e7d287cd6e" Dec 01 21:40:37 crc kubenswrapper[4962]: E1201 21:40:37.684031 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d8ebb63957c51726f2495488746d41c334b0d221e6f54c6d88543e7d287cd6e\": container with ID starting with 8d8ebb63957c51726f2495488746d41c334b0d221e6f54c6d88543e7d287cd6e not found: ID does not exist" containerID="8d8ebb63957c51726f2495488746d41c334b0d221e6f54c6d88543e7d287cd6e" Dec 01 21:40:37 crc kubenswrapper[4962]: I1201 21:40:37.684216 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d8ebb63957c51726f2495488746d41c334b0d221e6f54c6d88543e7d287cd6e"} err="failed to get container status \"8d8ebb63957c51726f2495488746d41c334b0d221e6f54c6d88543e7d287cd6e\": rpc error: code = NotFound desc = could not find container \"8d8ebb63957c51726f2495488746d41c334b0d221e6f54c6d88543e7d287cd6e\": container with ID starting with 8d8ebb63957c51726f2495488746d41c334b0d221e6f54c6d88543e7d287cd6e not found: ID does not exist" Dec 01 21:40:37 crc kubenswrapper[4962]: I1201 21:40:37.693032 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-jsv9r"] Dec 01 21:40:37 crc kubenswrapper[4962]: I1201 21:40:37.702429 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-jsv9r"] Dec 01 21:40:38 crc kubenswrapper[4962]: I1201 21:40:38.228595 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4002c0b0-4b79-4755-a60c-2fc0cdac7876" path="/var/lib/kubelet/pods/4002c0b0-4b79-4755-a60c-2fc0cdac7876/volumes" Dec 01 21:40:41 crc kubenswrapper[4962]: I1201 21:40:41.688991 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-98676cfd9-qw7bj" Dec 01 21:40:41 crc kubenswrapper[4962]: I1201 21:40:41.697085 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-98676cfd9-qw7bj" Dec 01 21:41:02 crc kubenswrapper[4962]: I1201 21:41:02.758379 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Dec 01 21:41:02 crc kubenswrapper[4962]: I1201 21:41:02.800073 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Dec 01 21:41:02 crc kubenswrapper[4962]: I1201 21:41:02.888042 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Dec 01 21:41:16 crc kubenswrapper[4962]: I1201 21:41:16.026310 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-7bcb5d4c85-jhx6l"] Dec 01 21:41:16 crc kubenswrapper[4962]: E1201 21:41:16.027282 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4002c0b0-4b79-4755-a60c-2fc0cdac7876" containerName="console" Dec 01 21:41:16 crc kubenswrapper[4962]: I1201 21:41:16.027301 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="4002c0b0-4b79-4755-a60c-2fc0cdac7876" containerName="console" Dec 01 21:41:16 crc kubenswrapper[4962]: I1201 21:41:16.027458 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="4002c0b0-4b79-4755-a60c-2fc0cdac7876" containerName="console" Dec 01 21:41:16 crc kubenswrapper[4962]: I1201 21:41:16.027972 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7bcb5d4c85-jhx6l" Dec 01 21:41:16 crc kubenswrapper[4962]: I1201 21:41:16.042904 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7bcb5d4c85-jhx6l"] Dec 01 21:41:16 crc kubenswrapper[4962]: I1201 21:41:16.064099 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2571680f-abc0-4c0e-8178-c8e336cca4b4-trusted-ca-bundle\") pod \"console-7bcb5d4c85-jhx6l\" (UID: \"2571680f-abc0-4c0e-8178-c8e336cca4b4\") " pod="openshift-console/console-7bcb5d4c85-jhx6l" Dec 01 21:41:16 crc kubenswrapper[4962]: I1201 21:41:16.064142 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xxth\" (UniqueName: \"kubernetes.io/projected/2571680f-abc0-4c0e-8178-c8e336cca4b4-kube-api-access-5xxth\") pod \"console-7bcb5d4c85-jhx6l\" (UID: \"2571680f-abc0-4c0e-8178-c8e336cca4b4\") " pod="openshift-console/console-7bcb5d4c85-jhx6l" Dec 01 21:41:16 crc kubenswrapper[4962]: I1201 21:41:16.064182 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2571680f-abc0-4c0e-8178-c8e336cca4b4-service-ca\") pod \"console-7bcb5d4c85-jhx6l\" (UID: \"2571680f-abc0-4c0e-8178-c8e336cca4b4\") " pod="openshift-console/console-7bcb5d4c85-jhx6l" Dec 01 21:41:16 crc kubenswrapper[4962]: I1201 21:41:16.064413 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2571680f-abc0-4c0e-8178-c8e336cca4b4-console-serving-cert\") pod \"console-7bcb5d4c85-jhx6l\" (UID: \"2571680f-abc0-4c0e-8178-c8e336cca4b4\") " pod="openshift-console/console-7bcb5d4c85-jhx6l" Dec 01 21:41:16 crc kubenswrapper[4962]: I1201 21:41:16.064478 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2571680f-abc0-4c0e-8178-c8e336cca4b4-oauth-serving-cert\") pod \"console-7bcb5d4c85-jhx6l\" (UID: \"2571680f-abc0-4c0e-8178-c8e336cca4b4\") " pod="openshift-console/console-7bcb5d4c85-jhx6l" Dec 01 21:41:16 crc kubenswrapper[4962]: I1201 21:41:16.064525 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2571680f-abc0-4c0e-8178-c8e336cca4b4-console-oauth-config\") pod \"console-7bcb5d4c85-jhx6l\" (UID: \"2571680f-abc0-4c0e-8178-c8e336cca4b4\") " pod="openshift-console/console-7bcb5d4c85-jhx6l" Dec 01 21:41:16 crc kubenswrapper[4962]: I1201 21:41:16.064571 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2571680f-abc0-4c0e-8178-c8e336cca4b4-console-config\") pod \"console-7bcb5d4c85-jhx6l\" (UID: \"2571680f-abc0-4c0e-8178-c8e336cca4b4\") " pod="openshift-console/console-7bcb5d4c85-jhx6l" Dec 01 21:41:16 crc kubenswrapper[4962]: I1201 21:41:16.165581 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2571680f-abc0-4c0e-8178-c8e336cca4b4-console-serving-cert\") pod \"console-7bcb5d4c85-jhx6l\" (UID: \"2571680f-abc0-4c0e-8178-c8e336cca4b4\") " pod="openshift-console/console-7bcb5d4c85-jhx6l" Dec 01 21:41:16 crc kubenswrapper[4962]: I1201 21:41:16.165627 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2571680f-abc0-4c0e-8178-c8e336cca4b4-oauth-serving-cert\") pod \"console-7bcb5d4c85-jhx6l\" (UID: \"2571680f-abc0-4c0e-8178-c8e336cca4b4\") " pod="openshift-console/console-7bcb5d4c85-jhx6l" Dec 01 21:41:16 crc kubenswrapper[4962]: I1201 21:41:16.165645 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2571680f-abc0-4c0e-8178-c8e336cca4b4-console-oauth-config\") pod \"console-7bcb5d4c85-jhx6l\" (UID: \"2571680f-abc0-4c0e-8178-c8e336cca4b4\") " pod="openshift-console/console-7bcb5d4c85-jhx6l" Dec 01 21:41:16 crc kubenswrapper[4962]: I1201 21:41:16.165667 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2571680f-abc0-4c0e-8178-c8e336cca4b4-console-config\") pod \"console-7bcb5d4c85-jhx6l\" (UID: \"2571680f-abc0-4c0e-8178-c8e336cca4b4\") " pod="openshift-console/console-7bcb5d4c85-jhx6l" Dec 01 21:41:16 crc kubenswrapper[4962]: I1201 21:41:16.165717 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2571680f-abc0-4c0e-8178-c8e336cca4b4-trusted-ca-bundle\") pod \"console-7bcb5d4c85-jhx6l\" (UID: \"2571680f-abc0-4c0e-8178-c8e336cca4b4\") " pod="openshift-console/console-7bcb5d4c85-jhx6l" Dec 01 21:41:16 crc kubenswrapper[4962]: I1201 21:41:16.165748 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xxth\" (UniqueName: \"kubernetes.io/projected/2571680f-abc0-4c0e-8178-c8e336cca4b4-kube-api-access-5xxth\") pod \"console-7bcb5d4c85-jhx6l\" (UID: \"2571680f-abc0-4c0e-8178-c8e336cca4b4\") " pod="openshift-console/console-7bcb5d4c85-jhx6l" Dec 01 21:41:16 crc kubenswrapper[4962]: I1201 21:41:16.165780 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2571680f-abc0-4c0e-8178-c8e336cca4b4-service-ca\") pod \"console-7bcb5d4c85-jhx6l\" (UID: \"2571680f-abc0-4c0e-8178-c8e336cca4b4\") " pod="openshift-console/console-7bcb5d4c85-jhx6l" Dec 01 21:41:16 crc kubenswrapper[4962]: I1201 21:41:16.166778 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2571680f-abc0-4c0e-8178-c8e336cca4b4-service-ca\") pod \"console-7bcb5d4c85-jhx6l\" (UID: \"2571680f-abc0-4c0e-8178-c8e336cca4b4\") " pod="openshift-console/console-7bcb5d4c85-jhx6l" Dec 01 21:41:16 crc kubenswrapper[4962]: I1201 21:41:16.166816 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2571680f-abc0-4c0e-8178-c8e336cca4b4-console-config\") pod \"console-7bcb5d4c85-jhx6l\" (UID: \"2571680f-abc0-4c0e-8178-c8e336cca4b4\") " pod="openshift-console/console-7bcb5d4c85-jhx6l" Dec 01 21:41:16 crc kubenswrapper[4962]: I1201 21:41:16.166898 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2571680f-abc0-4c0e-8178-c8e336cca4b4-trusted-ca-bundle\") pod \"console-7bcb5d4c85-jhx6l\" (UID: \"2571680f-abc0-4c0e-8178-c8e336cca4b4\") " pod="openshift-console/console-7bcb5d4c85-jhx6l" Dec 01 21:41:16 crc kubenswrapper[4962]: I1201 21:41:16.167451 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2571680f-abc0-4c0e-8178-c8e336cca4b4-oauth-serving-cert\") pod \"console-7bcb5d4c85-jhx6l\" (UID: \"2571680f-abc0-4c0e-8178-c8e336cca4b4\") " pod="openshift-console/console-7bcb5d4c85-jhx6l" Dec 01 21:41:16 crc kubenswrapper[4962]: I1201 21:41:16.173584 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2571680f-abc0-4c0e-8178-c8e336cca4b4-console-serving-cert\") pod \"console-7bcb5d4c85-jhx6l\" (UID: \"2571680f-abc0-4c0e-8178-c8e336cca4b4\") " pod="openshift-console/console-7bcb5d4c85-jhx6l" Dec 01 21:41:16 crc kubenswrapper[4962]: I1201 21:41:16.176021 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2571680f-abc0-4c0e-8178-c8e336cca4b4-console-oauth-config\") pod \"console-7bcb5d4c85-jhx6l\" (UID: \"2571680f-abc0-4c0e-8178-c8e336cca4b4\") " pod="openshift-console/console-7bcb5d4c85-jhx6l" Dec 01 21:41:16 crc kubenswrapper[4962]: I1201 21:41:16.180572 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xxth\" (UniqueName: \"kubernetes.io/projected/2571680f-abc0-4c0e-8178-c8e336cca4b4-kube-api-access-5xxth\") pod \"console-7bcb5d4c85-jhx6l\" (UID: \"2571680f-abc0-4c0e-8178-c8e336cca4b4\") " pod="openshift-console/console-7bcb5d4c85-jhx6l" Dec 01 21:41:16 crc kubenswrapper[4962]: I1201 21:41:16.353850 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7bcb5d4c85-jhx6l" Dec 01 21:41:16 crc kubenswrapper[4962]: I1201 21:41:16.839081 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7bcb5d4c85-jhx6l"] Dec 01 21:41:16 crc kubenswrapper[4962]: I1201 21:41:16.967035 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7bcb5d4c85-jhx6l" event={"ID":"2571680f-abc0-4c0e-8178-c8e336cca4b4","Type":"ContainerStarted","Data":"e2c3be77b352e830eb018340fa37c420d5dcfb10a66df42339e8612448ce847c"} Dec 01 21:41:17 crc kubenswrapper[4962]: I1201 21:41:17.977588 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7bcb5d4c85-jhx6l" event={"ID":"2571680f-abc0-4c0e-8178-c8e336cca4b4","Type":"ContainerStarted","Data":"e3df5734e42aa4a5dabd49292e8e68f3ed43e97c21d59a3a4b79ac5b94138d64"} Dec 01 21:41:18 crc kubenswrapper[4962]: I1201 21:41:18.009483 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7bcb5d4c85-jhx6l" podStartSLOduration=2.009463099 podStartE2EDuration="2.009463099s" podCreationTimestamp="2025-12-01 21:41:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:41:18.002139993 +0000 UTC m=+462.103579228" watchObservedRunningTime="2025-12-01 21:41:18.009463099 +0000 UTC m=+462.110902314" Dec 01 21:41:20 crc kubenswrapper[4962]: I1201 21:41:20.370988 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-5f75bfbf8b-qc2cs"] Dec 01 21:41:20 crc kubenswrapper[4962]: I1201 21:41:20.371869 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-5f75bfbf8b-qc2cs" Dec 01 21:41:20 crc kubenswrapper[4962]: I1201 21:41:20.384008 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/metrics-server-98676cfd9-qw7bj"] Dec 01 21:41:20 crc kubenswrapper[4962]: I1201 21:41:20.384834 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/metrics-server-98676cfd9-qw7bj" podUID="96d314cb-1713-4e20-8eec-70bedd5cabad" containerName="metrics-server" containerID="cri-o://305fc01654247128bde1a7cd7511f2b42c4df8a1704e7135b1469d1649722aaa" gracePeriod=170 Dec 01 21:41:20 crc kubenswrapper[4962]: I1201 21:41:20.395357 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-5f75bfbf8b-qc2cs"] Dec 01 21:41:20 crc kubenswrapper[4962]: I1201 21:41:20.440083 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a814a33-ec76-4d8d-882e-393c13fbf48c-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5f75bfbf8b-qc2cs\" (UID: \"0a814a33-ec76-4d8d-882e-393c13fbf48c\") " pod="openshift-monitoring/metrics-server-5f75bfbf8b-qc2cs" Dec 01 21:41:20 crc kubenswrapper[4962]: I1201 21:41:20.440175 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/0a814a33-ec76-4d8d-882e-393c13fbf48c-metrics-server-audit-profiles\") pod \"metrics-server-5f75bfbf8b-qc2cs\" (UID: \"0a814a33-ec76-4d8d-882e-393c13fbf48c\") " pod="openshift-monitoring/metrics-server-5f75bfbf8b-qc2cs" Dec 01 21:41:20 crc kubenswrapper[4962]: I1201 21:41:20.440211 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/0a814a33-ec76-4d8d-882e-393c13fbf48c-audit-log\") pod \"metrics-server-5f75bfbf8b-qc2cs\" (UID: \"0a814a33-ec76-4d8d-882e-393c13fbf48c\") " pod="openshift-monitoring/metrics-server-5f75bfbf8b-qc2cs" Dec 01 21:41:20 crc kubenswrapper[4962]: I1201 21:41:20.440268 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a814a33-ec76-4d8d-882e-393c13fbf48c-client-ca-bundle\") pod \"metrics-server-5f75bfbf8b-qc2cs\" (UID: \"0a814a33-ec76-4d8d-882e-393c13fbf48c\") " pod="openshift-monitoring/metrics-server-5f75bfbf8b-qc2cs" Dec 01 21:41:20 crc kubenswrapper[4962]: I1201 21:41:20.440315 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/0a814a33-ec76-4d8d-882e-393c13fbf48c-secret-metrics-client-certs\") pod \"metrics-server-5f75bfbf8b-qc2cs\" (UID: \"0a814a33-ec76-4d8d-882e-393c13fbf48c\") " pod="openshift-monitoring/metrics-server-5f75bfbf8b-qc2cs" Dec 01 21:41:20 crc kubenswrapper[4962]: I1201 21:41:20.440354 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/0a814a33-ec76-4d8d-882e-393c13fbf48c-secret-metrics-server-tls\") pod \"metrics-server-5f75bfbf8b-qc2cs\" (UID: \"0a814a33-ec76-4d8d-882e-393c13fbf48c\") " pod="openshift-monitoring/metrics-server-5f75bfbf8b-qc2cs" Dec 01 21:41:20 crc kubenswrapper[4962]: I1201 21:41:20.440382 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvkwb\" (UniqueName: \"kubernetes.io/projected/0a814a33-ec76-4d8d-882e-393c13fbf48c-kube-api-access-tvkwb\") pod \"metrics-server-5f75bfbf8b-qc2cs\" (UID: \"0a814a33-ec76-4d8d-882e-393c13fbf48c\") " pod="openshift-monitoring/metrics-server-5f75bfbf8b-qc2cs" Dec 01 21:41:20 crc kubenswrapper[4962]: I1201 21:41:20.541743 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/0a814a33-ec76-4d8d-882e-393c13fbf48c-metrics-server-audit-profiles\") pod \"metrics-server-5f75bfbf8b-qc2cs\" (UID: \"0a814a33-ec76-4d8d-882e-393c13fbf48c\") " pod="openshift-monitoring/metrics-server-5f75bfbf8b-qc2cs" Dec 01 21:41:20 crc kubenswrapper[4962]: I1201 21:41:20.541815 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/0a814a33-ec76-4d8d-882e-393c13fbf48c-audit-log\") pod \"metrics-server-5f75bfbf8b-qc2cs\" (UID: \"0a814a33-ec76-4d8d-882e-393c13fbf48c\") " pod="openshift-monitoring/metrics-server-5f75bfbf8b-qc2cs" Dec 01 21:41:20 crc kubenswrapper[4962]: I1201 21:41:20.541848 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a814a33-ec76-4d8d-882e-393c13fbf48c-client-ca-bundle\") pod \"metrics-server-5f75bfbf8b-qc2cs\" (UID: \"0a814a33-ec76-4d8d-882e-393c13fbf48c\") " pod="openshift-monitoring/metrics-server-5f75bfbf8b-qc2cs" Dec 01 21:41:20 crc kubenswrapper[4962]: I1201 21:41:20.541889 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/0a814a33-ec76-4d8d-882e-393c13fbf48c-secret-metrics-client-certs\") pod \"metrics-server-5f75bfbf8b-qc2cs\" (UID: \"0a814a33-ec76-4d8d-882e-393c13fbf48c\") " pod="openshift-monitoring/metrics-server-5f75bfbf8b-qc2cs" Dec 01 21:41:20 crc kubenswrapper[4962]: I1201 21:41:20.541923 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/0a814a33-ec76-4d8d-882e-393c13fbf48c-secret-metrics-server-tls\") pod \"metrics-server-5f75bfbf8b-qc2cs\" (UID: \"0a814a33-ec76-4d8d-882e-393c13fbf48c\") " pod="openshift-monitoring/metrics-server-5f75bfbf8b-qc2cs" Dec 01 21:41:20 crc kubenswrapper[4962]: I1201 21:41:20.541957 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvkwb\" (UniqueName: \"kubernetes.io/projected/0a814a33-ec76-4d8d-882e-393c13fbf48c-kube-api-access-tvkwb\") pod \"metrics-server-5f75bfbf8b-qc2cs\" (UID: \"0a814a33-ec76-4d8d-882e-393c13fbf48c\") " pod="openshift-monitoring/metrics-server-5f75bfbf8b-qc2cs" Dec 01 21:41:20 crc kubenswrapper[4962]: I1201 21:41:20.541992 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a814a33-ec76-4d8d-882e-393c13fbf48c-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5f75bfbf8b-qc2cs\" (UID: \"0a814a33-ec76-4d8d-882e-393c13fbf48c\") " pod="openshift-monitoring/metrics-server-5f75bfbf8b-qc2cs" Dec 01 21:41:20 crc kubenswrapper[4962]: I1201 21:41:20.542738 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/0a814a33-ec76-4d8d-882e-393c13fbf48c-audit-log\") pod \"metrics-server-5f75bfbf8b-qc2cs\" (UID: \"0a814a33-ec76-4d8d-882e-393c13fbf48c\") " pod="openshift-monitoring/metrics-server-5f75bfbf8b-qc2cs" Dec 01 21:41:20 crc kubenswrapper[4962]: I1201 21:41:20.542975 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a814a33-ec76-4d8d-882e-393c13fbf48c-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5f75bfbf8b-qc2cs\" (UID: \"0a814a33-ec76-4d8d-882e-393c13fbf48c\") " pod="openshift-monitoring/metrics-server-5f75bfbf8b-qc2cs" Dec 01 21:41:20 crc kubenswrapper[4962]: I1201 21:41:20.545755 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/0a814a33-ec76-4d8d-882e-393c13fbf48c-metrics-server-audit-profiles\") pod \"metrics-server-5f75bfbf8b-qc2cs\" (UID: \"0a814a33-ec76-4d8d-882e-393c13fbf48c\") " pod="openshift-monitoring/metrics-server-5f75bfbf8b-qc2cs" Dec 01 21:41:20 crc kubenswrapper[4962]: I1201 21:41:20.549382 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/0a814a33-ec76-4d8d-882e-393c13fbf48c-secret-metrics-server-tls\") pod \"metrics-server-5f75bfbf8b-qc2cs\" (UID: \"0a814a33-ec76-4d8d-882e-393c13fbf48c\") " pod="openshift-monitoring/metrics-server-5f75bfbf8b-qc2cs" Dec 01 21:41:20 crc kubenswrapper[4962]: I1201 21:41:20.550726 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/0a814a33-ec76-4d8d-882e-393c13fbf48c-secret-metrics-client-certs\") pod \"metrics-server-5f75bfbf8b-qc2cs\" (UID: \"0a814a33-ec76-4d8d-882e-393c13fbf48c\") " pod="openshift-monitoring/metrics-server-5f75bfbf8b-qc2cs" Dec 01 21:41:20 crc kubenswrapper[4962]: I1201 21:41:20.552573 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a814a33-ec76-4d8d-882e-393c13fbf48c-client-ca-bundle\") pod \"metrics-server-5f75bfbf8b-qc2cs\" (UID: \"0a814a33-ec76-4d8d-882e-393c13fbf48c\") " pod="openshift-monitoring/metrics-server-5f75bfbf8b-qc2cs" Dec 01 21:41:20 crc kubenswrapper[4962]: I1201 21:41:20.572892 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvkwb\" (UniqueName: \"kubernetes.io/projected/0a814a33-ec76-4d8d-882e-393c13fbf48c-kube-api-access-tvkwb\") pod \"metrics-server-5f75bfbf8b-qc2cs\" (UID: \"0a814a33-ec76-4d8d-882e-393c13fbf48c\") " pod="openshift-monitoring/metrics-server-5f75bfbf8b-qc2cs" Dec 01 21:41:20 crc kubenswrapper[4962]: I1201 21:41:20.702506 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-5f75bfbf8b-qc2cs" Dec 01 21:41:20 crc kubenswrapper[4962]: I1201 21:41:20.955164 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-5f75bfbf8b-qc2cs"] Dec 01 21:41:20 crc kubenswrapper[4962]: W1201 21:41:20.958970 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a814a33_ec76_4d8d_882e_393c13fbf48c.slice/crio-7d36e33b8627017a8854d1ce288871bcb77bb10878f27272df9ecfda73dec6b9 WatchSource:0}: Error finding container 7d36e33b8627017a8854d1ce288871bcb77bb10878f27272df9ecfda73dec6b9: Status 404 returned error can't find the container with id 7d36e33b8627017a8854d1ce288871bcb77bb10878f27272df9ecfda73dec6b9 Dec 01 21:41:20 crc kubenswrapper[4962]: I1201 21:41:20.999293 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-5f75bfbf8b-qc2cs" event={"ID":"0a814a33-ec76-4d8d-882e-393c13fbf48c","Type":"ContainerStarted","Data":"7d36e33b8627017a8854d1ce288871bcb77bb10878f27272df9ecfda73dec6b9"} Dec 01 21:41:22 crc kubenswrapper[4962]: I1201 21:41:22.009558 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-5f75bfbf8b-qc2cs" event={"ID":"0a814a33-ec76-4d8d-882e-393c13fbf48c","Type":"ContainerStarted","Data":"d7409e1bf0074b36afbfd2be4993e5c3fb39a18b424a076680952af8c4dda022"} Dec 01 21:41:22 crc kubenswrapper[4962]: I1201 21:41:22.038560 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-5f75bfbf8b-qc2cs" podStartSLOduration=2.038527567 podStartE2EDuration="2.038527567s" podCreationTimestamp="2025-12-01 21:41:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:41:22.036469896 +0000 UTC m=+466.137909151" watchObservedRunningTime="2025-12-01 21:41:22.038527567 +0000 UTC m=+466.139966832" Dec 01 21:41:26 crc kubenswrapper[4962]: I1201 21:41:26.354633 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7bcb5d4c85-jhx6l" Dec 01 21:41:26 crc kubenswrapper[4962]: I1201 21:41:26.355025 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7bcb5d4c85-jhx6l" Dec 01 21:41:26 crc kubenswrapper[4962]: I1201 21:41:26.362344 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7bcb5d4c85-jhx6l" Dec 01 21:41:27 crc kubenswrapper[4962]: I1201 21:41:27.059212 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7bcb5d4c85-jhx6l" Dec 01 21:41:27 crc kubenswrapper[4962]: I1201 21:41:27.138979 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-75566bdf7c-zxn4f"] Dec 01 21:41:40 crc kubenswrapper[4962]: I1201 21:41:40.703228 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/metrics-server-5f75bfbf8b-qc2cs" Dec 01 21:41:40 crc kubenswrapper[4962]: I1201 21:41:40.704017 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-5f75bfbf8b-qc2cs" Dec 01 21:41:52 crc kubenswrapper[4962]: I1201 21:41:52.211096 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-75566bdf7c-zxn4f" podUID="4f3bd254-c29c-48b9-9ecf-0824411c2e6f" containerName="console" containerID="cri-o://a68832c66f791bdb00d2078fc1724d178d13515fbaf21cc4cd053997dfd14c54" gracePeriod=15 Dec 01 21:41:52 crc kubenswrapper[4962]: I1201 21:41:52.632345 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-75566bdf7c-zxn4f_4f3bd254-c29c-48b9-9ecf-0824411c2e6f/console/0.log" Dec 01 21:41:52 crc kubenswrapper[4962]: I1201 21:41:52.632442 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-75566bdf7c-zxn4f" Dec 01 21:41:52 crc kubenswrapper[4962]: I1201 21:41:52.692930 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4f3bd254-c29c-48b9-9ecf-0824411c2e6f-service-ca\") pod \"4f3bd254-c29c-48b9-9ecf-0824411c2e6f\" (UID: \"4f3bd254-c29c-48b9-9ecf-0824411c2e6f\") " Dec 01 21:41:52 crc kubenswrapper[4962]: I1201 21:41:52.693096 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tkmxf\" (UniqueName: \"kubernetes.io/projected/4f3bd254-c29c-48b9-9ecf-0824411c2e6f-kube-api-access-tkmxf\") pod \"4f3bd254-c29c-48b9-9ecf-0824411c2e6f\" (UID: \"4f3bd254-c29c-48b9-9ecf-0824411c2e6f\") " Dec 01 21:41:52 crc kubenswrapper[4962]: I1201 21:41:52.693158 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4f3bd254-c29c-48b9-9ecf-0824411c2e6f-console-oauth-config\") pod \"4f3bd254-c29c-48b9-9ecf-0824411c2e6f\" (UID: \"4f3bd254-c29c-48b9-9ecf-0824411c2e6f\") " Dec 01 21:41:52 crc kubenswrapper[4962]: I1201 21:41:52.693213 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f3bd254-c29c-48b9-9ecf-0824411c2e6f-trusted-ca-bundle\") pod \"4f3bd254-c29c-48b9-9ecf-0824411c2e6f\" (UID: \"4f3bd254-c29c-48b9-9ecf-0824411c2e6f\") " Dec 01 21:41:52 crc kubenswrapper[4962]: I1201 21:41:52.693282 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4f3bd254-c29c-48b9-9ecf-0824411c2e6f-console-serving-cert\") pod \"4f3bd254-c29c-48b9-9ecf-0824411c2e6f\" (UID: \"4f3bd254-c29c-48b9-9ecf-0824411c2e6f\") " Dec 01 21:41:52 crc kubenswrapper[4962]: I1201 21:41:52.693342 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4f3bd254-c29c-48b9-9ecf-0824411c2e6f-console-config\") pod \"4f3bd254-c29c-48b9-9ecf-0824411c2e6f\" (UID: \"4f3bd254-c29c-48b9-9ecf-0824411c2e6f\") " Dec 01 21:41:52 crc kubenswrapper[4962]: I1201 21:41:52.693415 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4f3bd254-c29c-48b9-9ecf-0824411c2e6f-oauth-serving-cert\") pod \"4f3bd254-c29c-48b9-9ecf-0824411c2e6f\" (UID: \"4f3bd254-c29c-48b9-9ecf-0824411c2e6f\") " Dec 01 21:41:52 crc kubenswrapper[4962]: I1201 21:41:52.694401 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f3bd254-c29c-48b9-9ecf-0824411c2e6f-service-ca" (OuterVolumeSpecName: "service-ca") pod "4f3bd254-c29c-48b9-9ecf-0824411c2e6f" (UID: "4f3bd254-c29c-48b9-9ecf-0824411c2e6f"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:41:52 crc kubenswrapper[4962]: I1201 21:41:52.694565 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f3bd254-c29c-48b9-9ecf-0824411c2e6f-console-config" (OuterVolumeSpecName: "console-config") pod "4f3bd254-c29c-48b9-9ecf-0824411c2e6f" (UID: "4f3bd254-c29c-48b9-9ecf-0824411c2e6f"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:41:52 crc kubenswrapper[4962]: I1201 21:41:52.695303 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f3bd254-c29c-48b9-9ecf-0824411c2e6f-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "4f3bd254-c29c-48b9-9ecf-0824411c2e6f" (UID: "4f3bd254-c29c-48b9-9ecf-0824411c2e6f"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:41:52 crc kubenswrapper[4962]: I1201 21:41:52.695348 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f3bd254-c29c-48b9-9ecf-0824411c2e6f-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "4f3bd254-c29c-48b9-9ecf-0824411c2e6f" (UID: "4f3bd254-c29c-48b9-9ecf-0824411c2e6f"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:41:52 crc kubenswrapper[4962]: I1201 21:41:52.701122 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f3bd254-c29c-48b9-9ecf-0824411c2e6f-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "4f3bd254-c29c-48b9-9ecf-0824411c2e6f" (UID: "4f3bd254-c29c-48b9-9ecf-0824411c2e6f"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:41:52 crc kubenswrapper[4962]: I1201 21:41:52.701406 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f3bd254-c29c-48b9-9ecf-0824411c2e6f-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "4f3bd254-c29c-48b9-9ecf-0824411c2e6f" (UID: "4f3bd254-c29c-48b9-9ecf-0824411c2e6f"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:41:52 crc kubenswrapper[4962]: I1201 21:41:52.701465 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f3bd254-c29c-48b9-9ecf-0824411c2e6f-kube-api-access-tkmxf" (OuterVolumeSpecName: "kube-api-access-tkmxf") pod "4f3bd254-c29c-48b9-9ecf-0824411c2e6f" (UID: "4f3bd254-c29c-48b9-9ecf-0824411c2e6f"). InnerVolumeSpecName "kube-api-access-tkmxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:41:52 crc kubenswrapper[4962]: I1201 21:41:52.794788 4962 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4f3bd254-c29c-48b9-9ecf-0824411c2e6f-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 21:41:52 crc kubenswrapper[4962]: I1201 21:41:52.795080 4962 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4f3bd254-c29c-48b9-9ecf-0824411c2e6f-service-ca\") on node \"crc\" DevicePath \"\"" Dec 01 21:41:52 crc kubenswrapper[4962]: I1201 21:41:52.795092 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tkmxf\" (UniqueName: \"kubernetes.io/projected/4f3bd254-c29c-48b9-9ecf-0824411c2e6f-kube-api-access-tkmxf\") on node \"crc\" DevicePath \"\"" Dec 01 21:41:52 crc kubenswrapper[4962]: I1201 21:41:52.795103 4962 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4f3bd254-c29c-48b9-9ecf-0824411c2e6f-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 01 21:41:52 crc kubenswrapper[4962]: I1201 21:41:52.795112 4962 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f3bd254-c29c-48b9-9ecf-0824411c2e6f-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 21:41:52 crc kubenswrapper[4962]: I1201 21:41:52.795121 4962 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4f3bd254-c29c-48b9-9ecf-0824411c2e6f-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 21:41:52 crc kubenswrapper[4962]: I1201 21:41:52.795129 4962 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4f3bd254-c29c-48b9-9ecf-0824411c2e6f-console-config\") on node \"crc\" DevicePath \"\"" Dec 01 21:41:53 crc kubenswrapper[4962]: I1201 21:41:53.262142 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-75566bdf7c-zxn4f_4f3bd254-c29c-48b9-9ecf-0824411c2e6f/console/0.log" Dec 01 21:41:53 crc kubenswrapper[4962]: I1201 21:41:53.262221 4962 generic.go:334] "Generic (PLEG): container finished" podID="4f3bd254-c29c-48b9-9ecf-0824411c2e6f" containerID="a68832c66f791bdb00d2078fc1724d178d13515fbaf21cc4cd053997dfd14c54" exitCode=2 Dec 01 21:41:53 crc kubenswrapper[4962]: I1201 21:41:53.262265 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-75566bdf7c-zxn4f" event={"ID":"4f3bd254-c29c-48b9-9ecf-0824411c2e6f","Type":"ContainerDied","Data":"a68832c66f791bdb00d2078fc1724d178d13515fbaf21cc4cd053997dfd14c54"} Dec 01 21:41:53 crc kubenswrapper[4962]: I1201 21:41:53.262302 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-75566bdf7c-zxn4f" event={"ID":"4f3bd254-c29c-48b9-9ecf-0824411c2e6f","Type":"ContainerDied","Data":"d141f7ee7926f800878928608d1e54d0a4cd4c2e4ed6bd75e573d6372cc0d8e0"} Dec 01 21:41:53 crc kubenswrapper[4962]: I1201 21:41:53.262310 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-75566bdf7c-zxn4f" Dec 01 21:41:53 crc kubenswrapper[4962]: I1201 21:41:53.262333 4962 scope.go:117] "RemoveContainer" containerID="a68832c66f791bdb00d2078fc1724d178d13515fbaf21cc4cd053997dfd14c54" Dec 01 21:41:53 crc kubenswrapper[4962]: I1201 21:41:53.296650 4962 scope.go:117] "RemoveContainer" containerID="a68832c66f791bdb00d2078fc1724d178d13515fbaf21cc4cd053997dfd14c54" Dec 01 21:41:53 crc kubenswrapper[4962]: E1201 21:41:53.297319 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a68832c66f791bdb00d2078fc1724d178d13515fbaf21cc4cd053997dfd14c54\": container with ID starting with a68832c66f791bdb00d2078fc1724d178d13515fbaf21cc4cd053997dfd14c54 not found: ID does not exist" containerID="a68832c66f791bdb00d2078fc1724d178d13515fbaf21cc4cd053997dfd14c54" Dec 01 21:41:53 crc kubenswrapper[4962]: I1201 21:41:53.297411 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a68832c66f791bdb00d2078fc1724d178d13515fbaf21cc4cd053997dfd14c54"} err="failed to get container status \"a68832c66f791bdb00d2078fc1724d178d13515fbaf21cc4cd053997dfd14c54\": rpc error: code = NotFound desc = could not find container \"a68832c66f791bdb00d2078fc1724d178d13515fbaf21cc4cd053997dfd14c54\": container with ID starting with a68832c66f791bdb00d2078fc1724d178d13515fbaf21cc4cd053997dfd14c54 not found: ID does not exist" Dec 01 21:41:53 crc kubenswrapper[4962]: I1201 21:41:53.315628 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-75566bdf7c-zxn4f"] Dec 01 21:41:53 crc kubenswrapper[4962]: I1201 21:41:53.323419 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-75566bdf7c-zxn4f"] Dec 01 21:41:54 crc kubenswrapper[4962]: I1201 21:41:54.234909 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f3bd254-c29c-48b9-9ecf-0824411c2e6f" path="/var/lib/kubelet/pods/4f3bd254-c29c-48b9-9ecf-0824411c2e6f/volumes" Dec 01 21:42:00 crc kubenswrapper[4962]: I1201 21:42:00.711364 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-5f75bfbf8b-qc2cs" Dec 01 21:42:00 crc kubenswrapper[4962]: I1201 21:42:00.720428 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-5f75bfbf8b-qc2cs" Dec 01 21:42:44 crc kubenswrapper[4962]: I1201 21:42:44.956394 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rljxv"] Dec 01 21:42:44 crc kubenswrapper[4962]: E1201 21:42:44.958403 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f3bd254-c29c-48b9-9ecf-0824411c2e6f" containerName="console" Dec 01 21:42:44 crc kubenswrapper[4962]: I1201 21:42:44.958448 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f3bd254-c29c-48b9-9ecf-0824411c2e6f" containerName="console" Dec 01 21:42:44 crc kubenswrapper[4962]: I1201 21:42:44.958680 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f3bd254-c29c-48b9-9ecf-0824411c2e6f" containerName="console" Dec 01 21:42:44 crc kubenswrapper[4962]: I1201 21:42:44.959995 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rljxv" Dec 01 21:42:44 crc kubenswrapper[4962]: I1201 21:42:44.964801 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 01 21:42:44 crc kubenswrapper[4962]: I1201 21:42:44.965952 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rljxv"] Dec 01 21:42:45 crc kubenswrapper[4962]: I1201 21:42:45.019893 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1e0bbfb5-52f3-49bc-9db7-0d80859dbb2c-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rljxv\" (UID: \"1e0bbfb5-52f3-49bc-9db7-0d80859dbb2c\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rljxv" Dec 01 21:42:45 crc kubenswrapper[4962]: I1201 21:42:45.019955 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1e0bbfb5-52f3-49bc-9db7-0d80859dbb2c-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rljxv\" (UID: \"1e0bbfb5-52f3-49bc-9db7-0d80859dbb2c\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rljxv" Dec 01 21:42:45 crc kubenswrapper[4962]: I1201 21:42:45.020010 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqlzk\" (UniqueName: \"kubernetes.io/projected/1e0bbfb5-52f3-49bc-9db7-0d80859dbb2c-kube-api-access-rqlzk\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rljxv\" (UID: \"1e0bbfb5-52f3-49bc-9db7-0d80859dbb2c\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rljxv" Dec 01 21:42:45 crc kubenswrapper[4962]: I1201 21:42:45.120993 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1e0bbfb5-52f3-49bc-9db7-0d80859dbb2c-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rljxv\" (UID: \"1e0bbfb5-52f3-49bc-9db7-0d80859dbb2c\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rljxv" Dec 01 21:42:45 crc kubenswrapper[4962]: I1201 21:42:45.121050 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1e0bbfb5-52f3-49bc-9db7-0d80859dbb2c-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rljxv\" (UID: \"1e0bbfb5-52f3-49bc-9db7-0d80859dbb2c\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rljxv" Dec 01 21:42:45 crc kubenswrapper[4962]: I1201 21:42:45.121102 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqlzk\" (UniqueName: \"kubernetes.io/projected/1e0bbfb5-52f3-49bc-9db7-0d80859dbb2c-kube-api-access-rqlzk\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rljxv\" (UID: \"1e0bbfb5-52f3-49bc-9db7-0d80859dbb2c\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rljxv" Dec 01 21:42:45 crc kubenswrapper[4962]: I1201 21:42:45.121891 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1e0bbfb5-52f3-49bc-9db7-0d80859dbb2c-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rljxv\" (UID: \"1e0bbfb5-52f3-49bc-9db7-0d80859dbb2c\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rljxv" Dec 01 21:42:45 crc kubenswrapper[4962]: I1201 21:42:45.122098 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1e0bbfb5-52f3-49bc-9db7-0d80859dbb2c-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rljxv\" (UID: \"1e0bbfb5-52f3-49bc-9db7-0d80859dbb2c\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rljxv" Dec 01 21:42:45 crc kubenswrapper[4962]: I1201 21:42:45.143801 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqlzk\" (UniqueName: \"kubernetes.io/projected/1e0bbfb5-52f3-49bc-9db7-0d80859dbb2c-kube-api-access-rqlzk\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rljxv\" (UID: \"1e0bbfb5-52f3-49bc-9db7-0d80859dbb2c\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rljxv" Dec 01 21:42:45 crc kubenswrapper[4962]: I1201 21:42:45.288840 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rljxv" Dec 01 21:42:45 crc kubenswrapper[4962]: I1201 21:42:45.745061 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rljxv"] Dec 01 21:42:45 crc kubenswrapper[4962]: W1201 21:42:45.750992 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e0bbfb5_52f3_49bc_9db7_0d80859dbb2c.slice/crio-02fe937bd1d338fc460eba54c010739a0273f308dfbc69ff1cc039cffca7887c WatchSource:0}: Error finding container 02fe937bd1d338fc460eba54c010739a0273f308dfbc69ff1cc039cffca7887c: Status 404 returned error can't find the container with id 02fe937bd1d338fc460eba54c010739a0273f308dfbc69ff1cc039cffca7887c Dec 01 21:42:46 crc kubenswrapper[4962]: I1201 21:42:46.689459 4962 generic.go:334] "Generic (PLEG): container finished" podID="1e0bbfb5-52f3-49bc-9db7-0d80859dbb2c" containerID="17a016c68ef96b869fd86a1544e9b76b6fc066874409d1e56d24413cdca57232" exitCode=0 Dec 01 21:42:46 crc kubenswrapper[4962]: I1201 21:42:46.689558 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rljxv" event={"ID":"1e0bbfb5-52f3-49bc-9db7-0d80859dbb2c","Type":"ContainerDied","Data":"17a016c68ef96b869fd86a1544e9b76b6fc066874409d1e56d24413cdca57232"} Dec 01 21:42:46 crc kubenswrapper[4962]: I1201 21:42:46.691806 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rljxv" event={"ID":"1e0bbfb5-52f3-49bc-9db7-0d80859dbb2c","Type":"ContainerStarted","Data":"02fe937bd1d338fc460eba54c010739a0273f308dfbc69ff1cc039cffca7887c"} Dec 01 21:42:46 crc kubenswrapper[4962]: I1201 21:42:46.691305 4962 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 21:42:48 crc kubenswrapper[4962]: I1201 21:42:48.710831 4962 generic.go:334] "Generic (PLEG): container finished" podID="1e0bbfb5-52f3-49bc-9db7-0d80859dbb2c" containerID="518a35ea5128edf687a4604d5b0dff051977cca5b2bccca82d2844d9ffe6086b" exitCode=0 Dec 01 21:42:48 crc kubenswrapper[4962]: I1201 21:42:48.711105 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rljxv" event={"ID":"1e0bbfb5-52f3-49bc-9db7-0d80859dbb2c","Type":"ContainerDied","Data":"518a35ea5128edf687a4604d5b0dff051977cca5b2bccca82d2844d9ffe6086b"} Dec 01 21:42:49 crc kubenswrapper[4962]: I1201 21:42:49.723270 4962 generic.go:334] "Generic (PLEG): container finished" podID="1e0bbfb5-52f3-49bc-9db7-0d80859dbb2c" containerID="7c46261b9db40ea35fa8f4681509033924f20e28d36f46bf1d2f7c300789e3ce" exitCode=0 Dec 01 21:42:49 crc kubenswrapper[4962]: I1201 21:42:49.723392 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rljxv" event={"ID":"1e0bbfb5-52f3-49bc-9db7-0d80859dbb2c","Type":"ContainerDied","Data":"7c46261b9db40ea35fa8f4681509033924f20e28d36f46bf1d2f7c300789e3ce"} Dec 01 21:42:51 crc kubenswrapper[4962]: I1201 21:42:51.066431 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rljxv" Dec 01 21:42:51 crc kubenswrapper[4962]: I1201 21:42:51.210294 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1e0bbfb5-52f3-49bc-9db7-0d80859dbb2c-bundle\") pod \"1e0bbfb5-52f3-49bc-9db7-0d80859dbb2c\" (UID: \"1e0bbfb5-52f3-49bc-9db7-0d80859dbb2c\") " Dec 01 21:42:51 crc kubenswrapper[4962]: I1201 21:42:51.210494 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqlzk\" (UniqueName: \"kubernetes.io/projected/1e0bbfb5-52f3-49bc-9db7-0d80859dbb2c-kube-api-access-rqlzk\") pod \"1e0bbfb5-52f3-49bc-9db7-0d80859dbb2c\" (UID: \"1e0bbfb5-52f3-49bc-9db7-0d80859dbb2c\") " Dec 01 21:42:51 crc kubenswrapper[4962]: I1201 21:42:51.210641 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1e0bbfb5-52f3-49bc-9db7-0d80859dbb2c-util\") pod \"1e0bbfb5-52f3-49bc-9db7-0d80859dbb2c\" (UID: \"1e0bbfb5-52f3-49bc-9db7-0d80859dbb2c\") " Dec 01 21:42:51 crc kubenswrapper[4962]: I1201 21:42:51.217731 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e0bbfb5-52f3-49bc-9db7-0d80859dbb2c-kube-api-access-rqlzk" (OuterVolumeSpecName: "kube-api-access-rqlzk") pod "1e0bbfb5-52f3-49bc-9db7-0d80859dbb2c" (UID: "1e0bbfb5-52f3-49bc-9db7-0d80859dbb2c"). InnerVolumeSpecName "kube-api-access-rqlzk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:42:51 crc kubenswrapper[4962]: I1201 21:42:51.218149 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e0bbfb5-52f3-49bc-9db7-0d80859dbb2c-bundle" (OuterVolumeSpecName: "bundle") pod "1e0bbfb5-52f3-49bc-9db7-0d80859dbb2c" (UID: "1e0bbfb5-52f3-49bc-9db7-0d80859dbb2c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:42:51 crc kubenswrapper[4962]: I1201 21:42:51.230249 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e0bbfb5-52f3-49bc-9db7-0d80859dbb2c-util" (OuterVolumeSpecName: "util") pod "1e0bbfb5-52f3-49bc-9db7-0d80859dbb2c" (UID: "1e0bbfb5-52f3-49bc-9db7-0d80859dbb2c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:42:51 crc kubenswrapper[4962]: I1201 21:42:51.312740 4962 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1e0bbfb5-52f3-49bc-9db7-0d80859dbb2c-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 21:42:51 crc kubenswrapper[4962]: I1201 21:42:51.312782 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqlzk\" (UniqueName: \"kubernetes.io/projected/1e0bbfb5-52f3-49bc-9db7-0d80859dbb2c-kube-api-access-rqlzk\") on node \"crc\" DevicePath \"\"" Dec 01 21:42:51 crc kubenswrapper[4962]: I1201 21:42:51.312805 4962 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1e0bbfb5-52f3-49bc-9db7-0d80859dbb2c-util\") on node \"crc\" DevicePath \"\"" Dec 01 21:42:51 crc kubenswrapper[4962]: I1201 21:42:51.740336 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rljxv" event={"ID":"1e0bbfb5-52f3-49bc-9db7-0d80859dbb2c","Type":"ContainerDied","Data":"02fe937bd1d338fc460eba54c010739a0273f308dfbc69ff1cc039cffca7887c"} Dec 01 21:42:51 crc kubenswrapper[4962]: I1201 21:42:51.740384 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02fe937bd1d338fc460eba54c010739a0273f308dfbc69ff1cc039cffca7887c" Dec 01 21:42:51 crc kubenswrapper[4962]: I1201 21:42:51.740388 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rljxv" Dec 01 21:42:56 crc kubenswrapper[4962]: I1201 21:42:56.205465 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-j77n9"] Dec 01 21:42:56 crc kubenswrapper[4962]: I1201 21:42:56.206613 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" podUID="017b2e87-9a6e-46c6-b061-1ed93bfd2322" containerName="ovn-controller" containerID="cri-o://da8a6f1fac0baebddfe20bd0ebda104356ad37e9f128affd8df2757584e8e712" gracePeriod=30 Dec 01 21:42:56 crc kubenswrapper[4962]: I1201 21:42:56.206664 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" podUID="017b2e87-9a6e-46c6-b061-1ed93bfd2322" containerName="northd" containerID="cri-o://70bcf088189f0158815dd70d752870decf941ad51bbfbe7c0c872da603659864" gracePeriod=30 Dec 01 21:42:56 crc kubenswrapper[4962]: I1201 21:42:56.206677 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" podUID="017b2e87-9a6e-46c6-b061-1ed93bfd2322" containerName="sbdb" containerID="cri-o://893a9b3cc35f66e4d7ce63bd2c5c94a0e5824466a37b2281e05c4e92b5e90a05" gracePeriod=30 Dec 01 21:42:56 crc kubenswrapper[4962]: I1201 21:42:56.206816 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" podUID="017b2e87-9a6e-46c6-b061-1ed93bfd2322" containerName="nbdb" containerID="cri-o://81854e8f9430ba467e0aecead00fad301f92f85fdcca3f93f5e4aad9bba65925" gracePeriod=30 Dec 01 21:42:56 crc kubenswrapper[4962]: I1201 21:42:56.206801 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" podUID="017b2e87-9a6e-46c6-b061-1ed93bfd2322" containerName="ovn-acl-logging" containerID="cri-o://7f5f31e9f2e590d4d3319248b0a4dca7e1f6575cb973ec7cc3e33fc91eb82d89" gracePeriod=30 Dec 01 21:42:56 crc kubenswrapper[4962]: I1201 21:42:56.206780 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" podUID="017b2e87-9a6e-46c6-b061-1ed93bfd2322" containerName="kube-rbac-proxy-node" containerID="cri-o://2c2b5c061f60dc51486c1b3b6fd97f18f63860ecb75697b61ee8288c43f23417" gracePeriod=30 Dec 01 21:42:56 crc kubenswrapper[4962]: I1201 21:42:56.206906 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" podUID="017b2e87-9a6e-46c6-b061-1ed93bfd2322" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://2a2d1885515e7eab97b61b268f14f9fab5ff02464621e1a9b6adfee4c032ca3e" gracePeriod=30 Dec 01 21:42:56 crc kubenswrapper[4962]: I1201 21:42:56.241791 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" podUID="017b2e87-9a6e-46c6-b061-1ed93bfd2322" containerName="ovnkube-controller" containerID="cri-o://2ddb62c277e10551212f6291e472c1576caa040e6b2f4fcfe115b91a1bb25581" gracePeriod=30 Dec 01 21:42:56 crc kubenswrapper[4962]: I1201 21:42:56.781178 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j77n9_017b2e87-9a6e-46c6-b061-1ed93bfd2322/ovnkube-controller/3.log" Dec 01 21:42:56 crc kubenswrapper[4962]: I1201 21:42:56.783462 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j77n9_017b2e87-9a6e-46c6-b061-1ed93bfd2322/ovn-acl-logging/0.log" Dec 01 21:42:56 crc kubenswrapper[4962]: I1201 21:42:56.784041 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j77n9_017b2e87-9a6e-46c6-b061-1ed93bfd2322/ovn-controller/0.log" Dec 01 21:42:56 crc kubenswrapper[4962]: I1201 21:42:56.784415 4962 generic.go:334] "Generic (PLEG): container finished" podID="017b2e87-9a6e-46c6-b061-1ed93bfd2322" containerID="2ddb62c277e10551212f6291e472c1576caa040e6b2f4fcfe115b91a1bb25581" exitCode=0 Dec 01 21:42:56 crc kubenswrapper[4962]: I1201 21:42:56.784439 4962 generic.go:334] "Generic (PLEG): container finished" podID="017b2e87-9a6e-46c6-b061-1ed93bfd2322" containerID="893a9b3cc35f66e4d7ce63bd2c5c94a0e5824466a37b2281e05c4e92b5e90a05" exitCode=0 Dec 01 21:42:56 crc kubenswrapper[4962]: I1201 21:42:56.784448 4962 generic.go:334] "Generic (PLEG): container finished" podID="017b2e87-9a6e-46c6-b061-1ed93bfd2322" containerID="81854e8f9430ba467e0aecead00fad301f92f85fdcca3f93f5e4aad9bba65925" exitCode=0 Dec 01 21:42:56 crc kubenswrapper[4962]: I1201 21:42:56.784456 4962 generic.go:334] "Generic (PLEG): container finished" podID="017b2e87-9a6e-46c6-b061-1ed93bfd2322" containerID="70bcf088189f0158815dd70d752870decf941ad51bbfbe7c0c872da603659864" exitCode=0 Dec 01 21:42:56 crc kubenswrapper[4962]: I1201 21:42:56.784463 4962 generic.go:334] "Generic (PLEG): container finished" podID="017b2e87-9a6e-46c6-b061-1ed93bfd2322" containerID="7f5f31e9f2e590d4d3319248b0a4dca7e1f6575cb973ec7cc3e33fc91eb82d89" exitCode=143 Dec 01 21:42:56 crc kubenswrapper[4962]: I1201 21:42:56.784470 4962 generic.go:334] "Generic (PLEG): container finished" podID="017b2e87-9a6e-46c6-b061-1ed93bfd2322" containerID="da8a6f1fac0baebddfe20bd0ebda104356ad37e9f128affd8df2757584e8e712" exitCode=143 Dec 01 21:42:56 crc kubenswrapper[4962]: I1201 21:42:56.784501 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" event={"ID":"017b2e87-9a6e-46c6-b061-1ed93bfd2322","Type":"ContainerDied","Data":"2ddb62c277e10551212f6291e472c1576caa040e6b2f4fcfe115b91a1bb25581"} Dec 01 21:42:56 crc kubenswrapper[4962]: I1201 21:42:56.784561 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" event={"ID":"017b2e87-9a6e-46c6-b061-1ed93bfd2322","Type":"ContainerDied","Data":"893a9b3cc35f66e4d7ce63bd2c5c94a0e5824466a37b2281e05c4e92b5e90a05"} Dec 01 21:42:56 crc kubenswrapper[4962]: I1201 21:42:56.784577 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" event={"ID":"017b2e87-9a6e-46c6-b061-1ed93bfd2322","Type":"ContainerDied","Data":"81854e8f9430ba467e0aecead00fad301f92f85fdcca3f93f5e4aad9bba65925"} Dec 01 21:42:56 crc kubenswrapper[4962]: I1201 21:42:56.784595 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" event={"ID":"017b2e87-9a6e-46c6-b061-1ed93bfd2322","Type":"ContainerDied","Data":"70bcf088189f0158815dd70d752870decf941ad51bbfbe7c0c872da603659864"} Dec 01 21:42:56 crc kubenswrapper[4962]: I1201 21:42:56.784607 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" event={"ID":"017b2e87-9a6e-46c6-b061-1ed93bfd2322","Type":"ContainerDied","Data":"7f5f31e9f2e590d4d3319248b0a4dca7e1f6575cb973ec7cc3e33fc91eb82d89"} Dec 01 21:42:56 crc kubenswrapper[4962]: I1201 21:42:56.784614 4962 scope.go:117] "RemoveContainer" containerID="52ceea6c3fd351be1043a8bf07055a4d385f97a923ff2f08f1ddf810320d61ce" Dec 01 21:42:56 crc kubenswrapper[4962]: I1201 21:42:56.784619 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" event={"ID":"017b2e87-9a6e-46c6-b061-1ed93bfd2322","Type":"ContainerDied","Data":"da8a6f1fac0baebddfe20bd0ebda104356ad37e9f128affd8df2757584e8e712"} Dec 01 21:42:56 crc kubenswrapper[4962]: I1201 21:42:56.786640 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-m4wg5_f38b9e31-13b0-4a48-93bf-b3722ca60642/kube-multus/2.log" Dec 01 21:42:56 crc kubenswrapper[4962]: I1201 21:42:56.787178 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-m4wg5_f38b9e31-13b0-4a48-93bf-b3722ca60642/kube-multus/1.log" Dec 01 21:42:56 crc kubenswrapper[4962]: I1201 21:42:56.787211 4962 generic.go:334] "Generic (PLEG): container finished" podID="f38b9e31-13b0-4a48-93bf-b3722ca60642" containerID="1b8d562c177ec53feb71127f293276762385b527ac37171b4992a030c29c6db7" exitCode=2 Dec 01 21:42:56 crc kubenswrapper[4962]: I1201 21:42:56.787233 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-m4wg5" event={"ID":"f38b9e31-13b0-4a48-93bf-b3722ca60642","Type":"ContainerDied","Data":"1b8d562c177ec53feb71127f293276762385b527ac37171b4992a030c29c6db7"} Dec 01 21:42:56 crc kubenswrapper[4962]: I1201 21:42:56.787723 4962 scope.go:117] "RemoveContainer" containerID="1b8d562c177ec53feb71127f293276762385b527ac37171b4992a030c29c6db7" Dec 01 21:42:56 crc kubenswrapper[4962]: E1201 21:42:56.787985 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-m4wg5_openshift-multus(f38b9e31-13b0-4a48-93bf-b3722ca60642)\"" pod="openshift-multus/multus-m4wg5" podUID="f38b9e31-13b0-4a48-93bf-b3722ca60642" Dec 01 21:42:56 crc kubenswrapper[4962]: I1201 21:42:56.811807 4962 scope.go:117] "RemoveContainer" containerID="e13ebe98bcf0b54dcebec65942330d0f78a7f024cc3dbe2b2499bf7c572541c1" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.432173 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j77n9_017b2e87-9a6e-46c6-b061-1ed93bfd2322/ovn-acl-logging/0.log" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.432715 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j77n9_017b2e87-9a6e-46c6-b061-1ed93bfd2322/ovn-controller/0.log" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.433180 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.492675 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bdx49"] Dec 01 21:42:57 crc kubenswrapper[4962]: E1201 21:42:57.492889 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="017b2e87-9a6e-46c6-b061-1ed93bfd2322" containerName="ovnkube-controller" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.492901 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="017b2e87-9a6e-46c6-b061-1ed93bfd2322" containerName="ovnkube-controller" Dec 01 21:42:57 crc kubenswrapper[4962]: E1201 21:42:57.492910 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="017b2e87-9a6e-46c6-b061-1ed93bfd2322" containerName="ovn-acl-logging" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.492916 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="017b2e87-9a6e-46c6-b061-1ed93bfd2322" containerName="ovn-acl-logging" Dec 01 21:42:57 crc kubenswrapper[4962]: E1201 21:42:57.492925 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="017b2e87-9a6e-46c6-b061-1ed93bfd2322" containerName="northd" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.492949 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="017b2e87-9a6e-46c6-b061-1ed93bfd2322" containerName="northd" Dec 01 21:42:57 crc kubenswrapper[4962]: E1201 21:42:57.492959 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e0bbfb5-52f3-49bc-9db7-0d80859dbb2c" containerName="extract" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.492965 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e0bbfb5-52f3-49bc-9db7-0d80859dbb2c" containerName="extract" Dec 01 21:42:57 crc kubenswrapper[4962]: E1201 21:42:57.492972 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="017b2e87-9a6e-46c6-b061-1ed93bfd2322" containerName="ovn-controller" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.492978 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="017b2e87-9a6e-46c6-b061-1ed93bfd2322" containerName="ovn-controller" Dec 01 21:42:57 crc kubenswrapper[4962]: E1201 21:42:57.492989 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e0bbfb5-52f3-49bc-9db7-0d80859dbb2c" containerName="pull" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.492994 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e0bbfb5-52f3-49bc-9db7-0d80859dbb2c" containerName="pull" Dec 01 21:42:57 crc kubenswrapper[4962]: E1201 21:42:57.493001 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="017b2e87-9a6e-46c6-b061-1ed93bfd2322" containerName="kube-rbac-proxy-node" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.493006 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="017b2e87-9a6e-46c6-b061-1ed93bfd2322" containerName="kube-rbac-proxy-node" Dec 01 21:42:57 crc kubenswrapper[4962]: E1201 21:42:57.493012 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="017b2e87-9a6e-46c6-b061-1ed93bfd2322" containerName="kubecfg-setup" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.493017 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="017b2e87-9a6e-46c6-b061-1ed93bfd2322" containerName="kubecfg-setup" Dec 01 21:42:57 crc kubenswrapper[4962]: E1201 21:42:57.493026 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="017b2e87-9a6e-46c6-b061-1ed93bfd2322" containerName="ovnkube-controller" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.493032 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="017b2e87-9a6e-46c6-b061-1ed93bfd2322" containerName="ovnkube-controller" Dec 01 21:42:57 crc kubenswrapper[4962]: E1201 21:42:57.493040 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e0bbfb5-52f3-49bc-9db7-0d80859dbb2c" containerName="util" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.493046 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e0bbfb5-52f3-49bc-9db7-0d80859dbb2c" containerName="util" Dec 01 21:42:57 crc kubenswrapper[4962]: E1201 21:42:57.493052 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="017b2e87-9a6e-46c6-b061-1ed93bfd2322" containerName="ovnkube-controller" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.493058 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="017b2e87-9a6e-46c6-b061-1ed93bfd2322" containerName="ovnkube-controller" Dec 01 21:42:57 crc kubenswrapper[4962]: E1201 21:42:57.493067 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="017b2e87-9a6e-46c6-b061-1ed93bfd2322" containerName="kube-rbac-proxy-ovn-metrics" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.493074 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="017b2e87-9a6e-46c6-b061-1ed93bfd2322" containerName="kube-rbac-proxy-ovn-metrics" Dec 01 21:42:57 crc kubenswrapper[4962]: E1201 21:42:57.493081 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="017b2e87-9a6e-46c6-b061-1ed93bfd2322" containerName="nbdb" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.493087 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="017b2e87-9a6e-46c6-b061-1ed93bfd2322" containerName="nbdb" Dec 01 21:42:57 crc kubenswrapper[4962]: E1201 21:42:57.493097 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="017b2e87-9a6e-46c6-b061-1ed93bfd2322" containerName="sbdb" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.493102 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="017b2e87-9a6e-46c6-b061-1ed93bfd2322" containerName="sbdb" Dec 01 21:42:57 crc kubenswrapper[4962]: E1201 21:42:57.493113 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="017b2e87-9a6e-46c6-b061-1ed93bfd2322" containerName="ovnkube-controller" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.493118 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="017b2e87-9a6e-46c6-b061-1ed93bfd2322" containerName="ovnkube-controller" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.493225 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="017b2e87-9a6e-46c6-b061-1ed93bfd2322" containerName="ovn-controller" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.493234 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="017b2e87-9a6e-46c6-b061-1ed93bfd2322" containerName="nbdb" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.493242 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="017b2e87-9a6e-46c6-b061-1ed93bfd2322" containerName="kube-rbac-proxy-node" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.493249 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="017b2e87-9a6e-46c6-b061-1ed93bfd2322" containerName="ovnkube-controller" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.493255 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="017b2e87-9a6e-46c6-b061-1ed93bfd2322" containerName="sbdb" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.493264 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="017b2e87-9a6e-46c6-b061-1ed93bfd2322" containerName="ovnkube-controller" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.493271 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="017b2e87-9a6e-46c6-b061-1ed93bfd2322" containerName="ovnkube-controller" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.493282 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="017b2e87-9a6e-46c6-b061-1ed93bfd2322" containerName="ovn-acl-logging" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.493293 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="017b2e87-9a6e-46c6-b061-1ed93bfd2322" containerName="kube-rbac-proxy-ovn-metrics" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.493302 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e0bbfb5-52f3-49bc-9db7-0d80859dbb2c" containerName="extract" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.493312 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="017b2e87-9a6e-46c6-b061-1ed93bfd2322" containerName="northd" Dec 01 21:42:57 crc kubenswrapper[4962]: E1201 21:42:57.493436 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="017b2e87-9a6e-46c6-b061-1ed93bfd2322" containerName="ovnkube-controller" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.493446 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="017b2e87-9a6e-46c6-b061-1ed93bfd2322" containerName="ovnkube-controller" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.493555 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="017b2e87-9a6e-46c6-b061-1ed93bfd2322" containerName="ovnkube-controller" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.493785 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="017b2e87-9a6e-46c6-b061-1ed93bfd2322" containerName="ovnkube-controller" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.495353 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.613340 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-etc-openvswitch\") pod \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.613389 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/017b2e87-9a6e-46c6-b061-1ed93bfd2322-env-overrides\") pod \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.613416 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-run-openvswitch\") pod \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.613437 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/017b2e87-9a6e-46c6-b061-1ed93bfd2322-ovnkube-script-lib\") pod \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.613463 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fg5ph\" (UniqueName: \"kubernetes.io/projected/017b2e87-9a6e-46c6-b061-1ed93bfd2322-kube-api-access-fg5ph\") pod \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.613502 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-host-run-ovn-kubernetes\") pod \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.613534 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/017b2e87-9a6e-46c6-b061-1ed93bfd2322-ovn-node-metrics-cert\") pod \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.613555 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-systemd-units\") pod \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.613568 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-host-run-netns\") pod \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.613582 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-run-systemd\") pod \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.613608 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-log-socket\") pod \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.613624 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-host-cni-bin\") pod \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.613637 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-host-cni-netd\") pod \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.613650 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-run-ovn\") pod \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.613667 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-host-slash\") pod \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.613683 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-var-lib-openvswitch\") pod \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.613714 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/017b2e87-9a6e-46c6-b061-1ed93bfd2322-ovnkube-config\") pod \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.613744 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-node-log\") pod \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.613776 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-host-var-lib-cni-networks-ovn-kubernetes\") pod \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.613777 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "017b2e87-9a6e-46c6-b061-1ed93bfd2322" (UID: "017b2e87-9a6e-46c6-b061-1ed93bfd2322"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.613794 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "017b2e87-9a6e-46c6-b061-1ed93bfd2322" (UID: "017b2e87-9a6e-46c6-b061-1ed93bfd2322"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.613826 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "017b2e87-9a6e-46c6-b061-1ed93bfd2322" (UID: "017b2e87-9a6e-46c6-b061-1ed93bfd2322"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.613845 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-host-slash" (OuterVolumeSpecName: "host-slash") pod "017b2e87-9a6e-46c6-b061-1ed93bfd2322" (UID: "017b2e87-9a6e-46c6-b061-1ed93bfd2322"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.613844 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "017b2e87-9a6e-46c6-b061-1ed93bfd2322" (UID: "017b2e87-9a6e-46c6-b061-1ed93bfd2322"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.613851 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "017b2e87-9a6e-46c6-b061-1ed93bfd2322" (UID: "017b2e87-9a6e-46c6-b061-1ed93bfd2322"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.613868 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-log-socket" (OuterVolumeSpecName: "log-socket") pod "017b2e87-9a6e-46c6-b061-1ed93bfd2322" (UID: "017b2e87-9a6e-46c6-b061-1ed93bfd2322"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.613865 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "017b2e87-9a6e-46c6-b061-1ed93bfd2322" (UID: "017b2e87-9a6e-46c6-b061-1ed93bfd2322"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.613889 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "017b2e87-9a6e-46c6-b061-1ed93bfd2322" (UID: "017b2e87-9a6e-46c6-b061-1ed93bfd2322"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.613889 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "017b2e87-9a6e-46c6-b061-1ed93bfd2322" (UID: "017b2e87-9a6e-46c6-b061-1ed93bfd2322"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.613901 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "017b2e87-9a6e-46c6-b061-1ed93bfd2322" (UID: "017b2e87-9a6e-46c6-b061-1ed93bfd2322"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.613916 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-node-log" (OuterVolumeSpecName: "node-log") pod "017b2e87-9a6e-46c6-b061-1ed93bfd2322" (UID: "017b2e87-9a6e-46c6-b061-1ed93bfd2322"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.613921 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "017b2e87-9a6e-46c6-b061-1ed93bfd2322" (UID: "017b2e87-9a6e-46c6-b061-1ed93bfd2322"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.613920 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "017b2e87-9a6e-46c6-b061-1ed93bfd2322" (UID: "017b2e87-9a6e-46c6-b061-1ed93bfd2322"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.613795 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-host-kubelet\") pod \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\" (UID: \"017b2e87-9a6e-46c6-b061-1ed93bfd2322\") " Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.614147 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/017b2e87-9a6e-46c6-b061-1ed93bfd2322-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "017b2e87-9a6e-46c6-b061-1ed93bfd2322" (UID: "017b2e87-9a6e-46c6-b061-1ed93bfd2322"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.614166 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/017b2e87-9a6e-46c6-b061-1ed93bfd2322-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "017b2e87-9a6e-46c6-b061-1ed93bfd2322" (UID: "017b2e87-9a6e-46c6-b061-1ed93bfd2322"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.614217 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/017b2e87-9a6e-46c6-b061-1ed93bfd2322-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "017b2e87-9a6e-46c6-b061-1ed93bfd2322" (UID: "017b2e87-9a6e-46c6-b061-1ed93bfd2322"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.614260 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7b93d9d1-8679-4de4-84d6-adcfaae055d6-ovn-node-metrics-cert\") pod \"ovnkube-node-bdx49\" (UID: \"7b93d9d1-8679-4de4-84d6-adcfaae055d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.614294 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7b93d9d1-8679-4de4-84d6-adcfaae055d6-ovnkube-config\") pod \"ovnkube-node-bdx49\" (UID: \"7b93d9d1-8679-4de4-84d6-adcfaae055d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.614323 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7b93d9d1-8679-4de4-84d6-adcfaae055d6-etc-openvswitch\") pod \"ovnkube-node-bdx49\" (UID: \"7b93d9d1-8679-4de4-84d6-adcfaae055d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.614351 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7b93d9d1-8679-4de4-84d6-adcfaae055d6-var-lib-openvswitch\") pod \"ovnkube-node-bdx49\" (UID: \"7b93d9d1-8679-4de4-84d6-adcfaae055d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.614367 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7b93d9d1-8679-4de4-84d6-adcfaae055d6-ovnkube-script-lib\") pod \"ovnkube-node-bdx49\" (UID: \"7b93d9d1-8679-4de4-84d6-adcfaae055d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.614421 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7b93d9d1-8679-4de4-84d6-adcfaae055d6-node-log\") pod \"ovnkube-node-bdx49\" (UID: \"7b93d9d1-8679-4de4-84d6-adcfaae055d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.614463 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7b93d9d1-8679-4de4-84d6-adcfaae055d6-host-run-netns\") pod \"ovnkube-node-bdx49\" (UID: \"7b93d9d1-8679-4de4-84d6-adcfaae055d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.614482 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7b93d9d1-8679-4de4-84d6-adcfaae055d6-host-slash\") pod \"ovnkube-node-bdx49\" (UID: \"7b93d9d1-8679-4de4-84d6-adcfaae055d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.614545 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7b93d9d1-8679-4de4-84d6-adcfaae055d6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bdx49\" (UID: \"7b93d9d1-8679-4de4-84d6-adcfaae055d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.614756 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7b93d9d1-8679-4de4-84d6-adcfaae055d6-host-run-ovn-kubernetes\") pod \"ovnkube-node-bdx49\" (UID: \"7b93d9d1-8679-4de4-84d6-adcfaae055d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.614773 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-427qw\" (UniqueName: \"kubernetes.io/projected/7b93d9d1-8679-4de4-84d6-adcfaae055d6-kube-api-access-427qw\") pod \"ovnkube-node-bdx49\" (UID: \"7b93d9d1-8679-4de4-84d6-adcfaae055d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.614807 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7b93d9d1-8679-4de4-84d6-adcfaae055d6-systemd-units\") pod \"ovnkube-node-bdx49\" (UID: \"7b93d9d1-8679-4de4-84d6-adcfaae055d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.614823 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7b93d9d1-8679-4de4-84d6-adcfaae055d6-host-cni-bin\") pod \"ovnkube-node-bdx49\" (UID: \"7b93d9d1-8679-4de4-84d6-adcfaae055d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.614852 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7b93d9d1-8679-4de4-84d6-adcfaae055d6-host-cni-netd\") pod \"ovnkube-node-bdx49\" (UID: \"7b93d9d1-8679-4de4-84d6-adcfaae055d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.614975 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7b93d9d1-8679-4de4-84d6-adcfaae055d6-run-openvswitch\") pod \"ovnkube-node-bdx49\" (UID: \"7b93d9d1-8679-4de4-84d6-adcfaae055d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.615152 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7b93d9d1-8679-4de4-84d6-adcfaae055d6-run-systemd\") pod \"ovnkube-node-bdx49\" (UID: \"7b93d9d1-8679-4de4-84d6-adcfaae055d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.615195 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7b93d9d1-8679-4de4-84d6-adcfaae055d6-env-overrides\") pod \"ovnkube-node-bdx49\" (UID: \"7b93d9d1-8679-4de4-84d6-adcfaae055d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.615234 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7b93d9d1-8679-4de4-84d6-adcfaae055d6-log-socket\") pod \"ovnkube-node-bdx49\" (UID: \"7b93d9d1-8679-4de4-84d6-adcfaae055d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.615274 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7b93d9d1-8679-4de4-84d6-adcfaae055d6-host-kubelet\") pod \"ovnkube-node-bdx49\" (UID: \"7b93d9d1-8679-4de4-84d6-adcfaae055d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.615300 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7b93d9d1-8679-4de4-84d6-adcfaae055d6-run-ovn\") pod \"ovnkube-node-bdx49\" (UID: \"7b93d9d1-8679-4de4-84d6-adcfaae055d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.615368 4962 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/017b2e87-9a6e-46c6-b061-1ed93bfd2322-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.615379 4962 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-node-log\") on node \"crc\" DevicePath \"\"" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.615391 4962 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.615400 4962 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-host-kubelet\") on node \"crc\" DevicePath \"\"" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.615409 4962 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.615419 4962 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/017b2e87-9a6e-46c6-b061-1ed93bfd2322-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.615426 4962 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-run-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.615435 4962 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/017b2e87-9a6e-46c6-b061-1ed93bfd2322-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.615447 4962 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.615456 4962 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-host-run-netns\") on node \"crc\" DevicePath \"\"" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.615464 4962 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-systemd-units\") on node \"crc\" DevicePath \"\"" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.615472 4962 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-log-socket\") on node \"crc\" DevicePath \"\"" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.615481 4962 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-host-cni-bin\") on node \"crc\" DevicePath \"\"" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.615490 4962 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-host-slash\") on node \"crc\" DevicePath \"\"" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.615497 4962 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-host-cni-netd\") on node \"crc\" DevicePath \"\"" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.615505 4962 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.615512 4962 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.627300 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/017b2e87-9a6e-46c6-b061-1ed93bfd2322-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "017b2e87-9a6e-46c6-b061-1ed93bfd2322" (UID: "017b2e87-9a6e-46c6-b061-1ed93bfd2322"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.634163 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/017b2e87-9a6e-46c6-b061-1ed93bfd2322-kube-api-access-fg5ph" (OuterVolumeSpecName: "kube-api-access-fg5ph") pod "017b2e87-9a6e-46c6-b061-1ed93bfd2322" (UID: "017b2e87-9a6e-46c6-b061-1ed93bfd2322"). InnerVolumeSpecName "kube-api-access-fg5ph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.644533 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "017b2e87-9a6e-46c6-b061-1ed93bfd2322" (UID: "017b2e87-9a6e-46c6-b061-1ed93bfd2322"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.716108 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7b93d9d1-8679-4de4-84d6-adcfaae055d6-log-socket\") pod \"ovnkube-node-bdx49\" (UID: \"7b93d9d1-8679-4de4-84d6-adcfaae055d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.716162 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7b93d9d1-8679-4de4-84d6-adcfaae055d6-host-kubelet\") pod \"ovnkube-node-bdx49\" (UID: \"7b93d9d1-8679-4de4-84d6-adcfaae055d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.716183 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7b93d9d1-8679-4de4-84d6-adcfaae055d6-run-ovn\") pod \"ovnkube-node-bdx49\" (UID: \"7b93d9d1-8679-4de4-84d6-adcfaae055d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.716210 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7b93d9d1-8679-4de4-84d6-adcfaae055d6-ovn-node-metrics-cert\") pod \"ovnkube-node-bdx49\" (UID: \"7b93d9d1-8679-4de4-84d6-adcfaae055d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.716232 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7b93d9d1-8679-4de4-84d6-adcfaae055d6-ovnkube-config\") pod \"ovnkube-node-bdx49\" (UID: \"7b93d9d1-8679-4de4-84d6-adcfaae055d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.716251 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7b93d9d1-8679-4de4-84d6-adcfaae055d6-etc-openvswitch\") pod \"ovnkube-node-bdx49\" (UID: \"7b93d9d1-8679-4de4-84d6-adcfaae055d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.716243 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7b93d9d1-8679-4de4-84d6-adcfaae055d6-log-socket\") pod \"ovnkube-node-bdx49\" (UID: \"7b93d9d1-8679-4de4-84d6-adcfaae055d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.716275 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7b93d9d1-8679-4de4-84d6-adcfaae055d6-host-kubelet\") pod \"ovnkube-node-bdx49\" (UID: \"7b93d9d1-8679-4de4-84d6-adcfaae055d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.716311 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7b93d9d1-8679-4de4-84d6-adcfaae055d6-run-ovn\") pod \"ovnkube-node-bdx49\" (UID: \"7b93d9d1-8679-4de4-84d6-adcfaae055d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.716315 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7b93d9d1-8679-4de4-84d6-adcfaae055d6-var-lib-openvswitch\") pod \"ovnkube-node-bdx49\" (UID: \"7b93d9d1-8679-4de4-84d6-adcfaae055d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.716266 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7b93d9d1-8679-4de4-84d6-adcfaae055d6-var-lib-openvswitch\") pod \"ovnkube-node-bdx49\" (UID: \"7b93d9d1-8679-4de4-84d6-adcfaae055d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.716378 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7b93d9d1-8679-4de4-84d6-adcfaae055d6-ovnkube-script-lib\") pod \"ovnkube-node-bdx49\" (UID: \"7b93d9d1-8679-4de4-84d6-adcfaae055d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.716390 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7b93d9d1-8679-4de4-84d6-adcfaae055d6-etc-openvswitch\") pod \"ovnkube-node-bdx49\" (UID: \"7b93d9d1-8679-4de4-84d6-adcfaae055d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.716461 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7b93d9d1-8679-4de4-84d6-adcfaae055d6-node-log\") pod \"ovnkube-node-bdx49\" (UID: \"7b93d9d1-8679-4de4-84d6-adcfaae055d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.716525 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7b93d9d1-8679-4de4-84d6-adcfaae055d6-host-run-netns\") pod \"ovnkube-node-bdx49\" (UID: \"7b93d9d1-8679-4de4-84d6-adcfaae055d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.716546 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7b93d9d1-8679-4de4-84d6-adcfaae055d6-host-slash\") pod \"ovnkube-node-bdx49\" (UID: \"7b93d9d1-8679-4de4-84d6-adcfaae055d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.716586 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7b93d9d1-8679-4de4-84d6-adcfaae055d6-node-log\") pod \"ovnkube-node-bdx49\" (UID: \"7b93d9d1-8679-4de4-84d6-adcfaae055d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.716617 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7b93d9d1-8679-4de4-84d6-adcfaae055d6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bdx49\" (UID: \"7b93d9d1-8679-4de4-84d6-adcfaae055d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.716596 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7b93d9d1-8679-4de4-84d6-adcfaae055d6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bdx49\" (UID: \"7b93d9d1-8679-4de4-84d6-adcfaae055d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.716642 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7b93d9d1-8679-4de4-84d6-adcfaae055d6-host-run-netns\") pod \"ovnkube-node-bdx49\" (UID: \"7b93d9d1-8679-4de4-84d6-adcfaae055d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.716660 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7b93d9d1-8679-4de4-84d6-adcfaae055d6-host-slash\") pod \"ovnkube-node-bdx49\" (UID: \"7b93d9d1-8679-4de4-84d6-adcfaae055d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.716684 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7b93d9d1-8679-4de4-84d6-adcfaae055d6-host-run-ovn-kubernetes\") pod \"ovnkube-node-bdx49\" (UID: \"7b93d9d1-8679-4de4-84d6-adcfaae055d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.716704 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-427qw\" (UniqueName: \"kubernetes.io/projected/7b93d9d1-8679-4de4-84d6-adcfaae055d6-kube-api-access-427qw\") pod \"ovnkube-node-bdx49\" (UID: \"7b93d9d1-8679-4de4-84d6-adcfaae055d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.716742 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7b93d9d1-8679-4de4-84d6-adcfaae055d6-systemd-units\") pod \"ovnkube-node-bdx49\" (UID: \"7b93d9d1-8679-4de4-84d6-adcfaae055d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.716762 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7b93d9d1-8679-4de4-84d6-adcfaae055d6-host-cni-bin\") pod \"ovnkube-node-bdx49\" (UID: \"7b93d9d1-8679-4de4-84d6-adcfaae055d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.716795 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7b93d9d1-8679-4de4-84d6-adcfaae055d6-host-cni-netd\") pod \"ovnkube-node-bdx49\" (UID: \"7b93d9d1-8679-4de4-84d6-adcfaae055d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.716816 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7b93d9d1-8679-4de4-84d6-adcfaae055d6-run-openvswitch\") pod \"ovnkube-node-bdx49\" (UID: \"7b93d9d1-8679-4de4-84d6-adcfaae055d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.716857 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7b93d9d1-8679-4de4-84d6-adcfaae055d6-run-systemd\") pod \"ovnkube-node-bdx49\" (UID: \"7b93d9d1-8679-4de4-84d6-adcfaae055d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.716884 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7b93d9d1-8679-4de4-84d6-adcfaae055d6-env-overrides\") pod \"ovnkube-node-bdx49\" (UID: \"7b93d9d1-8679-4de4-84d6-adcfaae055d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.716986 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fg5ph\" (UniqueName: \"kubernetes.io/projected/017b2e87-9a6e-46c6-b061-1ed93bfd2322-kube-api-access-fg5ph\") on node \"crc\" DevicePath \"\"" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.717002 4962 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/017b2e87-9a6e-46c6-b061-1ed93bfd2322-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.717007 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7b93d9d1-8679-4de4-84d6-adcfaae055d6-ovnkube-config\") pod \"ovnkube-node-bdx49\" (UID: \"7b93d9d1-8679-4de4-84d6-adcfaae055d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.717019 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7b93d9d1-8679-4de4-84d6-adcfaae055d6-ovnkube-script-lib\") pod \"ovnkube-node-bdx49\" (UID: \"7b93d9d1-8679-4de4-84d6-adcfaae055d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.717012 4962 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/017b2e87-9a6e-46c6-b061-1ed93bfd2322-run-systemd\") on node \"crc\" DevicePath \"\"" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.717060 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7b93d9d1-8679-4de4-84d6-adcfaae055d6-run-openvswitch\") pod \"ovnkube-node-bdx49\" (UID: \"7b93d9d1-8679-4de4-84d6-adcfaae055d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.717062 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7b93d9d1-8679-4de4-84d6-adcfaae055d6-host-run-ovn-kubernetes\") pod \"ovnkube-node-bdx49\" (UID: \"7b93d9d1-8679-4de4-84d6-adcfaae055d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.717078 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7b93d9d1-8679-4de4-84d6-adcfaae055d6-systemd-units\") pod \"ovnkube-node-bdx49\" (UID: \"7b93d9d1-8679-4de4-84d6-adcfaae055d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.717060 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7b93d9d1-8679-4de4-84d6-adcfaae055d6-host-cni-netd\") pod \"ovnkube-node-bdx49\" (UID: \"7b93d9d1-8679-4de4-84d6-adcfaae055d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.717095 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7b93d9d1-8679-4de4-84d6-adcfaae055d6-run-systemd\") pod \"ovnkube-node-bdx49\" (UID: \"7b93d9d1-8679-4de4-84d6-adcfaae055d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.717091 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7b93d9d1-8679-4de4-84d6-adcfaae055d6-host-cni-bin\") pod \"ovnkube-node-bdx49\" (UID: \"7b93d9d1-8679-4de4-84d6-adcfaae055d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.717434 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7b93d9d1-8679-4de4-84d6-adcfaae055d6-env-overrides\") pod \"ovnkube-node-bdx49\" (UID: \"7b93d9d1-8679-4de4-84d6-adcfaae055d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.720562 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7b93d9d1-8679-4de4-84d6-adcfaae055d6-ovn-node-metrics-cert\") pod \"ovnkube-node-bdx49\" (UID: \"7b93d9d1-8679-4de4-84d6-adcfaae055d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.738427 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-427qw\" (UniqueName: \"kubernetes.io/projected/7b93d9d1-8679-4de4-84d6-adcfaae055d6-kube-api-access-427qw\") pod \"ovnkube-node-bdx49\" (UID: \"7b93d9d1-8679-4de4-84d6-adcfaae055d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.808282 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.809016 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-m4wg5_f38b9e31-13b0-4a48-93bf-b3722ca60642/kube-multus/2.log" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.813729 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j77n9_017b2e87-9a6e-46c6-b061-1ed93bfd2322/ovn-acl-logging/0.log" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.814081 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j77n9_017b2e87-9a6e-46c6-b061-1ed93bfd2322/ovn-controller/0.log" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.814348 4962 generic.go:334] "Generic (PLEG): container finished" podID="017b2e87-9a6e-46c6-b061-1ed93bfd2322" containerID="2a2d1885515e7eab97b61b268f14f9fab5ff02464621e1a9b6adfee4c032ca3e" exitCode=0 Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.814370 4962 generic.go:334] "Generic (PLEG): container finished" podID="017b2e87-9a6e-46c6-b061-1ed93bfd2322" containerID="2c2b5c061f60dc51486c1b3b6fd97f18f63860ecb75697b61ee8288c43f23417" exitCode=0 Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.814390 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" event={"ID":"017b2e87-9a6e-46c6-b061-1ed93bfd2322","Type":"ContainerDied","Data":"2a2d1885515e7eab97b61b268f14f9fab5ff02464621e1a9b6adfee4c032ca3e"} Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.814411 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" event={"ID":"017b2e87-9a6e-46c6-b061-1ed93bfd2322","Type":"ContainerDied","Data":"2c2b5c061f60dc51486c1b3b6fd97f18f63860ecb75697b61ee8288c43f23417"} Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.814421 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" event={"ID":"017b2e87-9a6e-46c6-b061-1ed93bfd2322","Type":"ContainerDied","Data":"04a387e5711598ad2964df75f2cc21cf336310ff885eaab7bd2f443a7d190bad"} Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.814435 4962 scope.go:117] "RemoveContainer" containerID="2ddb62c277e10551212f6291e472c1576caa040e6b2f4fcfe115b91a1bb25581" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.814528 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-j77n9" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.842103 4962 scope.go:117] "RemoveContainer" containerID="893a9b3cc35f66e4d7ce63bd2c5c94a0e5824466a37b2281e05c4e92b5e90a05" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.850202 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-j77n9"] Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.858770 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-j77n9"] Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.882024 4962 scope.go:117] "RemoveContainer" containerID="81854e8f9430ba467e0aecead00fad301f92f85fdcca3f93f5e4aad9bba65925" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.908107 4962 scope.go:117] "RemoveContainer" containerID="70bcf088189f0158815dd70d752870decf941ad51bbfbe7c0c872da603659864" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.933974 4962 scope.go:117] "RemoveContainer" containerID="2a2d1885515e7eab97b61b268f14f9fab5ff02464621e1a9b6adfee4c032ca3e" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.953675 4962 scope.go:117] "RemoveContainer" containerID="2c2b5c061f60dc51486c1b3b6fd97f18f63860ecb75697b61ee8288c43f23417" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.972880 4962 scope.go:117] "RemoveContainer" containerID="7f5f31e9f2e590d4d3319248b0a4dca7e1f6575cb973ec7cc3e33fc91eb82d89" Dec 01 21:42:57 crc kubenswrapper[4962]: I1201 21:42:57.986202 4962 scope.go:117] "RemoveContainer" containerID="da8a6f1fac0baebddfe20bd0ebda104356ad37e9f128affd8df2757584e8e712" Dec 01 21:42:58 crc kubenswrapper[4962]: I1201 21:42:58.043404 4962 scope.go:117] "RemoveContainer" containerID="b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69" Dec 01 21:42:58 crc kubenswrapper[4962]: I1201 21:42:58.060220 4962 scope.go:117] "RemoveContainer" containerID="2ddb62c277e10551212f6291e472c1576caa040e6b2f4fcfe115b91a1bb25581" Dec 01 21:42:58 crc kubenswrapper[4962]: E1201 21:42:58.060604 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ddb62c277e10551212f6291e472c1576caa040e6b2f4fcfe115b91a1bb25581\": container with ID starting with 2ddb62c277e10551212f6291e472c1576caa040e6b2f4fcfe115b91a1bb25581 not found: ID does not exist" containerID="2ddb62c277e10551212f6291e472c1576caa040e6b2f4fcfe115b91a1bb25581" Dec 01 21:42:58 crc kubenswrapper[4962]: I1201 21:42:58.060643 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ddb62c277e10551212f6291e472c1576caa040e6b2f4fcfe115b91a1bb25581"} err="failed to get container status \"2ddb62c277e10551212f6291e472c1576caa040e6b2f4fcfe115b91a1bb25581\": rpc error: code = NotFound desc = could not find container \"2ddb62c277e10551212f6291e472c1576caa040e6b2f4fcfe115b91a1bb25581\": container with ID starting with 2ddb62c277e10551212f6291e472c1576caa040e6b2f4fcfe115b91a1bb25581 not found: ID does not exist" Dec 01 21:42:58 crc kubenswrapper[4962]: I1201 21:42:58.060670 4962 scope.go:117] "RemoveContainer" containerID="893a9b3cc35f66e4d7ce63bd2c5c94a0e5824466a37b2281e05c4e92b5e90a05" Dec 01 21:42:58 crc kubenswrapper[4962]: E1201 21:42:58.060985 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"893a9b3cc35f66e4d7ce63bd2c5c94a0e5824466a37b2281e05c4e92b5e90a05\": container with ID starting with 893a9b3cc35f66e4d7ce63bd2c5c94a0e5824466a37b2281e05c4e92b5e90a05 not found: ID does not exist" containerID="893a9b3cc35f66e4d7ce63bd2c5c94a0e5824466a37b2281e05c4e92b5e90a05" Dec 01 21:42:58 crc kubenswrapper[4962]: I1201 21:42:58.061027 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"893a9b3cc35f66e4d7ce63bd2c5c94a0e5824466a37b2281e05c4e92b5e90a05"} err="failed to get container status \"893a9b3cc35f66e4d7ce63bd2c5c94a0e5824466a37b2281e05c4e92b5e90a05\": rpc error: code = NotFound desc = could not find container \"893a9b3cc35f66e4d7ce63bd2c5c94a0e5824466a37b2281e05c4e92b5e90a05\": container with ID starting with 893a9b3cc35f66e4d7ce63bd2c5c94a0e5824466a37b2281e05c4e92b5e90a05 not found: ID does not exist" Dec 01 21:42:58 crc kubenswrapper[4962]: I1201 21:42:58.061050 4962 scope.go:117] "RemoveContainer" containerID="81854e8f9430ba467e0aecead00fad301f92f85fdcca3f93f5e4aad9bba65925" Dec 01 21:42:58 crc kubenswrapper[4962]: E1201 21:42:58.061429 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81854e8f9430ba467e0aecead00fad301f92f85fdcca3f93f5e4aad9bba65925\": container with ID starting with 81854e8f9430ba467e0aecead00fad301f92f85fdcca3f93f5e4aad9bba65925 not found: ID does not exist" containerID="81854e8f9430ba467e0aecead00fad301f92f85fdcca3f93f5e4aad9bba65925" Dec 01 21:42:58 crc kubenswrapper[4962]: I1201 21:42:58.061452 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81854e8f9430ba467e0aecead00fad301f92f85fdcca3f93f5e4aad9bba65925"} err="failed to get container status \"81854e8f9430ba467e0aecead00fad301f92f85fdcca3f93f5e4aad9bba65925\": rpc error: code = NotFound desc = could not find container \"81854e8f9430ba467e0aecead00fad301f92f85fdcca3f93f5e4aad9bba65925\": container with ID starting with 81854e8f9430ba467e0aecead00fad301f92f85fdcca3f93f5e4aad9bba65925 not found: ID does not exist" Dec 01 21:42:58 crc kubenswrapper[4962]: I1201 21:42:58.061465 4962 scope.go:117] "RemoveContainer" containerID="70bcf088189f0158815dd70d752870decf941ad51bbfbe7c0c872da603659864" Dec 01 21:42:58 crc kubenswrapper[4962]: E1201 21:42:58.061711 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70bcf088189f0158815dd70d752870decf941ad51bbfbe7c0c872da603659864\": container with ID starting with 70bcf088189f0158815dd70d752870decf941ad51bbfbe7c0c872da603659864 not found: ID does not exist" containerID="70bcf088189f0158815dd70d752870decf941ad51bbfbe7c0c872da603659864" Dec 01 21:42:58 crc kubenswrapper[4962]: I1201 21:42:58.061748 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70bcf088189f0158815dd70d752870decf941ad51bbfbe7c0c872da603659864"} err="failed to get container status \"70bcf088189f0158815dd70d752870decf941ad51bbfbe7c0c872da603659864\": rpc error: code = NotFound desc = could not find container \"70bcf088189f0158815dd70d752870decf941ad51bbfbe7c0c872da603659864\": container with ID starting with 70bcf088189f0158815dd70d752870decf941ad51bbfbe7c0c872da603659864 not found: ID does not exist" Dec 01 21:42:58 crc kubenswrapper[4962]: I1201 21:42:58.061775 4962 scope.go:117] "RemoveContainer" containerID="2a2d1885515e7eab97b61b268f14f9fab5ff02464621e1a9b6adfee4c032ca3e" Dec 01 21:42:58 crc kubenswrapper[4962]: E1201 21:42:58.062022 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a2d1885515e7eab97b61b268f14f9fab5ff02464621e1a9b6adfee4c032ca3e\": container with ID starting with 2a2d1885515e7eab97b61b268f14f9fab5ff02464621e1a9b6adfee4c032ca3e not found: ID does not exist" containerID="2a2d1885515e7eab97b61b268f14f9fab5ff02464621e1a9b6adfee4c032ca3e" Dec 01 21:42:58 crc kubenswrapper[4962]: I1201 21:42:58.062043 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a2d1885515e7eab97b61b268f14f9fab5ff02464621e1a9b6adfee4c032ca3e"} err="failed to get container status \"2a2d1885515e7eab97b61b268f14f9fab5ff02464621e1a9b6adfee4c032ca3e\": rpc error: code = NotFound desc = could not find container \"2a2d1885515e7eab97b61b268f14f9fab5ff02464621e1a9b6adfee4c032ca3e\": container with ID starting with 2a2d1885515e7eab97b61b268f14f9fab5ff02464621e1a9b6adfee4c032ca3e not found: ID does not exist" Dec 01 21:42:58 crc kubenswrapper[4962]: I1201 21:42:58.062057 4962 scope.go:117] "RemoveContainer" containerID="2c2b5c061f60dc51486c1b3b6fd97f18f63860ecb75697b61ee8288c43f23417" Dec 01 21:42:58 crc kubenswrapper[4962]: E1201 21:42:58.062313 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c2b5c061f60dc51486c1b3b6fd97f18f63860ecb75697b61ee8288c43f23417\": container with ID starting with 2c2b5c061f60dc51486c1b3b6fd97f18f63860ecb75697b61ee8288c43f23417 not found: ID does not exist" containerID="2c2b5c061f60dc51486c1b3b6fd97f18f63860ecb75697b61ee8288c43f23417" Dec 01 21:42:58 crc kubenswrapper[4962]: I1201 21:42:58.062352 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c2b5c061f60dc51486c1b3b6fd97f18f63860ecb75697b61ee8288c43f23417"} err="failed to get container status \"2c2b5c061f60dc51486c1b3b6fd97f18f63860ecb75697b61ee8288c43f23417\": rpc error: code = NotFound desc = could not find container \"2c2b5c061f60dc51486c1b3b6fd97f18f63860ecb75697b61ee8288c43f23417\": container with ID starting with 2c2b5c061f60dc51486c1b3b6fd97f18f63860ecb75697b61ee8288c43f23417 not found: ID does not exist" Dec 01 21:42:58 crc kubenswrapper[4962]: I1201 21:42:58.062378 4962 scope.go:117] "RemoveContainer" containerID="7f5f31e9f2e590d4d3319248b0a4dca7e1f6575cb973ec7cc3e33fc91eb82d89" Dec 01 21:42:58 crc kubenswrapper[4962]: E1201 21:42:58.062673 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f5f31e9f2e590d4d3319248b0a4dca7e1f6575cb973ec7cc3e33fc91eb82d89\": container with ID starting with 7f5f31e9f2e590d4d3319248b0a4dca7e1f6575cb973ec7cc3e33fc91eb82d89 not found: ID does not exist" containerID="7f5f31e9f2e590d4d3319248b0a4dca7e1f6575cb973ec7cc3e33fc91eb82d89" Dec 01 21:42:58 crc kubenswrapper[4962]: I1201 21:42:58.062696 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f5f31e9f2e590d4d3319248b0a4dca7e1f6575cb973ec7cc3e33fc91eb82d89"} err="failed to get container status \"7f5f31e9f2e590d4d3319248b0a4dca7e1f6575cb973ec7cc3e33fc91eb82d89\": rpc error: code = NotFound desc = could not find container \"7f5f31e9f2e590d4d3319248b0a4dca7e1f6575cb973ec7cc3e33fc91eb82d89\": container with ID starting with 7f5f31e9f2e590d4d3319248b0a4dca7e1f6575cb973ec7cc3e33fc91eb82d89 not found: ID does not exist" Dec 01 21:42:58 crc kubenswrapper[4962]: I1201 21:42:58.062709 4962 scope.go:117] "RemoveContainer" containerID="da8a6f1fac0baebddfe20bd0ebda104356ad37e9f128affd8df2757584e8e712" Dec 01 21:42:58 crc kubenswrapper[4962]: E1201 21:42:58.062885 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da8a6f1fac0baebddfe20bd0ebda104356ad37e9f128affd8df2757584e8e712\": container with ID starting with da8a6f1fac0baebddfe20bd0ebda104356ad37e9f128affd8df2757584e8e712 not found: ID does not exist" containerID="da8a6f1fac0baebddfe20bd0ebda104356ad37e9f128affd8df2757584e8e712" Dec 01 21:42:58 crc kubenswrapper[4962]: I1201 21:42:58.062907 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da8a6f1fac0baebddfe20bd0ebda104356ad37e9f128affd8df2757584e8e712"} err="failed to get container status \"da8a6f1fac0baebddfe20bd0ebda104356ad37e9f128affd8df2757584e8e712\": rpc error: code = NotFound desc = could not find container \"da8a6f1fac0baebddfe20bd0ebda104356ad37e9f128affd8df2757584e8e712\": container with ID starting with da8a6f1fac0baebddfe20bd0ebda104356ad37e9f128affd8df2757584e8e712 not found: ID does not exist" Dec 01 21:42:58 crc kubenswrapper[4962]: I1201 21:42:58.062925 4962 scope.go:117] "RemoveContainer" containerID="b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69" Dec 01 21:42:58 crc kubenswrapper[4962]: E1201 21:42:58.063561 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69\": container with ID starting with b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69 not found: ID does not exist" containerID="b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69" Dec 01 21:42:58 crc kubenswrapper[4962]: I1201 21:42:58.063585 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69"} err="failed to get container status \"b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69\": rpc error: code = NotFound desc = could not find container \"b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69\": container with ID starting with b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69 not found: ID does not exist" Dec 01 21:42:58 crc kubenswrapper[4962]: I1201 21:42:58.063604 4962 scope.go:117] "RemoveContainer" containerID="2ddb62c277e10551212f6291e472c1576caa040e6b2f4fcfe115b91a1bb25581" Dec 01 21:42:58 crc kubenswrapper[4962]: I1201 21:42:58.063806 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ddb62c277e10551212f6291e472c1576caa040e6b2f4fcfe115b91a1bb25581"} err="failed to get container status \"2ddb62c277e10551212f6291e472c1576caa040e6b2f4fcfe115b91a1bb25581\": rpc error: code = NotFound desc = could not find container \"2ddb62c277e10551212f6291e472c1576caa040e6b2f4fcfe115b91a1bb25581\": container with ID starting with 2ddb62c277e10551212f6291e472c1576caa040e6b2f4fcfe115b91a1bb25581 not found: ID does not exist" Dec 01 21:42:58 crc kubenswrapper[4962]: I1201 21:42:58.063827 4962 scope.go:117] "RemoveContainer" containerID="893a9b3cc35f66e4d7ce63bd2c5c94a0e5824466a37b2281e05c4e92b5e90a05" Dec 01 21:42:58 crc kubenswrapper[4962]: I1201 21:42:58.064036 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"893a9b3cc35f66e4d7ce63bd2c5c94a0e5824466a37b2281e05c4e92b5e90a05"} err="failed to get container status \"893a9b3cc35f66e4d7ce63bd2c5c94a0e5824466a37b2281e05c4e92b5e90a05\": rpc error: code = NotFound desc = could not find container \"893a9b3cc35f66e4d7ce63bd2c5c94a0e5824466a37b2281e05c4e92b5e90a05\": container with ID starting with 893a9b3cc35f66e4d7ce63bd2c5c94a0e5824466a37b2281e05c4e92b5e90a05 not found: ID does not exist" Dec 01 21:42:58 crc kubenswrapper[4962]: I1201 21:42:58.064054 4962 scope.go:117] "RemoveContainer" containerID="81854e8f9430ba467e0aecead00fad301f92f85fdcca3f93f5e4aad9bba65925" Dec 01 21:42:58 crc kubenswrapper[4962]: I1201 21:42:58.064252 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81854e8f9430ba467e0aecead00fad301f92f85fdcca3f93f5e4aad9bba65925"} err="failed to get container status \"81854e8f9430ba467e0aecead00fad301f92f85fdcca3f93f5e4aad9bba65925\": rpc error: code = NotFound desc = could not find container \"81854e8f9430ba467e0aecead00fad301f92f85fdcca3f93f5e4aad9bba65925\": container with ID starting with 81854e8f9430ba467e0aecead00fad301f92f85fdcca3f93f5e4aad9bba65925 not found: ID does not exist" Dec 01 21:42:58 crc kubenswrapper[4962]: I1201 21:42:58.064270 4962 scope.go:117] "RemoveContainer" containerID="70bcf088189f0158815dd70d752870decf941ad51bbfbe7c0c872da603659864" Dec 01 21:42:58 crc kubenswrapper[4962]: I1201 21:42:58.064543 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70bcf088189f0158815dd70d752870decf941ad51bbfbe7c0c872da603659864"} err="failed to get container status \"70bcf088189f0158815dd70d752870decf941ad51bbfbe7c0c872da603659864\": rpc error: code = NotFound desc = could not find container \"70bcf088189f0158815dd70d752870decf941ad51bbfbe7c0c872da603659864\": container with ID starting with 70bcf088189f0158815dd70d752870decf941ad51bbfbe7c0c872da603659864 not found: ID does not exist" Dec 01 21:42:58 crc kubenswrapper[4962]: I1201 21:42:58.064559 4962 scope.go:117] "RemoveContainer" containerID="2a2d1885515e7eab97b61b268f14f9fab5ff02464621e1a9b6adfee4c032ca3e" Dec 01 21:42:58 crc kubenswrapper[4962]: I1201 21:42:58.064774 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a2d1885515e7eab97b61b268f14f9fab5ff02464621e1a9b6adfee4c032ca3e"} err="failed to get container status \"2a2d1885515e7eab97b61b268f14f9fab5ff02464621e1a9b6adfee4c032ca3e\": rpc error: code = NotFound desc = could not find container \"2a2d1885515e7eab97b61b268f14f9fab5ff02464621e1a9b6adfee4c032ca3e\": container with ID starting with 2a2d1885515e7eab97b61b268f14f9fab5ff02464621e1a9b6adfee4c032ca3e not found: ID does not exist" Dec 01 21:42:58 crc kubenswrapper[4962]: I1201 21:42:58.064788 4962 scope.go:117] "RemoveContainer" containerID="2c2b5c061f60dc51486c1b3b6fd97f18f63860ecb75697b61ee8288c43f23417" Dec 01 21:42:58 crc kubenswrapper[4962]: I1201 21:42:58.065101 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c2b5c061f60dc51486c1b3b6fd97f18f63860ecb75697b61ee8288c43f23417"} err="failed to get container status \"2c2b5c061f60dc51486c1b3b6fd97f18f63860ecb75697b61ee8288c43f23417\": rpc error: code = NotFound desc = could not find container \"2c2b5c061f60dc51486c1b3b6fd97f18f63860ecb75697b61ee8288c43f23417\": container with ID starting with 2c2b5c061f60dc51486c1b3b6fd97f18f63860ecb75697b61ee8288c43f23417 not found: ID does not exist" Dec 01 21:42:58 crc kubenswrapper[4962]: I1201 21:42:58.065117 4962 scope.go:117] "RemoveContainer" containerID="7f5f31e9f2e590d4d3319248b0a4dca7e1f6575cb973ec7cc3e33fc91eb82d89" Dec 01 21:42:58 crc kubenswrapper[4962]: I1201 21:42:58.065348 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f5f31e9f2e590d4d3319248b0a4dca7e1f6575cb973ec7cc3e33fc91eb82d89"} err="failed to get container status \"7f5f31e9f2e590d4d3319248b0a4dca7e1f6575cb973ec7cc3e33fc91eb82d89\": rpc error: code = NotFound desc = could not find container \"7f5f31e9f2e590d4d3319248b0a4dca7e1f6575cb973ec7cc3e33fc91eb82d89\": container with ID starting with 7f5f31e9f2e590d4d3319248b0a4dca7e1f6575cb973ec7cc3e33fc91eb82d89 not found: ID does not exist" Dec 01 21:42:58 crc kubenswrapper[4962]: I1201 21:42:58.065369 4962 scope.go:117] "RemoveContainer" containerID="da8a6f1fac0baebddfe20bd0ebda104356ad37e9f128affd8df2757584e8e712" Dec 01 21:42:58 crc kubenswrapper[4962]: I1201 21:42:58.065597 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da8a6f1fac0baebddfe20bd0ebda104356ad37e9f128affd8df2757584e8e712"} err="failed to get container status \"da8a6f1fac0baebddfe20bd0ebda104356ad37e9f128affd8df2757584e8e712\": rpc error: code = NotFound desc = could not find container \"da8a6f1fac0baebddfe20bd0ebda104356ad37e9f128affd8df2757584e8e712\": container with ID starting with da8a6f1fac0baebddfe20bd0ebda104356ad37e9f128affd8df2757584e8e712 not found: ID does not exist" Dec 01 21:42:58 crc kubenswrapper[4962]: I1201 21:42:58.065614 4962 scope.go:117] "RemoveContainer" containerID="b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69" Dec 01 21:42:58 crc kubenswrapper[4962]: I1201 21:42:58.065841 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69"} err="failed to get container status \"b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69\": rpc error: code = NotFound desc = could not find container \"b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69\": container with ID starting with b65212f510b5b611fa332185dd11e28e9201a2a5ca438bf218dffde3565e7a69 not found: ID does not exist" Dec 01 21:42:58 crc kubenswrapper[4962]: I1201 21:42:58.226239 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="017b2e87-9a6e-46c6-b061-1ed93bfd2322" path="/var/lib/kubelet/pods/017b2e87-9a6e-46c6-b061-1ed93bfd2322/volumes" Dec 01 21:42:58 crc kubenswrapper[4962]: I1201 21:42:58.822216 4962 generic.go:334] "Generic (PLEG): container finished" podID="7b93d9d1-8679-4de4-84d6-adcfaae055d6" containerID="a8a745ab87145b519311f69420f318142bb7fe35df6aeb54c970f8d079766aab" exitCode=0 Dec 01 21:42:58 crc kubenswrapper[4962]: I1201 21:42:58.822297 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" event={"ID":"7b93d9d1-8679-4de4-84d6-adcfaae055d6","Type":"ContainerDied","Data":"a8a745ab87145b519311f69420f318142bb7fe35df6aeb54c970f8d079766aab"} Dec 01 21:42:58 crc kubenswrapper[4962]: I1201 21:42:58.822444 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" event={"ID":"7b93d9d1-8679-4de4-84d6-adcfaae055d6","Type":"ContainerStarted","Data":"d87e47bfdd7c6e7b217aee9b644b7d4933f1bc0c2cfba8b32f250d5ad1093b25"} Dec 01 21:42:59 crc kubenswrapper[4962]: I1201 21:42:59.831527 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" event={"ID":"7b93d9d1-8679-4de4-84d6-adcfaae055d6","Type":"ContainerStarted","Data":"3355acc1bb40b625a2db9f0a987cb86f6d1220c1b7a540189581d4b3c980b222"} Dec 01 21:42:59 crc kubenswrapper[4962]: I1201 21:42:59.831989 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" event={"ID":"7b93d9d1-8679-4de4-84d6-adcfaae055d6","Type":"ContainerStarted","Data":"259a18d9c4b60b3b0c07b9e88a1fe0c229cde92306fddda859232f4f5b3cfd62"} Dec 01 21:42:59 crc kubenswrapper[4962]: I1201 21:42:59.831999 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" event={"ID":"7b93d9d1-8679-4de4-84d6-adcfaae055d6","Type":"ContainerStarted","Data":"a08119ee64df9d54564c849cea182f37784bd07b2d45f93e3bc1dcacf2cecbbf"} Dec 01 21:42:59 crc kubenswrapper[4962]: I1201 21:42:59.832009 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" event={"ID":"7b93d9d1-8679-4de4-84d6-adcfaae055d6","Type":"ContainerStarted","Data":"0943cd4b5ea77ffcd2947fe6c9c45b2bee3a878ecb1ee447a8aa5d973cb9561f"} Dec 01 21:42:59 crc kubenswrapper[4962]: I1201 21:42:59.832019 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" event={"ID":"7b93d9d1-8679-4de4-84d6-adcfaae055d6","Type":"ContainerStarted","Data":"0b9dae6d5dbab094bb81d4b26a0630f9c5fc22ffb361a14c140391cf67a5a5b8"} Dec 01 21:42:59 crc kubenswrapper[4962]: I1201 21:42:59.832027 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" event={"ID":"7b93d9d1-8679-4de4-84d6-adcfaae055d6","Type":"ContainerStarted","Data":"e69f03ad3c3008315d09a37dec3bdda8ebc78b1d0cab8462e243df9f07639f7f"} Dec 01 21:43:02 crc kubenswrapper[4962]: I1201 21:43:02.784457 4962 patch_prober.go:28] interesting pod/machine-config-daemon-b642k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 21:43:02 crc kubenswrapper[4962]: I1201 21:43:02.785279 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 21:43:02 crc kubenswrapper[4962]: I1201 21:43:02.863338 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" event={"ID":"7b93d9d1-8679-4de4-84d6-adcfaae055d6","Type":"ContainerStarted","Data":"272d786efcf9e355e7bf608931a2c81c00f19a73aee8ca31ff4dc6eec6234f3d"} Dec 01 21:43:03 crc kubenswrapper[4962]: I1201 21:43:03.568154 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-btqq9"] Dec 01 21:43:03 crc kubenswrapper[4962]: I1201 21:43:03.569045 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-btqq9" Dec 01 21:43:03 crc kubenswrapper[4962]: I1201 21:43:03.570505 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Dec 01 21:43:03 crc kubenswrapper[4962]: I1201 21:43:03.571216 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-dgdjc" Dec 01 21:43:03 crc kubenswrapper[4962]: I1201 21:43:03.571378 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Dec 01 21:43:03 crc kubenswrapper[4962]: I1201 21:43:03.607468 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-68b58c7cc9-gdr5f"] Dec 01 21:43:03 crc kubenswrapper[4962]: I1201 21:43:03.608622 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68b58c7cc9-gdr5f" Dec 01 21:43:03 crc kubenswrapper[4962]: I1201 21:43:03.612025 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-b75nt" Dec 01 21:43:03 crc kubenswrapper[4962]: I1201 21:43:03.612399 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Dec 01 21:43:03 crc kubenswrapper[4962]: I1201 21:43:03.619674 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-68b58c7cc9-mdbrz"] Dec 01 21:43:03 crc kubenswrapper[4962]: I1201 21:43:03.620381 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68b58c7cc9-mdbrz" Dec 01 21:43:03 crc kubenswrapper[4962]: I1201 21:43:03.695592 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vhrv\" (UniqueName: \"kubernetes.io/projected/c021e2cb-ee5d-4e74-a5fa-1ede1fde37df-kube-api-access-9vhrv\") pod \"obo-prometheus-operator-668cf9dfbb-btqq9\" (UID: \"c021e2cb-ee5d-4e74-a5fa-1ede1fde37df\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-btqq9" Dec 01 21:43:03 crc kubenswrapper[4962]: I1201 21:43:03.699348 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-mbs2x"] Dec 01 21:43:03 crc kubenswrapper[4962]: I1201 21:43:03.700108 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-mbs2x" Dec 01 21:43:03 crc kubenswrapper[4962]: I1201 21:43:03.704373 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Dec 01 21:43:03 crc kubenswrapper[4962]: I1201 21:43:03.704465 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-8dsmj" Dec 01 21:43:03 crc kubenswrapper[4962]: I1201 21:43:03.797800 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/6cb92407-0085-483e-8079-3aa441bfd214-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-mbs2x\" (UID: \"6cb92407-0085-483e-8079-3aa441bfd214\") " pod="openshift-operators/observability-operator-d8bb48f5d-mbs2x" Dec 01 21:43:03 crc kubenswrapper[4962]: I1201 21:43:03.797892 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bf5940c1-cfd9-4ed4-93a0-db06782924ae-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-68b58c7cc9-mdbrz\" (UID: \"bf5940c1-cfd9-4ed4-93a0-db06782924ae\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-68b58c7cc9-mdbrz" Dec 01 21:43:03 crc kubenswrapper[4962]: I1201 21:43:03.797962 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b6a9273c-4395-4883-abbd-cfd15b5d552d-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-68b58c7cc9-gdr5f\" (UID: \"b6a9273c-4395-4883-abbd-cfd15b5d552d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-68b58c7cc9-gdr5f" Dec 01 21:43:03 crc kubenswrapper[4962]: I1201 21:43:03.798081 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b6a9273c-4395-4883-abbd-cfd15b5d552d-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-68b58c7cc9-gdr5f\" (UID: \"b6a9273c-4395-4883-abbd-cfd15b5d552d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-68b58c7cc9-gdr5f" Dec 01 21:43:03 crc kubenswrapper[4962]: I1201 21:43:03.798147 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vhrv\" (UniqueName: \"kubernetes.io/projected/c021e2cb-ee5d-4e74-a5fa-1ede1fde37df-kube-api-access-9vhrv\") pod \"obo-prometheus-operator-668cf9dfbb-btqq9\" (UID: \"c021e2cb-ee5d-4e74-a5fa-1ede1fde37df\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-btqq9" Dec 01 21:43:03 crc kubenswrapper[4962]: I1201 21:43:03.798206 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bf5940c1-cfd9-4ed4-93a0-db06782924ae-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-68b58c7cc9-mdbrz\" (UID: \"bf5940c1-cfd9-4ed4-93a0-db06782924ae\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-68b58c7cc9-mdbrz" Dec 01 21:43:03 crc kubenswrapper[4962]: I1201 21:43:03.798229 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nsr2\" (UniqueName: \"kubernetes.io/projected/6cb92407-0085-483e-8079-3aa441bfd214-kube-api-access-9nsr2\") pod \"observability-operator-d8bb48f5d-mbs2x\" (UID: \"6cb92407-0085-483e-8079-3aa441bfd214\") " pod="openshift-operators/observability-operator-d8bb48f5d-mbs2x" Dec 01 21:43:03 crc kubenswrapper[4962]: I1201 21:43:03.800483 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5446b9c989-ff5lc"] Dec 01 21:43:03 crc kubenswrapper[4962]: I1201 21:43:03.801233 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-ff5lc" Dec 01 21:43:03 crc kubenswrapper[4962]: I1201 21:43:03.804964 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-45fs7" Dec 01 21:43:03 crc kubenswrapper[4962]: I1201 21:43:03.824732 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vhrv\" (UniqueName: \"kubernetes.io/projected/c021e2cb-ee5d-4e74-a5fa-1ede1fde37df-kube-api-access-9vhrv\") pod \"obo-prometheus-operator-668cf9dfbb-btqq9\" (UID: \"c021e2cb-ee5d-4e74-a5fa-1ede1fde37df\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-btqq9" Dec 01 21:43:03 crc kubenswrapper[4962]: I1201 21:43:03.884169 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-btqq9" Dec 01 21:43:03 crc kubenswrapper[4962]: I1201 21:43:03.900002 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bf5940c1-cfd9-4ed4-93a0-db06782924ae-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-68b58c7cc9-mdbrz\" (UID: \"bf5940c1-cfd9-4ed4-93a0-db06782924ae\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-68b58c7cc9-mdbrz" Dec 01 21:43:03 crc kubenswrapper[4962]: I1201 21:43:03.900058 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nsr2\" (UniqueName: \"kubernetes.io/projected/6cb92407-0085-483e-8079-3aa441bfd214-kube-api-access-9nsr2\") pod \"observability-operator-d8bb48f5d-mbs2x\" (UID: \"6cb92407-0085-483e-8079-3aa441bfd214\") " pod="openshift-operators/observability-operator-d8bb48f5d-mbs2x" Dec 01 21:43:03 crc kubenswrapper[4962]: I1201 21:43:03.900087 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/32158a1b-c7c3-4fda-98d1-69443d10d0a5-openshift-service-ca\") pod \"perses-operator-5446b9c989-ff5lc\" (UID: \"32158a1b-c7c3-4fda-98d1-69443d10d0a5\") " pod="openshift-operators/perses-operator-5446b9c989-ff5lc" Dec 01 21:43:03 crc kubenswrapper[4962]: I1201 21:43:03.900132 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/6cb92407-0085-483e-8079-3aa441bfd214-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-mbs2x\" (UID: \"6cb92407-0085-483e-8079-3aa441bfd214\") " pod="openshift-operators/observability-operator-d8bb48f5d-mbs2x" Dec 01 21:43:03 crc kubenswrapper[4962]: I1201 21:43:03.900170 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bf5940c1-cfd9-4ed4-93a0-db06782924ae-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-68b58c7cc9-mdbrz\" (UID: \"bf5940c1-cfd9-4ed4-93a0-db06782924ae\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-68b58c7cc9-mdbrz" Dec 01 21:43:03 crc kubenswrapper[4962]: I1201 21:43:03.900198 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b6a9273c-4395-4883-abbd-cfd15b5d552d-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-68b58c7cc9-gdr5f\" (UID: \"b6a9273c-4395-4883-abbd-cfd15b5d552d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-68b58c7cc9-gdr5f" Dec 01 21:43:03 crc kubenswrapper[4962]: I1201 21:43:03.900281 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8g9t\" (UniqueName: \"kubernetes.io/projected/32158a1b-c7c3-4fda-98d1-69443d10d0a5-kube-api-access-p8g9t\") pod \"perses-operator-5446b9c989-ff5lc\" (UID: \"32158a1b-c7c3-4fda-98d1-69443d10d0a5\") " pod="openshift-operators/perses-operator-5446b9c989-ff5lc" Dec 01 21:43:03 crc kubenswrapper[4962]: I1201 21:43:03.900325 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b6a9273c-4395-4883-abbd-cfd15b5d552d-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-68b58c7cc9-gdr5f\" (UID: \"b6a9273c-4395-4883-abbd-cfd15b5d552d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-68b58c7cc9-gdr5f" Dec 01 21:43:03 crc kubenswrapper[4962]: I1201 21:43:03.903664 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bf5940c1-cfd9-4ed4-93a0-db06782924ae-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-68b58c7cc9-mdbrz\" (UID: \"bf5940c1-cfd9-4ed4-93a0-db06782924ae\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-68b58c7cc9-mdbrz" Dec 01 21:43:03 crc kubenswrapper[4962]: I1201 21:43:03.903874 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/6cb92407-0085-483e-8079-3aa441bfd214-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-mbs2x\" (UID: \"6cb92407-0085-483e-8079-3aa441bfd214\") " pod="openshift-operators/observability-operator-d8bb48f5d-mbs2x" Dec 01 21:43:03 crc kubenswrapper[4962]: I1201 21:43:03.904156 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b6a9273c-4395-4883-abbd-cfd15b5d552d-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-68b58c7cc9-gdr5f\" (UID: \"b6a9273c-4395-4883-abbd-cfd15b5d552d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-68b58c7cc9-gdr5f" Dec 01 21:43:03 crc kubenswrapper[4962]: I1201 21:43:03.904852 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bf5940c1-cfd9-4ed4-93a0-db06782924ae-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-68b58c7cc9-mdbrz\" (UID: \"bf5940c1-cfd9-4ed4-93a0-db06782924ae\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-68b58c7cc9-mdbrz" Dec 01 21:43:03 crc kubenswrapper[4962]: E1201 21:43:03.908325 4962 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-btqq9_openshift-operators_c021e2cb-ee5d-4e74-a5fa-1ede1fde37df_0(ea1ba491d04bbeb78d6ef2329b6d9c6f677006aa1fd2fb45d642d190505afb8f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 01 21:43:03 crc kubenswrapper[4962]: E1201 21:43:03.908399 4962 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-btqq9_openshift-operators_c021e2cb-ee5d-4e74-a5fa-1ede1fde37df_0(ea1ba491d04bbeb78d6ef2329b6d9c6f677006aa1fd2fb45d642d190505afb8f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-btqq9" Dec 01 21:43:03 crc kubenswrapper[4962]: E1201 21:43:03.908422 4962 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-btqq9_openshift-operators_c021e2cb-ee5d-4e74-a5fa-1ede1fde37df_0(ea1ba491d04bbeb78d6ef2329b6d9c6f677006aa1fd2fb45d642d190505afb8f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-btqq9" Dec 01 21:43:03 crc kubenswrapper[4962]: E1201 21:43:03.908464 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-668cf9dfbb-btqq9_openshift-operators(c021e2cb-ee5d-4e74-a5fa-1ede1fde37df)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-668cf9dfbb-btqq9_openshift-operators(c021e2cb-ee5d-4e74-a5fa-1ede1fde37df)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-btqq9_openshift-operators_c021e2cb-ee5d-4e74-a5fa-1ede1fde37df_0(ea1ba491d04bbeb78d6ef2329b6d9c6f677006aa1fd2fb45d642d190505afb8f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-btqq9" podUID="c021e2cb-ee5d-4e74-a5fa-1ede1fde37df" Dec 01 21:43:03 crc kubenswrapper[4962]: I1201 21:43:03.909329 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b6a9273c-4395-4883-abbd-cfd15b5d552d-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-68b58c7cc9-gdr5f\" (UID: \"b6a9273c-4395-4883-abbd-cfd15b5d552d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-68b58c7cc9-gdr5f" Dec 01 21:43:03 crc kubenswrapper[4962]: I1201 21:43:03.922329 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68b58c7cc9-gdr5f" Dec 01 21:43:03 crc kubenswrapper[4962]: I1201 21:43:03.923559 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nsr2\" (UniqueName: \"kubernetes.io/projected/6cb92407-0085-483e-8079-3aa441bfd214-kube-api-access-9nsr2\") pod \"observability-operator-d8bb48f5d-mbs2x\" (UID: \"6cb92407-0085-483e-8079-3aa441bfd214\") " pod="openshift-operators/observability-operator-d8bb48f5d-mbs2x" Dec 01 21:43:03 crc kubenswrapper[4962]: I1201 21:43:03.937787 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68b58c7cc9-mdbrz" Dec 01 21:43:03 crc kubenswrapper[4962]: E1201 21:43:03.954891 4962 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-68b58c7cc9-gdr5f_openshift-operators_b6a9273c-4395-4883-abbd-cfd15b5d552d_0(775ea85286c264dbefcf540e6837811efb105d971a74d85aff90cf91d7cafef1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 01 21:43:03 crc kubenswrapper[4962]: E1201 21:43:03.954977 4962 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-68b58c7cc9-gdr5f_openshift-operators_b6a9273c-4395-4883-abbd-cfd15b5d552d_0(775ea85286c264dbefcf540e6837811efb105d971a74d85aff90cf91d7cafef1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68b58c7cc9-gdr5f" Dec 01 21:43:03 crc kubenswrapper[4962]: E1201 21:43:03.955009 4962 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-68b58c7cc9-gdr5f_openshift-operators_b6a9273c-4395-4883-abbd-cfd15b5d552d_0(775ea85286c264dbefcf540e6837811efb105d971a74d85aff90cf91d7cafef1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68b58c7cc9-gdr5f" Dec 01 21:43:03 crc kubenswrapper[4962]: E1201 21:43:03.955072 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-68b58c7cc9-gdr5f_openshift-operators(b6a9273c-4395-4883-abbd-cfd15b5d552d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-68b58c7cc9-gdr5f_openshift-operators(b6a9273c-4395-4883-abbd-cfd15b5d552d)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-68b58c7cc9-gdr5f_openshift-operators_b6a9273c-4395-4883-abbd-cfd15b5d552d_0(775ea85286c264dbefcf540e6837811efb105d971a74d85aff90cf91d7cafef1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68b58c7cc9-gdr5f" podUID="b6a9273c-4395-4883-abbd-cfd15b5d552d" Dec 01 21:43:03 crc kubenswrapper[4962]: E1201 21:43:03.969356 4962 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-68b58c7cc9-mdbrz_openshift-operators_bf5940c1-cfd9-4ed4-93a0-db06782924ae_0(fb938cb57e51776d49d5990c30dc2000989702e2ebf077c2f0247da21b31c8a5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 01 21:43:03 crc kubenswrapper[4962]: E1201 21:43:03.969450 4962 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-68b58c7cc9-mdbrz_openshift-operators_bf5940c1-cfd9-4ed4-93a0-db06782924ae_0(fb938cb57e51776d49d5990c30dc2000989702e2ebf077c2f0247da21b31c8a5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68b58c7cc9-mdbrz" Dec 01 21:43:03 crc kubenswrapper[4962]: E1201 21:43:03.969478 4962 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-68b58c7cc9-mdbrz_openshift-operators_bf5940c1-cfd9-4ed4-93a0-db06782924ae_0(fb938cb57e51776d49d5990c30dc2000989702e2ebf077c2f0247da21b31c8a5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68b58c7cc9-mdbrz" Dec 01 21:43:03 crc kubenswrapper[4962]: E1201 21:43:03.969551 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-68b58c7cc9-mdbrz_openshift-operators(bf5940c1-cfd9-4ed4-93a0-db06782924ae)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-68b58c7cc9-mdbrz_openshift-operators(bf5940c1-cfd9-4ed4-93a0-db06782924ae)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-68b58c7cc9-mdbrz_openshift-operators_bf5940c1-cfd9-4ed4-93a0-db06782924ae_0(fb938cb57e51776d49d5990c30dc2000989702e2ebf077c2f0247da21b31c8a5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68b58c7cc9-mdbrz" podUID="bf5940c1-cfd9-4ed4-93a0-db06782924ae" Dec 01 21:43:04 crc kubenswrapper[4962]: I1201 21:43:04.001892 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8g9t\" (UniqueName: \"kubernetes.io/projected/32158a1b-c7c3-4fda-98d1-69443d10d0a5-kube-api-access-p8g9t\") pod \"perses-operator-5446b9c989-ff5lc\" (UID: \"32158a1b-c7c3-4fda-98d1-69443d10d0a5\") " pod="openshift-operators/perses-operator-5446b9c989-ff5lc" Dec 01 21:43:04 crc kubenswrapper[4962]: I1201 21:43:04.001998 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/32158a1b-c7c3-4fda-98d1-69443d10d0a5-openshift-service-ca\") pod \"perses-operator-5446b9c989-ff5lc\" (UID: \"32158a1b-c7c3-4fda-98d1-69443d10d0a5\") " pod="openshift-operators/perses-operator-5446b9c989-ff5lc" Dec 01 21:43:04 crc kubenswrapper[4962]: I1201 21:43:04.002868 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/32158a1b-c7c3-4fda-98d1-69443d10d0a5-openshift-service-ca\") pod \"perses-operator-5446b9c989-ff5lc\" (UID: \"32158a1b-c7c3-4fda-98d1-69443d10d0a5\") " pod="openshift-operators/perses-operator-5446b9c989-ff5lc" Dec 01 21:43:04 crc kubenswrapper[4962]: I1201 21:43:04.020543 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8g9t\" (UniqueName: \"kubernetes.io/projected/32158a1b-c7c3-4fda-98d1-69443d10d0a5-kube-api-access-p8g9t\") pod \"perses-operator-5446b9c989-ff5lc\" (UID: \"32158a1b-c7c3-4fda-98d1-69443d10d0a5\") " pod="openshift-operators/perses-operator-5446b9c989-ff5lc" Dec 01 21:43:04 crc kubenswrapper[4962]: I1201 21:43:04.022918 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-mbs2x" Dec 01 21:43:04 crc kubenswrapper[4962]: E1201 21:43:04.047743 4962 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-mbs2x_openshift-operators_6cb92407-0085-483e-8079-3aa441bfd214_0(670b19e1a49e6c06af703d105bf31c32adf603bffb99df90fb5fafb511188172): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 01 21:43:04 crc kubenswrapper[4962]: E1201 21:43:04.047807 4962 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-mbs2x_openshift-operators_6cb92407-0085-483e-8079-3aa441bfd214_0(670b19e1a49e6c06af703d105bf31c32adf603bffb99df90fb5fafb511188172): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-mbs2x" Dec 01 21:43:04 crc kubenswrapper[4962]: E1201 21:43:04.047830 4962 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-mbs2x_openshift-operators_6cb92407-0085-483e-8079-3aa441bfd214_0(670b19e1a49e6c06af703d105bf31c32adf603bffb99df90fb5fafb511188172): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-mbs2x" Dec 01 21:43:04 crc kubenswrapper[4962]: E1201 21:43:04.047869 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-d8bb48f5d-mbs2x_openshift-operators(6cb92407-0085-483e-8079-3aa441bfd214)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-d8bb48f5d-mbs2x_openshift-operators(6cb92407-0085-483e-8079-3aa441bfd214)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-mbs2x_openshift-operators_6cb92407-0085-483e-8079-3aa441bfd214_0(670b19e1a49e6c06af703d105bf31c32adf603bffb99df90fb5fafb511188172): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-d8bb48f5d-mbs2x" podUID="6cb92407-0085-483e-8079-3aa441bfd214" Dec 01 21:43:04 crc kubenswrapper[4962]: I1201 21:43:04.118734 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-ff5lc" Dec 01 21:43:04 crc kubenswrapper[4962]: E1201 21:43:04.144790 4962 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-ff5lc_openshift-operators_32158a1b-c7c3-4fda-98d1-69443d10d0a5_0(19a379a52f378fe9bbf270b00e6524dce417e8a0169fb51a74a97652a3b6dc98): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 01 21:43:04 crc kubenswrapper[4962]: E1201 21:43:04.144867 4962 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-ff5lc_openshift-operators_32158a1b-c7c3-4fda-98d1-69443d10d0a5_0(19a379a52f378fe9bbf270b00e6524dce417e8a0169fb51a74a97652a3b6dc98): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-ff5lc" Dec 01 21:43:04 crc kubenswrapper[4962]: E1201 21:43:04.144896 4962 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-ff5lc_openshift-operators_32158a1b-c7c3-4fda-98d1-69443d10d0a5_0(19a379a52f378fe9bbf270b00e6524dce417e8a0169fb51a74a97652a3b6dc98): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-ff5lc" Dec 01 21:43:04 crc kubenswrapper[4962]: E1201 21:43:04.145003 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5446b9c989-ff5lc_openshift-operators(32158a1b-c7c3-4fda-98d1-69443d10d0a5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5446b9c989-ff5lc_openshift-operators(32158a1b-c7c3-4fda-98d1-69443d10d0a5)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-ff5lc_openshift-operators_32158a1b-c7c3-4fda-98d1-69443d10d0a5_0(19a379a52f378fe9bbf270b00e6524dce417e8a0169fb51a74a97652a3b6dc98): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5446b9c989-ff5lc" podUID="32158a1b-c7c3-4fda-98d1-69443d10d0a5" Dec 01 21:43:04 crc kubenswrapper[4962]: I1201 21:43:04.878322 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" event={"ID":"7b93d9d1-8679-4de4-84d6-adcfaae055d6","Type":"ContainerStarted","Data":"1dffc8fab2a8aa7232ecdab64ea9eb4f6625949a8148d6b6c2c57e5526ba39f9"} Dec 01 21:43:04 crc kubenswrapper[4962]: I1201 21:43:04.878675 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" Dec 01 21:43:04 crc kubenswrapper[4962]: I1201 21:43:04.878693 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" Dec 01 21:43:04 crc kubenswrapper[4962]: I1201 21:43:04.878705 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" Dec 01 21:43:04 crc kubenswrapper[4962]: I1201 21:43:04.920825 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" podStartSLOduration=7.920806195 podStartE2EDuration="7.920806195s" podCreationTimestamp="2025-12-01 21:42:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:43:04.91287855 +0000 UTC m=+569.014317755" watchObservedRunningTime="2025-12-01 21:43:04.920806195 +0000 UTC m=+569.022245390" Dec 01 21:43:04 crc kubenswrapper[4962]: I1201 21:43:04.928016 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" Dec 01 21:43:04 crc kubenswrapper[4962]: I1201 21:43:04.960689 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" Dec 01 21:43:05 crc kubenswrapper[4962]: I1201 21:43:05.351227 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-68b58c7cc9-mdbrz"] Dec 01 21:43:05 crc kubenswrapper[4962]: I1201 21:43:05.351344 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68b58c7cc9-mdbrz" Dec 01 21:43:05 crc kubenswrapper[4962]: I1201 21:43:05.351998 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68b58c7cc9-mdbrz" Dec 01 21:43:05 crc kubenswrapper[4962]: I1201 21:43:05.359912 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-68b58c7cc9-gdr5f"] Dec 01 21:43:05 crc kubenswrapper[4962]: I1201 21:43:05.360024 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68b58c7cc9-gdr5f" Dec 01 21:43:05 crc kubenswrapper[4962]: I1201 21:43:05.360434 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68b58c7cc9-gdr5f" Dec 01 21:43:05 crc kubenswrapper[4962]: I1201 21:43:05.377735 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-ff5lc"] Dec 01 21:43:05 crc kubenswrapper[4962]: I1201 21:43:05.377828 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-ff5lc" Dec 01 21:43:05 crc kubenswrapper[4962]: I1201 21:43:05.378346 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-ff5lc" Dec 01 21:43:05 crc kubenswrapper[4962]: I1201 21:43:05.385500 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-mbs2x"] Dec 01 21:43:05 crc kubenswrapper[4962]: I1201 21:43:05.385596 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-mbs2x" Dec 01 21:43:05 crc kubenswrapper[4962]: I1201 21:43:05.386237 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-mbs2x" Dec 01 21:43:05 crc kubenswrapper[4962]: E1201 21:43:05.397145 4962 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-68b58c7cc9-gdr5f_openshift-operators_b6a9273c-4395-4883-abbd-cfd15b5d552d_0(e4eb95faa764342b79bb92207656eee090aaef357d957c0c5dce1fc4487867aa): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 01 21:43:05 crc kubenswrapper[4962]: E1201 21:43:05.397204 4962 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-68b58c7cc9-gdr5f_openshift-operators_b6a9273c-4395-4883-abbd-cfd15b5d552d_0(e4eb95faa764342b79bb92207656eee090aaef357d957c0c5dce1fc4487867aa): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68b58c7cc9-gdr5f" Dec 01 21:43:05 crc kubenswrapper[4962]: E1201 21:43:05.397223 4962 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-68b58c7cc9-gdr5f_openshift-operators_b6a9273c-4395-4883-abbd-cfd15b5d552d_0(e4eb95faa764342b79bb92207656eee090aaef357d957c0c5dce1fc4487867aa): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68b58c7cc9-gdr5f" Dec 01 21:43:05 crc kubenswrapper[4962]: E1201 21:43:05.397271 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-68b58c7cc9-gdr5f_openshift-operators(b6a9273c-4395-4883-abbd-cfd15b5d552d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-68b58c7cc9-gdr5f_openshift-operators(b6a9273c-4395-4883-abbd-cfd15b5d552d)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-68b58c7cc9-gdr5f_openshift-operators_b6a9273c-4395-4883-abbd-cfd15b5d552d_0(e4eb95faa764342b79bb92207656eee090aaef357d957c0c5dce1fc4487867aa): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68b58c7cc9-gdr5f" podUID="b6a9273c-4395-4883-abbd-cfd15b5d552d" Dec 01 21:43:05 crc kubenswrapper[4962]: E1201 21:43:05.401589 4962 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-68b58c7cc9-mdbrz_openshift-operators_bf5940c1-cfd9-4ed4-93a0-db06782924ae_0(523bf3450f8233758d0c79556654de8ff90ac1e0326e6fcd9997bf9488bb2ae9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 01 21:43:05 crc kubenswrapper[4962]: E1201 21:43:05.401637 4962 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-68b58c7cc9-mdbrz_openshift-operators_bf5940c1-cfd9-4ed4-93a0-db06782924ae_0(523bf3450f8233758d0c79556654de8ff90ac1e0326e6fcd9997bf9488bb2ae9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68b58c7cc9-mdbrz" Dec 01 21:43:05 crc kubenswrapper[4962]: E1201 21:43:05.401659 4962 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-68b58c7cc9-mdbrz_openshift-operators_bf5940c1-cfd9-4ed4-93a0-db06782924ae_0(523bf3450f8233758d0c79556654de8ff90ac1e0326e6fcd9997bf9488bb2ae9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68b58c7cc9-mdbrz" Dec 01 21:43:05 crc kubenswrapper[4962]: E1201 21:43:05.401696 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-68b58c7cc9-mdbrz_openshift-operators(bf5940c1-cfd9-4ed4-93a0-db06782924ae)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-68b58c7cc9-mdbrz_openshift-operators(bf5940c1-cfd9-4ed4-93a0-db06782924ae)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-68b58c7cc9-mdbrz_openshift-operators_bf5940c1-cfd9-4ed4-93a0-db06782924ae_0(523bf3450f8233758d0c79556654de8ff90ac1e0326e6fcd9997bf9488bb2ae9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68b58c7cc9-mdbrz" podUID="bf5940c1-cfd9-4ed4-93a0-db06782924ae" Dec 01 21:43:05 crc kubenswrapper[4962]: I1201 21:43:05.425794 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-btqq9"] Dec 01 21:43:05 crc kubenswrapper[4962]: I1201 21:43:05.425868 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-btqq9" Dec 01 21:43:05 crc kubenswrapper[4962]: I1201 21:43:05.426180 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-btqq9" Dec 01 21:43:05 crc kubenswrapper[4962]: E1201 21:43:05.432335 4962 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-ff5lc_openshift-operators_32158a1b-c7c3-4fda-98d1-69443d10d0a5_0(b956ef6e049add5008229402b90b1fe7bd4ec247151c28310653368fb35c9b6f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 01 21:43:05 crc kubenswrapper[4962]: E1201 21:43:05.432399 4962 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-ff5lc_openshift-operators_32158a1b-c7c3-4fda-98d1-69443d10d0a5_0(b956ef6e049add5008229402b90b1fe7bd4ec247151c28310653368fb35c9b6f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-ff5lc" Dec 01 21:43:05 crc kubenswrapper[4962]: E1201 21:43:05.432419 4962 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-ff5lc_openshift-operators_32158a1b-c7c3-4fda-98d1-69443d10d0a5_0(b956ef6e049add5008229402b90b1fe7bd4ec247151c28310653368fb35c9b6f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-ff5lc" Dec 01 21:43:05 crc kubenswrapper[4962]: E1201 21:43:05.432456 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5446b9c989-ff5lc_openshift-operators(32158a1b-c7c3-4fda-98d1-69443d10d0a5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5446b9c989-ff5lc_openshift-operators(32158a1b-c7c3-4fda-98d1-69443d10d0a5)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-ff5lc_openshift-operators_32158a1b-c7c3-4fda-98d1-69443d10d0a5_0(b956ef6e049add5008229402b90b1fe7bd4ec247151c28310653368fb35c9b6f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5446b9c989-ff5lc" podUID="32158a1b-c7c3-4fda-98d1-69443d10d0a5" Dec 01 21:43:05 crc kubenswrapper[4962]: E1201 21:43:05.452387 4962 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-mbs2x_openshift-operators_6cb92407-0085-483e-8079-3aa441bfd214_0(7b8eec2759c08fa829177d41f4f12c21ca5d0b9d321b8667e3600cfdfd215fb3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 01 21:43:05 crc kubenswrapper[4962]: E1201 21:43:05.452466 4962 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-mbs2x_openshift-operators_6cb92407-0085-483e-8079-3aa441bfd214_0(7b8eec2759c08fa829177d41f4f12c21ca5d0b9d321b8667e3600cfdfd215fb3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-mbs2x" Dec 01 21:43:05 crc kubenswrapper[4962]: E1201 21:43:05.452495 4962 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-mbs2x_openshift-operators_6cb92407-0085-483e-8079-3aa441bfd214_0(7b8eec2759c08fa829177d41f4f12c21ca5d0b9d321b8667e3600cfdfd215fb3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-mbs2x" Dec 01 21:43:05 crc kubenswrapper[4962]: E1201 21:43:05.452551 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-d8bb48f5d-mbs2x_openshift-operators(6cb92407-0085-483e-8079-3aa441bfd214)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-d8bb48f5d-mbs2x_openshift-operators(6cb92407-0085-483e-8079-3aa441bfd214)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-mbs2x_openshift-operators_6cb92407-0085-483e-8079-3aa441bfd214_0(7b8eec2759c08fa829177d41f4f12c21ca5d0b9d321b8667e3600cfdfd215fb3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-d8bb48f5d-mbs2x" podUID="6cb92407-0085-483e-8079-3aa441bfd214" Dec 01 21:43:05 crc kubenswrapper[4962]: E1201 21:43:05.463414 4962 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-btqq9_openshift-operators_c021e2cb-ee5d-4e74-a5fa-1ede1fde37df_0(182e1dc65751cb28b887d966285e36e9cedec839154df6aab9dbe3ff9f0ef17a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 01 21:43:05 crc kubenswrapper[4962]: E1201 21:43:05.463455 4962 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-btqq9_openshift-operators_c021e2cb-ee5d-4e74-a5fa-1ede1fde37df_0(182e1dc65751cb28b887d966285e36e9cedec839154df6aab9dbe3ff9f0ef17a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-btqq9" Dec 01 21:43:05 crc kubenswrapper[4962]: E1201 21:43:05.463474 4962 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-btqq9_openshift-operators_c021e2cb-ee5d-4e74-a5fa-1ede1fde37df_0(182e1dc65751cb28b887d966285e36e9cedec839154df6aab9dbe3ff9f0ef17a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-btqq9" Dec 01 21:43:05 crc kubenswrapper[4962]: E1201 21:43:05.463509 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-668cf9dfbb-btqq9_openshift-operators(c021e2cb-ee5d-4e74-a5fa-1ede1fde37df)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-668cf9dfbb-btqq9_openshift-operators(c021e2cb-ee5d-4e74-a5fa-1ede1fde37df)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-btqq9_openshift-operators_c021e2cb-ee5d-4e74-a5fa-1ede1fde37df_0(182e1dc65751cb28b887d966285e36e9cedec839154df6aab9dbe3ff9f0ef17a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-btqq9" podUID="c021e2cb-ee5d-4e74-a5fa-1ede1fde37df" Dec 01 21:43:10 crc kubenswrapper[4962]: I1201 21:43:10.220085 4962 scope.go:117] "RemoveContainer" containerID="1b8d562c177ec53feb71127f293276762385b527ac37171b4992a030c29c6db7" Dec 01 21:43:10 crc kubenswrapper[4962]: E1201 21:43:10.221257 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-m4wg5_openshift-multus(f38b9e31-13b0-4a48-93bf-b3722ca60642)\"" pod="openshift-multus/multus-m4wg5" podUID="f38b9e31-13b0-4a48-93bf-b3722ca60642" Dec 01 21:43:18 crc kubenswrapper[4962]: I1201 21:43:18.218970 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-btqq9" Dec 01 21:43:18 crc kubenswrapper[4962]: I1201 21:43:18.219008 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68b58c7cc9-gdr5f" Dec 01 21:43:18 crc kubenswrapper[4962]: I1201 21:43:18.219918 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-btqq9" Dec 01 21:43:18 crc kubenswrapper[4962]: I1201 21:43:18.219921 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68b58c7cc9-gdr5f" Dec 01 21:43:18 crc kubenswrapper[4962]: E1201 21:43:18.250681 4962 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-btqq9_openshift-operators_c021e2cb-ee5d-4e74-a5fa-1ede1fde37df_0(214324ec312e7378359078a4d028aba2593ebab63bc8ec1e338503c1e91f89df): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 01 21:43:18 crc kubenswrapper[4962]: E1201 21:43:18.250745 4962 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-btqq9_openshift-operators_c021e2cb-ee5d-4e74-a5fa-1ede1fde37df_0(214324ec312e7378359078a4d028aba2593ebab63bc8ec1e338503c1e91f89df): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-btqq9" Dec 01 21:43:18 crc kubenswrapper[4962]: E1201 21:43:18.250765 4962 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-btqq9_openshift-operators_c021e2cb-ee5d-4e74-a5fa-1ede1fde37df_0(214324ec312e7378359078a4d028aba2593ebab63bc8ec1e338503c1e91f89df): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-btqq9" Dec 01 21:43:18 crc kubenswrapper[4962]: E1201 21:43:18.250811 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-668cf9dfbb-btqq9_openshift-operators(c021e2cb-ee5d-4e74-a5fa-1ede1fde37df)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-668cf9dfbb-btqq9_openshift-operators(c021e2cb-ee5d-4e74-a5fa-1ede1fde37df)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-btqq9_openshift-operators_c021e2cb-ee5d-4e74-a5fa-1ede1fde37df_0(214324ec312e7378359078a4d028aba2593ebab63bc8ec1e338503c1e91f89df): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-btqq9" podUID="c021e2cb-ee5d-4e74-a5fa-1ede1fde37df" Dec 01 21:43:18 crc kubenswrapper[4962]: E1201 21:43:18.265714 4962 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-68b58c7cc9-gdr5f_openshift-operators_b6a9273c-4395-4883-abbd-cfd15b5d552d_0(a506e237e81f6f8c5685db0537ab7678cc30f7804b9e5c83c79b3a940dc4e3b2): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 01 21:43:18 crc kubenswrapper[4962]: E1201 21:43:18.265785 4962 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-68b58c7cc9-gdr5f_openshift-operators_b6a9273c-4395-4883-abbd-cfd15b5d552d_0(a506e237e81f6f8c5685db0537ab7678cc30f7804b9e5c83c79b3a940dc4e3b2): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68b58c7cc9-gdr5f" Dec 01 21:43:18 crc kubenswrapper[4962]: E1201 21:43:18.265811 4962 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-68b58c7cc9-gdr5f_openshift-operators_b6a9273c-4395-4883-abbd-cfd15b5d552d_0(a506e237e81f6f8c5685db0537ab7678cc30f7804b9e5c83c79b3a940dc4e3b2): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68b58c7cc9-gdr5f" Dec 01 21:43:18 crc kubenswrapper[4962]: E1201 21:43:18.265874 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-68b58c7cc9-gdr5f_openshift-operators(b6a9273c-4395-4883-abbd-cfd15b5d552d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-68b58c7cc9-gdr5f_openshift-operators(b6a9273c-4395-4883-abbd-cfd15b5d552d)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-68b58c7cc9-gdr5f_openshift-operators_b6a9273c-4395-4883-abbd-cfd15b5d552d_0(a506e237e81f6f8c5685db0537ab7678cc30f7804b9e5c83c79b3a940dc4e3b2): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68b58c7cc9-gdr5f" podUID="b6a9273c-4395-4883-abbd-cfd15b5d552d" Dec 01 21:43:19 crc kubenswrapper[4962]: I1201 21:43:19.219607 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-ff5lc" Dec 01 21:43:19 crc kubenswrapper[4962]: I1201 21:43:19.220408 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-ff5lc" Dec 01 21:43:19 crc kubenswrapper[4962]: E1201 21:43:19.249500 4962 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-ff5lc_openshift-operators_32158a1b-c7c3-4fda-98d1-69443d10d0a5_0(2c9121d74f38d2d5a18f0537cff70e58390c4534d00166dc9a879fa79a13a3eb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 01 21:43:19 crc kubenswrapper[4962]: E1201 21:43:19.249611 4962 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-ff5lc_openshift-operators_32158a1b-c7c3-4fda-98d1-69443d10d0a5_0(2c9121d74f38d2d5a18f0537cff70e58390c4534d00166dc9a879fa79a13a3eb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-ff5lc" Dec 01 21:43:19 crc kubenswrapper[4962]: E1201 21:43:19.249632 4962 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-ff5lc_openshift-operators_32158a1b-c7c3-4fda-98d1-69443d10d0a5_0(2c9121d74f38d2d5a18f0537cff70e58390c4534d00166dc9a879fa79a13a3eb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-ff5lc" Dec 01 21:43:19 crc kubenswrapper[4962]: E1201 21:43:19.249698 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5446b9c989-ff5lc_openshift-operators(32158a1b-c7c3-4fda-98d1-69443d10d0a5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5446b9c989-ff5lc_openshift-operators(32158a1b-c7c3-4fda-98d1-69443d10d0a5)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-ff5lc_openshift-operators_32158a1b-c7c3-4fda-98d1-69443d10d0a5_0(2c9121d74f38d2d5a18f0537cff70e58390c4534d00166dc9a879fa79a13a3eb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5446b9c989-ff5lc" podUID="32158a1b-c7c3-4fda-98d1-69443d10d0a5" Dec 01 21:43:20 crc kubenswrapper[4962]: I1201 21:43:20.218668 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68b58c7cc9-mdbrz" Dec 01 21:43:20 crc kubenswrapper[4962]: I1201 21:43:20.218668 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-mbs2x" Dec 01 21:43:20 crc kubenswrapper[4962]: I1201 21:43:20.219180 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68b58c7cc9-mdbrz" Dec 01 21:43:20 crc kubenswrapper[4962]: I1201 21:43:20.219185 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-mbs2x" Dec 01 21:43:20 crc kubenswrapper[4962]: E1201 21:43:20.286971 4962 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-mbs2x_openshift-operators_6cb92407-0085-483e-8079-3aa441bfd214_0(eb980655d2bdb17f775b3fbc475ae422f8786f81a6966e5f1202351cc8875bdb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 01 21:43:20 crc kubenswrapper[4962]: E1201 21:43:20.287079 4962 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-mbs2x_openshift-operators_6cb92407-0085-483e-8079-3aa441bfd214_0(eb980655d2bdb17f775b3fbc475ae422f8786f81a6966e5f1202351cc8875bdb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-mbs2x" Dec 01 21:43:20 crc kubenswrapper[4962]: E1201 21:43:20.287124 4962 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-mbs2x_openshift-operators_6cb92407-0085-483e-8079-3aa441bfd214_0(eb980655d2bdb17f775b3fbc475ae422f8786f81a6966e5f1202351cc8875bdb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-mbs2x" Dec 01 21:43:20 crc kubenswrapper[4962]: E1201 21:43:20.287201 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-d8bb48f5d-mbs2x_openshift-operators(6cb92407-0085-483e-8079-3aa441bfd214)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-d8bb48f5d-mbs2x_openshift-operators(6cb92407-0085-483e-8079-3aa441bfd214)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-mbs2x_openshift-operators_6cb92407-0085-483e-8079-3aa441bfd214_0(eb980655d2bdb17f775b3fbc475ae422f8786f81a6966e5f1202351cc8875bdb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-d8bb48f5d-mbs2x" podUID="6cb92407-0085-483e-8079-3aa441bfd214" Dec 01 21:43:20 crc kubenswrapper[4962]: E1201 21:43:20.307593 4962 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-68b58c7cc9-mdbrz_openshift-operators_bf5940c1-cfd9-4ed4-93a0-db06782924ae_0(c7732ad9eb722c586cc7c5e7c81fc0f4fdd8934025345493ee285f8b4578852f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 01 21:43:20 crc kubenswrapper[4962]: E1201 21:43:20.307686 4962 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-68b58c7cc9-mdbrz_openshift-operators_bf5940c1-cfd9-4ed4-93a0-db06782924ae_0(c7732ad9eb722c586cc7c5e7c81fc0f4fdd8934025345493ee285f8b4578852f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68b58c7cc9-mdbrz" Dec 01 21:43:20 crc kubenswrapper[4962]: E1201 21:43:20.307718 4962 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-68b58c7cc9-mdbrz_openshift-operators_bf5940c1-cfd9-4ed4-93a0-db06782924ae_0(c7732ad9eb722c586cc7c5e7c81fc0f4fdd8934025345493ee285f8b4578852f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68b58c7cc9-mdbrz" Dec 01 21:43:20 crc kubenswrapper[4962]: E1201 21:43:20.307794 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-68b58c7cc9-mdbrz_openshift-operators(bf5940c1-cfd9-4ed4-93a0-db06782924ae)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-68b58c7cc9-mdbrz_openshift-operators(bf5940c1-cfd9-4ed4-93a0-db06782924ae)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-68b58c7cc9-mdbrz_openshift-operators_bf5940c1-cfd9-4ed4-93a0-db06782924ae_0(c7732ad9eb722c586cc7c5e7c81fc0f4fdd8934025345493ee285f8b4578852f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68b58c7cc9-mdbrz" podUID="bf5940c1-cfd9-4ed4-93a0-db06782924ae" Dec 01 21:43:25 crc kubenswrapper[4962]: I1201 21:43:25.219649 4962 scope.go:117] "RemoveContainer" containerID="1b8d562c177ec53feb71127f293276762385b527ac37171b4992a030c29c6db7" Dec 01 21:43:26 crc kubenswrapper[4962]: I1201 21:43:26.003513 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-m4wg5_f38b9e31-13b0-4a48-93bf-b3722ca60642/kube-multus/2.log" Dec 01 21:43:26 crc kubenswrapper[4962]: I1201 21:43:26.003807 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-m4wg5" event={"ID":"f38b9e31-13b0-4a48-93bf-b3722ca60642","Type":"ContainerStarted","Data":"8a3b627aca7bd12224fb892b5e9df3b54416b11db4903cb4b2210c4f10772c8c"} Dec 01 21:43:27 crc kubenswrapper[4962]: I1201 21:43:27.837003 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bdx49" Dec 01 21:43:32 crc kubenswrapper[4962]: I1201 21:43:32.219072 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-btqq9" Dec 01 21:43:32 crc kubenswrapper[4962]: I1201 21:43:32.219293 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-mbs2x" Dec 01 21:43:32 crc kubenswrapper[4962]: I1201 21:43:32.219417 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68b58c7cc9-gdr5f" Dec 01 21:43:32 crc kubenswrapper[4962]: I1201 21:43:32.219748 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-ff5lc" Dec 01 21:43:32 crc kubenswrapper[4962]: I1201 21:43:32.220134 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-mbs2x" Dec 01 21:43:32 crc kubenswrapper[4962]: I1201 21:43:32.220745 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-btqq9" Dec 01 21:43:32 crc kubenswrapper[4962]: I1201 21:43:32.221806 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68b58c7cc9-gdr5f" Dec 01 21:43:32 crc kubenswrapper[4962]: I1201 21:43:32.221829 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-ff5lc" Dec 01 21:43:32 crc kubenswrapper[4962]: I1201 21:43:32.684335 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-68b58c7cc9-gdr5f"] Dec 01 21:43:32 crc kubenswrapper[4962]: I1201 21:43:32.771368 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-btqq9"] Dec 01 21:43:32 crc kubenswrapper[4962]: I1201 21:43:32.774811 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-mbs2x"] Dec 01 21:43:32 crc kubenswrapper[4962]: W1201 21:43:32.783091 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc021e2cb_ee5d_4e74_a5fa_1ede1fde37df.slice/crio-7e8b4e033c34875d88c94db457e76dcf265abe47368d3950961aee34c2cd03ae WatchSource:0}: Error finding container 7e8b4e033c34875d88c94db457e76dcf265abe47368d3950961aee34c2cd03ae: Status 404 returned error can't find the container with id 7e8b4e033c34875d88c94db457e76dcf265abe47368d3950961aee34c2cd03ae Dec 01 21:43:32 crc kubenswrapper[4962]: I1201 21:43:32.786360 4962 patch_prober.go:28] interesting pod/machine-config-daemon-b642k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 21:43:32 crc kubenswrapper[4962]: W1201 21:43:32.786389 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6cb92407_0085_483e_8079_3aa441bfd214.slice/crio-8b3e55d9925e76b50fa927012ca60bf7535ccfc564d738260d459e02ab6d9891 WatchSource:0}: Error finding container 8b3e55d9925e76b50fa927012ca60bf7535ccfc564d738260d459e02ab6d9891: Status 404 returned error can't find the container with id 8b3e55d9925e76b50fa927012ca60bf7535ccfc564d738260d459e02ab6d9891 Dec 01 21:43:32 crc kubenswrapper[4962]: I1201 21:43:32.786396 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 21:43:32 crc kubenswrapper[4962]: I1201 21:43:32.834385 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-ff5lc"] Dec 01 21:43:32 crc kubenswrapper[4962]: W1201 21:43:32.841622 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32158a1b_c7c3_4fda_98d1_69443d10d0a5.slice/crio-49c4c9f8199661aba625ec13eaf589e82aff4946f3f03691d46b044511af01da WatchSource:0}: Error finding container 49c4c9f8199661aba625ec13eaf589e82aff4946f3f03691d46b044511af01da: Status 404 returned error can't find the container with id 49c4c9f8199661aba625ec13eaf589e82aff4946f3f03691d46b044511af01da Dec 01 21:43:33 crc kubenswrapper[4962]: I1201 21:43:33.058537 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-ff5lc" event={"ID":"32158a1b-c7c3-4fda-98d1-69443d10d0a5","Type":"ContainerStarted","Data":"49c4c9f8199661aba625ec13eaf589e82aff4946f3f03691d46b044511af01da"} Dec 01 21:43:33 crc kubenswrapper[4962]: I1201 21:43:33.061643 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-mbs2x" event={"ID":"6cb92407-0085-483e-8079-3aa441bfd214","Type":"ContainerStarted","Data":"8b3e55d9925e76b50fa927012ca60bf7535ccfc564d738260d459e02ab6d9891"} Dec 01 21:43:33 crc kubenswrapper[4962]: I1201 21:43:33.064852 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68b58c7cc9-gdr5f" event={"ID":"b6a9273c-4395-4883-abbd-cfd15b5d552d","Type":"ContainerStarted","Data":"d4bcd9cb011d811e10e01b2c13d13eb0dd115cd4f8a9e8ff4ca6b600b729ddd3"} Dec 01 21:43:33 crc kubenswrapper[4962]: I1201 21:43:33.068250 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-btqq9" event={"ID":"c021e2cb-ee5d-4e74-a5fa-1ede1fde37df","Type":"ContainerStarted","Data":"7e8b4e033c34875d88c94db457e76dcf265abe47368d3950961aee34c2cd03ae"} Dec 01 21:43:34 crc kubenswrapper[4962]: I1201 21:43:34.222190 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68b58c7cc9-mdbrz" Dec 01 21:43:34 crc kubenswrapper[4962]: I1201 21:43:34.222651 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68b58c7cc9-mdbrz" Dec 01 21:43:34 crc kubenswrapper[4962]: I1201 21:43:34.657810 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-68b58c7cc9-mdbrz"] Dec 01 21:43:35 crc kubenswrapper[4962]: I1201 21:43:35.091427 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68b58c7cc9-mdbrz" event={"ID":"bf5940c1-cfd9-4ed4-93a0-db06782924ae","Type":"ContainerStarted","Data":"6f7586b17b74f9932db1bbaee121c7e04fd010706fd4fe5d9f20692db209c140"} Dec 01 21:43:47 crc kubenswrapper[4962]: E1201 21:43:47.865998 4962 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/obo-prometheus-rhel9-operator@sha256:203cf5b9dc1460f09e75f58d8b5cf7df5e57c18c8c6a41c14b5e8977d83263f3" Dec 01 21:43:47 crc kubenswrapper[4962]: E1201 21:43:47.866899 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:prometheus-operator,Image:registry.redhat.io/cluster-observability-operator/obo-prometheus-rhel9-operator@sha256:203cf5b9dc1460f09e75f58d8b5cf7df5e57c18c8c6a41c14b5e8977d83263f3,Command:[],Args:[--prometheus-config-reloader=$(RELATED_IMAGE_PROMETHEUS_CONFIG_RELOADER) --prometheus-instance-selector=app.kubernetes.io/managed-by=observability-operator --alertmanager-instance-selector=app.kubernetes.io/managed-by=observability-operator --thanos-ruler-instance-selector=app.kubernetes.io/managed-by=observability-operator],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:GOGC,Value:30,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PROMETHEUS_CONFIG_RELOADER,Value:registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-prometheus-config-reloader-rhel9@sha256:1133c973c7472c665f910a722e19c8e2e27accb34b90fab67f14548627ce9c62,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.3.0,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{157286400 0} {} 150Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9vhrv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod obo-prometheus-operator-668cf9dfbb-btqq9_openshift-operators(c021e2cb-ee5d-4e74-a5fa-1ede1fde37df): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 21:43:47 crc kubenswrapper[4962]: E1201 21:43:47.868117 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-btqq9" podUID="c021e2cb-ee5d-4e74-a5fa-1ede1fde37df" Dec 01 21:43:48 crc kubenswrapper[4962]: E1201 21:43:48.193348 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/obo-prometheus-rhel9-operator@sha256:203cf5b9dc1460f09e75f58d8b5cf7df5e57c18c8c6a41c14b5e8977d83263f3\\\"\"" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-btqq9" podUID="c021e2cb-ee5d-4e74-a5fa-1ede1fde37df" Dec 01 21:43:48 crc kubenswrapper[4962]: E1201 21:43:48.257790 4962 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/perses-rhel9-operator@sha256:9aec4c328ec43e40481e06ca5808deead74b75c0aacb90e9e72966c3fa14f385" Dec 01 21:43:48 crc kubenswrapper[4962]: E1201 21:43:48.258005 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:perses-operator,Image:registry.redhat.io/cluster-observability-operator/perses-rhel9-operator@sha256:9aec4c328ec43e40481e06ca5808deead74b75c0aacb90e9e72966c3fa14f385,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.3.0,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{134217728 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:openshift-service-ca,ReadOnly:true,MountPath:/ca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p8g9t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000350000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod perses-operator-5446b9c989-ff5lc_openshift-operators(32158a1b-c7c3-4fda-98d1-69443d10d0a5): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 21:43:48 crc kubenswrapper[4962]: E1201 21:43:48.259850 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"perses-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/perses-operator-5446b9c989-ff5lc" podUID="32158a1b-c7c3-4fda-98d1-69443d10d0a5" Dec 01 21:43:49 crc kubenswrapper[4962]: I1201 21:43:49.198141 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68b58c7cc9-gdr5f" event={"ID":"b6a9273c-4395-4883-abbd-cfd15b5d552d","Type":"ContainerStarted","Data":"aee52e3637780fac0f8078ce513352bd55ac0f12b1c04a8a9f2dc3440ef726b0"} Dec 01 21:43:49 crc kubenswrapper[4962]: I1201 21:43:49.200498 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68b58c7cc9-mdbrz" event={"ID":"bf5940c1-cfd9-4ed4-93a0-db06782924ae","Type":"ContainerStarted","Data":"f46c6bc0f11b5481be93967690895238728a5a9a92da69c4478f5859b3c14923"} Dec 01 21:43:49 crc kubenswrapper[4962]: I1201 21:43:49.204714 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-mbs2x" event={"ID":"6cb92407-0085-483e-8079-3aa441bfd214","Type":"ContainerStarted","Data":"bf190e86f127506ac8b7ed2c160ecd5f7409607e2a402133c552dce8e398e7d4"} Dec 01 21:43:49 crc kubenswrapper[4962]: I1201 21:43:49.205081 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-d8bb48f5d-mbs2x" Dec 01 21:43:49 crc kubenswrapper[4962]: E1201 21:43:49.207513 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"perses-operator\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/perses-rhel9-operator@sha256:9aec4c328ec43e40481e06ca5808deead74b75c0aacb90e9e72966c3fa14f385\\\"\"" pod="openshift-operators/perses-operator-5446b9c989-ff5lc" podUID="32158a1b-c7c3-4fda-98d1-69443d10d0a5" Dec 01 21:43:49 crc kubenswrapper[4962]: I1201 21:43:49.223267 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68b58c7cc9-gdr5f" podStartSLOduration=30.646980321 podStartE2EDuration="46.223249426s" podCreationTimestamp="2025-12-01 21:43:03 +0000 UTC" firstStartedPulling="2025-12-01 21:43:32.692927338 +0000 UTC m=+596.794366533" lastFinishedPulling="2025-12-01 21:43:48.269196443 +0000 UTC m=+612.370635638" observedRunningTime="2025-12-01 21:43:49.221093094 +0000 UTC m=+613.322532319" watchObservedRunningTime="2025-12-01 21:43:49.223249426 +0000 UTC m=+613.324688631" Dec 01 21:43:49 crc kubenswrapper[4962]: I1201 21:43:49.279048 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-d8bb48f5d-mbs2x" podStartSLOduration=30.792971514 podStartE2EDuration="46.279021646s" podCreationTimestamp="2025-12-01 21:43:03 +0000 UTC" firstStartedPulling="2025-12-01 21:43:32.788548955 +0000 UTC m=+596.889988150" lastFinishedPulling="2025-12-01 21:43:48.274599077 +0000 UTC m=+612.376038282" observedRunningTime="2025-12-01 21:43:49.271293726 +0000 UTC m=+613.372732931" watchObservedRunningTime="2025-12-01 21:43:49.279021646 +0000 UTC m=+613.380460851" Dec 01 21:43:49 crc kubenswrapper[4962]: I1201 21:43:49.287031 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-d8bb48f5d-mbs2x" Dec 01 21:43:49 crc kubenswrapper[4962]: I1201 21:43:49.307068 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68b58c7cc9-mdbrz" podStartSLOduration=32.711059922 podStartE2EDuration="46.307039755s" podCreationTimestamp="2025-12-01 21:43:03 +0000 UTC" firstStartedPulling="2025-12-01 21:43:34.654385513 +0000 UTC m=+598.755824758" lastFinishedPulling="2025-12-01 21:43:48.250365396 +0000 UTC m=+612.351804591" observedRunningTime="2025-12-01 21:43:49.30300404 +0000 UTC m=+613.404443235" watchObservedRunningTime="2025-12-01 21:43:49.307039755 +0000 UTC m=+613.408478990" Dec 01 21:43:50 crc kubenswrapper[4962]: I1201 21:43:50.794316 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-98676cfd9-qw7bj" Dec 01 21:43:50 crc kubenswrapper[4962]: I1201 21:43:50.879107 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/96d314cb-1713-4e20-8eec-70bedd5cabad-audit-log\") pod \"96d314cb-1713-4e20-8eec-70bedd5cabad\" (UID: \"96d314cb-1713-4e20-8eec-70bedd5cabad\") " Dec 01 21:43:50 crc kubenswrapper[4962]: I1201 21:43:50.879408 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/96d314cb-1713-4e20-8eec-70bedd5cabad-secret-metrics-client-certs\") pod \"96d314cb-1713-4e20-8eec-70bedd5cabad\" (UID: \"96d314cb-1713-4e20-8eec-70bedd5cabad\") " Dec 01 21:43:50 crc kubenswrapper[4962]: I1201 21:43:50.879464 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96d314cb-1713-4e20-8eec-70bedd5cabad-configmap-kubelet-serving-ca-bundle\") pod \"96d314cb-1713-4e20-8eec-70bedd5cabad\" (UID: \"96d314cb-1713-4e20-8eec-70bedd5cabad\") " Dec 01 21:43:50 crc kubenswrapper[4962]: I1201 21:43:50.879591 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5v488\" (UniqueName: \"kubernetes.io/projected/96d314cb-1713-4e20-8eec-70bedd5cabad-kube-api-access-5v488\") pod \"96d314cb-1713-4e20-8eec-70bedd5cabad\" (UID: \"96d314cb-1713-4e20-8eec-70bedd5cabad\") " Dec 01 21:43:50 crc kubenswrapper[4962]: I1201 21:43:50.879760 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96d314cb-1713-4e20-8eec-70bedd5cabad-audit-log" (OuterVolumeSpecName: "audit-log") pod "96d314cb-1713-4e20-8eec-70bedd5cabad" (UID: "96d314cb-1713-4e20-8eec-70bedd5cabad"). InnerVolumeSpecName "audit-log". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:43:50 crc kubenswrapper[4962]: I1201 21:43:50.880303 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/96d314cb-1713-4e20-8eec-70bedd5cabad-secret-metrics-server-tls\") pod \"96d314cb-1713-4e20-8eec-70bedd5cabad\" (UID: \"96d314cb-1713-4e20-8eec-70bedd5cabad\") " Dec 01 21:43:50 crc kubenswrapper[4962]: I1201 21:43:50.880359 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/96d314cb-1713-4e20-8eec-70bedd5cabad-metrics-server-audit-profiles\") pod \"96d314cb-1713-4e20-8eec-70bedd5cabad\" (UID: \"96d314cb-1713-4e20-8eec-70bedd5cabad\") " Dec 01 21:43:50 crc kubenswrapper[4962]: I1201 21:43:50.880088 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96d314cb-1713-4e20-8eec-70bedd5cabad-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "96d314cb-1713-4e20-8eec-70bedd5cabad" (UID: "96d314cb-1713-4e20-8eec-70bedd5cabad"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:43:50 crc kubenswrapper[4962]: I1201 21:43:50.880405 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96d314cb-1713-4e20-8eec-70bedd5cabad-client-ca-bundle\") pod \"96d314cb-1713-4e20-8eec-70bedd5cabad\" (UID: \"96d314cb-1713-4e20-8eec-70bedd5cabad\") " Dec 01 21:43:50 crc kubenswrapper[4962]: I1201 21:43:50.880850 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96d314cb-1713-4e20-8eec-70bedd5cabad-metrics-server-audit-profiles" (OuterVolumeSpecName: "metrics-server-audit-profiles") pod "96d314cb-1713-4e20-8eec-70bedd5cabad" (UID: "96d314cb-1713-4e20-8eec-70bedd5cabad"). InnerVolumeSpecName "metrics-server-audit-profiles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:43:50 crc kubenswrapper[4962]: I1201 21:43:50.880966 4962 reconciler_common.go:293] "Volume detached for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/96d314cb-1713-4e20-8eec-70bedd5cabad-audit-log\") on node \"crc\" DevicePath \"\"" Dec 01 21:43:50 crc kubenswrapper[4962]: I1201 21:43:50.880985 4962 reconciler_common.go:293] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96d314cb-1713-4e20-8eec-70bedd5cabad-configmap-kubelet-serving-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 21:43:50 crc kubenswrapper[4962]: I1201 21:43:50.881000 4962 reconciler_common.go:293] "Volume detached for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/96d314cb-1713-4e20-8eec-70bedd5cabad-metrics-server-audit-profiles\") on node \"crc\" DevicePath \"\"" Dec 01 21:43:50 crc kubenswrapper[4962]: I1201 21:43:50.884847 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96d314cb-1713-4e20-8eec-70bedd5cabad-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "96d314cb-1713-4e20-8eec-70bedd5cabad" (UID: "96d314cb-1713-4e20-8eec-70bedd5cabad"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:43:50 crc kubenswrapper[4962]: I1201 21:43:50.885496 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96d314cb-1713-4e20-8eec-70bedd5cabad-kube-api-access-5v488" (OuterVolumeSpecName: "kube-api-access-5v488") pod "96d314cb-1713-4e20-8eec-70bedd5cabad" (UID: "96d314cb-1713-4e20-8eec-70bedd5cabad"). InnerVolumeSpecName "kube-api-access-5v488". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:43:50 crc kubenswrapper[4962]: I1201 21:43:50.885764 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96d314cb-1713-4e20-8eec-70bedd5cabad-secret-metrics-server-tls" (OuterVolumeSpecName: "secret-metrics-server-tls") pod "96d314cb-1713-4e20-8eec-70bedd5cabad" (UID: "96d314cb-1713-4e20-8eec-70bedd5cabad"). InnerVolumeSpecName "secret-metrics-server-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:43:50 crc kubenswrapper[4962]: I1201 21:43:50.887190 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96d314cb-1713-4e20-8eec-70bedd5cabad-client-ca-bundle" (OuterVolumeSpecName: "client-ca-bundle") pod "96d314cb-1713-4e20-8eec-70bedd5cabad" (UID: "96d314cb-1713-4e20-8eec-70bedd5cabad"). InnerVolumeSpecName "client-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:43:50 crc kubenswrapper[4962]: I1201 21:43:50.982397 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5v488\" (UniqueName: \"kubernetes.io/projected/96d314cb-1713-4e20-8eec-70bedd5cabad-kube-api-access-5v488\") on node \"crc\" DevicePath \"\"" Dec 01 21:43:50 crc kubenswrapper[4962]: I1201 21:43:50.982458 4962 reconciler_common.go:293] "Volume detached for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/96d314cb-1713-4e20-8eec-70bedd5cabad-secret-metrics-server-tls\") on node \"crc\" DevicePath \"\"" Dec 01 21:43:50 crc kubenswrapper[4962]: I1201 21:43:50.982480 4962 reconciler_common.go:293] "Volume detached for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96d314cb-1713-4e20-8eec-70bedd5cabad-client-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 21:43:50 crc kubenswrapper[4962]: I1201 21:43:50.982499 4962 reconciler_common.go:293] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/96d314cb-1713-4e20-8eec-70bedd5cabad-secret-metrics-client-certs\") on node \"crc\" DevicePath \"\"" Dec 01 21:43:51 crc kubenswrapper[4962]: I1201 21:43:51.221687 4962 generic.go:334] "Generic (PLEG): container finished" podID="96d314cb-1713-4e20-8eec-70bedd5cabad" containerID="305fc01654247128bde1a7cd7511f2b42c4df8a1704e7135b1469d1649722aaa" exitCode=0 Dec 01 21:43:51 crc kubenswrapper[4962]: I1201 21:43:51.221762 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-98676cfd9-qw7bj" event={"ID":"96d314cb-1713-4e20-8eec-70bedd5cabad","Type":"ContainerDied","Data":"305fc01654247128bde1a7cd7511f2b42c4df8a1704e7135b1469d1649722aaa"} Dec 01 21:43:51 crc kubenswrapper[4962]: I1201 21:43:51.221787 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-98676cfd9-qw7bj" Dec 01 21:43:51 crc kubenswrapper[4962]: I1201 21:43:51.221823 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-98676cfd9-qw7bj" event={"ID":"96d314cb-1713-4e20-8eec-70bedd5cabad","Type":"ContainerDied","Data":"351113dbd95ada2891e84bcc4be8059f46aa569bd39542c0cfe5103142183db0"} Dec 01 21:43:51 crc kubenswrapper[4962]: I1201 21:43:51.221852 4962 scope.go:117] "RemoveContainer" containerID="305fc01654247128bde1a7cd7511f2b42c4df8a1704e7135b1469d1649722aaa" Dec 01 21:43:51 crc kubenswrapper[4962]: I1201 21:43:51.249345 4962 scope.go:117] "RemoveContainer" containerID="305fc01654247128bde1a7cd7511f2b42c4df8a1704e7135b1469d1649722aaa" Dec 01 21:43:51 crc kubenswrapper[4962]: E1201 21:43:51.249840 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"305fc01654247128bde1a7cd7511f2b42c4df8a1704e7135b1469d1649722aaa\": container with ID starting with 305fc01654247128bde1a7cd7511f2b42c4df8a1704e7135b1469d1649722aaa not found: ID does not exist" containerID="305fc01654247128bde1a7cd7511f2b42c4df8a1704e7135b1469d1649722aaa" Dec 01 21:43:51 crc kubenswrapper[4962]: I1201 21:43:51.249893 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"305fc01654247128bde1a7cd7511f2b42c4df8a1704e7135b1469d1649722aaa"} err="failed to get container status \"305fc01654247128bde1a7cd7511f2b42c4df8a1704e7135b1469d1649722aaa\": rpc error: code = NotFound desc = could not find container \"305fc01654247128bde1a7cd7511f2b42c4df8a1704e7135b1469d1649722aaa\": container with ID starting with 305fc01654247128bde1a7cd7511f2b42c4df8a1704e7135b1469d1649722aaa not found: ID does not exist" Dec 01 21:43:51 crc kubenswrapper[4962]: I1201 21:43:51.268325 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/metrics-server-98676cfd9-qw7bj"] Dec 01 21:43:51 crc kubenswrapper[4962]: I1201 21:43:51.273296 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/metrics-server-98676cfd9-qw7bj"] Dec 01 21:43:52 crc kubenswrapper[4962]: I1201 21:43:52.229383 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96d314cb-1713-4e20-8eec-70bedd5cabad" path="/var/lib/kubelet/pods/96d314cb-1713-4e20-8eec-70bedd5cabad/volumes" Dec 01 21:43:58 crc kubenswrapper[4962]: I1201 21:43:58.464109 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-mfzw8"] Dec 01 21:43:58 crc kubenswrapper[4962]: E1201 21:43:58.464989 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96d314cb-1713-4e20-8eec-70bedd5cabad" containerName="metrics-server" Dec 01 21:43:58 crc kubenswrapper[4962]: I1201 21:43:58.465007 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="96d314cb-1713-4e20-8eec-70bedd5cabad" containerName="metrics-server" Dec 01 21:43:58 crc kubenswrapper[4962]: I1201 21:43:58.465170 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="96d314cb-1713-4e20-8eec-70bedd5cabad" containerName="metrics-server" Dec 01 21:43:58 crc kubenswrapper[4962]: I1201 21:43:58.465685 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-mfzw8" Dec 01 21:43:58 crc kubenswrapper[4962]: I1201 21:43:58.473445 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Dec 01 21:43:58 crc kubenswrapper[4962]: I1201 21:43:58.473547 4962 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-vxgr4" Dec 01 21:43:58 crc kubenswrapper[4962]: I1201 21:43:58.477912 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-lzx5s"] Dec 01 21:43:58 crc kubenswrapper[4962]: I1201 21:43:58.478861 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-lzx5s" Dec 01 21:43:58 crc kubenswrapper[4962]: I1201 21:43:58.479568 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Dec 01 21:43:58 crc kubenswrapper[4962]: I1201 21:43:58.482140 4962 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-fh7kv" Dec 01 21:43:58 crc kubenswrapper[4962]: I1201 21:43:58.490365 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwdbt\" (UniqueName: \"kubernetes.io/projected/1ed989fa-af32-4ec2-9ead-2681d1b96741-kube-api-access-jwdbt\") pod \"cert-manager-cainjector-7f985d654d-mfzw8\" (UID: \"1ed989fa-af32-4ec2-9ead-2681d1b96741\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-mfzw8" Dec 01 21:43:58 crc kubenswrapper[4962]: I1201 21:43:58.491743 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-mfzw8"] Dec 01 21:43:58 crc kubenswrapper[4962]: I1201 21:43:58.504518 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-lzx5s"] Dec 01 21:43:58 crc kubenswrapper[4962]: I1201 21:43:58.516572 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-zwlk2"] Dec 01 21:43:58 crc kubenswrapper[4962]: I1201 21:43:58.517657 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-zwlk2" Dec 01 21:43:58 crc kubenswrapper[4962]: I1201 21:43:58.520372 4962 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-mrmv5" Dec 01 21:43:58 crc kubenswrapper[4962]: I1201 21:43:58.523115 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-zwlk2"] Dec 01 21:43:58 crc kubenswrapper[4962]: I1201 21:43:58.591501 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jz9mc\" (UniqueName: \"kubernetes.io/projected/09bfd310-14dd-4f11-90d4-2b67683a468a-kube-api-access-jz9mc\") pod \"cert-manager-5b446d88c5-lzx5s\" (UID: \"09bfd310-14dd-4f11-90d4-2b67683a468a\") " pod="cert-manager/cert-manager-5b446d88c5-lzx5s" Dec 01 21:43:58 crc kubenswrapper[4962]: I1201 21:43:58.591566 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwdbt\" (UniqueName: \"kubernetes.io/projected/1ed989fa-af32-4ec2-9ead-2681d1b96741-kube-api-access-jwdbt\") pod \"cert-manager-cainjector-7f985d654d-mfzw8\" (UID: \"1ed989fa-af32-4ec2-9ead-2681d1b96741\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-mfzw8" Dec 01 21:43:58 crc kubenswrapper[4962]: I1201 21:43:58.591621 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vg2ds\" (UniqueName: \"kubernetes.io/projected/bce7cb19-aeee-4ad9-9284-46e78c5e1d6f-kube-api-access-vg2ds\") pod \"cert-manager-webhook-5655c58dd6-zwlk2\" (UID: \"bce7cb19-aeee-4ad9-9284-46e78c5e1d6f\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-zwlk2" Dec 01 21:43:58 crc kubenswrapper[4962]: I1201 21:43:58.609432 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwdbt\" (UniqueName: \"kubernetes.io/projected/1ed989fa-af32-4ec2-9ead-2681d1b96741-kube-api-access-jwdbt\") pod \"cert-manager-cainjector-7f985d654d-mfzw8\" (UID: \"1ed989fa-af32-4ec2-9ead-2681d1b96741\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-mfzw8" Dec 01 21:43:58 crc kubenswrapper[4962]: I1201 21:43:58.692380 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jz9mc\" (UniqueName: \"kubernetes.io/projected/09bfd310-14dd-4f11-90d4-2b67683a468a-kube-api-access-jz9mc\") pod \"cert-manager-5b446d88c5-lzx5s\" (UID: \"09bfd310-14dd-4f11-90d4-2b67683a468a\") " pod="cert-manager/cert-manager-5b446d88c5-lzx5s" Dec 01 21:43:58 crc kubenswrapper[4962]: I1201 21:43:58.692724 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vg2ds\" (UniqueName: \"kubernetes.io/projected/bce7cb19-aeee-4ad9-9284-46e78c5e1d6f-kube-api-access-vg2ds\") pod \"cert-manager-webhook-5655c58dd6-zwlk2\" (UID: \"bce7cb19-aeee-4ad9-9284-46e78c5e1d6f\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-zwlk2" Dec 01 21:43:58 crc kubenswrapper[4962]: I1201 21:43:58.708356 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jz9mc\" (UniqueName: \"kubernetes.io/projected/09bfd310-14dd-4f11-90d4-2b67683a468a-kube-api-access-jz9mc\") pod \"cert-manager-5b446d88c5-lzx5s\" (UID: \"09bfd310-14dd-4f11-90d4-2b67683a468a\") " pod="cert-manager/cert-manager-5b446d88c5-lzx5s" Dec 01 21:43:58 crc kubenswrapper[4962]: I1201 21:43:58.718544 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vg2ds\" (UniqueName: \"kubernetes.io/projected/bce7cb19-aeee-4ad9-9284-46e78c5e1d6f-kube-api-access-vg2ds\") pod \"cert-manager-webhook-5655c58dd6-zwlk2\" (UID: \"bce7cb19-aeee-4ad9-9284-46e78c5e1d6f\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-zwlk2" Dec 01 21:43:58 crc kubenswrapper[4962]: I1201 21:43:58.790805 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-mfzw8" Dec 01 21:43:58 crc kubenswrapper[4962]: I1201 21:43:58.800734 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-lzx5s" Dec 01 21:43:58 crc kubenswrapper[4962]: I1201 21:43:58.830207 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-zwlk2" Dec 01 21:43:59 crc kubenswrapper[4962]: I1201 21:43:59.216319 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-mfzw8"] Dec 01 21:43:59 crc kubenswrapper[4962]: I1201 21:43:59.274062 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-mfzw8" event={"ID":"1ed989fa-af32-4ec2-9ead-2681d1b96741","Type":"ContainerStarted","Data":"a1768540eb5471b2e5eb25356a679230589620d6347f8a75e6c9054f4845dbd8"} Dec 01 21:43:59 crc kubenswrapper[4962]: I1201 21:43:59.296767 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-lzx5s"] Dec 01 21:43:59 crc kubenswrapper[4962]: W1201 21:43:59.305869 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod09bfd310_14dd_4f11_90d4_2b67683a468a.slice/crio-aa523a8810bdddf38c6ca89a5fa7614e4d35f9cd93844fc3e1477f71b161aa2b WatchSource:0}: Error finding container aa523a8810bdddf38c6ca89a5fa7614e4d35f9cd93844fc3e1477f71b161aa2b: Status 404 returned error can't find the container with id aa523a8810bdddf38c6ca89a5fa7614e4d35f9cd93844fc3e1477f71b161aa2b Dec 01 21:43:59 crc kubenswrapper[4962]: I1201 21:43:59.332239 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-zwlk2"] Dec 01 21:44:00 crc kubenswrapper[4962]: I1201 21:44:00.295057 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-lzx5s" event={"ID":"09bfd310-14dd-4f11-90d4-2b67683a468a","Type":"ContainerStarted","Data":"aa523a8810bdddf38c6ca89a5fa7614e4d35f9cd93844fc3e1477f71b161aa2b"} Dec 01 21:44:00 crc kubenswrapper[4962]: I1201 21:44:00.295971 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-zwlk2" event={"ID":"bce7cb19-aeee-4ad9-9284-46e78c5e1d6f","Type":"ContainerStarted","Data":"d10c62e25026677fba1ad5c6133676d60cffc23fdf8f5d195736e7fd3f478e90"} Dec 01 21:44:02 crc kubenswrapper[4962]: I1201 21:44:02.784341 4962 patch_prober.go:28] interesting pod/machine-config-daemon-b642k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 21:44:02 crc kubenswrapper[4962]: I1201 21:44:02.785964 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 21:44:02 crc kubenswrapper[4962]: I1201 21:44:02.786150 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b642k" Dec 01 21:44:02 crc kubenswrapper[4962]: I1201 21:44:02.786995 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4b21fa950e24527d3d7f2945a34117bc2a69fe50d90966acf9350574b99da5ad"} pod="openshift-machine-config-operator/machine-config-daemon-b642k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 21:44:02 crc kubenswrapper[4962]: I1201 21:44:02.787170 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" containerID="cri-o://4b21fa950e24527d3d7f2945a34117bc2a69fe50d90966acf9350574b99da5ad" gracePeriod=600 Dec 01 21:44:03 crc kubenswrapper[4962]: I1201 21:44:03.315564 4962 generic.go:334] "Generic (PLEG): container finished" podID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerID="4b21fa950e24527d3d7f2945a34117bc2a69fe50d90966acf9350574b99da5ad" exitCode=0 Dec 01 21:44:03 crc kubenswrapper[4962]: I1201 21:44:03.315865 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" event={"ID":"191b6ce3-f613-4217-b224-a65ee4cfdfe7","Type":"ContainerDied","Data":"4b21fa950e24527d3d7f2945a34117bc2a69fe50d90966acf9350574b99da5ad"} Dec 01 21:44:03 crc kubenswrapper[4962]: I1201 21:44:03.315895 4962 scope.go:117] "RemoveContainer" containerID="749bd494341ecd94507a174dd68318952a7c94f26fd3fad275718b333cbd13e5" Dec 01 21:44:04 crc kubenswrapper[4962]: I1201 21:44:04.323324 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" event={"ID":"191b6ce3-f613-4217-b224-a65ee4cfdfe7","Type":"ContainerStarted","Data":"d2879c1f1c1a43cf7797f56147cd78f2bf5ee957daff607dcee5e6d23c293a8c"} Dec 01 21:44:04 crc kubenswrapper[4962]: I1201 21:44:04.324952 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-mfzw8" event={"ID":"1ed989fa-af32-4ec2-9ead-2681d1b96741","Type":"ContainerStarted","Data":"d55f3ca8b301fc985071fe0160d176013221272395cc27c4cfe98b6911492150"} Dec 01 21:44:04 crc kubenswrapper[4962]: I1201 21:44:04.326702 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-lzx5s" event={"ID":"09bfd310-14dd-4f11-90d4-2b67683a468a","Type":"ContainerStarted","Data":"0afc4bbbd9b1a51f0e86785fd71d463520a32ec15c4138ab30e7debc7af97a21"} Dec 01 21:44:04 crc kubenswrapper[4962]: I1201 21:44:04.328017 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-zwlk2" event={"ID":"bce7cb19-aeee-4ad9-9284-46e78c5e1d6f","Type":"ContainerStarted","Data":"674001fae34c901a1db044462f3688329889ea351d97a6a220f4b1ea5e0314bf"} Dec 01 21:44:04 crc kubenswrapper[4962]: I1201 21:44:04.328184 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-zwlk2" Dec 01 21:44:04 crc kubenswrapper[4962]: I1201 21:44:04.361645 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-mfzw8" podStartSLOduration=1.972780183 podStartE2EDuration="6.361627145s" podCreationTimestamp="2025-12-01 21:43:58 +0000 UTC" firstStartedPulling="2025-12-01 21:43:59.225768157 +0000 UTC m=+623.327207352" lastFinishedPulling="2025-12-01 21:44:03.614615079 +0000 UTC m=+627.716054314" observedRunningTime="2025-12-01 21:44:04.360073011 +0000 UTC m=+628.461512206" watchObservedRunningTime="2025-12-01 21:44:04.361627145 +0000 UTC m=+628.463066340" Dec 01 21:44:04 crc kubenswrapper[4962]: I1201 21:44:04.375412 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-zwlk2" podStartSLOduration=2.032319281 podStartE2EDuration="6.375395328s" podCreationTimestamp="2025-12-01 21:43:58 +0000 UTC" firstStartedPulling="2025-12-01 21:43:59.33913771 +0000 UTC m=+623.440576905" lastFinishedPulling="2025-12-01 21:44:03.682213707 +0000 UTC m=+627.783652952" observedRunningTime="2025-12-01 21:44:04.373373201 +0000 UTC m=+628.474812396" watchObservedRunningTime="2025-12-01 21:44:04.375395328 +0000 UTC m=+628.476834523" Dec 01 21:44:05 crc kubenswrapper[4962]: I1201 21:44:05.335006 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-btqq9" event={"ID":"c021e2cb-ee5d-4e74-a5fa-1ede1fde37df","Type":"ContainerStarted","Data":"8788bedf042cd269a4df7068711f67404452fc8fbdbdbb0c304a96e46f502e73"} Dec 01 21:44:05 crc kubenswrapper[4962]: I1201 21:44:05.336812 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-ff5lc" event={"ID":"32158a1b-c7c3-4fda-98d1-69443d10d0a5","Type":"ContainerStarted","Data":"c1040dca68849ae977eaea925b1ecddd963829b24b0c2bbf685a20ccbc68e3ef"} Dec 01 21:44:05 crc kubenswrapper[4962]: I1201 21:44:05.360828 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-btqq9" podStartSLOduration=30.864936267 podStartE2EDuration="1m2.360811965s" podCreationTimestamp="2025-12-01 21:43:03 +0000 UTC" firstStartedPulling="2025-12-01 21:43:32.787409003 +0000 UTC m=+596.888848198" lastFinishedPulling="2025-12-01 21:44:04.283284701 +0000 UTC m=+628.384723896" observedRunningTime="2025-12-01 21:44:05.356663957 +0000 UTC m=+629.458103162" watchObservedRunningTime="2025-12-01 21:44:05.360811965 +0000 UTC m=+629.462251170" Dec 01 21:44:05 crc kubenswrapper[4962]: I1201 21:44:05.362391 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-lzx5s" podStartSLOduration=3.060839548 podStartE2EDuration="7.36238255s" podCreationTimestamp="2025-12-01 21:43:58 +0000 UTC" firstStartedPulling="2025-12-01 21:43:59.313083097 +0000 UTC m=+623.414522292" lastFinishedPulling="2025-12-01 21:44:03.614626099 +0000 UTC m=+627.716065294" observedRunningTime="2025-12-01 21:44:04.3965107 +0000 UTC m=+628.497949905" watchObservedRunningTime="2025-12-01 21:44:05.36238255 +0000 UTC m=+629.463821775" Dec 01 21:44:05 crc kubenswrapper[4962]: I1201 21:44:05.383497 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5446b9c989-ff5lc" podStartSLOduration=30.944296421 podStartE2EDuration="1m2.383480302s" podCreationTimestamp="2025-12-01 21:43:03 +0000 UTC" firstStartedPulling="2025-12-01 21:43:32.84550062 +0000 UTC m=+596.946939815" lastFinishedPulling="2025-12-01 21:44:04.284684501 +0000 UTC m=+628.386123696" observedRunningTime="2025-12-01 21:44:05.378047157 +0000 UTC m=+629.479486422" watchObservedRunningTime="2025-12-01 21:44:05.383480302 +0000 UTC m=+629.484919497" Dec 01 21:44:08 crc kubenswrapper[4962]: I1201 21:44:08.833192 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-zwlk2" Dec 01 21:44:14 crc kubenswrapper[4962]: I1201 21:44:14.119604 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5446b9c989-ff5lc" Dec 01 21:44:14 crc kubenswrapper[4962]: I1201 21:44:14.122122 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5446b9c989-ff5lc" Dec 01 21:44:32 crc kubenswrapper[4962]: I1201 21:44:32.558461 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fttmb9"] Dec 01 21:44:32 crc kubenswrapper[4962]: I1201 21:44:32.560584 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fttmb9" Dec 01 21:44:32 crc kubenswrapper[4962]: I1201 21:44:32.564635 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 01 21:44:32 crc kubenswrapper[4962]: I1201 21:44:32.580470 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fttmb9"] Dec 01 21:44:32 crc kubenswrapper[4962]: I1201 21:44:32.733376 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2cb7ef80-3761-4574-9c1f-52405a401ebc-bundle\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fttmb9\" (UID: \"2cb7ef80-3761-4574-9c1f-52405a401ebc\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fttmb9" Dec 01 21:44:32 crc kubenswrapper[4962]: I1201 21:44:32.733435 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjzrl\" (UniqueName: \"kubernetes.io/projected/2cb7ef80-3761-4574-9c1f-52405a401ebc-kube-api-access-hjzrl\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fttmb9\" (UID: \"2cb7ef80-3761-4574-9c1f-52405a401ebc\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fttmb9" Dec 01 21:44:32 crc kubenswrapper[4962]: I1201 21:44:32.733467 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2cb7ef80-3761-4574-9c1f-52405a401ebc-util\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fttmb9\" (UID: \"2cb7ef80-3761-4574-9c1f-52405a401ebc\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fttmb9" Dec 01 21:44:32 crc kubenswrapper[4962]: I1201 21:44:32.743158 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8lv8cp"] Dec 01 21:44:32 crc kubenswrapper[4962]: I1201 21:44:32.744578 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8lv8cp" Dec 01 21:44:32 crc kubenswrapper[4962]: I1201 21:44:32.757034 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8lv8cp"] Dec 01 21:44:32 crc kubenswrapper[4962]: I1201 21:44:32.835730 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2cb7ef80-3761-4574-9c1f-52405a401ebc-util\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fttmb9\" (UID: \"2cb7ef80-3761-4574-9c1f-52405a401ebc\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fttmb9" Dec 01 21:44:32 crc kubenswrapper[4962]: I1201 21:44:32.835874 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1a573027-bd61-4a79-b4da-d0b25cb44908-util\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8lv8cp\" (UID: \"1a573027-bd61-4a79-b4da-d0b25cb44908\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8lv8cp" Dec 01 21:44:32 crc kubenswrapper[4962]: I1201 21:44:32.835919 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1a573027-bd61-4a79-b4da-d0b25cb44908-bundle\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8lv8cp\" (UID: \"1a573027-bd61-4a79-b4da-d0b25cb44908\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8lv8cp" Dec 01 21:44:32 crc kubenswrapper[4962]: I1201 21:44:32.835987 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2cb7ef80-3761-4574-9c1f-52405a401ebc-bundle\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fttmb9\" (UID: \"2cb7ef80-3761-4574-9c1f-52405a401ebc\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fttmb9" Dec 01 21:44:32 crc kubenswrapper[4962]: I1201 21:44:32.836015 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mshdc\" (UniqueName: \"kubernetes.io/projected/1a573027-bd61-4a79-b4da-d0b25cb44908-kube-api-access-mshdc\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8lv8cp\" (UID: \"1a573027-bd61-4a79-b4da-d0b25cb44908\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8lv8cp" Dec 01 21:44:32 crc kubenswrapper[4962]: I1201 21:44:32.836048 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjzrl\" (UniqueName: \"kubernetes.io/projected/2cb7ef80-3761-4574-9c1f-52405a401ebc-kube-api-access-hjzrl\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fttmb9\" (UID: \"2cb7ef80-3761-4574-9c1f-52405a401ebc\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fttmb9" Dec 01 21:44:32 crc kubenswrapper[4962]: I1201 21:44:32.836520 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2cb7ef80-3761-4574-9c1f-52405a401ebc-util\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fttmb9\" (UID: \"2cb7ef80-3761-4574-9c1f-52405a401ebc\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fttmb9" Dec 01 21:44:32 crc kubenswrapper[4962]: I1201 21:44:32.838603 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2cb7ef80-3761-4574-9c1f-52405a401ebc-bundle\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fttmb9\" (UID: \"2cb7ef80-3761-4574-9c1f-52405a401ebc\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fttmb9" Dec 01 21:44:32 crc kubenswrapper[4962]: I1201 21:44:32.859713 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjzrl\" (UniqueName: \"kubernetes.io/projected/2cb7ef80-3761-4574-9c1f-52405a401ebc-kube-api-access-hjzrl\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fttmb9\" (UID: \"2cb7ef80-3761-4574-9c1f-52405a401ebc\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fttmb9" Dec 01 21:44:32 crc kubenswrapper[4962]: I1201 21:44:32.880803 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fttmb9" Dec 01 21:44:32 crc kubenswrapper[4962]: I1201 21:44:32.937651 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1a573027-bd61-4a79-b4da-d0b25cb44908-util\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8lv8cp\" (UID: \"1a573027-bd61-4a79-b4da-d0b25cb44908\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8lv8cp" Dec 01 21:44:32 crc kubenswrapper[4962]: I1201 21:44:32.937709 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1a573027-bd61-4a79-b4da-d0b25cb44908-bundle\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8lv8cp\" (UID: \"1a573027-bd61-4a79-b4da-d0b25cb44908\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8lv8cp" Dec 01 21:44:32 crc kubenswrapper[4962]: I1201 21:44:32.937762 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mshdc\" (UniqueName: \"kubernetes.io/projected/1a573027-bd61-4a79-b4da-d0b25cb44908-kube-api-access-mshdc\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8lv8cp\" (UID: \"1a573027-bd61-4a79-b4da-d0b25cb44908\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8lv8cp" Dec 01 21:44:32 crc kubenswrapper[4962]: I1201 21:44:32.938819 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1a573027-bd61-4a79-b4da-d0b25cb44908-bundle\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8lv8cp\" (UID: \"1a573027-bd61-4a79-b4da-d0b25cb44908\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8lv8cp" Dec 01 21:44:32 crc kubenswrapper[4962]: I1201 21:44:32.939189 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1a573027-bd61-4a79-b4da-d0b25cb44908-util\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8lv8cp\" (UID: \"1a573027-bd61-4a79-b4da-d0b25cb44908\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8lv8cp" Dec 01 21:44:32 crc kubenswrapper[4962]: I1201 21:44:32.961207 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mshdc\" (UniqueName: \"kubernetes.io/projected/1a573027-bd61-4a79-b4da-d0b25cb44908-kube-api-access-mshdc\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8lv8cp\" (UID: \"1a573027-bd61-4a79-b4da-d0b25cb44908\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8lv8cp" Dec 01 21:44:33 crc kubenswrapper[4962]: I1201 21:44:33.103284 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8lv8cp" Dec 01 21:44:33 crc kubenswrapper[4962]: I1201 21:44:33.290554 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fttmb9"] Dec 01 21:44:33 crc kubenswrapper[4962]: I1201 21:44:33.322130 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8lv8cp"] Dec 01 21:44:33 crc kubenswrapper[4962]: I1201 21:44:33.550661 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fttmb9" event={"ID":"2cb7ef80-3761-4574-9c1f-52405a401ebc","Type":"ContainerStarted","Data":"02e2932038a4e45e6c5108b7c5af4d8d70e4a6e810d7de93e6f2e2660926b9a3"} Dec 01 21:44:33 crc kubenswrapper[4962]: I1201 21:44:33.550908 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fttmb9" event={"ID":"2cb7ef80-3761-4574-9c1f-52405a401ebc","Type":"ContainerStarted","Data":"9c712aad889a51a85f9def886ed708a714b4ffe64f2ccb01d9e3ae84abd5b225"} Dec 01 21:44:33 crc kubenswrapper[4962]: I1201 21:44:33.554650 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8lv8cp" event={"ID":"1a573027-bd61-4a79-b4da-d0b25cb44908","Type":"ContainerStarted","Data":"6d0913200ec690a13032b9323dfe75d334744c2d4ebb8e5fa5637efd9343c681"} Dec 01 21:44:33 crc kubenswrapper[4962]: I1201 21:44:33.554678 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8lv8cp" event={"ID":"1a573027-bd61-4a79-b4da-d0b25cb44908","Type":"ContainerStarted","Data":"350a9713288a0b8a073c279b9af0d2c3ce29b62fe4366a39312c0529b8fb0c5d"} Dec 01 21:44:34 crc kubenswrapper[4962]: I1201 21:44:34.565516 4962 generic.go:334] "Generic (PLEG): container finished" podID="2cb7ef80-3761-4574-9c1f-52405a401ebc" containerID="02e2932038a4e45e6c5108b7c5af4d8d70e4a6e810d7de93e6f2e2660926b9a3" exitCode=0 Dec 01 21:44:34 crc kubenswrapper[4962]: I1201 21:44:34.565869 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fttmb9" event={"ID":"2cb7ef80-3761-4574-9c1f-52405a401ebc","Type":"ContainerDied","Data":"02e2932038a4e45e6c5108b7c5af4d8d70e4a6e810d7de93e6f2e2660926b9a3"} Dec 01 21:44:34 crc kubenswrapper[4962]: I1201 21:44:34.571129 4962 generic.go:334] "Generic (PLEG): container finished" podID="1a573027-bd61-4a79-b4da-d0b25cb44908" containerID="6d0913200ec690a13032b9323dfe75d334744c2d4ebb8e5fa5637efd9343c681" exitCode=0 Dec 01 21:44:34 crc kubenswrapper[4962]: I1201 21:44:34.571186 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8lv8cp" event={"ID":"1a573027-bd61-4a79-b4da-d0b25cb44908","Type":"ContainerDied","Data":"6d0913200ec690a13032b9323dfe75d334744c2d4ebb8e5fa5637efd9343c681"} Dec 01 21:44:37 crc kubenswrapper[4962]: I1201 21:44:37.596214 4962 generic.go:334] "Generic (PLEG): container finished" podID="1a573027-bd61-4a79-b4da-d0b25cb44908" containerID="a2903f32925744e5e2873f0b09a44812ebc92707706d8417f98a08639be9ed9c" exitCode=0 Dec 01 21:44:37 crc kubenswrapper[4962]: I1201 21:44:37.596268 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8lv8cp" event={"ID":"1a573027-bd61-4a79-b4da-d0b25cb44908","Type":"ContainerDied","Data":"a2903f32925744e5e2873f0b09a44812ebc92707706d8417f98a08639be9ed9c"} Dec 01 21:44:37 crc kubenswrapper[4962]: I1201 21:44:37.608514 4962 generic.go:334] "Generic (PLEG): container finished" podID="2cb7ef80-3761-4574-9c1f-52405a401ebc" containerID="966d019fd9d42a793abf1e9301d049e0bfa3fc1523bb4379745dc87aab3114ef" exitCode=0 Dec 01 21:44:37 crc kubenswrapper[4962]: I1201 21:44:37.608556 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fttmb9" event={"ID":"2cb7ef80-3761-4574-9c1f-52405a401ebc","Type":"ContainerDied","Data":"966d019fd9d42a793abf1e9301d049e0bfa3fc1523bb4379745dc87aab3114ef"} Dec 01 21:44:38 crc kubenswrapper[4962]: I1201 21:44:38.623276 4962 generic.go:334] "Generic (PLEG): container finished" podID="2cb7ef80-3761-4574-9c1f-52405a401ebc" containerID="7381605846bd0b26c763c984400ebf2950fc8f714e2ec73811cb94790eeaa585" exitCode=0 Dec 01 21:44:38 crc kubenswrapper[4962]: I1201 21:44:38.623434 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fttmb9" event={"ID":"2cb7ef80-3761-4574-9c1f-52405a401ebc","Type":"ContainerDied","Data":"7381605846bd0b26c763c984400ebf2950fc8f714e2ec73811cb94790eeaa585"} Dec 01 21:44:38 crc kubenswrapper[4962]: I1201 21:44:38.642309 4962 generic.go:334] "Generic (PLEG): container finished" podID="1a573027-bd61-4a79-b4da-d0b25cb44908" containerID="46dacd1eeb6872a73731de849532591298aebd1ae82fa2ba7718d229e37fd004" exitCode=0 Dec 01 21:44:38 crc kubenswrapper[4962]: I1201 21:44:38.642382 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8lv8cp" event={"ID":"1a573027-bd61-4a79-b4da-d0b25cb44908","Type":"ContainerDied","Data":"46dacd1eeb6872a73731de849532591298aebd1ae82fa2ba7718d229e37fd004"} Dec 01 21:44:40 crc kubenswrapper[4962]: I1201 21:44:40.005998 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fttmb9" Dec 01 21:44:40 crc kubenswrapper[4962]: I1201 21:44:40.011282 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8lv8cp" Dec 01 21:44:40 crc kubenswrapper[4962]: I1201 21:44:40.095843 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1a573027-bd61-4a79-b4da-d0b25cb44908-util\") pod \"1a573027-bd61-4a79-b4da-d0b25cb44908\" (UID: \"1a573027-bd61-4a79-b4da-d0b25cb44908\") " Dec 01 21:44:40 crc kubenswrapper[4962]: I1201 21:44:40.096152 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1a573027-bd61-4a79-b4da-d0b25cb44908-bundle\") pod \"1a573027-bd61-4a79-b4da-d0b25cb44908\" (UID: \"1a573027-bd61-4a79-b4da-d0b25cb44908\") " Dec 01 21:44:40 crc kubenswrapper[4962]: I1201 21:44:40.096266 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjzrl\" (UniqueName: \"kubernetes.io/projected/2cb7ef80-3761-4574-9c1f-52405a401ebc-kube-api-access-hjzrl\") pod \"2cb7ef80-3761-4574-9c1f-52405a401ebc\" (UID: \"2cb7ef80-3761-4574-9c1f-52405a401ebc\") " Dec 01 21:44:40 crc kubenswrapper[4962]: I1201 21:44:40.096386 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2cb7ef80-3761-4574-9c1f-52405a401ebc-util\") pod \"2cb7ef80-3761-4574-9c1f-52405a401ebc\" (UID: \"2cb7ef80-3761-4574-9c1f-52405a401ebc\") " Dec 01 21:44:40 crc kubenswrapper[4962]: I1201 21:44:40.096622 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2cb7ef80-3761-4574-9c1f-52405a401ebc-bundle\") pod \"2cb7ef80-3761-4574-9c1f-52405a401ebc\" (UID: \"2cb7ef80-3761-4574-9c1f-52405a401ebc\") " Dec 01 21:44:40 crc kubenswrapper[4962]: I1201 21:44:40.096754 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mshdc\" (UniqueName: \"kubernetes.io/projected/1a573027-bd61-4a79-b4da-d0b25cb44908-kube-api-access-mshdc\") pod \"1a573027-bd61-4a79-b4da-d0b25cb44908\" (UID: \"1a573027-bd61-4a79-b4da-d0b25cb44908\") " Dec 01 21:44:40 crc kubenswrapper[4962]: I1201 21:44:40.098016 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2cb7ef80-3761-4574-9c1f-52405a401ebc-bundle" (OuterVolumeSpecName: "bundle") pod "2cb7ef80-3761-4574-9c1f-52405a401ebc" (UID: "2cb7ef80-3761-4574-9c1f-52405a401ebc"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:44:40 crc kubenswrapper[4962]: I1201 21:44:40.100643 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a573027-bd61-4a79-b4da-d0b25cb44908-bundle" (OuterVolumeSpecName: "bundle") pod "1a573027-bd61-4a79-b4da-d0b25cb44908" (UID: "1a573027-bd61-4a79-b4da-d0b25cb44908"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:44:40 crc kubenswrapper[4962]: I1201 21:44:40.103260 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cb7ef80-3761-4574-9c1f-52405a401ebc-kube-api-access-hjzrl" (OuterVolumeSpecName: "kube-api-access-hjzrl") pod "2cb7ef80-3761-4574-9c1f-52405a401ebc" (UID: "2cb7ef80-3761-4574-9c1f-52405a401ebc"). InnerVolumeSpecName "kube-api-access-hjzrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:44:40 crc kubenswrapper[4962]: I1201 21:44:40.105744 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a573027-bd61-4a79-b4da-d0b25cb44908-kube-api-access-mshdc" (OuterVolumeSpecName: "kube-api-access-mshdc") pod "1a573027-bd61-4a79-b4da-d0b25cb44908" (UID: "1a573027-bd61-4a79-b4da-d0b25cb44908"). InnerVolumeSpecName "kube-api-access-mshdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:44:40 crc kubenswrapper[4962]: I1201 21:44:40.112915 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2cb7ef80-3761-4574-9c1f-52405a401ebc-util" (OuterVolumeSpecName: "util") pod "2cb7ef80-3761-4574-9c1f-52405a401ebc" (UID: "2cb7ef80-3761-4574-9c1f-52405a401ebc"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:44:40 crc kubenswrapper[4962]: I1201 21:44:40.124058 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a573027-bd61-4a79-b4da-d0b25cb44908-util" (OuterVolumeSpecName: "util") pod "1a573027-bd61-4a79-b4da-d0b25cb44908" (UID: "1a573027-bd61-4a79-b4da-d0b25cb44908"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:44:40 crc kubenswrapper[4962]: I1201 21:44:40.197925 4962 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1a573027-bd61-4a79-b4da-d0b25cb44908-util\") on node \"crc\" DevicePath \"\"" Dec 01 21:44:40 crc kubenswrapper[4962]: I1201 21:44:40.198007 4962 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1a573027-bd61-4a79-b4da-d0b25cb44908-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 21:44:40 crc kubenswrapper[4962]: I1201 21:44:40.198024 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjzrl\" (UniqueName: \"kubernetes.io/projected/2cb7ef80-3761-4574-9c1f-52405a401ebc-kube-api-access-hjzrl\") on node \"crc\" DevicePath \"\"" Dec 01 21:44:40 crc kubenswrapper[4962]: I1201 21:44:40.198038 4962 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2cb7ef80-3761-4574-9c1f-52405a401ebc-util\") on node \"crc\" DevicePath \"\"" Dec 01 21:44:40 crc kubenswrapper[4962]: I1201 21:44:40.198049 4962 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2cb7ef80-3761-4574-9c1f-52405a401ebc-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 21:44:40 crc kubenswrapper[4962]: I1201 21:44:40.198060 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mshdc\" (UniqueName: \"kubernetes.io/projected/1a573027-bd61-4a79-b4da-d0b25cb44908-kube-api-access-mshdc\") on node \"crc\" DevicePath \"\"" Dec 01 21:44:40 crc kubenswrapper[4962]: I1201 21:44:40.662167 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8lv8cp" event={"ID":"1a573027-bd61-4a79-b4da-d0b25cb44908","Type":"ContainerDied","Data":"350a9713288a0b8a073c279b9af0d2c3ce29b62fe4366a39312c0529b8fb0c5d"} Dec 01 21:44:40 crc kubenswrapper[4962]: I1201 21:44:40.662212 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="350a9713288a0b8a073c279b9af0d2c3ce29b62fe4366a39312c0529b8fb0c5d" Dec 01 21:44:40 crc kubenswrapper[4962]: I1201 21:44:40.662276 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8lv8cp" Dec 01 21:44:40 crc kubenswrapper[4962]: I1201 21:44:40.664872 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fttmb9" event={"ID":"2cb7ef80-3761-4574-9c1f-52405a401ebc","Type":"ContainerDied","Data":"9c712aad889a51a85f9def886ed708a714b4ffe64f2ccb01d9e3ae84abd5b225"} Dec 01 21:44:40 crc kubenswrapper[4962]: I1201 21:44:40.664907 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c712aad889a51a85f9def886ed708a714b4ffe64f2ccb01d9e3ae84abd5b225" Dec 01 21:44:40 crc kubenswrapper[4962]: I1201 21:44:40.664991 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fttmb9" Dec 01 21:44:49 crc kubenswrapper[4962]: I1201 21:44:49.263456 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-9658d6667-z5mx5"] Dec 01 21:44:49 crc kubenswrapper[4962]: E1201 21:44:49.264389 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cb7ef80-3761-4574-9c1f-52405a401ebc" containerName="util" Dec 01 21:44:49 crc kubenswrapper[4962]: I1201 21:44:49.264405 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cb7ef80-3761-4574-9c1f-52405a401ebc" containerName="util" Dec 01 21:44:49 crc kubenswrapper[4962]: E1201 21:44:49.264425 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cb7ef80-3761-4574-9c1f-52405a401ebc" containerName="extract" Dec 01 21:44:49 crc kubenswrapper[4962]: I1201 21:44:49.264433 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cb7ef80-3761-4574-9c1f-52405a401ebc" containerName="extract" Dec 01 21:44:49 crc kubenswrapper[4962]: E1201 21:44:49.264442 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a573027-bd61-4a79-b4da-d0b25cb44908" containerName="util" Dec 01 21:44:49 crc kubenswrapper[4962]: I1201 21:44:49.264450 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a573027-bd61-4a79-b4da-d0b25cb44908" containerName="util" Dec 01 21:44:49 crc kubenswrapper[4962]: E1201 21:44:49.264463 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cb7ef80-3761-4574-9c1f-52405a401ebc" containerName="pull" Dec 01 21:44:49 crc kubenswrapper[4962]: I1201 21:44:49.264469 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cb7ef80-3761-4574-9c1f-52405a401ebc" containerName="pull" Dec 01 21:44:49 crc kubenswrapper[4962]: E1201 21:44:49.264481 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a573027-bd61-4a79-b4da-d0b25cb44908" containerName="extract" Dec 01 21:44:49 crc kubenswrapper[4962]: I1201 21:44:49.264487 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a573027-bd61-4a79-b4da-d0b25cb44908" containerName="extract" Dec 01 21:44:49 crc kubenswrapper[4962]: E1201 21:44:49.264507 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a573027-bd61-4a79-b4da-d0b25cb44908" containerName="pull" Dec 01 21:44:49 crc kubenswrapper[4962]: I1201 21:44:49.264514 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a573027-bd61-4a79-b4da-d0b25cb44908" containerName="pull" Dec 01 21:44:49 crc kubenswrapper[4962]: I1201 21:44:49.264654 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cb7ef80-3761-4574-9c1f-52405a401ebc" containerName="extract" Dec 01 21:44:49 crc kubenswrapper[4962]: I1201 21:44:49.264669 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a573027-bd61-4a79-b4da-d0b25cb44908" containerName="extract" Dec 01 21:44:49 crc kubenswrapper[4962]: I1201 21:44:49.265513 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-9658d6667-z5mx5" Dec 01 21:44:49 crc kubenswrapper[4962]: I1201 21:44:49.267536 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"kube-root-ca.crt" Dec 01 21:44:49 crc kubenswrapper[4962]: I1201 21:44:49.267650 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-dockercfg-kxdkz" Dec 01 21:44:49 crc kubenswrapper[4962]: I1201 21:44:49.267709 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-service-cert" Dec 01 21:44:49 crc kubenswrapper[4962]: I1201 21:44:49.267919 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-metrics" Dec 01 21:44:49 crc kubenswrapper[4962]: I1201 21:44:49.268127 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"loki-operator-manager-config" Dec 01 21:44:49 crc kubenswrapper[4962]: I1201 21:44:49.268745 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"openshift-service-ca.crt" Dec 01 21:44:49 crc kubenswrapper[4962]: I1201 21:44:49.286137 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-9658d6667-z5mx5"] Dec 01 21:44:49 crc kubenswrapper[4962]: I1201 21:44:49.354544 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/7b4dc242-7421-44a8-862e-b78e3e4310f3-manager-config\") pod \"loki-operator-controller-manager-9658d6667-z5mx5\" (UID: \"7b4dc242-7421-44a8-862e-b78e3e4310f3\") " pod="openshift-operators-redhat/loki-operator-controller-manager-9658d6667-z5mx5" Dec 01 21:44:49 crc kubenswrapper[4962]: I1201 21:44:49.354714 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7b4dc242-7421-44a8-862e-b78e3e4310f3-apiservice-cert\") pod \"loki-operator-controller-manager-9658d6667-z5mx5\" (UID: \"7b4dc242-7421-44a8-862e-b78e3e4310f3\") " pod="openshift-operators-redhat/loki-operator-controller-manager-9658d6667-z5mx5" Dec 01 21:44:49 crc kubenswrapper[4962]: I1201 21:44:49.354749 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7b4dc242-7421-44a8-862e-b78e3e4310f3-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-9658d6667-z5mx5\" (UID: \"7b4dc242-7421-44a8-862e-b78e3e4310f3\") " pod="openshift-operators-redhat/loki-operator-controller-manager-9658d6667-z5mx5" Dec 01 21:44:49 crc kubenswrapper[4962]: I1201 21:44:49.354770 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hppx8\" (UniqueName: \"kubernetes.io/projected/7b4dc242-7421-44a8-862e-b78e3e4310f3-kube-api-access-hppx8\") pod \"loki-operator-controller-manager-9658d6667-z5mx5\" (UID: \"7b4dc242-7421-44a8-862e-b78e3e4310f3\") " pod="openshift-operators-redhat/loki-operator-controller-manager-9658d6667-z5mx5" Dec 01 21:44:49 crc kubenswrapper[4962]: I1201 21:44:49.354803 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7b4dc242-7421-44a8-862e-b78e3e4310f3-webhook-cert\") pod \"loki-operator-controller-manager-9658d6667-z5mx5\" (UID: \"7b4dc242-7421-44a8-862e-b78e3e4310f3\") " pod="openshift-operators-redhat/loki-operator-controller-manager-9658d6667-z5mx5" Dec 01 21:44:49 crc kubenswrapper[4962]: I1201 21:44:49.456207 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/7b4dc242-7421-44a8-862e-b78e3e4310f3-manager-config\") pod \"loki-operator-controller-manager-9658d6667-z5mx5\" (UID: \"7b4dc242-7421-44a8-862e-b78e3e4310f3\") " pod="openshift-operators-redhat/loki-operator-controller-manager-9658d6667-z5mx5" Dec 01 21:44:49 crc kubenswrapper[4962]: I1201 21:44:49.456286 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7b4dc242-7421-44a8-862e-b78e3e4310f3-apiservice-cert\") pod \"loki-operator-controller-manager-9658d6667-z5mx5\" (UID: \"7b4dc242-7421-44a8-862e-b78e3e4310f3\") " pod="openshift-operators-redhat/loki-operator-controller-manager-9658d6667-z5mx5" Dec 01 21:44:49 crc kubenswrapper[4962]: I1201 21:44:49.456311 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7b4dc242-7421-44a8-862e-b78e3e4310f3-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-9658d6667-z5mx5\" (UID: \"7b4dc242-7421-44a8-862e-b78e3e4310f3\") " pod="openshift-operators-redhat/loki-operator-controller-manager-9658d6667-z5mx5" Dec 01 21:44:49 crc kubenswrapper[4962]: I1201 21:44:49.456335 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hppx8\" (UniqueName: \"kubernetes.io/projected/7b4dc242-7421-44a8-862e-b78e3e4310f3-kube-api-access-hppx8\") pod \"loki-operator-controller-manager-9658d6667-z5mx5\" (UID: \"7b4dc242-7421-44a8-862e-b78e3e4310f3\") " pod="openshift-operators-redhat/loki-operator-controller-manager-9658d6667-z5mx5" Dec 01 21:44:49 crc kubenswrapper[4962]: I1201 21:44:49.456366 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7b4dc242-7421-44a8-862e-b78e3e4310f3-webhook-cert\") pod \"loki-operator-controller-manager-9658d6667-z5mx5\" (UID: \"7b4dc242-7421-44a8-862e-b78e3e4310f3\") " pod="openshift-operators-redhat/loki-operator-controller-manager-9658d6667-z5mx5" Dec 01 21:44:49 crc kubenswrapper[4962]: I1201 21:44:49.457298 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/7b4dc242-7421-44a8-862e-b78e3e4310f3-manager-config\") pod \"loki-operator-controller-manager-9658d6667-z5mx5\" (UID: \"7b4dc242-7421-44a8-862e-b78e3e4310f3\") " pod="openshift-operators-redhat/loki-operator-controller-manager-9658d6667-z5mx5" Dec 01 21:44:49 crc kubenswrapper[4962]: I1201 21:44:49.462160 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7b4dc242-7421-44a8-862e-b78e3e4310f3-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-9658d6667-z5mx5\" (UID: \"7b4dc242-7421-44a8-862e-b78e3e4310f3\") " pod="openshift-operators-redhat/loki-operator-controller-manager-9658d6667-z5mx5" Dec 01 21:44:49 crc kubenswrapper[4962]: I1201 21:44:49.463912 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7b4dc242-7421-44a8-862e-b78e3e4310f3-webhook-cert\") pod \"loki-operator-controller-manager-9658d6667-z5mx5\" (UID: \"7b4dc242-7421-44a8-862e-b78e3e4310f3\") " pod="openshift-operators-redhat/loki-operator-controller-manager-9658d6667-z5mx5" Dec 01 21:44:49 crc kubenswrapper[4962]: I1201 21:44:49.465491 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7b4dc242-7421-44a8-862e-b78e3e4310f3-apiservice-cert\") pod \"loki-operator-controller-manager-9658d6667-z5mx5\" (UID: \"7b4dc242-7421-44a8-862e-b78e3e4310f3\") " pod="openshift-operators-redhat/loki-operator-controller-manager-9658d6667-z5mx5" Dec 01 21:44:49 crc kubenswrapper[4962]: I1201 21:44:49.477713 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hppx8\" (UniqueName: \"kubernetes.io/projected/7b4dc242-7421-44a8-862e-b78e3e4310f3-kube-api-access-hppx8\") pod \"loki-operator-controller-manager-9658d6667-z5mx5\" (UID: \"7b4dc242-7421-44a8-862e-b78e3e4310f3\") " pod="openshift-operators-redhat/loki-operator-controller-manager-9658d6667-z5mx5" Dec 01 21:44:49 crc kubenswrapper[4962]: I1201 21:44:49.586445 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-9658d6667-z5mx5" Dec 01 21:44:49 crc kubenswrapper[4962]: I1201 21:44:49.812473 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-9658d6667-z5mx5"] Dec 01 21:44:50 crc kubenswrapper[4962]: I1201 21:44:50.743722 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-9658d6667-z5mx5" event={"ID":"7b4dc242-7421-44a8-862e-b78e3e4310f3","Type":"ContainerStarted","Data":"454c57220cb5ab4388613172bb7ac8a4a79163e781a78cd00b8cac1b9f8c6869"} Dec 01 21:44:54 crc kubenswrapper[4962]: I1201 21:44:54.123232 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/cluster-logging-operator-ff9846bd-6l74q"] Dec 01 21:44:54 crc kubenswrapper[4962]: I1201 21:44:54.124518 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/cluster-logging-operator-ff9846bd-6l74q" Dec 01 21:44:54 crc kubenswrapper[4962]: I1201 21:44:54.126129 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"cluster-logging-operator-dockercfg-fwn24" Dec 01 21:44:54 crc kubenswrapper[4962]: I1201 21:44:54.126994 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"openshift-service-ca.crt" Dec 01 21:44:54 crc kubenswrapper[4962]: I1201 21:44:54.127239 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"kube-root-ca.crt" Dec 01 21:44:54 crc kubenswrapper[4962]: I1201 21:44:54.133247 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/cluster-logging-operator-ff9846bd-6l74q"] Dec 01 21:44:54 crc kubenswrapper[4962]: I1201 21:44:54.262736 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfz9n\" (UniqueName: \"kubernetes.io/projected/6965bdb4-04f5-486b-9897-b190e56d69b0-kube-api-access-bfz9n\") pod \"cluster-logging-operator-ff9846bd-6l74q\" (UID: \"6965bdb4-04f5-486b-9897-b190e56d69b0\") " pod="openshift-logging/cluster-logging-operator-ff9846bd-6l74q" Dec 01 21:44:54 crc kubenswrapper[4962]: I1201 21:44:54.363700 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfz9n\" (UniqueName: \"kubernetes.io/projected/6965bdb4-04f5-486b-9897-b190e56d69b0-kube-api-access-bfz9n\") pod \"cluster-logging-operator-ff9846bd-6l74q\" (UID: \"6965bdb4-04f5-486b-9897-b190e56d69b0\") " pod="openshift-logging/cluster-logging-operator-ff9846bd-6l74q" Dec 01 21:44:54 crc kubenswrapper[4962]: I1201 21:44:54.395520 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfz9n\" (UniqueName: \"kubernetes.io/projected/6965bdb4-04f5-486b-9897-b190e56d69b0-kube-api-access-bfz9n\") pod \"cluster-logging-operator-ff9846bd-6l74q\" (UID: \"6965bdb4-04f5-486b-9897-b190e56d69b0\") " pod="openshift-logging/cluster-logging-operator-ff9846bd-6l74q" Dec 01 21:44:54 crc kubenswrapper[4962]: I1201 21:44:54.446291 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/cluster-logging-operator-ff9846bd-6l74q" Dec 01 21:44:54 crc kubenswrapper[4962]: I1201 21:44:54.982165 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/cluster-logging-operator-ff9846bd-6l74q"] Dec 01 21:44:54 crc kubenswrapper[4962]: W1201 21:44:54.985870 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6965bdb4_04f5_486b_9897_b190e56d69b0.slice/crio-308f4befbd852086a35c9062b5bc5627b6733be5785575c7c7ea240be725c257 WatchSource:0}: Error finding container 308f4befbd852086a35c9062b5bc5627b6733be5785575c7c7ea240be725c257: Status 404 returned error can't find the container with id 308f4befbd852086a35c9062b5bc5627b6733be5785575c7c7ea240be725c257 Dec 01 21:44:55 crc kubenswrapper[4962]: I1201 21:44:55.800910 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-9658d6667-z5mx5" event={"ID":"7b4dc242-7421-44a8-862e-b78e3e4310f3","Type":"ContainerStarted","Data":"e30e93af801ebfcd3e71012eed94268d0ef6d24b31a183520d211730363e6b4c"} Dec 01 21:44:55 crc kubenswrapper[4962]: I1201 21:44:55.802493 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/cluster-logging-operator-ff9846bd-6l74q" event={"ID":"6965bdb4-04f5-486b-9897-b190e56d69b0","Type":"ContainerStarted","Data":"308f4befbd852086a35c9062b5bc5627b6733be5785575c7c7ea240be725c257"} Dec 01 21:45:00 crc kubenswrapper[4962]: I1201 21:45:00.185138 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410425-pmmvr"] Dec 01 21:45:00 crc kubenswrapper[4962]: I1201 21:45:00.186430 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410425-pmmvr" Dec 01 21:45:00 crc kubenswrapper[4962]: I1201 21:45:00.190121 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 01 21:45:00 crc kubenswrapper[4962]: I1201 21:45:00.190311 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 01 21:45:00 crc kubenswrapper[4962]: I1201 21:45:00.250173 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410425-pmmvr"] Dec 01 21:45:00 crc kubenswrapper[4962]: I1201 21:45:00.279684 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9b7d22d3-7996-4a9a-bc30-584a752a7ef9-secret-volume\") pod \"collect-profiles-29410425-pmmvr\" (UID: \"9b7d22d3-7996-4a9a-bc30-584a752a7ef9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410425-pmmvr" Dec 01 21:45:00 crc kubenswrapper[4962]: I1201 21:45:00.279768 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9b7d22d3-7996-4a9a-bc30-584a752a7ef9-config-volume\") pod \"collect-profiles-29410425-pmmvr\" (UID: \"9b7d22d3-7996-4a9a-bc30-584a752a7ef9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410425-pmmvr" Dec 01 21:45:00 crc kubenswrapper[4962]: I1201 21:45:00.279892 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmkj5\" (UniqueName: \"kubernetes.io/projected/9b7d22d3-7996-4a9a-bc30-584a752a7ef9-kube-api-access-pmkj5\") pod \"collect-profiles-29410425-pmmvr\" (UID: \"9b7d22d3-7996-4a9a-bc30-584a752a7ef9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410425-pmmvr" Dec 01 21:45:00 crc kubenswrapper[4962]: I1201 21:45:00.380875 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmkj5\" (UniqueName: \"kubernetes.io/projected/9b7d22d3-7996-4a9a-bc30-584a752a7ef9-kube-api-access-pmkj5\") pod \"collect-profiles-29410425-pmmvr\" (UID: \"9b7d22d3-7996-4a9a-bc30-584a752a7ef9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410425-pmmvr" Dec 01 21:45:00 crc kubenswrapper[4962]: I1201 21:45:00.380931 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9b7d22d3-7996-4a9a-bc30-584a752a7ef9-secret-volume\") pod \"collect-profiles-29410425-pmmvr\" (UID: \"9b7d22d3-7996-4a9a-bc30-584a752a7ef9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410425-pmmvr" Dec 01 21:45:00 crc kubenswrapper[4962]: I1201 21:45:00.380976 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9b7d22d3-7996-4a9a-bc30-584a752a7ef9-config-volume\") pod \"collect-profiles-29410425-pmmvr\" (UID: \"9b7d22d3-7996-4a9a-bc30-584a752a7ef9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410425-pmmvr" Dec 01 21:45:00 crc kubenswrapper[4962]: I1201 21:45:00.396215 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9b7d22d3-7996-4a9a-bc30-584a752a7ef9-secret-volume\") pod \"collect-profiles-29410425-pmmvr\" (UID: \"9b7d22d3-7996-4a9a-bc30-584a752a7ef9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410425-pmmvr" Dec 01 21:45:00 crc kubenswrapper[4962]: I1201 21:45:00.408754 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9b7d22d3-7996-4a9a-bc30-584a752a7ef9-config-volume\") pod \"collect-profiles-29410425-pmmvr\" (UID: \"9b7d22d3-7996-4a9a-bc30-584a752a7ef9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410425-pmmvr" Dec 01 21:45:00 crc kubenswrapper[4962]: I1201 21:45:00.411553 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmkj5\" (UniqueName: \"kubernetes.io/projected/9b7d22d3-7996-4a9a-bc30-584a752a7ef9-kube-api-access-pmkj5\") pod \"collect-profiles-29410425-pmmvr\" (UID: \"9b7d22d3-7996-4a9a-bc30-584a752a7ef9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410425-pmmvr" Dec 01 21:45:00 crc kubenswrapper[4962]: I1201 21:45:00.518449 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410425-pmmvr" Dec 01 21:45:04 crc kubenswrapper[4962]: I1201 21:45:04.397368 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410425-pmmvr"] Dec 01 21:45:04 crc kubenswrapper[4962]: I1201 21:45:04.866602 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/cluster-logging-operator-ff9846bd-6l74q" event={"ID":"6965bdb4-04f5-486b-9897-b190e56d69b0","Type":"ContainerStarted","Data":"338fb1f9d7094f8c96c76d3944acf568086ede97845315d83e3810bb3187e819"} Dec 01 21:45:04 crc kubenswrapper[4962]: I1201 21:45:04.869022 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-9658d6667-z5mx5" event={"ID":"7b4dc242-7421-44a8-862e-b78e3e4310f3","Type":"ContainerStarted","Data":"9679a4281ee52b2b5e27a05d86918fcbeca0170e5a49c06b80428e156e8f5f06"} Dec 01 21:45:04 crc kubenswrapper[4962]: I1201 21:45:04.869220 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators-redhat/loki-operator-controller-manager-9658d6667-z5mx5" Dec 01 21:45:04 crc kubenswrapper[4962]: I1201 21:45:04.870215 4962 generic.go:334] "Generic (PLEG): container finished" podID="9b7d22d3-7996-4a9a-bc30-584a752a7ef9" containerID="921b517902c1b5debb58936b939673779b86f01b9281acf1e846becfc68d4504" exitCode=0 Dec 01 21:45:04 crc kubenswrapper[4962]: I1201 21:45:04.870263 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410425-pmmvr" event={"ID":"9b7d22d3-7996-4a9a-bc30-584a752a7ef9","Type":"ContainerDied","Data":"921b517902c1b5debb58936b939673779b86f01b9281acf1e846becfc68d4504"} Dec 01 21:45:04 crc kubenswrapper[4962]: I1201 21:45:04.870291 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410425-pmmvr" event={"ID":"9b7d22d3-7996-4a9a-bc30-584a752a7ef9","Type":"ContainerStarted","Data":"c1c06e2dab6a5cc471c882d85c1580669faef3e5c30c477a5738df3fb408d8e8"} Dec 01 21:45:04 crc kubenswrapper[4962]: I1201 21:45:04.872107 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators-redhat/loki-operator-controller-manager-9658d6667-z5mx5" Dec 01 21:45:04 crc kubenswrapper[4962]: I1201 21:45:04.930842 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/cluster-logging-operator-ff9846bd-6l74q" podStartSLOduration=1.9428373890000001 podStartE2EDuration="10.930821511s" podCreationTimestamp="2025-12-01 21:44:54 +0000 UTC" firstStartedPulling="2025-12-01 21:44:54.98775962 +0000 UTC m=+679.089198835" lastFinishedPulling="2025-12-01 21:45:03.975743762 +0000 UTC m=+688.077182957" observedRunningTime="2025-12-01 21:45:04.903134091 +0000 UTC m=+689.004573306" watchObservedRunningTime="2025-12-01 21:45:04.930821511 +0000 UTC m=+689.032260716" Dec 01 21:45:06 crc kubenswrapper[4962]: I1201 21:45:06.231212 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410425-pmmvr" Dec 01 21:45:06 crc kubenswrapper[4962]: I1201 21:45:06.249757 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators-redhat/loki-operator-controller-manager-9658d6667-z5mx5" podStartSLOduration=2.9474955720000002 podStartE2EDuration="17.249739519s" podCreationTimestamp="2025-12-01 21:44:49 +0000 UTC" firstStartedPulling="2025-12-01 21:44:49.82122123 +0000 UTC m=+673.922660425" lastFinishedPulling="2025-12-01 21:45:04.123465167 +0000 UTC m=+688.224904372" observedRunningTime="2025-12-01 21:45:04.957326277 +0000 UTC m=+689.058765472" watchObservedRunningTime="2025-12-01 21:45:06.249739519 +0000 UTC m=+690.351178714" Dec 01 21:45:06 crc kubenswrapper[4962]: I1201 21:45:06.279773 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmkj5\" (UniqueName: \"kubernetes.io/projected/9b7d22d3-7996-4a9a-bc30-584a752a7ef9-kube-api-access-pmkj5\") pod \"9b7d22d3-7996-4a9a-bc30-584a752a7ef9\" (UID: \"9b7d22d3-7996-4a9a-bc30-584a752a7ef9\") " Dec 01 21:45:06 crc kubenswrapper[4962]: I1201 21:45:06.279834 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9b7d22d3-7996-4a9a-bc30-584a752a7ef9-config-volume\") pod \"9b7d22d3-7996-4a9a-bc30-584a752a7ef9\" (UID: \"9b7d22d3-7996-4a9a-bc30-584a752a7ef9\") " Dec 01 21:45:06 crc kubenswrapper[4962]: I1201 21:45:06.279951 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9b7d22d3-7996-4a9a-bc30-584a752a7ef9-secret-volume\") pod \"9b7d22d3-7996-4a9a-bc30-584a752a7ef9\" (UID: \"9b7d22d3-7996-4a9a-bc30-584a752a7ef9\") " Dec 01 21:45:06 crc kubenswrapper[4962]: I1201 21:45:06.281841 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b7d22d3-7996-4a9a-bc30-584a752a7ef9-config-volume" (OuterVolumeSpecName: "config-volume") pod "9b7d22d3-7996-4a9a-bc30-584a752a7ef9" (UID: "9b7d22d3-7996-4a9a-bc30-584a752a7ef9"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:45:06 crc kubenswrapper[4962]: I1201 21:45:06.288081 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b7d22d3-7996-4a9a-bc30-584a752a7ef9-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "9b7d22d3-7996-4a9a-bc30-584a752a7ef9" (UID: "9b7d22d3-7996-4a9a-bc30-584a752a7ef9"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:45:06 crc kubenswrapper[4962]: I1201 21:45:06.304274 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b7d22d3-7996-4a9a-bc30-584a752a7ef9-kube-api-access-pmkj5" (OuterVolumeSpecName: "kube-api-access-pmkj5") pod "9b7d22d3-7996-4a9a-bc30-584a752a7ef9" (UID: "9b7d22d3-7996-4a9a-bc30-584a752a7ef9"). InnerVolumeSpecName "kube-api-access-pmkj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:45:06 crc kubenswrapper[4962]: I1201 21:45:06.381204 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmkj5\" (UniqueName: \"kubernetes.io/projected/9b7d22d3-7996-4a9a-bc30-584a752a7ef9-kube-api-access-pmkj5\") on node \"crc\" DevicePath \"\"" Dec 01 21:45:06 crc kubenswrapper[4962]: I1201 21:45:06.381236 4962 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9b7d22d3-7996-4a9a-bc30-584a752a7ef9-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 21:45:06 crc kubenswrapper[4962]: I1201 21:45:06.381245 4962 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9b7d22d3-7996-4a9a-bc30-584a752a7ef9-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 01 21:45:06 crc kubenswrapper[4962]: I1201 21:45:06.883678 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410425-pmmvr" event={"ID":"9b7d22d3-7996-4a9a-bc30-584a752a7ef9","Type":"ContainerDied","Data":"c1c06e2dab6a5cc471c882d85c1580669faef3e5c30c477a5738df3fb408d8e8"} Dec 01 21:45:06 crc kubenswrapper[4962]: I1201 21:45:06.883732 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c1c06e2dab6a5cc471c882d85c1580669faef3e5c30c477a5738df3fb408d8e8" Dec 01 21:45:06 crc kubenswrapper[4962]: I1201 21:45:06.884255 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410425-pmmvr" Dec 01 21:45:10 crc kubenswrapper[4962]: I1201 21:45:10.055109 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["minio-dev/minio"] Dec 01 21:45:10 crc kubenswrapper[4962]: E1201 21:45:10.055557 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b7d22d3-7996-4a9a-bc30-584a752a7ef9" containerName="collect-profiles" Dec 01 21:45:10 crc kubenswrapper[4962]: I1201 21:45:10.055568 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b7d22d3-7996-4a9a-bc30-584a752a7ef9" containerName="collect-profiles" Dec 01 21:45:10 crc kubenswrapper[4962]: I1201 21:45:10.055676 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b7d22d3-7996-4a9a-bc30-584a752a7ef9" containerName="collect-profiles" Dec 01 21:45:10 crc kubenswrapper[4962]: I1201 21:45:10.056099 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Dec 01 21:45:10 crc kubenswrapper[4962]: I1201 21:45:10.058597 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"kube-root-ca.crt" Dec 01 21:45:10 crc kubenswrapper[4962]: I1201 21:45:10.058606 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"openshift-service-ca.crt" Dec 01 21:45:10 crc kubenswrapper[4962]: I1201 21:45:10.108502 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Dec 01 21:45:10 crc kubenswrapper[4962]: I1201 21:45:10.135536 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rmts\" (UniqueName: \"kubernetes.io/projected/9f0e5df4-8013-41b3-9d8a-6a3c9d21589a-kube-api-access-9rmts\") pod \"minio\" (UID: \"9f0e5df4-8013-41b3-9d8a-6a3c9d21589a\") " pod="minio-dev/minio" Dec 01 21:45:10 crc kubenswrapper[4962]: I1201 21:45:10.135792 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-8086506d-9395-4c11-b617-116183be2e8f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8086506d-9395-4c11-b617-116183be2e8f\") pod \"minio\" (UID: \"9f0e5df4-8013-41b3-9d8a-6a3c9d21589a\") " pod="minio-dev/minio" Dec 01 21:45:10 crc kubenswrapper[4962]: I1201 21:45:10.236742 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-8086506d-9395-4c11-b617-116183be2e8f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8086506d-9395-4c11-b617-116183be2e8f\") pod \"minio\" (UID: \"9f0e5df4-8013-41b3-9d8a-6a3c9d21589a\") " pod="minio-dev/minio" Dec 01 21:45:10 crc kubenswrapper[4962]: I1201 21:45:10.237354 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rmts\" (UniqueName: \"kubernetes.io/projected/9f0e5df4-8013-41b3-9d8a-6a3c9d21589a-kube-api-access-9rmts\") pod \"minio\" (UID: \"9f0e5df4-8013-41b3-9d8a-6a3c9d21589a\") " pod="minio-dev/minio" Dec 01 21:45:10 crc kubenswrapper[4962]: I1201 21:45:10.240246 4962 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 01 21:45:10 crc kubenswrapper[4962]: I1201 21:45:10.240284 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-8086506d-9395-4c11-b617-116183be2e8f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8086506d-9395-4c11-b617-116183be2e8f\") pod \"minio\" (UID: \"9f0e5df4-8013-41b3-9d8a-6a3c9d21589a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f825774e14470ea4386bbb8fbec059edab568e61ff132b93d2752d5941d87be1/globalmount\"" pod="minio-dev/minio" Dec 01 21:45:10 crc kubenswrapper[4962]: I1201 21:45:10.266266 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rmts\" (UniqueName: \"kubernetes.io/projected/9f0e5df4-8013-41b3-9d8a-6a3c9d21589a-kube-api-access-9rmts\") pod \"minio\" (UID: \"9f0e5df4-8013-41b3-9d8a-6a3c9d21589a\") " pod="minio-dev/minio" Dec 01 21:45:10 crc kubenswrapper[4962]: I1201 21:45:10.288085 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-8086506d-9395-4c11-b617-116183be2e8f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8086506d-9395-4c11-b617-116183be2e8f\") pod \"minio\" (UID: \"9f0e5df4-8013-41b3-9d8a-6a3c9d21589a\") " pod="minio-dev/minio" Dec 01 21:45:10 crc kubenswrapper[4962]: I1201 21:45:10.369571 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Dec 01 21:45:10 crc kubenswrapper[4962]: I1201 21:45:10.800277 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Dec 01 21:45:10 crc kubenswrapper[4962]: I1201 21:45:10.915708 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"9f0e5df4-8013-41b3-9d8a-6a3c9d21589a","Type":"ContainerStarted","Data":"e133d7366d70571fc6ef54d69eb4eb77057450d7358de968f7e43e8465e00e97"} Dec 01 21:45:20 crc kubenswrapper[4962]: I1201 21:45:20.998819 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"9f0e5df4-8013-41b3-9d8a-6a3c9d21589a","Type":"ContainerStarted","Data":"eb13b8ec825d4396f46da2efb00298fa37d470f8e287996177f13384279c7d93"} Dec 01 21:45:21 crc kubenswrapper[4962]: I1201 21:45:21.025483 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="minio-dev/minio" podStartSLOduration=4.7783287869999995 podStartE2EDuration="14.025458954s" podCreationTimestamp="2025-12-01 21:45:07 +0000 UTC" firstStartedPulling="2025-12-01 21:45:10.813659655 +0000 UTC m=+694.915098870" lastFinishedPulling="2025-12-01 21:45:20.060789812 +0000 UTC m=+704.162229037" observedRunningTime="2025-12-01 21:45:21.020374079 +0000 UTC m=+705.121813314" watchObservedRunningTime="2025-12-01 21:45:21.025458954 +0000 UTC m=+705.126898189" Dec 01 21:45:24 crc kubenswrapper[4962]: I1201 21:45:24.858047 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-distributor-76cc67bf56-prtfr"] Dec 01 21:45:24 crc kubenswrapper[4962]: I1201 21:45:24.859195 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-distributor-76cc67bf56-prtfr" Dec 01 21:45:24 crc kubenswrapper[4962]: I1201 21:45:24.862997 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-distributor-grpc" Dec 01 21:45:24 crc kubenswrapper[4962]: I1201 21:45:24.863168 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-distributor-http" Dec 01 21:45:24 crc kubenswrapper[4962]: I1201 21:45:24.863279 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-config" Dec 01 21:45:24 crc kubenswrapper[4962]: I1201 21:45:24.863455 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-dockercfg-4248j" Dec 01 21:45:24 crc kubenswrapper[4962]: I1201 21:45:24.863635 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-ca-bundle" Dec 01 21:45:24 crc kubenswrapper[4962]: I1201 21:45:24.878905 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-distributor-76cc67bf56-prtfr"] Dec 01 21:45:24 crc kubenswrapper[4962]: I1201 21:45:24.883956 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/deb58cb2-860d-49d2-95e1-12aa147bd419-logging-loki-ca-bundle\") pod \"logging-loki-distributor-76cc67bf56-prtfr\" (UID: \"deb58cb2-860d-49d2-95e1-12aa147bd419\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-prtfr" Dec 01 21:45:24 crc kubenswrapper[4962]: I1201 21:45:24.883995 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ns5fx\" (UniqueName: \"kubernetes.io/projected/deb58cb2-860d-49d2-95e1-12aa147bd419-kube-api-access-ns5fx\") pod \"logging-loki-distributor-76cc67bf56-prtfr\" (UID: \"deb58cb2-860d-49d2-95e1-12aa147bd419\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-prtfr" Dec 01 21:45:24 crc kubenswrapper[4962]: I1201 21:45:24.884085 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/deb58cb2-860d-49d2-95e1-12aa147bd419-config\") pod \"logging-loki-distributor-76cc67bf56-prtfr\" (UID: \"deb58cb2-860d-49d2-95e1-12aa147bd419\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-prtfr" Dec 01 21:45:24 crc kubenswrapper[4962]: I1201 21:45:24.884117 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/deb58cb2-860d-49d2-95e1-12aa147bd419-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-76cc67bf56-prtfr\" (UID: \"deb58cb2-860d-49d2-95e1-12aa147bd419\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-prtfr" Dec 01 21:45:24 crc kubenswrapper[4962]: I1201 21:45:24.884141 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/deb58cb2-860d-49d2-95e1-12aa147bd419-logging-loki-distributor-http\") pod \"logging-loki-distributor-76cc67bf56-prtfr\" (UID: \"deb58cb2-860d-49d2-95e1-12aa147bd419\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-prtfr" Dec 01 21:45:24 crc kubenswrapper[4962]: I1201 21:45:24.985672 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/deb58cb2-860d-49d2-95e1-12aa147bd419-logging-loki-ca-bundle\") pod \"logging-loki-distributor-76cc67bf56-prtfr\" (UID: \"deb58cb2-860d-49d2-95e1-12aa147bd419\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-prtfr" Dec 01 21:45:24 crc kubenswrapper[4962]: I1201 21:45:24.985736 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ns5fx\" (UniqueName: \"kubernetes.io/projected/deb58cb2-860d-49d2-95e1-12aa147bd419-kube-api-access-ns5fx\") pod \"logging-loki-distributor-76cc67bf56-prtfr\" (UID: \"deb58cb2-860d-49d2-95e1-12aa147bd419\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-prtfr" Dec 01 21:45:24 crc kubenswrapper[4962]: I1201 21:45:24.985819 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/deb58cb2-860d-49d2-95e1-12aa147bd419-config\") pod \"logging-loki-distributor-76cc67bf56-prtfr\" (UID: \"deb58cb2-860d-49d2-95e1-12aa147bd419\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-prtfr" Dec 01 21:45:24 crc kubenswrapper[4962]: I1201 21:45:24.985853 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/deb58cb2-860d-49d2-95e1-12aa147bd419-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-76cc67bf56-prtfr\" (UID: \"deb58cb2-860d-49d2-95e1-12aa147bd419\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-prtfr" Dec 01 21:45:24 crc kubenswrapper[4962]: I1201 21:45:24.985881 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/deb58cb2-860d-49d2-95e1-12aa147bd419-logging-loki-distributor-http\") pod \"logging-loki-distributor-76cc67bf56-prtfr\" (UID: \"deb58cb2-860d-49d2-95e1-12aa147bd419\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-prtfr" Dec 01 21:45:24 crc kubenswrapper[4962]: I1201 21:45:24.986813 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/deb58cb2-860d-49d2-95e1-12aa147bd419-logging-loki-ca-bundle\") pod \"logging-loki-distributor-76cc67bf56-prtfr\" (UID: \"deb58cb2-860d-49d2-95e1-12aa147bd419\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-prtfr" Dec 01 21:45:24 crc kubenswrapper[4962]: I1201 21:45:24.986968 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/deb58cb2-860d-49d2-95e1-12aa147bd419-config\") pod \"logging-loki-distributor-76cc67bf56-prtfr\" (UID: \"deb58cb2-860d-49d2-95e1-12aa147bd419\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-prtfr" Dec 01 21:45:24 crc kubenswrapper[4962]: I1201 21:45:24.997871 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-querier-5895d59bb8-khxd2"] Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.000099 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/deb58cb2-860d-49d2-95e1-12aa147bd419-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-76cc67bf56-prtfr\" (UID: \"deb58cb2-860d-49d2-95e1-12aa147bd419\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-prtfr" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.002832 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/deb58cb2-860d-49d2-95e1-12aa147bd419-logging-loki-distributor-http\") pod \"logging-loki-distributor-76cc67bf56-prtfr\" (UID: \"deb58cb2-860d-49d2-95e1-12aa147bd419\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-prtfr" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.003289 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-querier-5895d59bb8-khxd2" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.006348 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-querier-http" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.006645 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-s3" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.006762 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-querier-grpc" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.013046 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-querier-5895d59bb8-khxd2"] Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.030818 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ns5fx\" (UniqueName: \"kubernetes.io/projected/deb58cb2-860d-49d2-95e1-12aa147bd419-kube-api-access-ns5fx\") pod \"logging-loki-distributor-76cc67bf56-prtfr\" (UID: \"deb58cb2-860d-49d2-95e1-12aa147bd419\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-prtfr" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.089267 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/5e9077bf-815a-4c1f-8956-bc4094f59ceb-logging-loki-s3\") pod \"logging-loki-querier-5895d59bb8-khxd2\" (UID: \"5e9077bf-815a-4c1f-8956-bc4094f59ceb\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-khxd2" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.089324 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/5e9077bf-815a-4c1f-8956-bc4094f59ceb-logging-loki-querier-http\") pod \"logging-loki-querier-5895d59bb8-khxd2\" (UID: \"5e9077bf-815a-4c1f-8956-bc4094f59ceb\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-khxd2" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.089346 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e9077bf-815a-4c1f-8956-bc4094f59ceb-logging-loki-ca-bundle\") pod \"logging-loki-querier-5895d59bb8-khxd2\" (UID: \"5e9077bf-815a-4c1f-8956-bc4094f59ceb\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-khxd2" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.089382 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/5e9077bf-815a-4c1f-8956-bc4094f59ceb-logging-loki-querier-grpc\") pod \"logging-loki-querier-5895d59bb8-khxd2\" (UID: \"5e9077bf-815a-4c1f-8956-bc4094f59ceb\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-khxd2" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.089402 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e9077bf-815a-4c1f-8956-bc4094f59ceb-config\") pod \"logging-loki-querier-5895d59bb8-khxd2\" (UID: \"5e9077bf-815a-4c1f-8956-bc4094f59ceb\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-khxd2" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.089432 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bt4bq\" (UniqueName: \"kubernetes.io/projected/5e9077bf-815a-4c1f-8956-bc4094f59ceb-kube-api-access-bt4bq\") pod \"logging-loki-querier-5895d59bb8-khxd2\" (UID: \"5e9077bf-815a-4c1f-8956-bc4094f59ceb\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-khxd2" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.105348 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-query-frontend-84558f7c9f-7wvzd"] Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.106094 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-7wvzd" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.108050 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-query-frontend-http" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.108095 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-query-frontend-grpc" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.135010 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-query-frontend-84558f7c9f-7wvzd"] Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.177337 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-gateway-749d76f66f-pnn6j"] Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.178638 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-749d76f66f-pnn6j" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.180279 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-distributor-76cc67bf56-prtfr" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.185552 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-gateway-ca-bundle" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.185583 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.185698 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-client-http" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.185729 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-gateway" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.185835 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-http" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.192704 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-749d76f66f-pnn6j"] Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.193119 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/5e9077bf-815a-4c1f-8956-bc4094f59ceb-logging-loki-s3\") pod \"logging-loki-querier-5895d59bb8-khxd2\" (UID: \"5e9077bf-815a-4c1f-8956-bc4094f59ceb\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-khxd2" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.193176 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/5e9077bf-815a-4c1f-8956-bc4094f59ceb-logging-loki-querier-http\") pod \"logging-loki-querier-5895d59bb8-khxd2\" (UID: \"5e9077bf-815a-4c1f-8956-bc4094f59ceb\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-khxd2" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.193196 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e9077bf-815a-4c1f-8956-bc4094f59ceb-logging-loki-ca-bundle\") pod \"logging-loki-querier-5895d59bb8-khxd2\" (UID: \"5e9077bf-815a-4c1f-8956-bc4094f59ceb\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-khxd2" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.193235 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/5e9077bf-815a-4c1f-8956-bc4094f59ceb-logging-loki-querier-grpc\") pod \"logging-loki-querier-5895d59bb8-khxd2\" (UID: \"5e9077bf-815a-4c1f-8956-bc4094f59ceb\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-khxd2" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.193255 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e9077bf-815a-4c1f-8956-bc4094f59ceb-config\") pod \"logging-loki-querier-5895d59bb8-khxd2\" (UID: \"5e9077bf-815a-4c1f-8956-bc4094f59ceb\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-khxd2" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.193294 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bt4bq\" (UniqueName: \"kubernetes.io/projected/5e9077bf-815a-4c1f-8956-bc4094f59ceb-kube-api-access-bt4bq\") pod \"logging-loki-querier-5895d59bb8-khxd2\" (UID: \"5e9077bf-815a-4c1f-8956-bc4094f59ceb\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-khxd2" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.195138 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e9077bf-815a-4c1f-8956-bc4094f59ceb-config\") pod \"logging-loki-querier-5895d59bb8-khxd2\" (UID: \"5e9077bf-815a-4c1f-8956-bc4094f59ceb\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-khxd2" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.204611 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/5e9077bf-815a-4c1f-8956-bc4094f59ceb-logging-loki-s3\") pod \"logging-loki-querier-5895d59bb8-khxd2\" (UID: \"5e9077bf-815a-4c1f-8956-bc4094f59ceb\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-khxd2" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.209599 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e9077bf-815a-4c1f-8956-bc4094f59ceb-logging-loki-ca-bundle\") pod \"logging-loki-querier-5895d59bb8-khxd2\" (UID: \"5e9077bf-815a-4c1f-8956-bc4094f59ceb\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-khxd2" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.211676 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/5e9077bf-815a-4c1f-8956-bc4094f59ceb-logging-loki-querier-grpc\") pod \"logging-loki-querier-5895d59bb8-khxd2\" (UID: \"5e9077bf-815a-4c1f-8956-bc4094f59ceb\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-khxd2" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.211889 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/5e9077bf-815a-4c1f-8956-bc4094f59ceb-logging-loki-querier-http\") pod \"logging-loki-querier-5895d59bb8-khxd2\" (UID: \"5e9077bf-815a-4c1f-8956-bc4094f59ceb\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-khxd2" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.215976 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-gateway-749d76f66f-8szr5"] Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.250284 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-749d76f66f-8szr5" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.250665 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bt4bq\" (UniqueName: \"kubernetes.io/projected/5e9077bf-815a-4c1f-8956-bc4094f59ceb-kube-api-access-bt4bq\") pod \"logging-loki-querier-5895d59bb8-khxd2\" (UID: \"5e9077bf-815a-4c1f-8956-bc4094f59ceb\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-khxd2" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.258477 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-749d76f66f-8szr5"] Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.264509 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-dockercfg-l4t5s" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.297010 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/dc99ea3a-cc9e-4e3d-9d0d-16aaa0ae5faa-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-84558f7c9f-7wvzd\" (UID: \"dc99ea3a-cc9e-4e3d-9d0d-16aaa0ae5faa\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-7wvzd" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.297056 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a89c265b-cf90-4c13-9e7e-ebd27f1b3463-logging-loki-ca-bundle\") pod \"logging-loki-gateway-749d76f66f-8szr5\" (UID: \"a89c265b-cf90-4c13-9e7e-ebd27f1b3463\") " pod="openshift-logging/logging-loki-gateway-749d76f66f-8szr5" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.297097 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/8fdb44c5-cad3-460a-a6c8-90e65be7c1ce-tls-secret\") pod \"logging-loki-gateway-749d76f66f-pnn6j\" (UID: \"8fdb44c5-cad3-460a-a6c8-90e65be7c1ce\") " pod="openshift-logging/logging-loki-gateway-749d76f66f-pnn6j" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.297117 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/8fdb44c5-cad3-460a-a6c8-90e65be7c1ce-lokistack-gateway\") pod \"logging-loki-gateway-749d76f66f-pnn6j\" (UID: \"8fdb44c5-cad3-460a-a6c8-90e65be7c1ce\") " pod="openshift-logging/logging-loki-gateway-749d76f66f-pnn6j" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.297147 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krt48\" (UniqueName: \"kubernetes.io/projected/dc99ea3a-cc9e-4e3d-9d0d-16aaa0ae5faa-kube-api-access-krt48\") pod \"logging-loki-query-frontend-84558f7c9f-7wvzd\" (UID: \"dc99ea3a-cc9e-4e3d-9d0d-16aaa0ae5faa\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-7wvzd" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.297165 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8fdb44c5-cad3-460a-a6c8-90e65be7c1ce-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-749d76f66f-pnn6j\" (UID: \"8fdb44c5-cad3-460a-a6c8-90e65be7c1ce\") " pod="openshift-logging/logging-loki-gateway-749d76f66f-pnn6j" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.297191 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/8fdb44c5-cad3-460a-a6c8-90e65be7c1ce-rbac\") pod \"logging-loki-gateway-749d76f66f-pnn6j\" (UID: \"8fdb44c5-cad3-460a-a6c8-90e65be7c1ce\") " pod="openshift-logging/logging-loki-gateway-749d76f66f-pnn6j" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.297213 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpz95\" (UniqueName: \"kubernetes.io/projected/a89c265b-cf90-4c13-9e7e-ebd27f1b3463-kube-api-access-qpz95\") pod \"logging-loki-gateway-749d76f66f-8szr5\" (UID: \"a89c265b-cf90-4c13-9e7e-ebd27f1b3463\") " pod="openshift-logging/logging-loki-gateway-749d76f66f-8szr5" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.297235 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/dc99ea3a-cc9e-4e3d-9d0d-16aaa0ae5faa-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-84558f7c9f-7wvzd\" (UID: \"dc99ea3a-cc9e-4e3d-9d0d-16aaa0ae5faa\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-7wvzd" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.297254 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/a89c265b-cf90-4c13-9e7e-ebd27f1b3463-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-749d76f66f-8szr5\" (UID: \"a89c265b-cf90-4c13-9e7e-ebd27f1b3463\") " pod="openshift-logging/logging-loki-gateway-749d76f66f-8szr5" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.297362 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/a89c265b-cf90-4c13-9e7e-ebd27f1b3463-tls-secret\") pod \"logging-loki-gateway-749d76f66f-8szr5\" (UID: \"a89c265b-cf90-4c13-9e7e-ebd27f1b3463\") " pod="openshift-logging/logging-loki-gateway-749d76f66f-8szr5" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.297413 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/a89c265b-cf90-4c13-9e7e-ebd27f1b3463-rbac\") pod \"logging-loki-gateway-749d76f66f-8szr5\" (UID: \"a89c265b-cf90-4c13-9e7e-ebd27f1b3463\") " pod="openshift-logging/logging-loki-gateway-749d76f66f-8szr5" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.297447 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc99ea3a-cc9e-4e3d-9d0d-16aaa0ae5faa-config\") pod \"logging-loki-query-frontend-84558f7c9f-7wvzd\" (UID: \"dc99ea3a-cc9e-4e3d-9d0d-16aaa0ae5faa\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-7wvzd" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.297496 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8fdb44c5-cad3-460a-a6c8-90e65be7c1ce-logging-loki-ca-bundle\") pod \"logging-loki-gateway-749d76f66f-pnn6j\" (UID: \"8fdb44c5-cad3-460a-a6c8-90e65be7c1ce\") " pod="openshift-logging/logging-loki-gateway-749d76f66f-pnn6j" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.297535 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc99ea3a-cc9e-4e3d-9d0d-16aaa0ae5faa-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-84558f7c9f-7wvzd\" (UID: \"dc99ea3a-cc9e-4e3d-9d0d-16aaa0ae5faa\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-7wvzd" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.297566 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/8fdb44c5-cad3-460a-a6c8-90e65be7c1ce-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-749d76f66f-pnn6j\" (UID: \"8fdb44c5-cad3-460a-a6c8-90e65be7c1ce\") " pod="openshift-logging/logging-loki-gateway-749d76f66f-pnn6j" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.297616 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a89c265b-cf90-4c13-9e7e-ebd27f1b3463-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-749d76f66f-8szr5\" (UID: \"a89c265b-cf90-4c13-9e7e-ebd27f1b3463\") " pod="openshift-logging/logging-loki-gateway-749d76f66f-8szr5" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.297661 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/a89c265b-cf90-4c13-9e7e-ebd27f1b3463-tenants\") pod \"logging-loki-gateway-749d76f66f-8szr5\" (UID: \"a89c265b-cf90-4c13-9e7e-ebd27f1b3463\") " pod="openshift-logging/logging-loki-gateway-749d76f66f-8szr5" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.297727 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/8fdb44c5-cad3-460a-a6c8-90e65be7c1ce-tenants\") pod \"logging-loki-gateway-749d76f66f-pnn6j\" (UID: \"8fdb44c5-cad3-460a-a6c8-90e65be7c1ce\") " pod="openshift-logging/logging-loki-gateway-749d76f66f-pnn6j" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.297748 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/a89c265b-cf90-4c13-9e7e-ebd27f1b3463-lokistack-gateway\") pod \"logging-loki-gateway-749d76f66f-8szr5\" (UID: \"a89c265b-cf90-4c13-9e7e-ebd27f1b3463\") " pod="openshift-logging/logging-loki-gateway-749d76f66f-8szr5" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.297785 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cqj5\" (UniqueName: \"kubernetes.io/projected/8fdb44c5-cad3-460a-a6c8-90e65be7c1ce-kube-api-access-6cqj5\") pod \"logging-loki-gateway-749d76f66f-pnn6j\" (UID: \"8fdb44c5-cad3-460a-a6c8-90e65be7c1ce\") " pod="openshift-logging/logging-loki-gateway-749d76f66f-pnn6j" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.371280 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-querier-5895d59bb8-khxd2" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.398754 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cqj5\" (UniqueName: \"kubernetes.io/projected/8fdb44c5-cad3-460a-a6c8-90e65be7c1ce-kube-api-access-6cqj5\") pod \"logging-loki-gateway-749d76f66f-pnn6j\" (UID: \"8fdb44c5-cad3-460a-a6c8-90e65be7c1ce\") " pod="openshift-logging/logging-loki-gateway-749d76f66f-pnn6j" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.398819 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/dc99ea3a-cc9e-4e3d-9d0d-16aaa0ae5faa-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-84558f7c9f-7wvzd\" (UID: \"dc99ea3a-cc9e-4e3d-9d0d-16aaa0ae5faa\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-7wvzd" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.398840 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a89c265b-cf90-4c13-9e7e-ebd27f1b3463-logging-loki-ca-bundle\") pod \"logging-loki-gateway-749d76f66f-8szr5\" (UID: \"a89c265b-cf90-4c13-9e7e-ebd27f1b3463\") " pod="openshift-logging/logging-loki-gateway-749d76f66f-8szr5" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.398860 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/8fdb44c5-cad3-460a-a6c8-90e65be7c1ce-tls-secret\") pod \"logging-loki-gateway-749d76f66f-pnn6j\" (UID: \"8fdb44c5-cad3-460a-a6c8-90e65be7c1ce\") " pod="openshift-logging/logging-loki-gateway-749d76f66f-pnn6j" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.398879 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/8fdb44c5-cad3-460a-a6c8-90e65be7c1ce-lokistack-gateway\") pod \"logging-loki-gateway-749d76f66f-pnn6j\" (UID: \"8fdb44c5-cad3-460a-a6c8-90e65be7c1ce\") " pod="openshift-logging/logging-loki-gateway-749d76f66f-pnn6j" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.398900 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8fdb44c5-cad3-460a-a6c8-90e65be7c1ce-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-749d76f66f-pnn6j\" (UID: \"8fdb44c5-cad3-460a-a6c8-90e65be7c1ce\") " pod="openshift-logging/logging-loki-gateway-749d76f66f-pnn6j" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.398914 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krt48\" (UniqueName: \"kubernetes.io/projected/dc99ea3a-cc9e-4e3d-9d0d-16aaa0ae5faa-kube-api-access-krt48\") pod \"logging-loki-query-frontend-84558f7c9f-7wvzd\" (UID: \"dc99ea3a-cc9e-4e3d-9d0d-16aaa0ae5faa\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-7wvzd" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.398954 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/8fdb44c5-cad3-460a-a6c8-90e65be7c1ce-rbac\") pod \"logging-loki-gateway-749d76f66f-pnn6j\" (UID: \"8fdb44c5-cad3-460a-a6c8-90e65be7c1ce\") " pod="openshift-logging/logging-loki-gateway-749d76f66f-pnn6j" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.398975 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpz95\" (UniqueName: \"kubernetes.io/projected/a89c265b-cf90-4c13-9e7e-ebd27f1b3463-kube-api-access-qpz95\") pod \"logging-loki-gateway-749d76f66f-8szr5\" (UID: \"a89c265b-cf90-4c13-9e7e-ebd27f1b3463\") " pod="openshift-logging/logging-loki-gateway-749d76f66f-8szr5" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.398991 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/dc99ea3a-cc9e-4e3d-9d0d-16aaa0ae5faa-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-84558f7c9f-7wvzd\" (UID: \"dc99ea3a-cc9e-4e3d-9d0d-16aaa0ae5faa\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-7wvzd" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.399008 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/a89c265b-cf90-4c13-9e7e-ebd27f1b3463-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-749d76f66f-8szr5\" (UID: \"a89c265b-cf90-4c13-9e7e-ebd27f1b3463\") " pod="openshift-logging/logging-loki-gateway-749d76f66f-8szr5" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.399035 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/a89c265b-cf90-4c13-9e7e-ebd27f1b3463-tls-secret\") pod \"logging-loki-gateway-749d76f66f-8szr5\" (UID: \"a89c265b-cf90-4c13-9e7e-ebd27f1b3463\") " pod="openshift-logging/logging-loki-gateway-749d76f66f-8szr5" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.399054 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/a89c265b-cf90-4c13-9e7e-ebd27f1b3463-rbac\") pod \"logging-loki-gateway-749d76f66f-8szr5\" (UID: \"a89c265b-cf90-4c13-9e7e-ebd27f1b3463\") " pod="openshift-logging/logging-loki-gateway-749d76f66f-8szr5" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.399072 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc99ea3a-cc9e-4e3d-9d0d-16aaa0ae5faa-config\") pod \"logging-loki-query-frontend-84558f7c9f-7wvzd\" (UID: \"dc99ea3a-cc9e-4e3d-9d0d-16aaa0ae5faa\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-7wvzd" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.399103 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8fdb44c5-cad3-460a-a6c8-90e65be7c1ce-logging-loki-ca-bundle\") pod \"logging-loki-gateway-749d76f66f-pnn6j\" (UID: \"8fdb44c5-cad3-460a-a6c8-90e65be7c1ce\") " pod="openshift-logging/logging-loki-gateway-749d76f66f-pnn6j" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.399150 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc99ea3a-cc9e-4e3d-9d0d-16aaa0ae5faa-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-84558f7c9f-7wvzd\" (UID: \"dc99ea3a-cc9e-4e3d-9d0d-16aaa0ae5faa\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-7wvzd" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.399170 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/8fdb44c5-cad3-460a-a6c8-90e65be7c1ce-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-749d76f66f-pnn6j\" (UID: \"8fdb44c5-cad3-460a-a6c8-90e65be7c1ce\") " pod="openshift-logging/logging-loki-gateway-749d76f66f-pnn6j" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.399199 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a89c265b-cf90-4c13-9e7e-ebd27f1b3463-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-749d76f66f-8szr5\" (UID: \"a89c265b-cf90-4c13-9e7e-ebd27f1b3463\") " pod="openshift-logging/logging-loki-gateway-749d76f66f-8szr5" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.399227 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/a89c265b-cf90-4c13-9e7e-ebd27f1b3463-tenants\") pod \"logging-loki-gateway-749d76f66f-8szr5\" (UID: \"a89c265b-cf90-4c13-9e7e-ebd27f1b3463\") " pod="openshift-logging/logging-loki-gateway-749d76f66f-8szr5" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.399260 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/a89c265b-cf90-4c13-9e7e-ebd27f1b3463-lokistack-gateway\") pod \"logging-loki-gateway-749d76f66f-8szr5\" (UID: \"a89c265b-cf90-4c13-9e7e-ebd27f1b3463\") " pod="openshift-logging/logging-loki-gateway-749d76f66f-8szr5" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.399275 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/8fdb44c5-cad3-460a-a6c8-90e65be7c1ce-tenants\") pod \"logging-loki-gateway-749d76f66f-pnn6j\" (UID: \"8fdb44c5-cad3-460a-a6c8-90e65be7c1ce\") " pod="openshift-logging/logging-loki-gateway-749d76f66f-pnn6j" Dec 01 21:45:25 crc kubenswrapper[4962]: E1201 21:45:25.403662 4962 secret.go:188] Couldn't get secret openshift-logging/logging-loki-gateway-http: secret "logging-loki-gateway-http" not found Dec 01 21:45:25 crc kubenswrapper[4962]: E1201 21:45:25.403734 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a89c265b-cf90-4c13-9e7e-ebd27f1b3463-tls-secret podName:a89c265b-cf90-4c13-9e7e-ebd27f1b3463 nodeName:}" failed. No retries permitted until 2025-12-01 21:45:25.903714862 +0000 UTC m=+710.005154057 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-secret" (UniqueName: "kubernetes.io/secret/a89c265b-cf90-4c13-9e7e-ebd27f1b3463-tls-secret") pod "logging-loki-gateway-749d76f66f-8szr5" (UID: "a89c265b-cf90-4c13-9e7e-ebd27f1b3463") : secret "logging-loki-gateway-http" not found Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.404151 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/dc99ea3a-cc9e-4e3d-9d0d-16aaa0ae5faa-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-84558f7c9f-7wvzd\" (UID: \"dc99ea3a-cc9e-4e3d-9d0d-16aaa0ae5faa\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-7wvzd" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.404161 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/8fdb44c5-cad3-460a-a6c8-90e65be7c1ce-tenants\") pod \"logging-loki-gateway-749d76f66f-pnn6j\" (UID: \"8fdb44c5-cad3-460a-a6c8-90e65be7c1ce\") " pod="openshift-logging/logging-loki-gateway-749d76f66f-pnn6j" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.405076 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a89c265b-cf90-4c13-9e7e-ebd27f1b3463-logging-loki-ca-bundle\") pod \"logging-loki-gateway-749d76f66f-8szr5\" (UID: \"a89c265b-cf90-4c13-9e7e-ebd27f1b3463\") " pod="openshift-logging/logging-loki-gateway-749d76f66f-8szr5" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.405083 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/8fdb44c5-cad3-460a-a6c8-90e65be7c1ce-rbac\") pod \"logging-loki-gateway-749d76f66f-pnn6j\" (UID: \"8fdb44c5-cad3-460a-a6c8-90e65be7c1ce\") " pod="openshift-logging/logging-loki-gateway-749d76f66f-pnn6j" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.406044 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc99ea3a-cc9e-4e3d-9d0d-16aaa0ae5faa-config\") pod \"logging-loki-query-frontend-84558f7c9f-7wvzd\" (UID: \"dc99ea3a-cc9e-4e3d-9d0d-16aaa0ae5faa\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-7wvzd" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.406131 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8fdb44c5-cad3-460a-a6c8-90e65be7c1ce-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-749d76f66f-pnn6j\" (UID: \"8fdb44c5-cad3-460a-a6c8-90e65be7c1ce\") " pod="openshift-logging/logging-loki-gateway-749d76f66f-pnn6j" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.406188 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/8fdb44c5-cad3-460a-a6c8-90e65be7c1ce-lokistack-gateway\") pod \"logging-loki-gateway-749d76f66f-pnn6j\" (UID: \"8fdb44c5-cad3-460a-a6c8-90e65be7c1ce\") " pod="openshift-logging/logging-loki-gateway-749d76f66f-pnn6j" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.407833 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8fdb44c5-cad3-460a-a6c8-90e65be7c1ce-logging-loki-ca-bundle\") pod \"logging-loki-gateway-749d76f66f-pnn6j\" (UID: \"8fdb44c5-cad3-460a-a6c8-90e65be7c1ce\") " pod="openshift-logging/logging-loki-gateway-749d76f66f-pnn6j" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.409040 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/a89c265b-cf90-4c13-9e7e-ebd27f1b3463-rbac\") pod \"logging-loki-gateway-749d76f66f-8szr5\" (UID: \"a89c265b-cf90-4c13-9e7e-ebd27f1b3463\") " pod="openshift-logging/logging-loki-gateway-749d76f66f-8szr5" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.409078 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc99ea3a-cc9e-4e3d-9d0d-16aaa0ae5faa-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-84558f7c9f-7wvzd\" (UID: \"dc99ea3a-cc9e-4e3d-9d0d-16aaa0ae5faa\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-7wvzd" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.409292 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/a89c265b-cf90-4c13-9e7e-ebd27f1b3463-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-749d76f66f-8szr5\" (UID: \"a89c265b-cf90-4c13-9e7e-ebd27f1b3463\") " pod="openshift-logging/logging-loki-gateway-749d76f66f-8szr5" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.409364 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/8fdb44c5-cad3-460a-a6c8-90e65be7c1ce-tls-secret\") pod \"logging-loki-gateway-749d76f66f-pnn6j\" (UID: \"8fdb44c5-cad3-460a-a6c8-90e65be7c1ce\") " pod="openshift-logging/logging-loki-gateway-749d76f66f-pnn6j" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.413734 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/dc99ea3a-cc9e-4e3d-9d0d-16aaa0ae5faa-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-84558f7c9f-7wvzd\" (UID: \"dc99ea3a-cc9e-4e3d-9d0d-16aaa0ae5faa\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-7wvzd" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.413744 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/a89c265b-cf90-4c13-9e7e-ebd27f1b3463-tenants\") pod \"logging-loki-gateway-749d76f66f-8szr5\" (UID: \"a89c265b-cf90-4c13-9e7e-ebd27f1b3463\") " pod="openshift-logging/logging-loki-gateway-749d76f66f-8szr5" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.413968 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/a89c265b-cf90-4c13-9e7e-ebd27f1b3463-lokistack-gateway\") pod \"logging-loki-gateway-749d76f66f-8szr5\" (UID: \"a89c265b-cf90-4c13-9e7e-ebd27f1b3463\") " pod="openshift-logging/logging-loki-gateway-749d76f66f-8szr5" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.414015 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a89c265b-cf90-4c13-9e7e-ebd27f1b3463-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-749d76f66f-8szr5\" (UID: \"a89c265b-cf90-4c13-9e7e-ebd27f1b3463\") " pod="openshift-logging/logging-loki-gateway-749d76f66f-8szr5" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.417590 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/8fdb44c5-cad3-460a-a6c8-90e65be7c1ce-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-749d76f66f-pnn6j\" (UID: \"8fdb44c5-cad3-460a-a6c8-90e65be7c1ce\") " pod="openshift-logging/logging-loki-gateway-749d76f66f-pnn6j" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.421129 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cqj5\" (UniqueName: \"kubernetes.io/projected/8fdb44c5-cad3-460a-a6c8-90e65be7c1ce-kube-api-access-6cqj5\") pod \"logging-loki-gateway-749d76f66f-pnn6j\" (UID: \"8fdb44c5-cad3-460a-a6c8-90e65be7c1ce\") " pod="openshift-logging/logging-loki-gateway-749d76f66f-pnn6j" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.427538 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krt48\" (UniqueName: \"kubernetes.io/projected/dc99ea3a-cc9e-4e3d-9d0d-16aaa0ae5faa-kube-api-access-krt48\") pod \"logging-loki-query-frontend-84558f7c9f-7wvzd\" (UID: \"dc99ea3a-cc9e-4e3d-9d0d-16aaa0ae5faa\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-7wvzd" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.429301 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpz95\" (UniqueName: \"kubernetes.io/projected/a89c265b-cf90-4c13-9e7e-ebd27f1b3463-kube-api-access-qpz95\") pod \"logging-loki-gateway-749d76f66f-8szr5\" (UID: \"a89c265b-cf90-4c13-9e7e-ebd27f1b3463\") " pod="openshift-logging/logging-loki-gateway-749d76f66f-8szr5" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.437739 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-distributor-76cc67bf56-prtfr"] Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.499539 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-749d76f66f-pnn6j" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.722986 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-7wvzd" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.799950 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-querier-5895d59bb8-khxd2"] Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.907142 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/a89c265b-cf90-4c13-9e7e-ebd27f1b3463-tls-secret\") pod \"logging-loki-gateway-749d76f66f-8szr5\" (UID: \"a89c265b-cf90-4c13-9e7e-ebd27f1b3463\") " pod="openshift-logging/logging-loki-gateway-749d76f66f-8szr5" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.916613 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/a89c265b-cf90-4c13-9e7e-ebd27f1b3463-tls-secret\") pod \"logging-loki-gateway-749d76f66f-8szr5\" (UID: \"a89c265b-cf90-4c13-9e7e-ebd27f1b3463\") " pod="openshift-logging/logging-loki-gateway-749d76f66f-8szr5" Dec 01 21:45:25 crc kubenswrapper[4962]: I1201 21:45:25.916775 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-749d76f66f-pnn6j"] Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.006073 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.006991 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-ingester-0" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.010174 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-ingester-http" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.010200 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-ingester-grpc" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.024204 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.042402 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-querier-5895d59bb8-khxd2" event={"ID":"5e9077bf-815a-4c1f-8956-bc4094f59ceb","Type":"ContainerStarted","Data":"41100dbdf2ddf867a71014155283da90fffd368f8f7dfde2b6ffe21b55668d33"} Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.043744 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-749d76f66f-pnn6j" event={"ID":"8fdb44c5-cad3-460a-a6c8-90e65be7c1ce","Type":"ContainerStarted","Data":"4763739a0b8433f0ebeef0e9f70dbc8c4b1edb9d7cf03079e54755ec756d81ca"} Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.045088 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-distributor-76cc67bf56-prtfr" event={"ID":"deb58cb2-860d-49d2-95e1-12aa147bd419","Type":"ContainerStarted","Data":"5d37d5e6a048b686041e687f46076e5324a5bfe9e61179fa231c6b0a2301bd83"} Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.088520 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.089405 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-compactor-0" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.092368 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-compactor-grpc" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.096182 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-compactor-http" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.102527 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.109379 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/30d8b489-fae1-4ed5-8a5c-19d7bad83a3d-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"30d8b489-fae1-4ed5-8a5c-19d7bad83a3d\") " pod="openshift-logging/logging-loki-ingester-0" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.109449 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/30d8b489-fae1-4ed5-8a5c-19d7bad83a3d-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"30d8b489-fae1-4ed5-8a5c-19d7bad83a3d\") " pod="openshift-logging/logging-loki-ingester-0" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.109533 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/30d8b489-fae1-4ed5-8a5c-19d7bad83a3d-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"30d8b489-fae1-4ed5-8a5c-19d7bad83a3d\") " pod="openshift-logging/logging-loki-ingester-0" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.109576 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30d8b489-fae1-4ed5-8a5c-19d7bad83a3d-config\") pod \"logging-loki-ingester-0\" (UID: \"30d8b489-fae1-4ed5-8a5c-19d7bad83a3d\") " pod="openshift-logging/logging-loki-ingester-0" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.109600 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30d8b489-fae1-4ed5-8a5c-19d7bad83a3d-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"30d8b489-fae1-4ed5-8a5c-19d7bad83a3d\") " pod="openshift-logging/logging-loki-ingester-0" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.109616 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dd45\" (UniqueName: \"kubernetes.io/projected/30d8b489-fae1-4ed5-8a5c-19d7bad83a3d-kube-api-access-9dd45\") pod \"logging-loki-ingester-0\" (UID: \"30d8b489-fae1-4ed5-8a5c-19d7bad83a3d\") " pod="openshift-logging/logging-loki-ingester-0" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.109656 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-425f4048-7562-4d9a-ad31-559150051f2e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-425f4048-7562-4d9a-ad31-559150051f2e\") pod \"logging-loki-ingester-0\" (UID: \"30d8b489-fae1-4ed5-8a5c-19d7bad83a3d\") " pod="openshift-logging/logging-loki-ingester-0" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.109681 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-bce17583-e35d-475e-a586-0294af821d97\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bce17583-e35d-475e-a586-0294af821d97\") pod \"logging-loki-ingester-0\" (UID: \"30d8b489-fae1-4ed5-8a5c-19d7bad83a3d\") " pod="openshift-logging/logging-loki-ingester-0" Dec 01 21:45:26 crc kubenswrapper[4962]: W1201 21:45:26.141246 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc99ea3a_cc9e_4e3d_9d0d_16aaa0ae5faa.slice/crio-1e13fa55b9065c549be45dfaeae81f4b4d79969cd5717bd41be6a6d4b88b3e12 WatchSource:0}: Error finding container 1e13fa55b9065c549be45dfaeae81f4b4d79969cd5717bd41be6a6d4b88b3e12: Status 404 returned error can't find the container with id 1e13fa55b9065c549be45dfaeae81f4b4d79969cd5717bd41be6a6d4b88b3e12 Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.145383 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-query-frontend-84558f7c9f-7wvzd"] Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.158131 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.160032 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-index-gateway-0" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.163469 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-index-gateway-grpc" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.163694 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-index-gateway-http" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.169297 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.200273 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-749d76f66f-8szr5" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.211038 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-425f4048-7562-4d9a-ad31-559150051f2e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-425f4048-7562-4d9a-ad31-559150051f2e\") pod \"logging-loki-ingester-0\" (UID: \"30d8b489-fae1-4ed5-8a5c-19d7bad83a3d\") " pod="openshift-logging/logging-loki-ingester-0" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.211125 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-bce17583-e35d-475e-a586-0294af821d97\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bce17583-e35d-475e-a586-0294af821d97\") pod \"logging-loki-ingester-0\" (UID: \"30d8b489-fae1-4ed5-8a5c-19d7bad83a3d\") " pod="openshift-logging/logging-loki-ingester-0" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.211676 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/30d8b489-fae1-4ed5-8a5c-19d7bad83a3d-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"30d8b489-fae1-4ed5-8a5c-19d7bad83a3d\") " pod="openshift-logging/logging-loki-ingester-0" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.211723 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/30d8b489-fae1-4ed5-8a5c-19d7bad83a3d-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"30d8b489-fae1-4ed5-8a5c-19d7bad83a3d\") " pod="openshift-logging/logging-loki-ingester-0" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.211774 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/b10ba804-253d-4972-bfd5-9f5fb9847989-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"b10ba804-253d-4972-bfd5-9f5fb9847989\") " pod="openshift-logging/logging-loki-compactor-0" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.212253 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/b10ba804-253d-4972-bfd5-9f5fb9847989-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"b10ba804-253d-4972-bfd5-9f5fb9847989\") " pod="openshift-logging/logging-loki-compactor-0" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.212297 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/b10ba804-253d-4972-bfd5-9f5fb9847989-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"b10ba804-253d-4972-bfd5-9f5fb9847989\") " pod="openshift-logging/logging-loki-compactor-0" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.212319 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/30d8b489-fae1-4ed5-8a5c-19d7bad83a3d-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"30d8b489-fae1-4ed5-8a5c-19d7bad83a3d\") " pod="openshift-logging/logging-loki-ingester-0" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.212392 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-aa01dd83-c711-4e01-bc22-70325569e48a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-aa01dd83-c711-4e01-bc22-70325569e48a\") pod \"logging-loki-compactor-0\" (UID: \"b10ba804-253d-4972-bfd5-9f5fb9847989\") " pod="openshift-logging/logging-loki-compactor-0" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.212433 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30d8b489-fae1-4ed5-8a5c-19d7bad83a3d-config\") pod \"logging-loki-ingester-0\" (UID: \"30d8b489-fae1-4ed5-8a5c-19d7bad83a3d\") " pod="openshift-logging/logging-loki-ingester-0" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.212458 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b10ba804-253d-4972-bfd5-9f5fb9847989-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"b10ba804-253d-4972-bfd5-9f5fb9847989\") " pod="openshift-logging/logging-loki-compactor-0" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.212476 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30d8b489-fae1-4ed5-8a5c-19d7bad83a3d-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"30d8b489-fae1-4ed5-8a5c-19d7bad83a3d\") " pod="openshift-logging/logging-loki-ingester-0" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.212492 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dd45\" (UniqueName: \"kubernetes.io/projected/30d8b489-fae1-4ed5-8a5c-19d7bad83a3d-kube-api-access-9dd45\") pod \"logging-loki-ingester-0\" (UID: \"30d8b489-fae1-4ed5-8a5c-19d7bad83a3d\") " pod="openshift-logging/logging-loki-ingester-0" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.212511 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smksn\" (UniqueName: \"kubernetes.io/projected/b10ba804-253d-4972-bfd5-9f5fb9847989-kube-api-access-smksn\") pod \"logging-loki-compactor-0\" (UID: \"b10ba804-253d-4972-bfd5-9f5fb9847989\") " pod="openshift-logging/logging-loki-compactor-0" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.212539 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b10ba804-253d-4972-bfd5-9f5fb9847989-config\") pod \"logging-loki-compactor-0\" (UID: \"b10ba804-253d-4972-bfd5-9f5fb9847989\") " pod="openshift-logging/logging-loki-compactor-0" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.213539 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30d8b489-fae1-4ed5-8a5c-19d7bad83a3d-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"30d8b489-fae1-4ed5-8a5c-19d7bad83a3d\") " pod="openshift-logging/logging-loki-ingester-0" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.213687 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30d8b489-fae1-4ed5-8a5c-19d7bad83a3d-config\") pod \"logging-loki-ingester-0\" (UID: \"30d8b489-fae1-4ed5-8a5c-19d7bad83a3d\") " pod="openshift-logging/logging-loki-ingester-0" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.215263 4962 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.215300 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-425f4048-7562-4d9a-ad31-559150051f2e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-425f4048-7562-4d9a-ad31-559150051f2e\") pod \"logging-loki-ingester-0\" (UID: \"30d8b489-fae1-4ed5-8a5c-19d7bad83a3d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/60cdbe486c299de9152448c6c65d6b2e33af9adb73449dfdf9ca96f71b21d781/globalmount\"" pod="openshift-logging/logging-loki-ingester-0" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.216481 4962 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.216536 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-bce17583-e35d-475e-a586-0294af821d97\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bce17583-e35d-475e-a586-0294af821d97\") pod \"logging-loki-ingester-0\" (UID: \"30d8b489-fae1-4ed5-8a5c-19d7bad83a3d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/83e308fae0aaf1cd996f413ce4dbba5f17b35eefe23011fc3b0221b11b2fc2db/globalmount\"" pod="openshift-logging/logging-loki-ingester-0" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.218231 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/30d8b489-fae1-4ed5-8a5c-19d7bad83a3d-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"30d8b489-fae1-4ed5-8a5c-19d7bad83a3d\") " pod="openshift-logging/logging-loki-ingester-0" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.222783 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/30d8b489-fae1-4ed5-8a5c-19d7bad83a3d-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"30d8b489-fae1-4ed5-8a5c-19d7bad83a3d\") " pod="openshift-logging/logging-loki-ingester-0" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.233294 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/30d8b489-fae1-4ed5-8a5c-19d7bad83a3d-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"30d8b489-fae1-4ed5-8a5c-19d7bad83a3d\") " pod="openshift-logging/logging-loki-ingester-0" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.234436 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dd45\" (UniqueName: \"kubernetes.io/projected/30d8b489-fae1-4ed5-8a5c-19d7bad83a3d-kube-api-access-9dd45\") pod \"logging-loki-ingester-0\" (UID: \"30d8b489-fae1-4ed5-8a5c-19d7bad83a3d\") " pod="openshift-logging/logging-loki-ingester-0" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.259712 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-425f4048-7562-4d9a-ad31-559150051f2e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-425f4048-7562-4d9a-ad31-559150051f2e\") pod \"logging-loki-ingester-0\" (UID: \"30d8b489-fae1-4ed5-8a5c-19d7bad83a3d\") " pod="openshift-logging/logging-loki-ingester-0" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.261104 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-bce17583-e35d-475e-a586-0294af821d97\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bce17583-e35d-475e-a586-0294af821d97\") pod \"logging-loki-ingester-0\" (UID: \"30d8b489-fae1-4ed5-8a5c-19d7bad83a3d\") " pod="openshift-logging/logging-loki-ingester-0" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.314323 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-65bf9809-32bc-47bf-9c48-64b853187883\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-65bf9809-32bc-47bf-9c48-64b853187883\") pod \"logging-loki-index-gateway-0\" (UID: \"92297031-5f57-47f1-a6de-4a94b6490937\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.314461 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/92297031-5f57-47f1-a6de-4a94b6490937-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"92297031-5f57-47f1-a6de-4a94b6490937\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.314524 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-aa01dd83-c711-4e01-bc22-70325569e48a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-aa01dd83-c711-4e01-bc22-70325569e48a\") pod \"logging-loki-compactor-0\" (UID: \"b10ba804-253d-4972-bfd5-9f5fb9847989\") " pod="openshift-logging/logging-loki-compactor-0" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.314562 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zh8hr\" (UniqueName: \"kubernetes.io/projected/92297031-5f57-47f1-a6de-4a94b6490937-kube-api-access-zh8hr\") pod \"logging-loki-index-gateway-0\" (UID: \"92297031-5f57-47f1-a6de-4a94b6490937\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.314597 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b10ba804-253d-4972-bfd5-9f5fb9847989-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"b10ba804-253d-4972-bfd5-9f5fb9847989\") " pod="openshift-logging/logging-loki-compactor-0" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.314622 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smksn\" (UniqueName: \"kubernetes.io/projected/b10ba804-253d-4972-bfd5-9f5fb9847989-kube-api-access-smksn\") pod \"logging-loki-compactor-0\" (UID: \"b10ba804-253d-4972-bfd5-9f5fb9847989\") " pod="openshift-logging/logging-loki-compactor-0" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.314693 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b10ba804-253d-4972-bfd5-9f5fb9847989-config\") pod \"logging-loki-compactor-0\" (UID: \"b10ba804-253d-4972-bfd5-9f5fb9847989\") " pod="openshift-logging/logging-loki-compactor-0" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.314715 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92297031-5f57-47f1-a6de-4a94b6490937-config\") pod \"logging-loki-index-gateway-0\" (UID: \"92297031-5f57-47f1-a6de-4a94b6490937\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.314763 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/92297031-5f57-47f1-a6de-4a94b6490937-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"92297031-5f57-47f1-a6de-4a94b6490937\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.314795 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/b10ba804-253d-4972-bfd5-9f5fb9847989-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"b10ba804-253d-4972-bfd5-9f5fb9847989\") " pod="openshift-logging/logging-loki-compactor-0" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.314816 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/b10ba804-253d-4972-bfd5-9f5fb9847989-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"b10ba804-253d-4972-bfd5-9f5fb9847989\") " pod="openshift-logging/logging-loki-compactor-0" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.314858 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/b10ba804-253d-4972-bfd5-9f5fb9847989-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"b10ba804-253d-4972-bfd5-9f5fb9847989\") " pod="openshift-logging/logging-loki-compactor-0" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.314884 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/92297031-5f57-47f1-a6de-4a94b6490937-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"92297031-5f57-47f1-a6de-4a94b6490937\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.314964 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92297031-5f57-47f1-a6de-4a94b6490937-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"92297031-5f57-47f1-a6de-4a94b6490937\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.317493 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b10ba804-253d-4972-bfd5-9f5fb9847989-config\") pod \"logging-loki-compactor-0\" (UID: \"b10ba804-253d-4972-bfd5-9f5fb9847989\") " pod="openshift-logging/logging-loki-compactor-0" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.320774 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b10ba804-253d-4972-bfd5-9f5fb9847989-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"b10ba804-253d-4972-bfd5-9f5fb9847989\") " pod="openshift-logging/logging-loki-compactor-0" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.322967 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/b10ba804-253d-4972-bfd5-9f5fb9847989-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"b10ba804-253d-4972-bfd5-9f5fb9847989\") " pod="openshift-logging/logging-loki-compactor-0" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.323580 4962 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.323607 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-aa01dd83-c711-4e01-bc22-70325569e48a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-aa01dd83-c711-4e01-bc22-70325569e48a\") pod \"logging-loki-compactor-0\" (UID: \"b10ba804-253d-4972-bfd5-9f5fb9847989\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/cb56603435dc23e49cd0c7a0e74f5367f22fc51874e2220c7812aacceaacf3a0/globalmount\"" pod="openshift-logging/logging-loki-compactor-0" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.324314 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/b10ba804-253d-4972-bfd5-9f5fb9847989-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"b10ba804-253d-4972-bfd5-9f5fb9847989\") " pod="openshift-logging/logging-loki-compactor-0" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.327137 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/b10ba804-253d-4972-bfd5-9f5fb9847989-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"b10ba804-253d-4972-bfd5-9f5fb9847989\") " pod="openshift-logging/logging-loki-compactor-0" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.345702 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smksn\" (UniqueName: \"kubernetes.io/projected/b10ba804-253d-4972-bfd5-9f5fb9847989-kube-api-access-smksn\") pod \"logging-loki-compactor-0\" (UID: \"b10ba804-253d-4972-bfd5-9f5fb9847989\") " pod="openshift-logging/logging-loki-compactor-0" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.350884 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-ingester-0" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.358555 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-aa01dd83-c711-4e01-bc22-70325569e48a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-aa01dd83-c711-4e01-bc22-70325569e48a\") pod \"logging-loki-compactor-0\" (UID: \"b10ba804-253d-4972-bfd5-9f5fb9847989\") " pod="openshift-logging/logging-loki-compactor-0" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.404453 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-compactor-0" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.416674 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/92297031-5f57-47f1-a6de-4a94b6490937-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"92297031-5f57-47f1-a6de-4a94b6490937\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.416771 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zh8hr\" (UniqueName: \"kubernetes.io/projected/92297031-5f57-47f1-a6de-4a94b6490937-kube-api-access-zh8hr\") pod \"logging-loki-index-gateway-0\" (UID: \"92297031-5f57-47f1-a6de-4a94b6490937\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.416888 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92297031-5f57-47f1-a6de-4a94b6490937-config\") pod \"logging-loki-index-gateway-0\" (UID: \"92297031-5f57-47f1-a6de-4a94b6490937\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.416954 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/92297031-5f57-47f1-a6de-4a94b6490937-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"92297031-5f57-47f1-a6de-4a94b6490937\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.417030 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/92297031-5f57-47f1-a6de-4a94b6490937-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"92297031-5f57-47f1-a6de-4a94b6490937\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.417095 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92297031-5f57-47f1-a6de-4a94b6490937-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"92297031-5f57-47f1-a6de-4a94b6490937\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.417128 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-65bf9809-32bc-47bf-9c48-64b853187883\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-65bf9809-32bc-47bf-9c48-64b853187883\") pod \"logging-loki-index-gateway-0\" (UID: \"92297031-5f57-47f1-a6de-4a94b6490937\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.420712 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92297031-5f57-47f1-a6de-4a94b6490937-config\") pod \"logging-loki-index-gateway-0\" (UID: \"92297031-5f57-47f1-a6de-4a94b6490937\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.432822 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92297031-5f57-47f1-a6de-4a94b6490937-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"92297031-5f57-47f1-a6de-4a94b6490937\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.433264 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/92297031-5f57-47f1-a6de-4a94b6490937-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"92297031-5f57-47f1-a6de-4a94b6490937\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.434545 4962 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.434599 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-65bf9809-32bc-47bf-9c48-64b853187883\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-65bf9809-32bc-47bf-9c48-64b853187883\") pod \"logging-loki-index-gateway-0\" (UID: \"92297031-5f57-47f1-a6de-4a94b6490937\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/bdaa1c8c844c4f70c25c35718a6cf5f10f357fdeaa8afd1f850a35d491112c4e/globalmount\"" pod="openshift-logging/logging-loki-index-gateway-0" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.434624 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/92297031-5f57-47f1-a6de-4a94b6490937-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"92297031-5f57-47f1-a6de-4a94b6490937\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.436782 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zh8hr\" (UniqueName: \"kubernetes.io/projected/92297031-5f57-47f1-a6de-4a94b6490937-kube-api-access-zh8hr\") pod \"logging-loki-index-gateway-0\" (UID: \"92297031-5f57-47f1-a6de-4a94b6490937\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.437764 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/92297031-5f57-47f1-a6de-4a94b6490937-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"92297031-5f57-47f1-a6de-4a94b6490937\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.478090 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-65bf9809-32bc-47bf-9c48-64b853187883\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-65bf9809-32bc-47bf-9c48-64b853187883\") pod \"logging-loki-index-gateway-0\" (UID: \"92297031-5f57-47f1-a6de-4a94b6490937\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.484353 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-index-gateway-0" Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.682514 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-749d76f66f-8szr5"] Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.844457 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Dec 01 21:45:26 crc kubenswrapper[4962]: W1201 21:45:26.845098 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod30d8b489_fae1_4ed5_8a5c_19d7bad83a3d.slice/crio-cc61854a1626dbd15841fe141abad8453fc25cf74864676a20b72c6342755069 WatchSource:0}: Error finding container cc61854a1626dbd15841fe141abad8453fc25cf74864676a20b72c6342755069: Status 404 returned error can't find the container with id cc61854a1626dbd15841fe141abad8453fc25cf74864676a20b72c6342755069 Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.867400 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Dec 01 21:45:26 crc kubenswrapper[4962]: W1201 21:45:26.872639 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb10ba804_253d_4972_bfd5_9f5fb9847989.slice/crio-4fe6415e6f373e919e59fc08499f107fdb091802315410dd5c83890d8c479b3b WatchSource:0}: Error finding container 4fe6415e6f373e919e59fc08499f107fdb091802315410dd5c83890d8c479b3b: Status 404 returned error can't find the container with id 4fe6415e6f373e919e59fc08499f107fdb091802315410dd5c83890d8c479b3b Dec 01 21:45:26 crc kubenswrapper[4962]: I1201 21:45:26.907590 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Dec 01 21:45:27 crc kubenswrapper[4962]: I1201 21:45:27.064642 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-compactor-0" event={"ID":"b10ba804-253d-4972-bfd5-9f5fb9847989","Type":"ContainerStarted","Data":"4fe6415e6f373e919e59fc08499f107fdb091802315410dd5c83890d8c479b3b"} Dec 01 21:45:27 crc kubenswrapper[4962]: I1201 21:45:27.065894 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-7wvzd" event={"ID":"dc99ea3a-cc9e-4e3d-9d0d-16aaa0ae5faa","Type":"ContainerStarted","Data":"1e13fa55b9065c549be45dfaeae81f4b4d79969cd5717bd41be6a6d4b88b3e12"} Dec 01 21:45:27 crc kubenswrapper[4962]: I1201 21:45:27.068125 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-index-gateway-0" event={"ID":"92297031-5f57-47f1-a6de-4a94b6490937","Type":"ContainerStarted","Data":"45ea211db67e17934882d1bd794a9b76855cb2d72796de3939c65140d7250595"} Dec 01 21:45:27 crc kubenswrapper[4962]: I1201 21:45:27.068996 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-ingester-0" event={"ID":"30d8b489-fae1-4ed5-8a5c-19d7bad83a3d","Type":"ContainerStarted","Data":"cc61854a1626dbd15841fe141abad8453fc25cf74864676a20b72c6342755069"} Dec 01 21:45:27 crc kubenswrapper[4962]: I1201 21:45:27.069878 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-749d76f66f-8szr5" event={"ID":"a89c265b-cf90-4c13-9e7e-ebd27f1b3463","Type":"ContainerStarted","Data":"fede5b6d7b7d88270ecc8be434d974b9fecf3520fa3c85109434b578030b04c2"} Dec 01 21:45:30 crc kubenswrapper[4962]: I1201 21:45:30.091083 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-distributor-76cc67bf56-prtfr" event={"ID":"deb58cb2-860d-49d2-95e1-12aa147bd419","Type":"ContainerStarted","Data":"4ea9f591c2fff91cb47693eb92a7df45a1b58baec54a4cfa21f8f0f19eba8721"} Dec 01 21:45:30 crc kubenswrapper[4962]: I1201 21:45:30.091773 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-distributor-76cc67bf56-prtfr" Dec 01 21:45:30 crc kubenswrapper[4962]: I1201 21:45:30.100608 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-7wvzd" event={"ID":"dc99ea3a-cc9e-4e3d-9d0d-16aaa0ae5faa","Type":"ContainerStarted","Data":"e309fe0b3578f6b3eb324c6e9af13a367fa2520671135c7559185815c6999591"} Dec 01 21:45:30 crc kubenswrapper[4962]: I1201 21:45:30.100996 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-7wvzd" Dec 01 21:45:30 crc kubenswrapper[4962]: I1201 21:45:30.104362 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-index-gateway-0" event={"ID":"92297031-5f57-47f1-a6de-4a94b6490937","Type":"ContainerStarted","Data":"01b77d1fce6585958a5d6af2b62065307cb4e0c45ecf3c20841eab4a3997a32e"} Dec 01 21:45:30 crc kubenswrapper[4962]: I1201 21:45:30.104795 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-index-gateway-0" Dec 01 21:45:30 crc kubenswrapper[4962]: I1201 21:45:30.105869 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-querier-5895d59bb8-khxd2" event={"ID":"5e9077bf-815a-4c1f-8956-bc4094f59ceb","Type":"ContainerStarted","Data":"10cc4a274e970c1cd4abad72e6782263b40ce5597574c07bcf52708999ee536f"} Dec 01 21:45:30 crc kubenswrapper[4962]: I1201 21:45:30.106012 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-querier-5895d59bb8-khxd2" Dec 01 21:45:30 crc kubenswrapper[4962]: I1201 21:45:30.110086 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-ingester-0" event={"ID":"30d8b489-fae1-4ed5-8a5c-19d7bad83a3d","Type":"ContainerStarted","Data":"6a932c74d166bbd947f6554ec7595b0b01e9d23135f6336099c986001628e0e2"} Dec 01 21:45:30 crc kubenswrapper[4962]: I1201 21:45:30.110806 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-ingester-0" Dec 01 21:45:30 crc kubenswrapper[4962]: I1201 21:45:30.123696 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-749d76f66f-8szr5" event={"ID":"a89c265b-cf90-4c13-9e7e-ebd27f1b3463","Type":"ContainerStarted","Data":"85c017161257d839d3e604d4114f3439dafdc0131a215d16498e50b70d09800b"} Dec 01 21:45:30 crc kubenswrapper[4962]: I1201 21:45:30.125495 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-749d76f66f-pnn6j" event={"ID":"8fdb44c5-cad3-460a-a6c8-90e65be7c1ce","Type":"ContainerStarted","Data":"441b3a3030c20047e18caa7be95f9010098157fbdb06170c65e945b7b5d15272"} Dec 01 21:45:30 crc kubenswrapper[4962]: I1201 21:45:30.126702 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-compactor-0" event={"ID":"b10ba804-253d-4972-bfd5-9f5fb9847989","Type":"ContainerStarted","Data":"d357115e444311fe6086a0ebe81c3c30842cf6a333b7633dceb4553b01d9eefc"} Dec 01 21:45:30 crc kubenswrapper[4962]: I1201 21:45:30.127427 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-compactor-0" Dec 01 21:45:30 crc kubenswrapper[4962]: I1201 21:45:30.137988 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-distributor-76cc67bf56-prtfr" podStartSLOduration=2.191634832 podStartE2EDuration="6.137931148s" podCreationTimestamp="2025-12-01 21:45:24 +0000 UTC" firstStartedPulling="2025-12-01 21:45:25.448046607 +0000 UTC m=+709.549485802" lastFinishedPulling="2025-12-01 21:45:29.394342893 +0000 UTC m=+713.495782118" observedRunningTime="2025-12-01 21:45:30.112852932 +0000 UTC m=+714.214292147" watchObservedRunningTime="2025-12-01 21:45:30.137931148 +0000 UTC m=+714.239370383" Dec 01 21:45:30 crc kubenswrapper[4962]: I1201 21:45:30.141098 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-7wvzd" podStartSLOduration=1.845551548 podStartE2EDuration="5.141080868s" podCreationTimestamp="2025-12-01 21:45:25 +0000 UTC" firstStartedPulling="2025-12-01 21:45:26.143662493 +0000 UTC m=+710.245101688" lastFinishedPulling="2025-12-01 21:45:29.439191773 +0000 UTC m=+713.540631008" observedRunningTime="2025-12-01 21:45:30.129692223 +0000 UTC m=+714.231131418" watchObservedRunningTime="2025-12-01 21:45:30.141080868 +0000 UTC m=+714.242520103" Dec 01 21:45:30 crc kubenswrapper[4962]: I1201 21:45:30.153100 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-index-gateway-0" podStartSLOduration=2.684145373 podStartE2EDuration="5.15308103s" podCreationTimestamp="2025-12-01 21:45:25 +0000 UTC" firstStartedPulling="2025-12-01 21:45:26.973252041 +0000 UTC m=+711.074691226" lastFinishedPulling="2025-12-01 21:45:29.442187668 +0000 UTC m=+713.543626883" observedRunningTime="2025-12-01 21:45:30.148248002 +0000 UTC m=+714.249687227" watchObservedRunningTime="2025-12-01 21:45:30.15308103 +0000 UTC m=+714.254520235" Dec 01 21:45:30 crc kubenswrapper[4962]: I1201 21:45:30.172007 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-querier-5895d59bb8-khxd2" podStartSLOduration=2.539191757 podStartE2EDuration="6.171985559s" podCreationTimestamp="2025-12-01 21:45:24 +0000 UTC" firstStartedPulling="2025-12-01 21:45:25.80987558 +0000 UTC m=+709.911314795" lastFinishedPulling="2025-12-01 21:45:29.442669402 +0000 UTC m=+713.544108597" observedRunningTime="2025-12-01 21:45:30.171077733 +0000 UTC m=+714.272516968" watchObservedRunningTime="2025-12-01 21:45:30.171985559 +0000 UTC m=+714.273424764" Dec 01 21:45:30 crc kubenswrapper[4962]: I1201 21:45:30.185849 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-ingester-0" podStartSLOduration=3.595994438 podStartE2EDuration="6.185840775s" podCreationTimestamp="2025-12-01 21:45:24 +0000 UTC" firstStartedPulling="2025-12-01 21:45:26.850406716 +0000 UTC m=+710.951845911" lastFinishedPulling="2025-12-01 21:45:29.440253053 +0000 UTC m=+713.541692248" observedRunningTime="2025-12-01 21:45:30.184206188 +0000 UTC m=+714.285645383" watchObservedRunningTime="2025-12-01 21:45:30.185840775 +0000 UTC m=+714.287279970" Dec 01 21:45:30 crc kubenswrapper[4962]: I1201 21:45:30.207171 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-compactor-0" podStartSLOduration=2.642583797 podStartE2EDuration="5.207153543s" podCreationTimestamp="2025-12-01 21:45:25 +0000 UTC" firstStartedPulling="2025-12-01 21:45:26.875559774 +0000 UTC m=+710.976998969" lastFinishedPulling="2025-12-01 21:45:29.4401295 +0000 UTC m=+713.541568715" observedRunningTime="2025-12-01 21:45:30.203856529 +0000 UTC m=+714.305295754" watchObservedRunningTime="2025-12-01 21:45:30.207153543 +0000 UTC m=+714.308592748" Dec 01 21:45:32 crc kubenswrapper[4962]: I1201 21:45:32.146409 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-749d76f66f-8szr5" event={"ID":"a89c265b-cf90-4c13-9e7e-ebd27f1b3463","Type":"ContainerStarted","Data":"d63ef2d646bf1c3c46ddda3e74075aaf51a5560e8cf15ffb6988a8847161a5bb"} Dec 01 21:45:32 crc kubenswrapper[4962]: I1201 21:45:32.147370 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-749d76f66f-8szr5" Dec 01 21:45:32 crc kubenswrapper[4962]: I1201 21:45:32.147492 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-749d76f66f-8szr5" Dec 01 21:45:32 crc kubenswrapper[4962]: I1201 21:45:32.152007 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-749d76f66f-pnn6j" event={"ID":"8fdb44c5-cad3-460a-a6c8-90e65be7c1ce","Type":"ContainerStarted","Data":"6f8eaab7f8fe10250d864e8e9639b40816d6d641fb6bd3b9314a0758598b08c2"} Dec 01 21:45:32 crc kubenswrapper[4962]: I1201 21:45:32.152078 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-749d76f66f-pnn6j" Dec 01 21:45:32 crc kubenswrapper[4962]: I1201 21:45:32.152452 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-749d76f66f-pnn6j" Dec 01 21:45:32 crc kubenswrapper[4962]: I1201 21:45:32.186008 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-gateway-749d76f66f-8szr5" podStartSLOduration=2.040091885 podStartE2EDuration="7.185984197s" podCreationTimestamp="2025-12-01 21:45:25 +0000 UTC" firstStartedPulling="2025-12-01 21:45:26.684680358 +0000 UTC m=+710.786119563" lastFinishedPulling="2025-12-01 21:45:31.83057267 +0000 UTC m=+715.932011875" observedRunningTime="2025-12-01 21:45:32.180564759 +0000 UTC m=+716.282003964" watchObservedRunningTime="2025-12-01 21:45:32.185984197 +0000 UTC m=+716.287423432" Dec 01 21:45:32 crc kubenswrapper[4962]: I1201 21:45:32.218389 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-gateway-749d76f66f-pnn6j" podStartSLOduration=1.313845256 podStartE2EDuration="7.21837129s" podCreationTimestamp="2025-12-01 21:45:25 +0000 UTC" firstStartedPulling="2025-12-01 21:45:25.921208716 +0000 UTC m=+710.022647951" lastFinishedPulling="2025-12-01 21:45:31.82573478 +0000 UTC m=+715.927173985" observedRunningTime="2025-12-01 21:45:32.213739475 +0000 UTC m=+716.315178700" watchObservedRunningTime="2025-12-01 21:45:32.21837129 +0000 UTC m=+716.319810495" Dec 01 21:45:32 crc kubenswrapper[4962]: I1201 21:45:32.293090 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-749d76f66f-pnn6j" Dec 01 21:45:32 crc kubenswrapper[4962]: I1201 21:45:32.316998 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-749d76f66f-pnn6j" Dec 01 21:45:32 crc kubenswrapper[4962]: I1201 21:45:32.320555 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-749d76f66f-8szr5" Dec 01 21:45:32 crc kubenswrapper[4962]: I1201 21:45:32.320611 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-749d76f66f-8szr5" Dec 01 21:45:45 crc kubenswrapper[4962]: I1201 21:45:45.190227 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-distributor-76cc67bf56-prtfr" Dec 01 21:45:45 crc kubenswrapper[4962]: I1201 21:45:45.379840 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-querier-5895d59bb8-khxd2" Dec 01 21:45:45 crc kubenswrapper[4962]: I1201 21:45:45.733854 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-7wvzd" Dec 01 21:45:46 crc kubenswrapper[4962]: I1201 21:45:46.366622 4962 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: this instance owns no tokens Dec 01 21:45:46 crc kubenswrapper[4962]: I1201 21:45:46.366702 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="30d8b489-fae1-4ed5-8a5c-19d7bad83a3d" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 01 21:45:46 crc kubenswrapper[4962]: I1201 21:45:46.432433 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-compactor-0" Dec 01 21:45:46 crc kubenswrapper[4962]: I1201 21:45:46.491692 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-index-gateway-0" Dec 01 21:45:56 crc kubenswrapper[4962]: I1201 21:45:56.362048 4962 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: this instance owns no tokens Dec 01 21:45:56 crc kubenswrapper[4962]: I1201 21:45:56.362907 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="30d8b489-fae1-4ed5-8a5c-19d7bad83a3d" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 01 21:46:06 crc kubenswrapper[4962]: I1201 21:46:06.360781 4962 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: waiting for 15s after being ready Dec 01 21:46:06 crc kubenswrapper[4962]: I1201 21:46:06.361491 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="30d8b489-fae1-4ed5-8a5c-19d7bad83a3d" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 01 21:46:14 crc kubenswrapper[4962]: I1201 21:46:14.397643 4962 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 01 21:46:16 crc kubenswrapper[4962]: I1201 21:46:16.366585 4962 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: waiting for 15s after being ready Dec 01 21:46:16 crc kubenswrapper[4962]: I1201 21:46:16.367094 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="30d8b489-fae1-4ed5-8a5c-19d7bad83a3d" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 01 21:46:26 crc kubenswrapper[4962]: I1201 21:46:26.361244 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-ingester-0" Dec 01 21:46:32 crc kubenswrapper[4962]: I1201 21:46:32.785002 4962 patch_prober.go:28] interesting pod/machine-config-daemon-b642k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 21:46:32 crc kubenswrapper[4962]: I1201 21:46:32.786325 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 21:46:46 crc kubenswrapper[4962]: I1201 21:46:46.006998 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/collector-dg84k"] Dec 01 21:46:46 crc kubenswrapper[4962]: I1201 21:46:46.008529 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-dg84k" Dec 01 21:46:46 crc kubenswrapper[4962]: I1201 21:46:46.015515 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-dg84k"] Dec 01 21:46:46 crc kubenswrapper[4962]: I1201 21:46:46.016093 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-metrics" Dec 01 21:46:46 crc kubenswrapper[4962]: I1201 21:46:46.016374 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-dockercfg-9rfpn" Dec 01 21:46:46 crc kubenswrapper[4962]: I1201 21:46:46.016446 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-syslog-receiver" Dec 01 21:46:46 crc kubenswrapper[4962]: I1201 21:46:46.016669 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-config" Dec 01 21:46:46 crc kubenswrapper[4962]: I1201 21:46:46.016917 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-token" Dec 01 21:46:46 crc kubenswrapper[4962]: I1201 21:46:46.019566 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-trustbundle" Dec 01 21:46:46 crc kubenswrapper[4962]: I1201 21:46:46.157734 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-logging/collector-dg84k"] Dec 01 21:46:46 crc kubenswrapper[4962]: E1201 21:46:46.158428 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[collector-syslog-receiver collector-token config config-openshift-service-cacrt datadir entrypoint kube-api-access-z56gt metrics sa-token tmp trusted-ca], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-logging/collector-dg84k" podUID="ee451ea0-b37a-4b8e-94dc-de47e65716ec" Dec 01 21:46:46 crc kubenswrapper[4962]: I1201 21:46:46.205368 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/ee451ea0-b37a-4b8e-94dc-de47e65716ec-datadir\") pod \"collector-dg84k\" (UID: \"ee451ea0-b37a-4b8e-94dc-de47e65716ec\") " pod="openshift-logging/collector-dg84k" Dec 01 21:46:46 crc kubenswrapper[4962]: I1201 21:46:46.205408 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee451ea0-b37a-4b8e-94dc-de47e65716ec-config\") pod \"collector-dg84k\" (UID: \"ee451ea0-b37a-4b8e-94dc-de47e65716ec\") " pod="openshift-logging/collector-dg84k" Dec 01 21:46:46 crc kubenswrapper[4962]: I1201 21:46:46.205438 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/ee451ea0-b37a-4b8e-94dc-de47e65716ec-collector-token\") pod \"collector-dg84k\" (UID: \"ee451ea0-b37a-4b8e-94dc-de47e65716ec\") " pod="openshift-logging/collector-dg84k" Dec 01 21:46:46 crc kubenswrapper[4962]: I1201 21:46:46.206069 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/ee451ea0-b37a-4b8e-94dc-de47e65716ec-sa-token\") pod \"collector-dg84k\" (UID: \"ee451ea0-b37a-4b8e-94dc-de47e65716ec\") " pod="openshift-logging/collector-dg84k" Dec 01 21:46:46 crc kubenswrapper[4962]: I1201 21:46:46.206130 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/ee451ea0-b37a-4b8e-94dc-de47e65716ec-metrics\") pod \"collector-dg84k\" (UID: \"ee451ea0-b37a-4b8e-94dc-de47e65716ec\") " pod="openshift-logging/collector-dg84k" Dec 01 21:46:46 crc kubenswrapper[4962]: I1201 21:46:46.206153 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/ee451ea0-b37a-4b8e-94dc-de47e65716ec-entrypoint\") pod \"collector-dg84k\" (UID: \"ee451ea0-b37a-4b8e-94dc-de47e65716ec\") " pod="openshift-logging/collector-dg84k" Dec 01 21:46:46 crc kubenswrapper[4962]: I1201 21:46:46.206176 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ee451ea0-b37a-4b8e-94dc-de47e65716ec-tmp\") pod \"collector-dg84k\" (UID: \"ee451ea0-b37a-4b8e-94dc-de47e65716ec\") " pod="openshift-logging/collector-dg84k" Dec 01 21:46:46 crc kubenswrapper[4962]: I1201 21:46:46.206211 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z56gt\" (UniqueName: \"kubernetes.io/projected/ee451ea0-b37a-4b8e-94dc-de47e65716ec-kube-api-access-z56gt\") pod \"collector-dg84k\" (UID: \"ee451ea0-b37a-4b8e-94dc-de47e65716ec\") " pod="openshift-logging/collector-dg84k" Dec 01 21:46:46 crc kubenswrapper[4962]: I1201 21:46:46.206269 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/ee451ea0-b37a-4b8e-94dc-de47e65716ec-collector-syslog-receiver\") pod \"collector-dg84k\" (UID: \"ee451ea0-b37a-4b8e-94dc-de47e65716ec\") " pod="openshift-logging/collector-dg84k" Dec 01 21:46:46 crc kubenswrapper[4962]: I1201 21:46:46.206290 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/ee451ea0-b37a-4b8e-94dc-de47e65716ec-config-openshift-service-cacrt\") pod \"collector-dg84k\" (UID: \"ee451ea0-b37a-4b8e-94dc-de47e65716ec\") " pod="openshift-logging/collector-dg84k" Dec 01 21:46:46 crc kubenswrapper[4962]: I1201 21:46:46.206359 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ee451ea0-b37a-4b8e-94dc-de47e65716ec-trusted-ca\") pod \"collector-dg84k\" (UID: \"ee451ea0-b37a-4b8e-94dc-de47e65716ec\") " pod="openshift-logging/collector-dg84k" Dec 01 21:46:46 crc kubenswrapper[4962]: I1201 21:46:46.307853 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/ee451ea0-b37a-4b8e-94dc-de47e65716ec-metrics\") pod \"collector-dg84k\" (UID: \"ee451ea0-b37a-4b8e-94dc-de47e65716ec\") " pod="openshift-logging/collector-dg84k" Dec 01 21:46:46 crc kubenswrapper[4962]: I1201 21:46:46.308146 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/ee451ea0-b37a-4b8e-94dc-de47e65716ec-entrypoint\") pod \"collector-dg84k\" (UID: \"ee451ea0-b37a-4b8e-94dc-de47e65716ec\") " pod="openshift-logging/collector-dg84k" Dec 01 21:46:46 crc kubenswrapper[4962]: I1201 21:46:46.308237 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ee451ea0-b37a-4b8e-94dc-de47e65716ec-tmp\") pod \"collector-dg84k\" (UID: \"ee451ea0-b37a-4b8e-94dc-de47e65716ec\") " pod="openshift-logging/collector-dg84k" Dec 01 21:46:46 crc kubenswrapper[4962]: I1201 21:46:46.308336 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z56gt\" (UniqueName: \"kubernetes.io/projected/ee451ea0-b37a-4b8e-94dc-de47e65716ec-kube-api-access-z56gt\") pod \"collector-dg84k\" (UID: \"ee451ea0-b37a-4b8e-94dc-de47e65716ec\") " pod="openshift-logging/collector-dg84k" Dec 01 21:46:46 crc kubenswrapper[4962]: I1201 21:46:46.308452 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/ee451ea0-b37a-4b8e-94dc-de47e65716ec-collector-syslog-receiver\") pod \"collector-dg84k\" (UID: \"ee451ea0-b37a-4b8e-94dc-de47e65716ec\") " pod="openshift-logging/collector-dg84k" Dec 01 21:46:46 crc kubenswrapper[4962]: I1201 21:46:46.308767 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/ee451ea0-b37a-4b8e-94dc-de47e65716ec-config-openshift-service-cacrt\") pod \"collector-dg84k\" (UID: \"ee451ea0-b37a-4b8e-94dc-de47e65716ec\") " pod="openshift-logging/collector-dg84k" Dec 01 21:46:46 crc kubenswrapper[4962]: I1201 21:46:46.308858 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ee451ea0-b37a-4b8e-94dc-de47e65716ec-trusted-ca\") pod \"collector-dg84k\" (UID: \"ee451ea0-b37a-4b8e-94dc-de47e65716ec\") " pod="openshift-logging/collector-dg84k" Dec 01 21:46:46 crc kubenswrapper[4962]: I1201 21:46:46.308915 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/ee451ea0-b37a-4b8e-94dc-de47e65716ec-datadir\") pod \"collector-dg84k\" (UID: \"ee451ea0-b37a-4b8e-94dc-de47e65716ec\") " pod="openshift-logging/collector-dg84k" Dec 01 21:46:46 crc kubenswrapper[4962]: I1201 21:46:46.308952 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee451ea0-b37a-4b8e-94dc-de47e65716ec-config\") pod \"collector-dg84k\" (UID: \"ee451ea0-b37a-4b8e-94dc-de47e65716ec\") " pod="openshift-logging/collector-dg84k" Dec 01 21:46:46 crc kubenswrapper[4962]: I1201 21:46:46.308980 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/ee451ea0-b37a-4b8e-94dc-de47e65716ec-collector-token\") pod \"collector-dg84k\" (UID: \"ee451ea0-b37a-4b8e-94dc-de47e65716ec\") " pod="openshift-logging/collector-dg84k" Dec 01 21:46:46 crc kubenswrapper[4962]: I1201 21:46:46.309093 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/ee451ea0-b37a-4b8e-94dc-de47e65716ec-sa-token\") pod \"collector-dg84k\" (UID: \"ee451ea0-b37a-4b8e-94dc-de47e65716ec\") " pod="openshift-logging/collector-dg84k" Dec 01 21:46:46 crc kubenswrapper[4962]: I1201 21:46:46.309081 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/ee451ea0-b37a-4b8e-94dc-de47e65716ec-datadir\") pod \"collector-dg84k\" (UID: \"ee451ea0-b37a-4b8e-94dc-de47e65716ec\") " pod="openshift-logging/collector-dg84k" Dec 01 21:46:46 crc kubenswrapper[4962]: I1201 21:46:46.309651 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/ee451ea0-b37a-4b8e-94dc-de47e65716ec-config-openshift-service-cacrt\") pod \"collector-dg84k\" (UID: \"ee451ea0-b37a-4b8e-94dc-de47e65716ec\") " pod="openshift-logging/collector-dg84k" Dec 01 21:46:46 crc kubenswrapper[4962]: I1201 21:46:46.310094 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/ee451ea0-b37a-4b8e-94dc-de47e65716ec-entrypoint\") pod \"collector-dg84k\" (UID: \"ee451ea0-b37a-4b8e-94dc-de47e65716ec\") " pod="openshift-logging/collector-dg84k" Dec 01 21:46:46 crc kubenswrapper[4962]: I1201 21:46:46.310104 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ee451ea0-b37a-4b8e-94dc-de47e65716ec-trusted-ca\") pod \"collector-dg84k\" (UID: \"ee451ea0-b37a-4b8e-94dc-de47e65716ec\") " pod="openshift-logging/collector-dg84k" Dec 01 21:46:46 crc kubenswrapper[4962]: I1201 21:46:46.310214 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee451ea0-b37a-4b8e-94dc-de47e65716ec-config\") pod \"collector-dg84k\" (UID: \"ee451ea0-b37a-4b8e-94dc-de47e65716ec\") " pod="openshift-logging/collector-dg84k" Dec 01 21:46:46 crc kubenswrapper[4962]: I1201 21:46:46.314136 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ee451ea0-b37a-4b8e-94dc-de47e65716ec-tmp\") pod \"collector-dg84k\" (UID: \"ee451ea0-b37a-4b8e-94dc-de47e65716ec\") " pod="openshift-logging/collector-dg84k" Dec 01 21:46:46 crc kubenswrapper[4962]: I1201 21:46:46.315205 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/ee451ea0-b37a-4b8e-94dc-de47e65716ec-collector-token\") pod \"collector-dg84k\" (UID: \"ee451ea0-b37a-4b8e-94dc-de47e65716ec\") " pod="openshift-logging/collector-dg84k" Dec 01 21:46:46 crc kubenswrapper[4962]: I1201 21:46:46.316902 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/ee451ea0-b37a-4b8e-94dc-de47e65716ec-collector-syslog-receiver\") pod \"collector-dg84k\" (UID: \"ee451ea0-b37a-4b8e-94dc-de47e65716ec\") " pod="openshift-logging/collector-dg84k" Dec 01 21:46:46 crc kubenswrapper[4962]: I1201 21:46:46.326547 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/ee451ea0-b37a-4b8e-94dc-de47e65716ec-metrics\") pod \"collector-dg84k\" (UID: \"ee451ea0-b37a-4b8e-94dc-de47e65716ec\") " pod="openshift-logging/collector-dg84k" Dec 01 21:46:46 crc kubenswrapper[4962]: I1201 21:46:46.327645 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z56gt\" (UniqueName: \"kubernetes.io/projected/ee451ea0-b37a-4b8e-94dc-de47e65716ec-kube-api-access-z56gt\") pod \"collector-dg84k\" (UID: \"ee451ea0-b37a-4b8e-94dc-de47e65716ec\") " pod="openshift-logging/collector-dg84k" Dec 01 21:46:46 crc kubenswrapper[4962]: I1201 21:46:46.330383 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/ee451ea0-b37a-4b8e-94dc-de47e65716ec-sa-token\") pod \"collector-dg84k\" (UID: \"ee451ea0-b37a-4b8e-94dc-de47e65716ec\") " pod="openshift-logging/collector-dg84k" Dec 01 21:46:47 crc kubenswrapper[4962]: I1201 21:46:47.125988 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-dg84k" Dec 01 21:46:47 crc kubenswrapper[4962]: I1201 21:46:47.141173 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-dg84k" Dec 01 21:46:47 crc kubenswrapper[4962]: I1201 21:46:47.321515 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/ee451ea0-b37a-4b8e-94dc-de47e65716ec-sa-token\") pod \"ee451ea0-b37a-4b8e-94dc-de47e65716ec\" (UID: \"ee451ea0-b37a-4b8e-94dc-de47e65716ec\") " Dec 01 21:46:47 crc kubenswrapper[4962]: I1201 21:46:47.322043 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/ee451ea0-b37a-4b8e-94dc-de47e65716ec-config-openshift-service-cacrt\") pod \"ee451ea0-b37a-4b8e-94dc-de47e65716ec\" (UID: \"ee451ea0-b37a-4b8e-94dc-de47e65716ec\") " Dec 01 21:46:47 crc kubenswrapper[4962]: I1201 21:46:47.322328 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/ee451ea0-b37a-4b8e-94dc-de47e65716ec-collector-syslog-receiver\") pod \"ee451ea0-b37a-4b8e-94dc-de47e65716ec\" (UID: \"ee451ea0-b37a-4b8e-94dc-de47e65716ec\") " Dec 01 21:46:47 crc kubenswrapper[4962]: I1201 21:46:47.322606 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/ee451ea0-b37a-4b8e-94dc-de47e65716ec-datadir\") pod \"ee451ea0-b37a-4b8e-94dc-de47e65716ec\" (UID: \"ee451ea0-b37a-4b8e-94dc-de47e65716ec\") " Dec 01 21:46:47 crc kubenswrapper[4962]: I1201 21:46:47.322697 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ee451ea0-b37a-4b8e-94dc-de47e65716ec-datadir" (OuterVolumeSpecName: "datadir") pod "ee451ea0-b37a-4b8e-94dc-de47e65716ec" (UID: "ee451ea0-b37a-4b8e-94dc-de47e65716ec"). InnerVolumeSpecName "datadir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 21:46:47 crc kubenswrapper[4962]: I1201 21:46:47.322740 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee451ea0-b37a-4b8e-94dc-de47e65716ec-config-openshift-service-cacrt" (OuterVolumeSpecName: "config-openshift-service-cacrt") pod "ee451ea0-b37a-4b8e-94dc-de47e65716ec" (UID: "ee451ea0-b37a-4b8e-94dc-de47e65716ec"). InnerVolumeSpecName "config-openshift-service-cacrt". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:46:47 crc kubenswrapper[4962]: I1201 21:46:47.323475 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/ee451ea0-b37a-4b8e-94dc-de47e65716ec-metrics\") pod \"ee451ea0-b37a-4b8e-94dc-de47e65716ec\" (UID: \"ee451ea0-b37a-4b8e-94dc-de47e65716ec\") " Dec 01 21:46:47 crc kubenswrapper[4962]: I1201 21:46:47.324251 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/ee451ea0-b37a-4b8e-94dc-de47e65716ec-entrypoint\") pod \"ee451ea0-b37a-4b8e-94dc-de47e65716ec\" (UID: \"ee451ea0-b37a-4b8e-94dc-de47e65716ec\") " Dec 01 21:46:47 crc kubenswrapper[4962]: I1201 21:46:47.324418 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ee451ea0-b37a-4b8e-94dc-de47e65716ec-trusted-ca\") pod \"ee451ea0-b37a-4b8e-94dc-de47e65716ec\" (UID: \"ee451ea0-b37a-4b8e-94dc-de47e65716ec\") " Dec 01 21:46:47 crc kubenswrapper[4962]: I1201 21:46:47.324534 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/ee451ea0-b37a-4b8e-94dc-de47e65716ec-collector-token\") pod \"ee451ea0-b37a-4b8e-94dc-de47e65716ec\" (UID: \"ee451ea0-b37a-4b8e-94dc-de47e65716ec\") " Dec 01 21:46:47 crc kubenswrapper[4962]: I1201 21:46:47.325118 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee451ea0-b37a-4b8e-94dc-de47e65716ec-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "ee451ea0-b37a-4b8e-94dc-de47e65716ec" (UID: "ee451ea0-b37a-4b8e-94dc-de47e65716ec"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:46:47 crc kubenswrapper[4962]: I1201 21:46:47.325145 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z56gt\" (UniqueName: \"kubernetes.io/projected/ee451ea0-b37a-4b8e-94dc-de47e65716ec-kube-api-access-z56gt\") pod \"ee451ea0-b37a-4b8e-94dc-de47e65716ec\" (UID: \"ee451ea0-b37a-4b8e-94dc-de47e65716ec\") " Dec 01 21:46:47 crc kubenswrapper[4962]: I1201 21:46:47.325326 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ee451ea0-b37a-4b8e-94dc-de47e65716ec-tmp\") pod \"ee451ea0-b37a-4b8e-94dc-de47e65716ec\" (UID: \"ee451ea0-b37a-4b8e-94dc-de47e65716ec\") " Dec 01 21:46:47 crc kubenswrapper[4962]: I1201 21:46:47.325210 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee451ea0-b37a-4b8e-94dc-de47e65716ec-entrypoint" (OuterVolumeSpecName: "entrypoint") pod "ee451ea0-b37a-4b8e-94dc-de47e65716ec" (UID: "ee451ea0-b37a-4b8e-94dc-de47e65716ec"). InnerVolumeSpecName "entrypoint". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:46:47 crc kubenswrapper[4962]: I1201 21:46:47.325395 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee451ea0-b37a-4b8e-94dc-de47e65716ec-config\") pod \"ee451ea0-b37a-4b8e-94dc-de47e65716ec\" (UID: \"ee451ea0-b37a-4b8e-94dc-de47e65716ec\") " Dec 01 21:46:47 crc kubenswrapper[4962]: I1201 21:46:47.326678 4962 reconciler_common.go:293] "Volume detached for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/ee451ea0-b37a-4b8e-94dc-de47e65716ec-datadir\") on node \"crc\" DevicePath \"\"" Dec 01 21:46:47 crc kubenswrapper[4962]: I1201 21:46:47.326727 4962 reconciler_common.go:293] "Volume detached for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/ee451ea0-b37a-4b8e-94dc-de47e65716ec-entrypoint\") on node \"crc\" DevicePath \"\"" Dec 01 21:46:47 crc kubenswrapper[4962]: I1201 21:46:47.326755 4962 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ee451ea0-b37a-4b8e-94dc-de47e65716ec-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 21:46:47 crc kubenswrapper[4962]: I1201 21:46:47.326749 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee451ea0-b37a-4b8e-94dc-de47e65716ec-config" (OuterVolumeSpecName: "config") pod "ee451ea0-b37a-4b8e-94dc-de47e65716ec" (UID: "ee451ea0-b37a-4b8e-94dc-de47e65716ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:46:47 crc kubenswrapper[4962]: I1201 21:46:47.326774 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee451ea0-b37a-4b8e-94dc-de47e65716ec-sa-token" (OuterVolumeSpecName: "sa-token") pod "ee451ea0-b37a-4b8e-94dc-de47e65716ec" (UID: "ee451ea0-b37a-4b8e-94dc-de47e65716ec"). InnerVolumeSpecName "sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:46:47 crc kubenswrapper[4962]: I1201 21:46:47.326780 4962 reconciler_common.go:293] "Volume detached for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/ee451ea0-b37a-4b8e-94dc-de47e65716ec-config-openshift-service-cacrt\") on node \"crc\" DevicePath \"\"" Dec 01 21:46:47 crc kubenswrapper[4962]: I1201 21:46:47.327637 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee451ea0-b37a-4b8e-94dc-de47e65716ec-collector-syslog-receiver" (OuterVolumeSpecName: "collector-syslog-receiver") pod "ee451ea0-b37a-4b8e-94dc-de47e65716ec" (UID: "ee451ea0-b37a-4b8e-94dc-de47e65716ec"). InnerVolumeSpecName "collector-syslog-receiver". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:46:47 crc kubenswrapper[4962]: I1201 21:46:47.330125 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee451ea0-b37a-4b8e-94dc-de47e65716ec-tmp" (OuterVolumeSpecName: "tmp") pod "ee451ea0-b37a-4b8e-94dc-de47e65716ec" (UID: "ee451ea0-b37a-4b8e-94dc-de47e65716ec"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:46:47 crc kubenswrapper[4962]: I1201 21:46:47.330299 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee451ea0-b37a-4b8e-94dc-de47e65716ec-kube-api-access-z56gt" (OuterVolumeSpecName: "kube-api-access-z56gt") pod "ee451ea0-b37a-4b8e-94dc-de47e65716ec" (UID: "ee451ea0-b37a-4b8e-94dc-de47e65716ec"). InnerVolumeSpecName "kube-api-access-z56gt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:46:47 crc kubenswrapper[4962]: I1201 21:46:47.332244 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee451ea0-b37a-4b8e-94dc-de47e65716ec-collector-token" (OuterVolumeSpecName: "collector-token") pod "ee451ea0-b37a-4b8e-94dc-de47e65716ec" (UID: "ee451ea0-b37a-4b8e-94dc-de47e65716ec"). InnerVolumeSpecName "collector-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:46:47 crc kubenswrapper[4962]: I1201 21:46:47.335795 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee451ea0-b37a-4b8e-94dc-de47e65716ec-metrics" (OuterVolumeSpecName: "metrics") pod "ee451ea0-b37a-4b8e-94dc-de47e65716ec" (UID: "ee451ea0-b37a-4b8e-94dc-de47e65716ec"). InnerVolumeSpecName "metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:46:47 crc kubenswrapper[4962]: I1201 21:46:47.428355 4962 reconciler_common.go:293] "Volume detached for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/ee451ea0-b37a-4b8e-94dc-de47e65716ec-collector-syslog-receiver\") on node \"crc\" DevicePath \"\"" Dec 01 21:46:47 crc kubenswrapper[4962]: I1201 21:46:47.428754 4962 reconciler_common.go:293] "Volume detached for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/ee451ea0-b37a-4b8e-94dc-de47e65716ec-metrics\") on node \"crc\" DevicePath \"\"" Dec 01 21:46:47 crc kubenswrapper[4962]: I1201 21:46:47.428897 4962 reconciler_common.go:293] "Volume detached for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/ee451ea0-b37a-4b8e-94dc-de47e65716ec-collector-token\") on node \"crc\" DevicePath \"\"" Dec 01 21:46:47 crc kubenswrapper[4962]: I1201 21:46:47.429083 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z56gt\" (UniqueName: \"kubernetes.io/projected/ee451ea0-b37a-4b8e-94dc-de47e65716ec-kube-api-access-z56gt\") on node \"crc\" DevicePath \"\"" Dec 01 21:46:47 crc kubenswrapper[4962]: I1201 21:46:47.429207 4962 reconciler_common.go:293] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ee451ea0-b37a-4b8e-94dc-de47e65716ec-tmp\") on node \"crc\" DevicePath \"\"" Dec 01 21:46:47 crc kubenswrapper[4962]: I1201 21:46:47.429338 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee451ea0-b37a-4b8e-94dc-de47e65716ec-config\") on node \"crc\" DevicePath \"\"" Dec 01 21:46:47 crc kubenswrapper[4962]: I1201 21:46:47.429460 4962 reconciler_common.go:293] "Volume detached for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/ee451ea0-b37a-4b8e-94dc-de47e65716ec-sa-token\") on node \"crc\" DevicePath \"\"" Dec 01 21:46:48 crc kubenswrapper[4962]: I1201 21:46:48.134825 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-dg84k" Dec 01 21:46:48 crc kubenswrapper[4962]: I1201 21:46:48.214465 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-logging/collector-dg84k"] Dec 01 21:46:48 crc kubenswrapper[4962]: I1201 21:46:48.254741 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-logging/collector-dg84k"] Dec 01 21:46:48 crc kubenswrapper[4962]: I1201 21:46:48.258740 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/collector-ppj4c"] Dec 01 21:46:48 crc kubenswrapper[4962]: I1201 21:46:48.261622 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-ppj4c" Dec 01 21:46:48 crc kubenswrapper[4962]: I1201 21:46:48.264509 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-config" Dec 01 21:46:48 crc kubenswrapper[4962]: I1201 21:46:48.265108 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-token" Dec 01 21:46:48 crc kubenswrapper[4962]: I1201 21:46:48.265277 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-metrics" Dec 01 21:46:48 crc kubenswrapper[4962]: I1201 21:46:48.266101 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-dockercfg-9rfpn" Dec 01 21:46:48 crc kubenswrapper[4962]: I1201 21:46:48.269503 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-syslog-receiver" Dec 01 21:46:48 crc kubenswrapper[4962]: I1201 21:46:48.277056 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-trustbundle" Dec 01 21:46:48 crc kubenswrapper[4962]: I1201 21:46:48.279885 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-ppj4c"] Dec 01 21:46:48 crc kubenswrapper[4962]: I1201 21:46:48.357854 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/15e991cf-b72c-462a-bc84-b157fee8ac90-tmp\") pod \"collector-ppj4c\" (UID: \"15e991cf-b72c-462a-bc84-b157fee8ac90\") " pod="openshift-logging/collector-ppj4c" Dec 01 21:46:48 crc kubenswrapper[4962]: I1201 21:46:48.357887 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/15e991cf-b72c-462a-bc84-b157fee8ac90-metrics\") pod \"collector-ppj4c\" (UID: \"15e991cf-b72c-462a-bc84-b157fee8ac90\") " pod="openshift-logging/collector-ppj4c" Dec 01 21:46:48 crc kubenswrapper[4962]: I1201 21:46:48.357914 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15e991cf-b72c-462a-bc84-b157fee8ac90-config\") pod \"collector-ppj4c\" (UID: \"15e991cf-b72c-462a-bc84-b157fee8ac90\") " pod="openshift-logging/collector-ppj4c" Dec 01 21:46:48 crc kubenswrapper[4962]: I1201 21:46:48.357948 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4rpq\" (UniqueName: \"kubernetes.io/projected/15e991cf-b72c-462a-bc84-b157fee8ac90-kube-api-access-m4rpq\") pod \"collector-ppj4c\" (UID: \"15e991cf-b72c-462a-bc84-b157fee8ac90\") " pod="openshift-logging/collector-ppj4c" Dec 01 21:46:48 crc kubenswrapper[4962]: I1201 21:46:48.357972 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/15e991cf-b72c-462a-bc84-b157fee8ac90-collector-syslog-receiver\") pod \"collector-ppj4c\" (UID: \"15e991cf-b72c-462a-bc84-b157fee8ac90\") " pod="openshift-logging/collector-ppj4c" Dec 01 21:46:48 crc kubenswrapper[4962]: I1201 21:46:48.357993 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/15e991cf-b72c-462a-bc84-b157fee8ac90-collector-token\") pod \"collector-ppj4c\" (UID: \"15e991cf-b72c-462a-bc84-b157fee8ac90\") " pod="openshift-logging/collector-ppj4c" Dec 01 21:46:48 crc kubenswrapper[4962]: I1201 21:46:48.358056 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/15e991cf-b72c-462a-bc84-b157fee8ac90-trusted-ca\") pod \"collector-ppj4c\" (UID: \"15e991cf-b72c-462a-bc84-b157fee8ac90\") " pod="openshift-logging/collector-ppj4c" Dec 01 21:46:48 crc kubenswrapper[4962]: I1201 21:46:48.358101 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/15e991cf-b72c-462a-bc84-b157fee8ac90-entrypoint\") pod \"collector-ppj4c\" (UID: \"15e991cf-b72c-462a-bc84-b157fee8ac90\") " pod="openshift-logging/collector-ppj4c" Dec 01 21:46:48 crc kubenswrapper[4962]: I1201 21:46:48.358119 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/15e991cf-b72c-462a-bc84-b157fee8ac90-config-openshift-service-cacrt\") pod \"collector-ppj4c\" (UID: \"15e991cf-b72c-462a-bc84-b157fee8ac90\") " pod="openshift-logging/collector-ppj4c" Dec 01 21:46:48 crc kubenswrapper[4962]: I1201 21:46:48.358143 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/15e991cf-b72c-462a-bc84-b157fee8ac90-sa-token\") pod \"collector-ppj4c\" (UID: \"15e991cf-b72c-462a-bc84-b157fee8ac90\") " pod="openshift-logging/collector-ppj4c" Dec 01 21:46:48 crc kubenswrapper[4962]: I1201 21:46:48.358166 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/15e991cf-b72c-462a-bc84-b157fee8ac90-datadir\") pod \"collector-ppj4c\" (UID: \"15e991cf-b72c-462a-bc84-b157fee8ac90\") " pod="openshift-logging/collector-ppj4c" Dec 01 21:46:48 crc kubenswrapper[4962]: I1201 21:46:48.460171 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/15e991cf-b72c-462a-bc84-b157fee8ac90-entrypoint\") pod \"collector-ppj4c\" (UID: \"15e991cf-b72c-462a-bc84-b157fee8ac90\") " pod="openshift-logging/collector-ppj4c" Dec 01 21:46:48 crc kubenswrapper[4962]: I1201 21:46:48.460221 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/15e991cf-b72c-462a-bc84-b157fee8ac90-config-openshift-service-cacrt\") pod \"collector-ppj4c\" (UID: \"15e991cf-b72c-462a-bc84-b157fee8ac90\") " pod="openshift-logging/collector-ppj4c" Dec 01 21:46:48 crc kubenswrapper[4962]: I1201 21:46:48.460249 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/15e991cf-b72c-462a-bc84-b157fee8ac90-sa-token\") pod \"collector-ppj4c\" (UID: \"15e991cf-b72c-462a-bc84-b157fee8ac90\") " pod="openshift-logging/collector-ppj4c" Dec 01 21:46:48 crc kubenswrapper[4962]: I1201 21:46:48.460275 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/15e991cf-b72c-462a-bc84-b157fee8ac90-datadir\") pod \"collector-ppj4c\" (UID: \"15e991cf-b72c-462a-bc84-b157fee8ac90\") " pod="openshift-logging/collector-ppj4c" Dec 01 21:46:48 crc kubenswrapper[4962]: I1201 21:46:48.460299 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/15e991cf-b72c-462a-bc84-b157fee8ac90-tmp\") pod \"collector-ppj4c\" (UID: \"15e991cf-b72c-462a-bc84-b157fee8ac90\") " pod="openshift-logging/collector-ppj4c" Dec 01 21:46:48 crc kubenswrapper[4962]: I1201 21:46:48.460315 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/15e991cf-b72c-462a-bc84-b157fee8ac90-metrics\") pod \"collector-ppj4c\" (UID: \"15e991cf-b72c-462a-bc84-b157fee8ac90\") " pod="openshift-logging/collector-ppj4c" Dec 01 21:46:48 crc kubenswrapper[4962]: I1201 21:46:48.460336 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15e991cf-b72c-462a-bc84-b157fee8ac90-config\") pod \"collector-ppj4c\" (UID: \"15e991cf-b72c-462a-bc84-b157fee8ac90\") " pod="openshift-logging/collector-ppj4c" Dec 01 21:46:48 crc kubenswrapper[4962]: I1201 21:46:48.460358 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4rpq\" (UniqueName: \"kubernetes.io/projected/15e991cf-b72c-462a-bc84-b157fee8ac90-kube-api-access-m4rpq\") pod \"collector-ppj4c\" (UID: \"15e991cf-b72c-462a-bc84-b157fee8ac90\") " pod="openshift-logging/collector-ppj4c" Dec 01 21:46:48 crc kubenswrapper[4962]: I1201 21:46:48.460384 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/15e991cf-b72c-462a-bc84-b157fee8ac90-collector-syslog-receiver\") pod \"collector-ppj4c\" (UID: \"15e991cf-b72c-462a-bc84-b157fee8ac90\") " pod="openshift-logging/collector-ppj4c" Dec 01 21:46:48 crc kubenswrapper[4962]: I1201 21:46:48.460406 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/15e991cf-b72c-462a-bc84-b157fee8ac90-collector-token\") pod \"collector-ppj4c\" (UID: \"15e991cf-b72c-462a-bc84-b157fee8ac90\") " pod="openshift-logging/collector-ppj4c" Dec 01 21:46:48 crc kubenswrapper[4962]: I1201 21:46:48.460447 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/15e991cf-b72c-462a-bc84-b157fee8ac90-trusted-ca\") pod \"collector-ppj4c\" (UID: \"15e991cf-b72c-462a-bc84-b157fee8ac90\") " pod="openshift-logging/collector-ppj4c" Dec 01 21:46:48 crc kubenswrapper[4962]: I1201 21:46:48.461253 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/15e991cf-b72c-462a-bc84-b157fee8ac90-trusted-ca\") pod \"collector-ppj4c\" (UID: \"15e991cf-b72c-462a-bc84-b157fee8ac90\") " pod="openshift-logging/collector-ppj4c" Dec 01 21:46:48 crc kubenswrapper[4962]: I1201 21:46:48.462320 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/15e991cf-b72c-462a-bc84-b157fee8ac90-config-openshift-service-cacrt\") pod \"collector-ppj4c\" (UID: \"15e991cf-b72c-462a-bc84-b157fee8ac90\") " pod="openshift-logging/collector-ppj4c" Dec 01 21:46:48 crc kubenswrapper[4962]: I1201 21:46:48.462724 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15e991cf-b72c-462a-bc84-b157fee8ac90-config\") pod \"collector-ppj4c\" (UID: \"15e991cf-b72c-462a-bc84-b157fee8ac90\") " pod="openshift-logging/collector-ppj4c" Dec 01 21:46:48 crc kubenswrapper[4962]: I1201 21:46:48.462846 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/15e991cf-b72c-462a-bc84-b157fee8ac90-datadir\") pod \"collector-ppj4c\" (UID: \"15e991cf-b72c-462a-bc84-b157fee8ac90\") " pod="openshift-logging/collector-ppj4c" Dec 01 21:46:48 crc kubenswrapper[4962]: I1201 21:46:48.463352 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/15e991cf-b72c-462a-bc84-b157fee8ac90-entrypoint\") pod \"collector-ppj4c\" (UID: \"15e991cf-b72c-462a-bc84-b157fee8ac90\") " pod="openshift-logging/collector-ppj4c" Dec 01 21:46:48 crc kubenswrapper[4962]: I1201 21:46:48.466778 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/15e991cf-b72c-462a-bc84-b157fee8ac90-collector-syslog-receiver\") pod \"collector-ppj4c\" (UID: \"15e991cf-b72c-462a-bc84-b157fee8ac90\") " pod="openshift-logging/collector-ppj4c" Dec 01 21:46:48 crc kubenswrapper[4962]: I1201 21:46:48.467804 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/15e991cf-b72c-462a-bc84-b157fee8ac90-tmp\") pod \"collector-ppj4c\" (UID: \"15e991cf-b72c-462a-bc84-b157fee8ac90\") " pod="openshift-logging/collector-ppj4c" Dec 01 21:46:48 crc kubenswrapper[4962]: I1201 21:46:48.468239 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/15e991cf-b72c-462a-bc84-b157fee8ac90-collector-token\") pod \"collector-ppj4c\" (UID: \"15e991cf-b72c-462a-bc84-b157fee8ac90\") " pod="openshift-logging/collector-ppj4c" Dec 01 21:46:48 crc kubenswrapper[4962]: I1201 21:46:48.468284 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/15e991cf-b72c-462a-bc84-b157fee8ac90-metrics\") pod \"collector-ppj4c\" (UID: \"15e991cf-b72c-462a-bc84-b157fee8ac90\") " pod="openshift-logging/collector-ppj4c" Dec 01 21:46:48 crc kubenswrapper[4962]: I1201 21:46:48.490828 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/15e991cf-b72c-462a-bc84-b157fee8ac90-sa-token\") pod \"collector-ppj4c\" (UID: \"15e991cf-b72c-462a-bc84-b157fee8ac90\") " pod="openshift-logging/collector-ppj4c" Dec 01 21:46:48 crc kubenswrapper[4962]: I1201 21:46:48.492532 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4rpq\" (UniqueName: \"kubernetes.io/projected/15e991cf-b72c-462a-bc84-b157fee8ac90-kube-api-access-m4rpq\") pod \"collector-ppj4c\" (UID: \"15e991cf-b72c-462a-bc84-b157fee8ac90\") " pod="openshift-logging/collector-ppj4c" Dec 01 21:46:48 crc kubenswrapper[4962]: I1201 21:46:48.593090 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-ppj4c" Dec 01 21:46:49 crc kubenswrapper[4962]: I1201 21:46:49.044994 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-ppj4c"] Dec 01 21:46:49 crc kubenswrapper[4962]: I1201 21:46:49.142843 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/collector-ppj4c" event={"ID":"15e991cf-b72c-462a-bc84-b157fee8ac90","Type":"ContainerStarted","Data":"7dcf761a9a6f1c8ec85428b70a9aad844efd2f409ce784c348b5a7b8bbfd4b8d"} Dec 01 21:46:50 crc kubenswrapper[4962]: I1201 21:46:50.236733 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee451ea0-b37a-4b8e-94dc-de47e65716ec" path="/var/lib/kubelet/pods/ee451ea0-b37a-4b8e-94dc-de47e65716ec/volumes" Dec 01 21:46:57 crc kubenswrapper[4962]: I1201 21:46:57.210613 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/collector-ppj4c" event={"ID":"15e991cf-b72c-462a-bc84-b157fee8ac90","Type":"ContainerStarted","Data":"25dffd4cc75e5b493f913de212a14fe12ef721abec7588ecb71c4f1f237aaf32"} Dec 01 21:46:57 crc kubenswrapper[4962]: I1201 21:46:57.248010 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/collector-ppj4c" podStartSLOduration=1.847161459 podStartE2EDuration="9.247921694s" podCreationTimestamp="2025-12-01 21:46:48 +0000 UTC" firstStartedPulling="2025-12-01 21:46:49.05722119 +0000 UTC m=+793.158660375" lastFinishedPulling="2025-12-01 21:46:56.457981415 +0000 UTC m=+800.559420610" observedRunningTime="2025-12-01 21:46:57.238692855 +0000 UTC m=+801.340132090" watchObservedRunningTime="2025-12-01 21:46:57.247921694 +0000 UTC m=+801.349360929" Dec 01 21:47:02 crc kubenswrapper[4962]: I1201 21:47:02.784294 4962 patch_prober.go:28] interesting pod/machine-config-daemon-b642k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 21:47:02 crc kubenswrapper[4962]: I1201 21:47:02.785050 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 21:47:25 crc kubenswrapper[4962]: I1201 21:47:25.901681 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f72t6l"] Dec 01 21:47:25 crc kubenswrapper[4962]: I1201 21:47:25.903878 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f72t6l" Dec 01 21:47:25 crc kubenswrapper[4962]: I1201 21:47:25.910051 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 01 21:47:25 crc kubenswrapper[4962]: I1201 21:47:25.910993 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f72t6l"] Dec 01 21:47:25 crc kubenswrapper[4962]: I1201 21:47:25.948437 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fc7027df-456f-4562-8c33-b9902049338d-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f72t6l\" (UID: \"fc7027df-456f-4562-8c33-b9902049338d\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f72t6l" Dec 01 21:47:25 crc kubenswrapper[4962]: I1201 21:47:25.948512 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fc7027df-456f-4562-8c33-b9902049338d-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f72t6l\" (UID: \"fc7027df-456f-4562-8c33-b9902049338d\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f72t6l" Dec 01 21:47:25 crc kubenswrapper[4962]: I1201 21:47:25.948704 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbjwv\" (UniqueName: \"kubernetes.io/projected/fc7027df-456f-4562-8c33-b9902049338d-kube-api-access-dbjwv\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f72t6l\" (UID: \"fc7027df-456f-4562-8c33-b9902049338d\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f72t6l" Dec 01 21:47:26 crc kubenswrapper[4962]: I1201 21:47:26.050368 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fc7027df-456f-4562-8c33-b9902049338d-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f72t6l\" (UID: \"fc7027df-456f-4562-8c33-b9902049338d\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f72t6l" Dec 01 21:47:26 crc kubenswrapper[4962]: I1201 21:47:26.050466 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fc7027df-456f-4562-8c33-b9902049338d-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f72t6l\" (UID: \"fc7027df-456f-4562-8c33-b9902049338d\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f72t6l" Dec 01 21:47:26 crc kubenswrapper[4962]: I1201 21:47:26.050536 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbjwv\" (UniqueName: \"kubernetes.io/projected/fc7027df-456f-4562-8c33-b9902049338d-kube-api-access-dbjwv\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f72t6l\" (UID: \"fc7027df-456f-4562-8c33-b9902049338d\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f72t6l" Dec 01 21:47:26 crc kubenswrapper[4962]: I1201 21:47:26.051082 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fc7027df-456f-4562-8c33-b9902049338d-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f72t6l\" (UID: \"fc7027df-456f-4562-8c33-b9902049338d\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f72t6l" Dec 01 21:47:26 crc kubenswrapper[4962]: I1201 21:47:26.051138 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fc7027df-456f-4562-8c33-b9902049338d-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f72t6l\" (UID: \"fc7027df-456f-4562-8c33-b9902049338d\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f72t6l" Dec 01 21:47:26 crc kubenswrapper[4962]: I1201 21:47:26.078882 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbjwv\" (UniqueName: \"kubernetes.io/projected/fc7027df-456f-4562-8c33-b9902049338d-kube-api-access-dbjwv\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f72t6l\" (UID: \"fc7027df-456f-4562-8c33-b9902049338d\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f72t6l" Dec 01 21:47:26 crc kubenswrapper[4962]: I1201 21:47:26.227088 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f72t6l" Dec 01 21:47:26 crc kubenswrapper[4962]: I1201 21:47:26.488423 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f72t6l"] Dec 01 21:47:27 crc kubenswrapper[4962]: I1201 21:47:27.511928 4962 generic.go:334] "Generic (PLEG): container finished" podID="fc7027df-456f-4562-8c33-b9902049338d" containerID="1433e170121c34d61a14c1d8503a863492e008429df3297b0d1f6c7278afeb38" exitCode=0 Dec 01 21:47:27 crc kubenswrapper[4962]: I1201 21:47:27.511982 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f72t6l" event={"ID":"fc7027df-456f-4562-8c33-b9902049338d","Type":"ContainerDied","Data":"1433e170121c34d61a14c1d8503a863492e008429df3297b0d1f6c7278afeb38"} Dec 01 21:47:27 crc kubenswrapper[4962]: I1201 21:47:27.512030 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f72t6l" event={"ID":"fc7027df-456f-4562-8c33-b9902049338d","Type":"ContainerStarted","Data":"c7a0a693c1662c63f6878f69a0dd0cbf51d3cd8dc739b30eaa8e12a21a06c644"} Dec 01 21:47:28 crc kubenswrapper[4962]: I1201 21:47:28.270877 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-c4mq6"] Dec 01 21:47:28 crc kubenswrapper[4962]: I1201 21:47:28.274778 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c4mq6" Dec 01 21:47:28 crc kubenswrapper[4962]: I1201 21:47:28.288699 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c4mq6"] Dec 01 21:47:28 crc kubenswrapper[4962]: I1201 21:47:28.399046 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b7764f3-c427-4341-8a86-6ca32c186863-catalog-content\") pod \"redhat-operators-c4mq6\" (UID: \"0b7764f3-c427-4341-8a86-6ca32c186863\") " pod="openshift-marketplace/redhat-operators-c4mq6" Dec 01 21:47:28 crc kubenswrapper[4962]: I1201 21:47:28.399127 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b7764f3-c427-4341-8a86-6ca32c186863-utilities\") pod \"redhat-operators-c4mq6\" (UID: \"0b7764f3-c427-4341-8a86-6ca32c186863\") " pod="openshift-marketplace/redhat-operators-c4mq6" Dec 01 21:47:28 crc kubenswrapper[4962]: I1201 21:47:28.399516 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgtfd\" (UniqueName: \"kubernetes.io/projected/0b7764f3-c427-4341-8a86-6ca32c186863-kube-api-access-jgtfd\") pod \"redhat-operators-c4mq6\" (UID: \"0b7764f3-c427-4341-8a86-6ca32c186863\") " pod="openshift-marketplace/redhat-operators-c4mq6" Dec 01 21:47:28 crc kubenswrapper[4962]: I1201 21:47:28.501653 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b7764f3-c427-4341-8a86-6ca32c186863-catalog-content\") pod \"redhat-operators-c4mq6\" (UID: \"0b7764f3-c427-4341-8a86-6ca32c186863\") " pod="openshift-marketplace/redhat-operators-c4mq6" Dec 01 21:47:28 crc kubenswrapper[4962]: I1201 21:47:28.501710 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b7764f3-c427-4341-8a86-6ca32c186863-utilities\") pod \"redhat-operators-c4mq6\" (UID: \"0b7764f3-c427-4341-8a86-6ca32c186863\") " pod="openshift-marketplace/redhat-operators-c4mq6" Dec 01 21:47:28 crc kubenswrapper[4962]: I1201 21:47:28.501832 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgtfd\" (UniqueName: \"kubernetes.io/projected/0b7764f3-c427-4341-8a86-6ca32c186863-kube-api-access-jgtfd\") pod \"redhat-operators-c4mq6\" (UID: \"0b7764f3-c427-4341-8a86-6ca32c186863\") " pod="openshift-marketplace/redhat-operators-c4mq6" Dec 01 21:47:28 crc kubenswrapper[4962]: I1201 21:47:28.502366 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b7764f3-c427-4341-8a86-6ca32c186863-catalog-content\") pod \"redhat-operators-c4mq6\" (UID: \"0b7764f3-c427-4341-8a86-6ca32c186863\") " pod="openshift-marketplace/redhat-operators-c4mq6" Dec 01 21:47:28 crc kubenswrapper[4962]: I1201 21:47:28.502421 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b7764f3-c427-4341-8a86-6ca32c186863-utilities\") pod \"redhat-operators-c4mq6\" (UID: \"0b7764f3-c427-4341-8a86-6ca32c186863\") " pod="openshift-marketplace/redhat-operators-c4mq6" Dec 01 21:47:28 crc kubenswrapper[4962]: I1201 21:47:28.519992 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgtfd\" (UniqueName: \"kubernetes.io/projected/0b7764f3-c427-4341-8a86-6ca32c186863-kube-api-access-jgtfd\") pod \"redhat-operators-c4mq6\" (UID: \"0b7764f3-c427-4341-8a86-6ca32c186863\") " pod="openshift-marketplace/redhat-operators-c4mq6" Dec 01 21:47:28 crc kubenswrapper[4962]: I1201 21:47:28.642795 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c4mq6" Dec 01 21:47:29 crc kubenswrapper[4962]: I1201 21:47:29.133921 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c4mq6"] Dec 01 21:47:29 crc kubenswrapper[4962]: I1201 21:47:29.527461 4962 generic.go:334] "Generic (PLEG): container finished" podID="fc7027df-456f-4562-8c33-b9902049338d" containerID="a2e903e3117139f641fcff90ddcd31c40dc451736d691e8f784bd8784397a48d" exitCode=0 Dec 01 21:47:29 crc kubenswrapper[4962]: I1201 21:47:29.527574 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f72t6l" event={"ID":"fc7027df-456f-4562-8c33-b9902049338d","Type":"ContainerDied","Data":"a2e903e3117139f641fcff90ddcd31c40dc451736d691e8f784bd8784397a48d"} Dec 01 21:47:29 crc kubenswrapper[4962]: I1201 21:47:29.530459 4962 generic.go:334] "Generic (PLEG): container finished" podID="0b7764f3-c427-4341-8a86-6ca32c186863" containerID="59f9bc42ee8fc8b50eb40b6fc533ffa2bc75833e2ba25188bea87600712945bc" exitCode=0 Dec 01 21:47:29 crc kubenswrapper[4962]: I1201 21:47:29.530497 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c4mq6" event={"ID":"0b7764f3-c427-4341-8a86-6ca32c186863","Type":"ContainerDied","Data":"59f9bc42ee8fc8b50eb40b6fc533ffa2bc75833e2ba25188bea87600712945bc"} Dec 01 21:47:29 crc kubenswrapper[4962]: I1201 21:47:29.530540 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c4mq6" event={"ID":"0b7764f3-c427-4341-8a86-6ca32c186863","Type":"ContainerStarted","Data":"eacbbeeedd877d0104b5b7d053933efa931a91d616f6bb6a79e4d93a817445c1"} Dec 01 21:47:30 crc kubenswrapper[4962]: I1201 21:47:30.545273 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f72t6l" event={"ID":"fc7027df-456f-4562-8c33-b9902049338d","Type":"ContainerStarted","Data":"dc6d99b414776e694375b4888aa0e30d8f505af7e007db82a501ee26b641b0aa"} Dec 01 21:47:30 crc kubenswrapper[4962]: I1201 21:47:30.575544 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f72t6l" podStartSLOduration=4.5263812770000005 podStartE2EDuration="5.575511071s" podCreationTimestamp="2025-12-01 21:47:25 +0000 UTC" firstStartedPulling="2025-12-01 21:47:27.514096742 +0000 UTC m=+831.615535937" lastFinishedPulling="2025-12-01 21:47:28.563226526 +0000 UTC m=+832.664665731" observedRunningTime="2025-12-01 21:47:30.565338925 +0000 UTC m=+834.666778190" watchObservedRunningTime="2025-12-01 21:47:30.575511071 +0000 UTC m=+834.676950347" Dec 01 21:47:31 crc kubenswrapper[4962]: I1201 21:47:31.559761 4962 generic.go:334] "Generic (PLEG): container finished" podID="fc7027df-456f-4562-8c33-b9902049338d" containerID="dc6d99b414776e694375b4888aa0e30d8f505af7e007db82a501ee26b641b0aa" exitCode=0 Dec 01 21:47:31 crc kubenswrapper[4962]: I1201 21:47:31.559858 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f72t6l" event={"ID":"fc7027df-456f-4562-8c33-b9902049338d","Type":"ContainerDied","Data":"dc6d99b414776e694375b4888aa0e30d8f505af7e007db82a501ee26b641b0aa"} Dec 01 21:47:31 crc kubenswrapper[4962]: I1201 21:47:31.561992 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c4mq6" event={"ID":"0b7764f3-c427-4341-8a86-6ca32c186863","Type":"ContainerStarted","Data":"d39e484195523afa77fb20f0b70d4ce0492f19d6cd7703335b9fdc72f0a9e2ee"} Dec 01 21:47:32 crc kubenswrapper[4962]: I1201 21:47:32.784714 4962 patch_prober.go:28] interesting pod/machine-config-daemon-b642k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 21:47:32 crc kubenswrapper[4962]: I1201 21:47:32.784779 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 21:47:32 crc kubenswrapper[4962]: I1201 21:47:32.784835 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b642k" Dec 01 21:47:32 crc kubenswrapper[4962]: I1201 21:47:32.785457 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d2879c1f1c1a43cf7797f56147cd78f2bf5ee957daff607dcee5e6d23c293a8c"} pod="openshift-machine-config-operator/machine-config-daemon-b642k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 21:47:32 crc kubenswrapper[4962]: I1201 21:47:32.785511 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" containerID="cri-o://d2879c1f1c1a43cf7797f56147cd78f2bf5ee957daff607dcee5e6d23c293a8c" gracePeriod=600 Dec 01 21:47:32 crc kubenswrapper[4962]: I1201 21:47:32.954543 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f72t6l" Dec 01 21:47:32 crc kubenswrapper[4962]: I1201 21:47:32.972443 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fc7027df-456f-4562-8c33-b9902049338d-util\") pod \"fc7027df-456f-4562-8c33-b9902049338d\" (UID: \"fc7027df-456f-4562-8c33-b9902049338d\") " Dec 01 21:47:32 crc kubenswrapper[4962]: I1201 21:47:32.972809 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fc7027df-456f-4562-8c33-b9902049338d-bundle\") pod \"fc7027df-456f-4562-8c33-b9902049338d\" (UID: \"fc7027df-456f-4562-8c33-b9902049338d\") " Dec 01 21:47:32 crc kubenswrapper[4962]: I1201 21:47:32.972862 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbjwv\" (UniqueName: \"kubernetes.io/projected/fc7027df-456f-4562-8c33-b9902049338d-kube-api-access-dbjwv\") pod \"fc7027df-456f-4562-8c33-b9902049338d\" (UID: \"fc7027df-456f-4562-8c33-b9902049338d\") " Dec 01 21:47:32 crc kubenswrapper[4962]: I1201 21:47:32.975772 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc7027df-456f-4562-8c33-b9902049338d-bundle" (OuterVolumeSpecName: "bundle") pod "fc7027df-456f-4562-8c33-b9902049338d" (UID: "fc7027df-456f-4562-8c33-b9902049338d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:47:32 crc kubenswrapper[4962]: I1201 21:47:32.982309 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc7027df-456f-4562-8c33-b9902049338d-kube-api-access-dbjwv" (OuterVolumeSpecName: "kube-api-access-dbjwv") pod "fc7027df-456f-4562-8c33-b9902049338d" (UID: "fc7027df-456f-4562-8c33-b9902049338d"). InnerVolumeSpecName "kube-api-access-dbjwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:47:32 crc kubenswrapper[4962]: I1201 21:47:32.990511 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc7027df-456f-4562-8c33-b9902049338d-util" (OuterVolumeSpecName: "util") pod "fc7027df-456f-4562-8c33-b9902049338d" (UID: "fc7027df-456f-4562-8c33-b9902049338d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:47:33 crc kubenswrapper[4962]: I1201 21:47:33.074901 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbjwv\" (UniqueName: \"kubernetes.io/projected/fc7027df-456f-4562-8c33-b9902049338d-kube-api-access-dbjwv\") on node \"crc\" DevicePath \"\"" Dec 01 21:47:33 crc kubenswrapper[4962]: I1201 21:47:33.074947 4962 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fc7027df-456f-4562-8c33-b9902049338d-util\") on node \"crc\" DevicePath \"\"" Dec 01 21:47:33 crc kubenswrapper[4962]: I1201 21:47:33.074957 4962 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fc7027df-456f-4562-8c33-b9902049338d-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 21:47:33 crc kubenswrapper[4962]: I1201 21:47:33.580441 4962 generic.go:334] "Generic (PLEG): container finished" podID="0b7764f3-c427-4341-8a86-6ca32c186863" containerID="d39e484195523afa77fb20f0b70d4ce0492f19d6cd7703335b9fdc72f0a9e2ee" exitCode=0 Dec 01 21:47:33 crc kubenswrapper[4962]: I1201 21:47:33.580505 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c4mq6" event={"ID":"0b7764f3-c427-4341-8a86-6ca32c186863","Type":"ContainerDied","Data":"d39e484195523afa77fb20f0b70d4ce0492f19d6cd7703335b9fdc72f0a9e2ee"} Dec 01 21:47:33 crc kubenswrapper[4962]: I1201 21:47:33.584500 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f72t6l" event={"ID":"fc7027df-456f-4562-8c33-b9902049338d","Type":"ContainerDied","Data":"c7a0a693c1662c63f6878f69a0dd0cbf51d3cd8dc739b30eaa8e12a21a06c644"} Dec 01 21:47:33 crc kubenswrapper[4962]: I1201 21:47:33.584561 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c7a0a693c1662c63f6878f69a0dd0cbf51d3cd8dc739b30eaa8e12a21a06c644" Dec 01 21:47:33 crc kubenswrapper[4962]: I1201 21:47:33.584990 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f72t6l" Dec 01 21:47:33 crc kubenswrapper[4962]: I1201 21:47:33.589804 4962 generic.go:334] "Generic (PLEG): container finished" podID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerID="d2879c1f1c1a43cf7797f56147cd78f2bf5ee957daff607dcee5e6d23c293a8c" exitCode=0 Dec 01 21:47:33 crc kubenswrapper[4962]: I1201 21:47:33.589866 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" event={"ID":"191b6ce3-f613-4217-b224-a65ee4cfdfe7","Type":"ContainerDied","Data":"d2879c1f1c1a43cf7797f56147cd78f2bf5ee957daff607dcee5e6d23c293a8c"} Dec 01 21:47:33 crc kubenswrapper[4962]: I1201 21:47:33.589918 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" event={"ID":"191b6ce3-f613-4217-b224-a65ee4cfdfe7","Type":"ContainerStarted","Data":"95b773e188f611e19f1e133dda091ac575dae9bb165debbd86a90d7593910a0b"} Dec 01 21:47:33 crc kubenswrapper[4962]: I1201 21:47:33.589963 4962 scope.go:117] "RemoveContainer" containerID="4b21fa950e24527d3d7f2945a34117bc2a69fe50d90966acf9350574b99da5ad" Dec 01 21:47:34 crc kubenswrapper[4962]: I1201 21:47:34.597378 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c4mq6" event={"ID":"0b7764f3-c427-4341-8a86-6ca32c186863","Type":"ContainerStarted","Data":"6c7f530ee3f79e2e155a43ae6e592f99017d411f151019a958ccd16d02ef4a01"} Dec 01 21:47:35 crc kubenswrapper[4962]: I1201 21:47:35.782691 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-c4mq6" podStartSLOduration=3.057038212 podStartE2EDuration="7.782659203s" podCreationTimestamp="2025-12-01 21:47:28 +0000 UTC" firstStartedPulling="2025-12-01 21:47:29.531904909 +0000 UTC m=+833.633344114" lastFinishedPulling="2025-12-01 21:47:34.2575259 +0000 UTC m=+838.358965105" observedRunningTime="2025-12-01 21:47:34.613950917 +0000 UTC m=+838.715390132" watchObservedRunningTime="2025-12-01 21:47:35.782659203 +0000 UTC m=+839.884098458" Dec 01 21:47:35 crc kubenswrapper[4962]: I1201 21:47:35.783569 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-r6ndt"] Dec 01 21:47:35 crc kubenswrapper[4962]: E1201 21:47:35.783991 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc7027df-456f-4562-8c33-b9902049338d" containerName="pull" Dec 01 21:47:35 crc kubenswrapper[4962]: I1201 21:47:35.784013 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc7027df-456f-4562-8c33-b9902049338d" containerName="pull" Dec 01 21:47:35 crc kubenswrapper[4962]: E1201 21:47:35.784049 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc7027df-456f-4562-8c33-b9902049338d" containerName="extract" Dec 01 21:47:35 crc kubenswrapper[4962]: I1201 21:47:35.784060 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc7027df-456f-4562-8c33-b9902049338d" containerName="extract" Dec 01 21:47:35 crc kubenswrapper[4962]: E1201 21:47:35.784079 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc7027df-456f-4562-8c33-b9902049338d" containerName="util" Dec 01 21:47:35 crc kubenswrapper[4962]: I1201 21:47:35.784089 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc7027df-456f-4562-8c33-b9902049338d" containerName="util" Dec 01 21:47:35 crc kubenswrapper[4962]: I1201 21:47:35.784323 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc7027df-456f-4562-8c33-b9902049338d" containerName="extract" Dec 01 21:47:35 crc kubenswrapper[4962]: I1201 21:47:35.785132 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-r6ndt" Dec 01 21:47:35 crc kubenswrapper[4962]: I1201 21:47:35.788644 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Dec 01 21:47:35 crc kubenswrapper[4962]: I1201 21:47:35.788720 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Dec 01 21:47:35 crc kubenswrapper[4962]: I1201 21:47:35.798412 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-x5l6h" Dec 01 21:47:35 crc kubenswrapper[4962]: I1201 21:47:35.820186 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-r6ndt"] Dec 01 21:47:35 crc kubenswrapper[4962]: I1201 21:47:35.931895 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwrkn\" (UniqueName: \"kubernetes.io/projected/c7fabe32-40b1-4300-bd18-c51c12e45a21-kube-api-access-lwrkn\") pod \"nmstate-operator-5b5b58f5c8-r6ndt\" (UID: \"c7fabe32-40b1-4300-bd18-c51c12e45a21\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-r6ndt" Dec 01 21:47:36 crc kubenswrapper[4962]: I1201 21:47:36.033271 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwrkn\" (UniqueName: \"kubernetes.io/projected/c7fabe32-40b1-4300-bd18-c51c12e45a21-kube-api-access-lwrkn\") pod \"nmstate-operator-5b5b58f5c8-r6ndt\" (UID: \"c7fabe32-40b1-4300-bd18-c51c12e45a21\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-r6ndt" Dec 01 21:47:36 crc kubenswrapper[4962]: I1201 21:47:36.066898 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwrkn\" (UniqueName: \"kubernetes.io/projected/c7fabe32-40b1-4300-bd18-c51c12e45a21-kube-api-access-lwrkn\") pod \"nmstate-operator-5b5b58f5c8-r6ndt\" (UID: \"c7fabe32-40b1-4300-bd18-c51c12e45a21\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-r6ndt" Dec 01 21:47:36 crc kubenswrapper[4962]: I1201 21:47:36.128273 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-r6ndt" Dec 01 21:47:36 crc kubenswrapper[4962]: I1201 21:47:36.587568 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-r6ndt"] Dec 01 21:47:36 crc kubenswrapper[4962]: W1201 21:47:36.592075 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7fabe32_40b1_4300_bd18_c51c12e45a21.slice/crio-d51f60ccc12ddd987bce0a60a8e5e25f026a0d7a0558ed7414e5aa6d8e368759 WatchSource:0}: Error finding container d51f60ccc12ddd987bce0a60a8e5e25f026a0d7a0558ed7414e5aa6d8e368759: Status 404 returned error can't find the container with id d51f60ccc12ddd987bce0a60a8e5e25f026a0d7a0558ed7414e5aa6d8e368759 Dec 01 21:47:36 crc kubenswrapper[4962]: I1201 21:47:36.614146 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-r6ndt" event={"ID":"c7fabe32-40b1-4300-bd18-c51c12e45a21","Type":"ContainerStarted","Data":"d51f60ccc12ddd987bce0a60a8e5e25f026a0d7a0558ed7414e5aa6d8e368759"} Dec 01 21:47:38 crc kubenswrapper[4962]: I1201 21:47:38.643797 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-c4mq6" Dec 01 21:47:38 crc kubenswrapper[4962]: I1201 21:47:38.645227 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-c4mq6" Dec 01 21:47:39 crc kubenswrapper[4962]: I1201 21:47:39.771953 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-c4mq6" podUID="0b7764f3-c427-4341-8a86-6ca32c186863" containerName="registry-server" probeResult="failure" output=< Dec 01 21:47:39 crc kubenswrapper[4962]: timeout: failed to connect service ":50051" within 1s Dec 01 21:47:39 crc kubenswrapper[4962]: > Dec 01 21:47:40 crc kubenswrapper[4962]: I1201 21:47:40.644010 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-r6ndt" event={"ID":"c7fabe32-40b1-4300-bd18-c51c12e45a21","Type":"ContainerStarted","Data":"d518db31678adc37ad701f5de361cfeb229c2c0f74cd0a7a3cda1d219408f6a8"} Dec 01 21:47:40 crc kubenswrapper[4962]: I1201 21:47:40.677088 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-r6ndt" podStartSLOduration=2.075482118 podStartE2EDuration="5.677064601s" podCreationTimestamp="2025-12-01 21:47:35 +0000 UTC" firstStartedPulling="2025-12-01 21:47:36.594321324 +0000 UTC m=+840.695760519" lastFinishedPulling="2025-12-01 21:47:40.195903797 +0000 UTC m=+844.297343002" observedRunningTime="2025-12-01 21:47:40.671410159 +0000 UTC m=+844.772849364" watchObservedRunningTime="2025-12-01 21:47:40.677064601 +0000 UTC m=+844.778503856" Dec 01 21:47:47 crc kubenswrapper[4962]: I1201 21:47:47.456557 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-s5czc"] Dec 01 21:47:47 crc kubenswrapper[4962]: I1201 21:47:47.457778 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-s5czc" Dec 01 21:47:47 crc kubenswrapper[4962]: I1201 21:47:47.463271 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Dec 01 21:47:47 crc kubenswrapper[4962]: I1201 21:47:47.463475 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-2x9w7" Dec 01 21:47:47 crc kubenswrapper[4962]: I1201 21:47:47.465731 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-l62dd"] Dec 01 21:47:47 crc kubenswrapper[4962]: I1201 21:47:47.467126 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-l62dd" Dec 01 21:47:47 crc kubenswrapper[4962]: I1201 21:47:47.519199 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-l62dd"] Dec 01 21:47:47 crc kubenswrapper[4962]: I1201 21:47:47.550407 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-s5czc"] Dec 01 21:47:47 crc kubenswrapper[4962]: I1201 21:47:47.571663 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-s4vbq"] Dec 01 21:47:47 crc kubenswrapper[4962]: I1201 21:47:47.572793 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-s4vbq" Dec 01 21:47:47 crc kubenswrapper[4962]: I1201 21:47:47.651119 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njgrg\" (UniqueName: \"kubernetes.io/projected/2473e9c3-5f3d-4122-ae3c-c0ef0de79201-kube-api-access-njgrg\") pod \"nmstate-webhook-5f6d4c5ccb-s5czc\" (UID: \"2473e9c3-5f3d-4122-ae3c-c0ef0de79201\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-s5czc" Dec 01 21:47:47 crc kubenswrapper[4962]: I1201 21:47:47.651253 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/2473e9c3-5f3d-4122-ae3c-c0ef0de79201-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-s5czc\" (UID: \"2473e9c3-5f3d-4122-ae3c-c0ef0de79201\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-s5czc" Dec 01 21:47:47 crc kubenswrapper[4962]: I1201 21:47:47.651305 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kj9lq\" (UniqueName: \"kubernetes.io/projected/838c46a9-9378-4801-8cc4-e203bf8c2972-kube-api-access-kj9lq\") pod \"nmstate-metrics-7f946cbc9-l62dd\" (UID: \"838c46a9-9378-4801-8cc4-e203bf8c2972\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-l62dd" Dec 01 21:47:47 crc kubenswrapper[4962]: I1201 21:47:47.652703 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-48pxc"] Dec 01 21:47:47 crc kubenswrapper[4962]: I1201 21:47:47.653636 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-48pxc" Dec 01 21:47:47 crc kubenswrapper[4962]: I1201 21:47:47.657542 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Dec 01 21:47:47 crc kubenswrapper[4962]: I1201 21:47:47.657816 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-9vflh" Dec 01 21:47:47 crc kubenswrapper[4962]: I1201 21:47:47.657978 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Dec 01 21:47:47 crc kubenswrapper[4962]: I1201 21:47:47.674303 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-48pxc"] Dec 01 21:47:47 crc kubenswrapper[4962]: I1201 21:47:47.753189 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/5a7b0f93-3ea3-4a0f-baef-4ca08977cbde-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-48pxc\" (UID: \"5a7b0f93-3ea3-4a0f-baef-4ca08977cbde\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-48pxc" Dec 01 21:47:47 crc kubenswrapper[4962]: I1201 21:47:47.753249 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njgrg\" (UniqueName: \"kubernetes.io/projected/2473e9c3-5f3d-4122-ae3c-c0ef0de79201-kube-api-access-njgrg\") pod \"nmstate-webhook-5f6d4c5ccb-s5czc\" (UID: \"2473e9c3-5f3d-4122-ae3c-c0ef0de79201\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-s5czc" Dec 01 21:47:47 crc kubenswrapper[4962]: I1201 21:47:47.753277 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/770c0f72-8589-4617-8b07-92d0702ff5b8-dbus-socket\") pod \"nmstate-handler-s4vbq\" (UID: \"770c0f72-8589-4617-8b07-92d0702ff5b8\") " pod="openshift-nmstate/nmstate-handler-s4vbq" Dec 01 21:47:47 crc kubenswrapper[4962]: I1201 21:47:47.753303 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rdvn\" (UniqueName: \"kubernetes.io/projected/770c0f72-8589-4617-8b07-92d0702ff5b8-kube-api-access-9rdvn\") pod \"nmstate-handler-s4vbq\" (UID: \"770c0f72-8589-4617-8b07-92d0702ff5b8\") " pod="openshift-nmstate/nmstate-handler-s4vbq" Dec 01 21:47:47 crc kubenswrapper[4962]: I1201 21:47:47.753329 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9nqs\" (UniqueName: \"kubernetes.io/projected/5a7b0f93-3ea3-4a0f-baef-4ca08977cbde-kube-api-access-j9nqs\") pod \"nmstate-console-plugin-7fbb5f6569-48pxc\" (UID: \"5a7b0f93-3ea3-4a0f-baef-4ca08977cbde\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-48pxc" Dec 01 21:47:47 crc kubenswrapper[4962]: I1201 21:47:47.753413 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/770c0f72-8589-4617-8b07-92d0702ff5b8-ovs-socket\") pod \"nmstate-handler-s4vbq\" (UID: \"770c0f72-8589-4617-8b07-92d0702ff5b8\") " pod="openshift-nmstate/nmstate-handler-s4vbq" Dec 01 21:47:47 crc kubenswrapper[4962]: I1201 21:47:47.753447 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/770c0f72-8589-4617-8b07-92d0702ff5b8-nmstate-lock\") pod \"nmstate-handler-s4vbq\" (UID: \"770c0f72-8589-4617-8b07-92d0702ff5b8\") " pod="openshift-nmstate/nmstate-handler-s4vbq" Dec 01 21:47:47 crc kubenswrapper[4962]: I1201 21:47:47.753474 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/2473e9c3-5f3d-4122-ae3c-c0ef0de79201-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-s5czc\" (UID: \"2473e9c3-5f3d-4122-ae3c-c0ef0de79201\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-s5czc" Dec 01 21:47:47 crc kubenswrapper[4962]: E1201 21:47:47.753679 4962 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Dec 01 21:47:47 crc kubenswrapper[4962]: I1201 21:47:47.753711 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5a7b0f93-3ea3-4a0f-baef-4ca08977cbde-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-48pxc\" (UID: \"5a7b0f93-3ea3-4a0f-baef-4ca08977cbde\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-48pxc" Dec 01 21:47:47 crc kubenswrapper[4962]: E1201 21:47:47.753740 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2473e9c3-5f3d-4122-ae3c-c0ef0de79201-tls-key-pair podName:2473e9c3-5f3d-4122-ae3c-c0ef0de79201 nodeName:}" failed. No retries permitted until 2025-12-01 21:47:48.253718283 +0000 UTC m=+852.355157478 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/2473e9c3-5f3d-4122-ae3c-c0ef0de79201-tls-key-pair") pod "nmstate-webhook-5f6d4c5ccb-s5czc" (UID: "2473e9c3-5f3d-4122-ae3c-c0ef0de79201") : secret "openshift-nmstate-webhook" not found Dec 01 21:47:47 crc kubenswrapper[4962]: I1201 21:47:47.753817 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kj9lq\" (UniqueName: \"kubernetes.io/projected/838c46a9-9378-4801-8cc4-e203bf8c2972-kube-api-access-kj9lq\") pod \"nmstate-metrics-7f946cbc9-l62dd\" (UID: \"838c46a9-9378-4801-8cc4-e203bf8c2972\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-l62dd" Dec 01 21:47:47 crc kubenswrapper[4962]: I1201 21:47:47.779567 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njgrg\" (UniqueName: \"kubernetes.io/projected/2473e9c3-5f3d-4122-ae3c-c0ef0de79201-kube-api-access-njgrg\") pod \"nmstate-webhook-5f6d4c5ccb-s5czc\" (UID: \"2473e9c3-5f3d-4122-ae3c-c0ef0de79201\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-s5czc" Dec 01 21:47:47 crc kubenswrapper[4962]: I1201 21:47:47.785644 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kj9lq\" (UniqueName: \"kubernetes.io/projected/838c46a9-9378-4801-8cc4-e203bf8c2972-kube-api-access-kj9lq\") pod \"nmstate-metrics-7f946cbc9-l62dd\" (UID: \"838c46a9-9378-4801-8cc4-e203bf8c2972\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-l62dd" Dec 01 21:47:47 crc kubenswrapper[4962]: I1201 21:47:47.855492 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/5a7b0f93-3ea3-4a0f-baef-4ca08977cbde-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-48pxc\" (UID: \"5a7b0f93-3ea3-4a0f-baef-4ca08977cbde\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-48pxc" Dec 01 21:47:47 crc kubenswrapper[4962]: I1201 21:47:47.855859 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/770c0f72-8589-4617-8b07-92d0702ff5b8-dbus-socket\") pod \"nmstate-handler-s4vbq\" (UID: \"770c0f72-8589-4617-8b07-92d0702ff5b8\") " pod="openshift-nmstate/nmstate-handler-s4vbq" Dec 01 21:47:47 crc kubenswrapper[4962]: I1201 21:47:47.855889 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rdvn\" (UniqueName: \"kubernetes.io/projected/770c0f72-8589-4617-8b07-92d0702ff5b8-kube-api-access-9rdvn\") pod \"nmstate-handler-s4vbq\" (UID: \"770c0f72-8589-4617-8b07-92d0702ff5b8\") " pod="openshift-nmstate/nmstate-handler-s4vbq" Dec 01 21:47:47 crc kubenswrapper[4962]: I1201 21:47:47.855912 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9nqs\" (UniqueName: \"kubernetes.io/projected/5a7b0f93-3ea3-4a0f-baef-4ca08977cbde-kube-api-access-j9nqs\") pod \"nmstate-console-plugin-7fbb5f6569-48pxc\" (UID: \"5a7b0f93-3ea3-4a0f-baef-4ca08977cbde\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-48pxc" Dec 01 21:47:47 crc kubenswrapper[4962]: I1201 21:47:47.855971 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/770c0f72-8589-4617-8b07-92d0702ff5b8-ovs-socket\") pod \"nmstate-handler-s4vbq\" (UID: \"770c0f72-8589-4617-8b07-92d0702ff5b8\") " pod="openshift-nmstate/nmstate-handler-s4vbq" Dec 01 21:47:47 crc kubenswrapper[4962]: I1201 21:47:47.856006 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/770c0f72-8589-4617-8b07-92d0702ff5b8-nmstate-lock\") pod \"nmstate-handler-s4vbq\" (UID: \"770c0f72-8589-4617-8b07-92d0702ff5b8\") " pod="openshift-nmstate/nmstate-handler-s4vbq" Dec 01 21:47:47 crc kubenswrapper[4962]: I1201 21:47:47.856073 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5a7b0f93-3ea3-4a0f-baef-4ca08977cbde-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-48pxc\" (UID: \"5a7b0f93-3ea3-4a0f-baef-4ca08977cbde\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-48pxc" Dec 01 21:47:47 crc kubenswrapper[4962]: E1201 21:47:47.855767 4962 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Dec 01 21:47:47 crc kubenswrapper[4962]: I1201 21:47:47.857127 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5a7b0f93-3ea3-4a0f-baef-4ca08977cbde-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-48pxc\" (UID: \"5a7b0f93-3ea3-4a0f-baef-4ca08977cbde\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-48pxc" Dec 01 21:47:47 crc kubenswrapper[4962]: E1201 21:47:47.857143 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a7b0f93-3ea3-4a0f-baef-4ca08977cbde-plugin-serving-cert podName:5a7b0f93-3ea3-4a0f-baef-4ca08977cbde nodeName:}" failed. No retries permitted until 2025-12-01 21:47:48.357108309 +0000 UTC m=+852.458547504 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/5a7b0f93-3ea3-4a0f-baef-4ca08977cbde-plugin-serving-cert") pod "nmstate-console-plugin-7fbb5f6569-48pxc" (UID: "5a7b0f93-3ea3-4a0f-baef-4ca08977cbde") : secret "plugin-serving-cert" not found Dec 01 21:47:47 crc kubenswrapper[4962]: I1201 21:47:47.857436 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/770c0f72-8589-4617-8b07-92d0702ff5b8-ovs-socket\") pod \"nmstate-handler-s4vbq\" (UID: \"770c0f72-8589-4617-8b07-92d0702ff5b8\") " pod="openshift-nmstate/nmstate-handler-s4vbq" Dec 01 21:47:47 crc kubenswrapper[4962]: I1201 21:47:47.857482 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/770c0f72-8589-4617-8b07-92d0702ff5b8-nmstate-lock\") pod \"nmstate-handler-s4vbq\" (UID: \"770c0f72-8589-4617-8b07-92d0702ff5b8\") " pod="openshift-nmstate/nmstate-handler-s4vbq" Dec 01 21:47:47 crc kubenswrapper[4962]: I1201 21:47:47.857442 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/770c0f72-8589-4617-8b07-92d0702ff5b8-dbus-socket\") pod \"nmstate-handler-s4vbq\" (UID: \"770c0f72-8589-4617-8b07-92d0702ff5b8\") " pod="openshift-nmstate/nmstate-handler-s4vbq" Dec 01 21:47:47 crc kubenswrapper[4962]: I1201 21:47:47.859956 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-578c49649f-mltwz"] Dec 01 21:47:47 crc kubenswrapper[4962]: I1201 21:47:47.860847 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-578c49649f-mltwz" Dec 01 21:47:47 crc kubenswrapper[4962]: I1201 21:47:47.870201 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-l62dd" Dec 01 21:47:47 crc kubenswrapper[4962]: I1201 21:47:47.879152 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-578c49649f-mltwz"] Dec 01 21:47:47 crc kubenswrapper[4962]: I1201 21:47:47.886993 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9nqs\" (UniqueName: \"kubernetes.io/projected/5a7b0f93-3ea3-4a0f-baef-4ca08977cbde-kube-api-access-j9nqs\") pod \"nmstate-console-plugin-7fbb5f6569-48pxc\" (UID: \"5a7b0f93-3ea3-4a0f-baef-4ca08977cbde\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-48pxc" Dec 01 21:47:47 crc kubenswrapper[4962]: I1201 21:47:47.906628 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rdvn\" (UniqueName: \"kubernetes.io/projected/770c0f72-8589-4617-8b07-92d0702ff5b8-kube-api-access-9rdvn\") pod \"nmstate-handler-s4vbq\" (UID: \"770c0f72-8589-4617-8b07-92d0702ff5b8\") " pod="openshift-nmstate/nmstate-handler-s4vbq" Dec 01 21:47:48 crc kubenswrapper[4962]: I1201 21:47:48.063630 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/75defef6-e656-452e-a623-8e1dd47c8078-console-serving-cert\") pod \"console-578c49649f-mltwz\" (UID: \"75defef6-e656-452e-a623-8e1dd47c8078\") " pod="openshift-console/console-578c49649f-mltwz" Dec 01 21:47:48 crc kubenswrapper[4962]: I1201 21:47:48.063984 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/75defef6-e656-452e-a623-8e1dd47c8078-service-ca\") pod \"console-578c49649f-mltwz\" (UID: \"75defef6-e656-452e-a623-8e1dd47c8078\") " pod="openshift-console/console-578c49649f-mltwz" Dec 01 21:47:48 crc kubenswrapper[4962]: I1201 21:47:48.064032 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/75defef6-e656-452e-a623-8e1dd47c8078-console-config\") pod \"console-578c49649f-mltwz\" (UID: \"75defef6-e656-452e-a623-8e1dd47c8078\") " pod="openshift-console/console-578c49649f-mltwz" Dec 01 21:47:48 crc kubenswrapper[4962]: I1201 21:47:48.064058 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/75defef6-e656-452e-a623-8e1dd47c8078-console-oauth-config\") pod \"console-578c49649f-mltwz\" (UID: \"75defef6-e656-452e-a623-8e1dd47c8078\") " pod="openshift-console/console-578c49649f-mltwz" Dec 01 21:47:48 crc kubenswrapper[4962]: I1201 21:47:48.064085 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqwfp\" (UniqueName: \"kubernetes.io/projected/75defef6-e656-452e-a623-8e1dd47c8078-kube-api-access-xqwfp\") pod \"console-578c49649f-mltwz\" (UID: \"75defef6-e656-452e-a623-8e1dd47c8078\") " pod="openshift-console/console-578c49649f-mltwz" Dec 01 21:47:48 crc kubenswrapper[4962]: I1201 21:47:48.064140 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/75defef6-e656-452e-a623-8e1dd47c8078-oauth-serving-cert\") pod \"console-578c49649f-mltwz\" (UID: \"75defef6-e656-452e-a623-8e1dd47c8078\") " pod="openshift-console/console-578c49649f-mltwz" Dec 01 21:47:48 crc kubenswrapper[4962]: I1201 21:47:48.064353 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75defef6-e656-452e-a623-8e1dd47c8078-trusted-ca-bundle\") pod \"console-578c49649f-mltwz\" (UID: \"75defef6-e656-452e-a623-8e1dd47c8078\") " pod="openshift-console/console-578c49649f-mltwz" Dec 01 21:47:48 crc kubenswrapper[4962]: I1201 21:47:48.165270 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/75defef6-e656-452e-a623-8e1dd47c8078-service-ca\") pod \"console-578c49649f-mltwz\" (UID: \"75defef6-e656-452e-a623-8e1dd47c8078\") " pod="openshift-console/console-578c49649f-mltwz" Dec 01 21:47:48 crc kubenswrapper[4962]: I1201 21:47:48.165363 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/75defef6-e656-452e-a623-8e1dd47c8078-console-config\") pod \"console-578c49649f-mltwz\" (UID: \"75defef6-e656-452e-a623-8e1dd47c8078\") " pod="openshift-console/console-578c49649f-mltwz" Dec 01 21:47:48 crc kubenswrapper[4962]: I1201 21:47:48.165395 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/75defef6-e656-452e-a623-8e1dd47c8078-console-oauth-config\") pod \"console-578c49649f-mltwz\" (UID: \"75defef6-e656-452e-a623-8e1dd47c8078\") " pod="openshift-console/console-578c49649f-mltwz" Dec 01 21:47:48 crc kubenswrapper[4962]: I1201 21:47:48.165429 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqwfp\" (UniqueName: \"kubernetes.io/projected/75defef6-e656-452e-a623-8e1dd47c8078-kube-api-access-xqwfp\") pod \"console-578c49649f-mltwz\" (UID: \"75defef6-e656-452e-a623-8e1dd47c8078\") " pod="openshift-console/console-578c49649f-mltwz" Dec 01 21:47:48 crc kubenswrapper[4962]: I1201 21:47:48.165511 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/75defef6-e656-452e-a623-8e1dd47c8078-oauth-serving-cert\") pod \"console-578c49649f-mltwz\" (UID: \"75defef6-e656-452e-a623-8e1dd47c8078\") " pod="openshift-console/console-578c49649f-mltwz" Dec 01 21:47:48 crc kubenswrapper[4962]: I1201 21:47:48.165541 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75defef6-e656-452e-a623-8e1dd47c8078-trusted-ca-bundle\") pod \"console-578c49649f-mltwz\" (UID: \"75defef6-e656-452e-a623-8e1dd47c8078\") " pod="openshift-console/console-578c49649f-mltwz" Dec 01 21:47:48 crc kubenswrapper[4962]: I1201 21:47:48.165587 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/75defef6-e656-452e-a623-8e1dd47c8078-console-serving-cert\") pod \"console-578c49649f-mltwz\" (UID: \"75defef6-e656-452e-a623-8e1dd47c8078\") " pod="openshift-console/console-578c49649f-mltwz" Dec 01 21:47:48 crc kubenswrapper[4962]: I1201 21:47:48.167646 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/75defef6-e656-452e-a623-8e1dd47c8078-service-ca\") pod \"console-578c49649f-mltwz\" (UID: \"75defef6-e656-452e-a623-8e1dd47c8078\") " pod="openshift-console/console-578c49649f-mltwz" Dec 01 21:47:48 crc kubenswrapper[4962]: I1201 21:47:48.168239 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/75defef6-e656-452e-a623-8e1dd47c8078-console-config\") pod \"console-578c49649f-mltwz\" (UID: \"75defef6-e656-452e-a623-8e1dd47c8078\") " pod="openshift-console/console-578c49649f-mltwz" Dec 01 21:47:48 crc kubenswrapper[4962]: I1201 21:47:48.169508 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/75defef6-e656-452e-a623-8e1dd47c8078-oauth-serving-cert\") pod \"console-578c49649f-mltwz\" (UID: \"75defef6-e656-452e-a623-8e1dd47c8078\") " pod="openshift-console/console-578c49649f-mltwz" Dec 01 21:47:48 crc kubenswrapper[4962]: I1201 21:47:48.169981 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75defef6-e656-452e-a623-8e1dd47c8078-trusted-ca-bundle\") pod \"console-578c49649f-mltwz\" (UID: \"75defef6-e656-452e-a623-8e1dd47c8078\") " pod="openshift-console/console-578c49649f-mltwz" Dec 01 21:47:48 crc kubenswrapper[4962]: I1201 21:47:48.172385 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/75defef6-e656-452e-a623-8e1dd47c8078-console-oauth-config\") pod \"console-578c49649f-mltwz\" (UID: \"75defef6-e656-452e-a623-8e1dd47c8078\") " pod="openshift-console/console-578c49649f-mltwz" Dec 01 21:47:48 crc kubenswrapper[4962]: I1201 21:47:48.172968 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/75defef6-e656-452e-a623-8e1dd47c8078-console-serving-cert\") pod \"console-578c49649f-mltwz\" (UID: \"75defef6-e656-452e-a623-8e1dd47c8078\") " pod="openshift-console/console-578c49649f-mltwz" Dec 01 21:47:48 crc kubenswrapper[4962]: I1201 21:47:48.186135 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqwfp\" (UniqueName: \"kubernetes.io/projected/75defef6-e656-452e-a623-8e1dd47c8078-kube-api-access-xqwfp\") pod \"console-578c49649f-mltwz\" (UID: \"75defef6-e656-452e-a623-8e1dd47c8078\") " pod="openshift-console/console-578c49649f-mltwz" Dec 01 21:47:48 crc kubenswrapper[4962]: I1201 21:47:48.200261 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-s4vbq" Dec 01 21:47:48 crc kubenswrapper[4962]: I1201 21:47:48.234823 4962 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 21:47:48 crc kubenswrapper[4962]: I1201 21:47:48.257707 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-578c49649f-mltwz" Dec 01 21:47:48 crc kubenswrapper[4962]: I1201 21:47:48.262531 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-l62dd"] Dec 01 21:47:48 crc kubenswrapper[4962]: I1201 21:47:48.266980 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/2473e9c3-5f3d-4122-ae3c-c0ef0de79201-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-s5czc\" (UID: \"2473e9c3-5f3d-4122-ae3c-c0ef0de79201\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-s5czc" Dec 01 21:47:48 crc kubenswrapper[4962]: I1201 21:47:48.270878 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/2473e9c3-5f3d-4122-ae3c-c0ef0de79201-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-s5czc\" (UID: \"2473e9c3-5f3d-4122-ae3c-c0ef0de79201\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-s5czc" Dec 01 21:47:48 crc kubenswrapper[4962]: I1201 21:47:48.369947 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/5a7b0f93-3ea3-4a0f-baef-4ca08977cbde-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-48pxc\" (UID: \"5a7b0f93-3ea3-4a0f-baef-4ca08977cbde\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-48pxc" Dec 01 21:47:48 crc kubenswrapper[4962]: I1201 21:47:48.375866 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/5a7b0f93-3ea3-4a0f-baef-4ca08977cbde-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-48pxc\" (UID: \"5a7b0f93-3ea3-4a0f-baef-4ca08977cbde\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-48pxc" Dec 01 21:47:48 crc kubenswrapper[4962]: I1201 21:47:48.466308 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-s5czc" Dec 01 21:47:48 crc kubenswrapper[4962]: I1201 21:47:48.569494 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-48pxc" Dec 01 21:47:48 crc kubenswrapper[4962]: I1201 21:47:48.703795 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-578c49649f-mltwz"] Dec 01 21:47:48 crc kubenswrapper[4962]: I1201 21:47:48.720290 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-c4mq6" Dec 01 21:47:48 crc kubenswrapper[4962]: I1201 21:47:48.771674 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-c4mq6" Dec 01 21:47:48 crc kubenswrapper[4962]: I1201 21:47:48.965592 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-c4mq6"] Dec 01 21:47:49 crc kubenswrapper[4962]: I1201 21:47:49.012160 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-s4vbq" event={"ID":"770c0f72-8589-4617-8b07-92d0702ff5b8","Type":"ContainerStarted","Data":"8271968bc77905b0c2e8d1e6658124636d9b9adb50497bdd2c47f8496a4856cf"} Dec 01 21:47:49 crc kubenswrapper[4962]: I1201 21:47:49.013678 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-578c49649f-mltwz" event={"ID":"75defef6-e656-452e-a623-8e1dd47c8078","Type":"ContainerStarted","Data":"719c04e831220c088ca2d9bac2c0648a40fbdf7225b0a92e1eb15edcfc075d99"} Dec 01 21:47:49 crc kubenswrapper[4962]: I1201 21:47:49.013740 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-578c49649f-mltwz" event={"ID":"75defef6-e656-452e-a623-8e1dd47c8078","Type":"ContainerStarted","Data":"f6ecf38f2d33aad78a29e4a4a63f32add8c01e305bd00e74e4227c777384ac9a"} Dec 01 21:47:49 crc kubenswrapper[4962]: I1201 21:47:49.014499 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-l62dd" event={"ID":"838c46a9-9378-4801-8cc4-e203bf8c2972","Type":"ContainerStarted","Data":"3a3f0df06bae2b3dec2f07e0c76f741c11f89fb88be5f8ddb5a2d1e495237a7b"} Dec 01 21:47:49 crc kubenswrapper[4962]: I1201 21:47:49.031713 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-s5czc"] Dec 01 21:47:49 crc kubenswrapper[4962]: I1201 21:47:49.047746 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-578c49649f-mltwz" podStartSLOduration=2.047732705 podStartE2EDuration="2.047732705s" podCreationTimestamp="2025-12-01 21:47:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:47:49.039460778 +0000 UTC m=+853.140899993" watchObservedRunningTime="2025-12-01 21:47:49.047732705 +0000 UTC m=+853.149171900" Dec 01 21:47:49 crc kubenswrapper[4962]: I1201 21:47:49.134465 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-48pxc"] Dec 01 21:47:50 crc kubenswrapper[4962]: I1201 21:47:50.023855 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-48pxc" event={"ID":"5a7b0f93-3ea3-4a0f-baef-4ca08977cbde","Type":"ContainerStarted","Data":"96a3098d9ebe09c765405e51623bf84d39acb10af234cd73e3955a23d958be55"} Dec 01 21:47:50 crc kubenswrapper[4962]: I1201 21:47:50.025078 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-s5czc" event={"ID":"2473e9c3-5f3d-4122-ae3c-c0ef0de79201","Type":"ContainerStarted","Data":"b1e6ae39594040e5aab33049d0a20cc12960ccc3d2b4268c96575702e5d7f373"} Dec 01 21:47:50 crc kubenswrapper[4962]: I1201 21:47:50.025201 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-c4mq6" podUID="0b7764f3-c427-4341-8a86-6ca32c186863" containerName="registry-server" containerID="cri-o://6c7f530ee3f79e2e155a43ae6e592f99017d411f151019a958ccd16d02ef4a01" gracePeriod=2 Dec 01 21:47:51 crc kubenswrapper[4962]: I1201 21:47:51.036674 4962 generic.go:334] "Generic (PLEG): container finished" podID="0b7764f3-c427-4341-8a86-6ca32c186863" containerID="6c7f530ee3f79e2e155a43ae6e592f99017d411f151019a958ccd16d02ef4a01" exitCode=0 Dec 01 21:47:51 crc kubenswrapper[4962]: I1201 21:47:51.036769 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c4mq6" event={"ID":"0b7764f3-c427-4341-8a86-6ca32c186863","Type":"ContainerDied","Data":"6c7f530ee3f79e2e155a43ae6e592f99017d411f151019a958ccd16d02ef4a01"} Dec 01 21:47:51 crc kubenswrapper[4962]: I1201 21:47:51.283132 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c4mq6" Dec 01 21:47:51 crc kubenswrapper[4962]: I1201 21:47:51.420979 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b7764f3-c427-4341-8a86-6ca32c186863-utilities\") pod \"0b7764f3-c427-4341-8a86-6ca32c186863\" (UID: \"0b7764f3-c427-4341-8a86-6ca32c186863\") " Dec 01 21:47:51 crc kubenswrapper[4962]: I1201 21:47:51.421041 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgtfd\" (UniqueName: \"kubernetes.io/projected/0b7764f3-c427-4341-8a86-6ca32c186863-kube-api-access-jgtfd\") pod \"0b7764f3-c427-4341-8a86-6ca32c186863\" (UID: \"0b7764f3-c427-4341-8a86-6ca32c186863\") " Dec 01 21:47:51 crc kubenswrapper[4962]: I1201 21:47:51.421206 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b7764f3-c427-4341-8a86-6ca32c186863-catalog-content\") pod \"0b7764f3-c427-4341-8a86-6ca32c186863\" (UID: \"0b7764f3-c427-4341-8a86-6ca32c186863\") " Dec 01 21:47:51 crc kubenswrapper[4962]: I1201 21:47:51.422247 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b7764f3-c427-4341-8a86-6ca32c186863-utilities" (OuterVolumeSpecName: "utilities") pod "0b7764f3-c427-4341-8a86-6ca32c186863" (UID: "0b7764f3-c427-4341-8a86-6ca32c186863"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:47:51 crc kubenswrapper[4962]: I1201 21:47:51.427104 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b7764f3-c427-4341-8a86-6ca32c186863-kube-api-access-jgtfd" (OuterVolumeSpecName: "kube-api-access-jgtfd") pod "0b7764f3-c427-4341-8a86-6ca32c186863" (UID: "0b7764f3-c427-4341-8a86-6ca32c186863"). InnerVolumeSpecName "kube-api-access-jgtfd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:47:51 crc kubenswrapper[4962]: I1201 21:47:51.522570 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b7764f3-c427-4341-8a86-6ca32c186863-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 21:47:51 crc kubenswrapper[4962]: I1201 21:47:51.522602 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jgtfd\" (UniqueName: \"kubernetes.io/projected/0b7764f3-c427-4341-8a86-6ca32c186863-kube-api-access-jgtfd\") on node \"crc\" DevicePath \"\"" Dec 01 21:47:51 crc kubenswrapper[4962]: I1201 21:47:51.530547 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b7764f3-c427-4341-8a86-6ca32c186863-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0b7764f3-c427-4341-8a86-6ca32c186863" (UID: "0b7764f3-c427-4341-8a86-6ca32c186863"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:47:51 crc kubenswrapper[4962]: I1201 21:47:51.624195 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b7764f3-c427-4341-8a86-6ca32c186863-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 21:47:52 crc kubenswrapper[4962]: I1201 21:47:52.058058 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-l62dd" event={"ID":"838c46a9-9378-4801-8cc4-e203bf8c2972","Type":"ContainerStarted","Data":"7b4f61425d41e30734103fb18a971ba93004273b334fe1fa1ed3c9f925d6703d"} Dec 01 21:47:52 crc kubenswrapper[4962]: I1201 21:47:52.061015 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c4mq6" event={"ID":"0b7764f3-c427-4341-8a86-6ca32c186863","Type":"ContainerDied","Data":"eacbbeeedd877d0104b5b7d053933efa931a91d616f6bb6a79e4d93a817445c1"} Dec 01 21:47:52 crc kubenswrapper[4962]: I1201 21:47:52.061053 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c4mq6" Dec 01 21:47:52 crc kubenswrapper[4962]: I1201 21:47:52.061082 4962 scope.go:117] "RemoveContainer" containerID="6c7f530ee3f79e2e155a43ae6e592f99017d411f151019a958ccd16d02ef4a01" Dec 01 21:47:52 crc kubenswrapper[4962]: I1201 21:47:52.062840 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-s4vbq" event={"ID":"770c0f72-8589-4617-8b07-92d0702ff5b8","Type":"ContainerStarted","Data":"82b1cd186c9e1e28678984998457e1b5ccaa496449c2a359f91b87c637b2edc2"} Dec 01 21:47:52 crc kubenswrapper[4962]: I1201 21:47:52.062982 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-s4vbq" Dec 01 21:47:52 crc kubenswrapper[4962]: I1201 21:47:52.070483 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-s5czc" event={"ID":"2473e9c3-5f3d-4122-ae3c-c0ef0de79201","Type":"ContainerStarted","Data":"02902cdb9ff7026ebf121fb7e0dadf589c8da63d2137d1900dcf048702cc0244"} Dec 01 21:47:52 crc kubenswrapper[4962]: I1201 21:47:52.070639 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-s5czc" Dec 01 21:47:52 crc kubenswrapper[4962]: I1201 21:47:52.088580 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-s4vbq" podStartSLOduration=2.4101753009999998 podStartE2EDuration="5.088556039s" podCreationTimestamp="2025-12-01 21:47:47 +0000 UTC" firstStartedPulling="2025-12-01 21:47:48.260862931 +0000 UTC m=+852.362302126" lastFinishedPulling="2025-12-01 21:47:50.939243669 +0000 UTC m=+855.040682864" observedRunningTime="2025-12-01 21:47:52.078621204 +0000 UTC m=+856.180060409" watchObservedRunningTime="2025-12-01 21:47:52.088556039 +0000 UTC m=+856.189995234" Dec 01 21:47:52 crc kubenswrapper[4962]: I1201 21:47:52.103215 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-s5czc" podStartSLOduration=3.200780042 podStartE2EDuration="5.102998303s" podCreationTimestamp="2025-12-01 21:47:47 +0000 UTC" firstStartedPulling="2025-12-01 21:47:49.040800836 +0000 UTC m=+853.142240031" lastFinishedPulling="2025-12-01 21:47:50.943019087 +0000 UTC m=+855.044458292" observedRunningTime="2025-12-01 21:47:52.097557217 +0000 UTC m=+856.198996442" watchObservedRunningTime="2025-12-01 21:47:52.102998303 +0000 UTC m=+856.204437508" Dec 01 21:47:52 crc kubenswrapper[4962]: I1201 21:47:52.110712 4962 scope.go:117] "RemoveContainer" containerID="d39e484195523afa77fb20f0b70d4ce0492f19d6cd7703335b9fdc72f0a9e2ee" Dec 01 21:47:52 crc kubenswrapper[4962]: I1201 21:47:52.126866 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-c4mq6"] Dec 01 21:47:52 crc kubenswrapper[4962]: I1201 21:47:52.135325 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-c4mq6"] Dec 01 21:47:52 crc kubenswrapper[4962]: I1201 21:47:52.194108 4962 scope.go:117] "RemoveContainer" containerID="59f9bc42ee8fc8b50eb40b6fc533ffa2bc75833e2ba25188bea87600712945bc" Dec 01 21:47:52 crc kubenswrapper[4962]: I1201 21:47:52.229120 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b7764f3-c427-4341-8a86-6ca32c186863" path="/var/lib/kubelet/pods/0b7764f3-c427-4341-8a86-6ca32c186863/volumes" Dec 01 21:47:53 crc kubenswrapper[4962]: I1201 21:47:53.081925 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-48pxc" event={"ID":"5a7b0f93-3ea3-4a0f-baef-4ca08977cbde","Type":"ContainerStarted","Data":"5d9dab38c11b542674e86e386fd5596ea72fa9e2822ac802b048bbdcb5ca1eee"} Dec 01 21:47:54 crc kubenswrapper[4962]: I1201 21:47:54.094610 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-l62dd" event={"ID":"838c46a9-9378-4801-8cc4-e203bf8c2972","Type":"ContainerStarted","Data":"3ed244b3341ab03756dc5b33d7247f2796565bf39fef2e798d1ff89b3ee83461"} Dec 01 21:47:54 crc kubenswrapper[4962]: I1201 21:47:54.118656 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-48pxc" podStartSLOduration=4.292265466 podStartE2EDuration="7.118631818s" podCreationTimestamp="2025-12-01 21:47:47 +0000 UTC" firstStartedPulling="2025-12-01 21:47:49.146777056 +0000 UTC m=+853.248216251" lastFinishedPulling="2025-12-01 21:47:51.973143398 +0000 UTC m=+856.074582603" observedRunningTime="2025-12-01 21:47:53.112280138 +0000 UTC m=+857.213719353" watchObservedRunningTime="2025-12-01 21:47:54.118631818 +0000 UTC m=+858.220071043" Dec 01 21:47:58 crc kubenswrapper[4962]: I1201 21:47:58.247897 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-s4vbq" Dec 01 21:47:58 crc kubenswrapper[4962]: I1201 21:47:58.258757 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-578c49649f-mltwz" Dec 01 21:47:58 crc kubenswrapper[4962]: I1201 21:47:58.258807 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-578c49649f-mltwz" Dec 01 21:47:58 crc kubenswrapper[4962]: I1201 21:47:58.268917 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-578c49649f-mltwz" Dec 01 21:47:58 crc kubenswrapper[4962]: I1201 21:47:58.274663 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-l62dd" podStartSLOduration=6.217724674 podStartE2EDuration="11.274637865s" podCreationTimestamp="2025-12-01 21:47:47 +0000 UTC" firstStartedPulling="2025-12-01 21:47:48.234532986 +0000 UTC m=+852.335972191" lastFinishedPulling="2025-12-01 21:47:53.291446147 +0000 UTC m=+857.392885382" observedRunningTime="2025-12-01 21:47:54.121524591 +0000 UTC m=+858.222963776" watchObservedRunningTime="2025-12-01 21:47:58.274637865 +0000 UTC m=+862.376077090" Dec 01 21:47:59 crc kubenswrapper[4962]: I1201 21:47:59.172723 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-578c49649f-mltwz" Dec 01 21:47:59 crc kubenswrapper[4962]: I1201 21:47:59.254964 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7bcb5d4c85-jhx6l"] Dec 01 21:48:08 crc kubenswrapper[4962]: I1201 21:48:08.474013 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-s5czc" Dec 01 21:48:24 crc kubenswrapper[4962]: I1201 21:48:24.317592 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-7bcb5d4c85-jhx6l" podUID="2571680f-abc0-4c0e-8178-c8e336cca4b4" containerName="console" containerID="cri-o://e3df5734e42aa4a5dabd49292e8e68f3ed43e97c21d59a3a4b79ac5b94138d64" gracePeriod=15 Dec 01 21:48:24 crc kubenswrapper[4962]: I1201 21:48:24.715513 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7bcb5d4c85-jhx6l_2571680f-abc0-4c0e-8178-c8e336cca4b4/console/0.log" Dec 01 21:48:24 crc kubenswrapper[4962]: I1201 21:48:24.715833 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7bcb5d4c85-jhx6l" Dec 01 21:48:24 crc kubenswrapper[4962]: I1201 21:48:24.896439 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2571680f-abc0-4c0e-8178-c8e336cca4b4-console-serving-cert\") pod \"2571680f-abc0-4c0e-8178-c8e336cca4b4\" (UID: \"2571680f-abc0-4c0e-8178-c8e336cca4b4\") " Dec 01 21:48:24 crc kubenswrapper[4962]: I1201 21:48:24.896492 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2571680f-abc0-4c0e-8178-c8e336cca4b4-oauth-serving-cert\") pod \"2571680f-abc0-4c0e-8178-c8e336cca4b4\" (UID: \"2571680f-abc0-4c0e-8178-c8e336cca4b4\") " Dec 01 21:48:24 crc kubenswrapper[4962]: I1201 21:48:24.896542 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2571680f-abc0-4c0e-8178-c8e336cca4b4-console-oauth-config\") pod \"2571680f-abc0-4c0e-8178-c8e336cca4b4\" (UID: \"2571680f-abc0-4c0e-8178-c8e336cca4b4\") " Dec 01 21:48:24 crc kubenswrapper[4962]: I1201 21:48:24.896568 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2571680f-abc0-4c0e-8178-c8e336cca4b4-service-ca\") pod \"2571680f-abc0-4c0e-8178-c8e336cca4b4\" (UID: \"2571680f-abc0-4c0e-8178-c8e336cca4b4\") " Dec 01 21:48:24 crc kubenswrapper[4962]: I1201 21:48:24.896688 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2571680f-abc0-4c0e-8178-c8e336cca4b4-console-config\") pod \"2571680f-abc0-4c0e-8178-c8e336cca4b4\" (UID: \"2571680f-abc0-4c0e-8178-c8e336cca4b4\") " Dec 01 21:48:24 crc kubenswrapper[4962]: I1201 21:48:24.896734 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2571680f-abc0-4c0e-8178-c8e336cca4b4-trusted-ca-bundle\") pod \"2571680f-abc0-4c0e-8178-c8e336cca4b4\" (UID: \"2571680f-abc0-4c0e-8178-c8e336cca4b4\") " Dec 01 21:48:24 crc kubenswrapper[4962]: I1201 21:48:24.896803 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xxth\" (UniqueName: \"kubernetes.io/projected/2571680f-abc0-4c0e-8178-c8e336cca4b4-kube-api-access-5xxth\") pod \"2571680f-abc0-4c0e-8178-c8e336cca4b4\" (UID: \"2571680f-abc0-4c0e-8178-c8e336cca4b4\") " Dec 01 21:48:24 crc kubenswrapper[4962]: I1201 21:48:24.897465 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2571680f-abc0-4c0e-8178-c8e336cca4b4-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "2571680f-abc0-4c0e-8178-c8e336cca4b4" (UID: "2571680f-abc0-4c0e-8178-c8e336cca4b4"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:48:24 crc kubenswrapper[4962]: I1201 21:48:24.897764 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2571680f-abc0-4c0e-8178-c8e336cca4b4-console-config" (OuterVolumeSpecName: "console-config") pod "2571680f-abc0-4c0e-8178-c8e336cca4b4" (UID: "2571680f-abc0-4c0e-8178-c8e336cca4b4"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:48:24 crc kubenswrapper[4962]: I1201 21:48:24.898062 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2571680f-abc0-4c0e-8178-c8e336cca4b4-service-ca" (OuterVolumeSpecName: "service-ca") pod "2571680f-abc0-4c0e-8178-c8e336cca4b4" (UID: "2571680f-abc0-4c0e-8178-c8e336cca4b4"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:48:24 crc kubenswrapper[4962]: I1201 21:48:24.899083 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2571680f-abc0-4c0e-8178-c8e336cca4b4-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "2571680f-abc0-4c0e-8178-c8e336cca4b4" (UID: "2571680f-abc0-4c0e-8178-c8e336cca4b4"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:48:24 crc kubenswrapper[4962]: I1201 21:48:24.904749 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2571680f-abc0-4c0e-8178-c8e336cca4b4-kube-api-access-5xxth" (OuterVolumeSpecName: "kube-api-access-5xxth") pod "2571680f-abc0-4c0e-8178-c8e336cca4b4" (UID: "2571680f-abc0-4c0e-8178-c8e336cca4b4"). InnerVolumeSpecName "kube-api-access-5xxth". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:48:24 crc kubenswrapper[4962]: I1201 21:48:24.905604 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2571680f-abc0-4c0e-8178-c8e336cca4b4-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "2571680f-abc0-4c0e-8178-c8e336cca4b4" (UID: "2571680f-abc0-4c0e-8178-c8e336cca4b4"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:48:24 crc kubenswrapper[4962]: I1201 21:48:24.910094 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2571680f-abc0-4c0e-8178-c8e336cca4b4-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "2571680f-abc0-4c0e-8178-c8e336cca4b4" (UID: "2571680f-abc0-4c0e-8178-c8e336cca4b4"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:48:24 crc kubenswrapper[4962]: I1201 21:48:24.998244 4962 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2571680f-abc0-4c0e-8178-c8e336cca4b4-console-config\") on node \"crc\" DevicePath \"\"" Dec 01 21:48:24 crc kubenswrapper[4962]: I1201 21:48:24.998615 4962 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2571680f-abc0-4c0e-8178-c8e336cca4b4-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 21:48:24 crc kubenswrapper[4962]: I1201 21:48:24.998626 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xxth\" (UniqueName: \"kubernetes.io/projected/2571680f-abc0-4c0e-8178-c8e336cca4b4-kube-api-access-5xxth\") on node \"crc\" DevicePath \"\"" Dec 01 21:48:24 crc kubenswrapper[4962]: I1201 21:48:24.998637 4962 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2571680f-abc0-4c0e-8178-c8e336cca4b4-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 21:48:24 crc kubenswrapper[4962]: I1201 21:48:24.998646 4962 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2571680f-abc0-4c0e-8178-c8e336cca4b4-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 21:48:24 crc kubenswrapper[4962]: I1201 21:48:24.998654 4962 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2571680f-abc0-4c0e-8178-c8e336cca4b4-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 01 21:48:24 crc kubenswrapper[4962]: I1201 21:48:24.998662 4962 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2571680f-abc0-4c0e-8178-c8e336cca4b4-service-ca\") on node \"crc\" DevicePath \"\"" Dec 01 21:48:25 crc kubenswrapper[4962]: I1201 21:48:25.418440 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7bcb5d4c85-jhx6l_2571680f-abc0-4c0e-8178-c8e336cca4b4/console/0.log" Dec 01 21:48:25 crc kubenswrapper[4962]: I1201 21:48:25.418727 4962 generic.go:334] "Generic (PLEG): container finished" podID="2571680f-abc0-4c0e-8178-c8e336cca4b4" containerID="e3df5734e42aa4a5dabd49292e8e68f3ed43e97c21d59a3a4b79ac5b94138d64" exitCode=2 Dec 01 21:48:25 crc kubenswrapper[4962]: I1201 21:48:25.418758 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7bcb5d4c85-jhx6l" event={"ID":"2571680f-abc0-4c0e-8178-c8e336cca4b4","Type":"ContainerDied","Data":"e3df5734e42aa4a5dabd49292e8e68f3ed43e97c21d59a3a4b79ac5b94138d64"} Dec 01 21:48:25 crc kubenswrapper[4962]: I1201 21:48:25.418783 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7bcb5d4c85-jhx6l" event={"ID":"2571680f-abc0-4c0e-8178-c8e336cca4b4","Type":"ContainerDied","Data":"e2c3be77b352e830eb018340fa37c420d5dcfb10a66df42339e8612448ce847c"} Dec 01 21:48:25 crc kubenswrapper[4962]: I1201 21:48:25.418799 4962 scope.go:117] "RemoveContainer" containerID="e3df5734e42aa4a5dabd49292e8e68f3ed43e97c21d59a3a4b79ac5b94138d64" Dec 01 21:48:25 crc kubenswrapper[4962]: I1201 21:48:25.418806 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7bcb5d4c85-jhx6l" Dec 01 21:48:25 crc kubenswrapper[4962]: I1201 21:48:25.453282 4962 scope.go:117] "RemoveContainer" containerID="e3df5734e42aa4a5dabd49292e8e68f3ed43e97c21d59a3a4b79ac5b94138d64" Dec 01 21:48:25 crc kubenswrapper[4962]: E1201 21:48:25.453762 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3df5734e42aa4a5dabd49292e8e68f3ed43e97c21d59a3a4b79ac5b94138d64\": container with ID starting with e3df5734e42aa4a5dabd49292e8e68f3ed43e97c21d59a3a4b79ac5b94138d64 not found: ID does not exist" containerID="e3df5734e42aa4a5dabd49292e8e68f3ed43e97c21d59a3a4b79ac5b94138d64" Dec 01 21:48:25 crc kubenswrapper[4962]: I1201 21:48:25.453799 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3df5734e42aa4a5dabd49292e8e68f3ed43e97c21d59a3a4b79ac5b94138d64"} err="failed to get container status \"e3df5734e42aa4a5dabd49292e8e68f3ed43e97c21d59a3a4b79ac5b94138d64\": rpc error: code = NotFound desc = could not find container \"e3df5734e42aa4a5dabd49292e8e68f3ed43e97c21d59a3a4b79ac5b94138d64\": container with ID starting with e3df5734e42aa4a5dabd49292e8e68f3ed43e97c21d59a3a4b79ac5b94138d64 not found: ID does not exist" Dec 01 21:48:25 crc kubenswrapper[4962]: I1201 21:48:25.455701 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7bcb5d4c85-jhx6l"] Dec 01 21:48:25 crc kubenswrapper[4962]: I1201 21:48:25.461888 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7bcb5d4c85-jhx6l"] Dec 01 21:48:26 crc kubenswrapper[4962]: I1201 21:48:26.231026 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2571680f-abc0-4c0e-8178-c8e336cca4b4" path="/var/lib/kubelet/pods/2571680f-abc0-4c0e-8178-c8e336cca4b4/volumes" Dec 01 21:48:28 crc kubenswrapper[4962]: I1201 21:48:28.015703 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832wntw"] Dec 01 21:48:28 crc kubenswrapper[4962]: E1201 21:48:28.016434 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b7764f3-c427-4341-8a86-6ca32c186863" containerName="registry-server" Dec 01 21:48:28 crc kubenswrapper[4962]: I1201 21:48:28.016445 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b7764f3-c427-4341-8a86-6ca32c186863" containerName="registry-server" Dec 01 21:48:28 crc kubenswrapper[4962]: E1201 21:48:28.016458 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b7764f3-c427-4341-8a86-6ca32c186863" containerName="extract-utilities" Dec 01 21:48:28 crc kubenswrapper[4962]: I1201 21:48:28.016464 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b7764f3-c427-4341-8a86-6ca32c186863" containerName="extract-utilities" Dec 01 21:48:28 crc kubenswrapper[4962]: E1201 21:48:28.016486 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2571680f-abc0-4c0e-8178-c8e336cca4b4" containerName="console" Dec 01 21:48:28 crc kubenswrapper[4962]: I1201 21:48:28.016493 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2571680f-abc0-4c0e-8178-c8e336cca4b4" containerName="console" Dec 01 21:48:28 crc kubenswrapper[4962]: E1201 21:48:28.016505 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b7764f3-c427-4341-8a86-6ca32c186863" containerName="extract-content" Dec 01 21:48:28 crc kubenswrapper[4962]: I1201 21:48:28.016510 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b7764f3-c427-4341-8a86-6ca32c186863" containerName="extract-content" Dec 01 21:48:28 crc kubenswrapper[4962]: I1201 21:48:28.016622 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2571680f-abc0-4c0e-8178-c8e336cca4b4" containerName="console" Dec 01 21:48:28 crc kubenswrapper[4962]: I1201 21:48:28.016632 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b7764f3-c427-4341-8a86-6ca32c186863" containerName="registry-server" Dec 01 21:48:28 crc kubenswrapper[4962]: I1201 21:48:28.017583 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832wntw" Dec 01 21:48:28 crc kubenswrapper[4962]: I1201 21:48:28.020156 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 01 21:48:28 crc kubenswrapper[4962]: I1201 21:48:28.028274 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832wntw"] Dec 01 21:48:28 crc kubenswrapper[4962]: I1201 21:48:28.055481 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b6528667-e4ce-4641-9cbb-5ebfac003777-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832wntw\" (UID: \"b6528667-e4ce-4641-9cbb-5ebfac003777\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832wntw" Dec 01 21:48:28 crc kubenswrapper[4962]: I1201 21:48:28.055555 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b6528667-e4ce-4641-9cbb-5ebfac003777-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832wntw\" (UID: \"b6528667-e4ce-4641-9cbb-5ebfac003777\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832wntw" Dec 01 21:48:28 crc kubenswrapper[4962]: I1201 21:48:28.055667 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kslh2\" (UniqueName: \"kubernetes.io/projected/b6528667-e4ce-4641-9cbb-5ebfac003777-kube-api-access-kslh2\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832wntw\" (UID: \"b6528667-e4ce-4641-9cbb-5ebfac003777\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832wntw" Dec 01 21:48:28 crc kubenswrapper[4962]: I1201 21:48:28.157781 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b6528667-e4ce-4641-9cbb-5ebfac003777-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832wntw\" (UID: \"b6528667-e4ce-4641-9cbb-5ebfac003777\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832wntw" Dec 01 21:48:28 crc kubenswrapper[4962]: I1201 21:48:28.157829 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b6528667-e4ce-4641-9cbb-5ebfac003777-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832wntw\" (UID: \"b6528667-e4ce-4641-9cbb-5ebfac003777\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832wntw" Dec 01 21:48:28 crc kubenswrapper[4962]: I1201 21:48:28.157859 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kslh2\" (UniqueName: \"kubernetes.io/projected/b6528667-e4ce-4641-9cbb-5ebfac003777-kube-api-access-kslh2\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832wntw\" (UID: \"b6528667-e4ce-4641-9cbb-5ebfac003777\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832wntw" Dec 01 21:48:28 crc kubenswrapper[4962]: I1201 21:48:28.158401 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b6528667-e4ce-4641-9cbb-5ebfac003777-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832wntw\" (UID: \"b6528667-e4ce-4641-9cbb-5ebfac003777\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832wntw" Dec 01 21:48:28 crc kubenswrapper[4962]: I1201 21:48:28.158455 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b6528667-e4ce-4641-9cbb-5ebfac003777-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832wntw\" (UID: \"b6528667-e4ce-4641-9cbb-5ebfac003777\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832wntw" Dec 01 21:48:28 crc kubenswrapper[4962]: I1201 21:48:28.178743 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kslh2\" (UniqueName: \"kubernetes.io/projected/b6528667-e4ce-4641-9cbb-5ebfac003777-kube-api-access-kslh2\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832wntw\" (UID: \"b6528667-e4ce-4641-9cbb-5ebfac003777\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832wntw" Dec 01 21:48:28 crc kubenswrapper[4962]: I1201 21:48:28.335889 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832wntw" Dec 01 21:48:28 crc kubenswrapper[4962]: I1201 21:48:28.845950 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832wntw"] Dec 01 21:48:29 crc kubenswrapper[4962]: I1201 21:48:29.452517 4962 generic.go:334] "Generic (PLEG): container finished" podID="b6528667-e4ce-4641-9cbb-5ebfac003777" containerID="85649e6cf7ca97678ab26931539e3d074656b1bafd6c481ea937b01ce483af10" exitCode=0 Dec 01 21:48:29 crc kubenswrapper[4962]: I1201 21:48:29.452576 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832wntw" event={"ID":"b6528667-e4ce-4641-9cbb-5ebfac003777","Type":"ContainerDied","Data":"85649e6cf7ca97678ab26931539e3d074656b1bafd6c481ea937b01ce483af10"} Dec 01 21:48:29 crc kubenswrapper[4962]: I1201 21:48:29.452841 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832wntw" event={"ID":"b6528667-e4ce-4641-9cbb-5ebfac003777","Type":"ContainerStarted","Data":"735160ee12f3ed53061eefe64695fa56822fe4d6518eabdea90c21c5855d9b93"} Dec 01 21:48:31 crc kubenswrapper[4962]: I1201 21:48:31.476406 4962 generic.go:334] "Generic (PLEG): container finished" podID="b6528667-e4ce-4641-9cbb-5ebfac003777" containerID="30e3d46b5cf6def15d3f01ca45e7aaf2cf73cce00f34fb52600c36d2fddbb9d5" exitCode=0 Dec 01 21:48:31 crc kubenswrapper[4962]: I1201 21:48:31.476488 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832wntw" event={"ID":"b6528667-e4ce-4641-9cbb-5ebfac003777","Type":"ContainerDied","Data":"30e3d46b5cf6def15d3f01ca45e7aaf2cf73cce00f34fb52600c36d2fddbb9d5"} Dec 01 21:48:32 crc kubenswrapper[4962]: I1201 21:48:32.488817 4962 generic.go:334] "Generic (PLEG): container finished" podID="b6528667-e4ce-4641-9cbb-5ebfac003777" containerID="f8694f14f07dda6a22b148ca1eb917180b0d6bcc646c28815cc6d5fec942721f" exitCode=0 Dec 01 21:48:32 crc kubenswrapper[4962]: I1201 21:48:32.488892 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832wntw" event={"ID":"b6528667-e4ce-4641-9cbb-5ebfac003777","Type":"ContainerDied","Data":"f8694f14f07dda6a22b148ca1eb917180b0d6bcc646c28815cc6d5fec942721f"} Dec 01 21:48:33 crc kubenswrapper[4962]: I1201 21:48:33.925194 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832wntw" Dec 01 21:48:33 crc kubenswrapper[4962]: I1201 21:48:33.992381 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kslh2\" (UniqueName: \"kubernetes.io/projected/b6528667-e4ce-4641-9cbb-5ebfac003777-kube-api-access-kslh2\") pod \"b6528667-e4ce-4641-9cbb-5ebfac003777\" (UID: \"b6528667-e4ce-4641-9cbb-5ebfac003777\") " Dec 01 21:48:33 crc kubenswrapper[4962]: I1201 21:48:33.992547 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b6528667-e4ce-4641-9cbb-5ebfac003777-util\") pod \"b6528667-e4ce-4641-9cbb-5ebfac003777\" (UID: \"b6528667-e4ce-4641-9cbb-5ebfac003777\") " Dec 01 21:48:33 crc kubenswrapper[4962]: I1201 21:48:33.992720 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b6528667-e4ce-4641-9cbb-5ebfac003777-bundle\") pod \"b6528667-e4ce-4641-9cbb-5ebfac003777\" (UID: \"b6528667-e4ce-4641-9cbb-5ebfac003777\") " Dec 01 21:48:33 crc kubenswrapper[4962]: I1201 21:48:33.994685 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6528667-e4ce-4641-9cbb-5ebfac003777-bundle" (OuterVolumeSpecName: "bundle") pod "b6528667-e4ce-4641-9cbb-5ebfac003777" (UID: "b6528667-e4ce-4641-9cbb-5ebfac003777"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:48:34 crc kubenswrapper[4962]: I1201 21:48:34.001419 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6528667-e4ce-4641-9cbb-5ebfac003777-kube-api-access-kslh2" (OuterVolumeSpecName: "kube-api-access-kslh2") pod "b6528667-e4ce-4641-9cbb-5ebfac003777" (UID: "b6528667-e4ce-4641-9cbb-5ebfac003777"). InnerVolumeSpecName "kube-api-access-kslh2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:48:34 crc kubenswrapper[4962]: I1201 21:48:34.013860 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6528667-e4ce-4641-9cbb-5ebfac003777-util" (OuterVolumeSpecName: "util") pod "b6528667-e4ce-4641-9cbb-5ebfac003777" (UID: "b6528667-e4ce-4641-9cbb-5ebfac003777"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:48:34 crc kubenswrapper[4962]: I1201 21:48:34.095189 4962 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b6528667-e4ce-4641-9cbb-5ebfac003777-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 21:48:34 crc kubenswrapper[4962]: I1201 21:48:34.095267 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kslh2\" (UniqueName: \"kubernetes.io/projected/b6528667-e4ce-4641-9cbb-5ebfac003777-kube-api-access-kslh2\") on node \"crc\" DevicePath \"\"" Dec 01 21:48:34 crc kubenswrapper[4962]: I1201 21:48:34.095288 4962 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b6528667-e4ce-4641-9cbb-5ebfac003777-util\") on node \"crc\" DevicePath \"\"" Dec 01 21:48:34 crc kubenswrapper[4962]: I1201 21:48:34.512734 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832wntw" event={"ID":"b6528667-e4ce-4641-9cbb-5ebfac003777","Type":"ContainerDied","Data":"735160ee12f3ed53061eefe64695fa56822fe4d6518eabdea90c21c5855d9b93"} Dec 01 21:48:34 crc kubenswrapper[4962]: I1201 21:48:34.512812 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832wntw" Dec 01 21:48:34 crc kubenswrapper[4962]: I1201 21:48:34.512820 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="735160ee12f3ed53061eefe64695fa56822fe4d6518eabdea90c21c5855d9b93" Dec 01 21:48:43 crc kubenswrapper[4962]: I1201 21:48:43.977090 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-fc9ff4f78-6q794"] Dec 01 21:48:43 crc kubenswrapper[4962]: E1201 21:48:43.977718 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6528667-e4ce-4641-9cbb-5ebfac003777" containerName="extract" Dec 01 21:48:43 crc kubenswrapper[4962]: I1201 21:48:43.977729 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6528667-e4ce-4641-9cbb-5ebfac003777" containerName="extract" Dec 01 21:48:43 crc kubenswrapper[4962]: E1201 21:48:43.977738 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6528667-e4ce-4641-9cbb-5ebfac003777" containerName="util" Dec 01 21:48:43 crc kubenswrapper[4962]: I1201 21:48:43.977744 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6528667-e4ce-4641-9cbb-5ebfac003777" containerName="util" Dec 01 21:48:43 crc kubenswrapper[4962]: E1201 21:48:43.977753 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6528667-e4ce-4641-9cbb-5ebfac003777" containerName="pull" Dec 01 21:48:43 crc kubenswrapper[4962]: I1201 21:48:43.977758 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6528667-e4ce-4641-9cbb-5ebfac003777" containerName="pull" Dec 01 21:48:43 crc kubenswrapper[4962]: I1201 21:48:43.977879 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6528667-e4ce-4641-9cbb-5ebfac003777" containerName="extract" Dec 01 21:48:43 crc kubenswrapper[4962]: I1201 21:48:43.978415 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-fc9ff4f78-6q794" Dec 01 21:48:43 crc kubenswrapper[4962]: I1201 21:48:43.982600 4962 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Dec 01 21:48:43 crc kubenswrapper[4962]: I1201 21:48:43.982646 4962 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Dec 01 21:48:43 crc kubenswrapper[4962]: I1201 21:48:43.982766 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Dec 01 21:48:43 crc kubenswrapper[4962]: I1201 21:48:43.982904 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Dec 01 21:48:43 crc kubenswrapper[4962]: I1201 21:48:43.982996 4962 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-6b9xc" Dec 01 21:48:43 crc kubenswrapper[4962]: I1201 21:48:43.993699 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-fc9ff4f78-6q794"] Dec 01 21:48:44 crc kubenswrapper[4962]: I1201 21:48:44.071321 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69tzq\" (UniqueName: \"kubernetes.io/projected/06d500dd-2267-451a-992d-d676f1033bb6-kube-api-access-69tzq\") pod \"metallb-operator-controller-manager-fc9ff4f78-6q794\" (UID: \"06d500dd-2267-451a-992d-d676f1033bb6\") " pod="metallb-system/metallb-operator-controller-manager-fc9ff4f78-6q794" Dec 01 21:48:44 crc kubenswrapper[4962]: I1201 21:48:44.071383 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/06d500dd-2267-451a-992d-d676f1033bb6-webhook-cert\") pod \"metallb-operator-controller-manager-fc9ff4f78-6q794\" (UID: \"06d500dd-2267-451a-992d-d676f1033bb6\") " pod="metallb-system/metallb-operator-controller-manager-fc9ff4f78-6q794" Dec 01 21:48:44 crc kubenswrapper[4962]: I1201 21:48:44.071581 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/06d500dd-2267-451a-992d-d676f1033bb6-apiservice-cert\") pod \"metallb-operator-controller-manager-fc9ff4f78-6q794\" (UID: \"06d500dd-2267-451a-992d-d676f1033bb6\") " pod="metallb-system/metallb-operator-controller-manager-fc9ff4f78-6q794" Dec 01 21:48:44 crc kubenswrapper[4962]: I1201 21:48:44.172680 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/06d500dd-2267-451a-992d-d676f1033bb6-apiservice-cert\") pod \"metallb-operator-controller-manager-fc9ff4f78-6q794\" (UID: \"06d500dd-2267-451a-992d-d676f1033bb6\") " pod="metallb-system/metallb-operator-controller-manager-fc9ff4f78-6q794" Dec 01 21:48:44 crc kubenswrapper[4962]: I1201 21:48:44.172784 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69tzq\" (UniqueName: \"kubernetes.io/projected/06d500dd-2267-451a-992d-d676f1033bb6-kube-api-access-69tzq\") pod \"metallb-operator-controller-manager-fc9ff4f78-6q794\" (UID: \"06d500dd-2267-451a-992d-d676f1033bb6\") " pod="metallb-system/metallb-operator-controller-manager-fc9ff4f78-6q794" Dec 01 21:48:44 crc kubenswrapper[4962]: I1201 21:48:44.172820 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/06d500dd-2267-451a-992d-d676f1033bb6-webhook-cert\") pod \"metallb-operator-controller-manager-fc9ff4f78-6q794\" (UID: \"06d500dd-2267-451a-992d-d676f1033bb6\") " pod="metallb-system/metallb-operator-controller-manager-fc9ff4f78-6q794" Dec 01 21:48:44 crc kubenswrapper[4962]: I1201 21:48:44.179000 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/06d500dd-2267-451a-992d-d676f1033bb6-webhook-cert\") pod \"metallb-operator-controller-manager-fc9ff4f78-6q794\" (UID: \"06d500dd-2267-451a-992d-d676f1033bb6\") " pod="metallb-system/metallb-operator-controller-manager-fc9ff4f78-6q794" Dec 01 21:48:44 crc kubenswrapper[4962]: I1201 21:48:44.190644 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69tzq\" (UniqueName: \"kubernetes.io/projected/06d500dd-2267-451a-992d-d676f1033bb6-kube-api-access-69tzq\") pod \"metallb-operator-controller-manager-fc9ff4f78-6q794\" (UID: \"06d500dd-2267-451a-992d-d676f1033bb6\") " pod="metallb-system/metallb-operator-controller-manager-fc9ff4f78-6q794" Dec 01 21:48:44 crc kubenswrapper[4962]: I1201 21:48:44.192286 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/06d500dd-2267-451a-992d-d676f1033bb6-apiservice-cert\") pod \"metallb-operator-controller-manager-fc9ff4f78-6q794\" (UID: \"06d500dd-2267-451a-992d-d676f1033bb6\") " pod="metallb-system/metallb-operator-controller-manager-fc9ff4f78-6q794" Dec 01 21:48:44 crc kubenswrapper[4962]: I1201 21:48:44.216271 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-74475bd8d7-k5jkf"] Dec 01 21:48:44 crc kubenswrapper[4962]: I1201 21:48:44.217923 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-74475bd8d7-k5jkf" Dec 01 21:48:44 crc kubenswrapper[4962]: I1201 21:48:44.220278 4962 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 01 21:48:44 crc kubenswrapper[4962]: I1201 21:48:44.223146 4962 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-fqkwt" Dec 01 21:48:44 crc kubenswrapper[4962]: I1201 21:48:44.227860 4962 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Dec 01 21:48:44 crc kubenswrapper[4962]: I1201 21:48:44.274194 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e46e036e-ca57-4675-a356-6a0cf72b184d-webhook-cert\") pod \"metallb-operator-webhook-server-74475bd8d7-k5jkf\" (UID: \"e46e036e-ca57-4675-a356-6a0cf72b184d\") " pod="metallb-system/metallb-operator-webhook-server-74475bd8d7-k5jkf" Dec 01 21:48:44 crc kubenswrapper[4962]: I1201 21:48:44.274321 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e46e036e-ca57-4675-a356-6a0cf72b184d-apiservice-cert\") pod \"metallb-operator-webhook-server-74475bd8d7-k5jkf\" (UID: \"e46e036e-ca57-4675-a356-6a0cf72b184d\") " pod="metallb-system/metallb-operator-webhook-server-74475bd8d7-k5jkf" Dec 01 21:48:44 crc kubenswrapper[4962]: I1201 21:48:44.274353 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wnh7\" (UniqueName: \"kubernetes.io/projected/e46e036e-ca57-4675-a356-6a0cf72b184d-kube-api-access-5wnh7\") pod \"metallb-operator-webhook-server-74475bd8d7-k5jkf\" (UID: \"e46e036e-ca57-4675-a356-6a0cf72b184d\") " pod="metallb-system/metallb-operator-webhook-server-74475bd8d7-k5jkf" Dec 01 21:48:44 crc kubenswrapper[4962]: I1201 21:48:44.292469 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-74475bd8d7-k5jkf"] Dec 01 21:48:44 crc kubenswrapper[4962]: I1201 21:48:44.318620 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-fc9ff4f78-6q794" Dec 01 21:48:44 crc kubenswrapper[4962]: I1201 21:48:44.375826 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e46e036e-ca57-4675-a356-6a0cf72b184d-webhook-cert\") pod \"metallb-operator-webhook-server-74475bd8d7-k5jkf\" (UID: \"e46e036e-ca57-4675-a356-6a0cf72b184d\") " pod="metallb-system/metallb-operator-webhook-server-74475bd8d7-k5jkf" Dec 01 21:48:44 crc kubenswrapper[4962]: I1201 21:48:44.376133 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e46e036e-ca57-4675-a356-6a0cf72b184d-apiservice-cert\") pod \"metallb-operator-webhook-server-74475bd8d7-k5jkf\" (UID: \"e46e036e-ca57-4675-a356-6a0cf72b184d\") " pod="metallb-system/metallb-operator-webhook-server-74475bd8d7-k5jkf" Dec 01 21:48:44 crc kubenswrapper[4962]: I1201 21:48:44.376216 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wnh7\" (UniqueName: \"kubernetes.io/projected/e46e036e-ca57-4675-a356-6a0cf72b184d-kube-api-access-5wnh7\") pod \"metallb-operator-webhook-server-74475bd8d7-k5jkf\" (UID: \"e46e036e-ca57-4675-a356-6a0cf72b184d\") " pod="metallb-system/metallb-operator-webhook-server-74475bd8d7-k5jkf" Dec 01 21:48:44 crc kubenswrapper[4962]: I1201 21:48:44.393885 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e46e036e-ca57-4675-a356-6a0cf72b184d-webhook-cert\") pod \"metallb-operator-webhook-server-74475bd8d7-k5jkf\" (UID: \"e46e036e-ca57-4675-a356-6a0cf72b184d\") " pod="metallb-system/metallb-operator-webhook-server-74475bd8d7-k5jkf" Dec 01 21:48:44 crc kubenswrapper[4962]: I1201 21:48:44.393911 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e46e036e-ca57-4675-a356-6a0cf72b184d-apiservice-cert\") pod \"metallb-operator-webhook-server-74475bd8d7-k5jkf\" (UID: \"e46e036e-ca57-4675-a356-6a0cf72b184d\") " pod="metallb-system/metallb-operator-webhook-server-74475bd8d7-k5jkf" Dec 01 21:48:44 crc kubenswrapper[4962]: I1201 21:48:44.397570 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wnh7\" (UniqueName: \"kubernetes.io/projected/e46e036e-ca57-4675-a356-6a0cf72b184d-kube-api-access-5wnh7\") pod \"metallb-operator-webhook-server-74475bd8d7-k5jkf\" (UID: \"e46e036e-ca57-4675-a356-6a0cf72b184d\") " pod="metallb-system/metallb-operator-webhook-server-74475bd8d7-k5jkf" Dec 01 21:48:44 crc kubenswrapper[4962]: I1201 21:48:44.540132 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-74475bd8d7-k5jkf" Dec 01 21:48:44 crc kubenswrapper[4962]: I1201 21:48:44.826096 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-fc9ff4f78-6q794"] Dec 01 21:48:44 crc kubenswrapper[4962]: W1201 21:48:44.830876 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06d500dd_2267_451a_992d_d676f1033bb6.slice/crio-647df6706322e129add4a83dd163e75415ef8335020a851027dfe6d5732d693b WatchSource:0}: Error finding container 647df6706322e129add4a83dd163e75415ef8335020a851027dfe6d5732d693b: Status 404 returned error can't find the container with id 647df6706322e129add4a83dd163e75415ef8335020a851027dfe6d5732d693b Dec 01 21:48:44 crc kubenswrapper[4962]: W1201 21:48:44.962803 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode46e036e_ca57_4675_a356_6a0cf72b184d.slice/crio-0fd7b57d0996a13b5a2747c5a7f86998f203a3d84f868ac7e2557bae212d5fc5 WatchSource:0}: Error finding container 0fd7b57d0996a13b5a2747c5a7f86998f203a3d84f868ac7e2557bae212d5fc5: Status 404 returned error can't find the container with id 0fd7b57d0996a13b5a2747c5a7f86998f203a3d84f868ac7e2557bae212d5fc5 Dec 01 21:48:44 crc kubenswrapper[4962]: I1201 21:48:44.965681 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-74475bd8d7-k5jkf"] Dec 01 21:48:45 crc kubenswrapper[4962]: I1201 21:48:45.591295 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-74475bd8d7-k5jkf" event={"ID":"e46e036e-ca57-4675-a356-6a0cf72b184d","Type":"ContainerStarted","Data":"0fd7b57d0996a13b5a2747c5a7f86998f203a3d84f868ac7e2557bae212d5fc5"} Dec 01 21:48:45 crc kubenswrapper[4962]: I1201 21:48:45.594140 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-fc9ff4f78-6q794" event={"ID":"06d500dd-2267-451a-992d-d676f1033bb6","Type":"ContainerStarted","Data":"647df6706322e129add4a83dd163e75415ef8335020a851027dfe6d5732d693b"} Dec 01 21:48:51 crc kubenswrapper[4962]: I1201 21:48:51.671144 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-74475bd8d7-k5jkf" event={"ID":"e46e036e-ca57-4675-a356-6a0cf72b184d","Type":"ContainerStarted","Data":"f66deca6dc22812c97df8fe276b8fde53b640c440a1d8150e600d9024e3491e8"} Dec 01 21:48:51 crc kubenswrapper[4962]: I1201 21:48:51.672029 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-74475bd8d7-k5jkf" Dec 01 21:48:51 crc kubenswrapper[4962]: I1201 21:48:51.674488 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-fc9ff4f78-6q794" event={"ID":"06d500dd-2267-451a-992d-d676f1033bb6","Type":"ContainerStarted","Data":"1c9406c9db2dc41db7f36ebc849a2cbc65bbcb399ab71e6ba4d707cc01a9a529"} Dec 01 21:48:51 crc kubenswrapper[4962]: I1201 21:48:51.674806 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-fc9ff4f78-6q794" Dec 01 21:48:51 crc kubenswrapper[4962]: I1201 21:48:51.699180 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-74475bd8d7-k5jkf" podStartSLOduration=2.03327233 podStartE2EDuration="7.699150194s" podCreationTimestamp="2025-12-01 21:48:44 +0000 UTC" firstStartedPulling="2025-12-01 21:48:44.966075936 +0000 UTC m=+909.067515131" lastFinishedPulling="2025-12-01 21:48:50.63195379 +0000 UTC m=+914.733392995" observedRunningTime="2025-12-01 21:48:51.698683491 +0000 UTC m=+915.800122696" watchObservedRunningTime="2025-12-01 21:48:51.699150194 +0000 UTC m=+915.800589469" Dec 01 21:48:51 crc kubenswrapper[4962]: I1201 21:48:51.730869 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-fc9ff4f78-6q794" podStartSLOduration=2.957535395 podStartE2EDuration="8.730843876s" podCreationTimestamp="2025-12-01 21:48:43 +0000 UTC" firstStartedPulling="2025-12-01 21:48:44.832851525 +0000 UTC m=+908.934290720" lastFinishedPulling="2025-12-01 21:48:50.606160006 +0000 UTC m=+914.707599201" observedRunningTime="2025-12-01 21:48:51.72642281 +0000 UTC m=+915.827862025" watchObservedRunningTime="2025-12-01 21:48:51.730843876 +0000 UTC m=+915.832283101" Dec 01 21:49:04 crc kubenswrapper[4962]: I1201 21:49:04.559345 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-74475bd8d7-k5jkf" Dec 01 21:49:18 crc kubenswrapper[4962]: I1201 21:49:18.691681 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-89xg9"] Dec 01 21:49:18 crc kubenswrapper[4962]: I1201 21:49:18.695640 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-89xg9" Dec 01 21:49:18 crc kubenswrapper[4962]: I1201 21:49:18.704930 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-89xg9"] Dec 01 21:49:18 crc kubenswrapper[4962]: I1201 21:49:18.731003 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bac2c3d5-d3cc-44de-b838-ad03aff719d7-utilities\") pod \"redhat-marketplace-89xg9\" (UID: \"bac2c3d5-d3cc-44de-b838-ad03aff719d7\") " pod="openshift-marketplace/redhat-marketplace-89xg9" Dec 01 21:49:18 crc kubenswrapper[4962]: I1201 21:49:18.731080 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bac2c3d5-d3cc-44de-b838-ad03aff719d7-catalog-content\") pod \"redhat-marketplace-89xg9\" (UID: \"bac2c3d5-d3cc-44de-b838-ad03aff719d7\") " pod="openshift-marketplace/redhat-marketplace-89xg9" Dec 01 21:49:18 crc kubenswrapper[4962]: I1201 21:49:18.731109 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfcm6\" (UniqueName: \"kubernetes.io/projected/bac2c3d5-d3cc-44de-b838-ad03aff719d7-kube-api-access-jfcm6\") pod \"redhat-marketplace-89xg9\" (UID: \"bac2c3d5-d3cc-44de-b838-ad03aff719d7\") " pod="openshift-marketplace/redhat-marketplace-89xg9" Dec 01 21:49:18 crc kubenswrapper[4962]: I1201 21:49:18.832193 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bac2c3d5-d3cc-44de-b838-ad03aff719d7-utilities\") pod \"redhat-marketplace-89xg9\" (UID: \"bac2c3d5-d3cc-44de-b838-ad03aff719d7\") " pod="openshift-marketplace/redhat-marketplace-89xg9" Dec 01 21:49:18 crc kubenswrapper[4962]: I1201 21:49:18.832260 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bac2c3d5-d3cc-44de-b838-ad03aff719d7-catalog-content\") pod \"redhat-marketplace-89xg9\" (UID: \"bac2c3d5-d3cc-44de-b838-ad03aff719d7\") " pod="openshift-marketplace/redhat-marketplace-89xg9" Dec 01 21:49:18 crc kubenswrapper[4962]: I1201 21:49:18.832284 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfcm6\" (UniqueName: \"kubernetes.io/projected/bac2c3d5-d3cc-44de-b838-ad03aff719d7-kube-api-access-jfcm6\") pod \"redhat-marketplace-89xg9\" (UID: \"bac2c3d5-d3cc-44de-b838-ad03aff719d7\") " pod="openshift-marketplace/redhat-marketplace-89xg9" Dec 01 21:49:18 crc kubenswrapper[4962]: I1201 21:49:18.833149 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bac2c3d5-d3cc-44de-b838-ad03aff719d7-utilities\") pod \"redhat-marketplace-89xg9\" (UID: \"bac2c3d5-d3cc-44de-b838-ad03aff719d7\") " pod="openshift-marketplace/redhat-marketplace-89xg9" Dec 01 21:49:18 crc kubenswrapper[4962]: I1201 21:49:18.833460 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bac2c3d5-d3cc-44de-b838-ad03aff719d7-catalog-content\") pod \"redhat-marketplace-89xg9\" (UID: \"bac2c3d5-d3cc-44de-b838-ad03aff719d7\") " pod="openshift-marketplace/redhat-marketplace-89xg9" Dec 01 21:49:18 crc kubenswrapper[4962]: I1201 21:49:18.854979 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfcm6\" (UniqueName: \"kubernetes.io/projected/bac2c3d5-d3cc-44de-b838-ad03aff719d7-kube-api-access-jfcm6\") pod \"redhat-marketplace-89xg9\" (UID: \"bac2c3d5-d3cc-44de-b838-ad03aff719d7\") " pod="openshift-marketplace/redhat-marketplace-89xg9" Dec 01 21:49:19 crc kubenswrapper[4962]: I1201 21:49:19.019907 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-89xg9" Dec 01 21:49:19 crc kubenswrapper[4962]: I1201 21:49:19.490450 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-89xg9"] Dec 01 21:49:19 crc kubenswrapper[4962]: I1201 21:49:19.949468 4962 generic.go:334] "Generic (PLEG): container finished" podID="bac2c3d5-d3cc-44de-b838-ad03aff719d7" containerID="82b490220539dd11f3e6e0f98d252a532e43e86f473b4a4523cb1a3a7d689d8d" exitCode=0 Dec 01 21:49:19 crc kubenswrapper[4962]: I1201 21:49:19.949603 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-89xg9" event={"ID":"bac2c3d5-d3cc-44de-b838-ad03aff719d7","Type":"ContainerDied","Data":"82b490220539dd11f3e6e0f98d252a532e43e86f473b4a4523cb1a3a7d689d8d"} Dec 01 21:49:19 crc kubenswrapper[4962]: I1201 21:49:19.949923 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-89xg9" event={"ID":"bac2c3d5-d3cc-44de-b838-ad03aff719d7","Type":"ContainerStarted","Data":"e9272c97fbcc376ce3265554019a85a61e10c44e724e8b6674692a1acddb32c4"} Dec 01 21:49:21 crc kubenswrapper[4962]: E1201 21:49:21.693905 4962 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbac2c3d5_d3cc_44de_b838_ad03aff719d7.slice/crio-b862cea8a694cf220a503f3b5cd35df781ea26e11e0c1b2ba34360da6a49c1bb.scope\": RecentStats: unable to find data in memory cache]" Dec 01 21:49:21 crc kubenswrapper[4962]: I1201 21:49:21.972156 4962 generic.go:334] "Generic (PLEG): container finished" podID="bac2c3d5-d3cc-44de-b838-ad03aff719d7" containerID="b862cea8a694cf220a503f3b5cd35df781ea26e11e0c1b2ba34360da6a49c1bb" exitCode=0 Dec 01 21:49:21 crc kubenswrapper[4962]: I1201 21:49:21.972199 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-89xg9" event={"ID":"bac2c3d5-d3cc-44de-b838-ad03aff719d7","Type":"ContainerDied","Data":"b862cea8a694cf220a503f3b5cd35df781ea26e11e0c1b2ba34360da6a49c1bb"} Dec 01 21:49:22 crc kubenswrapper[4962]: I1201 21:49:22.984133 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-89xg9" event={"ID":"bac2c3d5-d3cc-44de-b838-ad03aff719d7","Type":"ContainerStarted","Data":"fa064ef1ee6b9f30ff0b15e24c1a0226e584c5dae6eeb42ff297f7e49f9b899f"} Dec 01 21:49:23 crc kubenswrapper[4962]: I1201 21:49:23.014765 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-89xg9" podStartSLOduration=2.455474131 podStartE2EDuration="5.014736436s" podCreationTimestamp="2025-12-01 21:49:18 +0000 UTC" firstStartedPulling="2025-12-01 21:49:19.951295476 +0000 UTC m=+944.052734701" lastFinishedPulling="2025-12-01 21:49:22.510557771 +0000 UTC m=+946.611997006" observedRunningTime="2025-12-01 21:49:23.008359095 +0000 UTC m=+947.109798330" watchObservedRunningTime="2025-12-01 21:49:23.014736436 +0000 UTC m=+947.116175671" Dec 01 21:49:24 crc kubenswrapper[4962]: I1201 21:49:24.322864 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-fc9ff4f78-6q794" Dec 01 21:49:25 crc kubenswrapper[4962]: I1201 21:49:25.143373 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-mgx99"] Dec 01 21:49:25 crc kubenswrapper[4962]: I1201 21:49:25.148681 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-mgx99" Dec 01 21:49:25 crc kubenswrapper[4962]: I1201 21:49:25.151492 4962 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-sr7hk" Dec 01 21:49:25 crc kubenswrapper[4962]: I1201 21:49:25.152268 4962 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Dec 01 21:49:25 crc kubenswrapper[4962]: I1201 21:49:25.154059 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Dec 01 21:49:25 crc kubenswrapper[4962]: I1201 21:49:25.157538 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-2xwv6"] Dec 01 21:49:25 crc kubenswrapper[4962]: I1201 21:49:25.158994 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-2xwv6" Dec 01 21:49:25 crc kubenswrapper[4962]: I1201 21:49:25.164572 4962 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Dec 01 21:49:25 crc kubenswrapper[4962]: I1201 21:49:25.176947 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-2xwv6"] Dec 01 21:49:25 crc kubenswrapper[4962]: I1201 21:49:25.244384 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/7590e32f-b0cb-46dc-a679-46b2ede43ba0-frr-sockets\") pod \"frr-k8s-mgx99\" (UID: \"7590e32f-b0cb-46dc-a679-46b2ede43ba0\") " pod="metallb-system/frr-k8s-mgx99" Dec 01 21:49:25 crc kubenswrapper[4962]: I1201 21:49:25.244458 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/7590e32f-b0cb-46dc-a679-46b2ede43ba0-reloader\") pod \"frr-k8s-mgx99\" (UID: \"7590e32f-b0cb-46dc-a679-46b2ede43ba0\") " pod="metallb-system/frr-k8s-mgx99" Dec 01 21:49:25 crc kubenswrapper[4962]: I1201 21:49:25.244516 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/7590e32f-b0cb-46dc-a679-46b2ede43ba0-metrics\") pod \"frr-k8s-mgx99\" (UID: \"7590e32f-b0cb-46dc-a679-46b2ede43ba0\") " pod="metallb-system/frr-k8s-mgx99" Dec 01 21:49:25 crc kubenswrapper[4962]: I1201 21:49:25.244545 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/7590e32f-b0cb-46dc-a679-46b2ede43ba0-frr-startup\") pod \"frr-k8s-mgx99\" (UID: \"7590e32f-b0cb-46dc-a679-46b2ede43ba0\") " pod="metallb-system/frr-k8s-mgx99" Dec 01 21:49:25 crc kubenswrapper[4962]: I1201 21:49:25.244591 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwsb5\" (UniqueName: \"kubernetes.io/projected/ba7de090-9085-47a3-a086-73f78775d865-kube-api-access-pwsb5\") pod \"frr-k8s-webhook-server-7fcb986d4-2xwv6\" (UID: \"ba7de090-9085-47a3-a086-73f78775d865\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-2xwv6" Dec 01 21:49:25 crc kubenswrapper[4962]: I1201 21:49:25.244642 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrgwl\" (UniqueName: \"kubernetes.io/projected/7590e32f-b0cb-46dc-a679-46b2ede43ba0-kube-api-access-mrgwl\") pod \"frr-k8s-mgx99\" (UID: \"7590e32f-b0cb-46dc-a679-46b2ede43ba0\") " pod="metallb-system/frr-k8s-mgx99" Dec 01 21:49:25 crc kubenswrapper[4962]: I1201 21:49:25.244677 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/7590e32f-b0cb-46dc-a679-46b2ede43ba0-frr-conf\") pod \"frr-k8s-mgx99\" (UID: \"7590e32f-b0cb-46dc-a679-46b2ede43ba0\") " pod="metallb-system/frr-k8s-mgx99" Dec 01 21:49:25 crc kubenswrapper[4962]: I1201 21:49:25.244699 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ba7de090-9085-47a3-a086-73f78775d865-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-2xwv6\" (UID: \"ba7de090-9085-47a3-a086-73f78775d865\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-2xwv6" Dec 01 21:49:25 crc kubenswrapper[4962]: I1201 21:49:25.244731 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7590e32f-b0cb-46dc-a679-46b2ede43ba0-metrics-certs\") pod \"frr-k8s-mgx99\" (UID: \"7590e32f-b0cb-46dc-a679-46b2ede43ba0\") " pod="metallb-system/frr-k8s-mgx99" Dec 01 21:49:25 crc kubenswrapper[4962]: I1201 21:49:25.283795 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-z5gxh"] Dec 01 21:49:25 crc kubenswrapper[4962]: I1201 21:49:25.284904 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-z5gxh" Dec 01 21:49:25 crc kubenswrapper[4962]: I1201 21:49:25.288368 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Dec 01 21:49:25 crc kubenswrapper[4962]: I1201 21:49:25.288686 4962 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-bqb9c" Dec 01 21:49:25 crc kubenswrapper[4962]: I1201 21:49:25.288827 4962 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Dec 01 21:49:25 crc kubenswrapper[4962]: I1201 21:49:25.288985 4962 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Dec 01 21:49:25 crc kubenswrapper[4962]: I1201 21:49:25.300731 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-f8648f98b-hf6jx"] Dec 01 21:49:25 crc kubenswrapper[4962]: I1201 21:49:25.307753 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-hf6jx" Dec 01 21:49:25 crc kubenswrapper[4962]: I1201 21:49:25.311206 4962 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Dec 01 21:49:25 crc kubenswrapper[4962]: I1201 21:49:25.323118 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-hf6jx"] Dec 01 21:49:25 crc kubenswrapper[4962]: I1201 21:49:25.346601 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/519181c6-2c70-42ee-825f-427fe5942b07-metrics-certs\") pod \"controller-f8648f98b-hf6jx\" (UID: \"519181c6-2c70-42ee-825f-427fe5942b07\") " pod="metallb-system/controller-f8648f98b-hf6jx" Dec 01 21:49:25 crc kubenswrapper[4962]: I1201 21:49:25.346656 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/519181c6-2c70-42ee-825f-427fe5942b07-cert\") pod \"controller-f8648f98b-hf6jx\" (UID: \"519181c6-2c70-42ee-825f-427fe5942b07\") " pod="metallb-system/controller-f8648f98b-hf6jx" Dec 01 21:49:25 crc kubenswrapper[4962]: I1201 21:49:25.346682 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/7590e32f-b0cb-46dc-a679-46b2ede43ba0-frr-sockets\") pod \"frr-k8s-mgx99\" (UID: \"7590e32f-b0cb-46dc-a679-46b2ede43ba0\") " pod="metallb-system/frr-k8s-mgx99" Dec 01 21:49:25 crc kubenswrapper[4962]: I1201 21:49:25.346717 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/7590e32f-b0cb-46dc-a679-46b2ede43ba0-reloader\") pod \"frr-k8s-mgx99\" (UID: \"7590e32f-b0cb-46dc-a679-46b2ede43ba0\") " pod="metallb-system/frr-k8s-mgx99" Dec 01 21:49:25 crc kubenswrapper[4962]: I1201 21:49:25.346734 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czfsl\" (UniqueName: \"kubernetes.io/projected/519181c6-2c70-42ee-825f-427fe5942b07-kube-api-access-czfsl\") pod \"controller-f8648f98b-hf6jx\" (UID: \"519181c6-2c70-42ee-825f-427fe5942b07\") " pod="metallb-system/controller-f8648f98b-hf6jx" Dec 01 21:49:25 crc kubenswrapper[4962]: I1201 21:49:25.346759 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/7590e32f-b0cb-46dc-a679-46b2ede43ba0-metrics\") pod \"frr-k8s-mgx99\" (UID: \"7590e32f-b0cb-46dc-a679-46b2ede43ba0\") " pod="metallb-system/frr-k8s-mgx99" Dec 01 21:49:25 crc kubenswrapper[4962]: I1201 21:49:25.346777 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/7590e32f-b0cb-46dc-a679-46b2ede43ba0-frr-startup\") pod \"frr-k8s-mgx99\" (UID: \"7590e32f-b0cb-46dc-a679-46b2ede43ba0\") " pod="metallb-system/frr-k8s-mgx99" Dec 01 21:49:25 crc kubenswrapper[4962]: I1201 21:49:25.346802 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwsb5\" (UniqueName: \"kubernetes.io/projected/ba7de090-9085-47a3-a086-73f78775d865-kube-api-access-pwsb5\") pod \"frr-k8s-webhook-server-7fcb986d4-2xwv6\" (UID: \"ba7de090-9085-47a3-a086-73f78775d865\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-2xwv6" Dec 01 21:49:25 crc kubenswrapper[4962]: I1201 21:49:25.346836 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrgwl\" (UniqueName: \"kubernetes.io/projected/7590e32f-b0cb-46dc-a679-46b2ede43ba0-kube-api-access-mrgwl\") pod \"frr-k8s-mgx99\" (UID: \"7590e32f-b0cb-46dc-a679-46b2ede43ba0\") " pod="metallb-system/frr-k8s-mgx99" Dec 01 21:49:25 crc kubenswrapper[4962]: I1201 21:49:25.346856 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/7590e32f-b0cb-46dc-a679-46b2ede43ba0-frr-conf\") pod \"frr-k8s-mgx99\" (UID: \"7590e32f-b0cb-46dc-a679-46b2ede43ba0\") " pod="metallb-system/frr-k8s-mgx99" Dec 01 21:49:25 crc kubenswrapper[4962]: I1201 21:49:25.346872 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ba7de090-9085-47a3-a086-73f78775d865-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-2xwv6\" (UID: \"ba7de090-9085-47a3-a086-73f78775d865\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-2xwv6" Dec 01 21:49:25 crc kubenswrapper[4962]: I1201 21:49:25.346897 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7590e32f-b0cb-46dc-a679-46b2ede43ba0-metrics-certs\") pod \"frr-k8s-mgx99\" (UID: \"7590e32f-b0cb-46dc-a679-46b2ede43ba0\") " pod="metallb-system/frr-k8s-mgx99" Dec 01 21:49:25 crc kubenswrapper[4962]: I1201 21:49:25.349190 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/7590e32f-b0cb-46dc-a679-46b2ede43ba0-metrics\") pod \"frr-k8s-mgx99\" (UID: \"7590e32f-b0cb-46dc-a679-46b2ede43ba0\") " pod="metallb-system/frr-k8s-mgx99" Dec 01 21:49:25 crc kubenswrapper[4962]: I1201 21:49:25.349462 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/7590e32f-b0cb-46dc-a679-46b2ede43ba0-frr-sockets\") pod \"frr-k8s-mgx99\" (UID: \"7590e32f-b0cb-46dc-a679-46b2ede43ba0\") " pod="metallb-system/frr-k8s-mgx99" Dec 01 21:49:25 crc kubenswrapper[4962]: I1201 21:49:25.349718 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/7590e32f-b0cb-46dc-a679-46b2ede43ba0-reloader\") pod \"frr-k8s-mgx99\" (UID: \"7590e32f-b0cb-46dc-a679-46b2ede43ba0\") " pod="metallb-system/frr-k8s-mgx99" Dec 01 21:49:25 crc kubenswrapper[4962]: I1201 21:49:25.350229 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/7590e32f-b0cb-46dc-a679-46b2ede43ba0-frr-conf\") pod \"frr-k8s-mgx99\" (UID: \"7590e32f-b0cb-46dc-a679-46b2ede43ba0\") " pod="metallb-system/frr-k8s-mgx99" Dec 01 21:49:25 crc kubenswrapper[4962]: I1201 21:49:25.351164 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/7590e32f-b0cb-46dc-a679-46b2ede43ba0-frr-startup\") pod \"frr-k8s-mgx99\" (UID: \"7590e32f-b0cb-46dc-a679-46b2ede43ba0\") " pod="metallb-system/frr-k8s-mgx99" Dec 01 21:49:25 crc kubenswrapper[4962]: I1201 21:49:25.356527 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ba7de090-9085-47a3-a086-73f78775d865-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-2xwv6\" (UID: \"ba7de090-9085-47a3-a086-73f78775d865\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-2xwv6" Dec 01 21:49:25 crc kubenswrapper[4962]: I1201 21:49:25.356530 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7590e32f-b0cb-46dc-a679-46b2ede43ba0-metrics-certs\") pod \"frr-k8s-mgx99\" (UID: \"7590e32f-b0cb-46dc-a679-46b2ede43ba0\") " pod="metallb-system/frr-k8s-mgx99" Dec 01 21:49:25 crc kubenswrapper[4962]: I1201 21:49:25.369965 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwsb5\" (UniqueName: \"kubernetes.io/projected/ba7de090-9085-47a3-a086-73f78775d865-kube-api-access-pwsb5\") pod \"frr-k8s-webhook-server-7fcb986d4-2xwv6\" (UID: \"ba7de090-9085-47a3-a086-73f78775d865\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-2xwv6" Dec 01 21:49:25 crc kubenswrapper[4962]: I1201 21:49:25.370036 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrgwl\" (UniqueName: \"kubernetes.io/projected/7590e32f-b0cb-46dc-a679-46b2ede43ba0-kube-api-access-mrgwl\") pod \"frr-k8s-mgx99\" (UID: \"7590e32f-b0cb-46dc-a679-46b2ede43ba0\") " pod="metallb-system/frr-k8s-mgx99" Dec 01 21:49:25 crc kubenswrapper[4962]: I1201 21:49:25.448596 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czfsl\" (UniqueName: \"kubernetes.io/projected/519181c6-2c70-42ee-825f-427fe5942b07-kube-api-access-czfsl\") pod \"controller-f8648f98b-hf6jx\" (UID: \"519181c6-2c70-42ee-825f-427fe5942b07\") " pod="metallb-system/controller-f8648f98b-hf6jx" Dec 01 21:49:25 crc kubenswrapper[4962]: I1201 21:49:25.448668 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/9dc8d3dc-4cdb-45b7-a54f-83db94bdde05-metallb-excludel2\") pod \"speaker-z5gxh\" (UID: \"9dc8d3dc-4cdb-45b7-a54f-83db94bdde05\") " pod="metallb-system/speaker-z5gxh" Dec 01 21:49:25 crc kubenswrapper[4962]: I1201 21:49:25.448707 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9dc8d3dc-4cdb-45b7-a54f-83db94bdde05-metrics-certs\") pod \"speaker-z5gxh\" (UID: \"9dc8d3dc-4cdb-45b7-a54f-83db94bdde05\") " pod="metallb-system/speaker-z5gxh" Dec 01 21:49:25 crc kubenswrapper[4962]: I1201 21:49:25.448740 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/9dc8d3dc-4cdb-45b7-a54f-83db94bdde05-memberlist\") pod \"speaker-z5gxh\" (UID: \"9dc8d3dc-4cdb-45b7-a54f-83db94bdde05\") " pod="metallb-system/speaker-z5gxh" Dec 01 21:49:25 crc kubenswrapper[4962]: I1201 21:49:25.448756 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/519181c6-2c70-42ee-825f-427fe5942b07-metrics-certs\") pod \"controller-f8648f98b-hf6jx\" (UID: \"519181c6-2c70-42ee-825f-427fe5942b07\") " pod="metallb-system/controller-f8648f98b-hf6jx" Dec 01 21:49:25 crc kubenswrapper[4962]: I1201 21:49:25.448786 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/519181c6-2c70-42ee-825f-427fe5942b07-cert\") pod \"controller-f8648f98b-hf6jx\" (UID: \"519181c6-2c70-42ee-825f-427fe5942b07\") " pod="metallb-system/controller-f8648f98b-hf6jx" Dec 01 21:49:25 crc kubenswrapper[4962]: I1201 21:49:25.448810 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h72g9\" (UniqueName: \"kubernetes.io/projected/9dc8d3dc-4cdb-45b7-a54f-83db94bdde05-kube-api-access-h72g9\") pod \"speaker-z5gxh\" (UID: \"9dc8d3dc-4cdb-45b7-a54f-83db94bdde05\") " pod="metallb-system/speaker-z5gxh" Dec 01 21:49:25 crc kubenswrapper[4962]: I1201 21:49:25.455441 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/519181c6-2c70-42ee-825f-427fe5942b07-metrics-certs\") pod \"controller-f8648f98b-hf6jx\" (UID: \"519181c6-2c70-42ee-825f-427fe5942b07\") " pod="metallb-system/controller-f8648f98b-hf6jx" Dec 01 21:49:25 crc kubenswrapper[4962]: I1201 21:49:25.468606 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/519181c6-2c70-42ee-825f-427fe5942b07-cert\") pod \"controller-f8648f98b-hf6jx\" (UID: \"519181c6-2c70-42ee-825f-427fe5942b07\") " pod="metallb-system/controller-f8648f98b-hf6jx" Dec 01 21:49:25 crc kubenswrapper[4962]: I1201 21:49:25.468684 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czfsl\" (UniqueName: \"kubernetes.io/projected/519181c6-2c70-42ee-825f-427fe5942b07-kube-api-access-czfsl\") pod \"controller-f8648f98b-hf6jx\" (UID: \"519181c6-2c70-42ee-825f-427fe5942b07\") " pod="metallb-system/controller-f8648f98b-hf6jx" Dec 01 21:49:25 crc kubenswrapper[4962]: I1201 21:49:25.524118 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-mgx99" Dec 01 21:49:25 crc kubenswrapper[4962]: I1201 21:49:25.534158 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-2xwv6" Dec 01 21:49:25 crc kubenswrapper[4962]: I1201 21:49:25.549722 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9dc8d3dc-4cdb-45b7-a54f-83db94bdde05-metrics-certs\") pod \"speaker-z5gxh\" (UID: \"9dc8d3dc-4cdb-45b7-a54f-83db94bdde05\") " pod="metallb-system/speaker-z5gxh" Dec 01 21:49:25 crc kubenswrapper[4962]: I1201 21:49:25.549790 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/9dc8d3dc-4cdb-45b7-a54f-83db94bdde05-memberlist\") pod \"speaker-z5gxh\" (UID: \"9dc8d3dc-4cdb-45b7-a54f-83db94bdde05\") " pod="metallb-system/speaker-z5gxh" Dec 01 21:49:25 crc kubenswrapper[4962]: I1201 21:49:25.549837 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h72g9\" (UniqueName: \"kubernetes.io/projected/9dc8d3dc-4cdb-45b7-a54f-83db94bdde05-kube-api-access-h72g9\") pod \"speaker-z5gxh\" (UID: \"9dc8d3dc-4cdb-45b7-a54f-83db94bdde05\") " pod="metallb-system/speaker-z5gxh" Dec 01 21:49:25 crc kubenswrapper[4962]: I1201 21:49:25.549896 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/9dc8d3dc-4cdb-45b7-a54f-83db94bdde05-metallb-excludel2\") pod \"speaker-z5gxh\" (UID: \"9dc8d3dc-4cdb-45b7-a54f-83db94bdde05\") " pod="metallb-system/speaker-z5gxh" Dec 01 21:49:25 crc kubenswrapper[4962]: E1201 21:49:25.549967 4962 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 01 21:49:25 crc kubenswrapper[4962]: E1201 21:49:25.550020 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9dc8d3dc-4cdb-45b7-a54f-83db94bdde05-memberlist podName:9dc8d3dc-4cdb-45b7-a54f-83db94bdde05 nodeName:}" failed. No retries permitted until 2025-12-01 21:49:26.050003959 +0000 UTC m=+950.151443154 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/9dc8d3dc-4cdb-45b7-a54f-83db94bdde05-memberlist") pod "speaker-z5gxh" (UID: "9dc8d3dc-4cdb-45b7-a54f-83db94bdde05") : secret "metallb-memberlist" not found Dec 01 21:49:25 crc kubenswrapper[4962]: I1201 21:49:25.550519 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/9dc8d3dc-4cdb-45b7-a54f-83db94bdde05-metallb-excludel2\") pod \"speaker-z5gxh\" (UID: \"9dc8d3dc-4cdb-45b7-a54f-83db94bdde05\") " pod="metallb-system/speaker-z5gxh" Dec 01 21:49:25 crc kubenswrapper[4962]: I1201 21:49:25.557329 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9dc8d3dc-4cdb-45b7-a54f-83db94bdde05-metrics-certs\") pod \"speaker-z5gxh\" (UID: \"9dc8d3dc-4cdb-45b7-a54f-83db94bdde05\") " pod="metallb-system/speaker-z5gxh" Dec 01 21:49:25 crc kubenswrapper[4962]: I1201 21:49:25.571750 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h72g9\" (UniqueName: \"kubernetes.io/projected/9dc8d3dc-4cdb-45b7-a54f-83db94bdde05-kube-api-access-h72g9\") pod \"speaker-z5gxh\" (UID: \"9dc8d3dc-4cdb-45b7-a54f-83db94bdde05\") " pod="metallb-system/speaker-z5gxh" Dec 01 21:49:25 crc kubenswrapper[4962]: I1201 21:49:25.624859 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-hf6jx" Dec 01 21:49:25 crc kubenswrapper[4962]: I1201 21:49:25.993604 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-2xwv6"] Dec 01 21:49:26 crc kubenswrapper[4962]: I1201 21:49:26.014981 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-2xwv6" event={"ID":"ba7de090-9085-47a3-a086-73f78775d865","Type":"ContainerStarted","Data":"c2a8ba29fbee87c3a9750697d3ccf8ec92f68767fd44c8f4682a9c564f46ecbd"} Dec 01 21:49:26 crc kubenswrapper[4962]: I1201 21:49:26.017853 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mgx99" event={"ID":"7590e32f-b0cb-46dc-a679-46b2ede43ba0","Type":"ContainerStarted","Data":"a4555f4e0adaa9d2f35515f1ba755e5b3fa619db064e76898e9b98a437e23c23"} Dec 01 21:49:26 crc kubenswrapper[4962]: I1201 21:49:26.147986 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/9dc8d3dc-4cdb-45b7-a54f-83db94bdde05-memberlist\") pod \"speaker-z5gxh\" (UID: \"9dc8d3dc-4cdb-45b7-a54f-83db94bdde05\") " pod="metallb-system/speaker-z5gxh" Dec 01 21:49:26 crc kubenswrapper[4962]: E1201 21:49:26.148166 4962 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 01 21:49:26 crc kubenswrapper[4962]: E1201 21:49:26.148337 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9dc8d3dc-4cdb-45b7-a54f-83db94bdde05-memberlist podName:9dc8d3dc-4cdb-45b7-a54f-83db94bdde05 nodeName:}" failed. No retries permitted until 2025-12-01 21:49:27.148318912 +0000 UTC m=+951.249758107 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/9dc8d3dc-4cdb-45b7-a54f-83db94bdde05-memberlist") pod "speaker-z5gxh" (UID: "9dc8d3dc-4cdb-45b7-a54f-83db94bdde05") : secret "metallb-memberlist" not found Dec 01 21:49:26 crc kubenswrapper[4962]: I1201 21:49:26.344115 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-hf6jx"] Dec 01 21:49:26 crc kubenswrapper[4962]: W1201 21:49:26.346090 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod519181c6_2c70_42ee_825f_427fe5942b07.slice/crio-d1b32a156b45f62599adf0a5571a9640f651671375290b9d2b35c054ecded985 WatchSource:0}: Error finding container d1b32a156b45f62599adf0a5571a9640f651671375290b9d2b35c054ecded985: Status 404 returned error can't find the container with id d1b32a156b45f62599adf0a5571a9640f651671375290b9d2b35c054ecded985 Dec 01 21:49:27 crc kubenswrapper[4962]: I1201 21:49:27.028598 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-hf6jx" event={"ID":"519181c6-2c70-42ee-825f-427fe5942b07","Type":"ContainerStarted","Data":"d255985fb62e9320a581a7361dc26a7d852c2cc82a4495ed9b1edbe2b1f9b3c0"} Dec 01 21:49:27 crc kubenswrapper[4962]: I1201 21:49:27.029095 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-f8648f98b-hf6jx" Dec 01 21:49:27 crc kubenswrapper[4962]: I1201 21:49:27.029126 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-hf6jx" event={"ID":"519181c6-2c70-42ee-825f-427fe5942b07","Type":"ContainerStarted","Data":"de0cb3ad0405504421f47bc7f788712248436460643475ca4a188094b8795a0f"} Dec 01 21:49:27 crc kubenswrapper[4962]: I1201 21:49:27.029139 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-hf6jx" event={"ID":"519181c6-2c70-42ee-825f-427fe5942b07","Type":"ContainerStarted","Data":"d1b32a156b45f62599adf0a5571a9640f651671375290b9d2b35c054ecded985"} Dec 01 21:49:27 crc kubenswrapper[4962]: I1201 21:49:27.051055 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-f8648f98b-hf6jx" podStartSLOduration=2.051037376 podStartE2EDuration="2.051037376s" podCreationTimestamp="2025-12-01 21:49:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:49:27.045626942 +0000 UTC m=+951.147066147" watchObservedRunningTime="2025-12-01 21:49:27.051037376 +0000 UTC m=+951.152476571" Dec 01 21:49:27 crc kubenswrapper[4962]: I1201 21:49:27.163593 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/9dc8d3dc-4cdb-45b7-a54f-83db94bdde05-memberlist\") pod \"speaker-z5gxh\" (UID: \"9dc8d3dc-4cdb-45b7-a54f-83db94bdde05\") " pod="metallb-system/speaker-z5gxh" Dec 01 21:49:27 crc kubenswrapper[4962]: I1201 21:49:27.184766 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/9dc8d3dc-4cdb-45b7-a54f-83db94bdde05-memberlist\") pod \"speaker-z5gxh\" (UID: \"9dc8d3dc-4cdb-45b7-a54f-83db94bdde05\") " pod="metallb-system/speaker-z5gxh" Dec 01 21:49:27 crc kubenswrapper[4962]: I1201 21:49:27.405462 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-z5gxh" Dec 01 21:49:27 crc kubenswrapper[4962]: W1201 21:49:27.445871 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9dc8d3dc_4cdb_45b7_a54f_83db94bdde05.slice/crio-cdd7b68a87ac469b372226fdfc9504fe5b8573e9d0afe2a38f34f35a79dba0de WatchSource:0}: Error finding container cdd7b68a87ac469b372226fdfc9504fe5b8573e9d0afe2a38f34f35a79dba0de: Status 404 returned error can't find the container with id cdd7b68a87ac469b372226fdfc9504fe5b8573e9d0afe2a38f34f35a79dba0de Dec 01 21:49:28 crc kubenswrapper[4962]: I1201 21:49:28.060742 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-z5gxh" event={"ID":"9dc8d3dc-4cdb-45b7-a54f-83db94bdde05","Type":"ContainerStarted","Data":"7a8541a4d4b3b5471cd2b3fc0f60be185447ccbc402daa2897f6c773a92f728b"} Dec 01 21:49:28 crc kubenswrapper[4962]: I1201 21:49:28.061044 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-z5gxh" event={"ID":"9dc8d3dc-4cdb-45b7-a54f-83db94bdde05","Type":"ContainerStarted","Data":"92e9e79b0886555d9caa6a5bb7c4d2a1aeb5c90f6450281661143581944c1ab7"} Dec 01 21:49:28 crc kubenswrapper[4962]: I1201 21:49:28.061057 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-z5gxh" event={"ID":"9dc8d3dc-4cdb-45b7-a54f-83db94bdde05","Type":"ContainerStarted","Data":"cdd7b68a87ac469b372226fdfc9504fe5b8573e9d0afe2a38f34f35a79dba0de"} Dec 01 21:49:28 crc kubenswrapper[4962]: I1201 21:49:28.061593 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-z5gxh" Dec 01 21:49:28 crc kubenswrapper[4962]: I1201 21:49:28.108263 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-z5gxh" podStartSLOduration=3.108248505 podStartE2EDuration="3.108248505s" podCreationTimestamp="2025-12-01 21:49:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:49:28.107447193 +0000 UTC m=+952.208886388" watchObservedRunningTime="2025-12-01 21:49:28.108248505 +0000 UTC m=+952.209687700" Dec 01 21:49:29 crc kubenswrapper[4962]: I1201 21:49:29.020197 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-89xg9" Dec 01 21:49:29 crc kubenswrapper[4962]: I1201 21:49:29.020578 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-89xg9" Dec 01 21:49:29 crc kubenswrapper[4962]: I1201 21:49:29.082491 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-89xg9" Dec 01 21:49:29 crc kubenswrapper[4962]: I1201 21:49:29.320857 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-89xg9" Dec 01 21:49:31 crc kubenswrapper[4962]: I1201 21:49:31.466813 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-89xg9"] Dec 01 21:49:31 crc kubenswrapper[4962]: I1201 21:49:31.467390 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-89xg9" podUID="bac2c3d5-d3cc-44de-b838-ad03aff719d7" containerName="registry-server" containerID="cri-o://fa064ef1ee6b9f30ff0b15e24c1a0226e584c5dae6eeb42ff297f7e49f9b899f" gracePeriod=2 Dec 01 21:49:32 crc kubenswrapper[4962]: I1201 21:49:32.144517 4962 generic.go:334] "Generic (PLEG): container finished" podID="bac2c3d5-d3cc-44de-b838-ad03aff719d7" containerID="fa064ef1ee6b9f30ff0b15e24c1a0226e584c5dae6eeb42ff297f7e49f9b899f" exitCode=0 Dec 01 21:49:32 crc kubenswrapper[4962]: I1201 21:49:32.144565 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-89xg9" event={"ID":"bac2c3d5-d3cc-44de-b838-ad03aff719d7","Type":"ContainerDied","Data":"fa064ef1ee6b9f30ff0b15e24c1a0226e584c5dae6eeb42ff297f7e49f9b899f"} Dec 01 21:49:36 crc kubenswrapper[4962]: I1201 21:49:36.910041 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-89xg9" Dec 01 21:49:37 crc kubenswrapper[4962]: I1201 21:49:37.016856 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bac2c3d5-d3cc-44de-b838-ad03aff719d7-catalog-content\") pod \"bac2c3d5-d3cc-44de-b838-ad03aff719d7\" (UID: \"bac2c3d5-d3cc-44de-b838-ad03aff719d7\") " Dec 01 21:49:37 crc kubenswrapper[4962]: I1201 21:49:37.017136 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jfcm6\" (UniqueName: \"kubernetes.io/projected/bac2c3d5-d3cc-44de-b838-ad03aff719d7-kube-api-access-jfcm6\") pod \"bac2c3d5-d3cc-44de-b838-ad03aff719d7\" (UID: \"bac2c3d5-d3cc-44de-b838-ad03aff719d7\") " Dec 01 21:49:37 crc kubenswrapper[4962]: I1201 21:49:37.017165 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bac2c3d5-d3cc-44de-b838-ad03aff719d7-utilities\") pod \"bac2c3d5-d3cc-44de-b838-ad03aff719d7\" (UID: \"bac2c3d5-d3cc-44de-b838-ad03aff719d7\") " Dec 01 21:49:37 crc kubenswrapper[4962]: I1201 21:49:37.018246 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bac2c3d5-d3cc-44de-b838-ad03aff719d7-utilities" (OuterVolumeSpecName: "utilities") pod "bac2c3d5-d3cc-44de-b838-ad03aff719d7" (UID: "bac2c3d5-d3cc-44de-b838-ad03aff719d7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:49:37 crc kubenswrapper[4962]: I1201 21:49:37.026444 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bac2c3d5-d3cc-44de-b838-ad03aff719d7-kube-api-access-jfcm6" (OuterVolumeSpecName: "kube-api-access-jfcm6") pod "bac2c3d5-d3cc-44de-b838-ad03aff719d7" (UID: "bac2c3d5-d3cc-44de-b838-ad03aff719d7"). InnerVolumeSpecName "kube-api-access-jfcm6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:49:37 crc kubenswrapper[4962]: I1201 21:49:37.041850 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bac2c3d5-d3cc-44de-b838-ad03aff719d7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bac2c3d5-d3cc-44de-b838-ad03aff719d7" (UID: "bac2c3d5-d3cc-44de-b838-ad03aff719d7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:49:37 crc kubenswrapper[4962]: I1201 21:49:37.119419 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jfcm6\" (UniqueName: \"kubernetes.io/projected/bac2c3d5-d3cc-44de-b838-ad03aff719d7-kube-api-access-jfcm6\") on node \"crc\" DevicePath \"\"" Dec 01 21:49:37 crc kubenswrapper[4962]: I1201 21:49:37.119460 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bac2c3d5-d3cc-44de-b838-ad03aff719d7-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 21:49:37 crc kubenswrapper[4962]: I1201 21:49:37.119475 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bac2c3d5-d3cc-44de-b838-ad03aff719d7-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 21:49:37 crc kubenswrapper[4962]: I1201 21:49:37.244028 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-89xg9" event={"ID":"bac2c3d5-d3cc-44de-b838-ad03aff719d7","Type":"ContainerDied","Data":"e9272c97fbcc376ce3265554019a85a61e10c44e724e8b6674692a1acddb32c4"} Dec 01 21:49:37 crc kubenswrapper[4962]: I1201 21:49:37.244074 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-89xg9" Dec 01 21:49:37 crc kubenswrapper[4962]: I1201 21:49:37.244493 4962 scope.go:117] "RemoveContainer" containerID="fa064ef1ee6b9f30ff0b15e24c1a0226e584c5dae6eeb42ff297f7e49f9b899f" Dec 01 21:49:37 crc kubenswrapper[4962]: I1201 21:49:37.245684 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-2xwv6" event={"ID":"ba7de090-9085-47a3-a086-73f78775d865","Type":"ContainerStarted","Data":"5bf97a38c0fdcf8bfa37e2266efff6461a73a2261681ea288deb869ff0819f9d"} Dec 01 21:49:37 crc kubenswrapper[4962]: I1201 21:49:37.245809 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-2xwv6" Dec 01 21:49:37 crc kubenswrapper[4962]: I1201 21:49:37.252100 4962 generic.go:334] "Generic (PLEG): container finished" podID="7590e32f-b0cb-46dc-a679-46b2ede43ba0" containerID="37d6e698de858f57a199ece7d165b07d271312781b0fdf8ef2ef578b5f46f957" exitCode=0 Dec 01 21:49:37 crc kubenswrapper[4962]: I1201 21:49:37.252177 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mgx99" event={"ID":"7590e32f-b0cb-46dc-a679-46b2ede43ba0","Type":"ContainerDied","Data":"37d6e698de858f57a199ece7d165b07d271312781b0fdf8ef2ef578b5f46f957"} Dec 01 21:49:37 crc kubenswrapper[4962]: I1201 21:49:37.264434 4962 scope.go:117] "RemoveContainer" containerID="b862cea8a694cf220a503f3b5cd35df781ea26e11e0c1b2ba34360da6a49c1bb" Dec 01 21:49:37 crc kubenswrapper[4962]: I1201 21:49:37.267657 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-2xwv6" podStartSLOduration=1.373088217 podStartE2EDuration="12.267641887s" podCreationTimestamp="2025-12-01 21:49:25 +0000 UTC" firstStartedPulling="2025-12-01 21:49:26.008564116 +0000 UTC m=+950.110003311" lastFinishedPulling="2025-12-01 21:49:36.903117786 +0000 UTC m=+961.004556981" observedRunningTime="2025-12-01 21:49:37.264995102 +0000 UTC m=+961.366434297" watchObservedRunningTime="2025-12-01 21:49:37.267641887 +0000 UTC m=+961.369081092" Dec 01 21:49:37 crc kubenswrapper[4962]: I1201 21:49:37.315108 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-89xg9"] Dec 01 21:49:37 crc kubenswrapper[4962]: I1201 21:49:37.321227 4962 scope.go:117] "RemoveContainer" containerID="82b490220539dd11f3e6e0f98d252a532e43e86f473b4a4523cb1a3a7d689d8d" Dec 01 21:49:37 crc kubenswrapper[4962]: I1201 21:49:37.321953 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-89xg9"] Dec 01 21:49:37 crc kubenswrapper[4962]: I1201 21:49:37.410233 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-z5gxh" Dec 01 21:49:38 crc kubenswrapper[4962]: I1201 21:49:38.230144 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bac2c3d5-d3cc-44de-b838-ad03aff719d7" path="/var/lib/kubelet/pods/bac2c3d5-d3cc-44de-b838-ad03aff719d7/volumes" Dec 01 21:49:38 crc kubenswrapper[4962]: I1201 21:49:38.262788 4962 generic.go:334] "Generic (PLEG): container finished" podID="7590e32f-b0cb-46dc-a679-46b2ede43ba0" containerID="33368f8cf1f6b4dbbbf88c1180a6ffe01fe70d1f4bd7f33f7c1521b13fde7e93" exitCode=0 Dec 01 21:49:38 crc kubenswrapper[4962]: I1201 21:49:38.263619 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mgx99" event={"ID":"7590e32f-b0cb-46dc-a679-46b2ede43ba0","Type":"ContainerDied","Data":"33368f8cf1f6b4dbbbf88c1180a6ffe01fe70d1f4bd7f33f7c1521b13fde7e93"} Dec 01 21:49:38 crc kubenswrapper[4962]: I1201 21:49:38.879635 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-84cs2"] Dec 01 21:49:38 crc kubenswrapper[4962]: E1201 21:49:38.880263 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bac2c3d5-d3cc-44de-b838-ad03aff719d7" containerName="extract-content" Dec 01 21:49:38 crc kubenswrapper[4962]: I1201 21:49:38.880365 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="bac2c3d5-d3cc-44de-b838-ad03aff719d7" containerName="extract-content" Dec 01 21:49:38 crc kubenswrapper[4962]: E1201 21:49:38.880480 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bac2c3d5-d3cc-44de-b838-ad03aff719d7" containerName="extract-utilities" Dec 01 21:49:38 crc kubenswrapper[4962]: I1201 21:49:38.880545 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="bac2c3d5-d3cc-44de-b838-ad03aff719d7" containerName="extract-utilities" Dec 01 21:49:38 crc kubenswrapper[4962]: E1201 21:49:38.880633 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bac2c3d5-d3cc-44de-b838-ad03aff719d7" containerName="registry-server" Dec 01 21:49:38 crc kubenswrapper[4962]: I1201 21:49:38.880703 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="bac2c3d5-d3cc-44de-b838-ad03aff719d7" containerName="registry-server" Dec 01 21:49:38 crc kubenswrapper[4962]: I1201 21:49:38.880988 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="bac2c3d5-d3cc-44de-b838-ad03aff719d7" containerName="registry-server" Dec 01 21:49:38 crc kubenswrapper[4962]: I1201 21:49:38.882444 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-84cs2" Dec 01 21:49:38 crc kubenswrapper[4962]: I1201 21:49:38.899853 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-84cs2"] Dec 01 21:49:38 crc kubenswrapper[4962]: I1201 21:49:38.956529 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4957s\" (UniqueName: \"kubernetes.io/projected/71238ef7-45a7-48e8-9359-f864d95fb095-kube-api-access-4957s\") pod \"certified-operators-84cs2\" (UID: \"71238ef7-45a7-48e8-9359-f864d95fb095\") " pod="openshift-marketplace/certified-operators-84cs2" Dec 01 21:49:38 crc kubenswrapper[4962]: I1201 21:49:38.956805 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71238ef7-45a7-48e8-9359-f864d95fb095-utilities\") pod \"certified-operators-84cs2\" (UID: \"71238ef7-45a7-48e8-9359-f864d95fb095\") " pod="openshift-marketplace/certified-operators-84cs2" Dec 01 21:49:38 crc kubenswrapper[4962]: I1201 21:49:38.956960 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71238ef7-45a7-48e8-9359-f864d95fb095-catalog-content\") pod \"certified-operators-84cs2\" (UID: \"71238ef7-45a7-48e8-9359-f864d95fb095\") " pod="openshift-marketplace/certified-operators-84cs2" Dec 01 21:49:39 crc kubenswrapper[4962]: I1201 21:49:39.058457 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71238ef7-45a7-48e8-9359-f864d95fb095-utilities\") pod \"certified-operators-84cs2\" (UID: \"71238ef7-45a7-48e8-9359-f864d95fb095\") " pod="openshift-marketplace/certified-operators-84cs2" Dec 01 21:49:39 crc kubenswrapper[4962]: I1201 21:49:39.058565 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71238ef7-45a7-48e8-9359-f864d95fb095-catalog-content\") pod \"certified-operators-84cs2\" (UID: \"71238ef7-45a7-48e8-9359-f864d95fb095\") " pod="openshift-marketplace/certified-operators-84cs2" Dec 01 21:49:39 crc kubenswrapper[4962]: I1201 21:49:39.058669 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4957s\" (UniqueName: \"kubernetes.io/projected/71238ef7-45a7-48e8-9359-f864d95fb095-kube-api-access-4957s\") pod \"certified-operators-84cs2\" (UID: \"71238ef7-45a7-48e8-9359-f864d95fb095\") " pod="openshift-marketplace/certified-operators-84cs2" Dec 01 21:49:39 crc kubenswrapper[4962]: I1201 21:49:39.059920 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71238ef7-45a7-48e8-9359-f864d95fb095-utilities\") pod \"certified-operators-84cs2\" (UID: \"71238ef7-45a7-48e8-9359-f864d95fb095\") " pod="openshift-marketplace/certified-operators-84cs2" Dec 01 21:49:39 crc kubenswrapper[4962]: I1201 21:49:39.060053 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71238ef7-45a7-48e8-9359-f864d95fb095-catalog-content\") pod \"certified-operators-84cs2\" (UID: \"71238ef7-45a7-48e8-9359-f864d95fb095\") " pod="openshift-marketplace/certified-operators-84cs2" Dec 01 21:49:39 crc kubenswrapper[4962]: I1201 21:49:39.091334 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4957s\" (UniqueName: \"kubernetes.io/projected/71238ef7-45a7-48e8-9359-f864d95fb095-kube-api-access-4957s\") pod \"certified-operators-84cs2\" (UID: \"71238ef7-45a7-48e8-9359-f864d95fb095\") " pod="openshift-marketplace/certified-operators-84cs2" Dec 01 21:49:39 crc kubenswrapper[4962]: I1201 21:49:39.209093 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-84cs2" Dec 01 21:49:39 crc kubenswrapper[4962]: I1201 21:49:39.271170 4962 generic.go:334] "Generic (PLEG): container finished" podID="7590e32f-b0cb-46dc-a679-46b2ede43ba0" containerID="4c2b43000ddaa8664a27ba8ba3db540d1460025a111825a13cb52dece735f39b" exitCode=0 Dec 01 21:49:39 crc kubenswrapper[4962]: I1201 21:49:39.271213 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mgx99" event={"ID":"7590e32f-b0cb-46dc-a679-46b2ede43ba0","Type":"ContainerDied","Data":"4c2b43000ddaa8664a27ba8ba3db540d1460025a111825a13cb52dece735f39b"} Dec 01 21:49:39 crc kubenswrapper[4962]: I1201 21:49:39.809685 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-84cs2"] Dec 01 21:49:39 crc kubenswrapper[4962]: W1201 21:49:39.821902 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71238ef7_45a7_48e8_9359_f864d95fb095.slice/crio-32d1b86c669905d190e653f9c72225f61eb9182eb0a2627fbd1f7785de0c3c28 WatchSource:0}: Error finding container 32d1b86c669905d190e653f9c72225f61eb9182eb0a2627fbd1f7785de0c3c28: Status 404 returned error can't find the container with id 32d1b86c669905d190e653f9c72225f61eb9182eb0a2627fbd1f7785de0c3c28 Dec 01 21:49:40 crc kubenswrapper[4962]: I1201 21:49:40.314158 4962 generic.go:334] "Generic (PLEG): container finished" podID="71238ef7-45a7-48e8-9359-f864d95fb095" containerID="65e9a7a53a62ee168f8ddf2df62d737cd2a1c5cc92c6877c8c15580e37fd3511" exitCode=0 Dec 01 21:49:40 crc kubenswrapper[4962]: I1201 21:49:40.314318 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-84cs2" event={"ID":"71238ef7-45a7-48e8-9359-f864d95fb095","Type":"ContainerDied","Data":"65e9a7a53a62ee168f8ddf2df62d737cd2a1c5cc92c6877c8c15580e37fd3511"} Dec 01 21:49:40 crc kubenswrapper[4962]: I1201 21:49:40.314370 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-84cs2" event={"ID":"71238ef7-45a7-48e8-9359-f864d95fb095","Type":"ContainerStarted","Data":"32d1b86c669905d190e653f9c72225f61eb9182eb0a2627fbd1f7785de0c3c28"} Dec 01 21:49:40 crc kubenswrapper[4962]: I1201 21:49:40.326101 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mgx99" event={"ID":"7590e32f-b0cb-46dc-a679-46b2ede43ba0","Type":"ContainerStarted","Data":"d1f828003f842c080b42523733203d111ff43e98e052504758f4bdc35b8f2254"} Dec 01 21:49:40 crc kubenswrapper[4962]: I1201 21:49:40.326143 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mgx99" event={"ID":"7590e32f-b0cb-46dc-a679-46b2ede43ba0","Type":"ContainerStarted","Data":"21c408036f3540e10db0a3f0198c43f08b8d547985bea1fa4e085ba3d82b22c8"} Dec 01 21:49:40 crc kubenswrapper[4962]: I1201 21:49:40.326159 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mgx99" event={"ID":"7590e32f-b0cb-46dc-a679-46b2ede43ba0","Type":"ContainerStarted","Data":"35ca025baa8f2c0faef5c8853ce462286dbd8fc982f87d2690479a43ada5e152"} Dec 01 21:49:40 crc kubenswrapper[4962]: I1201 21:49:40.326172 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mgx99" event={"ID":"7590e32f-b0cb-46dc-a679-46b2ede43ba0","Type":"ContainerStarted","Data":"76e99341b39bf65b3412c321f0bd8f8d19fa2699a732d474efb1af594fbad162"} Dec 01 21:49:40 crc kubenswrapper[4962]: I1201 21:49:40.326183 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mgx99" event={"ID":"7590e32f-b0cb-46dc-a679-46b2ede43ba0","Type":"ContainerStarted","Data":"48e5810f41201c5bfbe0a626662b3f7842795aeaa1f7ca16608b5681ecb65113"} Dec 01 21:49:41 crc kubenswrapper[4962]: I1201 21:49:41.350531 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mgx99" event={"ID":"7590e32f-b0cb-46dc-a679-46b2ede43ba0","Type":"ContainerStarted","Data":"ccf9ec25d934bd4b343b127928c3e3c22082e2cba8142368df0a1ffcb5ed5559"} Dec 01 21:49:41 crc kubenswrapper[4962]: I1201 21:49:41.351049 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-mgx99" Dec 01 21:49:41 crc kubenswrapper[4962]: I1201 21:49:41.379119 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-mgx99" podStartSLOduration=5.126596441 podStartE2EDuration="16.379087585s" podCreationTimestamp="2025-12-01 21:49:25 +0000 UTC" firstStartedPulling="2025-12-01 21:49:25.669734595 +0000 UTC m=+949.771173780" lastFinishedPulling="2025-12-01 21:49:36.922225729 +0000 UTC m=+961.023664924" observedRunningTime="2025-12-01 21:49:41.37752018 +0000 UTC m=+965.478959385" watchObservedRunningTime="2025-12-01 21:49:41.379087585 +0000 UTC m=+965.480526780" Dec 01 21:49:42 crc kubenswrapper[4962]: I1201 21:49:42.364821 4962 generic.go:334] "Generic (PLEG): container finished" podID="71238ef7-45a7-48e8-9359-f864d95fb095" containerID="c297cc507f420764b85aaffe2581f963a88cb40da838bbf2bbca54da593301a6" exitCode=0 Dec 01 21:49:42 crc kubenswrapper[4962]: I1201 21:49:42.365017 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-84cs2" event={"ID":"71238ef7-45a7-48e8-9359-f864d95fb095","Type":"ContainerDied","Data":"c297cc507f420764b85aaffe2581f963a88cb40da838bbf2bbca54da593301a6"} Dec 01 21:49:43 crc kubenswrapper[4962]: I1201 21:49:43.375836 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-84cs2" event={"ID":"71238ef7-45a7-48e8-9359-f864d95fb095","Type":"ContainerStarted","Data":"1ccf6e7222640086f0b09cfc24756262e8ceb894e02610016dac3ec96c2d5e14"} Dec 01 21:49:43 crc kubenswrapper[4962]: I1201 21:49:43.400184 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-84cs2" podStartSLOduration=2.773922447 podStartE2EDuration="5.400161768s" podCreationTimestamp="2025-12-01 21:49:38 +0000 UTC" firstStartedPulling="2025-12-01 21:49:40.318423017 +0000 UTC m=+964.419862212" lastFinishedPulling="2025-12-01 21:49:42.944662318 +0000 UTC m=+967.046101533" observedRunningTime="2025-12-01 21:49:43.399405976 +0000 UTC m=+967.500845171" watchObservedRunningTime="2025-12-01 21:49:43.400161768 +0000 UTC m=+967.501600983" Dec 01 21:49:45 crc kubenswrapper[4962]: I1201 21:49:45.525275 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-mgx99" Dec 01 21:49:45 crc kubenswrapper[4962]: I1201 21:49:45.588630 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-mgx99" Dec 01 21:49:45 crc kubenswrapper[4962]: I1201 21:49:45.636225 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-f8648f98b-hf6jx" Dec 01 21:49:47 crc kubenswrapper[4962]: I1201 21:49:47.684127 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-sd67v"] Dec 01 21:49:47 crc kubenswrapper[4962]: I1201 21:49:47.686415 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-sd67v" Dec 01 21:49:47 crc kubenswrapper[4962]: I1201 21:49:47.689268 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Dec 01 21:49:47 crc kubenswrapper[4962]: I1201 21:49:47.689362 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Dec 01 21:49:47 crc kubenswrapper[4962]: I1201 21:49:47.695356 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-dnw6q" Dec 01 21:49:47 crc kubenswrapper[4962]: I1201 21:49:47.696232 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-sd67v"] Dec 01 21:49:47 crc kubenswrapper[4962]: I1201 21:49:47.739693 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgb8d\" (UniqueName: \"kubernetes.io/projected/d151cbe8-8f07-425d-bd99-c06451f4a3cf-kube-api-access-dgb8d\") pod \"openstack-operator-index-sd67v\" (UID: \"d151cbe8-8f07-425d-bd99-c06451f4a3cf\") " pod="openstack-operators/openstack-operator-index-sd67v" Dec 01 21:49:47 crc kubenswrapper[4962]: I1201 21:49:47.841419 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgb8d\" (UniqueName: \"kubernetes.io/projected/d151cbe8-8f07-425d-bd99-c06451f4a3cf-kube-api-access-dgb8d\") pod \"openstack-operator-index-sd67v\" (UID: \"d151cbe8-8f07-425d-bd99-c06451f4a3cf\") " pod="openstack-operators/openstack-operator-index-sd67v" Dec 01 21:49:47 crc kubenswrapper[4962]: I1201 21:49:47.866333 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgb8d\" (UniqueName: \"kubernetes.io/projected/d151cbe8-8f07-425d-bd99-c06451f4a3cf-kube-api-access-dgb8d\") pod \"openstack-operator-index-sd67v\" (UID: \"d151cbe8-8f07-425d-bd99-c06451f4a3cf\") " pod="openstack-operators/openstack-operator-index-sd67v" Dec 01 21:49:48 crc kubenswrapper[4962]: I1201 21:49:48.019646 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-sd67v" Dec 01 21:49:48 crc kubenswrapper[4962]: I1201 21:49:48.593107 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-sd67v"] Dec 01 21:49:48 crc kubenswrapper[4962]: W1201 21:49:48.597530 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd151cbe8_8f07_425d_bd99_c06451f4a3cf.slice/crio-f0d977b828e3219972390e370dea099af374577b0636f0c8cf101d2a601e37d8 WatchSource:0}: Error finding container f0d977b828e3219972390e370dea099af374577b0636f0c8cf101d2a601e37d8: Status 404 returned error can't find the container with id f0d977b828e3219972390e370dea099af374577b0636f0c8cf101d2a601e37d8 Dec 01 21:49:49 crc kubenswrapper[4962]: I1201 21:49:49.210213 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-84cs2" Dec 01 21:49:49 crc kubenswrapper[4962]: I1201 21:49:49.210477 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-84cs2" Dec 01 21:49:49 crc kubenswrapper[4962]: I1201 21:49:49.264311 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-84cs2" Dec 01 21:49:49 crc kubenswrapper[4962]: I1201 21:49:49.455692 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-sd67v" event={"ID":"d151cbe8-8f07-425d-bd99-c06451f4a3cf","Type":"ContainerStarted","Data":"f0d977b828e3219972390e370dea099af374577b0636f0c8cf101d2a601e37d8"} Dec 01 21:49:49 crc kubenswrapper[4962]: I1201 21:49:49.519028 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-84cs2" Dec 01 21:49:54 crc kubenswrapper[4962]: I1201 21:49:54.499825 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-sd67v" event={"ID":"d151cbe8-8f07-425d-bd99-c06451f4a3cf","Type":"ContainerStarted","Data":"b674f97d47174cbdb3ca65f296260e7df29416fbccf8bbbcfd6beb0322b9b1be"} Dec 01 21:49:54 crc kubenswrapper[4962]: I1201 21:49:54.518762 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-sd67v" podStartSLOduration=2.612382043 podStartE2EDuration="7.518736611s" podCreationTimestamp="2025-12-01 21:49:47 +0000 UTC" firstStartedPulling="2025-12-01 21:49:48.601017222 +0000 UTC m=+972.702456427" lastFinishedPulling="2025-12-01 21:49:53.5073718 +0000 UTC m=+977.608810995" observedRunningTime="2025-12-01 21:49:54.517038913 +0000 UTC m=+978.618478118" watchObservedRunningTime="2025-12-01 21:49:54.518736611 +0000 UTC m=+978.620175846" Dec 01 21:49:55 crc kubenswrapper[4962]: I1201 21:49:55.530457 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-mgx99" Dec 01 21:49:55 crc kubenswrapper[4962]: I1201 21:49:55.539264 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-2xwv6" Dec 01 21:49:55 crc kubenswrapper[4962]: I1201 21:49:55.871816 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-84cs2"] Dec 01 21:49:55 crc kubenswrapper[4962]: I1201 21:49:55.872176 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-84cs2" podUID="71238ef7-45a7-48e8-9359-f864d95fb095" containerName="registry-server" containerID="cri-o://1ccf6e7222640086f0b09cfc24756262e8ceb894e02610016dac3ec96c2d5e14" gracePeriod=2 Dec 01 21:49:57 crc kubenswrapper[4962]: I1201 21:49:57.452478 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-84cs2" Dec 01 21:49:57 crc kubenswrapper[4962]: I1201 21:49:57.527652 4962 generic.go:334] "Generic (PLEG): container finished" podID="71238ef7-45a7-48e8-9359-f864d95fb095" containerID="1ccf6e7222640086f0b09cfc24756262e8ceb894e02610016dac3ec96c2d5e14" exitCode=0 Dec 01 21:49:57 crc kubenswrapper[4962]: I1201 21:49:57.527695 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-84cs2" event={"ID":"71238ef7-45a7-48e8-9359-f864d95fb095","Type":"ContainerDied","Data":"1ccf6e7222640086f0b09cfc24756262e8ceb894e02610016dac3ec96c2d5e14"} Dec 01 21:49:57 crc kubenswrapper[4962]: I1201 21:49:57.527722 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-84cs2" event={"ID":"71238ef7-45a7-48e8-9359-f864d95fb095","Type":"ContainerDied","Data":"32d1b86c669905d190e653f9c72225f61eb9182eb0a2627fbd1f7785de0c3c28"} Dec 01 21:49:57 crc kubenswrapper[4962]: I1201 21:49:57.527722 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-84cs2" Dec 01 21:49:57 crc kubenswrapper[4962]: I1201 21:49:57.527737 4962 scope.go:117] "RemoveContainer" containerID="1ccf6e7222640086f0b09cfc24756262e8ceb894e02610016dac3ec96c2d5e14" Dec 01 21:49:57 crc kubenswrapper[4962]: I1201 21:49:57.534699 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71238ef7-45a7-48e8-9359-f864d95fb095-utilities\") pod \"71238ef7-45a7-48e8-9359-f864d95fb095\" (UID: \"71238ef7-45a7-48e8-9359-f864d95fb095\") " Dec 01 21:49:57 crc kubenswrapper[4962]: I1201 21:49:57.534853 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71238ef7-45a7-48e8-9359-f864d95fb095-catalog-content\") pod \"71238ef7-45a7-48e8-9359-f864d95fb095\" (UID: \"71238ef7-45a7-48e8-9359-f864d95fb095\") " Dec 01 21:49:57 crc kubenswrapper[4962]: I1201 21:49:57.534921 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4957s\" (UniqueName: \"kubernetes.io/projected/71238ef7-45a7-48e8-9359-f864d95fb095-kube-api-access-4957s\") pod \"71238ef7-45a7-48e8-9359-f864d95fb095\" (UID: \"71238ef7-45a7-48e8-9359-f864d95fb095\") " Dec 01 21:49:57 crc kubenswrapper[4962]: I1201 21:49:57.536409 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71238ef7-45a7-48e8-9359-f864d95fb095-utilities" (OuterVolumeSpecName: "utilities") pod "71238ef7-45a7-48e8-9359-f864d95fb095" (UID: "71238ef7-45a7-48e8-9359-f864d95fb095"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:49:57 crc kubenswrapper[4962]: I1201 21:49:57.542522 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71238ef7-45a7-48e8-9359-f864d95fb095-kube-api-access-4957s" (OuterVolumeSpecName: "kube-api-access-4957s") pod "71238ef7-45a7-48e8-9359-f864d95fb095" (UID: "71238ef7-45a7-48e8-9359-f864d95fb095"). InnerVolumeSpecName "kube-api-access-4957s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:49:57 crc kubenswrapper[4962]: I1201 21:49:57.550620 4962 scope.go:117] "RemoveContainer" containerID="c297cc507f420764b85aaffe2581f963a88cb40da838bbf2bbca54da593301a6" Dec 01 21:49:57 crc kubenswrapper[4962]: I1201 21:49:57.603850 4962 scope.go:117] "RemoveContainer" containerID="65e9a7a53a62ee168f8ddf2df62d737cd2a1c5cc92c6877c8c15580e37fd3511" Dec 01 21:49:57 crc kubenswrapper[4962]: I1201 21:49:57.616728 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71238ef7-45a7-48e8-9359-f864d95fb095-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "71238ef7-45a7-48e8-9359-f864d95fb095" (UID: "71238ef7-45a7-48e8-9359-f864d95fb095"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:49:57 crc kubenswrapper[4962]: I1201 21:49:57.637244 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71238ef7-45a7-48e8-9359-f864d95fb095-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 21:49:57 crc kubenswrapper[4962]: I1201 21:49:57.637287 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4957s\" (UniqueName: \"kubernetes.io/projected/71238ef7-45a7-48e8-9359-f864d95fb095-kube-api-access-4957s\") on node \"crc\" DevicePath \"\"" Dec 01 21:49:57 crc kubenswrapper[4962]: I1201 21:49:57.637302 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71238ef7-45a7-48e8-9359-f864d95fb095-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 21:49:57 crc kubenswrapper[4962]: I1201 21:49:57.643819 4962 scope.go:117] "RemoveContainer" containerID="1ccf6e7222640086f0b09cfc24756262e8ceb894e02610016dac3ec96c2d5e14" Dec 01 21:49:57 crc kubenswrapper[4962]: E1201 21:49:57.644427 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ccf6e7222640086f0b09cfc24756262e8ceb894e02610016dac3ec96c2d5e14\": container with ID starting with 1ccf6e7222640086f0b09cfc24756262e8ceb894e02610016dac3ec96c2d5e14 not found: ID does not exist" containerID="1ccf6e7222640086f0b09cfc24756262e8ceb894e02610016dac3ec96c2d5e14" Dec 01 21:49:57 crc kubenswrapper[4962]: I1201 21:49:57.644500 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ccf6e7222640086f0b09cfc24756262e8ceb894e02610016dac3ec96c2d5e14"} err="failed to get container status \"1ccf6e7222640086f0b09cfc24756262e8ceb894e02610016dac3ec96c2d5e14\": rpc error: code = NotFound desc = could not find container \"1ccf6e7222640086f0b09cfc24756262e8ceb894e02610016dac3ec96c2d5e14\": container with ID starting with 1ccf6e7222640086f0b09cfc24756262e8ceb894e02610016dac3ec96c2d5e14 not found: ID does not exist" Dec 01 21:49:57 crc kubenswrapper[4962]: I1201 21:49:57.644559 4962 scope.go:117] "RemoveContainer" containerID="c297cc507f420764b85aaffe2581f963a88cb40da838bbf2bbca54da593301a6" Dec 01 21:49:57 crc kubenswrapper[4962]: E1201 21:49:57.645091 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c297cc507f420764b85aaffe2581f963a88cb40da838bbf2bbca54da593301a6\": container with ID starting with c297cc507f420764b85aaffe2581f963a88cb40da838bbf2bbca54da593301a6 not found: ID does not exist" containerID="c297cc507f420764b85aaffe2581f963a88cb40da838bbf2bbca54da593301a6" Dec 01 21:49:57 crc kubenswrapper[4962]: I1201 21:49:57.645131 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c297cc507f420764b85aaffe2581f963a88cb40da838bbf2bbca54da593301a6"} err="failed to get container status \"c297cc507f420764b85aaffe2581f963a88cb40da838bbf2bbca54da593301a6\": rpc error: code = NotFound desc = could not find container \"c297cc507f420764b85aaffe2581f963a88cb40da838bbf2bbca54da593301a6\": container with ID starting with c297cc507f420764b85aaffe2581f963a88cb40da838bbf2bbca54da593301a6 not found: ID does not exist" Dec 01 21:49:57 crc kubenswrapper[4962]: I1201 21:49:57.645158 4962 scope.go:117] "RemoveContainer" containerID="65e9a7a53a62ee168f8ddf2df62d737cd2a1c5cc92c6877c8c15580e37fd3511" Dec 01 21:49:57 crc kubenswrapper[4962]: E1201 21:49:57.646022 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65e9a7a53a62ee168f8ddf2df62d737cd2a1c5cc92c6877c8c15580e37fd3511\": container with ID starting with 65e9a7a53a62ee168f8ddf2df62d737cd2a1c5cc92c6877c8c15580e37fd3511 not found: ID does not exist" containerID="65e9a7a53a62ee168f8ddf2df62d737cd2a1c5cc92c6877c8c15580e37fd3511" Dec 01 21:49:57 crc kubenswrapper[4962]: I1201 21:49:57.646054 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65e9a7a53a62ee168f8ddf2df62d737cd2a1c5cc92c6877c8c15580e37fd3511"} err="failed to get container status \"65e9a7a53a62ee168f8ddf2df62d737cd2a1c5cc92c6877c8c15580e37fd3511\": rpc error: code = NotFound desc = could not find container \"65e9a7a53a62ee168f8ddf2df62d737cd2a1c5cc92c6877c8c15580e37fd3511\": container with ID starting with 65e9a7a53a62ee168f8ddf2df62d737cd2a1c5cc92c6877c8c15580e37fd3511 not found: ID does not exist" Dec 01 21:49:57 crc kubenswrapper[4962]: I1201 21:49:57.871695 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-84cs2"] Dec 01 21:49:57 crc kubenswrapper[4962]: I1201 21:49:57.881430 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-84cs2"] Dec 01 21:49:58 crc kubenswrapper[4962]: I1201 21:49:58.020780 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-sd67v" Dec 01 21:49:58 crc kubenswrapper[4962]: I1201 21:49:58.021127 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-sd67v" Dec 01 21:49:58 crc kubenswrapper[4962]: I1201 21:49:58.069586 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-sd67v" Dec 01 21:49:58 crc kubenswrapper[4962]: I1201 21:49:58.228509 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71238ef7-45a7-48e8-9359-f864d95fb095" path="/var/lib/kubelet/pods/71238ef7-45a7-48e8-9359-f864d95fb095/volumes" Dec 01 21:49:58 crc kubenswrapper[4962]: I1201 21:49:58.563330 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-sd67v" Dec 01 21:50:00 crc kubenswrapper[4962]: I1201 21:50:00.733567 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/73f65eef14c85a3f197e77d6eb02f32b624918b5a1b27530e5300f306dlvx4g"] Dec 01 21:50:00 crc kubenswrapper[4962]: E1201 21:50:00.734554 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71238ef7-45a7-48e8-9359-f864d95fb095" containerName="extract-utilities" Dec 01 21:50:00 crc kubenswrapper[4962]: I1201 21:50:00.734587 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="71238ef7-45a7-48e8-9359-f864d95fb095" containerName="extract-utilities" Dec 01 21:50:00 crc kubenswrapper[4962]: E1201 21:50:00.734657 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71238ef7-45a7-48e8-9359-f864d95fb095" containerName="registry-server" Dec 01 21:50:00 crc kubenswrapper[4962]: I1201 21:50:00.734676 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="71238ef7-45a7-48e8-9359-f864d95fb095" containerName="registry-server" Dec 01 21:50:00 crc kubenswrapper[4962]: E1201 21:50:00.734697 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71238ef7-45a7-48e8-9359-f864d95fb095" containerName="extract-content" Dec 01 21:50:00 crc kubenswrapper[4962]: I1201 21:50:00.734713 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="71238ef7-45a7-48e8-9359-f864d95fb095" containerName="extract-content" Dec 01 21:50:00 crc kubenswrapper[4962]: I1201 21:50:00.735145 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="71238ef7-45a7-48e8-9359-f864d95fb095" containerName="registry-server" Dec 01 21:50:00 crc kubenswrapper[4962]: I1201 21:50:00.741096 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/73f65eef14c85a3f197e77d6eb02f32b624918b5a1b27530e5300f306dlvx4g" Dec 01 21:50:00 crc kubenswrapper[4962]: I1201 21:50:00.744166 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/73f65eef14c85a3f197e77d6eb02f32b624918b5a1b27530e5300f306dlvx4g"] Dec 01 21:50:00 crc kubenswrapper[4962]: I1201 21:50:00.745362 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-4zpdw" Dec 01 21:50:00 crc kubenswrapper[4962]: I1201 21:50:00.893878 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdd8g\" (UniqueName: \"kubernetes.io/projected/dfd7b1bd-1314-48d9-92f7-1adcba0a1cd3-kube-api-access-cdd8g\") pod \"73f65eef14c85a3f197e77d6eb02f32b624918b5a1b27530e5300f306dlvx4g\" (UID: \"dfd7b1bd-1314-48d9-92f7-1adcba0a1cd3\") " pod="openstack-operators/73f65eef14c85a3f197e77d6eb02f32b624918b5a1b27530e5300f306dlvx4g" Dec 01 21:50:00 crc kubenswrapper[4962]: I1201 21:50:00.894023 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dfd7b1bd-1314-48d9-92f7-1adcba0a1cd3-bundle\") pod \"73f65eef14c85a3f197e77d6eb02f32b624918b5a1b27530e5300f306dlvx4g\" (UID: \"dfd7b1bd-1314-48d9-92f7-1adcba0a1cd3\") " pod="openstack-operators/73f65eef14c85a3f197e77d6eb02f32b624918b5a1b27530e5300f306dlvx4g" Dec 01 21:50:00 crc kubenswrapper[4962]: I1201 21:50:00.894096 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dfd7b1bd-1314-48d9-92f7-1adcba0a1cd3-util\") pod \"73f65eef14c85a3f197e77d6eb02f32b624918b5a1b27530e5300f306dlvx4g\" (UID: \"dfd7b1bd-1314-48d9-92f7-1adcba0a1cd3\") " pod="openstack-operators/73f65eef14c85a3f197e77d6eb02f32b624918b5a1b27530e5300f306dlvx4g" Dec 01 21:50:00 crc kubenswrapper[4962]: I1201 21:50:00.995893 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dfd7b1bd-1314-48d9-92f7-1adcba0a1cd3-util\") pod \"73f65eef14c85a3f197e77d6eb02f32b624918b5a1b27530e5300f306dlvx4g\" (UID: \"dfd7b1bd-1314-48d9-92f7-1adcba0a1cd3\") " pod="openstack-operators/73f65eef14c85a3f197e77d6eb02f32b624918b5a1b27530e5300f306dlvx4g" Dec 01 21:50:00 crc kubenswrapper[4962]: I1201 21:50:00.996000 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdd8g\" (UniqueName: \"kubernetes.io/projected/dfd7b1bd-1314-48d9-92f7-1adcba0a1cd3-kube-api-access-cdd8g\") pod \"73f65eef14c85a3f197e77d6eb02f32b624918b5a1b27530e5300f306dlvx4g\" (UID: \"dfd7b1bd-1314-48d9-92f7-1adcba0a1cd3\") " pod="openstack-operators/73f65eef14c85a3f197e77d6eb02f32b624918b5a1b27530e5300f306dlvx4g" Dec 01 21:50:00 crc kubenswrapper[4962]: I1201 21:50:00.996093 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dfd7b1bd-1314-48d9-92f7-1adcba0a1cd3-bundle\") pod \"73f65eef14c85a3f197e77d6eb02f32b624918b5a1b27530e5300f306dlvx4g\" (UID: \"dfd7b1bd-1314-48d9-92f7-1adcba0a1cd3\") " pod="openstack-operators/73f65eef14c85a3f197e77d6eb02f32b624918b5a1b27530e5300f306dlvx4g" Dec 01 21:50:00 crc kubenswrapper[4962]: I1201 21:50:00.996677 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dfd7b1bd-1314-48d9-92f7-1adcba0a1cd3-bundle\") pod \"73f65eef14c85a3f197e77d6eb02f32b624918b5a1b27530e5300f306dlvx4g\" (UID: \"dfd7b1bd-1314-48d9-92f7-1adcba0a1cd3\") " pod="openstack-operators/73f65eef14c85a3f197e77d6eb02f32b624918b5a1b27530e5300f306dlvx4g" Dec 01 21:50:00 crc kubenswrapper[4962]: I1201 21:50:00.996707 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dfd7b1bd-1314-48d9-92f7-1adcba0a1cd3-util\") pod \"73f65eef14c85a3f197e77d6eb02f32b624918b5a1b27530e5300f306dlvx4g\" (UID: \"dfd7b1bd-1314-48d9-92f7-1adcba0a1cd3\") " pod="openstack-operators/73f65eef14c85a3f197e77d6eb02f32b624918b5a1b27530e5300f306dlvx4g" Dec 01 21:50:01 crc kubenswrapper[4962]: I1201 21:50:01.019452 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdd8g\" (UniqueName: \"kubernetes.io/projected/dfd7b1bd-1314-48d9-92f7-1adcba0a1cd3-kube-api-access-cdd8g\") pod \"73f65eef14c85a3f197e77d6eb02f32b624918b5a1b27530e5300f306dlvx4g\" (UID: \"dfd7b1bd-1314-48d9-92f7-1adcba0a1cd3\") " pod="openstack-operators/73f65eef14c85a3f197e77d6eb02f32b624918b5a1b27530e5300f306dlvx4g" Dec 01 21:50:01 crc kubenswrapper[4962]: I1201 21:50:01.066120 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/73f65eef14c85a3f197e77d6eb02f32b624918b5a1b27530e5300f306dlvx4g" Dec 01 21:50:01 crc kubenswrapper[4962]: I1201 21:50:01.558713 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/73f65eef14c85a3f197e77d6eb02f32b624918b5a1b27530e5300f306dlvx4g"] Dec 01 21:50:02 crc kubenswrapper[4962]: I1201 21:50:02.567226 4962 generic.go:334] "Generic (PLEG): container finished" podID="dfd7b1bd-1314-48d9-92f7-1adcba0a1cd3" containerID="58ed978ac91832685389693df0d3436af8a0210a6ad491df59703e660aa96945" exitCode=0 Dec 01 21:50:02 crc kubenswrapper[4962]: I1201 21:50:02.567274 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/73f65eef14c85a3f197e77d6eb02f32b624918b5a1b27530e5300f306dlvx4g" event={"ID":"dfd7b1bd-1314-48d9-92f7-1adcba0a1cd3","Type":"ContainerDied","Data":"58ed978ac91832685389693df0d3436af8a0210a6ad491df59703e660aa96945"} Dec 01 21:50:02 crc kubenswrapper[4962]: I1201 21:50:02.567299 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/73f65eef14c85a3f197e77d6eb02f32b624918b5a1b27530e5300f306dlvx4g" event={"ID":"dfd7b1bd-1314-48d9-92f7-1adcba0a1cd3","Type":"ContainerStarted","Data":"9b4660c432bbb373abdd90b86e4526030ef0a101c67831710e5bd02e2d8d241b"} Dec 01 21:50:02 crc kubenswrapper[4962]: I1201 21:50:02.784456 4962 patch_prober.go:28] interesting pod/machine-config-daemon-b642k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 21:50:02 crc kubenswrapper[4962]: I1201 21:50:02.784513 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 21:50:03 crc kubenswrapper[4962]: I1201 21:50:03.606490 4962 generic.go:334] "Generic (PLEG): container finished" podID="dfd7b1bd-1314-48d9-92f7-1adcba0a1cd3" containerID="9da833ba3568dff63d63fbfb667298c116bf5c05a4b0dc9ad1395a7e0632ea46" exitCode=0 Dec 01 21:50:03 crc kubenswrapper[4962]: I1201 21:50:03.606810 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/73f65eef14c85a3f197e77d6eb02f32b624918b5a1b27530e5300f306dlvx4g" event={"ID":"dfd7b1bd-1314-48d9-92f7-1adcba0a1cd3","Type":"ContainerDied","Data":"9da833ba3568dff63d63fbfb667298c116bf5c05a4b0dc9ad1395a7e0632ea46"} Dec 01 21:50:04 crc kubenswrapper[4962]: I1201 21:50:04.619804 4962 generic.go:334] "Generic (PLEG): container finished" podID="dfd7b1bd-1314-48d9-92f7-1adcba0a1cd3" containerID="e20f3e0f66029cf9ad28a94e64480d3dc8dd108ffeb9ca49c99bc53b9de83b3e" exitCode=0 Dec 01 21:50:04 crc kubenswrapper[4962]: I1201 21:50:04.619885 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/73f65eef14c85a3f197e77d6eb02f32b624918b5a1b27530e5300f306dlvx4g" event={"ID":"dfd7b1bd-1314-48d9-92f7-1adcba0a1cd3","Type":"ContainerDied","Data":"e20f3e0f66029cf9ad28a94e64480d3dc8dd108ffeb9ca49c99bc53b9de83b3e"} Dec 01 21:50:05 crc kubenswrapper[4962]: I1201 21:50:05.954377 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/73f65eef14c85a3f197e77d6eb02f32b624918b5a1b27530e5300f306dlvx4g" Dec 01 21:50:06 crc kubenswrapper[4962]: I1201 21:50:06.106331 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dfd7b1bd-1314-48d9-92f7-1adcba0a1cd3-util\") pod \"dfd7b1bd-1314-48d9-92f7-1adcba0a1cd3\" (UID: \"dfd7b1bd-1314-48d9-92f7-1adcba0a1cd3\") " Dec 01 21:50:06 crc kubenswrapper[4962]: I1201 21:50:06.106619 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dfd7b1bd-1314-48d9-92f7-1adcba0a1cd3-bundle\") pod \"dfd7b1bd-1314-48d9-92f7-1adcba0a1cd3\" (UID: \"dfd7b1bd-1314-48d9-92f7-1adcba0a1cd3\") " Dec 01 21:50:06 crc kubenswrapper[4962]: I1201 21:50:06.106728 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cdd8g\" (UniqueName: \"kubernetes.io/projected/dfd7b1bd-1314-48d9-92f7-1adcba0a1cd3-kube-api-access-cdd8g\") pod \"dfd7b1bd-1314-48d9-92f7-1adcba0a1cd3\" (UID: \"dfd7b1bd-1314-48d9-92f7-1adcba0a1cd3\") " Dec 01 21:50:06 crc kubenswrapper[4962]: I1201 21:50:06.107962 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dfd7b1bd-1314-48d9-92f7-1adcba0a1cd3-bundle" (OuterVolumeSpecName: "bundle") pod "dfd7b1bd-1314-48d9-92f7-1adcba0a1cd3" (UID: "dfd7b1bd-1314-48d9-92f7-1adcba0a1cd3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:50:06 crc kubenswrapper[4962]: I1201 21:50:06.117638 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfd7b1bd-1314-48d9-92f7-1adcba0a1cd3-kube-api-access-cdd8g" (OuterVolumeSpecName: "kube-api-access-cdd8g") pod "dfd7b1bd-1314-48d9-92f7-1adcba0a1cd3" (UID: "dfd7b1bd-1314-48d9-92f7-1adcba0a1cd3"). InnerVolumeSpecName "kube-api-access-cdd8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:50:06 crc kubenswrapper[4962]: I1201 21:50:06.120637 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dfd7b1bd-1314-48d9-92f7-1adcba0a1cd3-util" (OuterVolumeSpecName: "util") pod "dfd7b1bd-1314-48d9-92f7-1adcba0a1cd3" (UID: "dfd7b1bd-1314-48d9-92f7-1adcba0a1cd3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:50:06 crc kubenswrapper[4962]: I1201 21:50:06.208863 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cdd8g\" (UniqueName: \"kubernetes.io/projected/dfd7b1bd-1314-48d9-92f7-1adcba0a1cd3-kube-api-access-cdd8g\") on node \"crc\" DevicePath \"\"" Dec 01 21:50:06 crc kubenswrapper[4962]: I1201 21:50:06.208903 4962 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dfd7b1bd-1314-48d9-92f7-1adcba0a1cd3-util\") on node \"crc\" DevicePath \"\"" Dec 01 21:50:06 crc kubenswrapper[4962]: I1201 21:50:06.208920 4962 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dfd7b1bd-1314-48d9-92f7-1adcba0a1cd3-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 21:50:06 crc kubenswrapper[4962]: I1201 21:50:06.644275 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/73f65eef14c85a3f197e77d6eb02f32b624918b5a1b27530e5300f306dlvx4g" event={"ID":"dfd7b1bd-1314-48d9-92f7-1adcba0a1cd3","Type":"ContainerDied","Data":"9b4660c432bbb373abdd90b86e4526030ef0a101c67831710e5bd02e2d8d241b"} Dec 01 21:50:06 crc kubenswrapper[4962]: I1201 21:50:06.644336 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b4660c432bbb373abdd90b86e4526030ef0a101c67831710e5bd02e2d8d241b" Dec 01 21:50:06 crc kubenswrapper[4962]: I1201 21:50:06.644340 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/73f65eef14c85a3f197e77d6eb02f32b624918b5a1b27530e5300f306dlvx4g" Dec 01 21:50:13 crc kubenswrapper[4962]: I1201 21:50:13.591971 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-5dc9c6958f-52l87"] Dec 01 21:50:13 crc kubenswrapper[4962]: E1201 21:50:13.593039 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfd7b1bd-1314-48d9-92f7-1adcba0a1cd3" containerName="pull" Dec 01 21:50:13 crc kubenswrapper[4962]: I1201 21:50:13.593060 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfd7b1bd-1314-48d9-92f7-1adcba0a1cd3" containerName="pull" Dec 01 21:50:13 crc kubenswrapper[4962]: E1201 21:50:13.593094 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfd7b1bd-1314-48d9-92f7-1adcba0a1cd3" containerName="extract" Dec 01 21:50:13 crc kubenswrapper[4962]: I1201 21:50:13.593107 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfd7b1bd-1314-48d9-92f7-1adcba0a1cd3" containerName="extract" Dec 01 21:50:13 crc kubenswrapper[4962]: E1201 21:50:13.593136 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfd7b1bd-1314-48d9-92f7-1adcba0a1cd3" containerName="util" Dec 01 21:50:13 crc kubenswrapper[4962]: I1201 21:50:13.593148 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfd7b1bd-1314-48d9-92f7-1adcba0a1cd3" containerName="util" Dec 01 21:50:13 crc kubenswrapper[4962]: I1201 21:50:13.593425 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfd7b1bd-1314-48d9-92f7-1adcba0a1cd3" containerName="extract" Dec 01 21:50:13 crc kubenswrapper[4962]: I1201 21:50:13.594345 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-5dc9c6958f-52l87" Dec 01 21:50:13 crc kubenswrapper[4962]: I1201 21:50:13.599473 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-697sb" Dec 01 21:50:13 crc kubenswrapper[4962]: I1201 21:50:13.683151 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bl4bq\" (UniqueName: \"kubernetes.io/projected/555a34ee-8a52-4159-8e01-ed6dcceb45e9-kube-api-access-bl4bq\") pod \"openstack-operator-controller-operator-5dc9c6958f-52l87\" (UID: \"555a34ee-8a52-4159-8e01-ed6dcceb45e9\") " pod="openstack-operators/openstack-operator-controller-operator-5dc9c6958f-52l87" Dec 01 21:50:13 crc kubenswrapper[4962]: I1201 21:50:13.718182 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-5dc9c6958f-52l87"] Dec 01 21:50:13 crc kubenswrapper[4962]: I1201 21:50:13.784889 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bl4bq\" (UniqueName: \"kubernetes.io/projected/555a34ee-8a52-4159-8e01-ed6dcceb45e9-kube-api-access-bl4bq\") pod \"openstack-operator-controller-operator-5dc9c6958f-52l87\" (UID: \"555a34ee-8a52-4159-8e01-ed6dcceb45e9\") " pod="openstack-operators/openstack-operator-controller-operator-5dc9c6958f-52l87" Dec 01 21:50:13 crc kubenswrapper[4962]: I1201 21:50:13.810270 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bl4bq\" (UniqueName: \"kubernetes.io/projected/555a34ee-8a52-4159-8e01-ed6dcceb45e9-kube-api-access-bl4bq\") pod \"openstack-operator-controller-operator-5dc9c6958f-52l87\" (UID: \"555a34ee-8a52-4159-8e01-ed6dcceb45e9\") " pod="openstack-operators/openstack-operator-controller-operator-5dc9c6958f-52l87" Dec 01 21:50:13 crc kubenswrapper[4962]: I1201 21:50:13.922875 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-5dc9c6958f-52l87" Dec 01 21:50:14 crc kubenswrapper[4962]: I1201 21:50:14.376167 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-5dc9c6958f-52l87"] Dec 01 21:50:14 crc kubenswrapper[4962]: W1201 21:50:14.390357 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod555a34ee_8a52_4159_8e01_ed6dcceb45e9.slice/crio-4f14ead2a01f826fad1f100ed516450ef3b5999682409d7eadcdc735c530f1d6 WatchSource:0}: Error finding container 4f14ead2a01f826fad1f100ed516450ef3b5999682409d7eadcdc735c530f1d6: Status 404 returned error can't find the container with id 4f14ead2a01f826fad1f100ed516450ef3b5999682409d7eadcdc735c530f1d6 Dec 01 21:50:14 crc kubenswrapper[4962]: I1201 21:50:14.713852 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-5dc9c6958f-52l87" event={"ID":"555a34ee-8a52-4159-8e01-ed6dcceb45e9","Type":"ContainerStarted","Data":"4f14ead2a01f826fad1f100ed516450ef3b5999682409d7eadcdc735c530f1d6"} Dec 01 21:50:19 crc kubenswrapper[4962]: I1201 21:50:19.756664 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-5dc9c6958f-52l87" event={"ID":"555a34ee-8a52-4159-8e01-ed6dcceb45e9","Type":"ContainerStarted","Data":"f36f094348c6362aefc08284faa24fbf8aedf8f8f09562fef8cffaffbe56cc4b"} Dec 01 21:50:19 crc kubenswrapper[4962]: I1201 21:50:19.757190 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-5dc9c6958f-52l87" Dec 01 21:50:19 crc kubenswrapper[4962]: I1201 21:50:19.788365 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-5dc9c6958f-52l87" podStartSLOduration=1.6655991110000001 podStartE2EDuration="6.788352773s" podCreationTimestamp="2025-12-01 21:50:13 +0000 UTC" firstStartedPulling="2025-12-01 21:50:14.394466242 +0000 UTC m=+998.495905447" lastFinishedPulling="2025-12-01 21:50:19.517219914 +0000 UTC m=+1003.618659109" observedRunningTime="2025-12-01 21:50:19.785244605 +0000 UTC m=+1003.886683800" watchObservedRunningTime="2025-12-01 21:50:19.788352773 +0000 UTC m=+1003.889791968" Dec 01 21:50:32 crc kubenswrapper[4962]: I1201 21:50:32.784828 4962 patch_prober.go:28] interesting pod/machine-config-daemon-b642k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 21:50:32 crc kubenswrapper[4962]: I1201 21:50:32.785833 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 21:50:33 crc kubenswrapper[4962]: I1201 21:50:33.927365 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-5dc9c6958f-52l87" Dec 01 21:50:45 crc kubenswrapper[4962]: I1201 21:50:45.048636 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jtgdl"] Dec 01 21:50:45 crc kubenswrapper[4962]: I1201 21:50:45.052243 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jtgdl" Dec 01 21:50:45 crc kubenswrapper[4962]: I1201 21:50:45.074962 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jtgdl"] Dec 01 21:50:45 crc kubenswrapper[4962]: I1201 21:50:45.189406 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de696fc6-3587-4bf4-82bb-bbac058b18dc-utilities\") pod \"community-operators-jtgdl\" (UID: \"de696fc6-3587-4bf4-82bb-bbac058b18dc\") " pod="openshift-marketplace/community-operators-jtgdl" Dec 01 21:50:45 crc kubenswrapper[4962]: I1201 21:50:45.189838 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de696fc6-3587-4bf4-82bb-bbac058b18dc-catalog-content\") pod \"community-operators-jtgdl\" (UID: \"de696fc6-3587-4bf4-82bb-bbac058b18dc\") " pod="openshift-marketplace/community-operators-jtgdl" Dec 01 21:50:45 crc kubenswrapper[4962]: I1201 21:50:45.189908 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lclt8\" (UniqueName: \"kubernetes.io/projected/de696fc6-3587-4bf4-82bb-bbac058b18dc-kube-api-access-lclt8\") pod \"community-operators-jtgdl\" (UID: \"de696fc6-3587-4bf4-82bb-bbac058b18dc\") " pod="openshift-marketplace/community-operators-jtgdl" Dec 01 21:50:45 crc kubenswrapper[4962]: I1201 21:50:45.292169 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de696fc6-3587-4bf4-82bb-bbac058b18dc-utilities\") pod \"community-operators-jtgdl\" (UID: \"de696fc6-3587-4bf4-82bb-bbac058b18dc\") " pod="openshift-marketplace/community-operators-jtgdl" Dec 01 21:50:45 crc kubenswrapper[4962]: I1201 21:50:45.292320 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de696fc6-3587-4bf4-82bb-bbac058b18dc-catalog-content\") pod \"community-operators-jtgdl\" (UID: \"de696fc6-3587-4bf4-82bb-bbac058b18dc\") " pod="openshift-marketplace/community-operators-jtgdl" Dec 01 21:50:45 crc kubenswrapper[4962]: I1201 21:50:45.292364 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lclt8\" (UniqueName: \"kubernetes.io/projected/de696fc6-3587-4bf4-82bb-bbac058b18dc-kube-api-access-lclt8\") pod \"community-operators-jtgdl\" (UID: \"de696fc6-3587-4bf4-82bb-bbac058b18dc\") " pod="openshift-marketplace/community-operators-jtgdl" Dec 01 21:50:45 crc kubenswrapper[4962]: I1201 21:50:45.293072 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de696fc6-3587-4bf4-82bb-bbac058b18dc-utilities\") pod \"community-operators-jtgdl\" (UID: \"de696fc6-3587-4bf4-82bb-bbac058b18dc\") " pod="openshift-marketplace/community-operators-jtgdl" Dec 01 21:50:45 crc kubenswrapper[4962]: I1201 21:50:45.293234 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de696fc6-3587-4bf4-82bb-bbac058b18dc-catalog-content\") pod \"community-operators-jtgdl\" (UID: \"de696fc6-3587-4bf4-82bb-bbac058b18dc\") " pod="openshift-marketplace/community-operators-jtgdl" Dec 01 21:50:45 crc kubenswrapper[4962]: I1201 21:50:45.318783 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lclt8\" (UniqueName: \"kubernetes.io/projected/de696fc6-3587-4bf4-82bb-bbac058b18dc-kube-api-access-lclt8\") pod \"community-operators-jtgdl\" (UID: \"de696fc6-3587-4bf4-82bb-bbac058b18dc\") " pod="openshift-marketplace/community-operators-jtgdl" Dec 01 21:50:45 crc kubenswrapper[4962]: I1201 21:50:45.386051 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jtgdl" Dec 01 21:50:46 crc kubenswrapper[4962]: I1201 21:50:46.021566 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jtgdl"] Dec 01 21:50:47 crc kubenswrapper[4962]: I1201 21:50:47.032005 4962 generic.go:334] "Generic (PLEG): container finished" podID="de696fc6-3587-4bf4-82bb-bbac058b18dc" containerID="897970a82a1d797ae4e6de2f6504b6624c3d57ee501c6f718c4ec9004ee67db8" exitCode=0 Dec 01 21:50:47 crc kubenswrapper[4962]: I1201 21:50:47.032318 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jtgdl" event={"ID":"de696fc6-3587-4bf4-82bb-bbac058b18dc","Type":"ContainerDied","Data":"897970a82a1d797ae4e6de2f6504b6624c3d57ee501c6f718c4ec9004ee67db8"} Dec 01 21:50:47 crc kubenswrapper[4962]: I1201 21:50:47.032348 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jtgdl" event={"ID":"de696fc6-3587-4bf4-82bb-bbac058b18dc","Type":"ContainerStarted","Data":"e8f0ea8c0ea7651a38ba48efaab430d8af6db81a19f7e8bb3399e4b217113450"} Dec 01 21:50:49 crc kubenswrapper[4962]: I1201 21:50:49.052823 4962 generic.go:334] "Generic (PLEG): container finished" podID="de696fc6-3587-4bf4-82bb-bbac058b18dc" containerID="cd51c0aee06212f04194acaa3138703460b467be2dbad72c6f8b8d01399ba0c8" exitCode=0 Dec 01 21:50:49 crc kubenswrapper[4962]: I1201 21:50:49.053101 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jtgdl" event={"ID":"de696fc6-3587-4bf4-82bb-bbac058b18dc","Type":"ContainerDied","Data":"cd51c0aee06212f04194acaa3138703460b467be2dbad72c6f8b8d01399ba0c8"} Dec 01 21:50:50 crc kubenswrapper[4962]: I1201 21:50:50.061605 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jtgdl" event={"ID":"de696fc6-3587-4bf4-82bb-bbac058b18dc","Type":"ContainerStarted","Data":"839364b5a9145aef68f3a15d0a17a92fd96fb4101218e14356d36f734b341012"} Dec 01 21:50:50 crc kubenswrapper[4962]: I1201 21:50:50.104197 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jtgdl" podStartSLOduration=2.6398659220000003 podStartE2EDuration="5.104181357s" podCreationTimestamp="2025-12-01 21:50:45 +0000 UTC" firstStartedPulling="2025-12-01 21:50:47.034486794 +0000 UTC m=+1031.135926029" lastFinishedPulling="2025-12-01 21:50:49.498802269 +0000 UTC m=+1033.600241464" observedRunningTime="2025-12-01 21:50:50.098406543 +0000 UTC m=+1034.199845778" watchObservedRunningTime="2025-12-01 21:50:50.104181357 +0000 UTC m=+1034.205620552" Dec 01 21:50:55 crc kubenswrapper[4962]: I1201 21:50:55.387074 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jtgdl" Dec 01 21:50:55 crc kubenswrapper[4962]: I1201 21:50:55.388081 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jtgdl" Dec 01 21:50:55 crc kubenswrapper[4962]: I1201 21:50:55.595869 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jtgdl" Dec 01 21:50:56 crc kubenswrapper[4962]: I1201 21:50:56.255890 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jtgdl" Dec 01 21:50:56 crc kubenswrapper[4962]: I1201 21:50:56.345741 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jtgdl"] Dec 01 21:50:58 crc kubenswrapper[4962]: I1201 21:50:58.118858 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-jtgdl" podUID="de696fc6-3587-4bf4-82bb-bbac058b18dc" containerName="registry-server" containerID="cri-o://839364b5a9145aef68f3a15d0a17a92fd96fb4101218e14356d36f734b341012" gracePeriod=2 Dec 01 21:50:58 crc kubenswrapper[4962]: I1201 21:50:58.583375 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jtgdl" Dec 01 21:50:58 crc kubenswrapper[4962]: I1201 21:50:58.768965 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de696fc6-3587-4bf4-82bb-bbac058b18dc-utilities\") pod \"de696fc6-3587-4bf4-82bb-bbac058b18dc\" (UID: \"de696fc6-3587-4bf4-82bb-bbac058b18dc\") " Dec 01 21:50:58 crc kubenswrapper[4962]: I1201 21:50:58.769095 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de696fc6-3587-4bf4-82bb-bbac058b18dc-catalog-content\") pod \"de696fc6-3587-4bf4-82bb-bbac058b18dc\" (UID: \"de696fc6-3587-4bf4-82bb-bbac058b18dc\") " Dec 01 21:50:58 crc kubenswrapper[4962]: I1201 21:50:58.769122 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lclt8\" (UniqueName: \"kubernetes.io/projected/de696fc6-3587-4bf4-82bb-bbac058b18dc-kube-api-access-lclt8\") pod \"de696fc6-3587-4bf4-82bb-bbac058b18dc\" (UID: \"de696fc6-3587-4bf4-82bb-bbac058b18dc\") " Dec 01 21:50:58 crc kubenswrapper[4962]: I1201 21:50:58.770500 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de696fc6-3587-4bf4-82bb-bbac058b18dc-utilities" (OuterVolumeSpecName: "utilities") pod "de696fc6-3587-4bf4-82bb-bbac058b18dc" (UID: "de696fc6-3587-4bf4-82bb-bbac058b18dc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:50:58 crc kubenswrapper[4962]: I1201 21:50:58.797214 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de696fc6-3587-4bf4-82bb-bbac058b18dc-kube-api-access-lclt8" (OuterVolumeSpecName: "kube-api-access-lclt8") pod "de696fc6-3587-4bf4-82bb-bbac058b18dc" (UID: "de696fc6-3587-4bf4-82bb-bbac058b18dc"). InnerVolumeSpecName "kube-api-access-lclt8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:50:58 crc kubenswrapper[4962]: I1201 21:50:58.819167 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de696fc6-3587-4bf4-82bb-bbac058b18dc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "de696fc6-3587-4bf4-82bb-bbac058b18dc" (UID: "de696fc6-3587-4bf4-82bb-bbac058b18dc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:50:58 crc kubenswrapper[4962]: I1201 21:50:58.871195 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de696fc6-3587-4bf4-82bb-bbac058b18dc-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 21:50:58 crc kubenswrapper[4962]: I1201 21:50:58.871228 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de696fc6-3587-4bf4-82bb-bbac058b18dc-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 21:50:58 crc kubenswrapper[4962]: I1201 21:50:58.871241 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lclt8\" (UniqueName: \"kubernetes.io/projected/de696fc6-3587-4bf4-82bb-bbac058b18dc-kube-api-access-lclt8\") on node \"crc\" DevicePath \"\"" Dec 01 21:50:59 crc kubenswrapper[4962]: I1201 21:50:59.149027 4962 generic.go:334] "Generic (PLEG): container finished" podID="de696fc6-3587-4bf4-82bb-bbac058b18dc" containerID="839364b5a9145aef68f3a15d0a17a92fd96fb4101218e14356d36f734b341012" exitCode=0 Dec 01 21:50:59 crc kubenswrapper[4962]: I1201 21:50:59.149073 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jtgdl" event={"ID":"de696fc6-3587-4bf4-82bb-bbac058b18dc","Type":"ContainerDied","Data":"839364b5a9145aef68f3a15d0a17a92fd96fb4101218e14356d36f734b341012"} Dec 01 21:50:59 crc kubenswrapper[4962]: I1201 21:50:59.149106 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jtgdl" event={"ID":"de696fc6-3587-4bf4-82bb-bbac058b18dc","Type":"ContainerDied","Data":"e8f0ea8c0ea7651a38ba48efaab430d8af6db81a19f7e8bb3399e4b217113450"} Dec 01 21:50:59 crc kubenswrapper[4962]: I1201 21:50:59.149122 4962 scope.go:117] "RemoveContainer" containerID="839364b5a9145aef68f3a15d0a17a92fd96fb4101218e14356d36f734b341012" Dec 01 21:50:59 crc kubenswrapper[4962]: I1201 21:50:59.149302 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jtgdl" Dec 01 21:50:59 crc kubenswrapper[4962]: I1201 21:50:59.187983 4962 scope.go:117] "RemoveContainer" containerID="cd51c0aee06212f04194acaa3138703460b467be2dbad72c6f8b8d01399ba0c8" Dec 01 21:50:59 crc kubenswrapper[4962]: I1201 21:50:59.205369 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jtgdl"] Dec 01 21:50:59 crc kubenswrapper[4962]: I1201 21:50:59.209426 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-jtgdl"] Dec 01 21:50:59 crc kubenswrapper[4962]: I1201 21:50:59.225297 4962 scope.go:117] "RemoveContainer" containerID="897970a82a1d797ae4e6de2f6504b6624c3d57ee501c6f718c4ec9004ee67db8" Dec 01 21:50:59 crc kubenswrapper[4962]: I1201 21:50:59.242982 4962 scope.go:117] "RemoveContainer" containerID="839364b5a9145aef68f3a15d0a17a92fd96fb4101218e14356d36f734b341012" Dec 01 21:50:59 crc kubenswrapper[4962]: E1201 21:50:59.243440 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"839364b5a9145aef68f3a15d0a17a92fd96fb4101218e14356d36f734b341012\": container with ID starting with 839364b5a9145aef68f3a15d0a17a92fd96fb4101218e14356d36f734b341012 not found: ID does not exist" containerID="839364b5a9145aef68f3a15d0a17a92fd96fb4101218e14356d36f734b341012" Dec 01 21:50:59 crc kubenswrapper[4962]: I1201 21:50:59.243486 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"839364b5a9145aef68f3a15d0a17a92fd96fb4101218e14356d36f734b341012"} err="failed to get container status \"839364b5a9145aef68f3a15d0a17a92fd96fb4101218e14356d36f734b341012\": rpc error: code = NotFound desc = could not find container \"839364b5a9145aef68f3a15d0a17a92fd96fb4101218e14356d36f734b341012\": container with ID starting with 839364b5a9145aef68f3a15d0a17a92fd96fb4101218e14356d36f734b341012 not found: ID does not exist" Dec 01 21:50:59 crc kubenswrapper[4962]: I1201 21:50:59.243514 4962 scope.go:117] "RemoveContainer" containerID="cd51c0aee06212f04194acaa3138703460b467be2dbad72c6f8b8d01399ba0c8" Dec 01 21:50:59 crc kubenswrapper[4962]: E1201 21:50:59.243904 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd51c0aee06212f04194acaa3138703460b467be2dbad72c6f8b8d01399ba0c8\": container with ID starting with cd51c0aee06212f04194acaa3138703460b467be2dbad72c6f8b8d01399ba0c8 not found: ID does not exist" containerID="cd51c0aee06212f04194acaa3138703460b467be2dbad72c6f8b8d01399ba0c8" Dec 01 21:50:59 crc kubenswrapper[4962]: I1201 21:50:59.243962 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd51c0aee06212f04194acaa3138703460b467be2dbad72c6f8b8d01399ba0c8"} err="failed to get container status \"cd51c0aee06212f04194acaa3138703460b467be2dbad72c6f8b8d01399ba0c8\": rpc error: code = NotFound desc = could not find container \"cd51c0aee06212f04194acaa3138703460b467be2dbad72c6f8b8d01399ba0c8\": container with ID starting with cd51c0aee06212f04194acaa3138703460b467be2dbad72c6f8b8d01399ba0c8 not found: ID does not exist" Dec 01 21:50:59 crc kubenswrapper[4962]: I1201 21:50:59.243982 4962 scope.go:117] "RemoveContainer" containerID="897970a82a1d797ae4e6de2f6504b6624c3d57ee501c6f718c4ec9004ee67db8" Dec 01 21:50:59 crc kubenswrapper[4962]: E1201 21:50:59.244198 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"897970a82a1d797ae4e6de2f6504b6624c3d57ee501c6f718c4ec9004ee67db8\": container with ID starting with 897970a82a1d797ae4e6de2f6504b6624c3d57ee501c6f718c4ec9004ee67db8 not found: ID does not exist" containerID="897970a82a1d797ae4e6de2f6504b6624c3d57ee501c6f718c4ec9004ee67db8" Dec 01 21:50:59 crc kubenswrapper[4962]: I1201 21:50:59.244222 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"897970a82a1d797ae4e6de2f6504b6624c3d57ee501c6f718c4ec9004ee67db8"} err="failed to get container status \"897970a82a1d797ae4e6de2f6504b6624c3d57ee501c6f718c4ec9004ee67db8\": rpc error: code = NotFound desc = could not find container \"897970a82a1d797ae4e6de2f6504b6624c3d57ee501c6f718c4ec9004ee67db8\": container with ID starting with 897970a82a1d797ae4e6de2f6504b6624c3d57ee501c6f718c4ec9004ee67db8 not found: ID does not exist" Dec 01 21:51:00 crc kubenswrapper[4962]: I1201 21:51:00.235226 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de696fc6-3587-4bf4-82bb-bbac058b18dc" path="/var/lib/kubelet/pods/de696fc6-3587-4bf4-82bb-bbac058b18dc/volumes" Dec 01 21:51:01 crc kubenswrapper[4962]: I1201 21:51:01.828540 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-t725d"] Dec 01 21:51:01 crc kubenswrapper[4962]: E1201 21:51:01.829038 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de696fc6-3587-4bf4-82bb-bbac058b18dc" containerName="extract-content" Dec 01 21:51:01 crc kubenswrapper[4962]: I1201 21:51:01.829051 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="de696fc6-3587-4bf4-82bb-bbac058b18dc" containerName="extract-content" Dec 01 21:51:01 crc kubenswrapper[4962]: E1201 21:51:01.829060 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de696fc6-3587-4bf4-82bb-bbac058b18dc" containerName="extract-utilities" Dec 01 21:51:01 crc kubenswrapper[4962]: I1201 21:51:01.829066 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="de696fc6-3587-4bf4-82bb-bbac058b18dc" containerName="extract-utilities" Dec 01 21:51:01 crc kubenswrapper[4962]: E1201 21:51:01.829082 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de696fc6-3587-4bf4-82bb-bbac058b18dc" containerName="registry-server" Dec 01 21:51:01 crc kubenswrapper[4962]: I1201 21:51:01.829090 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="de696fc6-3587-4bf4-82bb-bbac058b18dc" containerName="registry-server" Dec 01 21:51:01 crc kubenswrapper[4962]: I1201 21:51:01.829224 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="de696fc6-3587-4bf4-82bb-bbac058b18dc" containerName="registry-server" Dec 01 21:51:01 crc kubenswrapper[4962]: I1201 21:51:01.829918 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-t725d" Dec 01 21:51:01 crc kubenswrapper[4962]: I1201 21:51:01.832278 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-6wrlv" Dec 01 21:51:01 crc kubenswrapper[4962]: I1201 21:51:01.840956 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-2nvkp"] Dec 01 21:51:01 crc kubenswrapper[4962]: I1201 21:51:01.842367 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-2nvkp" Dec 01 21:51:01 crc kubenswrapper[4962]: I1201 21:51:01.844534 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-br5pc" Dec 01 21:51:01 crc kubenswrapper[4962]: I1201 21:51:01.848166 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-t725d"] Dec 01 21:51:01 crc kubenswrapper[4962]: I1201 21:51:01.854819 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-2nvkp"] Dec 01 21:51:01 crc kubenswrapper[4962]: I1201 21:51:01.874808 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-sqwvg"] Dec 01 21:51:01 crc kubenswrapper[4962]: I1201 21:51:01.876152 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-sqwvg" Dec 01 21:51:01 crc kubenswrapper[4962]: I1201 21:51:01.878231 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-gdgvn" Dec 01 21:51:01 crc kubenswrapper[4962]: I1201 21:51:01.885557 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-668d9c48b9-8xc97"] Dec 01 21:51:01 crc kubenswrapper[4962]: I1201 21:51:01.887034 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-8xc97" Dec 01 21:51:01 crc kubenswrapper[4962]: I1201 21:51:01.905223 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-5kpxd" Dec 01 21:51:01 crc kubenswrapper[4962]: I1201 21:51:01.930734 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thl78\" (UniqueName: \"kubernetes.io/projected/8e655cd6-3169-46a0-b299-37d13dae8d3a-kube-api-access-thl78\") pod \"designate-operator-controller-manager-78b4bc895b-sqwvg\" (UID: \"8e655cd6-3169-46a0-b299-37d13dae8d3a\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-sqwvg" Dec 01 21:51:01 crc kubenswrapper[4962]: I1201 21:51:01.933059 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvjnl\" (UniqueName: \"kubernetes.io/projected/0e2461fa-57b4-406a-9801-522b2e3ee2f0-kube-api-access-bvjnl\") pod \"barbican-operator-controller-manager-7d9dfd778-t725d\" (UID: \"0e2461fa-57b4-406a-9801-522b2e3ee2f0\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-t725d" Dec 01 21:51:01 crc kubenswrapper[4962]: I1201 21:51:01.933528 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cj96d\" (UniqueName: \"kubernetes.io/projected/39217f35-ba4e-402b-84fe-876ca232ff60-kube-api-access-cj96d\") pod \"cinder-operator-controller-manager-859b6ccc6-2nvkp\" (UID: \"39217f35-ba4e-402b-84fe-876ca232ff60\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-2nvkp" Dec 01 21:51:01 crc kubenswrapper[4962]: I1201 21:51:01.933614 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wwqn\" (UniqueName: \"kubernetes.io/projected/d871da7f-4b47-4931-aa3b-1525f50b2bde-kube-api-access-4wwqn\") pod \"glance-operator-controller-manager-668d9c48b9-8xc97\" (UID: \"d871da7f-4b47-4931-aa3b-1525f50b2bde\") " pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-8xc97" Dec 01 21:51:01 crc kubenswrapper[4962]: I1201 21:51:01.946020 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-sqwvg"] Dec 01 21:51:01 crc kubenswrapper[4962]: I1201 21:51:01.985658 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-668d9c48b9-8xc97"] Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.021285 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-sjzl9"] Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.022643 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-sjzl9" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.024872 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-fr86x" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.036035 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-sjzl9"] Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.040127 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thl78\" (UniqueName: \"kubernetes.io/projected/8e655cd6-3169-46a0-b299-37d13dae8d3a-kube-api-access-thl78\") pod \"designate-operator-controller-manager-78b4bc895b-sqwvg\" (UID: \"8e655cd6-3169-46a0-b299-37d13dae8d3a\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-sqwvg" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.040185 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvjnl\" (UniqueName: \"kubernetes.io/projected/0e2461fa-57b4-406a-9801-522b2e3ee2f0-kube-api-access-bvjnl\") pod \"barbican-operator-controller-manager-7d9dfd778-t725d\" (UID: \"0e2461fa-57b4-406a-9801-522b2e3ee2f0\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-t725d" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.040221 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9mss\" (UniqueName: \"kubernetes.io/projected/1a9bd198-45fa-40ba-b3a0-55c150c211d6-kube-api-access-q9mss\") pod \"heat-operator-controller-manager-5f64f6f8bb-sjzl9\" (UID: \"1a9bd198-45fa-40ba-b3a0-55c150c211d6\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-sjzl9" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.040270 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cj96d\" (UniqueName: \"kubernetes.io/projected/39217f35-ba4e-402b-84fe-876ca232ff60-kube-api-access-cj96d\") pod \"cinder-operator-controller-manager-859b6ccc6-2nvkp\" (UID: \"39217f35-ba4e-402b-84fe-876ca232ff60\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-2nvkp" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.040296 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wwqn\" (UniqueName: \"kubernetes.io/projected/d871da7f-4b47-4931-aa3b-1525f50b2bde-kube-api-access-4wwqn\") pod \"glance-operator-controller-manager-668d9c48b9-8xc97\" (UID: \"d871da7f-4b47-4931-aa3b-1525f50b2bde\") " pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-8xc97" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.060006 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-mllgh"] Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.061553 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-mllgh" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.065673 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-gjp59" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.081741 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvjnl\" (UniqueName: \"kubernetes.io/projected/0e2461fa-57b4-406a-9801-522b2e3ee2f0-kube-api-access-bvjnl\") pod \"barbican-operator-controller-manager-7d9dfd778-t725d\" (UID: \"0e2461fa-57b4-406a-9801-522b2e3ee2f0\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-t725d" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.083567 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wwqn\" (UniqueName: \"kubernetes.io/projected/d871da7f-4b47-4931-aa3b-1525f50b2bde-kube-api-access-4wwqn\") pod \"glance-operator-controller-manager-668d9c48b9-8xc97\" (UID: \"d871da7f-4b47-4931-aa3b-1525f50b2bde\") " pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-8xc97" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.097323 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-27r4m"] Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.098649 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-27r4m" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.099233 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thl78\" (UniqueName: \"kubernetes.io/projected/8e655cd6-3169-46a0-b299-37d13dae8d3a-kube-api-access-thl78\") pod \"designate-operator-controller-manager-78b4bc895b-sqwvg\" (UID: \"8e655cd6-3169-46a0-b299-37d13dae8d3a\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-sqwvg" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.111347 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.111421 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-r4pv7" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.113292 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cj96d\" (UniqueName: \"kubernetes.io/projected/39217f35-ba4e-402b-84fe-876ca232ff60-kube-api-access-cj96d\") pod \"cinder-operator-controller-manager-859b6ccc6-2nvkp\" (UID: \"39217f35-ba4e-402b-84fe-876ca232ff60\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-2nvkp" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.124274 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-mllgh"] Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.129383 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-27r4m"] Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.142528 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mck9m\" (UniqueName: \"kubernetes.io/projected/2e5d2ffc-89ea-4f45-a3a2-1745f3b107e8-kube-api-access-mck9m\") pod \"infra-operator-controller-manager-57548d458d-27r4m\" (UID: \"2e5d2ffc-89ea-4f45-a3a2-1745f3b107e8\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-27r4m" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.142646 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e5d2ffc-89ea-4f45-a3a2-1745f3b107e8-cert\") pod \"infra-operator-controller-manager-57548d458d-27r4m\" (UID: \"2e5d2ffc-89ea-4f45-a3a2-1745f3b107e8\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-27r4m" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.142677 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9mss\" (UniqueName: \"kubernetes.io/projected/1a9bd198-45fa-40ba-b3a0-55c150c211d6-kube-api-access-q9mss\") pod \"heat-operator-controller-manager-5f64f6f8bb-sjzl9\" (UID: \"1a9bd198-45fa-40ba-b3a0-55c150c211d6\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-sjzl9" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.142711 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppdlm\" (UniqueName: \"kubernetes.io/projected/fb3ad1a2-8ee0-4d12-8499-d10819081f1b-kube-api-access-ppdlm\") pod \"horizon-operator-controller-manager-68c6d99b8f-mllgh\" (UID: \"fb3ad1a2-8ee0-4d12-8499-d10819081f1b\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-mllgh" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.146112 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-t725d" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.146723 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-w68ng"] Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.148266 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-w68ng" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.154788 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-w68ng"] Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.156700 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-2v4gt" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.158009 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-2nvkp" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.162542 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-546d4bdf48-fkvsq"] Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.163983 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-fkvsq" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.171508 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-b4tmz" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.187584 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-6546668bfd-zvp89"] Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.191851 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-zvp89" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.202678 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9mss\" (UniqueName: \"kubernetes.io/projected/1a9bd198-45fa-40ba-b3a0-55c150c211d6-kube-api-access-q9mss\") pod \"heat-operator-controller-manager-5f64f6f8bb-sjzl9\" (UID: \"1a9bd198-45fa-40ba-b3a0-55c150c211d6\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-sjzl9" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.203089 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-8p4gk" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.212560 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-sqwvg" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.221674 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-nptld"] Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.222903 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-nptld" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.227343 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-rbn8z" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.237675 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-8xc97" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.257283 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2smln\" (UniqueName: \"kubernetes.io/projected/3673ec86-6e36-4f0b-ac14-87e5d89e283e-kube-api-access-2smln\") pod \"manila-operator-controller-manager-6546668bfd-zvp89\" (UID: \"3673ec86-6e36-4f0b-ac14-87e5d89e283e\") " pod="openstack-operators/manila-operator-controller-manager-6546668bfd-zvp89" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.259653 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e5d2ffc-89ea-4f45-a3a2-1745f3b107e8-cert\") pod \"infra-operator-controller-manager-57548d458d-27r4m\" (UID: \"2e5d2ffc-89ea-4f45-a3a2-1745f3b107e8\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-27r4m" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.259821 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vt9pm\" (UniqueName: \"kubernetes.io/projected/fb72edda-e449-44f6-a85d-b74c0f3f9ad2-kube-api-access-vt9pm\") pod \"mariadb-operator-controller-manager-56bbcc9d85-nptld\" (UID: \"fb72edda-e449-44f6-a85d-b74c0f3f9ad2\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-nptld" Dec 01 21:51:02 crc kubenswrapper[4962]: E1201 21:51:02.260466 4962 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.262681 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-546d4bdf48-fkvsq"] Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.263589 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppdlm\" (UniqueName: \"kubernetes.io/projected/fb3ad1a2-8ee0-4d12-8499-d10819081f1b-kube-api-access-ppdlm\") pod \"horizon-operator-controller-manager-68c6d99b8f-mllgh\" (UID: \"fb3ad1a2-8ee0-4d12-8499-d10819081f1b\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-mllgh" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.263673 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h978r\" (UniqueName: \"kubernetes.io/projected/ed78cdfd-dc4e-4528-9542-6fc778f54e5f-kube-api-access-h978r\") pod \"keystone-operator-controller-manager-546d4bdf48-fkvsq\" (UID: \"ed78cdfd-dc4e-4528-9542-6fc778f54e5f\") " pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-fkvsq" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.263805 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mck9m\" (UniqueName: \"kubernetes.io/projected/2e5d2ffc-89ea-4f45-a3a2-1745f3b107e8-kube-api-access-mck9m\") pod \"infra-operator-controller-manager-57548d458d-27r4m\" (UID: \"2e5d2ffc-89ea-4f45-a3a2-1745f3b107e8\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-27r4m" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.263907 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtndt\" (UniqueName: \"kubernetes.io/projected/d0ae9966-90f0-4d97-a056-dd9e86c81949-kube-api-access-dtndt\") pod \"ironic-operator-controller-manager-6c548fd776-w68ng\" (UID: \"d0ae9966-90f0-4d97-a056-dd9e86c81949\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-w68ng" Dec 01 21:51:02 crc kubenswrapper[4962]: E1201 21:51:02.264174 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e5d2ffc-89ea-4f45-a3a2-1745f3b107e8-cert podName:2e5d2ffc-89ea-4f45-a3a2-1745f3b107e8 nodeName:}" failed. No retries permitted until 2025-12-01 21:51:02.764154904 +0000 UTC m=+1046.865594099 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2e5d2ffc-89ea-4f45-a3a2-1745f3b107e8-cert") pod "infra-operator-controller-manager-57548d458d-27r4m" (UID: "2e5d2ffc-89ea-4f45-a3a2-1745f3b107e8") : secret "infra-operator-webhook-server-cert" not found Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.275165 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6546668bfd-zvp89"] Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.291490 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-7d98c"] Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.293452 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-7d98c" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.294335 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppdlm\" (UniqueName: \"kubernetes.io/projected/fb3ad1a2-8ee0-4d12-8499-d10819081f1b-kube-api-access-ppdlm\") pod \"horizon-operator-controller-manager-68c6d99b8f-mllgh\" (UID: \"fb3ad1a2-8ee0-4d12-8499-d10819081f1b\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-mllgh" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.297346 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-6c5gb" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.299121 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mck9m\" (UniqueName: \"kubernetes.io/projected/2e5d2ffc-89ea-4f45-a3a2-1745f3b107e8-kube-api-access-mck9m\") pod \"infra-operator-controller-manager-57548d458d-27r4m\" (UID: \"2e5d2ffc-89ea-4f45-a3a2-1745f3b107e8\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-27r4m" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.303336 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-nptld"] Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.328003 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-7d98c"] Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.348001 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-mqrwk"] Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.349416 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-mqrwk" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.353494 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-cx484" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.367134 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vt9pm\" (UniqueName: \"kubernetes.io/projected/fb72edda-e449-44f6-a85d-b74c0f3f9ad2-kube-api-access-vt9pm\") pod \"mariadb-operator-controller-manager-56bbcc9d85-nptld\" (UID: \"fb72edda-e449-44f6-a85d-b74c0f3f9ad2\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-nptld" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.367195 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h978r\" (UniqueName: \"kubernetes.io/projected/ed78cdfd-dc4e-4528-9542-6fc778f54e5f-kube-api-access-h978r\") pod \"keystone-operator-controller-manager-546d4bdf48-fkvsq\" (UID: \"ed78cdfd-dc4e-4528-9542-6fc778f54e5f\") " pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-fkvsq" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.367301 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtndt\" (UniqueName: \"kubernetes.io/projected/d0ae9966-90f0-4d97-a056-dd9e86c81949-kube-api-access-dtndt\") pod \"ironic-operator-controller-manager-6c548fd776-w68ng\" (UID: \"d0ae9966-90f0-4d97-a056-dd9e86c81949\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-w68ng" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.367353 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxr2v\" (UniqueName: \"kubernetes.io/projected/795b9a42-a6d4-487b-84ef-0f1b3617ebfc-kube-api-access-fxr2v\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-7d98c\" (UID: \"795b9a42-a6d4-487b-84ef-0f1b3617ebfc\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-7d98c" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.367391 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2smln\" (UniqueName: \"kubernetes.io/projected/3673ec86-6e36-4f0b-ac14-87e5d89e283e-kube-api-access-2smln\") pod \"manila-operator-controller-manager-6546668bfd-zvp89\" (UID: \"3673ec86-6e36-4f0b-ac14-87e5d89e283e\") " pod="openstack-operators/manila-operator-controller-manager-6546668bfd-zvp89" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.382969 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtndt\" (UniqueName: \"kubernetes.io/projected/d0ae9966-90f0-4d97-a056-dd9e86c81949-kube-api-access-dtndt\") pod \"ironic-operator-controller-manager-6c548fd776-w68ng\" (UID: \"d0ae9966-90f0-4d97-a056-dd9e86c81949\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-w68ng" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.384846 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2smln\" (UniqueName: \"kubernetes.io/projected/3673ec86-6e36-4f0b-ac14-87e5d89e283e-kube-api-access-2smln\") pod \"manila-operator-controller-manager-6546668bfd-zvp89\" (UID: \"3673ec86-6e36-4f0b-ac14-87e5d89e283e\") " pod="openstack-operators/manila-operator-controller-manager-6546668bfd-zvp89" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.386659 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h978r\" (UniqueName: \"kubernetes.io/projected/ed78cdfd-dc4e-4528-9542-6fc778f54e5f-kube-api-access-h978r\") pod \"keystone-operator-controller-manager-546d4bdf48-fkvsq\" (UID: \"ed78cdfd-dc4e-4528-9542-6fc778f54e5f\") " pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-fkvsq" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.389247 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vt9pm\" (UniqueName: \"kubernetes.io/projected/fb72edda-e449-44f6-a85d-b74c0f3f9ad2-kube-api-access-vt9pm\") pod \"mariadb-operator-controller-manager-56bbcc9d85-nptld\" (UID: \"fb72edda-e449-44f6-a85d-b74c0f3f9ad2\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-nptld" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.392552 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-mqrwk"] Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.401739 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-xvbpf"] Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.404021 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-xvbpf" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.406404 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-wdhrs" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.416246 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-hfkpq"] Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.417927 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-hfkpq" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.421971 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-xlt6h" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.443270 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4z6mbc"] Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.445140 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4z6mbc" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.453600 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.453840 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-8rtm9" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.455580 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-sjzl9" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.464954 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-xvbpf"] Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.468548 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lzd2\" (UniqueName: \"kubernetes.io/projected/d62cdff4-c4d1-44fb-99dc-bdd6a31d03af-kube-api-access-8lzd2\") pod \"ovn-operator-controller-manager-b6456fdb6-hfkpq\" (UID: \"d62cdff4-c4d1-44fb-99dc-bdd6a31d03af\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-hfkpq" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.468591 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxr2v\" (UniqueName: \"kubernetes.io/projected/795b9a42-a6d4-487b-84ef-0f1b3617ebfc-kube-api-access-fxr2v\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-7d98c\" (UID: \"795b9a42-a6d4-487b-84ef-0f1b3617ebfc\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-7d98c" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.468696 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8c6xl\" (UniqueName: \"kubernetes.io/projected/f2e499a5-b89a-45d4-bd3e-9f743e010a51-kube-api-access-8c6xl\") pod \"octavia-operator-controller-manager-998648c74-xvbpf\" (UID: \"f2e499a5-b89a-45d4-bd3e-9f743e010a51\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-xvbpf" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.468761 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slhl6\" (UniqueName: \"kubernetes.io/projected/1fb020cd-66c6-401d-be7e-9a26b62eb8d8-kube-api-access-slhl6\") pod \"nova-operator-controller-manager-697bc559fc-mqrwk\" (UID: \"1fb020cd-66c6-401d-be7e-9a26b62eb8d8\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-mqrwk" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.480676 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-mllgh" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.512514 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-hfkpq"] Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.522511 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxr2v\" (UniqueName: \"kubernetes.io/projected/795b9a42-a6d4-487b-84ef-0f1b3617ebfc-kube-api-access-fxr2v\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-7d98c\" (UID: \"795b9a42-a6d4-487b-84ef-0f1b3617ebfc\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-7d98c" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.583369 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lzd2\" (UniqueName: \"kubernetes.io/projected/d62cdff4-c4d1-44fb-99dc-bdd6a31d03af-kube-api-access-8lzd2\") pod \"ovn-operator-controller-manager-b6456fdb6-hfkpq\" (UID: \"d62cdff4-c4d1-44fb-99dc-bdd6a31d03af\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-hfkpq" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.583466 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fec45066-0c5d-48de-9c33-f166f33131f0-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4z6mbc\" (UID: \"fec45066-0c5d-48de-9c33-f166f33131f0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4z6mbc" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.583488 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5b2d\" (UniqueName: \"kubernetes.io/projected/fec45066-0c5d-48de-9c33-f166f33131f0-kube-api-access-c5b2d\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4z6mbc\" (UID: \"fec45066-0c5d-48de-9c33-f166f33131f0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4z6mbc" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.583546 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8c6xl\" (UniqueName: \"kubernetes.io/projected/f2e499a5-b89a-45d4-bd3e-9f743e010a51-kube-api-access-8c6xl\") pod \"octavia-operator-controller-manager-998648c74-xvbpf\" (UID: \"f2e499a5-b89a-45d4-bd3e-9f743e010a51\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-xvbpf" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.583621 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slhl6\" (UniqueName: \"kubernetes.io/projected/1fb020cd-66c6-401d-be7e-9a26b62eb8d8-kube-api-access-slhl6\") pod \"nova-operator-controller-manager-697bc559fc-mqrwk\" (UID: \"1fb020cd-66c6-401d-be7e-9a26b62eb8d8\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-mqrwk" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.609271 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-w68ng" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.613155 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-4hgng"] Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.614566 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-4hgng" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.615481 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-fkvsq" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.622494 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-fmm2r" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.644387 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-zvp89" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.650911 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4z6mbc"] Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.651565 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slhl6\" (UniqueName: \"kubernetes.io/projected/1fb020cd-66c6-401d-be7e-9a26b62eb8d8-kube-api-access-slhl6\") pod \"nova-operator-controller-manager-697bc559fc-mqrwk\" (UID: \"1fb020cd-66c6-401d-be7e-9a26b62eb8d8\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-mqrwk" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.653812 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8c6xl\" (UniqueName: \"kubernetes.io/projected/f2e499a5-b89a-45d4-bd3e-9f743e010a51-kube-api-access-8c6xl\") pod \"octavia-operator-controller-manager-998648c74-xvbpf\" (UID: \"f2e499a5-b89a-45d4-bd3e-9f743e010a51\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-xvbpf" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.658429 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-nptld" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.664182 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lzd2\" (UniqueName: \"kubernetes.io/projected/d62cdff4-c4d1-44fb-99dc-bdd6a31d03af-kube-api-access-8lzd2\") pod \"ovn-operator-controller-manager-b6456fdb6-hfkpq\" (UID: \"d62cdff4-c4d1-44fb-99dc-bdd6a31d03af\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-hfkpq" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.681801 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-7d98c" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.688057 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nft94\" (UniqueName: \"kubernetes.io/projected/ec3039da-9f5e-4870-8579-8560a63221a8-kube-api-access-nft94\") pod \"placement-operator-controller-manager-78f8948974-4hgng\" (UID: \"ec3039da-9f5e-4870-8579-8560a63221a8\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-4hgng" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.688123 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5b2d\" (UniqueName: \"kubernetes.io/projected/fec45066-0c5d-48de-9c33-f166f33131f0-kube-api-access-c5b2d\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4z6mbc\" (UID: \"fec45066-0c5d-48de-9c33-f166f33131f0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4z6mbc" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.688141 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fec45066-0c5d-48de-9c33-f166f33131f0-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4z6mbc\" (UID: \"fec45066-0c5d-48de-9c33-f166f33131f0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4z6mbc" Dec 01 21:51:02 crc kubenswrapper[4962]: E1201 21:51:02.688326 4962 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 21:51:02 crc kubenswrapper[4962]: E1201 21:51:02.688367 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fec45066-0c5d-48de-9c33-f166f33131f0-cert podName:fec45066-0c5d-48de-9c33-f166f33131f0 nodeName:}" failed. No retries permitted until 2025-12-01 21:51:03.188354851 +0000 UTC m=+1047.289794046 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fec45066-0c5d-48de-9c33-f166f33131f0-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4z6mbc" (UID: "fec45066-0c5d-48de-9c33-f166f33131f0") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.699174 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-4hgng"] Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.701857 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-mqrwk" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.711595 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-6tfrn"] Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.713427 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-6tfrn" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.716025 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-4vhhc" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.761333 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-xvbpf" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.763977 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-6tfrn"] Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.783178 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6c484b4dc4-ch82f"] Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.784354 4962 patch_prober.go:28] interesting pod/machine-config-daemon-b642k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.784409 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.784561 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b642k" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.785235 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"95b773e188f611e19f1e133dda091ac575dae9bb165debbd86a90d7593910a0b"} pod="openshift-machine-config-operator/machine-config-daemon-b642k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.785294 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" containerID="cri-o://95b773e188f611e19f1e133dda091ac575dae9bb165debbd86a90d7593910a0b" gracePeriod=600 Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.785391 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6c484b4dc4-ch82f" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.788398 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-wmfx5" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.789149 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5b2d\" (UniqueName: \"kubernetes.io/projected/fec45066-0c5d-48de-9c33-f166f33131f0-kube-api-access-c5b2d\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4z6mbc\" (UID: \"fec45066-0c5d-48de-9c33-f166f33131f0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4z6mbc" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.789397 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nft94\" (UniqueName: \"kubernetes.io/projected/ec3039da-9f5e-4870-8579-8560a63221a8-kube-api-access-nft94\") pod \"placement-operator-controller-manager-78f8948974-4hgng\" (UID: \"ec3039da-9f5e-4870-8579-8560a63221a8\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-4hgng" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.789446 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pt2lg\" (UniqueName: \"kubernetes.io/projected/03f5786b-da6f-4b56-ac07-fb563f0a85b4-kube-api-access-pt2lg\") pod \"swift-operator-controller-manager-5f8c65bbfc-6tfrn\" (UID: \"03f5786b-da6f-4b56-ac07-fb563f0a85b4\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-6tfrn" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.789481 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e5d2ffc-89ea-4f45-a3a2-1745f3b107e8-cert\") pod \"infra-operator-controller-manager-57548d458d-27r4m\" (UID: \"2e5d2ffc-89ea-4f45-a3a2-1745f3b107e8\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-27r4m" Dec 01 21:51:02 crc kubenswrapper[4962]: E1201 21:51:02.789611 4962 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 01 21:51:02 crc kubenswrapper[4962]: E1201 21:51:02.789642 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e5d2ffc-89ea-4f45-a3a2-1745f3b107e8-cert podName:2e5d2ffc-89ea-4f45-a3a2-1745f3b107e8 nodeName:}" failed. No retries permitted until 2025-12-01 21:51:03.789629241 +0000 UTC m=+1047.891068436 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2e5d2ffc-89ea-4f45-a3a2-1745f3b107e8-cert") pod "infra-operator-controller-manager-57548d458d-27r4m" (UID: "2e5d2ffc-89ea-4f45-a3a2-1745f3b107e8") : secret "infra-operator-webhook-server-cert" not found Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.793018 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6c484b4dc4-ch82f"] Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.817692 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nft94\" (UniqueName: \"kubernetes.io/projected/ec3039da-9f5e-4870-8579-8560a63221a8-kube-api-access-nft94\") pod \"placement-operator-controller-manager-78f8948974-4hgng\" (UID: \"ec3039da-9f5e-4870-8579-8560a63221a8\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-4hgng" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.829445 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-hfkpq" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.841082 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-zzc5v"] Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.842593 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-zzc5v" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.848165 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-m5kjf" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.853113 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-zzc5v"] Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.893541 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-q7bxg"] Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.897439 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2dw7\" (UniqueName: \"kubernetes.io/projected/0aff0b93-1032-412b-9628-3ab9e94717a8-kube-api-access-n2dw7\") pod \"test-operator-controller-manager-5854674fcc-zzc5v\" (UID: \"0aff0b93-1032-412b-9628-3ab9e94717a8\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-zzc5v" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.897514 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kh9qt\" (UniqueName: \"kubernetes.io/projected/af182ba4-78a6-41eb-bf65-8abd64207122-kube-api-access-kh9qt\") pod \"telemetry-operator-controller-manager-6c484b4dc4-ch82f\" (UID: \"af182ba4-78a6-41eb-bf65-8abd64207122\") " pod="openstack-operators/telemetry-operator-controller-manager-6c484b4dc4-ch82f" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.897555 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pt2lg\" (UniqueName: \"kubernetes.io/projected/03f5786b-da6f-4b56-ac07-fb563f0a85b4-kube-api-access-pt2lg\") pod \"swift-operator-controller-manager-5f8c65bbfc-6tfrn\" (UID: \"03f5786b-da6f-4b56-ac07-fb563f0a85b4\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-6tfrn" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.898414 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-q7bxg" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.900537 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-4hgng" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.901107 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-4tpgl" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.912019 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-q7bxg"] Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.941074 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pt2lg\" (UniqueName: \"kubernetes.io/projected/03f5786b-da6f-4b56-ac07-fb563f0a85b4-kube-api-access-pt2lg\") pod \"swift-operator-controller-manager-5f8c65bbfc-6tfrn\" (UID: \"03f5786b-da6f-4b56-ac07-fb563f0a85b4\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-6tfrn" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.964305 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-d8646fccf-4h8tf"] Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.965751 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-d8646fccf-4h8tf" Dec 01 21:51:02 crc kubenswrapper[4962]: I1201 21:51:02.971273 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-d8646fccf-4h8tf"] Dec 01 21:51:03 crc kubenswrapper[4962]: I1201 21:51:02.979863 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Dec 01 21:51:03 crc kubenswrapper[4962]: I1201 21:51:02.980075 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-5s9xs" Dec 01 21:51:03 crc kubenswrapper[4962]: I1201 21:51:02.980199 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Dec 01 21:51:03 crc kubenswrapper[4962]: I1201 21:51:02.999165 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2dw7\" (UniqueName: \"kubernetes.io/projected/0aff0b93-1032-412b-9628-3ab9e94717a8-kube-api-access-n2dw7\") pod \"test-operator-controller-manager-5854674fcc-zzc5v\" (UID: \"0aff0b93-1032-412b-9628-3ab9e94717a8\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-zzc5v" Dec 01 21:51:03 crc kubenswrapper[4962]: I1201 21:51:02.999207 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhczg\" (UniqueName: \"kubernetes.io/projected/c847e733-65b6-4724-8037-5199d847f1ba-kube-api-access-bhczg\") pod \"watcher-operator-controller-manager-769dc69bc-q7bxg\" (UID: \"c847e733-65b6-4724-8037-5199d847f1ba\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-q7bxg" Dec 01 21:51:03 crc kubenswrapper[4962]: I1201 21:51:02.999250 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kh9qt\" (UniqueName: \"kubernetes.io/projected/af182ba4-78a6-41eb-bf65-8abd64207122-kube-api-access-kh9qt\") pod \"telemetry-operator-controller-manager-6c484b4dc4-ch82f\" (UID: \"af182ba4-78a6-41eb-bf65-8abd64207122\") " pod="openstack-operators/telemetry-operator-controller-manager-6c484b4dc4-ch82f" Dec 01 21:51:03 crc kubenswrapper[4962]: I1201 21:51:03.011773 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lv89z"] Dec 01 21:51:03 crc kubenswrapper[4962]: I1201 21:51:03.018497 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lv89z"] Dec 01 21:51:03 crc kubenswrapper[4962]: I1201 21:51:03.018843 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lv89z" Dec 01 21:51:03 crc kubenswrapper[4962]: I1201 21:51:03.028260 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-dwmr5" Dec 01 21:51:03 crc kubenswrapper[4962]: I1201 21:51:03.035302 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kh9qt\" (UniqueName: \"kubernetes.io/projected/af182ba4-78a6-41eb-bf65-8abd64207122-kube-api-access-kh9qt\") pod \"telemetry-operator-controller-manager-6c484b4dc4-ch82f\" (UID: \"af182ba4-78a6-41eb-bf65-8abd64207122\") " pod="openstack-operators/telemetry-operator-controller-manager-6c484b4dc4-ch82f" Dec 01 21:51:03 crc kubenswrapper[4962]: I1201 21:51:03.036226 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2dw7\" (UniqueName: \"kubernetes.io/projected/0aff0b93-1032-412b-9628-3ab9e94717a8-kube-api-access-n2dw7\") pod \"test-operator-controller-manager-5854674fcc-zzc5v\" (UID: \"0aff0b93-1032-412b-9628-3ab9e94717a8\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-zzc5v" Dec 01 21:51:03 crc kubenswrapper[4962]: I1201 21:51:03.038309 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-zzc5v" Dec 01 21:51:03 crc kubenswrapper[4962]: I1201 21:51:03.106375 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtxbw\" (UniqueName: \"kubernetes.io/projected/05992e60-e6fc-43a0-b44a-d177ae3f4c83-kube-api-access-dtxbw\") pod \"openstack-operator-controller-manager-d8646fccf-4h8tf\" (UID: \"05992e60-e6fc-43a0-b44a-d177ae3f4c83\") " pod="openstack-operators/openstack-operator-controller-manager-d8646fccf-4h8tf" Dec 01 21:51:03 crc kubenswrapper[4962]: I1201 21:51:03.106476 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/05992e60-e6fc-43a0-b44a-d177ae3f4c83-metrics-certs\") pod \"openstack-operator-controller-manager-d8646fccf-4h8tf\" (UID: \"05992e60-e6fc-43a0-b44a-d177ae3f4c83\") " pod="openstack-operators/openstack-operator-controller-manager-d8646fccf-4h8tf" Dec 01 21:51:03 crc kubenswrapper[4962]: I1201 21:51:03.106548 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhczg\" (UniqueName: \"kubernetes.io/projected/c847e733-65b6-4724-8037-5199d847f1ba-kube-api-access-bhczg\") pod \"watcher-operator-controller-manager-769dc69bc-q7bxg\" (UID: \"c847e733-65b6-4724-8037-5199d847f1ba\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-q7bxg" Dec 01 21:51:03 crc kubenswrapper[4962]: I1201 21:51:03.106592 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/05992e60-e6fc-43a0-b44a-d177ae3f4c83-webhook-certs\") pod \"openstack-operator-controller-manager-d8646fccf-4h8tf\" (UID: \"05992e60-e6fc-43a0-b44a-d177ae3f4c83\") " pod="openstack-operators/openstack-operator-controller-manager-d8646fccf-4h8tf" Dec 01 21:51:03 crc kubenswrapper[4962]: I1201 21:51:03.106639 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bn5jk\" (UniqueName: \"kubernetes.io/projected/400ba839-34f0-4463-a318-c1bcba6e5039-kube-api-access-bn5jk\") pod \"rabbitmq-cluster-operator-manager-668c99d594-lv89z\" (UID: \"400ba839-34f0-4463-a318-c1bcba6e5039\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lv89z" Dec 01 21:51:03 crc kubenswrapper[4962]: I1201 21:51:03.149364 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhczg\" (UniqueName: \"kubernetes.io/projected/c847e733-65b6-4724-8037-5199d847f1ba-kube-api-access-bhczg\") pod \"watcher-operator-controller-manager-769dc69bc-q7bxg\" (UID: \"c847e733-65b6-4724-8037-5199d847f1ba\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-q7bxg" Dec 01 21:51:03 crc kubenswrapper[4962]: I1201 21:51:03.254542 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-6tfrn" Dec 01 21:51:03 crc kubenswrapper[4962]: I1201 21:51:03.255833 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/05992e60-e6fc-43a0-b44a-d177ae3f4c83-webhook-certs\") pod \"openstack-operator-controller-manager-d8646fccf-4h8tf\" (UID: \"05992e60-e6fc-43a0-b44a-d177ae3f4c83\") " pod="openstack-operators/openstack-operator-controller-manager-d8646fccf-4h8tf" Dec 01 21:51:03 crc kubenswrapper[4962]: I1201 21:51:03.256016 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bn5jk\" (UniqueName: \"kubernetes.io/projected/400ba839-34f0-4463-a318-c1bcba6e5039-kube-api-access-bn5jk\") pod \"rabbitmq-cluster-operator-manager-668c99d594-lv89z\" (UID: \"400ba839-34f0-4463-a318-c1bcba6e5039\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lv89z" Dec 01 21:51:03 crc kubenswrapper[4962]: I1201 21:51:03.256074 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fec45066-0c5d-48de-9c33-f166f33131f0-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4z6mbc\" (UID: \"fec45066-0c5d-48de-9c33-f166f33131f0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4z6mbc" Dec 01 21:51:03 crc kubenswrapper[4962]: I1201 21:51:03.256163 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtxbw\" (UniqueName: \"kubernetes.io/projected/05992e60-e6fc-43a0-b44a-d177ae3f4c83-kube-api-access-dtxbw\") pod \"openstack-operator-controller-manager-d8646fccf-4h8tf\" (UID: \"05992e60-e6fc-43a0-b44a-d177ae3f4c83\") " pod="openstack-operators/openstack-operator-controller-manager-d8646fccf-4h8tf" Dec 01 21:51:03 crc kubenswrapper[4962]: I1201 21:51:03.256233 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/05992e60-e6fc-43a0-b44a-d177ae3f4c83-metrics-certs\") pod \"openstack-operator-controller-manager-d8646fccf-4h8tf\" (UID: \"05992e60-e6fc-43a0-b44a-d177ae3f4c83\") " pod="openstack-operators/openstack-operator-controller-manager-d8646fccf-4h8tf" Dec 01 21:51:03 crc kubenswrapper[4962]: E1201 21:51:03.256455 4962 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 01 21:51:03 crc kubenswrapper[4962]: E1201 21:51:03.256519 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05992e60-e6fc-43a0-b44a-d177ae3f4c83-metrics-certs podName:05992e60-e6fc-43a0-b44a-d177ae3f4c83 nodeName:}" failed. No retries permitted until 2025-12-01 21:51:03.756500092 +0000 UTC m=+1047.857939287 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/05992e60-e6fc-43a0-b44a-d177ae3f4c83-metrics-certs") pod "openstack-operator-controller-manager-d8646fccf-4h8tf" (UID: "05992e60-e6fc-43a0-b44a-d177ae3f4c83") : secret "metrics-server-cert" not found Dec 01 21:51:03 crc kubenswrapper[4962]: E1201 21:51:03.258463 4962 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 01 21:51:03 crc kubenswrapper[4962]: E1201 21:51:03.258526 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05992e60-e6fc-43a0-b44a-d177ae3f4c83-webhook-certs podName:05992e60-e6fc-43a0-b44a-d177ae3f4c83 nodeName:}" failed. No retries permitted until 2025-12-01 21:51:03.758508569 +0000 UTC m=+1047.859947754 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/05992e60-e6fc-43a0-b44a-d177ae3f4c83-webhook-certs") pod "openstack-operator-controller-manager-d8646fccf-4h8tf" (UID: "05992e60-e6fc-43a0-b44a-d177ae3f4c83") : secret "webhook-server-cert" not found Dec 01 21:51:03 crc kubenswrapper[4962]: E1201 21:51:03.258861 4962 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 21:51:03 crc kubenswrapper[4962]: E1201 21:51:03.258894 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fec45066-0c5d-48de-9c33-f166f33131f0-cert podName:fec45066-0c5d-48de-9c33-f166f33131f0 nodeName:}" failed. No retries permitted until 2025-12-01 21:51:04.25888613 +0000 UTC m=+1048.360325315 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fec45066-0c5d-48de-9c33-f166f33131f0-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4z6mbc" (UID: "fec45066-0c5d-48de-9c33-f166f33131f0") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 21:51:03 crc kubenswrapper[4962]: I1201 21:51:03.288153 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6c484b4dc4-ch82f" Dec 01 21:51:03 crc kubenswrapper[4962]: I1201 21:51:03.293943 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bn5jk\" (UniqueName: \"kubernetes.io/projected/400ba839-34f0-4463-a318-c1bcba6e5039-kube-api-access-bn5jk\") pod \"rabbitmq-cluster-operator-manager-668c99d594-lv89z\" (UID: \"400ba839-34f0-4463-a318-c1bcba6e5039\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lv89z" Dec 01 21:51:03 crc kubenswrapper[4962]: I1201 21:51:03.303629 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtxbw\" (UniqueName: \"kubernetes.io/projected/05992e60-e6fc-43a0-b44a-d177ae3f4c83-kube-api-access-dtxbw\") pod \"openstack-operator-controller-manager-d8646fccf-4h8tf\" (UID: \"05992e60-e6fc-43a0-b44a-d177ae3f4c83\") " pod="openstack-operators/openstack-operator-controller-manager-d8646fccf-4h8tf" Dec 01 21:51:03 crc kubenswrapper[4962]: I1201 21:51:03.312262 4962 generic.go:334] "Generic (PLEG): container finished" podID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerID="95b773e188f611e19f1e133dda091ac575dae9bb165debbd86a90d7593910a0b" exitCode=0 Dec 01 21:51:03 crc kubenswrapper[4962]: I1201 21:51:03.312304 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" event={"ID":"191b6ce3-f613-4217-b224-a65ee4cfdfe7","Type":"ContainerDied","Data":"95b773e188f611e19f1e133dda091ac575dae9bb165debbd86a90d7593910a0b"} Dec 01 21:51:03 crc kubenswrapper[4962]: I1201 21:51:03.312341 4962 scope.go:117] "RemoveContainer" containerID="d2879c1f1c1a43cf7797f56147cd78f2bf5ee957daff607dcee5e6d23c293a8c" Dec 01 21:51:03 crc kubenswrapper[4962]: I1201 21:51:03.357966 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-q7bxg" Dec 01 21:51:03 crc kubenswrapper[4962]: I1201 21:51:03.578045 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-sqwvg"] Dec 01 21:51:03 crc kubenswrapper[4962]: I1201 21:51:03.602287 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lv89z" Dec 01 21:51:03 crc kubenswrapper[4962]: I1201 21:51:03.765204 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/05992e60-e6fc-43a0-b44a-d177ae3f4c83-webhook-certs\") pod \"openstack-operator-controller-manager-d8646fccf-4h8tf\" (UID: \"05992e60-e6fc-43a0-b44a-d177ae3f4c83\") " pod="openstack-operators/openstack-operator-controller-manager-d8646fccf-4h8tf" Dec 01 21:51:03 crc kubenswrapper[4962]: I1201 21:51:03.765662 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/05992e60-e6fc-43a0-b44a-d177ae3f4c83-metrics-certs\") pod \"openstack-operator-controller-manager-d8646fccf-4h8tf\" (UID: \"05992e60-e6fc-43a0-b44a-d177ae3f4c83\") " pod="openstack-operators/openstack-operator-controller-manager-d8646fccf-4h8tf" Dec 01 21:51:03 crc kubenswrapper[4962]: E1201 21:51:03.765382 4962 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 01 21:51:03 crc kubenswrapper[4962]: E1201 21:51:03.765911 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05992e60-e6fc-43a0-b44a-d177ae3f4c83-webhook-certs podName:05992e60-e6fc-43a0-b44a-d177ae3f4c83 nodeName:}" failed. No retries permitted until 2025-12-01 21:51:04.765896743 +0000 UTC m=+1048.867335938 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/05992e60-e6fc-43a0-b44a-d177ae3f4c83-webhook-certs") pod "openstack-operator-controller-manager-d8646fccf-4h8tf" (UID: "05992e60-e6fc-43a0-b44a-d177ae3f4c83") : secret "webhook-server-cert" not found Dec 01 21:51:03 crc kubenswrapper[4962]: E1201 21:51:03.765844 4962 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 01 21:51:03 crc kubenswrapper[4962]: E1201 21:51:03.766359 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05992e60-e6fc-43a0-b44a-d177ae3f4c83-metrics-certs podName:05992e60-e6fc-43a0-b44a-d177ae3f4c83 nodeName:}" failed. No retries permitted until 2025-12-01 21:51:04.766351036 +0000 UTC m=+1048.867790231 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/05992e60-e6fc-43a0-b44a-d177ae3f4c83-metrics-certs") pod "openstack-operator-controller-manager-d8646fccf-4h8tf" (UID: "05992e60-e6fc-43a0-b44a-d177ae3f4c83") : secret "metrics-server-cert" not found Dec 01 21:51:03 crc kubenswrapper[4962]: I1201 21:51:03.770577 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-t725d"] Dec 01 21:51:03 crc kubenswrapper[4962]: I1201 21:51:03.810044 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-2nvkp"] Dec 01 21:51:03 crc kubenswrapper[4962]: W1201 21:51:03.817916 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e2461fa_57b4_406a_9801_522b2e3ee2f0.slice/crio-44b64a101a3a65415934d3e1fff90c7145329019d7a97dd1ed424672d78ee59f WatchSource:0}: Error finding container 44b64a101a3a65415934d3e1fff90c7145329019d7a97dd1ed424672d78ee59f: Status 404 returned error can't find the container with id 44b64a101a3a65415934d3e1fff90c7145329019d7a97dd1ed424672d78ee59f Dec 01 21:51:03 crc kubenswrapper[4962]: I1201 21:51:03.867804 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e5d2ffc-89ea-4f45-a3a2-1745f3b107e8-cert\") pod \"infra-operator-controller-manager-57548d458d-27r4m\" (UID: \"2e5d2ffc-89ea-4f45-a3a2-1745f3b107e8\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-27r4m" Dec 01 21:51:03 crc kubenswrapper[4962]: E1201 21:51:03.868111 4962 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 01 21:51:03 crc kubenswrapper[4962]: E1201 21:51:03.868202 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e5d2ffc-89ea-4f45-a3a2-1745f3b107e8-cert podName:2e5d2ffc-89ea-4f45-a3a2-1745f3b107e8 nodeName:}" failed. No retries permitted until 2025-12-01 21:51:05.868180843 +0000 UTC m=+1049.969620038 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2e5d2ffc-89ea-4f45-a3a2-1745f3b107e8-cert") pod "infra-operator-controller-manager-57548d458d-27r4m" (UID: "2e5d2ffc-89ea-4f45-a3a2-1745f3b107e8") : secret "infra-operator-webhook-server-cert" not found Dec 01 21:51:04 crc kubenswrapper[4962]: I1201 21:51:04.088370 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-668d9c48b9-8xc97"] Dec 01 21:51:04 crc kubenswrapper[4962]: E1201 21:51:04.273762 4962 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 21:51:04 crc kubenswrapper[4962]: I1201 21:51:04.281022 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fec45066-0c5d-48de-9c33-f166f33131f0-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4z6mbc\" (UID: \"fec45066-0c5d-48de-9c33-f166f33131f0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4z6mbc" Dec 01 21:51:04 crc kubenswrapper[4962]: E1201 21:51:04.281065 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fec45066-0c5d-48de-9c33-f166f33131f0-cert podName:fec45066-0c5d-48de-9c33-f166f33131f0 nodeName:}" failed. No retries permitted until 2025-12-01 21:51:06.281031517 +0000 UTC m=+1050.382470722 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fec45066-0c5d-48de-9c33-f166f33131f0-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4z6mbc" (UID: "fec45066-0c5d-48de-9c33-f166f33131f0") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 21:51:04 crc kubenswrapper[4962]: I1201 21:51:04.323302 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-sqwvg" event={"ID":"8e655cd6-3169-46a0-b299-37d13dae8d3a","Type":"ContainerStarted","Data":"eeb046f44e4783d996daff744fdf7c335dbd467733381f863ada7ff6d161486c"} Dec 01 21:51:04 crc kubenswrapper[4962]: I1201 21:51:04.324715 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-8xc97" event={"ID":"d871da7f-4b47-4931-aa3b-1525f50b2bde","Type":"ContainerStarted","Data":"d5cf787c9c20a195af33d2a1b5d003f9589987e0436dd8e7bfb042de5efc8633"} Dec 01 21:51:04 crc kubenswrapper[4962]: I1201 21:51:04.325846 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-t725d" event={"ID":"0e2461fa-57b4-406a-9801-522b2e3ee2f0","Type":"ContainerStarted","Data":"44b64a101a3a65415934d3e1fff90c7145329019d7a97dd1ed424672d78ee59f"} Dec 01 21:51:04 crc kubenswrapper[4962]: I1201 21:51:04.328529 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" event={"ID":"191b6ce3-f613-4217-b224-a65ee4cfdfe7","Type":"ContainerStarted","Data":"7e98140d5fb11879a3903d3761dc38b8ef264c041494b571b46af54f4f57bb50"} Dec 01 21:51:04 crc kubenswrapper[4962]: I1201 21:51:04.330475 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-2nvkp" event={"ID":"39217f35-ba4e-402b-84fe-876ca232ff60","Type":"ContainerStarted","Data":"b7845595438ebdad9315e4fe7a79a93ec0ded64e1e69ba534004fba7e59c777e"} Dec 01 21:51:04 crc kubenswrapper[4962]: I1201 21:51:04.792711 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/05992e60-e6fc-43a0-b44a-d177ae3f4c83-metrics-certs\") pod \"openstack-operator-controller-manager-d8646fccf-4h8tf\" (UID: \"05992e60-e6fc-43a0-b44a-d177ae3f4c83\") " pod="openstack-operators/openstack-operator-controller-manager-d8646fccf-4h8tf" Dec 01 21:51:04 crc kubenswrapper[4962]: I1201 21:51:04.793055 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/05992e60-e6fc-43a0-b44a-d177ae3f4c83-webhook-certs\") pod \"openstack-operator-controller-manager-d8646fccf-4h8tf\" (UID: \"05992e60-e6fc-43a0-b44a-d177ae3f4c83\") " pod="openstack-operators/openstack-operator-controller-manager-d8646fccf-4h8tf" Dec 01 21:51:04 crc kubenswrapper[4962]: E1201 21:51:04.793274 4962 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 01 21:51:04 crc kubenswrapper[4962]: E1201 21:51:04.793325 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05992e60-e6fc-43a0-b44a-d177ae3f4c83-webhook-certs podName:05992e60-e6fc-43a0-b44a-d177ae3f4c83 nodeName:}" failed. No retries permitted until 2025-12-01 21:51:06.793312288 +0000 UTC m=+1050.894751483 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/05992e60-e6fc-43a0-b44a-d177ae3f4c83-webhook-certs") pod "openstack-operator-controller-manager-d8646fccf-4h8tf" (UID: "05992e60-e6fc-43a0-b44a-d177ae3f4c83") : secret "webhook-server-cert" not found Dec 01 21:51:04 crc kubenswrapper[4962]: E1201 21:51:04.793364 4962 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 01 21:51:04 crc kubenswrapper[4962]: E1201 21:51:04.793386 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05992e60-e6fc-43a0-b44a-d177ae3f4c83-metrics-certs podName:05992e60-e6fc-43a0-b44a-d177ae3f4c83 nodeName:}" failed. No retries permitted until 2025-12-01 21:51:06.79338011 +0000 UTC m=+1050.894819305 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/05992e60-e6fc-43a0-b44a-d177ae3f4c83-metrics-certs") pod "openstack-operator-controller-manager-d8646fccf-4h8tf" (UID: "05992e60-e6fc-43a0-b44a-d177ae3f4c83") : secret "metrics-server-cert" not found Dec 01 21:51:04 crc kubenswrapper[4962]: I1201 21:51:04.938499 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-nptld"] Dec 01 21:51:04 crc kubenswrapper[4962]: I1201 21:51:04.956110 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-mllgh"] Dec 01 21:51:05 crc kubenswrapper[4962]: I1201 21:51:05.020515 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-7d98c"] Dec 01 21:51:05 crc kubenswrapper[4962]: W1201 21:51:05.069796 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec3039da_9f5e_4870_8579_8560a63221a8.slice/crio-cb2aa93fbd6f91b5e92783c85812bf6b25c06541c276c33c6b8dabde657e2648 WatchSource:0}: Error finding container cb2aa93fbd6f91b5e92783c85812bf6b25c06541c276c33c6b8dabde657e2648: Status 404 returned error can't find the container with id cb2aa93fbd6f91b5e92783c85812bf6b25c06541c276c33c6b8dabde657e2648 Dec 01 21:51:05 crc kubenswrapper[4962]: I1201 21:51:05.094523 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6546668bfd-zvp89"] Dec 01 21:51:05 crc kubenswrapper[4962]: I1201 21:51:05.115242 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-4hgng"] Dec 01 21:51:05 crc kubenswrapper[4962]: I1201 21:51:05.126101 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-546d4bdf48-fkvsq"] Dec 01 21:51:05 crc kubenswrapper[4962]: E1201 21:51:05.136027 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-n2dw7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-zzc5v_openstack-operators(0aff0b93-1032-412b-9628-3ab9e94717a8): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 21:51:05 crc kubenswrapper[4962]: E1201 21:51:05.142594 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-n2dw7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-zzc5v_openstack-operators(0aff0b93-1032-412b-9628-3ab9e94717a8): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 21:51:05 crc kubenswrapper[4962]: E1201 21:51:05.143672 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.107:5001/openstack-k8s-operators/telemetry-operator:e82e6b4a488661603634ac58918e94b98a55620c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kh9qt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-6c484b4dc4-ch82f_openstack-operators(af182ba4-78a6-41eb-bf65-8abd64207122): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 21:51:05 crc kubenswrapper[4962]: E1201 21:51:05.143755 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-zzc5v" podUID="0aff0b93-1032-412b-9628-3ab9e94717a8" Dec 01 21:51:05 crc kubenswrapper[4962]: E1201 21:51:05.148205 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kh9qt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-6c484b4dc4-ch82f_openstack-operators(af182ba4-78a6-41eb-bf65-8abd64207122): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 21:51:05 crc kubenswrapper[4962]: E1201 21:51:05.149311 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-slhl6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-mqrwk_openstack-operators(1fb020cd-66c6-401d-be7e-9a26b62eb8d8): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 21:51:05 crc kubenswrapper[4962]: E1201 21:51:05.149802 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/telemetry-operator-controller-manager-6c484b4dc4-ch82f" podUID="af182ba4-78a6-41eb-bf65-8abd64207122" Dec 01 21:51:05 crc kubenswrapper[4962]: E1201 21:51:05.150853 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bhczg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-769dc69bc-q7bxg_openstack-operators(c847e733-65b6-4724-8037-5199d847f1ba): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 21:51:05 crc kubenswrapper[4962]: E1201 21:51:05.151261 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-slhl6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-mqrwk_openstack-operators(1fb020cd-66c6-401d-be7e-9a26b62eb8d8): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 21:51:05 crc kubenswrapper[4962]: I1201 21:51:05.153132 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-xvbpf"] Dec 01 21:51:05 crc kubenswrapper[4962]: E1201 21:51:05.153660 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-mqrwk" podUID="1fb020cd-66c6-401d-be7e-9a26b62eb8d8" Dec 01 21:51:05 crc kubenswrapper[4962]: E1201 21:51:05.156386 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bhczg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-769dc69bc-q7bxg_openstack-operators(c847e733-65b6-4724-8037-5199d847f1ba): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 21:51:05 crc kubenswrapper[4962]: E1201 21:51:05.157916 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-q7bxg" podUID="c847e733-65b6-4724-8037-5199d847f1ba" Dec 01 21:51:05 crc kubenswrapper[4962]: I1201 21:51:05.161085 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-w68ng"] Dec 01 21:51:05 crc kubenswrapper[4962]: I1201 21:51:05.176335 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-sjzl9"] Dec 01 21:51:05 crc kubenswrapper[4962]: I1201 21:51:05.193602 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-hfkpq"] Dec 01 21:51:05 crc kubenswrapper[4962]: I1201 21:51:05.198797 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-q7bxg"] Dec 01 21:51:05 crc kubenswrapper[4962]: I1201 21:51:05.207066 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-zzc5v"] Dec 01 21:51:05 crc kubenswrapper[4962]: I1201 21:51:05.213313 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-mqrwk"] Dec 01 21:51:05 crc kubenswrapper[4962]: I1201 21:51:05.218799 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6c484b4dc4-ch82f"] Dec 01 21:51:05 crc kubenswrapper[4962]: I1201 21:51:05.351632 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lv89z"] Dec 01 21:51:05 crc kubenswrapper[4962]: I1201 21:51:05.354375 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-zvp89" event={"ID":"3673ec86-6e36-4f0b-ac14-87e5d89e283e","Type":"ContainerStarted","Data":"bdbc01cf8ab0de193719d93cdd05078c7e4fa31afe9189901dced89b4dfa1a57"} Dec 01 21:51:05 crc kubenswrapper[4962]: I1201 21:51:05.355433 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-sjzl9" event={"ID":"1a9bd198-45fa-40ba-b3a0-55c150c211d6","Type":"ContainerStarted","Data":"61cca735d50afc82cb20dfb7396acb3ded02a3fc47f37205ae9c428056e202e3"} Dec 01 21:51:05 crc kubenswrapper[4962]: I1201 21:51:05.357982 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-6tfrn"] Dec 01 21:51:05 crc kubenswrapper[4962]: I1201 21:51:05.361693 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-mllgh" event={"ID":"fb3ad1a2-8ee0-4d12-8499-d10819081f1b","Type":"ContainerStarted","Data":"bd4b3b0a24917caedba7cafa3813d192f2c1490c0457762f2c0dcbd202a5dc3a"} Dec 01 21:51:05 crc kubenswrapper[4962]: I1201 21:51:05.364057 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-7d98c" event={"ID":"795b9a42-a6d4-487b-84ef-0f1b3617ebfc","Type":"ContainerStarted","Data":"018c89b5148129169db48c0bc6e00ea709dc3a93b3921168372e5e5916f2d04b"} Dec 01 21:51:05 crc kubenswrapper[4962]: I1201 21:51:05.368671 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-nptld" event={"ID":"fb72edda-e449-44f6-a85d-b74c0f3f9ad2","Type":"ContainerStarted","Data":"65f285c3b6f0fa4e83d385d8e244e599a3819ea432622b8ab7e3dd062671bbfe"} Dec 01 21:51:05 crc kubenswrapper[4962]: I1201 21:51:05.370074 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-hfkpq" event={"ID":"d62cdff4-c4d1-44fb-99dc-bdd6a31d03af","Type":"ContainerStarted","Data":"3b09b5b3cd1b40bf34a5197c52596a71f191041e59a6fe15c1954c23d4cfd2f7"} Dec 01 21:51:05 crc kubenswrapper[4962]: I1201 21:51:05.371173 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-xvbpf" event={"ID":"f2e499a5-b89a-45d4-bd3e-9f743e010a51","Type":"ContainerStarted","Data":"f37e3efbaa1eb833cbb2ab2da72b07f79e1312a63ee253f9db90e1dd8947b3b5"} Dec 01 21:51:05 crc kubenswrapper[4962]: I1201 21:51:05.372530 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-q7bxg" event={"ID":"c847e733-65b6-4724-8037-5199d847f1ba","Type":"ContainerStarted","Data":"e976d71e60553fc0c62e7745d2bb1bac1ea43129e32fd1e5f41634e519e715a0"} Dec 01 21:51:05 crc kubenswrapper[4962]: I1201 21:51:05.375278 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-zzc5v" event={"ID":"0aff0b93-1032-412b-9628-3ab9e94717a8","Type":"ContainerStarted","Data":"000282f9291c676f6b0343e4495a977015c7b56e3609c412b49c7664a43a4987"} Dec 01 21:51:05 crc kubenswrapper[4962]: E1201 21:51:05.376164 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-q7bxg" podUID="c847e733-65b6-4724-8037-5199d847f1ba" Dec 01 21:51:05 crc kubenswrapper[4962]: I1201 21:51:05.378105 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-mqrwk" event={"ID":"1fb020cd-66c6-401d-be7e-9a26b62eb8d8","Type":"ContainerStarted","Data":"1b8993836da6fbbcfc62309a546f07cf2814c96df25d1ad617689478186d6ac7"} Dec 01 21:51:05 crc kubenswrapper[4962]: E1201 21:51:05.378707 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-zzc5v" podUID="0aff0b93-1032-412b-9628-3ab9e94717a8" Dec 01 21:51:05 crc kubenswrapper[4962]: I1201 21:51:05.379492 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-fkvsq" event={"ID":"ed78cdfd-dc4e-4528-9542-6fc778f54e5f","Type":"ContainerStarted","Data":"d64359c452a3feef3d4b86def5013d1c3e84f6d553e4bcb78aa630425ee386ce"} Dec 01 21:51:05 crc kubenswrapper[4962]: I1201 21:51:05.383230 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-4hgng" event={"ID":"ec3039da-9f5e-4870-8579-8560a63221a8","Type":"ContainerStarted","Data":"cb2aa93fbd6f91b5e92783c85812bf6b25c06541c276c33c6b8dabde657e2648"} Dec 01 21:51:05 crc kubenswrapper[4962]: E1201 21:51:05.384735 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-mqrwk" podUID="1fb020cd-66c6-401d-be7e-9a26b62eb8d8" Dec 01 21:51:05 crc kubenswrapper[4962]: I1201 21:51:05.395780 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-w68ng" event={"ID":"d0ae9966-90f0-4d97-a056-dd9e86c81949","Type":"ContainerStarted","Data":"1e43e06ac3d5453408eca702d3bc6eae6839da7581ab79ea747c36bd25606f36"} Dec 01 21:51:05 crc kubenswrapper[4962]: I1201 21:51:05.405383 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6c484b4dc4-ch82f" event={"ID":"af182ba4-78a6-41eb-bf65-8abd64207122","Type":"ContainerStarted","Data":"1024a890a977aa552aabc59dd5072c53ef7d0ea4f5d23b24c74776e4bf5cd1ae"} Dec 01 21:51:05 crc kubenswrapper[4962]: E1201 21:51:05.409713 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.107:5001/openstack-k8s-operators/telemetry-operator:e82e6b4a488661603634ac58918e94b98a55620c\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-6c484b4dc4-ch82f" podUID="af182ba4-78a6-41eb-bf65-8abd64207122" Dec 01 21:51:05 crc kubenswrapper[4962]: I1201 21:51:05.925919 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e5d2ffc-89ea-4f45-a3a2-1745f3b107e8-cert\") pod \"infra-operator-controller-manager-57548d458d-27r4m\" (UID: \"2e5d2ffc-89ea-4f45-a3a2-1745f3b107e8\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-27r4m" Dec 01 21:51:05 crc kubenswrapper[4962]: E1201 21:51:05.926470 4962 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 01 21:51:05 crc kubenswrapper[4962]: E1201 21:51:05.926525 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e5d2ffc-89ea-4f45-a3a2-1745f3b107e8-cert podName:2e5d2ffc-89ea-4f45-a3a2-1745f3b107e8 nodeName:}" failed. No retries permitted until 2025-12-01 21:51:09.926510034 +0000 UTC m=+1054.027949219 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2e5d2ffc-89ea-4f45-a3a2-1745f3b107e8-cert") pod "infra-operator-controller-manager-57548d458d-27r4m" (UID: "2e5d2ffc-89ea-4f45-a3a2-1745f3b107e8") : secret "infra-operator-webhook-server-cert" not found Dec 01 21:51:06 crc kubenswrapper[4962]: I1201 21:51:06.336357 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fec45066-0c5d-48de-9c33-f166f33131f0-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4z6mbc\" (UID: \"fec45066-0c5d-48de-9c33-f166f33131f0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4z6mbc" Dec 01 21:51:06 crc kubenswrapper[4962]: E1201 21:51:06.337465 4962 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 21:51:06 crc kubenswrapper[4962]: E1201 21:51:06.337520 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fec45066-0c5d-48de-9c33-f166f33131f0-cert podName:fec45066-0c5d-48de-9c33-f166f33131f0 nodeName:}" failed. No retries permitted until 2025-12-01 21:51:10.337502585 +0000 UTC m=+1054.438941780 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fec45066-0c5d-48de-9c33-f166f33131f0-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4z6mbc" (UID: "fec45066-0c5d-48de-9c33-f166f33131f0") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 21:51:06 crc kubenswrapper[4962]: I1201 21:51:06.418481 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-6tfrn" event={"ID":"03f5786b-da6f-4b56-ac07-fb563f0a85b4","Type":"ContainerStarted","Data":"9a7467d6b815bc435756a0481d310f1a4ee882e442af94ec1392db8b476af207"} Dec 01 21:51:06 crc kubenswrapper[4962]: I1201 21:51:06.420874 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lv89z" event={"ID":"400ba839-34f0-4463-a318-c1bcba6e5039","Type":"ContainerStarted","Data":"c9dacd507aee5cce0631c9d854bf4f64ce67b15da511570e25574ef77e2c0aac"} Dec 01 21:51:06 crc kubenswrapper[4962]: E1201 21:51:06.423878 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-mqrwk" podUID="1fb020cd-66c6-401d-be7e-9a26b62eb8d8" Dec 01 21:51:06 crc kubenswrapper[4962]: E1201 21:51:06.424247 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.107:5001/openstack-k8s-operators/telemetry-operator:e82e6b4a488661603634ac58918e94b98a55620c\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-6c484b4dc4-ch82f" podUID="af182ba4-78a6-41eb-bf65-8abd64207122" Dec 01 21:51:06 crc kubenswrapper[4962]: E1201 21:51:06.425028 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-zzc5v" podUID="0aff0b93-1032-412b-9628-3ab9e94717a8" Dec 01 21:51:06 crc kubenswrapper[4962]: E1201 21:51:06.425633 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-q7bxg" podUID="c847e733-65b6-4724-8037-5199d847f1ba" Dec 01 21:51:06 crc kubenswrapper[4962]: I1201 21:51:06.846094 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/05992e60-e6fc-43a0-b44a-d177ae3f4c83-metrics-certs\") pod \"openstack-operator-controller-manager-d8646fccf-4h8tf\" (UID: \"05992e60-e6fc-43a0-b44a-d177ae3f4c83\") " pod="openstack-operators/openstack-operator-controller-manager-d8646fccf-4h8tf" Dec 01 21:51:06 crc kubenswrapper[4962]: I1201 21:51:06.846337 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/05992e60-e6fc-43a0-b44a-d177ae3f4c83-webhook-certs\") pod \"openstack-operator-controller-manager-d8646fccf-4h8tf\" (UID: \"05992e60-e6fc-43a0-b44a-d177ae3f4c83\") " pod="openstack-operators/openstack-operator-controller-manager-d8646fccf-4h8tf" Dec 01 21:51:06 crc kubenswrapper[4962]: E1201 21:51:06.846548 4962 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 01 21:51:06 crc kubenswrapper[4962]: E1201 21:51:06.846592 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05992e60-e6fc-43a0-b44a-d177ae3f4c83-webhook-certs podName:05992e60-e6fc-43a0-b44a-d177ae3f4c83 nodeName:}" failed. No retries permitted until 2025-12-01 21:51:10.846579697 +0000 UTC m=+1054.948018882 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/05992e60-e6fc-43a0-b44a-d177ae3f4c83-webhook-certs") pod "openstack-operator-controller-manager-d8646fccf-4h8tf" (UID: "05992e60-e6fc-43a0-b44a-d177ae3f4c83") : secret "webhook-server-cert" not found Dec 01 21:51:06 crc kubenswrapper[4962]: E1201 21:51:06.846892 4962 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 01 21:51:06 crc kubenswrapper[4962]: E1201 21:51:06.846915 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05992e60-e6fc-43a0-b44a-d177ae3f4c83-metrics-certs podName:05992e60-e6fc-43a0-b44a-d177ae3f4c83 nodeName:}" failed. No retries permitted until 2025-12-01 21:51:10.846906836 +0000 UTC m=+1054.948346021 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/05992e60-e6fc-43a0-b44a-d177ae3f4c83-metrics-certs") pod "openstack-operator-controller-manager-d8646fccf-4h8tf" (UID: "05992e60-e6fc-43a0-b44a-d177ae3f4c83") : secret "metrics-server-cert" not found Dec 01 21:51:10 crc kubenswrapper[4962]: I1201 21:51:10.009314 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e5d2ffc-89ea-4f45-a3a2-1745f3b107e8-cert\") pod \"infra-operator-controller-manager-57548d458d-27r4m\" (UID: \"2e5d2ffc-89ea-4f45-a3a2-1745f3b107e8\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-27r4m" Dec 01 21:51:10 crc kubenswrapper[4962]: E1201 21:51:10.009803 4962 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 01 21:51:10 crc kubenswrapper[4962]: E1201 21:51:10.010357 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e5d2ffc-89ea-4f45-a3a2-1745f3b107e8-cert podName:2e5d2ffc-89ea-4f45-a3a2-1745f3b107e8 nodeName:}" failed. No retries permitted until 2025-12-01 21:51:18.010337184 +0000 UTC m=+1062.111776379 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2e5d2ffc-89ea-4f45-a3a2-1745f3b107e8-cert") pod "infra-operator-controller-manager-57548d458d-27r4m" (UID: "2e5d2ffc-89ea-4f45-a3a2-1745f3b107e8") : secret "infra-operator-webhook-server-cert" not found Dec 01 21:51:10 crc kubenswrapper[4962]: I1201 21:51:10.417806 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fec45066-0c5d-48de-9c33-f166f33131f0-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4z6mbc\" (UID: \"fec45066-0c5d-48de-9c33-f166f33131f0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4z6mbc" Dec 01 21:51:10 crc kubenswrapper[4962]: E1201 21:51:10.417990 4962 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 21:51:10 crc kubenswrapper[4962]: E1201 21:51:10.418212 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fec45066-0c5d-48de-9c33-f166f33131f0-cert podName:fec45066-0c5d-48de-9c33-f166f33131f0 nodeName:}" failed. No retries permitted until 2025-12-01 21:51:18.418188586 +0000 UTC m=+1062.519627781 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fec45066-0c5d-48de-9c33-f166f33131f0-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4z6mbc" (UID: "fec45066-0c5d-48de-9c33-f166f33131f0") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 21:51:10 crc kubenswrapper[4962]: I1201 21:51:10.935185 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/05992e60-e6fc-43a0-b44a-d177ae3f4c83-metrics-certs\") pod \"openstack-operator-controller-manager-d8646fccf-4h8tf\" (UID: \"05992e60-e6fc-43a0-b44a-d177ae3f4c83\") " pod="openstack-operators/openstack-operator-controller-manager-d8646fccf-4h8tf" Dec 01 21:51:10 crc kubenswrapper[4962]: I1201 21:51:10.935263 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/05992e60-e6fc-43a0-b44a-d177ae3f4c83-webhook-certs\") pod \"openstack-operator-controller-manager-d8646fccf-4h8tf\" (UID: \"05992e60-e6fc-43a0-b44a-d177ae3f4c83\") " pod="openstack-operators/openstack-operator-controller-manager-d8646fccf-4h8tf" Dec 01 21:51:10 crc kubenswrapper[4962]: E1201 21:51:10.935379 4962 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 01 21:51:10 crc kubenswrapper[4962]: E1201 21:51:10.935453 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05992e60-e6fc-43a0-b44a-d177ae3f4c83-metrics-certs podName:05992e60-e6fc-43a0-b44a-d177ae3f4c83 nodeName:}" failed. No retries permitted until 2025-12-01 21:51:18.935436531 +0000 UTC m=+1063.036875726 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/05992e60-e6fc-43a0-b44a-d177ae3f4c83-metrics-certs") pod "openstack-operator-controller-manager-d8646fccf-4h8tf" (UID: "05992e60-e6fc-43a0-b44a-d177ae3f4c83") : secret "metrics-server-cert" not found Dec 01 21:51:10 crc kubenswrapper[4962]: E1201 21:51:10.935395 4962 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 01 21:51:10 crc kubenswrapper[4962]: E1201 21:51:10.935530 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05992e60-e6fc-43a0-b44a-d177ae3f4c83-webhook-certs podName:05992e60-e6fc-43a0-b44a-d177ae3f4c83 nodeName:}" failed. No retries permitted until 2025-12-01 21:51:18.935516133 +0000 UTC m=+1063.036955328 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/05992e60-e6fc-43a0-b44a-d177ae3f4c83-webhook-certs") pod "openstack-operator-controller-manager-d8646fccf-4h8tf" (UID: "05992e60-e6fc-43a0-b44a-d177ae3f4c83") : secret "webhook-server-cert" not found Dec 01 21:51:18 crc kubenswrapper[4962]: I1201 21:51:18.074977 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e5d2ffc-89ea-4f45-a3a2-1745f3b107e8-cert\") pod \"infra-operator-controller-manager-57548d458d-27r4m\" (UID: \"2e5d2ffc-89ea-4f45-a3a2-1745f3b107e8\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-27r4m" Dec 01 21:51:18 crc kubenswrapper[4962]: I1201 21:51:18.091119 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e5d2ffc-89ea-4f45-a3a2-1745f3b107e8-cert\") pod \"infra-operator-controller-manager-57548d458d-27r4m\" (UID: \"2e5d2ffc-89ea-4f45-a3a2-1745f3b107e8\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-27r4m" Dec 01 21:51:18 crc kubenswrapper[4962]: I1201 21:51:18.257288 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-27r4m" Dec 01 21:51:18 crc kubenswrapper[4962]: I1201 21:51:18.487875 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fec45066-0c5d-48de-9c33-f166f33131f0-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4z6mbc\" (UID: \"fec45066-0c5d-48de-9c33-f166f33131f0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4z6mbc" Dec 01 21:51:18 crc kubenswrapper[4962]: I1201 21:51:18.495354 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fec45066-0c5d-48de-9c33-f166f33131f0-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4z6mbc\" (UID: \"fec45066-0c5d-48de-9c33-f166f33131f0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4z6mbc" Dec 01 21:51:18 crc kubenswrapper[4962]: I1201 21:51:18.505589 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4z6mbc" Dec 01 21:51:18 crc kubenswrapper[4962]: I1201 21:51:18.998530 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/05992e60-e6fc-43a0-b44a-d177ae3f4c83-webhook-certs\") pod \"openstack-operator-controller-manager-d8646fccf-4h8tf\" (UID: \"05992e60-e6fc-43a0-b44a-d177ae3f4c83\") " pod="openstack-operators/openstack-operator-controller-manager-d8646fccf-4h8tf" Dec 01 21:51:18 crc kubenswrapper[4962]: I1201 21:51:18.998905 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/05992e60-e6fc-43a0-b44a-d177ae3f4c83-metrics-certs\") pod \"openstack-operator-controller-manager-d8646fccf-4h8tf\" (UID: \"05992e60-e6fc-43a0-b44a-d177ae3f4c83\") " pod="openstack-operators/openstack-operator-controller-manager-d8646fccf-4h8tf" Dec 01 21:51:19 crc kubenswrapper[4962]: I1201 21:51:19.003757 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/05992e60-e6fc-43a0-b44a-d177ae3f4c83-metrics-certs\") pod \"openstack-operator-controller-manager-d8646fccf-4h8tf\" (UID: \"05992e60-e6fc-43a0-b44a-d177ae3f4c83\") " pod="openstack-operators/openstack-operator-controller-manager-d8646fccf-4h8tf" Dec 01 21:51:19 crc kubenswrapper[4962]: I1201 21:51:19.012328 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/05992e60-e6fc-43a0-b44a-d177ae3f4c83-webhook-certs\") pod \"openstack-operator-controller-manager-d8646fccf-4h8tf\" (UID: \"05992e60-e6fc-43a0-b44a-d177ae3f4c83\") " pod="openstack-operators/openstack-operator-controller-manager-d8646fccf-4h8tf" Dec 01 21:51:19 crc kubenswrapper[4962]: I1201 21:51:19.114042 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-d8646fccf-4h8tf" Dec 01 21:51:27 crc kubenswrapper[4962]: E1201 21:51:27.264164 4962 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557" Dec 01 21:51:27 crc kubenswrapper[4962]: E1201 21:51:27.265147 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fxr2v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-5fdfd5b6b5-7d98c_openstack-operators(795b9a42-a6d4-487b-84ef-0f1b3617ebfc): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 21:51:27 crc kubenswrapper[4962]: E1201 21:51:27.766809 4962 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:600ca007e493d3af0fcc2ebac92e8da5efd2afe812b62d7d3d4dd0115bdf05d7" Dec 01 21:51:27 crc kubenswrapper[4962]: E1201 21:51:27.767072 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:600ca007e493d3af0fcc2ebac92e8da5efd2afe812b62d7d3d4dd0115bdf05d7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vt9pm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-56bbcc9d85-nptld_openstack-operators(fb72edda-e449-44f6-a85d-b74c0f3f9ad2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 21:51:29 crc kubenswrapper[4962]: E1201 21:51:29.400803 4962 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f" Dec 01 21:51:29 crc kubenswrapper[4962]: E1201 21:51:29.402272 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nft94,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-78f8948974-4hgng_openstack-operators(ec3039da-9f5e-4870-8579-8560a63221a8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 21:51:31 crc kubenswrapper[4962]: E1201 21:51:31.289237 4962 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/heat-operator@sha256:c4abfc148600dfa85915f3dc911d988ea2335f26cb6b8d749fe79bfe53e5e429" Dec 01 21:51:31 crc kubenswrapper[4962]: E1201 21:51:31.289768 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:c4abfc148600dfa85915f3dc911d988ea2335f26cb6b8d749fe79bfe53e5e429,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-q9mss,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-5f64f6f8bb-sjzl9_openstack-operators(1a9bd198-45fa-40ba-b3a0-55c150c211d6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 21:51:31 crc kubenswrapper[4962]: E1201 21:51:31.826282 4962 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168" Dec 01 21:51:31 crc kubenswrapper[4962]: E1201 21:51:31.826504 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8c6xl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-998648c74-xvbpf_openstack-operators(f2e499a5-b89a-45d4-bd3e-9f743e010a51): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 21:51:32 crc kubenswrapper[4962]: E1201 21:51:32.379992 4962 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/barbican-operator@sha256:f6059a0fbf031d34dcf086d14ce8c0546caeaee23c5780e90b5037c5feee9fea" Dec 01 21:51:32 crc kubenswrapper[4962]: E1201 21:51:32.380527 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/barbican-operator@sha256:f6059a0fbf031d34dcf086d14ce8c0546caeaee23c5780e90b5037c5feee9fea,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bvjnl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-operator-controller-manager-7d9dfd778-t725d_openstack-operators(0e2461fa-57b4-406a-9801-522b2e3ee2f0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 21:51:32 crc kubenswrapper[4962]: E1201 21:51:32.896735 4962 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5" Dec 01 21:51:32 crc kubenswrapper[4962]: E1201 21:51:32.897052 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ppdlm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-68c6d99b8f-mllgh_openstack-operators(fb3ad1a2-8ee0-4d12-8499-d10819081f1b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 21:51:33 crc kubenswrapper[4962]: E1201 21:51:33.371165 4962 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:0f523b7e2fa9e86fef986acf07d0c42d5658c475d565f11eaea926ebffcb6530" Dec 01 21:51:33 crc kubenswrapper[4962]: E1201 21:51:33.371373 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:0f523b7e2fa9e86fef986acf07d0c42d5658c475d565f11eaea926ebffcb6530,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dtndt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-6c548fd776-w68ng_openstack-operators(d0ae9966-90f0-4d97-a056-dd9e86c81949): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 21:51:33 crc kubenswrapper[4962]: E1201 21:51:33.882826 4962 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Dec 01 21:51:33 crc kubenswrapper[4962]: E1201 21:51:33.883351 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bn5jk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-lv89z_openstack-operators(400ba839-34f0-4463-a318-c1bcba6e5039): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 21:51:33 crc kubenswrapper[4962]: E1201 21:51:33.884573 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lv89z" podUID="400ba839-34f0-4463-a318-c1bcba6e5039" Dec 01 21:51:34 crc kubenswrapper[4962]: E1201 21:51:34.425662 4962 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:986861e5a0a9954f63581d9d55a30f8057883cefea489415d76257774526eea3" Dec 01 21:51:34 crc kubenswrapper[4962]: E1201 21:51:34.425887 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:986861e5a0a9954f63581d9d55a30f8057883cefea489415d76257774526eea3,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-h978r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-546d4bdf48-fkvsq_openstack-operators(ed78cdfd-dc4e-4528-9542-6fc778f54e5f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 21:51:34 crc kubenswrapper[4962]: E1201 21:51:34.745349 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lv89z" podUID="400ba839-34f0-4463-a318-c1bcba6e5039" Dec 01 21:51:42 crc kubenswrapper[4962]: E1201 21:51:42.683589 4962 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.107:5001/openstack-k8s-operators/telemetry-operator:e82e6b4a488661603634ac58918e94b98a55620c" Dec 01 21:51:42 crc kubenswrapper[4962]: E1201 21:51:42.683998 4962 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.107:5001/openstack-k8s-operators/telemetry-operator:e82e6b4a488661603634ac58918e94b98a55620c" Dec 01 21:51:42 crc kubenswrapper[4962]: E1201 21:51:42.684178 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.107:5001/openstack-k8s-operators/telemetry-operator:e82e6b4a488661603634ac58918e94b98a55620c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kh9qt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-6c484b4dc4-ch82f_openstack-operators(af182ba4-78a6-41eb-bf65-8abd64207122): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 21:51:43 crc kubenswrapper[4962]: I1201 21:51:43.201458 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-27r4m"] Dec 01 21:51:43 crc kubenswrapper[4962]: I1201 21:51:43.212560 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4z6mbc"] Dec 01 21:51:43 crc kubenswrapper[4962]: I1201 21:51:43.337920 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-d8646fccf-4h8tf"] Dec 01 21:51:43 crc kubenswrapper[4962]: W1201 21:51:43.638801 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05992e60_e6fc_43a0_b44a_d177ae3f4c83.slice/crio-c23e746ba187223d8c41b903db851f952757233fb822e23fd34a90f1a991fb28 WatchSource:0}: Error finding container c23e746ba187223d8c41b903db851f952757233fb822e23fd34a90f1a991fb28: Status 404 returned error can't find the container with id c23e746ba187223d8c41b903db851f952757233fb822e23fd34a90f1a991fb28 Dec 01 21:51:43 crc kubenswrapper[4962]: I1201 21:51:43.842595 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-27r4m" event={"ID":"2e5d2ffc-89ea-4f45-a3a2-1745f3b107e8","Type":"ContainerStarted","Data":"fadb8d6257740338b0e78b02d1fc76f598f6e5bb4c818e9f0c3f58cf76e3ff9e"} Dec 01 21:51:43 crc kubenswrapper[4962]: I1201 21:51:43.844404 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-hfkpq" event={"ID":"d62cdff4-c4d1-44fb-99dc-bdd6a31d03af","Type":"ContainerStarted","Data":"b7361c80321f533d589922d316fddca6d4459727ef537fe892f183865b20f71b"} Dec 01 21:51:43 crc kubenswrapper[4962]: I1201 21:51:43.863326 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4z6mbc" event={"ID":"fec45066-0c5d-48de-9c33-f166f33131f0","Type":"ContainerStarted","Data":"1ea480ad9873a6f24f12a2d6939a6c9d78933dd0674487a93e0db128885ed9e7"} Dec 01 21:51:43 crc kubenswrapper[4962]: I1201 21:51:43.865216 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-d8646fccf-4h8tf" event={"ID":"05992e60-e6fc-43a0-b44a-d177ae3f4c83","Type":"ContainerStarted","Data":"c23e746ba187223d8c41b903db851f952757233fb822e23fd34a90f1a991fb28"} Dec 01 21:51:44 crc kubenswrapper[4962]: I1201 21:51:44.874197 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-zvp89" event={"ID":"3673ec86-6e36-4f0b-ac14-87e5d89e283e","Type":"ContainerStarted","Data":"00b02ce8531f180d7b0f716cfcb3db988b4edfd2a5846888e90627e40af63441"} Dec 01 21:51:44 crc kubenswrapper[4962]: I1201 21:51:44.877310 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-sqwvg" event={"ID":"8e655cd6-3169-46a0-b299-37d13dae8d3a","Type":"ContainerStarted","Data":"cc3e711bfb2262facceb55fea7bda2acbcff7ff458a08792ba11aeae43314ce8"} Dec 01 21:51:44 crc kubenswrapper[4962]: I1201 21:51:44.879649 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-8xc97" event={"ID":"d871da7f-4b47-4931-aa3b-1525f50b2bde","Type":"ContainerStarted","Data":"3f78f1fb42089ef743de7fed12bd81abf8fc0e80a9da6ee2641027cf1cf46dac"} Dec 01 21:51:44 crc kubenswrapper[4962]: I1201 21:51:44.881635 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-2nvkp" event={"ID":"39217f35-ba4e-402b-84fe-876ca232ff60","Type":"ContainerStarted","Data":"8bb907109fefc527af2c5e1403b98e12f959b325d146db3e70de62a892f857cc"} Dec 01 21:51:44 crc kubenswrapper[4962]: I1201 21:51:44.883789 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-6tfrn" event={"ID":"03f5786b-da6f-4b56-ac07-fb563f0a85b4","Type":"ContainerStarted","Data":"22214a8538c54a14be5abe179098c1ccce20da795b2ce831037f8a14010b66bf"} Dec 01 21:51:46 crc kubenswrapper[4962]: I1201 21:51:46.906205 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-d8646fccf-4h8tf" event={"ID":"05992e60-e6fc-43a0-b44a-d177ae3f4c83","Type":"ContainerStarted","Data":"135fde837c2efefd950d1df6a4fb0edc71ec38fddd70677f22abe7370de52810"} Dec 01 21:51:46 crc kubenswrapper[4962]: I1201 21:51:46.906626 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-d8646fccf-4h8tf" Dec 01 21:51:46 crc kubenswrapper[4962]: I1201 21:51:46.908982 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-q7bxg" event={"ID":"c847e733-65b6-4724-8037-5199d847f1ba","Type":"ContainerStarted","Data":"24ca5f6ce47dd2c05968e6f0d4fa56842c7fa31fcb13db47cd892a760dcc5f77"} Dec 01 21:51:46 crc kubenswrapper[4962]: I1201 21:51:46.914450 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-zzc5v" event={"ID":"0aff0b93-1032-412b-9628-3ab9e94717a8","Type":"ContainerStarted","Data":"2933ae0a92ad690021522ba844e874be0e59427e45157e088fbeb2e84ebe9b3d"} Dec 01 21:51:46 crc kubenswrapper[4962]: I1201 21:51:46.915829 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-mqrwk" event={"ID":"1fb020cd-66c6-401d-be7e-9a26b62eb8d8","Type":"ContainerStarted","Data":"59fe520503f8a46d4dfb62fe0f6756e6440fe569c35a826f4a331d69cd62c3d2"} Dec 01 21:51:46 crc kubenswrapper[4962]: I1201 21:51:46.944444 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-d8646fccf-4h8tf" podStartSLOduration=44.944423574 podStartE2EDuration="44.944423574s" podCreationTimestamp="2025-12-01 21:51:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:51:46.934448351 +0000 UTC m=+1091.035887546" watchObservedRunningTime="2025-12-01 21:51:46.944423574 +0000 UTC m=+1091.045862779" Dec 01 21:51:49 crc kubenswrapper[4962]: E1201 21:51:49.702029 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/placement-operator-controller-manager-78f8948974-4hgng" podUID="ec3039da-9f5e-4870-8579-8560a63221a8" Dec 01 21:51:49 crc kubenswrapper[4962]: E1201 21:51:49.753828 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-mllgh" podUID="fb3ad1a2-8ee0-4d12-8499-d10819081f1b" Dec 01 21:51:49 crc kubenswrapper[4962]: E1201 21:51:49.796495 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-998648c74-xvbpf" podUID="f2e499a5-b89a-45d4-bd3e-9f743e010a51" Dec 01 21:51:49 crc kubenswrapper[4962]: I1201 21:51:49.954182 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-4hgng" event={"ID":"ec3039da-9f5e-4870-8579-8560a63221a8","Type":"ContainerStarted","Data":"a36446dcfd0eb1fdd221acacf3318cd1b09cba437e7a5a9a85df36a81df1ecbe"} Dec 01 21:51:49 crc kubenswrapper[4962]: I1201 21:51:49.960199 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4z6mbc" event={"ID":"fec45066-0c5d-48de-9c33-f166f33131f0","Type":"ContainerStarted","Data":"599115b40714e6294230370eaa6b73d4f8ba58b70bda58932595e634de3f47b8"} Dec 01 21:51:49 crc kubenswrapper[4962]: I1201 21:51:49.961686 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-xvbpf" event={"ID":"f2e499a5-b89a-45d4-bd3e-9f743e010a51","Type":"ContainerStarted","Data":"0e9f912e9af8193033ac7f38c1e06bbb99a976ffcbb4c23f26e62640a021696e"} Dec 01 21:51:49 crc kubenswrapper[4962]: I1201 21:51:49.980380 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lv89z" event={"ID":"400ba839-34f0-4463-a318-c1bcba6e5039","Type":"ContainerStarted","Data":"16812dc8f9a2444d94e941aa737f3edbf065509ecbff7cbe652ae7a2236821e9"} Dec 01 21:51:49 crc kubenswrapper[4962]: E1201 21:51:49.981245 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-t725d" podUID="0e2461fa-57b4-406a-9801-522b2e3ee2f0" Dec 01 21:51:49 crc kubenswrapper[4962]: I1201 21:51:49.987236 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-6tfrn" event={"ID":"03f5786b-da6f-4b56-ac07-fb563f0a85b4","Type":"ContainerStarted","Data":"9a9d958c2f84eebabbf1bed1e7cf85ccfa449d43fb62dd32d9f0529347c99bae"} Dec 01 21:51:49 crc kubenswrapper[4962]: I1201 21:51:49.988294 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-6tfrn" Dec 01 21:51:49 crc kubenswrapper[4962]: I1201 21:51:49.991464 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-6tfrn" Dec 01 21:51:50 crc kubenswrapper[4962]: I1201 21:51:50.001624 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-mqrwk" event={"ID":"1fb020cd-66c6-401d-be7e-9a26b62eb8d8","Type":"ContainerStarted","Data":"2af4aeff7f9c10f182377e472d6e0333bbf1bcd68e28e6b0dc45769b6186fa85"} Dec 01 21:51:50 crc kubenswrapper[4962]: I1201 21:51:50.002468 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-mqrwk" Dec 01 21:51:50 crc kubenswrapper[4962]: I1201 21:51:50.021283 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-zvp89" event={"ID":"3673ec86-6e36-4f0b-ac14-87e5d89e283e","Type":"ContainerStarted","Data":"67fffe49bd7778e0bee682b363ce6fe89b817991c65b0de224fed6eb2fd73240"} Dec 01 21:51:50 crc kubenswrapper[4962]: I1201 21:51:50.023564 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-zvp89" Dec 01 21:51:50 crc kubenswrapper[4962]: I1201 21:51:50.026808 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-8xc97" event={"ID":"d871da7f-4b47-4931-aa3b-1525f50b2bde","Type":"ContainerStarted","Data":"724ad62214145226e9ade0d09b7bc3a10a8a889c35ef1d72d7842c509d19a5c0"} Dec 01 21:51:50 crc kubenswrapper[4962]: I1201 21:51:50.026974 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-8xc97" Dec 01 21:51:50 crc kubenswrapper[4962]: I1201 21:51:50.027909 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-6tfrn" podStartSLOduration=4.221487136 podStartE2EDuration="48.027888438s" podCreationTimestamp="2025-12-01 21:51:02 +0000 UTC" firstStartedPulling="2025-12-01 21:51:05.361480411 +0000 UTC m=+1049.462919606" lastFinishedPulling="2025-12-01 21:51:49.167881703 +0000 UTC m=+1093.269320908" observedRunningTime="2025-12-01 21:51:50.015409193 +0000 UTC m=+1094.116848398" watchObservedRunningTime="2025-12-01 21:51:50.027888438 +0000 UTC m=+1094.129327643" Dec 01 21:51:50 crc kubenswrapper[4962]: I1201 21:51:50.029199 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-8xc97" Dec 01 21:51:50 crc kubenswrapper[4962]: I1201 21:51:50.039655 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-zvp89" Dec 01 21:51:50 crc kubenswrapper[4962]: I1201 21:51:50.051198 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lv89z" podStartSLOduration=4.159023028 podStartE2EDuration="48.051103788s" podCreationTimestamp="2025-12-01 21:51:02 +0000 UTC" firstStartedPulling="2025-12-01 21:51:05.364134126 +0000 UTC m=+1049.465573321" lastFinishedPulling="2025-12-01 21:51:49.256214876 +0000 UTC m=+1093.357654081" observedRunningTime="2025-12-01 21:51:50.03958584 +0000 UTC m=+1094.141025035" watchObservedRunningTime="2025-12-01 21:51:50.051103788 +0000 UTC m=+1094.152542983" Dec 01 21:51:50 crc kubenswrapper[4962]: I1201 21:51:50.052684 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-mllgh" event={"ID":"fb3ad1a2-8ee0-4d12-8499-d10819081f1b","Type":"ContainerStarted","Data":"14a6b3005c04b06f4e226939d60abedce07c3380b80235641f76aa7c5e5adf95"} Dec 01 21:51:50 crc kubenswrapper[4962]: I1201 21:51:50.074600 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-27r4m" event={"ID":"2e5d2ffc-89ea-4f45-a3a2-1745f3b107e8","Type":"ContainerStarted","Data":"ef36ab94f962332c8fa63b097b27bca69660a2ce0d90163d311f3d0889794eb8"} Dec 01 21:51:50 crc kubenswrapper[4962]: I1201 21:51:50.085860 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-zvp89" podStartSLOduration=4.028704571 podStartE2EDuration="48.085843616s" podCreationTimestamp="2025-12-01 21:51:02 +0000 UTC" firstStartedPulling="2025-12-01 21:51:05.055646281 +0000 UTC m=+1049.157085476" lastFinishedPulling="2025-12-01 21:51:49.112785296 +0000 UTC m=+1093.214224521" observedRunningTime="2025-12-01 21:51:50.057827229 +0000 UTC m=+1094.159266424" watchObservedRunningTime="2025-12-01 21:51:50.085843616 +0000 UTC m=+1094.187282821" Dec 01 21:51:50 crc kubenswrapper[4962]: E1201 21:51:50.210958 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-sjzl9" podUID="1a9bd198-45fa-40ba-b3a0-55c150c211d6" Dec 01 21:51:50 crc kubenswrapper[4962]: E1201 21:51:50.311956 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-nptld" podUID="fb72edda-e449-44f6-a85d-b74c0f3f9ad2" Dec 01 21:51:50 crc kubenswrapper[4962]: I1201 21:51:50.314697 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-mqrwk" podStartSLOduration=10.726212651 podStartE2EDuration="48.314680666s" podCreationTimestamp="2025-12-01 21:51:02 +0000 UTC" firstStartedPulling="2025-12-01 21:51:05.149068558 +0000 UTC m=+1049.250507753" lastFinishedPulling="2025-12-01 21:51:42.737536573 +0000 UTC m=+1086.838975768" observedRunningTime="2025-12-01 21:51:50.109072447 +0000 UTC m=+1094.210511642" watchObservedRunningTime="2025-12-01 21:51:50.314680666 +0000 UTC m=+1094.416119861" Dec 01 21:51:50 crc kubenswrapper[4962]: I1201 21:51:50.353890 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-8xc97" podStartSLOduration=4.198601169 podStartE2EDuration="49.353869851s" podCreationTimestamp="2025-12-01 21:51:01 +0000 UTC" firstStartedPulling="2025-12-01 21:51:04.096546239 +0000 UTC m=+1048.197985434" lastFinishedPulling="2025-12-01 21:51:49.251814911 +0000 UTC m=+1093.353254116" observedRunningTime="2025-12-01 21:51:50.127963194 +0000 UTC m=+1094.229402399" watchObservedRunningTime="2025-12-01 21:51:50.353869851 +0000 UTC m=+1094.455309046" Dec 01 21:51:50 crc kubenswrapper[4962]: E1201 21:51:50.387326 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/telemetry-operator-controller-manager-6c484b4dc4-ch82f" podUID="af182ba4-78a6-41eb-bf65-8abd64207122" Dec 01 21:51:50 crc kubenswrapper[4962]: E1201 21:51:50.615467 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-7d98c" podUID="795b9a42-a6d4-487b-84ef-0f1b3617ebfc" Dec 01 21:51:50 crc kubenswrapper[4962]: E1201 21:51:50.704520 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-fkvsq" podUID="ed78cdfd-dc4e-4528-9542-6fc778f54e5f" Dec 01 21:51:50 crc kubenswrapper[4962]: E1201 21:51:50.780781 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-w68ng" podUID="d0ae9966-90f0-4d97-a056-dd9e86c81949" Dec 01 21:51:51 crc kubenswrapper[4962]: I1201 21:51:51.088197 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-t725d" event={"ID":"0e2461fa-57b4-406a-9801-522b2e3ee2f0","Type":"ContainerStarted","Data":"61785d37d4584429c8bd8f9497077fd62711d5964908a9d2c8eeb8826af1effc"} Dec 01 21:51:51 crc kubenswrapper[4962]: I1201 21:51:51.098106 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-w68ng" event={"ID":"d0ae9966-90f0-4d97-a056-dd9e86c81949","Type":"ContainerStarted","Data":"7b7a6ffd98d7385dcdea83e96de3ecbf5c15aff3a082cde2cc025a646a646c4d"} Dec 01 21:51:51 crc kubenswrapper[4962]: I1201 21:51:51.100002 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-2nvkp" event={"ID":"39217f35-ba4e-402b-84fe-876ca232ff60","Type":"ContainerStarted","Data":"484641be035094ceb86fa79157bbc751f63b9cdda83a950c90e6b4238eedec89"} Dec 01 21:51:51 crc kubenswrapper[4962]: I1201 21:51:51.100212 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-2nvkp" Dec 01 21:51:51 crc kubenswrapper[4962]: I1201 21:51:51.102113 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-2nvkp" Dec 01 21:51:51 crc kubenswrapper[4962]: I1201 21:51:51.103057 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-zzc5v" event={"ID":"0aff0b93-1032-412b-9628-3ab9e94717a8","Type":"ContainerStarted","Data":"269ac3e42f953f23e1aa1d6551da8c450ed0fca00af018bf8fb65985d3d9efbe"} Dec 01 21:51:51 crc kubenswrapper[4962]: I1201 21:51:51.103676 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5854674fcc-zzc5v" Dec 01 21:51:51 crc kubenswrapper[4962]: I1201 21:51:51.105264 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5854674fcc-zzc5v" Dec 01 21:51:51 crc kubenswrapper[4962]: I1201 21:51:51.108258 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-nptld" event={"ID":"fb72edda-e449-44f6-a85d-b74c0f3f9ad2","Type":"ContainerStarted","Data":"701367aa2f3518745056d8548847db2fbf074f47b794706dcd31b25e1c3b1016"} Dec 01 21:51:51 crc kubenswrapper[4962]: I1201 21:51:51.111609 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-sjzl9" event={"ID":"1a9bd198-45fa-40ba-b3a0-55c150c211d6","Type":"ContainerStarted","Data":"fd32ad8b82b6fd8ad5b393b6972d943cf404e60c472f2bfbb97b56b871d4f4d7"} Dec 01 21:51:51 crc kubenswrapper[4962]: I1201 21:51:51.128562 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-hfkpq" event={"ID":"d62cdff4-c4d1-44fb-99dc-bdd6a31d03af","Type":"ContainerStarted","Data":"23fede2eef45d3e76070ccdc9e19daa3f5c3dc1dfcad1eb236d98175c968b92b"} Dec 01 21:51:51 crc kubenswrapper[4962]: I1201 21:51:51.129791 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-hfkpq" Dec 01 21:51:51 crc kubenswrapper[4962]: I1201 21:51:51.133485 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-hfkpq" Dec 01 21:51:51 crc kubenswrapper[4962]: I1201 21:51:51.142911 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-78f8948974-4hgng" Dec 01 21:51:51 crc kubenswrapper[4962]: I1201 21:51:51.155322 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-q7bxg" event={"ID":"c847e733-65b6-4724-8037-5199d847f1ba","Type":"ContainerStarted","Data":"d538edc331c39f67f1d7616380fbb3568aa9935674eb032ae6d7bcf8990865db"} Dec 01 21:51:51 crc kubenswrapper[4962]: I1201 21:51:51.156483 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-q7bxg" Dec 01 21:51:51 crc kubenswrapper[4962]: I1201 21:51:51.161218 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-q7bxg" Dec 01 21:51:51 crc kubenswrapper[4962]: I1201 21:51:51.164812 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5854674fcc-zzc5v" podStartSLOduration=11.561864461999999 podStartE2EDuration="49.164779308s" podCreationTimestamp="2025-12-01 21:51:02 +0000 UTC" firstStartedPulling="2025-12-01 21:51:05.135856763 +0000 UTC m=+1049.237295958" lastFinishedPulling="2025-12-01 21:51:42.738771609 +0000 UTC m=+1086.840210804" observedRunningTime="2025-12-01 21:51:51.158891541 +0000 UTC m=+1095.260330736" watchObservedRunningTime="2025-12-01 21:51:51.164779308 +0000 UTC m=+1095.266218503" Dec 01 21:51:51 crc kubenswrapper[4962]: I1201 21:51:51.164917 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6c484b4dc4-ch82f" event={"ID":"af182ba4-78a6-41eb-bf65-8abd64207122","Type":"ContainerStarted","Data":"0b377bdecf4c732942728eca7299c4182039a883cc183321e03b6d4ae6a727c4"} Dec 01 21:51:51 crc kubenswrapper[4962]: E1201 21:51:51.171081 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.107:5001/openstack-k8s-operators/telemetry-operator:e82e6b4a488661603634ac58918e94b98a55620c\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-6c484b4dc4-ch82f" podUID="af182ba4-78a6-41eb-bf65-8abd64207122" Dec 01 21:51:51 crc kubenswrapper[4962]: I1201 21:51:51.175091 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-7d98c" event={"ID":"795b9a42-a6d4-487b-84ef-0f1b3617ebfc","Type":"ContainerStarted","Data":"1d2bdcc0761ddf0ac6349cf33424b39d40da43ddfe2df9ba5b988ccafe32ac3e"} Dec 01 21:51:51 crc kubenswrapper[4962]: I1201 21:51:51.193677 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4z6mbc" event={"ID":"fec45066-0c5d-48de-9c33-f166f33131f0","Type":"ContainerStarted","Data":"efe6320da6b69f72742d97566b61eec1f540af91a0f1788f64ac054bdc3e3d61"} Dec 01 21:51:51 crc kubenswrapper[4962]: I1201 21:51:51.194504 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4z6mbc" Dec 01 21:51:51 crc kubenswrapper[4962]: I1201 21:51:51.198010 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-2nvkp" podStartSLOduration=4.86762752 podStartE2EDuration="50.197989553s" podCreationTimestamp="2025-12-01 21:51:01 +0000 UTC" firstStartedPulling="2025-12-01 21:51:03.80517775 +0000 UTC m=+1047.906616945" lastFinishedPulling="2025-12-01 21:51:49.135539783 +0000 UTC m=+1093.236978978" observedRunningTime="2025-12-01 21:51:51.190051697 +0000 UTC m=+1095.291490912" watchObservedRunningTime="2025-12-01 21:51:51.197989553 +0000 UTC m=+1095.299428748" Dec 01 21:51:51 crc kubenswrapper[4962]: I1201 21:51:51.210182 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-sqwvg" event={"ID":"8e655cd6-3169-46a0-b299-37d13dae8d3a","Type":"ContainerStarted","Data":"affe0a78c45b2a0e6b7a102b92a97e0d08eaa5cb9e0211b758a1f21b5d1c26dc"} Dec 01 21:51:51 crc kubenswrapper[4962]: I1201 21:51:51.210448 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-sqwvg" Dec 01 21:51:51 crc kubenswrapper[4962]: I1201 21:51:51.220731 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-fkvsq" event={"ID":"ed78cdfd-dc4e-4528-9542-6fc778f54e5f","Type":"ContainerStarted","Data":"77ce4a01e6ef366eb8de7f3e7b5b753f83e9345d921120d811ea40ad615e7164"} Dec 01 21:51:51 crc kubenswrapper[4962]: I1201 21:51:51.228828 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-27r4m" event={"ID":"2e5d2ffc-89ea-4f45-a3a2-1745f3b107e8","Type":"ContainerStarted","Data":"93a28addbacaa7515f81d155746c6ade29b3753636ae36dd74b017746fd7eac0"} Dec 01 21:51:51 crc kubenswrapper[4962]: I1201 21:51:51.229580 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-57548d458d-27r4m" Dec 01 21:51:51 crc kubenswrapper[4962]: I1201 21:51:51.231695 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-sqwvg" Dec 01 21:51:51 crc kubenswrapper[4962]: I1201 21:51:51.249530 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-mllgh" Dec 01 21:51:51 crc kubenswrapper[4962]: I1201 21:51:51.269776 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-mqrwk" Dec 01 21:51:51 crc kubenswrapper[4962]: I1201 21:51:51.316436 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-sqwvg" podStartSLOduration=4.890373226 podStartE2EDuration="50.316418191s" podCreationTimestamp="2025-12-01 21:51:01 +0000 UTC" firstStartedPulling="2025-12-01 21:51:03.709265822 +0000 UTC m=+1047.810705017" lastFinishedPulling="2025-12-01 21:51:49.135310757 +0000 UTC m=+1093.236749982" observedRunningTime="2025-12-01 21:51:51.297147703 +0000 UTC m=+1095.398586898" watchObservedRunningTime="2025-12-01 21:51:51.316418191 +0000 UTC m=+1095.417857386" Dec 01 21:51:51 crc kubenswrapper[4962]: I1201 21:51:51.331420 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-hfkpq" podStartSLOduration=5.288197028 podStartE2EDuration="49.331396687s" podCreationTimestamp="2025-12-01 21:51:02 +0000 UTC" firstStartedPulling="2025-12-01 21:51:05.125331443 +0000 UTC m=+1049.226770638" lastFinishedPulling="2025-12-01 21:51:49.168531102 +0000 UTC m=+1093.269970297" observedRunningTime="2025-12-01 21:51:51.313671793 +0000 UTC m=+1095.415110988" watchObservedRunningTime="2025-12-01 21:51:51.331396687 +0000 UTC m=+1095.432835882" Dec 01 21:51:51 crc kubenswrapper[4962]: I1201 21:51:51.360951 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4z6mbc" podStartSLOduration=43.825120754 podStartE2EDuration="49.360918547s" podCreationTimestamp="2025-12-01 21:51:02 +0000 UTC" firstStartedPulling="2025-12-01 21:51:43.640258543 +0000 UTC m=+1087.741697738" lastFinishedPulling="2025-12-01 21:51:49.176056326 +0000 UTC m=+1093.277495531" observedRunningTime="2025-12-01 21:51:51.341463603 +0000 UTC m=+1095.442902798" watchObservedRunningTime="2025-12-01 21:51:51.360918547 +0000 UTC m=+1095.462357742" Dec 01 21:51:51 crc kubenswrapper[4962]: I1201 21:51:51.364843 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-q7bxg" podStartSLOduration=11.629157925 podStartE2EDuration="49.364826138s" podCreationTimestamp="2025-12-01 21:51:02 +0000 UTC" firstStartedPulling="2025-12-01 21:51:05.150721385 +0000 UTC m=+1049.252160580" lastFinishedPulling="2025-12-01 21:51:42.886389598 +0000 UTC m=+1086.987828793" observedRunningTime="2025-12-01 21:51:51.362381139 +0000 UTC m=+1095.463820344" watchObservedRunningTime="2025-12-01 21:51:51.364826138 +0000 UTC m=+1095.466265333" Dec 01 21:51:51 crc kubenswrapper[4962]: I1201 21:51:51.390697 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-57548d458d-27r4m" podStartSLOduration=44.765637051 podStartE2EDuration="50.390667513s" podCreationTimestamp="2025-12-01 21:51:01 +0000 UTC" firstStartedPulling="2025-12-01 21:51:43.640278873 +0000 UTC m=+1087.741718088" lastFinishedPulling="2025-12-01 21:51:49.265309345 +0000 UTC m=+1093.366748550" observedRunningTime="2025-12-01 21:51:51.388303206 +0000 UTC m=+1095.489742401" watchObservedRunningTime="2025-12-01 21:51:51.390667513 +0000 UTC m=+1095.492106708" Dec 01 21:51:51 crc kubenswrapper[4962]: I1201 21:51:51.409381 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-78f8948974-4hgng" podStartSLOduration=3.65713353 podStartE2EDuration="49.409363195s" podCreationTimestamp="2025-12-01 21:51:02 +0000 UTC" firstStartedPulling="2025-12-01 21:51:05.082901596 +0000 UTC m=+1049.184340791" lastFinishedPulling="2025-12-01 21:51:50.835131261 +0000 UTC m=+1094.936570456" observedRunningTime="2025-12-01 21:51:51.409226481 +0000 UTC m=+1095.510665676" watchObservedRunningTime="2025-12-01 21:51:51.409363195 +0000 UTC m=+1095.510802380" Dec 01 21:51:51 crc kubenswrapper[4962]: I1201 21:51:51.471644 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-mllgh" podStartSLOduration=4.628719892 podStartE2EDuration="50.471625016s" podCreationTimestamp="2025-12-01 21:51:01 +0000 UTC" firstStartedPulling="2025-12-01 21:51:04.992930377 +0000 UTC m=+1049.094369572" lastFinishedPulling="2025-12-01 21:51:50.835835501 +0000 UTC m=+1094.937274696" observedRunningTime="2025-12-01 21:51:51.468107256 +0000 UTC m=+1095.569546451" watchObservedRunningTime="2025-12-01 21:51:51.471625016 +0000 UTC m=+1095.573064211" Dec 01 21:51:52 crc kubenswrapper[4962]: I1201 21:51:52.261628 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-mllgh" event={"ID":"fb3ad1a2-8ee0-4d12-8499-d10819081f1b","Type":"ContainerStarted","Data":"1f8f54fc7f089124ba30b60230c64767f75547b36180917cf76d49b03b4dc8be"} Dec 01 21:51:52 crc kubenswrapper[4962]: I1201 21:51:52.264425 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-4hgng" event={"ID":"ec3039da-9f5e-4870-8579-8560a63221a8","Type":"ContainerStarted","Data":"7f889825412bb5eac771f54aaac29a95d2d20dd05f0f4784672823d7b5b3ff02"} Dec 01 21:51:52 crc kubenswrapper[4962]: I1201 21:51:52.267347 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-t725d" event={"ID":"0e2461fa-57b4-406a-9801-522b2e3ee2f0","Type":"ContainerStarted","Data":"5c004e0b7e9a0c335104c7d8c99eb80f77c4a47051926f1e7337685e3e45f811"} Dec 01 21:51:52 crc kubenswrapper[4962]: I1201 21:51:52.267610 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-t725d" Dec 01 21:51:52 crc kubenswrapper[4962]: I1201 21:51:52.272297 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-xvbpf" event={"ID":"f2e499a5-b89a-45d4-bd3e-9f743e010a51","Type":"ContainerStarted","Data":"80e78dc32b8c6955d02e29be97b48d36d3385020396c70df3b805c9fe82b9a76"} Dec 01 21:51:52 crc kubenswrapper[4962]: I1201 21:51:52.273544 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-998648c74-xvbpf" Dec 01 21:51:52 crc kubenswrapper[4962]: I1201 21:51:52.295036 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-t725d" podStartSLOduration=3.450079482 podStartE2EDuration="51.295012579s" podCreationTimestamp="2025-12-01 21:51:01 +0000 UTC" firstStartedPulling="2025-12-01 21:51:03.819303662 +0000 UTC m=+1047.920742857" lastFinishedPulling="2025-12-01 21:51:51.664236759 +0000 UTC m=+1095.765675954" observedRunningTime="2025-12-01 21:51:52.284103498 +0000 UTC m=+1096.385542693" watchObservedRunningTime="2025-12-01 21:51:52.295012579 +0000 UTC m=+1096.396451774" Dec 01 21:51:52 crc kubenswrapper[4962]: I1201 21:51:52.314788 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-998648c74-xvbpf" podStartSLOduration=4.8982475260000005 podStartE2EDuration="50.314763721s" podCreationTimestamp="2025-12-01 21:51:02 +0000 UTC" firstStartedPulling="2025-12-01 21:51:05.130434488 +0000 UTC m=+1049.231873683" lastFinishedPulling="2025-12-01 21:51:50.546950683 +0000 UTC m=+1094.648389878" observedRunningTime="2025-12-01 21:51:52.308601335 +0000 UTC m=+1096.410040540" watchObservedRunningTime="2025-12-01 21:51:52.314763721 +0000 UTC m=+1096.416202926" Dec 01 21:51:53 crc kubenswrapper[4962]: I1201 21:51:53.280207 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-fkvsq" event={"ID":"ed78cdfd-dc4e-4528-9542-6fc778f54e5f","Type":"ContainerStarted","Data":"5539ae77a7460b98fddfe8ce747d6080ebc5257b60735b8ce6a3460b60802189"} Dec 01 21:51:53 crc kubenswrapper[4962]: I1201 21:51:53.280632 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-fkvsq" Dec 01 21:51:53 crc kubenswrapper[4962]: I1201 21:51:53.284129 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-sjzl9" event={"ID":"1a9bd198-45fa-40ba-b3a0-55c150c211d6","Type":"ContainerStarted","Data":"254f6531f79a1df65ec55d3a811251c466c482211a139852648bfd010136161e"} Dec 01 21:51:53 crc kubenswrapper[4962]: I1201 21:51:53.284370 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-sjzl9" Dec 01 21:51:53 crc kubenswrapper[4962]: I1201 21:51:53.296306 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-w68ng" event={"ID":"d0ae9966-90f0-4d97-a056-dd9e86c81949","Type":"ContainerStarted","Data":"b70c633352f8961be53261411646510bdb6feeee095060d4ac153aa05f1abfa5"} Dec 01 21:51:53 crc kubenswrapper[4962]: I1201 21:51:53.297298 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-w68ng" Dec 01 21:51:53 crc kubenswrapper[4962]: I1201 21:51:53.300174 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-7d98c" event={"ID":"795b9a42-a6d4-487b-84ef-0f1b3617ebfc","Type":"ContainerStarted","Data":"5882d46df40b8627e64ac58751aa2b20bd9d765d63c6e34358c2c50bce6d6a3f"} Dec 01 21:51:53 crc kubenswrapper[4962]: I1201 21:51:53.300425 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-7d98c" Dec 01 21:51:53 crc kubenswrapper[4962]: I1201 21:51:53.304536 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-nptld" event={"ID":"fb72edda-e449-44f6-a85d-b74c0f3f9ad2","Type":"ContainerStarted","Data":"c7022d65765660a8c124e95df79c6932fad9c0acf9f53361e944a2e445625df3"} Dec 01 21:51:53 crc kubenswrapper[4962]: I1201 21:51:53.304594 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-nptld" Dec 01 21:51:53 crc kubenswrapper[4962]: I1201 21:51:53.305510 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-fkvsq" podStartSLOduration=5.460406942 podStartE2EDuration="52.305490684s" podCreationTimestamp="2025-12-01 21:51:01 +0000 UTC" firstStartedPulling="2025-12-01 21:51:05.08655854 +0000 UTC m=+1049.187997735" lastFinishedPulling="2025-12-01 21:51:51.931642282 +0000 UTC m=+1096.033081477" observedRunningTime="2025-12-01 21:51:53.298659759 +0000 UTC m=+1097.400098994" watchObservedRunningTime="2025-12-01 21:51:53.305490684 +0000 UTC m=+1097.406929879" Dec 01 21:51:53 crc kubenswrapper[4962]: I1201 21:51:53.324814 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-w68ng" podStartSLOduration=5.538249657 podStartE2EDuration="52.324792993s" podCreationTimestamp="2025-12-01 21:51:01 +0000 UTC" firstStartedPulling="2025-12-01 21:51:05.12415528 +0000 UTC m=+1049.225594465" lastFinishedPulling="2025-12-01 21:51:51.910698616 +0000 UTC m=+1096.012137801" observedRunningTime="2025-12-01 21:51:53.320691846 +0000 UTC m=+1097.422131061" watchObservedRunningTime="2025-12-01 21:51:53.324792993 +0000 UTC m=+1097.426232208" Dec 01 21:51:53 crc kubenswrapper[4962]: I1201 21:51:53.339250 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-sjzl9" podStartSLOduration=5.517871916 podStartE2EDuration="52.339231433s" podCreationTimestamp="2025-12-01 21:51:01 +0000 UTC" firstStartedPulling="2025-12-01 21:51:05.110326376 +0000 UTC m=+1049.211765571" lastFinishedPulling="2025-12-01 21:51:51.931685893 +0000 UTC m=+1096.033125088" observedRunningTime="2025-12-01 21:51:53.336795174 +0000 UTC m=+1097.438234379" watchObservedRunningTime="2025-12-01 21:51:53.339231433 +0000 UTC m=+1097.440670638" Dec 01 21:51:53 crc kubenswrapper[4962]: I1201 21:51:53.362432 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-nptld" podStartSLOduration=4.394258581 podStartE2EDuration="51.362413373s" podCreationTimestamp="2025-12-01 21:51:02 +0000 UTC" firstStartedPulling="2025-12-01 21:51:04.961142483 +0000 UTC m=+1049.062581678" lastFinishedPulling="2025-12-01 21:51:51.929297275 +0000 UTC m=+1096.030736470" observedRunningTime="2025-12-01 21:51:53.354159488 +0000 UTC m=+1097.455598683" watchObservedRunningTime="2025-12-01 21:51:53.362413373 +0000 UTC m=+1097.463852578" Dec 01 21:51:58 crc kubenswrapper[4962]: I1201 21:51:58.266320 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-57548d458d-27r4m" Dec 01 21:51:58 crc kubenswrapper[4962]: I1201 21:51:58.300322 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-7d98c" podStartSLOduration=9.419205511 podStartE2EDuration="56.300298968s" podCreationTimestamp="2025-12-01 21:51:02 +0000 UTC" firstStartedPulling="2025-12-01 21:51:05.050602517 +0000 UTC m=+1049.152041712" lastFinishedPulling="2025-12-01 21:51:51.931695974 +0000 UTC m=+1096.033135169" observedRunningTime="2025-12-01 21:51:53.374450335 +0000 UTC m=+1097.475889530" watchObservedRunningTime="2025-12-01 21:51:58.300298968 +0000 UTC m=+1102.401738153" Dec 01 21:51:58 crc kubenswrapper[4962]: I1201 21:51:58.514296 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4z6mbc" Dec 01 21:51:59 crc kubenswrapper[4962]: I1201 21:51:59.126540 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-d8646fccf-4h8tf" Dec 01 21:52:02 crc kubenswrapper[4962]: I1201 21:52:02.150957 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-t725d" Dec 01 21:52:02 crc kubenswrapper[4962]: I1201 21:52:02.461032 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-sjzl9" Dec 01 21:52:02 crc kubenswrapper[4962]: I1201 21:52:02.488773 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-mllgh" Dec 01 21:52:02 crc kubenswrapper[4962]: I1201 21:52:02.613894 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-w68ng" Dec 01 21:52:02 crc kubenswrapper[4962]: I1201 21:52:02.620267 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-fkvsq" Dec 01 21:52:02 crc kubenswrapper[4962]: I1201 21:52:02.663429 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-nptld" Dec 01 21:52:02 crc kubenswrapper[4962]: I1201 21:52:02.692696 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-7d98c" Dec 01 21:52:02 crc kubenswrapper[4962]: I1201 21:52:02.764463 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-998648c74-xvbpf" Dec 01 21:52:02 crc kubenswrapper[4962]: I1201 21:52:02.909857 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-78f8948974-4hgng" Dec 01 21:52:07 crc kubenswrapper[4962]: I1201 21:52:07.470733 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6c484b4dc4-ch82f" event={"ID":"af182ba4-78a6-41eb-bf65-8abd64207122","Type":"ContainerStarted","Data":"c6ec5950f16a3d0eb44865be70c1577c91df8e409198cd86fd34008f5f10641d"} Dec 01 21:52:07 crc kubenswrapper[4962]: I1201 21:52:07.472355 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-6c484b4dc4-ch82f" Dec 01 21:52:07 crc kubenswrapper[4962]: I1201 21:52:07.499895 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-6c484b4dc4-ch82f" podStartSLOduration=4.288749288 podStartE2EDuration="1m5.499871471s" podCreationTimestamp="2025-12-01 21:51:02 +0000 UTC" firstStartedPulling="2025-12-01 21:51:05.143568822 +0000 UTC m=+1049.245008017" lastFinishedPulling="2025-12-01 21:52:06.354690995 +0000 UTC m=+1110.456130200" observedRunningTime="2025-12-01 21:52:07.493358616 +0000 UTC m=+1111.594797861" watchObservedRunningTime="2025-12-01 21:52:07.499871471 +0000 UTC m=+1111.601310696" Dec 01 21:52:13 crc kubenswrapper[4962]: I1201 21:52:13.294474 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-6c484b4dc4-ch82f" Dec 01 21:52:30 crc kubenswrapper[4962]: I1201 21:52:30.088777 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-clh7c"] Dec 01 21:52:30 crc kubenswrapper[4962]: I1201 21:52:30.090858 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-clh7c" Dec 01 21:52:30 crc kubenswrapper[4962]: I1201 21:52:30.096829 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 01 21:52:30 crc kubenswrapper[4962]: I1201 21:52:30.096890 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 01 21:52:30 crc kubenswrapper[4962]: I1201 21:52:30.097056 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-qx5q2" Dec 01 21:52:30 crc kubenswrapper[4962]: I1201 21:52:30.097091 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 01 21:52:30 crc kubenswrapper[4962]: I1201 21:52:30.098262 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-clh7c"] Dec 01 21:52:30 crc kubenswrapper[4962]: I1201 21:52:30.140723 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/acafcb5c-9dfc-4997-bc42-84fbcb84c6d1-config\") pod \"dnsmasq-dns-675f4bcbfc-clh7c\" (UID: \"acafcb5c-9dfc-4997-bc42-84fbcb84c6d1\") " pod="openstack/dnsmasq-dns-675f4bcbfc-clh7c" Dec 01 21:52:30 crc kubenswrapper[4962]: I1201 21:52:30.141292 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rllnj\" (UniqueName: \"kubernetes.io/projected/acafcb5c-9dfc-4997-bc42-84fbcb84c6d1-kube-api-access-rllnj\") pod \"dnsmasq-dns-675f4bcbfc-clh7c\" (UID: \"acafcb5c-9dfc-4997-bc42-84fbcb84c6d1\") " pod="openstack/dnsmasq-dns-675f4bcbfc-clh7c" Dec 01 21:52:30 crc kubenswrapper[4962]: I1201 21:52:30.143106 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-7nmb4"] Dec 01 21:52:30 crc kubenswrapper[4962]: I1201 21:52:30.144676 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-7nmb4" Dec 01 21:52:30 crc kubenswrapper[4962]: I1201 21:52:30.151508 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Dec 01 21:52:30 crc kubenswrapper[4962]: I1201 21:52:30.152307 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-7nmb4"] Dec 01 21:52:30 crc kubenswrapper[4962]: I1201 21:52:30.243380 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rllnj\" (UniqueName: \"kubernetes.io/projected/acafcb5c-9dfc-4997-bc42-84fbcb84c6d1-kube-api-access-rllnj\") pod \"dnsmasq-dns-675f4bcbfc-clh7c\" (UID: \"acafcb5c-9dfc-4997-bc42-84fbcb84c6d1\") " pod="openstack/dnsmasq-dns-675f4bcbfc-clh7c" Dec 01 21:52:30 crc kubenswrapper[4962]: I1201 21:52:30.243442 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6951c087-297e-4743-a6ec-3a1e3a7a3f9f-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-7nmb4\" (UID: \"6951c087-297e-4743-a6ec-3a1e3a7a3f9f\") " pod="openstack/dnsmasq-dns-78dd6ddcc-7nmb4" Dec 01 21:52:30 crc kubenswrapper[4962]: I1201 21:52:30.243514 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/acafcb5c-9dfc-4997-bc42-84fbcb84c6d1-config\") pod \"dnsmasq-dns-675f4bcbfc-clh7c\" (UID: \"acafcb5c-9dfc-4997-bc42-84fbcb84c6d1\") " pod="openstack/dnsmasq-dns-675f4bcbfc-clh7c" Dec 01 21:52:30 crc kubenswrapper[4962]: I1201 21:52:30.243556 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwwmk\" (UniqueName: \"kubernetes.io/projected/6951c087-297e-4743-a6ec-3a1e3a7a3f9f-kube-api-access-kwwmk\") pod \"dnsmasq-dns-78dd6ddcc-7nmb4\" (UID: \"6951c087-297e-4743-a6ec-3a1e3a7a3f9f\") " pod="openstack/dnsmasq-dns-78dd6ddcc-7nmb4" Dec 01 21:52:30 crc kubenswrapper[4962]: I1201 21:52:30.243586 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6951c087-297e-4743-a6ec-3a1e3a7a3f9f-config\") pod \"dnsmasq-dns-78dd6ddcc-7nmb4\" (UID: \"6951c087-297e-4743-a6ec-3a1e3a7a3f9f\") " pod="openstack/dnsmasq-dns-78dd6ddcc-7nmb4" Dec 01 21:52:30 crc kubenswrapper[4962]: I1201 21:52:30.244889 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/acafcb5c-9dfc-4997-bc42-84fbcb84c6d1-config\") pod \"dnsmasq-dns-675f4bcbfc-clh7c\" (UID: \"acafcb5c-9dfc-4997-bc42-84fbcb84c6d1\") " pod="openstack/dnsmasq-dns-675f4bcbfc-clh7c" Dec 01 21:52:30 crc kubenswrapper[4962]: I1201 21:52:30.266191 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rllnj\" (UniqueName: \"kubernetes.io/projected/acafcb5c-9dfc-4997-bc42-84fbcb84c6d1-kube-api-access-rllnj\") pod \"dnsmasq-dns-675f4bcbfc-clh7c\" (UID: \"acafcb5c-9dfc-4997-bc42-84fbcb84c6d1\") " pod="openstack/dnsmasq-dns-675f4bcbfc-clh7c" Dec 01 21:52:30 crc kubenswrapper[4962]: I1201 21:52:30.344483 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6951c087-297e-4743-a6ec-3a1e3a7a3f9f-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-7nmb4\" (UID: \"6951c087-297e-4743-a6ec-3a1e3a7a3f9f\") " pod="openstack/dnsmasq-dns-78dd6ddcc-7nmb4" Dec 01 21:52:30 crc kubenswrapper[4962]: I1201 21:52:30.344859 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwwmk\" (UniqueName: \"kubernetes.io/projected/6951c087-297e-4743-a6ec-3a1e3a7a3f9f-kube-api-access-kwwmk\") pod \"dnsmasq-dns-78dd6ddcc-7nmb4\" (UID: \"6951c087-297e-4743-a6ec-3a1e3a7a3f9f\") " pod="openstack/dnsmasq-dns-78dd6ddcc-7nmb4" Dec 01 21:52:30 crc kubenswrapper[4962]: I1201 21:52:30.344976 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6951c087-297e-4743-a6ec-3a1e3a7a3f9f-config\") pod \"dnsmasq-dns-78dd6ddcc-7nmb4\" (UID: \"6951c087-297e-4743-a6ec-3a1e3a7a3f9f\") " pod="openstack/dnsmasq-dns-78dd6ddcc-7nmb4" Dec 01 21:52:30 crc kubenswrapper[4962]: I1201 21:52:30.345529 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6951c087-297e-4743-a6ec-3a1e3a7a3f9f-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-7nmb4\" (UID: \"6951c087-297e-4743-a6ec-3a1e3a7a3f9f\") " pod="openstack/dnsmasq-dns-78dd6ddcc-7nmb4" Dec 01 21:52:30 crc kubenswrapper[4962]: I1201 21:52:30.346569 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6951c087-297e-4743-a6ec-3a1e3a7a3f9f-config\") pod \"dnsmasq-dns-78dd6ddcc-7nmb4\" (UID: \"6951c087-297e-4743-a6ec-3a1e3a7a3f9f\") " pod="openstack/dnsmasq-dns-78dd6ddcc-7nmb4" Dec 01 21:52:30 crc kubenswrapper[4962]: I1201 21:52:30.364014 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwwmk\" (UniqueName: \"kubernetes.io/projected/6951c087-297e-4743-a6ec-3a1e3a7a3f9f-kube-api-access-kwwmk\") pod \"dnsmasq-dns-78dd6ddcc-7nmb4\" (UID: \"6951c087-297e-4743-a6ec-3a1e3a7a3f9f\") " pod="openstack/dnsmasq-dns-78dd6ddcc-7nmb4" Dec 01 21:52:30 crc kubenswrapper[4962]: I1201 21:52:30.427485 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-clh7c" Dec 01 21:52:30 crc kubenswrapper[4962]: I1201 21:52:30.463959 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-7nmb4" Dec 01 21:52:31 crc kubenswrapper[4962]: I1201 21:52:31.169600 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-7nmb4"] Dec 01 21:52:31 crc kubenswrapper[4962]: I1201 21:52:31.178593 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-clh7c"] Dec 01 21:52:31 crc kubenswrapper[4962]: I1201 21:52:31.783867 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-7nmb4" event={"ID":"6951c087-297e-4743-a6ec-3a1e3a7a3f9f","Type":"ContainerStarted","Data":"a6266424258187176486dfcbb90ce44f96cc9dcba9ba4281b1869d5c24af5a7a"} Dec 01 21:52:31 crc kubenswrapper[4962]: I1201 21:52:31.785041 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-clh7c" event={"ID":"acafcb5c-9dfc-4997-bc42-84fbcb84c6d1","Type":"ContainerStarted","Data":"26ca8ebb89da450c8f9229bf67035efba501f29e1af9d47f0f398ae32a27894d"} Dec 01 21:52:32 crc kubenswrapper[4962]: I1201 21:52:32.918443 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-clh7c"] Dec 01 21:52:32 crc kubenswrapper[4962]: I1201 21:52:32.964170 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-njtpt"] Dec 01 21:52:32 crc kubenswrapper[4962]: I1201 21:52:32.967317 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-njtpt" Dec 01 21:52:32 crc kubenswrapper[4962]: I1201 21:52:32.992405 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-njtpt"] Dec 01 21:52:33 crc kubenswrapper[4962]: I1201 21:52:33.103803 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/214eff82-ef7c-49b0-a9a3-5246584e9b66-config\") pod \"dnsmasq-dns-666b6646f7-njtpt\" (UID: \"214eff82-ef7c-49b0-a9a3-5246584e9b66\") " pod="openstack/dnsmasq-dns-666b6646f7-njtpt" Dec 01 21:52:33 crc kubenswrapper[4962]: I1201 21:52:33.103962 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/214eff82-ef7c-49b0-a9a3-5246584e9b66-dns-svc\") pod \"dnsmasq-dns-666b6646f7-njtpt\" (UID: \"214eff82-ef7c-49b0-a9a3-5246584e9b66\") " pod="openstack/dnsmasq-dns-666b6646f7-njtpt" Dec 01 21:52:33 crc kubenswrapper[4962]: I1201 21:52:33.104034 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvflk\" (UniqueName: \"kubernetes.io/projected/214eff82-ef7c-49b0-a9a3-5246584e9b66-kube-api-access-gvflk\") pod \"dnsmasq-dns-666b6646f7-njtpt\" (UID: \"214eff82-ef7c-49b0-a9a3-5246584e9b66\") " pod="openstack/dnsmasq-dns-666b6646f7-njtpt" Dec 01 21:52:33 crc kubenswrapper[4962]: I1201 21:52:33.204891 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvflk\" (UniqueName: \"kubernetes.io/projected/214eff82-ef7c-49b0-a9a3-5246584e9b66-kube-api-access-gvflk\") pod \"dnsmasq-dns-666b6646f7-njtpt\" (UID: \"214eff82-ef7c-49b0-a9a3-5246584e9b66\") " pod="openstack/dnsmasq-dns-666b6646f7-njtpt" Dec 01 21:52:33 crc kubenswrapper[4962]: I1201 21:52:33.205038 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/214eff82-ef7c-49b0-a9a3-5246584e9b66-config\") pod \"dnsmasq-dns-666b6646f7-njtpt\" (UID: \"214eff82-ef7c-49b0-a9a3-5246584e9b66\") " pod="openstack/dnsmasq-dns-666b6646f7-njtpt" Dec 01 21:52:33 crc kubenswrapper[4962]: I1201 21:52:33.205103 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/214eff82-ef7c-49b0-a9a3-5246584e9b66-dns-svc\") pod \"dnsmasq-dns-666b6646f7-njtpt\" (UID: \"214eff82-ef7c-49b0-a9a3-5246584e9b66\") " pod="openstack/dnsmasq-dns-666b6646f7-njtpt" Dec 01 21:52:33 crc kubenswrapper[4962]: I1201 21:52:33.205889 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/214eff82-ef7c-49b0-a9a3-5246584e9b66-config\") pod \"dnsmasq-dns-666b6646f7-njtpt\" (UID: \"214eff82-ef7c-49b0-a9a3-5246584e9b66\") " pod="openstack/dnsmasq-dns-666b6646f7-njtpt" Dec 01 21:52:33 crc kubenswrapper[4962]: I1201 21:52:33.205891 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/214eff82-ef7c-49b0-a9a3-5246584e9b66-dns-svc\") pod \"dnsmasq-dns-666b6646f7-njtpt\" (UID: \"214eff82-ef7c-49b0-a9a3-5246584e9b66\") " pod="openstack/dnsmasq-dns-666b6646f7-njtpt" Dec 01 21:52:33 crc kubenswrapper[4962]: I1201 21:52:33.230752 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-7nmb4"] Dec 01 21:52:33 crc kubenswrapper[4962]: I1201 21:52:33.232287 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvflk\" (UniqueName: \"kubernetes.io/projected/214eff82-ef7c-49b0-a9a3-5246584e9b66-kube-api-access-gvflk\") pod \"dnsmasq-dns-666b6646f7-njtpt\" (UID: \"214eff82-ef7c-49b0-a9a3-5246584e9b66\") " pod="openstack/dnsmasq-dns-666b6646f7-njtpt" Dec 01 21:52:33 crc kubenswrapper[4962]: I1201 21:52:33.275800 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-kv54m"] Dec 01 21:52:33 crc kubenswrapper[4962]: I1201 21:52:33.279113 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-kv54m" Dec 01 21:52:33 crc kubenswrapper[4962]: I1201 21:52:33.297462 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-kv54m"] Dec 01 21:52:33 crc kubenswrapper[4962]: I1201 21:52:33.324411 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-njtpt" Dec 01 21:52:33 crc kubenswrapper[4962]: I1201 21:52:33.411154 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33007873-cb3d-4f47-8883-a60f3f823a16-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-kv54m\" (UID: \"33007873-cb3d-4f47-8883-a60f3f823a16\") " pod="openstack/dnsmasq-dns-57d769cc4f-kv54m" Dec 01 21:52:33 crc kubenswrapper[4962]: I1201 21:52:33.411347 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33007873-cb3d-4f47-8883-a60f3f823a16-config\") pod \"dnsmasq-dns-57d769cc4f-kv54m\" (UID: \"33007873-cb3d-4f47-8883-a60f3f823a16\") " pod="openstack/dnsmasq-dns-57d769cc4f-kv54m" Dec 01 21:52:33 crc kubenswrapper[4962]: I1201 21:52:33.411438 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g46qf\" (UniqueName: \"kubernetes.io/projected/33007873-cb3d-4f47-8883-a60f3f823a16-kube-api-access-g46qf\") pod \"dnsmasq-dns-57d769cc4f-kv54m\" (UID: \"33007873-cb3d-4f47-8883-a60f3f823a16\") " pod="openstack/dnsmasq-dns-57d769cc4f-kv54m" Dec 01 21:52:33 crc kubenswrapper[4962]: I1201 21:52:33.512479 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33007873-cb3d-4f47-8883-a60f3f823a16-config\") pod \"dnsmasq-dns-57d769cc4f-kv54m\" (UID: \"33007873-cb3d-4f47-8883-a60f3f823a16\") " pod="openstack/dnsmasq-dns-57d769cc4f-kv54m" Dec 01 21:52:33 crc kubenswrapper[4962]: I1201 21:52:33.512784 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g46qf\" (UniqueName: \"kubernetes.io/projected/33007873-cb3d-4f47-8883-a60f3f823a16-kube-api-access-g46qf\") pod \"dnsmasq-dns-57d769cc4f-kv54m\" (UID: \"33007873-cb3d-4f47-8883-a60f3f823a16\") " pod="openstack/dnsmasq-dns-57d769cc4f-kv54m" Dec 01 21:52:33 crc kubenswrapper[4962]: I1201 21:52:33.512830 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33007873-cb3d-4f47-8883-a60f3f823a16-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-kv54m\" (UID: \"33007873-cb3d-4f47-8883-a60f3f823a16\") " pod="openstack/dnsmasq-dns-57d769cc4f-kv54m" Dec 01 21:52:33 crc kubenswrapper[4962]: I1201 21:52:33.513626 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33007873-cb3d-4f47-8883-a60f3f823a16-config\") pod \"dnsmasq-dns-57d769cc4f-kv54m\" (UID: \"33007873-cb3d-4f47-8883-a60f3f823a16\") " pod="openstack/dnsmasq-dns-57d769cc4f-kv54m" Dec 01 21:52:33 crc kubenswrapper[4962]: I1201 21:52:33.513644 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33007873-cb3d-4f47-8883-a60f3f823a16-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-kv54m\" (UID: \"33007873-cb3d-4f47-8883-a60f3f823a16\") " pod="openstack/dnsmasq-dns-57d769cc4f-kv54m" Dec 01 21:52:33 crc kubenswrapper[4962]: I1201 21:52:33.560700 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g46qf\" (UniqueName: \"kubernetes.io/projected/33007873-cb3d-4f47-8883-a60f3f823a16-kube-api-access-g46qf\") pod \"dnsmasq-dns-57d769cc4f-kv54m\" (UID: \"33007873-cb3d-4f47-8883-a60f3f823a16\") " pod="openstack/dnsmasq-dns-57d769cc4f-kv54m" Dec 01 21:52:33 crc kubenswrapper[4962]: I1201 21:52:33.610577 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-kv54m" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.080452 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.082589 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.089086 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.089182 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.089338 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.089491 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.089788 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.089984 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-jkck4" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.089146 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.090921 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.157212 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-njtpt"] Dec 01 21:52:35 crc kubenswrapper[4962]: W1201 21:52:34.164003 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod214eff82_ef7c_49b0_a9a3_5246584e9b66.slice/crio-b9ef5b9b904a69be7cb21f895f4db8cb7b5fa52634eadf9d37a5f616e0c66c0d WatchSource:0}: Error finding container b9ef5b9b904a69be7cb21f895f4db8cb7b5fa52634eadf9d37a5f616e0c66c0d: Status 404 returned error can't find the container with id b9ef5b9b904a69be7cb21f895f4db8cb7b5fa52634eadf9d37a5f616e0c66c0d Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.223639 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c9ef8bb6-0fc4-411e-82a1-85d95ced5818-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c9ef8bb6-0fc4-411e-82a1-85d95ced5818\") " pod="openstack/rabbitmq-server-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.223702 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c9ef8bb6-0fc4-411e-82a1-85d95ced5818-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c9ef8bb6-0fc4-411e-82a1-85d95ced5818\") " pod="openstack/rabbitmq-server-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.223890 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c9ef8bb6-0fc4-411e-82a1-85d95ced5818-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c9ef8bb6-0fc4-411e-82a1-85d95ced5818\") " pod="openstack/rabbitmq-server-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.224034 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c9ef8bb6-0fc4-411e-82a1-85d95ced5818-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c9ef8bb6-0fc4-411e-82a1-85d95ced5818\") " pod="openstack/rabbitmq-server-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.224089 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c9ef8bb6-0fc4-411e-82a1-85d95ced5818-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c9ef8bb6-0fc4-411e-82a1-85d95ced5818\") " pod="openstack/rabbitmq-server-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.224126 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvkbl\" (UniqueName: \"kubernetes.io/projected/c9ef8bb6-0fc4-411e-82a1-85d95ced5818-kube-api-access-wvkbl\") pod \"rabbitmq-server-0\" (UID: \"c9ef8bb6-0fc4-411e-82a1-85d95ced5818\") " pod="openstack/rabbitmq-server-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.224149 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c9ef8bb6-0fc4-411e-82a1-85d95ced5818-config-data\") pod \"rabbitmq-server-0\" (UID: \"c9ef8bb6-0fc4-411e-82a1-85d95ced5818\") " pod="openstack/rabbitmq-server-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.224222 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"c9ef8bb6-0fc4-411e-82a1-85d95ced5818\") " pod="openstack/rabbitmq-server-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.224288 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c9ef8bb6-0fc4-411e-82a1-85d95ced5818-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c9ef8bb6-0fc4-411e-82a1-85d95ced5818\") " pod="openstack/rabbitmq-server-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.224326 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c9ef8bb6-0fc4-411e-82a1-85d95ced5818-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c9ef8bb6-0fc4-411e-82a1-85d95ced5818\") " pod="openstack/rabbitmq-server-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.224377 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c9ef8bb6-0fc4-411e-82a1-85d95ced5818-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c9ef8bb6-0fc4-411e-82a1-85d95ced5818\") " pod="openstack/rabbitmq-server-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.252083 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-kv54m"] Dec 01 21:52:35 crc kubenswrapper[4962]: W1201 21:52:34.300123 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33007873_cb3d_4f47_8883_a60f3f823a16.slice/crio-e52e85602f6fe719f856f8cef376f2f3020aaf6e0f60d71855a7f6464d21fff8 WatchSource:0}: Error finding container e52e85602f6fe719f856f8cef376f2f3020aaf6e0f60d71855a7f6464d21fff8: Status 404 returned error can't find the container with id e52e85602f6fe719f856f8cef376f2f3020aaf6e0f60d71855a7f6464d21fff8 Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.326272 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c9ef8bb6-0fc4-411e-82a1-85d95ced5818-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c9ef8bb6-0fc4-411e-82a1-85d95ced5818\") " pod="openstack/rabbitmq-server-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.326329 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c9ef8bb6-0fc4-411e-82a1-85d95ced5818-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c9ef8bb6-0fc4-411e-82a1-85d95ced5818\") " pod="openstack/rabbitmq-server-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.326374 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c9ef8bb6-0fc4-411e-82a1-85d95ced5818-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c9ef8bb6-0fc4-411e-82a1-85d95ced5818\") " pod="openstack/rabbitmq-server-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.326397 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c9ef8bb6-0fc4-411e-82a1-85d95ced5818-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c9ef8bb6-0fc4-411e-82a1-85d95ced5818\") " pod="openstack/rabbitmq-server-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.326428 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c9ef8bb6-0fc4-411e-82a1-85d95ced5818-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c9ef8bb6-0fc4-411e-82a1-85d95ced5818\") " pod="openstack/rabbitmq-server-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.326465 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c9ef8bb6-0fc4-411e-82a1-85d95ced5818-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c9ef8bb6-0fc4-411e-82a1-85d95ced5818\") " pod="openstack/rabbitmq-server-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.326490 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c9ef8bb6-0fc4-411e-82a1-85d95ced5818-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c9ef8bb6-0fc4-411e-82a1-85d95ced5818\") " pod="openstack/rabbitmq-server-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.326513 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvkbl\" (UniqueName: \"kubernetes.io/projected/c9ef8bb6-0fc4-411e-82a1-85d95ced5818-kube-api-access-wvkbl\") pod \"rabbitmq-server-0\" (UID: \"c9ef8bb6-0fc4-411e-82a1-85d95ced5818\") " pod="openstack/rabbitmq-server-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.326530 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c9ef8bb6-0fc4-411e-82a1-85d95ced5818-config-data\") pod \"rabbitmq-server-0\" (UID: \"c9ef8bb6-0fc4-411e-82a1-85d95ced5818\") " pod="openstack/rabbitmq-server-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.326576 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"c9ef8bb6-0fc4-411e-82a1-85d95ced5818\") " pod="openstack/rabbitmq-server-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.326615 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c9ef8bb6-0fc4-411e-82a1-85d95ced5818-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c9ef8bb6-0fc4-411e-82a1-85d95ced5818\") " pod="openstack/rabbitmq-server-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.327199 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c9ef8bb6-0fc4-411e-82a1-85d95ced5818-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c9ef8bb6-0fc4-411e-82a1-85d95ced5818\") " pod="openstack/rabbitmq-server-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.327337 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c9ef8bb6-0fc4-411e-82a1-85d95ced5818-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c9ef8bb6-0fc4-411e-82a1-85d95ced5818\") " pod="openstack/rabbitmq-server-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.327871 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"c9ef8bb6-0fc4-411e-82a1-85d95ced5818\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-server-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.329010 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c9ef8bb6-0fc4-411e-82a1-85d95ced5818-config-data\") pod \"rabbitmq-server-0\" (UID: \"c9ef8bb6-0fc4-411e-82a1-85d95ced5818\") " pod="openstack/rabbitmq-server-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.329458 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c9ef8bb6-0fc4-411e-82a1-85d95ced5818-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c9ef8bb6-0fc4-411e-82a1-85d95ced5818\") " pod="openstack/rabbitmq-server-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.332294 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c9ef8bb6-0fc4-411e-82a1-85d95ced5818-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c9ef8bb6-0fc4-411e-82a1-85d95ced5818\") " pod="openstack/rabbitmq-server-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.332905 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c9ef8bb6-0fc4-411e-82a1-85d95ced5818-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c9ef8bb6-0fc4-411e-82a1-85d95ced5818\") " pod="openstack/rabbitmq-server-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.349382 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c9ef8bb6-0fc4-411e-82a1-85d95ced5818-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c9ef8bb6-0fc4-411e-82a1-85d95ced5818\") " pod="openstack/rabbitmq-server-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.352711 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvkbl\" (UniqueName: \"kubernetes.io/projected/c9ef8bb6-0fc4-411e-82a1-85d95ced5818-kube-api-access-wvkbl\") pod \"rabbitmq-server-0\" (UID: \"c9ef8bb6-0fc4-411e-82a1-85d95ced5818\") " pod="openstack/rabbitmq-server-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.352834 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c9ef8bb6-0fc4-411e-82a1-85d95ced5818-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c9ef8bb6-0fc4-411e-82a1-85d95ced5818\") " pod="openstack/rabbitmq-server-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.366702 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c9ef8bb6-0fc4-411e-82a1-85d95ced5818-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c9ef8bb6-0fc4-411e-82a1-85d95ced5818\") " pod="openstack/rabbitmq-server-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.376491 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"c9ef8bb6-0fc4-411e-82a1-85d95ced5818\") " pod="openstack/rabbitmq-server-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.403077 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.404579 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.410880 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.413145 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.415349 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.415557 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.415880 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.416142 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.416143 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.416230 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-hqps9" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.423023 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.549574 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1e9a059a-712b-4ff4-b50e-7d94a96a9db5-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e9a059a-712b-4ff4-b50e-7d94a96a9db5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.550068 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1e9a059a-712b-4ff4-b50e-7d94a96a9db5-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e9a059a-712b-4ff4-b50e-7d94a96a9db5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.550176 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1e9a059a-712b-4ff4-b50e-7d94a96a9db5-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e9a059a-712b-4ff4-b50e-7d94a96a9db5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.550251 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1e9a059a-712b-4ff4-b50e-7d94a96a9db5-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e9a059a-712b-4ff4-b50e-7d94a96a9db5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.550278 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1e9a059a-712b-4ff4-b50e-7d94a96a9db5-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e9a059a-712b-4ff4-b50e-7d94a96a9db5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.550309 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1e9a059a-712b-4ff4-b50e-7d94a96a9db5-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e9a059a-712b-4ff4-b50e-7d94a96a9db5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.550356 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1e9a059a-712b-4ff4-b50e-7d94a96a9db5-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e9a059a-712b-4ff4-b50e-7d94a96a9db5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.550381 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbdxb\" (UniqueName: \"kubernetes.io/projected/1e9a059a-712b-4ff4-b50e-7d94a96a9db5-kube-api-access-jbdxb\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e9a059a-712b-4ff4-b50e-7d94a96a9db5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.550453 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1e9a059a-712b-4ff4-b50e-7d94a96a9db5-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e9a059a-712b-4ff4-b50e-7d94a96a9db5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.550508 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1e9a059a-712b-4ff4-b50e-7d94a96a9db5-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e9a059a-712b-4ff4-b50e-7d94a96a9db5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.550609 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e9a059a-712b-4ff4-b50e-7d94a96a9db5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.652029 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e9a059a-712b-4ff4-b50e-7d94a96a9db5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.652082 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1e9a059a-712b-4ff4-b50e-7d94a96a9db5-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e9a059a-712b-4ff4-b50e-7d94a96a9db5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.652133 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1e9a059a-712b-4ff4-b50e-7d94a96a9db5-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e9a059a-712b-4ff4-b50e-7d94a96a9db5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.652268 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e9a059a-712b-4ff4-b50e-7d94a96a9db5\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-cell1-server-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.653053 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1e9a059a-712b-4ff4-b50e-7d94a96a9db5-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e9a059a-712b-4ff4-b50e-7d94a96a9db5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.653091 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1e9a059a-712b-4ff4-b50e-7d94a96a9db5-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e9a059a-712b-4ff4-b50e-7d94a96a9db5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.653148 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1e9a059a-712b-4ff4-b50e-7d94a96a9db5-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e9a059a-712b-4ff4-b50e-7d94a96a9db5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.653170 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1e9a059a-712b-4ff4-b50e-7d94a96a9db5-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e9a059a-712b-4ff4-b50e-7d94a96a9db5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.653272 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1e9a059a-712b-4ff4-b50e-7d94a96a9db5-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e9a059a-712b-4ff4-b50e-7d94a96a9db5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.653714 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1e9a059a-712b-4ff4-b50e-7d94a96a9db5-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e9a059a-712b-4ff4-b50e-7d94a96a9db5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.653740 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1e9a059a-712b-4ff4-b50e-7d94a96a9db5-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e9a059a-712b-4ff4-b50e-7d94a96a9db5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.653828 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1e9a059a-712b-4ff4-b50e-7d94a96a9db5-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e9a059a-712b-4ff4-b50e-7d94a96a9db5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.653877 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbdxb\" (UniqueName: \"kubernetes.io/projected/1e9a059a-712b-4ff4-b50e-7d94a96a9db5-kube-api-access-jbdxb\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e9a059a-712b-4ff4-b50e-7d94a96a9db5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.653905 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1e9a059a-712b-4ff4-b50e-7d94a96a9db5-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e9a059a-712b-4ff4-b50e-7d94a96a9db5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.653952 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1e9a059a-712b-4ff4-b50e-7d94a96a9db5-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e9a059a-712b-4ff4-b50e-7d94a96a9db5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.654669 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1e9a059a-712b-4ff4-b50e-7d94a96a9db5-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e9a059a-712b-4ff4-b50e-7d94a96a9db5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.657226 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1e9a059a-712b-4ff4-b50e-7d94a96a9db5-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e9a059a-712b-4ff4-b50e-7d94a96a9db5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.658217 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1e9a059a-712b-4ff4-b50e-7d94a96a9db5-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e9a059a-712b-4ff4-b50e-7d94a96a9db5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.659383 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1e9a059a-712b-4ff4-b50e-7d94a96a9db5-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e9a059a-712b-4ff4-b50e-7d94a96a9db5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.660555 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1e9a059a-712b-4ff4-b50e-7d94a96a9db5-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e9a059a-712b-4ff4-b50e-7d94a96a9db5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.675730 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbdxb\" (UniqueName: \"kubernetes.io/projected/1e9a059a-712b-4ff4-b50e-7d94a96a9db5-kube-api-access-jbdxb\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e9a059a-712b-4ff4-b50e-7d94a96a9db5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.676478 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1e9a059a-712b-4ff4-b50e-7d94a96a9db5-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e9a059a-712b-4ff4-b50e-7d94a96a9db5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.696948 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e9a059a-712b-4ff4-b50e-7d94a96a9db5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.785169 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.838658 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-njtpt" event={"ID":"214eff82-ef7c-49b0-a9a3-5246584e9b66","Type":"ContainerStarted","Data":"b9ef5b9b904a69be7cb21f895f4db8cb7b5fa52634eadf9d37a5f616e0c66c0d"} Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:34.839681 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-kv54m" event={"ID":"33007873-cb3d-4f47-8883-a60f3f823a16","Type":"ContainerStarted","Data":"e52e85602f6fe719f856f8cef376f2f3020aaf6e0f60d71855a7f6464d21fff8"} Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:35.772005 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:35.776016 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:35.779695 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:35.779797 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:35.779738 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:35.780017 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-gslgk" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:35.786792 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:35.805224 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:35.888075 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c09bcbbf-f96b-4f90-8f2d-9d635454a05e-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"c09bcbbf-f96b-4f90-8f2d-9d635454a05e\") " pod="openstack/openstack-galera-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:35.888162 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c09bcbbf-f96b-4f90-8f2d-9d635454a05e-config-data-generated\") pod \"openstack-galera-0\" (UID: \"c09bcbbf-f96b-4f90-8f2d-9d635454a05e\") " pod="openstack/openstack-galera-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:35.888189 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c09bcbbf-f96b-4f90-8f2d-9d635454a05e-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"c09bcbbf-f96b-4f90-8f2d-9d635454a05e\") " pod="openstack/openstack-galera-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:35.888547 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qp9pw\" (UniqueName: \"kubernetes.io/projected/c09bcbbf-f96b-4f90-8f2d-9d635454a05e-kube-api-access-qp9pw\") pod \"openstack-galera-0\" (UID: \"c09bcbbf-f96b-4f90-8f2d-9d635454a05e\") " pod="openstack/openstack-galera-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:35.888616 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c09bcbbf-f96b-4f90-8f2d-9d635454a05e-kolla-config\") pod \"openstack-galera-0\" (UID: \"c09bcbbf-f96b-4f90-8f2d-9d635454a05e\") " pod="openstack/openstack-galera-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:35.888650 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"c09bcbbf-f96b-4f90-8f2d-9d635454a05e\") " pod="openstack/openstack-galera-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:35.888679 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c09bcbbf-f96b-4f90-8f2d-9d635454a05e-operator-scripts\") pod \"openstack-galera-0\" (UID: \"c09bcbbf-f96b-4f90-8f2d-9d635454a05e\") " pod="openstack/openstack-galera-0" Dec 01 21:52:35 crc kubenswrapper[4962]: I1201 21:52:35.888754 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c09bcbbf-f96b-4f90-8f2d-9d635454a05e-config-data-default\") pod \"openstack-galera-0\" (UID: \"c09bcbbf-f96b-4f90-8f2d-9d635454a05e\") " pod="openstack/openstack-galera-0" Dec 01 21:52:36 crc kubenswrapper[4962]: I1201 21:52:36.029863 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c09bcbbf-f96b-4f90-8f2d-9d635454a05e-kolla-config\") pod \"openstack-galera-0\" (UID: \"c09bcbbf-f96b-4f90-8f2d-9d635454a05e\") " pod="openstack/openstack-galera-0" Dec 01 21:52:36 crc kubenswrapper[4962]: I1201 21:52:36.029927 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"c09bcbbf-f96b-4f90-8f2d-9d635454a05e\") " pod="openstack/openstack-galera-0" Dec 01 21:52:36 crc kubenswrapper[4962]: I1201 21:52:36.029978 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c09bcbbf-f96b-4f90-8f2d-9d635454a05e-operator-scripts\") pod \"openstack-galera-0\" (UID: \"c09bcbbf-f96b-4f90-8f2d-9d635454a05e\") " pod="openstack/openstack-galera-0" Dec 01 21:52:36 crc kubenswrapper[4962]: I1201 21:52:36.030039 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c09bcbbf-f96b-4f90-8f2d-9d635454a05e-config-data-default\") pod \"openstack-galera-0\" (UID: \"c09bcbbf-f96b-4f90-8f2d-9d635454a05e\") " pod="openstack/openstack-galera-0" Dec 01 21:52:36 crc kubenswrapper[4962]: I1201 21:52:36.030063 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c09bcbbf-f96b-4f90-8f2d-9d635454a05e-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"c09bcbbf-f96b-4f90-8f2d-9d635454a05e\") " pod="openstack/openstack-galera-0" Dec 01 21:52:36 crc kubenswrapper[4962]: I1201 21:52:36.030092 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c09bcbbf-f96b-4f90-8f2d-9d635454a05e-config-data-generated\") pod \"openstack-galera-0\" (UID: \"c09bcbbf-f96b-4f90-8f2d-9d635454a05e\") " pod="openstack/openstack-galera-0" Dec 01 21:52:36 crc kubenswrapper[4962]: I1201 21:52:36.030108 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c09bcbbf-f96b-4f90-8f2d-9d635454a05e-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"c09bcbbf-f96b-4f90-8f2d-9d635454a05e\") " pod="openstack/openstack-galera-0" Dec 01 21:52:36 crc kubenswrapper[4962]: I1201 21:52:36.030135 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qp9pw\" (UniqueName: \"kubernetes.io/projected/c09bcbbf-f96b-4f90-8f2d-9d635454a05e-kube-api-access-qp9pw\") pod \"openstack-galera-0\" (UID: \"c09bcbbf-f96b-4f90-8f2d-9d635454a05e\") " pod="openstack/openstack-galera-0" Dec 01 21:52:36 crc kubenswrapper[4962]: I1201 21:52:36.030367 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"c09bcbbf-f96b-4f90-8f2d-9d635454a05e\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/openstack-galera-0" Dec 01 21:52:36 crc kubenswrapper[4962]: I1201 21:52:36.030885 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c09bcbbf-f96b-4f90-8f2d-9d635454a05e-config-data-generated\") pod \"openstack-galera-0\" (UID: \"c09bcbbf-f96b-4f90-8f2d-9d635454a05e\") " pod="openstack/openstack-galera-0" Dec 01 21:52:36 crc kubenswrapper[4962]: I1201 21:52:36.031186 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c09bcbbf-f96b-4f90-8f2d-9d635454a05e-kolla-config\") pod \"openstack-galera-0\" (UID: \"c09bcbbf-f96b-4f90-8f2d-9d635454a05e\") " pod="openstack/openstack-galera-0" Dec 01 21:52:36 crc kubenswrapper[4962]: I1201 21:52:36.031738 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c09bcbbf-f96b-4f90-8f2d-9d635454a05e-config-data-default\") pod \"openstack-galera-0\" (UID: \"c09bcbbf-f96b-4f90-8f2d-9d635454a05e\") " pod="openstack/openstack-galera-0" Dec 01 21:52:36 crc kubenswrapper[4962]: I1201 21:52:36.032091 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c09bcbbf-f96b-4f90-8f2d-9d635454a05e-operator-scripts\") pod \"openstack-galera-0\" (UID: \"c09bcbbf-f96b-4f90-8f2d-9d635454a05e\") " pod="openstack/openstack-galera-0" Dec 01 21:52:36 crc kubenswrapper[4962]: I1201 21:52:36.040691 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c09bcbbf-f96b-4f90-8f2d-9d635454a05e-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"c09bcbbf-f96b-4f90-8f2d-9d635454a05e\") " pod="openstack/openstack-galera-0" Dec 01 21:52:36 crc kubenswrapper[4962]: I1201 21:52:36.060005 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c09bcbbf-f96b-4f90-8f2d-9d635454a05e-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"c09bcbbf-f96b-4f90-8f2d-9d635454a05e\") " pod="openstack/openstack-galera-0" Dec 01 21:52:36 crc kubenswrapper[4962]: I1201 21:52:36.069592 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"c09bcbbf-f96b-4f90-8f2d-9d635454a05e\") " pod="openstack/openstack-galera-0" Dec 01 21:52:36 crc kubenswrapper[4962]: I1201 21:52:36.072553 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qp9pw\" (UniqueName: \"kubernetes.io/projected/c09bcbbf-f96b-4f90-8f2d-9d635454a05e-kube-api-access-qp9pw\") pod \"openstack-galera-0\" (UID: \"c09bcbbf-f96b-4f90-8f2d-9d635454a05e\") " pod="openstack/openstack-galera-0" Dec 01 21:52:36 crc kubenswrapper[4962]: I1201 21:52:36.103056 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 21:52:36 crc kubenswrapper[4962]: I1201 21:52:36.113554 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 21:52:36 crc kubenswrapper[4962]: I1201 21:52:36.117484 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 01 21:52:37 crc kubenswrapper[4962]: I1201 21:52:37.204126 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 01 21:52:37 crc kubenswrapper[4962]: I1201 21:52:37.212129 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 01 21:52:37 crc kubenswrapper[4962]: I1201 21:52:37.220385 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-sk8wh" Dec 01 21:52:37 crc kubenswrapper[4962]: I1201 21:52:37.224591 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 01 21:52:37 crc kubenswrapper[4962]: I1201 21:52:37.224875 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 01 21:52:37 crc kubenswrapper[4962]: I1201 21:52:37.225228 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 01 21:52:37 crc kubenswrapper[4962]: I1201 21:52:37.230011 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 01 21:52:37 crc kubenswrapper[4962]: I1201 21:52:37.275698 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 01 21:52:37 crc kubenswrapper[4962]: I1201 21:52:37.280801 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 01 21:52:37 crc kubenswrapper[4962]: I1201 21:52:37.283634 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Dec 01 21:52:37 crc kubenswrapper[4962]: I1201 21:52:37.283799 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 01 21:52:37 crc kubenswrapper[4962]: I1201 21:52:37.283918 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-fchbl" Dec 01 21:52:37 crc kubenswrapper[4962]: I1201 21:52:37.293191 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 01 21:52:37 crc kubenswrapper[4962]: I1201 21:52:37.351056 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"aebd10ab-b3dd-4bc7-8ea0-f5883d794715\") " pod="openstack/openstack-cell1-galera-0" Dec 01 21:52:37 crc kubenswrapper[4962]: I1201 21:52:37.351144 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/aebd10ab-b3dd-4bc7-8ea0-f5883d794715-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"aebd10ab-b3dd-4bc7-8ea0-f5883d794715\") " pod="openstack/openstack-cell1-galera-0" Dec 01 21:52:37 crc kubenswrapper[4962]: I1201 21:52:37.351183 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/aebd10ab-b3dd-4bc7-8ea0-f5883d794715-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"aebd10ab-b3dd-4bc7-8ea0-f5883d794715\") " pod="openstack/openstack-cell1-galera-0" Dec 01 21:52:37 crc kubenswrapper[4962]: I1201 21:52:37.351205 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6jrz\" (UniqueName: \"kubernetes.io/projected/aebd10ab-b3dd-4bc7-8ea0-f5883d794715-kube-api-access-w6jrz\") pod \"openstack-cell1-galera-0\" (UID: \"aebd10ab-b3dd-4bc7-8ea0-f5883d794715\") " pod="openstack/openstack-cell1-galera-0" Dec 01 21:52:37 crc kubenswrapper[4962]: I1201 21:52:37.351227 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/aebd10ab-b3dd-4bc7-8ea0-f5883d794715-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"aebd10ab-b3dd-4bc7-8ea0-f5883d794715\") " pod="openstack/openstack-cell1-galera-0" Dec 01 21:52:37 crc kubenswrapper[4962]: I1201 21:52:37.351259 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/aebd10ab-b3dd-4bc7-8ea0-f5883d794715-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"aebd10ab-b3dd-4bc7-8ea0-f5883d794715\") " pod="openstack/openstack-cell1-galera-0" Dec 01 21:52:37 crc kubenswrapper[4962]: I1201 21:52:37.351301 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aebd10ab-b3dd-4bc7-8ea0-f5883d794715-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"aebd10ab-b3dd-4bc7-8ea0-f5883d794715\") " pod="openstack/openstack-cell1-galera-0" Dec 01 21:52:37 crc kubenswrapper[4962]: I1201 21:52:37.351320 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aebd10ab-b3dd-4bc7-8ea0-f5883d794715-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"aebd10ab-b3dd-4bc7-8ea0-f5883d794715\") " pod="openstack/openstack-cell1-galera-0" Dec 01 21:52:37 crc kubenswrapper[4962]: I1201 21:52:37.452911 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c1c35af6-81b8-418f-a1e9-e19209bab14d-kolla-config\") pod \"memcached-0\" (UID: \"c1c35af6-81b8-418f-a1e9-e19209bab14d\") " pod="openstack/memcached-0" Dec 01 21:52:37 crc kubenswrapper[4962]: I1201 21:52:37.453030 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/aebd10ab-b3dd-4bc7-8ea0-f5883d794715-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"aebd10ab-b3dd-4bc7-8ea0-f5883d794715\") " pod="openstack/openstack-cell1-galera-0" Dec 01 21:52:37 crc kubenswrapper[4962]: I1201 21:52:37.453067 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/aebd10ab-b3dd-4bc7-8ea0-f5883d794715-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"aebd10ab-b3dd-4bc7-8ea0-f5883d794715\") " pod="openstack/openstack-cell1-galera-0" Dec 01 21:52:37 crc kubenswrapper[4962]: I1201 21:52:37.453088 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6jrz\" (UniqueName: \"kubernetes.io/projected/aebd10ab-b3dd-4bc7-8ea0-f5883d794715-kube-api-access-w6jrz\") pod \"openstack-cell1-galera-0\" (UID: \"aebd10ab-b3dd-4bc7-8ea0-f5883d794715\") " pod="openstack/openstack-cell1-galera-0" Dec 01 21:52:37 crc kubenswrapper[4962]: I1201 21:52:37.453107 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/aebd10ab-b3dd-4bc7-8ea0-f5883d794715-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"aebd10ab-b3dd-4bc7-8ea0-f5883d794715\") " pod="openstack/openstack-cell1-galera-0" Dec 01 21:52:37 crc kubenswrapper[4962]: I1201 21:52:37.453139 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/aebd10ab-b3dd-4bc7-8ea0-f5883d794715-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"aebd10ab-b3dd-4bc7-8ea0-f5883d794715\") " pod="openstack/openstack-cell1-galera-0" Dec 01 21:52:37 crc kubenswrapper[4962]: I1201 21:52:37.453160 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1c35af6-81b8-418f-a1e9-e19209bab14d-memcached-tls-certs\") pod \"memcached-0\" (UID: \"c1c35af6-81b8-418f-a1e9-e19209bab14d\") " pod="openstack/memcached-0" Dec 01 21:52:37 crc kubenswrapper[4962]: I1201 21:52:37.453195 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86svv\" (UniqueName: \"kubernetes.io/projected/c1c35af6-81b8-418f-a1e9-e19209bab14d-kube-api-access-86svv\") pod \"memcached-0\" (UID: \"c1c35af6-81b8-418f-a1e9-e19209bab14d\") " pod="openstack/memcached-0" Dec 01 21:52:37 crc kubenswrapper[4962]: I1201 21:52:37.453214 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aebd10ab-b3dd-4bc7-8ea0-f5883d794715-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"aebd10ab-b3dd-4bc7-8ea0-f5883d794715\") " pod="openstack/openstack-cell1-galera-0" Dec 01 21:52:37 crc kubenswrapper[4962]: I1201 21:52:37.453233 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aebd10ab-b3dd-4bc7-8ea0-f5883d794715-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"aebd10ab-b3dd-4bc7-8ea0-f5883d794715\") " pod="openstack/openstack-cell1-galera-0" Dec 01 21:52:37 crc kubenswrapper[4962]: I1201 21:52:37.453262 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c1c35af6-81b8-418f-a1e9-e19209bab14d-config-data\") pod \"memcached-0\" (UID: \"c1c35af6-81b8-418f-a1e9-e19209bab14d\") " pod="openstack/memcached-0" Dec 01 21:52:37 crc kubenswrapper[4962]: I1201 21:52:37.453289 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1c35af6-81b8-418f-a1e9-e19209bab14d-combined-ca-bundle\") pod \"memcached-0\" (UID: \"c1c35af6-81b8-418f-a1e9-e19209bab14d\") " pod="openstack/memcached-0" Dec 01 21:52:37 crc kubenswrapper[4962]: I1201 21:52:37.453311 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"aebd10ab-b3dd-4bc7-8ea0-f5883d794715\") " pod="openstack/openstack-cell1-galera-0" Dec 01 21:52:37 crc kubenswrapper[4962]: I1201 21:52:37.453639 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"aebd10ab-b3dd-4bc7-8ea0-f5883d794715\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/openstack-cell1-galera-0" Dec 01 21:52:37 crc kubenswrapper[4962]: I1201 21:52:37.454745 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/aebd10ab-b3dd-4bc7-8ea0-f5883d794715-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"aebd10ab-b3dd-4bc7-8ea0-f5883d794715\") " pod="openstack/openstack-cell1-galera-0" Dec 01 21:52:37 crc kubenswrapper[4962]: I1201 21:52:37.455314 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/aebd10ab-b3dd-4bc7-8ea0-f5883d794715-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"aebd10ab-b3dd-4bc7-8ea0-f5883d794715\") " pod="openstack/openstack-cell1-galera-0" Dec 01 21:52:37 crc kubenswrapper[4962]: I1201 21:52:37.455777 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/aebd10ab-b3dd-4bc7-8ea0-f5883d794715-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"aebd10ab-b3dd-4bc7-8ea0-f5883d794715\") " pod="openstack/openstack-cell1-galera-0" Dec 01 21:52:37 crc kubenswrapper[4962]: I1201 21:52:37.455891 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aebd10ab-b3dd-4bc7-8ea0-f5883d794715-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"aebd10ab-b3dd-4bc7-8ea0-f5883d794715\") " pod="openstack/openstack-cell1-galera-0" Dec 01 21:52:37 crc kubenswrapper[4962]: I1201 21:52:37.467602 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/aebd10ab-b3dd-4bc7-8ea0-f5883d794715-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"aebd10ab-b3dd-4bc7-8ea0-f5883d794715\") " pod="openstack/openstack-cell1-galera-0" Dec 01 21:52:37 crc kubenswrapper[4962]: I1201 21:52:37.478621 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aebd10ab-b3dd-4bc7-8ea0-f5883d794715-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"aebd10ab-b3dd-4bc7-8ea0-f5883d794715\") " pod="openstack/openstack-cell1-galera-0" Dec 01 21:52:37 crc kubenswrapper[4962]: I1201 21:52:37.480379 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6jrz\" (UniqueName: \"kubernetes.io/projected/aebd10ab-b3dd-4bc7-8ea0-f5883d794715-kube-api-access-w6jrz\") pod \"openstack-cell1-galera-0\" (UID: \"aebd10ab-b3dd-4bc7-8ea0-f5883d794715\") " pod="openstack/openstack-cell1-galera-0" Dec 01 21:52:37 crc kubenswrapper[4962]: I1201 21:52:37.509541 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"aebd10ab-b3dd-4bc7-8ea0-f5883d794715\") " pod="openstack/openstack-cell1-galera-0" Dec 01 21:52:37 crc kubenswrapper[4962]: I1201 21:52:37.555065 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c1c35af6-81b8-418f-a1e9-e19209bab14d-config-data\") pod \"memcached-0\" (UID: \"c1c35af6-81b8-418f-a1e9-e19209bab14d\") " pod="openstack/memcached-0" Dec 01 21:52:37 crc kubenswrapper[4962]: I1201 21:52:37.555118 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1c35af6-81b8-418f-a1e9-e19209bab14d-combined-ca-bundle\") pod \"memcached-0\" (UID: \"c1c35af6-81b8-418f-a1e9-e19209bab14d\") " pod="openstack/memcached-0" Dec 01 21:52:37 crc kubenswrapper[4962]: I1201 21:52:37.555161 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c1c35af6-81b8-418f-a1e9-e19209bab14d-kolla-config\") pod \"memcached-0\" (UID: \"c1c35af6-81b8-418f-a1e9-e19209bab14d\") " pod="openstack/memcached-0" Dec 01 21:52:37 crc kubenswrapper[4962]: I1201 21:52:37.555246 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1c35af6-81b8-418f-a1e9-e19209bab14d-memcached-tls-certs\") pod \"memcached-0\" (UID: \"c1c35af6-81b8-418f-a1e9-e19209bab14d\") " pod="openstack/memcached-0" Dec 01 21:52:37 crc kubenswrapper[4962]: I1201 21:52:37.555276 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86svv\" (UniqueName: \"kubernetes.io/projected/c1c35af6-81b8-418f-a1e9-e19209bab14d-kube-api-access-86svv\") pod \"memcached-0\" (UID: \"c1c35af6-81b8-418f-a1e9-e19209bab14d\") " pod="openstack/memcached-0" Dec 01 21:52:37 crc kubenswrapper[4962]: I1201 21:52:37.556209 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c1c35af6-81b8-418f-a1e9-e19209bab14d-config-data\") pod \"memcached-0\" (UID: \"c1c35af6-81b8-418f-a1e9-e19209bab14d\") " pod="openstack/memcached-0" Dec 01 21:52:37 crc kubenswrapper[4962]: I1201 21:52:37.556783 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c1c35af6-81b8-418f-a1e9-e19209bab14d-kolla-config\") pod \"memcached-0\" (UID: \"c1c35af6-81b8-418f-a1e9-e19209bab14d\") " pod="openstack/memcached-0" Dec 01 21:52:37 crc kubenswrapper[4962]: I1201 21:52:37.558294 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 01 21:52:37 crc kubenswrapper[4962]: I1201 21:52:37.561194 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1c35af6-81b8-418f-a1e9-e19209bab14d-combined-ca-bundle\") pod \"memcached-0\" (UID: \"c1c35af6-81b8-418f-a1e9-e19209bab14d\") " pod="openstack/memcached-0" Dec 01 21:52:37 crc kubenswrapper[4962]: I1201 21:52:37.562626 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1c35af6-81b8-418f-a1e9-e19209bab14d-memcached-tls-certs\") pod \"memcached-0\" (UID: \"c1c35af6-81b8-418f-a1e9-e19209bab14d\") " pod="openstack/memcached-0" Dec 01 21:52:37 crc kubenswrapper[4962]: I1201 21:52:37.570699 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86svv\" (UniqueName: \"kubernetes.io/projected/c1c35af6-81b8-418f-a1e9-e19209bab14d-kube-api-access-86svv\") pod \"memcached-0\" (UID: \"c1c35af6-81b8-418f-a1e9-e19209bab14d\") " pod="openstack/memcached-0" Dec 01 21:52:37 crc kubenswrapper[4962]: I1201 21:52:37.613219 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 01 21:52:39 crc kubenswrapper[4962]: I1201 21:52:39.328342 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 21:52:39 crc kubenswrapper[4962]: I1201 21:52:39.329762 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 01 21:52:39 crc kubenswrapper[4962]: I1201 21:52:39.332820 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-2d8pf" Dec 01 21:52:39 crc kubenswrapper[4962]: I1201 21:52:39.357997 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 21:52:39 crc kubenswrapper[4962]: I1201 21:52:39.502057 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8hwx\" (UniqueName: \"kubernetes.io/projected/3dced14f-6bff-4820-b135-78ef69ba6b33-kube-api-access-x8hwx\") pod \"kube-state-metrics-0\" (UID: \"3dced14f-6bff-4820-b135-78ef69ba6b33\") " pod="openstack/kube-state-metrics-0" Dec 01 21:52:39 crc kubenswrapper[4962]: I1201 21:52:39.603643 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8hwx\" (UniqueName: \"kubernetes.io/projected/3dced14f-6bff-4820-b135-78ef69ba6b33-kube-api-access-x8hwx\") pod \"kube-state-metrics-0\" (UID: \"3dced14f-6bff-4820-b135-78ef69ba6b33\") " pod="openstack/kube-state-metrics-0" Dec 01 21:52:39 crc kubenswrapper[4962]: I1201 21:52:39.647842 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8hwx\" (UniqueName: \"kubernetes.io/projected/3dced14f-6bff-4820-b135-78ef69ba6b33-kube-api-access-x8hwx\") pod \"kube-state-metrics-0\" (UID: \"3dced14f-6bff-4820-b135-78ef69ba6b33\") " pod="openstack/kube-state-metrics-0" Dec 01 21:52:39 crc kubenswrapper[4962]: I1201 21:52:39.652363 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 01 21:52:39 crc kubenswrapper[4962]: I1201 21:52:39.799225 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-ui-dashboards-7d5fb4cbfb-9stzc"] Dec 01 21:52:39 crc kubenswrapper[4962]: I1201 21:52:39.800450 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-9stzc" Dec 01 21:52:39 crc kubenswrapper[4962]: I1201 21:52:39.804512 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-ui-dashboards-sa-dockercfg-sgwgm" Dec 01 21:52:39 crc kubenswrapper[4962]: I1201 21:52:39.804711 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-ui-dashboards" Dec 01 21:52:39 crc kubenswrapper[4962]: I1201 21:52:39.817780 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-ui-dashboards-7d5fb4cbfb-9stzc"] Dec 01 21:52:39 crc kubenswrapper[4962]: I1201 21:52:39.908541 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2tvd\" (UniqueName: \"kubernetes.io/projected/07284111-fb8f-4fc6-9693-dfe6869248bf-kube-api-access-j2tvd\") pod \"observability-ui-dashboards-7d5fb4cbfb-9stzc\" (UID: \"07284111-fb8f-4fc6-9693-dfe6869248bf\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-9stzc" Dec 01 21:52:39 crc kubenswrapper[4962]: I1201 21:52:39.908622 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07284111-fb8f-4fc6-9693-dfe6869248bf-serving-cert\") pod \"observability-ui-dashboards-7d5fb4cbfb-9stzc\" (UID: \"07284111-fb8f-4fc6-9693-dfe6869248bf\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-9stzc" Dec 01 21:52:40 crc kubenswrapper[4962]: I1201 21:52:40.029069 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2tvd\" (UniqueName: \"kubernetes.io/projected/07284111-fb8f-4fc6-9693-dfe6869248bf-kube-api-access-j2tvd\") pod \"observability-ui-dashboards-7d5fb4cbfb-9stzc\" (UID: \"07284111-fb8f-4fc6-9693-dfe6869248bf\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-9stzc" Dec 01 21:52:40 crc kubenswrapper[4962]: I1201 21:52:40.029198 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07284111-fb8f-4fc6-9693-dfe6869248bf-serving-cert\") pod \"observability-ui-dashboards-7d5fb4cbfb-9stzc\" (UID: \"07284111-fb8f-4fc6-9693-dfe6869248bf\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-9stzc" Dec 01 21:52:40 crc kubenswrapper[4962]: E1201 21:52:40.029539 4962 secret.go:188] Couldn't get secret openshift-operators/observability-ui-dashboards: secret "observability-ui-dashboards" not found Dec 01 21:52:40 crc kubenswrapper[4962]: E1201 21:52:40.029629 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07284111-fb8f-4fc6-9693-dfe6869248bf-serving-cert podName:07284111-fb8f-4fc6-9693-dfe6869248bf nodeName:}" failed. No retries permitted until 2025-12-01 21:52:40.529606703 +0000 UTC m=+1144.631045898 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/07284111-fb8f-4fc6-9693-dfe6869248bf-serving-cert") pod "observability-ui-dashboards-7d5fb4cbfb-9stzc" (UID: "07284111-fb8f-4fc6-9693-dfe6869248bf") : secret "observability-ui-dashboards" not found Dec 01 21:52:40 crc kubenswrapper[4962]: I1201 21:52:40.055456 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2tvd\" (UniqueName: \"kubernetes.io/projected/07284111-fb8f-4fc6-9693-dfe6869248bf-kube-api-access-j2tvd\") pod \"observability-ui-dashboards-7d5fb4cbfb-9stzc\" (UID: \"07284111-fb8f-4fc6-9693-dfe6869248bf\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-9stzc" Dec 01 21:52:40 crc kubenswrapper[4962]: I1201 21:52:40.181097 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-6c4bf499-lfzj5"] Dec 01 21:52:40 crc kubenswrapper[4962]: I1201 21:52:40.182202 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c4bf499-lfzj5" Dec 01 21:52:40 crc kubenswrapper[4962]: I1201 21:52:40.204190 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6c4bf499-lfzj5"] Dec 01 21:52:40 crc kubenswrapper[4962]: I1201 21:52:40.252097 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8aeb2e7e-7195-43a5-b5a8-760f44df8a86-trusted-ca-bundle\") pod \"console-6c4bf499-lfzj5\" (UID: \"8aeb2e7e-7195-43a5-b5a8-760f44df8a86\") " pod="openshift-console/console-6c4bf499-lfzj5" Dec 01 21:52:40 crc kubenswrapper[4962]: I1201 21:52:40.252163 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8aeb2e7e-7195-43a5-b5a8-760f44df8a86-service-ca\") pod \"console-6c4bf499-lfzj5\" (UID: \"8aeb2e7e-7195-43a5-b5a8-760f44df8a86\") " pod="openshift-console/console-6c4bf499-lfzj5" Dec 01 21:52:40 crc kubenswrapper[4962]: I1201 21:52:40.252191 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwm4s\" (UniqueName: \"kubernetes.io/projected/8aeb2e7e-7195-43a5-b5a8-760f44df8a86-kube-api-access-dwm4s\") pod \"console-6c4bf499-lfzj5\" (UID: \"8aeb2e7e-7195-43a5-b5a8-760f44df8a86\") " pod="openshift-console/console-6c4bf499-lfzj5" Dec 01 21:52:40 crc kubenswrapper[4962]: I1201 21:52:40.252222 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8aeb2e7e-7195-43a5-b5a8-760f44df8a86-console-oauth-config\") pod \"console-6c4bf499-lfzj5\" (UID: \"8aeb2e7e-7195-43a5-b5a8-760f44df8a86\") " pod="openshift-console/console-6c4bf499-lfzj5" Dec 01 21:52:40 crc kubenswrapper[4962]: I1201 21:52:40.252361 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8aeb2e7e-7195-43a5-b5a8-760f44df8a86-oauth-serving-cert\") pod \"console-6c4bf499-lfzj5\" (UID: \"8aeb2e7e-7195-43a5-b5a8-760f44df8a86\") " pod="openshift-console/console-6c4bf499-lfzj5" Dec 01 21:52:40 crc kubenswrapper[4962]: I1201 21:52:40.252382 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8aeb2e7e-7195-43a5-b5a8-760f44df8a86-console-serving-cert\") pod \"console-6c4bf499-lfzj5\" (UID: \"8aeb2e7e-7195-43a5-b5a8-760f44df8a86\") " pod="openshift-console/console-6c4bf499-lfzj5" Dec 01 21:52:40 crc kubenswrapper[4962]: I1201 21:52:40.252469 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8aeb2e7e-7195-43a5-b5a8-760f44df8a86-console-config\") pod \"console-6c4bf499-lfzj5\" (UID: \"8aeb2e7e-7195-43a5-b5a8-760f44df8a86\") " pod="openshift-console/console-6c4bf499-lfzj5" Dec 01 21:52:40 crc kubenswrapper[4962]: I1201 21:52:40.354057 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8aeb2e7e-7195-43a5-b5a8-760f44df8a86-oauth-serving-cert\") pod \"console-6c4bf499-lfzj5\" (UID: \"8aeb2e7e-7195-43a5-b5a8-760f44df8a86\") " pod="openshift-console/console-6c4bf499-lfzj5" Dec 01 21:52:40 crc kubenswrapper[4962]: I1201 21:52:40.354099 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8aeb2e7e-7195-43a5-b5a8-760f44df8a86-console-serving-cert\") pod \"console-6c4bf499-lfzj5\" (UID: \"8aeb2e7e-7195-43a5-b5a8-760f44df8a86\") " pod="openshift-console/console-6c4bf499-lfzj5" Dec 01 21:52:40 crc kubenswrapper[4962]: I1201 21:52:40.354116 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8aeb2e7e-7195-43a5-b5a8-760f44df8a86-console-config\") pod \"console-6c4bf499-lfzj5\" (UID: \"8aeb2e7e-7195-43a5-b5a8-760f44df8a86\") " pod="openshift-console/console-6c4bf499-lfzj5" Dec 01 21:52:40 crc kubenswrapper[4962]: I1201 21:52:40.354140 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8aeb2e7e-7195-43a5-b5a8-760f44df8a86-trusted-ca-bundle\") pod \"console-6c4bf499-lfzj5\" (UID: \"8aeb2e7e-7195-43a5-b5a8-760f44df8a86\") " pod="openshift-console/console-6c4bf499-lfzj5" Dec 01 21:52:40 crc kubenswrapper[4962]: I1201 21:52:40.354174 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8aeb2e7e-7195-43a5-b5a8-760f44df8a86-service-ca\") pod \"console-6c4bf499-lfzj5\" (UID: \"8aeb2e7e-7195-43a5-b5a8-760f44df8a86\") " pod="openshift-console/console-6c4bf499-lfzj5" Dec 01 21:52:40 crc kubenswrapper[4962]: I1201 21:52:40.354198 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwm4s\" (UniqueName: \"kubernetes.io/projected/8aeb2e7e-7195-43a5-b5a8-760f44df8a86-kube-api-access-dwm4s\") pod \"console-6c4bf499-lfzj5\" (UID: \"8aeb2e7e-7195-43a5-b5a8-760f44df8a86\") " pod="openshift-console/console-6c4bf499-lfzj5" Dec 01 21:52:40 crc kubenswrapper[4962]: I1201 21:52:40.354222 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8aeb2e7e-7195-43a5-b5a8-760f44df8a86-console-oauth-config\") pod \"console-6c4bf499-lfzj5\" (UID: \"8aeb2e7e-7195-43a5-b5a8-760f44df8a86\") " pod="openshift-console/console-6c4bf499-lfzj5" Dec 01 21:52:40 crc kubenswrapper[4962]: I1201 21:52:40.355153 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8aeb2e7e-7195-43a5-b5a8-760f44df8a86-oauth-serving-cert\") pod \"console-6c4bf499-lfzj5\" (UID: \"8aeb2e7e-7195-43a5-b5a8-760f44df8a86\") " pod="openshift-console/console-6c4bf499-lfzj5" Dec 01 21:52:40 crc kubenswrapper[4962]: I1201 21:52:40.355694 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8aeb2e7e-7195-43a5-b5a8-760f44df8a86-console-config\") pod \"console-6c4bf499-lfzj5\" (UID: \"8aeb2e7e-7195-43a5-b5a8-760f44df8a86\") " pod="openshift-console/console-6c4bf499-lfzj5" Dec 01 21:52:40 crc kubenswrapper[4962]: I1201 21:52:40.356299 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8aeb2e7e-7195-43a5-b5a8-760f44df8a86-service-ca\") pod \"console-6c4bf499-lfzj5\" (UID: \"8aeb2e7e-7195-43a5-b5a8-760f44df8a86\") " pod="openshift-console/console-6c4bf499-lfzj5" Dec 01 21:52:40 crc kubenswrapper[4962]: I1201 21:52:40.358418 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8aeb2e7e-7195-43a5-b5a8-760f44df8a86-trusted-ca-bundle\") pod \"console-6c4bf499-lfzj5\" (UID: \"8aeb2e7e-7195-43a5-b5a8-760f44df8a86\") " pod="openshift-console/console-6c4bf499-lfzj5" Dec 01 21:52:40 crc kubenswrapper[4962]: I1201 21:52:40.360620 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8aeb2e7e-7195-43a5-b5a8-760f44df8a86-console-oauth-config\") pod \"console-6c4bf499-lfzj5\" (UID: \"8aeb2e7e-7195-43a5-b5a8-760f44df8a86\") " pod="openshift-console/console-6c4bf499-lfzj5" Dec 01 21:52:40 crc kubenswrapper[4962]: I1201 21:52:40.360910 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8aeb2e7e-7195-43a5-b5a8-760f44df8a86-console-serving-cert\") pod \"console-6c4bf499-lfzj5\" (UID: \"8aeb2e7e-7195-43a5-b5a8-760f44df8a86\") " pod="openshift-console/console-6c4bf499-lfzj5" Dec 01 21:52:40 crc kubenswrapper[4962]: I1201 21:52:40.377750 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwm4s\" (UniqueName: \"kubernetes.io/projected/8aeb2e7e-7195-43a5-b5a8-760f44df8a86-kube-api-access-dwm4s\") pod \"console-6c4bf499-lfzj5\" (UID: \"8aeb2e7e-7195-43a5-b5a8-760f44df8a86\") " pod="openshift-console/console-6c4bf499-lfzj5" Dec 01 21:52:40 crc kubenswrapper[4962]: I1201 21:52:40.423571 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 01 21:52:40 crc kubenswrapper[4962]: I1201 21:52:40.433791 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 01 21:52:40 crc kubenswrapper[4962]: I1201 21:52:40.437661 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Dec 01 21:52:40 crc kubenswrapper[4962]: I1201 21:52:40.437793 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Dec 01 21:52:40 crc kubenswrapper[4962]: I1201 21:52:40.437971 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Dec 01 21:52:40 crc kubenswrapper[4962]: I1201 21:52:40.440544 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-65vdm" Dec 01 21:52:40 crc kubenswrapper[4962]: I1201 21:52:40.440659 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Dec 01 21:52:40 crc kubenswrapper[4962]: I1201 21:52:40.447638 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Dec 01 21:52:40 crc kubenswrapper[4962]: I1201 21:52:40.449377 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 01 21:52:40 crc kubenswrapper[4962]: I1201 21:52:40.501480 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c4bf499-lfzj5" Dec 01 21:52:40 crc kubenswrapper[4962]: I1201 21:52:40.557638 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7xqn\" (UniqueName: \"kubernetes.io/projected/4f8782cd-368d-4071-848d-8ad2379ddf6c-kube-api-access-v7xqn\") pod \"prometheus-metric-storage-0\" (UID: \"4f8782cd-368d-4071-848d-8ad2379ddf6c\") " pod="openstack/prometheus-metric-storage-0" Dec 01 21:52:40 crc kubenswrapper[4962]: I1201 21:52:40.557760 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4f8782cd-368d-4071-848d-8ad2379ddf6c-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"4f8782cd-368d-4071-848d-8ad2379ddf6c\") " pod="openstack/prometheus-metric-storage-0" Dec 01 21:52:40 crc kubenswrapper[4962]: I1201 21:52:40.557859 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07284111-fb8f-4fc6-9693-dfe6869248bf-serving-cert\") pod \"observability-ui-dashboards-7d5fb4cbfb-9stzc\" (UID: \"07284111-fb8f-4fc6-9693-dfe6869248bf\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-9stzc" Dec 01 21:52:40 crc kubenswrapper[4962]: I1201 21:52:40.558631 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4f8782cd-368d-4071-848d-8ad2379ddf6c-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"4f8782cd-368d-4071-848d-8ad2379ddf6c\") " pod="openstack/prometheus-metric-storage-0" Dec 01 21:52:40 crc kubenswrapper[4962]: I1201 21:52:40.558728 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"prometheus-metric-storage-0\" (UID: \"4f8782cd-368d-4071-848d-8ad2379ddf6c\") " pod="openstack/prometheus-metric-storage-0" Dec 01 21:52:40 crc kubenswrapper[4962]: I1201 21:52:40.559076 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4f8782cd-368d-4071-848d-8ad2379ddf6c-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"4f8782cd-368d-4071-848d-8ad2379ddf6c\") " pod="openstack/prometheus-metric-storage-0" Dec 01 21:52:40 crc kubenswrapper[4962]: I1201 21:52:40.559111 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4f8782cd-368d-4071-848d-8ad2379ddf6c-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"4f8782cd-368d-4071-848d-8ad2379ddf6c\") " pod="openstack/prometheus-metric-storage-0" Dec 01 21:52:40 crc kubenswrapper[4962]: I1201 21:52:40.559250 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4f8782cd-368d-4071-848d-8ad2379ddf6c-config\") pod \"prometheus-metric-storage-0\" (UID: \"4f8782cd-368d-4071-848d-8ad2379ddf6c\") " pod="openstack/prometheus-metric-storage-0" Dec 01 21:52:40 crc kubenswrapper[4962]: I1201 21:52:40.559278 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4f8782cd-368d-4071-848d-8ad2379ddf6c-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"4f8782cd-368d-4071-848d-8ad2379ddf6c\") " pod="openstack/prometheus-metric-storage-0" Dec 01 21:52:40 crc kubenswrapper[4962]: I1201 21:52:40.562545 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07284111-fb8f-4fc6-9693-dfe6869248bf-serving-cert\") pod \"observability-ui-dashboards-7d5fb4cbfb-9stzc\" (UID: \"07284111-fb8f-4fc6-9693-dfe6869248bf\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-9stzc" Dec 01 21:52:40 crc kubenswrapper[4962]: I1201 21:52:40.660963 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4f8782cd-368d-4071-848d-8ad2379ddf6c-config\") pod \"prometheus-metric-storage-0\" (UID: \"4f8782cd-368d-4071-848d-8ad2379ddf6c\") " pod="openstack/prometheus-metric-storage-0" Dec 01 21:52:40 crc kubenswrapper[4962]: I1201 21:52:40.661012 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4f8782cd-368d-4071-848d-8ad2379ddf6c-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"4f8782cd-368d-4071-848d-8ad2379ddf6c\") " pod="openstack/prometheus-metric-storage-0" Dec 01 21:52:40 crc kubenswrapper[4962]: I1201 21:52:40.661065 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7xqn\" (UniqueName: \"kubernetes.io/projected/4f8782cd-368d-4071-848d-8ad2379ddf6c-kube-api-access-v7xqn\") pod \"prometheus-metric-storage-0\" (UID: \"4f8782cd-368d-4071-848d-8ad2379ddf6c\") " pod="openstack/prometheus-metric-storage-0" Dec 01 21:52:40 crc kubenswrapper[4962]: I1201 21:52:40.661109 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4f8782cd-368d-4071-848d-8ad2379ddf6c-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"4f8782cd-368d-4071-848d-8ad2379ddf6c\") " pod="openstack/prometheus-metric-storage-0" Dec 01 21:52:40 crc kubenswrapper[4962]: I1201 21:52:40.661156 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4f8782cd-368d-4071-848d-8ad2379ddf6c-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"4f8782cd-368d-4071-848d-8ad2379ddf6c\") " pod="openstack/prometheus-metric-storage-0" Dec 01 21:52:40 crc kubenswrapper[4962]: I1201 21:52:40.661193 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"prometheus-metric-storage-0\" (UID: \"4f8782cd-368d-4071-848d-8ad2379ddf6c\") " pod="openstack/prometheus-metric-storage-0" Dec 01 21:52:40 crc kubenswrapper[4962]: I1201 21:52:40.661225 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4f8782cd-368d-4071-848d-8ad2379ddf6c-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"4f8782cd-368d-4071-848d-8ad2379ddf6c\") " pod="openstack/prometheus-metric-storage-0" Dec 01 21:52:40 crc kubenswrapper[4962]: I1201 21:52:40.661241 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4f8782cd-368d-4071-848d-8ad2379ddf6c-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"4f8782cd-368d-4071-848d-8ad2379ddf6c\") " pod="openstack/prometheus-metric-storage-0" Dec 01 21:52:40 crc kubenswrapper[4962]: I1201 21:52:40.662219 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"prometheus-metric-storage-0\" (UID: \"4f8782cd-368d-4071-848d-8ad2379ddf6c\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/prometheus-metric-storage-0" Dec 01 21:52:40 crc kubenswrapper[4962]: I1201 21:52:40.662418 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4f8782cd-368d-4071-848d-8ad2379ddf6c-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"4f8782cd-368d-4071-848d-8ad2379ddf6c\") " pod="openstack/prometheus-metric-storage-0" Dec 01 21:52:40 crc kubenswrapper[4962]: I1201 21:52:40.665584 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4f8782cd-368d-4071-848d-8ad2379ddf6c-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"4f8782cd-368d-4071-848d-8ad2379ddf6c\") " pod="openstack/prometheus-metric-storage-0" Dec 01 21:52:40 crc kubenswrapper[4962]: I1201 21:52:40.665691 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4f8782cd-368d-4071-848d-8ad2379ddf6c-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"4f8782cd-368d-4071-848d-8ad2379ddf6c\") " pod="openstack/prometheus-metric-storage-0" Dec 01 21:52:40 crc kubenswrapper[4962]: I1201 21:52:40.667293 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4f8782cd-368d-4071-848d-8ad2379ddf6c-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"4f8782cd-368d-4071-848d-8ad2379ddf6c\") " pod="openstack/prometheus-metric-storage-0" Dec 01 21:52:40 crc kubenswrapper[4962]: I1201 21:52:40.667480 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/4f8782cd-368d-4071-848d-8ad2379ddf6c-config\") pod \"prometheus-metric-storage-0\" (UID: \"4f8782cd-368d-4071-848d-8ad2379ddf6c\") " pod="openstack/prometheus-metric-storage-0" Dec 01 21:52:40 crc kubenswrapper[4962]: I1201 21:52:40.680548 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7xqn\" (UniqueName: \"kubernetes.io/projected/4f8782cd-368d-4071-848d-8ad2379ddf6c-kube-api-access-v7xqn\") pod \"prometheus-metric-storage-0\" (UID: \"4f8782cd-368d-4071-848d-8ad2379ddf6c\") " pod="openstack/prometheus-metric-storage-0" Dec 01 21:52:40 crc kubenswrapper[4962]: I1201 21:52:40.681899 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4f8782cd-368d-4071-848d-8ad2379ddf6c-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"4f8782cd-368d-4071-848d-8ad2379ddf6c\") " pod="openstack/prometheus-metric-storage-0" Dec 01 21:52:40 crc kubenswrapper[4962]: I1201 21:52:40.690601 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"prometheus-metric-storage-0\" (UID: \"4f8782cd-368d-4071-848d-8ad2379ddf6c\") " pod="openstack/prometheus-metric-storage-0" Dec 01 21:52:40 crc kubenswrapper[4962]: W1201 21:52:40.703327 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9ef8bb6_0fc4_411e_82a1_85d95ced5818.slice/crio-78e5aa08bb5fc48c8005a571ce973bcf7fe5ed46a0b3db0a3f5790ad1c304c45 WatchSource:0}: Error finding container 78e5aa08bb5fc48c8005a571ce973bcf7fe5ed46a0b3db0a3f5790ad1c304c45: Status 404 returned error can't find the container with id 78e5aa08bb5fc48c8005a571ce973bcf7fe5ed46a0b3db0a3f5790ad1c304c45 Dec 01 21:52:40 crc kubenswrapper[4962]: W1201 21:52:40.705592 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e9a059a_712b_4ff4_b50e_7d94a96a9db5.slice/crio-3f2d9cbde4485b49740dce5ef9a9aeebcd4abfce6e5fe5955bf2d2e23669f7e7 WatchSource:0}: Error finding container 3f2d9cbde4485b49740dce5ef9a9aeebcd4abfce6e5fe5955bf2d2e23669f7e7: Status 404 returned error can't find the container with id 3f2d9cbde4485b49740dce5ef9a9aeebcd4abfce6e5fe5955bf2d2e23669f7e7 Dec 01 21:52:40 crc kubenswrapper[4962]: I1201 21:52:40.726215 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-9stzc" Dec 01 21:52:40 crc kubenswrapper[4962]: I1201 21:52:40.759481 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 01 21:52:40 crc kubenswrapper[4962]: I1201 21:52:40.920035 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c9ef8bb6-0fc4-411e-82a1-85d95ced5818","Type":"ContainerStarted","Data":"78e5aa08bb5fc48c8005a571ce973bcf7fe5ed46a0b3db0a3f5790ad1c304c45"} Dec 01 21:52:40 crc kubenswrapper[4962]: I1201 21:52:40.921271 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1e9a059a-712b-4ff4-b50e-7d94a96a9db5","Type":"ContainerStarted","Data":"3f2d9cbde4485b49740dce5ef9a9aeebcd4abfce6e5fe5955bf2d2e23669f7e7"} Dec 01 21:52:42 crc kubenswrapper[4962]: I1201 21:52:42.165263 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-xd7ph"] Dec 01 21:52:42 crc kubenswrapper[4962]: I1201 21:52:42.166758 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-xd7ph" Dec 01 21:52:42 crc kubenswrapper[4962]: I1201 21:52:42.168866 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-6rwls" Dec 01 21:52:42 crc kubenswrapper[4962]: I1201 21:52:42.169041 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Dec 01 21:52:42 crc kubenswrapper[4962]: I1201 21:52:42.169355 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Dec 01 21:52:42 crc kubenswrapper[4962]: I1201 21:52:42.181747 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-xd7ph"] Dec 01 21:52:42 crc kubenswrapper[4962]: I1201 21:52:42.219384 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f4f0c9ce-824a-4a5b-adc9-f7a09b3ab97d-var-run-ovn\") pod \"ovn-controller-xd7ph\" (UID: \"f4f0c9ce-824a-4a5b-adc9-f7a09b3ab97d\") " pod="openstack/ovn-controller-xd7ph" Dec 01 21:52:42 crc kubenswrapper[4962]: I1201 21:52:42.219436 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4f0c9ce-824a-4a5b-adc9-f7a09b3ab97d-ovn-controller-tls-certs\") pod \"ovn-controller-xd7ph\" (UID: \"f4f0c9ce-824a-4a5b-adc9-f7a09b3ab97d\") " pod="openstack/ovn-controller-xd7ph" Dec 01 21:52:42 crc kubenswrapper[4962]: I1201 21:52:42.219502 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f4f0c9ce-824a-4a5b-adc9-f7a09b3ab97d-scripts\") pod \"ovn-controller-xd7ph\" (UID: \"f4f0c9ce-824a-4a5b-adc9-f7a09b3ab97d\") " pod="openstack/ovn-controller-xd7ph" Dec 01 21:52:42 crc kubenswrapper[4962]: I1201 21:52:42.219591 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4f0c9ce-824a-4a5b-adc9-f7a09b3ab97d-combined-ca-bundle\") pod \"ovn-controller-xd7ph\" (UID: \"f4f0c9ce-824a-4a5b-adc9-f7a09b3ab97d\") " pod="openstack/ovn-controller-xd7ph" Dec 01 21:52:42 crc kubenswrapper[4962]: I1201 21:52:42.219622 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f4f0c9ce-824a-4a5b-adc9-f7a09b3ab97d-var-run\") pod \"ovn-controller-xd7ph\" (UID: \"f4f0c9ce-824a-4a5b-adc9-f7a09b3ab97d\") " pod="openstack/ovn-controller-xd7ph" Dec 01 21:52:42 crc kubenswrapper[4962]: I1201 21:52:42.219637 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f4f0c9ce-824a-4a5b-adc9-f7a09b3ab97d-var-log-ovn\") pod \"ovn-controller-xd7ph\" (UID: \"f4f0c9ce-824a-4a5b-adc9-f7a09b3ab97d\") " pod="openstack/ovn-controller-xd7ph" Dec 01 21:52:42 crc kubenswrapper[4962]: I1201 21:52:42.219662 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8z9k5\" (UniqueName: \"kubernetes.io/projected/f4f0c9ce-824a-4a5b-adc9-f7a09b3ab97d-kube-api-access-8z9k5\") pod \"ovn-controller-xd7ph\" (UID: \"f4f0c9ce-824a-4a5b-adc9-f7a09b3ab97d\") " pod="openstack/ovn-controller-xd7ph" Dec 01 21:52:42 crc kubenswrapper[4962]: I1201 21:52:42.251757 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-cdpb9"] Dec 01 21:52:42 crc kubenswrapper[4962]: I1201 21:52:42.253755 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-cdpb9" Dec 01 21:52:42 crc kubenswrapper[4962]: I1201 21:52:42.260893 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-cdpb9"] Dec 01 21:52:42 crc kubenswrapper[4962]: I1201 21:52:42.321545 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/88fa575e-baee-41dd-8c7e-72baff22783e-var-lib\") pod \"ovn-controller-ovs-cdpb9\" (UID: \"88fa575e-baee-41dd-8c7e-72baff22783e\") " pod="openstack/ovn-controller-ovs-cdpb9" Dec 01 21:52:42 crc kubenswrapper[4962]: I1201 21:52:42.321617 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4f0c9ce-824a-4a5b-adc9-f7a09b3ab97d-combined-ca-bundle\") pod \"ovn-controller-xd7ph\" (UID: \"f4f0c9ce-824a-4a5b-adc9-f7a09b3ab97d\") " pod="openstack/ovn-controller-xd7ph" Dec 01 21:52:42 crc kubenswrapper[4962]: I1201 21:52:42.321658 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f4f0c9ce-824a-4a5b-adc9-f7a09b3ab97d-var-run\") pod \"ovn-controller-xd7ph\" (UID: \"f4f0c9ce-824a-4a5b-adc9-f7a09b3ab97d\") " pod="openstack/ovn-controller-xd7ph" Dec 01 21:52:42 crc kubenswrapper[4962]: I1201 21:52:42.321739 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f4f0c9ce-824a-4a5b-adc9-f7a09b3ab97d-var-log-ovn\") pod \"ovn-controller-xd7ph\" (UID: \"f4f0c9ce-824a-4a5b-adc9-f7a09b3ab97d\") " pod="openstack/ovn-controller-xd7ph" Dec 01 21:52:42 crc kubenswrapper[4962]: I1201 21:52:42.321806 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/88fa575e-baee-41dd-8c7e-72baff22783e-scripts\") pod \"ovn-controller-ovs-cdpb9\" (UID: \"88fa575e-baee-41dd-8c7e-72baff22783e\") " pod="openstack/ovn-controller-ovs-cdpb9" Dec 01 21:52:42 crc kubenswrapper[4962]: I1201 21:52:42.321841 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8z9k5\" (UniqueName: \"kubernetes.io/projected/f4f0c9ce-824a-4a5b-adc9-f7a09b3ab97d-kube-api-access-8z9k5\") pod \"ovn-controller-xd7ph\" (UID: \"f4f0c9ce-824a-4a5b-adc9-f7a09b3ab97d\") " pod="openstack/ovn-controller-xd7ph" Dec 01 21:52:42 crc kubenswrapper[4962]: I1201 21:52:42.321895 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/88fa575e-baee-41dd-8c7e-72baff22783e-etc-ovs\") pod \"ovn-controller-ovs-cdpb9\" (UID: \"88fa575e-baee-41dd-8c7e-72baff22783e\") " pod="openstack/ovn-controller-ovs-cdpb9" Dec 01 21:52:42 crc kubenswrapper[4962]: I1201 21:52:42.322073 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f4f0c9ce-824a-4a5b-adc9-f7a09b3ab97d-var-run-ovn\") pod \"ovn-controller-xd7ph\" (UID: \"f4f0c9ce-824a-4a5b-adc9-f7a09b3ab97d\") " pod="openstack/ovn-controller-xd7ph" Dec 01 21:52:42 crc kubenswrapper[4962]: I1201 21:52:42.322104 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4f0c9ce-824a-4a5b-adc9-f7a09b3ab97d-ovn-controller-tls-certs\") pod \"ovn-controller-xd7ph\" (UID: \"f4f0c9ce-824a-4a5b-adc9-f7a09b3ab97d\") " pod="openstack/ovn-controller-xd7ph" Dec 01 21:52:42 crc kubenswrapper[4962]: I1201 21:52:42.322122 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f4f0c9ce-824a-4a5b-adc9-f7a09b3ab97d-var-run\") pod \"ovn-controller-xd7ph\" (UID: \"f4f0c9ce-824a-4a5b-adc9-f7a09b3ab97d\") " pod="openstack/ovn-controller-xd7ph" Dec 01 21:52:42 crc kubenswrapper[4962]: I1201 21:52:42.322149 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxpxc\" (UniqueName: \"kubernetes.io/projected/88fa575e-baee-41dd-8c7e-72baff22783e-kube-api-access-sxpxc\") pod \"ovn-controller-ovs-cdpb9\" (UID: \"88fa575e-baee-41dd-8c7e-72baff22783e\") " pod="openstack/ovn-controller-ovs-cdpb9" Dec 01 21:52:42 crc kubenswrapper[4962]: I1201 21:52:42.322202 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f4f0c9ce-824a-4a5b-adc9-f7a09b3ab97d-var-log-ovn\") pod \"ovn-controller-xd7ph\" (UID: \"f4f0c9ce-824a-4a5b-adc9-f7a09b3ab97d\") " pod="openstack/ovn-controller-xd7ph" Dec 01 21:52:42 crc kubenswrapper[4962]: I1201 21:52:42.322278 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/88fa575e-baee-41dd-8c7e-72baff22783e-var-run\") pod \"ovn-controller-ovs-cdpb9\" (UID: \"88fa575e-baee-41dd-8c7e-72baff22783e\") " pod="openstack/ovn-controller-ovs-cdpb9" Dec 01 21:52:42 crc kubenswrapper[4962]: I1201 21:52:42.322318 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f4f0c9ce-824a-4a5b-adc9-f7a09b3ab97d-scripts\") pod \"ovn-controller-xd7ph\" (UID: \"f4f0c9ce-824a-4a5b-adc9-f7a09b3ab97d\") " pod="openstack/ovn-controller-xd7ph" Dec 01 21:52:42 crc kubenswrapper[4962]: I1201 21:52:42.322355 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/88fa575e-baee-41dd-8c7e-72baff22783e-var-log\") pod \"ovn-controller-ovs-cdpb9\" (UID: \"88fa575e-baee-41dd-8c7e-72baff22783e\") " pod="openstack/ovn-controller-ovs-cdpb9" Dec 01 21:52:42 crc kubenswrapper[4962]: I1201 21:52:42.322386 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f4f0c9ce-824a-4a5b-adc9-f7a09b3ab97d-var-run-ovn\") pod \"ovn-controller-xd7ph\" (UID: \"f4f0c9ce-824a-4a5b-adc9-f7a09b3ab97d\") " pod="openstack/ovn-controller-xd7ph" Dec 01 21:52:42 crc kubenswrapper[4962]: I1201 21:52:42.326209 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f4f0c9ce-824a-4a5b-adc9-f7a09b3ab97d-scripts\") pod \"ovn-controller-xd7ph\" (UID: \"f4f0c9ce-824a-4a5b-adc9-f7a09b3ab97d\") " pod="openstack/ovn-controller-xd7ph" Dec 01 21:52:42 crc kubenswrapper[4962]: I1201 21:52:42.334198 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4f0c9ce-824a-4a5b-adc9-f7a09b3ab97d-ovn-controller-tls-certs\") pod \"ovn-controller-xd7ph\" (UID: \"f4f0c9ce-824a-4a5b-adc9-f7a09b3ab97d\") " pod="openstack/ovn-controller-xd7ph" Dec 01 21:52:42 crc kubenswrapper[4962]: I1201 21:52:42.337771 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8z9k5\" (UniqueName: \"kubernetes.io/projected/f4f0c9ce-824a-4a5b-adc9-f7a09b3ab97d-kube-api-access-8z9k5\") pod \"ovn-controller-xd7ph\" (UID: \"f4f0c9ce-824a-4a5b-adc9-f7a09b3ab97d\") " pod="openstack/ovn-controller-xd7ph" Dec 01 21:52:42 crc kubenswrapper[4962]: I1201 21:52:42.339051 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4f0c9ce-824a-4a5b-adc9-f7a09b3ab97d-combined-ca-bundle\") pod \"ovn-controller-xd7ph\" (UID: \"f4f0c9ce-824a-4a5b-adc9-f7a09b3ab97d\") " pod="openstack/ovn-controller-xd7ph" Dec 01 21:52:42 crc kubenswrapper[4962]: I1201 21:52:42.423852 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxpxc\" (UniqueName: \"kubernetes.io/projected/88fa575e-baee-41dd-8c7e-72baff22783e-kube-api-access-sxpxc\") pod \"ovn-controller-ovs-cdpb9\" (UID: \"88fa575e-baee-41dd-8c7e-72baff22783e\") " pod="openstack/ovn-controller-ovs-cdpb9" Dec 01 21:52:42 crc kubenswrapper[4962]: I1201 21:52:42.423980 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/88fa575e-baee-41dd-8c7e-72baff22783e-var-run\") pod \"ovn-controller-ovs-cdpb9\" (UID: \"88fa575e-baee-41dd-8c7e-72baff22783e\") " pod="openstack/ovn-controller-ovs-cdpb9" Dec 01 21:52:42 crc kubenswrapper[4962]: I1201 21:52:42.424042 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/88fa575e-baee-41dd-8c7e-72baff22783e-var-log\") pod \"ovn-controller-ovs-cdpb9\" (UID: \"88fa575e-baee-41dd-8c7e-72baff22783e\") " pod="openstack/ovn-controller-ovs-cdpb9" Dec 01 21:52:42 crc kubenswrapper[4962]: I1201 21:52:42.424083 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/88fa575e-baee-41dd-8c7e-72baff22783e-var-lib\") pod \"ovn-controller-ovs-cdpb9\" (UID: \"88fa575e-baee-41dd-8c7e-72baff22783e\") " pod="openstack/ovn-controller-ovs-cdpb9" Dec 01 21:52:42 crc kubenswrapper[4962]: I1201 21:52:42.424144 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/88fa575e-baee-41dd-8c7e-72baff22783e-scripts\") pod \"ovn-controller-ovs-cdpb9\" (UID: \"88fa575e-baee-41dd-8c7e-72baff22783e\") " pod="openstack/ovn-controller-ovs-cdpb9" Dec 01 21:52:42 crc kubenswrapper[4962]: I1201 21:52:42.424181 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/88fa575e-baee-41dd-8c7e-72baff22783e-etc-ovs\") pod \"ovn-controller-ovs-cdpb9\" (UID: \"88fa575e-baee-41dd-8c7e-72baff22783e\") " pod="openstack/ovn-controller-ovs-cdpb9" Dec 01 21:52:42 crc kubenswrapper[4962]: I1201 21:52:42.424231 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/88fa575e-baee-41dd-8c7e-72baff22783e-var-run\") pod \"ovn-controller-ovs-cdpb9\" (UID: \"88fa575e-baee-41dd-8c7e-72baff22783e\") " pod="openstack/ovn-controller-ovs-cdpb9" Dec 01 21:52:42 crc kubenswrapper[4962]: I1201 21:52:42.424415 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/88fa575e-baee-41dd-8c7e-72baff22783e-var-lib\") pod \"ovn-controller-ovs-cdpb9\" (UID: \"88fa575e-baee-41dd-8c7e-72baff22783e\") " pod="openstack/ovn-controller-ovs-cdpb9" Dec 01 21:52:42 crc kubenswrapper[4962]: I1201 21:52:42.424500 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/88fa575e-baee-41dd-8c7e-72baff22783e-var-log\") pod \"ovn-controller-ovs-cdpb9\" (UID: \"88fa575e-baee-41dd-8c7e-72baff22783e\") " pod="openstack/ovn-controller-ovs-cdpb9" Dec 01 21:52:42 crc kubenswrapper[4962]: I1201 21:52:42.424602 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/88fa575e-baee-41dd-8c7e-72baff22783e-etc-ovs\") pod \"ovn-controller-ovs-cdpb9\" (UID: \"88fa575e-baee-41dd-8c7e-72baff22783e\") " pod="openstack/ovn-controller-ovs-cdpb9" Dec 01 21:52:42 crc kubenswrapper[4962]: I1201 21:52:42.426480 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/88fa575e-baee-41dd-8c7e-72baff22783e-scripts\") pod \"ovn-controller-ovs-cdpb9\" (UID: \"88fa575e-baee-41dd-8c7e-72baff22783e\") " pod="openstack/ovn-controller-ovs-cdpb9" Dec 01 21:52:42 crc kubenswrapper[4962]: I1201 21:52:42.440686 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxpxc\" (UniqueName: \"kubernetes.io/projected/88fa575e-baee-41dd-8c7e-72baff22783e-kube-api-access-sxpxc\") pod \"ovn-controller-ovs-cdpb9\" (UID: \"88fa575e-baee-41dd-8c7e-72baff22783e\") " pod="openstack/ovn-controller-ovs-cdpb9" Dec 01 21:52:42 crc kubenswrapper[4962]: I1201 21:52:42.553415 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-xd7ph" Dec 01 21:52:42 crc kubenswrapper[4962]: I1201 21:52:42.571713 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-cdpb9" Dec 01 21:52:43 crc kubenswrapper[4962]: I1201 21:52:43.893748 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 01 21:52:43 crc kubenswrapper[4962]: I1201 21:52:43.898478 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 01 21:52:43 crc kubenswrapper[4962]: I1201 21:52:43.901778 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Dec 01 21:52:43 crc kubenswrapper[4962]: I1201 21:52:43.902872 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 01 21:52:43 crc kubenswrapper[4962]: I1201 21:52:43.903299 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-qwzhh" Dec 01 21:52:43 crc kubenswrapper[4962]: I1201 21:52:43.903512 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 01 21:52:43 crc kubenswrapper[4962]: I1201 21:52:43.903752 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Dec 01 21:52:43 crc kubenswrapper[4962]: I1201 21:52:43.904555 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 01 21:52:44 crc kubenswrapper[4962]: I1201 21:52:44.102658 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"f893a462-9c1f-4b76-84fc-ba5e84364399\") " pod="openstack/ovsdbserver-nb-0" Dec 01 21:52:44 crc kubenswrapper[4962]: I1201 21:52:44.102919 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f893a462-9c1f-4b76-84fc-ba5e84364399-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"f893a462-9c1f-4b76-84fc-ba5e84364399\") " pod="openstack/ovsdbserver-nb-0" Dec 01 21:52:44 crc kubenswrapper[4962]: I1201 21:52:44.103078 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f893a462-9c1f-4b76-84fc-ba5e84364399-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"f893a462-9c1f-4b76-84fc-ba5e84364399\") " pod="openstack/ovsdbserver-nb-0" Dec 01 21:52:44 crc kubenswrapper[4962]: I1201 21:52:44.103190 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f893a462-9c1f-4b76-84fc-ba5e84364399-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"f893a462-9c1f-4b76-84fc-ba5e84364399\") " pod="openstack/ovsdbserver-nb-0" Dec 01 21:52:44 crc kubenswrapper[4962]: I1201 21:52:44.103290 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f893a462-9c1f-4b76-84fc-ba5e84364399-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"f893a462-9c1f-4b76-84fc-ba5e84364399\") " pod="openstack/ovsdbserver-nb-0" Dec 01 21:52:44 crc kubenswrapper[4962]: I1201 21:52:44.103384 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f893a462-9c1f-4b76-84fc-ba5e84364399-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"f893a462-9c1f-4b76-84fc-ba5e84364399\") " pod="openstack/ovsdbserver-nb-0" Dec 01 21:52:44 crc kubenswrapper[4962]: I1201 21:52:44.103523 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f893a462-9c1f-4b76-84fc-ba5e84364399-config\") pod \"ovsdbserver-nb-0\" (UID: \"f893a462-9c1f-4b76-84fc-ba5e84364399\") " pod="openstack/ovsdbserver-nb-0" Dec 01 21:52:44 crc kubenswrapper[4962]: I1201 21:52:44.103658 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlddq\" (UniqueName: \"kubernetes.io/projected/f893a462-9c1f-4b76-84fc-ba5e84364399-kube-api-access-tlddq\") pod \"ovsdbserver-nb-0\" (UID: \"f893a462-9c1f-4b76-84fc-ba5e84364399\") " pod="openstack/ovsdbserver-nb-0" Dec 01 21:52:44 crc kubenswrapper[4962]: I1201 21:52:44.205863 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"f893a462-9c1f-4b76-84fc-ba5e84364399\") " pod="openstack/ovsdbserver-nb-0" Dec 01 21:52:44 crc kubenswrapper[4962]: I1201 21:52:44.205984 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f893a462-9c1f-4b76-84fc-ba5e84364399-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"f893a462-9c1f-4b76-84fc-ba5e84364399\") " pod="openstack/ovsdbserver-nb-0" Dec 01 21:52:44 crc kubenswrapper[4962]: I1201 21:52:44.206057 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f893a462-9c1f-4b76-84fc-ba5e84364399-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"f893a462-9c1f-4b76-84fc-ba5e84364399\") " pod="openstack/ovsdbserver-nb-0" Dec 01 21:52:44 crc kubenswrapper[4962]: I1201 21:52:44.206091 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f893a462-9c1f-4b76-84fc-ba5e84364399-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"f893a462-9c1f-4b76-84fc-ba5e84364399\") " pod="openstack/ovsdbserver-nb-0" Dec 01 21:52:44 crc kubenswrapper[4962]: I1201 21:52:44.206146 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f893a462-9c1f-4b76-84fc-ba5e84364399-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"f893a462-9c1f-4b76-84fc-ba5e84364399\") " pod="openstack/ovsdbserver-nb-0" Dec 01 21:52:44 crc kubenswrapper[4962]: I1201 21:52:44.206199 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f893a462-9c1f-4b76-84fc-ba5e84364399-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"f893a462-9c1f-4b76-84fc-ba5e84364399\") " pod="openstack/ovsdbserver-nb-0" Dec 01 21:52:44 crc kubenswrapper[4962]: I1201 21:52:44.206279 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f893a462-9c1f-4b76-84fc-ba5e84364399-config\") pod \"ovsdbserver-nb-0\" (UID: \"f893a462-9c1f-4b76-84fc-ba5e84364399\") " pod="openstack/ovsdbserver-nb-0" Dec 01 21:52:44 crc kubenswrapper[4962]: I1201 21:52:44.206328 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"f893a462-9c1f-4b76-84fc-ba5e84364399\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/ovsdbserver-nb-0" Dec 01 21:52:44 crc kubenswrapper[4962]: I1201 21:52:44.206378 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlddq\" (UniqueName: \"kubernetes.io/projected/f893a462-9c1f-4b76-84fc-ba5e84364399-kube-api-access-tlddq\") pod \"ovsdbserver-nb-0\" (UID: \"f893a462-9c1f-4b76-84fc-ba5e84364399\") " pod="openstack/ovsdbserver-nb-0" Dec 01 21:52:44 crc kubenswrapper[4962]: I1201 21:52:44.207687 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f893a462-9c1f-4b76-84fc-ba5e84364399-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"f893a462-9c1f-4b76-84fc-ba5e84364399\") " pod="openstack/ovsdbserver-nb-0" Dec 01 21:52:44 crc kubenswrapper[4962]: I1201 21:52:44.208671 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f893a462-9c1f-4b76-84fc-ba5e84364399-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"f893a462-9c1f-4b76-84fc-ba5e84364399\") " pod="openstack/ovsdbserver-nb-0" Dec 01 21:52:44 crc kubenswrapper[4962]: I1201 21:52:44.210851 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f893a462-9c1f-4b76-84fc-ba5e84364399-config\") pod \"ovsdbserver-nb-0\" (UID: \"f893a462-9c1f-4b76-84fc-ba5e84364399\") " pod="openstack/ovsdbserver-nb-0" Dec 01 21:52:44 crc kubenswrapper[4962]: I1201 21:52:44.214250 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f893a462-9c1f-4b76-84fc-ba5e84364399-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"f893a462-9c1f-4b76-84fc-ba5e84364399\") " pod="openstack/ovsdbserver-nb-0" Dec 01 21:52:44 crc kubenswrapper[4962]: I1201 21:52:44.215347 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f893a462-9c1f-4b76-84fc-ba5e84364399-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"f893a462-9c1f-4b76-84fc-ba5e84364399\") " pod="openstack/ovsdbserver-nb-0" Dec 01 21:52:44 crc kubenswrapper[4962]: I1201 21:52:44.215914 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f893a462-9c1f-4b76-84fc-ba5e84364399-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"f893a462-9c1f-4b76-84fc-ba5e84364399\") " pod="openstack/ovsdbserver-nb-0" Dec 01 21:52:44 crc kubenswrapper[4962]: I1201 21:52:44.225913 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlddq\" (UniqueName: \"kubernetes.io/projected/f893a462-9c1f-4b76-84fc-ba5e84364399-kube-api-access-tlddq\") pod \"ovsdbserver-nb-0\" (UID: \"f893a462-9c1f-4b76-84fc-ba5e84364399\") " pod="openstack/ovsdbserver-nb-0" Dec 01 21:52:44 crc kubenswrapper[4962]: I1201 21:52:44.237472 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"f893a462-9c1f-4b76-84fc-ba5e84364399\") " pod="openstack/ovsdbserver-nb-0" Dec 01 21:52:44 crc kubenswrapper[4962]: I1201 21:52:44.320748 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 01 21:52:47 crc kubenswrapper[4962]: I1201 21:52:47.153880 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 01 21:52:47 crc kubenswrapper[4962]: I1201 21:52:47.155964 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 01 21:52:47 crc kubenswrapper[4962]: I1201 21:52:47.157963 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Dec 01 21:52:47 crc kubenswrapper[4962]: I1201 21:52:47.160111 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 01 21:52:47 crc kubenswrapper[4962]: I1201 21:52:47.160222 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-2hm7x" Dec 01 21:52:47 crc kubenswrapper[4962]: I1201 21:52:47.160443 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 01 21:52:47 crc kubenswrapper[4962]: I1201 21:52:47.166205 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 01 21:52:47 crc kubenswrapper[4962]: I1201 21:52:47.278856 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"fe00e319-7859-4bac-9316-156263865d80\") " pod="openstack/ovsdbserver-sb-0" Dec 01 21:52:47 crc kubenswrapper[4962]: I1201 21:52:47.279212 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe00e319-7859-4bac-9316-156263865d80-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"fe00e319-7859-4bac-9316-156263865d80\") " pod="openstack/ovsdbserver-sb-0" Dec 01 21:52:47 crc kubenswrapper[4962]: I1201 21:52:47.279242 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fe00e319-7859-4bac-9316-156263865d80-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"fe00e319-7859-4bac-9316-156263865d80\") " pod="openstack/ovsdbserver-sb-0" Dec 01 21:52:47 crc kubenswrapper[4962]: I1201 21:52:47.279270 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe00e319-7859-4bac-9316-156263865d80-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"fe00e319-7859-4bac-9316-156263865d80\") " pod="openstack/ovsdbserver-sb-0" Dec 01 21:52:47 crc kubenswrapper[4962]: I1201 21:52:47.279301 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe00e319-7859-4bac-9316-156263865d80-config\") pod \"ovsdbserver-sb-0\" (UID: \"fe00e319-7859-4bac-9316-156263865d80\") " pod="openstack/ovsdbserver-sb-0" Dec 01 21:52:47 crc kubenswrapper[4962]: I1201 21:52:47.279432 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwhqb\" (UniqueName: \"kubernetes.io/projected/fe00e319-7859-4bac-9316-156263865d80-kube-api-access-jwhqb\") pod \"ovsdbserver-sb-0\" (UID: \"fe00e319-7859-4bac-9316-156263865d80\") " pod="openstack/ovsdbserver-sb-0" Dec 01 21:52:47 crc kubenswrapper[4962]: I1201 21:52:47.279470 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fe00e319-7859-4bac-9316-156263865d80-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"fe00e319-7859-4bac-9316-156263865d80\") " pod="openstack/ovsdbserver-sb-0" Dec 01 21:52:47 crc kubenswrapper[4962]: I1201 21:52:47.279531 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe00e319-7859-4bac-9316-156263865d80-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"fe00e319-7859-4bac-9316-156263865d80\") " pod="openstack/ovsdbserver-sb-0" Dec 01 21:52:47 crc kubenswrapper[4962]: I1201 21:52:47.381477 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwhqb\" (UniqueName: \"kubernetes.io/projected/fe00e319-7859-4bac-9316-156263865d80-kube-api-access-jwhqb\") pod \"ovsdbserver-sb-0\" (UID: \"fe00e319-7859-4bac-9316-156263865d80\") " pod="openstack/ovsdbserver-sb-0" Dec 01 21:52:47 crc kubenswrapper[4962]: I1201 21:52:47.381538 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fe00e319-7859-4bac-9316-156263865d80-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"fe00e319-7859-4bac-9316-156263865d80\") " pod="openstack/ovsdbserver-sb-0" Dec 01 21:52:47 crc kubenswrapper[4962]: I1201 21:52:47.381642 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe00e319-7859-4bac-9316-156263865d80-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"fe00e319-7859-4bac-9316-156263865d80\") " pod="openstack/ovsdbserver-sb-0" Dec 01 21:52:47 crc kubenswrapper[4962]: I1201 21:52:47.381706 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"fe00e319-7859-4bac-9316-156263865d80\") " pod="openstack/ovsdbserver-sb-0" Dec 01 21:52:47 crc kubenswrapper[4962]: I1201 21:52:47.381763 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe00e319-7859-4bac-9316-156263865d80-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"fe00e319-7859-4bac-9316-156263865d80\") " pod="openstack/ovsdbserver-sb-0" Dec 01 21:52:47 crc kubenswrapper[4962]: I1201 21:52:47.381793 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fe00e319-7859-4bac-9316-156263865d80-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"fe00e319-7859-4bac-9316-156263865d80\") " pod="openstack/ovsdbserver-sb-0" Dec 01 21:52:47 crc kubenswrapper[4962]: I1201 21:52:47.381821 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe00e319-7859-4bac-9316-156263865d80-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"fe00e319-7859-4bac-9316-156263865d80\") " pod="openstack/ovsdbserver-sb-0" Dec 01 21:52:47 crc kubenswrapper[4962]: I1201 21:52:47.381864 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe00e319-7859-4bac-9316-156263865d80-config\") pod \"ovsdbserver-sb-0\" (UID: \"fe00e319-7859-4bac-9316-156263865d80\") " pod="openstack/ovsdbserver-sb-0" Dec 01 21:52:47 crc kubenswrapper[4962]: I1201 21:52:47.382865 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fe00e319-7859-4bac-9316-156263865d80-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"fe00e319-7859-4bac-9316-156263865d80\") " pod="openstack/ovsdbserver-sb-0" Dec 01 21:52:47 crc kubenswrapper[4962]: I1201 21:52:47.383438 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"fe00e319-7859-4bac-9316-156263865d80\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/ovsdbserver-sb-0" Dec 01 21:52:47 crc kubenswrapper[4962]: I1201 21:52:47.384202 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fe00e319-7859-4bac-9316-156263865d80-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"fe00e319-7859-4bac-9316-156263865d80\") " pod="openstack/ovsdbserver-sb-0" Dec 01 21:52:47 crc kubenswrapper[4962]: I1201 21:52:47.384261 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe00e319-7859-4bac-9316-156263865d80-config\") pod \"ovsdbserver-sb-0\" (UID: \"fe00e319-7859-4bac-9316-156263865d80\") " pod="openstack/ovsdbserver-sb-0" Dec 01 21:52:47 crc kubenswrapper[4962]: I1201 21:52:47.388901 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe00e319-7859-4bac-9316-156263865d80-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"fe00e319-7859-4bac-9316-156263865d80\") " pod="openstack/ovsdbserver-sb-0" Dec 01 21:52:47 crc kubenswrapper[4962]: I1201 21:52:47.389532 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe00e319-7859-4bac-9316-156263865d80-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"fe00e319-7859-4bac-9316-156263865d80\") " pod="openstack/ovsdbserver-sb-0" Dec 01 21:52:47 crc kubenswrapper[4962]: I1201 21:52:47.393040 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe00e319-7859-4bac-9316-156263865d80-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"fe00e319-7859-4bac-9316-156263865d80\") " pod="openstack/ovsdbserver-sb-0" Dec 01 21:52:47 crc kubenswrapper[4962]: I1201 21:52:47.402860 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwhqb\" (UniqueName: \"kubernetes.io/projected/fe00e319-7859-4bac-9316-156263865d80-kube-api-access-jwhqb\") pod \"ovsdbserver-sb-0\" (UID: \"fe00e319-7859-4bac-9316-156263865d80\") " pod="openstack/ovsdbserver-sb-0" Dec 01 21:52:47 crc kubenswrapper[4962]: I1201 21:52:47.419477 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"fe00e319-7859-4bac-9316-156263865d80\") " pod="openstack/ovsdbserver-sb-0" Dec 01 21:52:47 crc kubenswrapper[4962]: I1201 21:52:47.510611 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 01 21:52:49 crc kubenswrapper[4962]: I1201 21:52:49.610659 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 21:52:51 crc kubenswrapper[4962]: E1201 21:52:51.317262 4962 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 01 21:52:51 crc kubenswrapper[4962]: E1201 21:52:51.317991 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-g46qf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-kv54m_openstack(33007873-cb3d-4f47-8883-a60f3f823a16): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 21:52:51 crc kubenswrapper[4962]: W1201 21:52:51.318321 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3dced14f_6bff_4820_b135_78ef69ba6b33.slice/crio-cd63ae4baebc3800f00b82ad190c515c0cfdb69ed4241445e9af6edf6a8576ac WatchSource:0}: Error finding container cd63ae4baebc3800f00b82ad190c515c0cfdb69ed4241445e9af6edf6a8576ac: Status 404 returned error can't find the container with id cd63ae4baebc3800f00b82ad190c515c0cfdb69ed4241445e9af6edf6a8576ac Dec 01 21:52:51 crc kubenswrapper[4962]: E1201 21:52:51.319603 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-kv54m" podUID="33007873-cb3d-4f47-8883-a60f3f823a16" Dec 01 21:52:51 crc kubenswrapper[4962]: E1201 21:52:51.351131 4962 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 01 21:52:51 crc kubenswrapper[4962]: E1201 21:52:51.351736 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kwwmk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-7nmb4_openstack(6951c087-297e-4743-a6ec-3a1e3a7a3f9f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 21:52:51 crc kubenswrapper[4962]: E1201 21:52:51.353008 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-7nmb4" podUID="6951c087-297e-4743-a6ec-3a1e3a7a3f9f" Dec 01 21:52:51 crc kubenswrapper[4962]: I1201 21:52:51.354758 4962 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 21:52:51 crc kubenswrapper[4962]: E1201 21:52:51.396271 4962 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 01 21:52:51 crc kubenswrapper[4962]: E1201 21:52:51.396469 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rllnj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-clh7c_openstack(acafcb5c-9dfc-4997-bc42-84fbcb84c6d1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 21:52:51 crc kubenswrapper[4962]: E1201 21:52:51.397667 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-clh7c" podUID="acafcb5c-9dfc-4997-bc42-84fbcb84c6d1" Dec 01 21:52:52 crc kubenswrapper[4962]: I1201 21:52:52.101809 4962 generic.go:334] "Generic (PLEG): container finished" podID="214eff82-ef7c-49b0-a9a3-5246584e9b66" containerID="13306bf603397e4024db8bc4f86d54c728e0721e68e03f159229afeedca2751e" exitCode=0 Dec 01 21:52:52 crc kubenswrapper[4962]: I1201 21:52:52.105614 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-njtpt" event={"ID":"214eff82-ef7c-49b0-a9a3-5246584e9b66","Type":"ContainerDied","Data":"13306bf603397e4024db8bc4f86d54c728e0721e68e03f159229afeedca2751e"} Dec 01 21:52:52 crc kubenswrapper[4962]: I1201 21:52:52.117040 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"3dced14f-6bff-4820-b135-78ef69ba6b33","Type":"ContainerStarted","Data":"cd63ae4baebc3800f00b82ad190c515c0cfdb69ed4241445e9af6edf6a8576ac"} Dec 01 21:52:52 crc kubenswrapper[4962]: I1201 21:52:52.352270 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6c4bf499-lfzj5"] Dec 01 21:52:52 crc kubenswrapper[4962]: I1201 21:52:52.363610 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 01 21:52:52 crc kubenswrapper[4962]: I1201 21:52:52.379372 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 01 21:52:52 crc kubenswrapper[4962]: W1201 21:52:52.741227 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8aeb2e7e_7195_43a5_b5a8_760f44df8a86.slice/crio-177cae4eeb0eb9c85ffec6e41c49db394e7fd8c7b990d0ad0dd4098c159f5c6c WatchSource:0}: Error finding container 177cae4eeb0eb9c85ffec6e41c49db394e7fd8c7b990d0ad0dd4098c159f5c6c: Status 404 returned error can't find the container with id 177cae4eeb0eb9c85ffec6e41c49db394e7fd8c7b990d0ad0dd4098c159f5c6c Dec 01 21:52:52 crc kubenswrapper[4962]: I1201 21:52:52.838453 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-ui-dashboards-7d5fb4cbfb-9stzc"] Dec 01 21:52:52 crc kubenswrapper[4962]: I1201 21:52:52.857130 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 01 21:52:53 crc kubenswrapper[4962]: I1201 21:52:53.112447 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-xd7ph"] Dec 01 21:52:53 crc kubenswrapper[4962]: I1201 21:52:53.144415 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c4bf499-lfzj5" event={"ID":"8aeb2e7e-7195-43a5-b5a8-760f44df8a86","Type":"ContainerStarted","Data":"177cae4eeb0eb9c85ffec6e41c49db394e7fd8c7b990d0ad0dd4098c159f5c6c"} Dec 01 21:52:53 crc kubenswrapper[4962]: I1201 21:52:53.150520 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c9ef8bb6-0fc4-411e-82a1-85d95ced5818","Type":"ContainerStarted","Data":"8adf4ba0aa720627144c4b6055ae0379a2e8bc72a0049b5aea7634192f4d4038"} Dec 01 21:52:53 crc kubenswrapper[4962]: W1201 21:52:53.151422 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf4f0c9ce_824a_4a5b_adc9_f7a09b3ab97d.slice/crio-69330227d302e2a7266894e2af8f38e2a224ac8b258b275423c8fe5573348107 WatchSource:0}: Error finding container 69330227d302e2a7266894e2af8f38e2a224ac8b258b275423c8fe5573348107: Status 404 returned error can't find the container with id 69330227d302e2a7266894e2af8f38e2a224ac8b258b275423c8fe5573348107 Dec 01 21:52:53 crc kubenswrapper[4962]: I1201 21:52:53.153477 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4f8782cd-368d-4071-848d-8ad2379ddf6c","Type":"ContainerStarted","Data":"a68f2fb2321cc67b862c7135ec6db51274b6482a0f892e5ba2f3b3c06b66c488"} Dec 01 21:52:53 crc kubenswrapper[4962]: I1201 21:52:53.155583 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1e9a059a-712b-4ff4-b50e-7d94a96a9db5","Type":"ContainerStarted","Data":"2e612e8c7d52bd7bb195592643b0167d6f4ce348b0ef115b6d213703e68c13cb"} Dec 01 21:52:53 crc kubenswrapper[4962]: I1201 21:52:53.157697 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"aebd10ab-b3dd-4bc7-8ea0-f5883d794715","Type":"ContainerStarted","Data":"08ac08c58f8b5b86f30166920e4a98a68a6d1f0ef634205592b60b9889d76c06"} Dec 01 21:52:53 crc kubenswrapper[4962]: I1201 21:52:53.159187 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"c1c35af6-81b8-418f-a1e9-e19209bab14d","Type":"ContainerStarted","Data":"f456e5d91ecd6463d3b02116ed261a676673380b7a46d610db26c64563488770"} Dec 01 21:52:53 crc kubenswrapper[4962]: I1201 21:52:53.183347 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 01 21:52:53 crc kubenswrapper[4962]: I1201 21:52:53.227722 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 01 21:52:53 crc kubenswrapper[4962]: I1201 21:52:53.336154 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-7nmb4" Dec 01 21:52:53 crc kubenswrapper[4962]: I1201 21:52:53.356618 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-clh7c" Dec 01 21:52:53 crc kubenswrapper[4962]: I1201 21:52:53.450640 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwwmk\" (UniqueName: \"kubernetes.io/projected/6951c087-297e-4743-a6ec-3a1e3a7a3f9f-kube-api-access-kwwmk\") pod \"6951c087-297e-4743-a6ec-3a1e3a7a3f9f\" (UID: \"6951c087-297e-4743-a6ec-3a1e3a7a3f9f\") " Dec 01 21:52:53 crc kubenswrapper[4962]: I1201 21:52:53.450721 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6951c087-297e-4743-a6ec-3a1e3a7a3f9f-config\") pod \"6951c087-297e-4743-a6ec-3a1e3a7a3f9f\" (UID: \"6951c087-297e-4743-a6ec-3a1e3a7a3f9f\") " Dec 01 21:52:53 crc kubenswrapper[4962]: I1201 21:52:53.450863 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6951c087-297e-4743-a6ec-3a1e3a7a3f9f-dns-svc\") pod \"6951c087-297e-4743-a6ec-3a1e3a7a3f9f\" (UID: \"6951c087-297e-4743-a6ec-3a1e3a7a3f9f\") " Dec 01 21:52:53 crc kubenswrapper[4962]: I1201 21:52:53.451667 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6951c087-297e-4743-a6ec-3a1e3a7a3f9f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6951c087-297e-4743-a6ec-3a1e3a7a3f9f" (UID: "6951c087-297e-4743-a6ec-3a1e3a7a3f9f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:52:53 crc kubenswrapper[4962]: I1201 21:52:53.451959 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6951c087-297e-4743-a6ec-3a1e3a7a3f9f-config" (OuterVolumeSpecName: "config") pod "6951c087-297e-4743-a6ec-3a1e3a7a3f9f" (UID: "6951c087-297e-4743-a6ec-3a1e3a7a3f9f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:52:53 crc kubenswrapper[4962]: I1201 21:52:53.457668 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6951c087-297e-4743-a6ec-3a1e3a7a3f9f-kube-api-access-kwwmk" (OuterVolumeSpecName: "kube-api-access-kwwmk") pod "6951c087-297e-4743-a6ec-3a1e3a7a3f9f" (UID: "6951c087-297e-4743-a6ec-3a1e3a7a3f9f"). InnerVolumeSpecName "kube-api-access-kwwmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:52:53 crc kubenswrapper[4962]: I1201 21:52:53.552379 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rllnj\" (UniqueName: \"kubernetes.io/projected/acafcb5c-9dfc-4997-bc42-84fbcb84c6d1-kube-api-access-rllnj\") pod \"acafcb5c-9dfc-4997-bc42-84fbcb84c6d1\" (UID: \"acafcb5c-9dfc-4997-bc42-84fbcb84c6d1\") " Dec 01 21:52:53 crc kubenswrapper[4962]: I1201 21:52:53.552494 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/acafcb5c-9dfc-4997-bc42-84fbcb84c6d1-config\") pod \"acafcb5c-9dfc-4997-bc42-84fbcb84c6d1\" (UID: \"acafcb5c-9dfc-4997-bc42-84fbcb84c6d1\") " Dec 01 21:52:53 crc kubenswrapper[4962]: I1201 21:52:53.553042 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6951c087-297e-4743-a6ec-3a1e3a7a3f9f-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 21:52:53 crc kubenswrapper[4962]: I1201 21:52:53.553057 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwwmk\" (UniqueName: \"kubernetes.io/projected/6951c087-297e-4743-a6ec-3a1e3a7a3f9f-kube-api-access-kwwmk\") on node \"crc\" DevicePath \"\"" Dec 01 21:52:53 crc kubenswrapper[4962]: I1201 21:52:53.553070 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6951c087-297e-4743-a6ec-3a1e3a7a3f9f-config\") on node \"crc\" DevicePath \"\"" Dec 01 21:52:53 crc kubenswrapper[4962]: I1201 21:52:53.553518 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/acafcb5c-9dfc-4997-bc42-84fbcb84c6d1-config" (OuterVolumeSpecName: "config") pod "acafcb5c-9dfc-4997-bc42-84fbcb84c6d1" (UID: "acafcb5c-9dfc-4997-bc42-84fbcb84c6d1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:52:53 crc kubenswrapper[4962]: I1201 21:52:53.555963 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acafcb5c-9dfc-4997-bc42-84fbcb84c6d1-kube-api-access-rllnj" (OuterVolumeSpecName: "kube-api-access-rllnj") pod "acafcb5c-9dfc-4997-bc42-84fbcb84c6d1" (UID: "acafcb5c-9dfc-4997-bc42-84fbcb84c6d1"). InnerVolumeSpecName "kube-api-access-rllnj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:52:53 crc kubenswrapper[4962]: I1201 21:52:53.654664 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/acafcb5c-9dfc-4997-bc42-84fbcb84c6d1-config\") on node \"crc\" DevicePath \"\"" Dec 01 21:52:53 crc kubenswrapper[4962]: I1201 21:52:53.654693 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rllnj\" (UniqueName: \"kubernetes.io/projected/acafcb5c-9dfc-4997-bc42-84fbcb84c6d1-kube-api-access-rllnj\") on node \"crc\" DevicePath \"\"" Dec 01 21:52:53 crc kubenswrapper[4962]: I1201 21:52:53.849081 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 01 21:52:54 crc kubenswrapper[4962]: W1201 21:52:54.075295 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf893a462_9c1f_4b76_84fc_ba5e84364399.slice/crio-3fb5ece938a42acd138cf795f56dfe77b71669d86e6f84d1449cf2f031ae489e WatchSource:0}: Error finding container 3fb5ece938a42acd138cf795f56dfe77b71669d86e6f84d1449cf2f031ae489e: Status 404 returned error can't find the container with id 3fb5ece938a42acd138cf795f56dfe77b71669d86e6f84d1449cf2f031ae489e Dec 01 21:52:54 crc kubenswrapper[4962]: I1201 21:52:54.186759 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"c09bcbbf-f96b-4f90-8f2d-9d635454a05e","Type":"ContainerStarted","Data":"78ecfd4d261079cda051306a9e98c163aeedddb54ac11d931c3929b8cb1d11cf"} Dec 01 21:52:54 crc kubenswrapper[4962]: I1201 21:52:54.189237 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-9stzc" event={"ID":"07284111-fb8f-4fc6-9693-dfe6869248bf","Type":"ContainerStarted","Data":"5f1b59bf657c673e738df677682aa80526d933223d1d062bd9cad900208c6421"} Dec 01 21:52:54 crc kubenswrapper[4962]: I1201 21:52:54.190009 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-cdpb9"] Dec 01 21:52:54 crc kubenswrapper[4962]: I1201 21:52:54.190595 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"f893a462-9c1f-4b76-84fc-ba5e84364399","Type":"ContainerStarted","Data":"3fb5ece938a42acd138cf795f56dfe77b71669d86e6f84d1449cf2f031ae489e"} Dec 01 21:52:54 crc kubenswrapper[4962]: I1201 21:52:54.193966 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-clh7c" Dec 01 21:52:54 crc kubenswrapper[4962]: I1201 21:52:54.194015 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-clh7c" event={"ID":"acafcb5c-9dfc-4997-bc42-84fbcb84c6d1","Type":"ContainerDied","Data":"26ca8ebb89da450c8f9229bf67035efba501f29e1af9d47f0f398ae32a27894d"} Dec 01 21:52:54 crc kubenswrapper[4962]: I1201 21:52:54.195781 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"fe00e319-7859-4bac-9316-156263865d80","Type":"ContainerStarted","Data":"312c70d6bc677bc82e85b49fb7e46aa78f3f3727c1976e95cf64c8f1e68649c4"} Dec 01 21:52:54 crc kubenswrapper[4962]: W1201 21:52:54.196125 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88fa575e_baee_41dd_8c7e_72baff22783e.slice/crio-313a7228c0ed1b4e14e386eb096f4bf95fc7dd3a201d8939c1f701b1ef35d131 WatchSource:0}: Error finding container 313a7228c0ed1b4e14e386eb096f4bf95fc7dd3a201d8939c1f701b1ef35d131: Status 404 returned error can't find the container with id 313a7228c0ed1b4e14e386eb096f4bf95fc7dd3a201d8939c1f701b1ef35d131 Dec 01 21:52:54 crc kubenswrapper[4962]: I1201 21:52:54.198922 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-njtpt" event={"ID":"214eff82-ef7c-49b0-a9a3-5246584e9b66","Type":"ContainerStarted","Data":"bb7e400348f164596c2b4bf570e25417f82a7d8338cf7b8c5d94f13403552be4"} Dec 01 21:52:54 crc kubenswrapper[4962]: I1201 21:52:54.199030 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-njtpt" Dec 01 21:52:54 crc kubenswrapper[4962]: I1201 21:52:54.202221 4962 generic.go:334] "Generic (PLEG): container finished" podID="33007873-cb3d-4f47-8883-a60f3f823a16" containerID="f4f41170d9e311be4cf208f6f061d3c7127e6a261d74c26dee78e6b911148fa1" exitCode=0 Dec 01 21:52:54 crc kubenswrapper[4962]: I1201 21:52:54.202255 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-kv54m" event={"ID":"33007873-cb3d-4f47-8883-a60f3f823a16","Type":"ContainerDied","Data":"f4f41170d9e311be4cf208f6f061d3c7127e6a261d74c26dee78e6b911148fa1"} Dec 01 21:52:54 crc kubenswrapper[4962]: I1201 21:52:54.203612 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-7nmb4" Dec 01 21:52:54 crc kubenswrapper[4962]: I1201 21:52:54.203623 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-7nmb4" event={"ID":"6951c087-297e-4743-a6ec-3a1e3a7a3f9f","Type":"ContainerDied","Data":"a6266424258187176486dfcbb90ce44f96cc9dcba9ba4281b1869d5c24af5a7a"} Dec 01 21:52:54 crc kubenswrapper[4962]: I1201 21:52:54.206121 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-xd7ph" event={"ID":"f4f0c9ce-824a-4a5b-adc9-f7a09b3ab97d","Type":"ContainerStarted","Data":"69330227d302e2a7266894e2af8f38e2a224ac8b258b275423c8fe5573348107"} Dec 01 21:52:54 crc kubenswrapper[4962]: I1201 21:52:54.222763 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-njtpt" podStartSLOduration=4.883057127 podStartE2EDuration="22.222733276s" podCreationTimestamp="2025-12-01 21:52:32 +0000 UTC" firstStartedPulling="2025-12-01 21:52:34.167060617 +0000 UTC m=+1138.268499822" lastFinishedPulling="2025-12-01 21:52:51.506736776 +0000 UTC m=+1155.608175971" observedRunningTime="2025-12-01 21:52:54.213482703 +0000 UTC m=+1158.314921898" watchObservedRunningTime="2025-12-01 21:52:54.222733276 +0000 UTC m=+1158.324172471" Dec 01 21:52:54 crc kubenswrapper[4962]: I1201 21:52:54.289147 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-7nmb4"] Dec 01 21:52:54 crc kubenswrapper[4962]: I1201 21:52:54.301974 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-7nmb4"] Dec 01 21:52:54 crc kubenswrapper[4962]: I1201 21:52:54.341026 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-clh7c"] Dec 01 21:52:54 crc kubenswrapper[4962]: I1201 21:52:54.354486 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-clh7c"] Dec 01 21:52:55 crc kubenswrapper[4962]: I1201 21:52:55.219385 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"3dced14f-6bff-4820-b135-78ef69ba6b33","Type":"ContainerStarted","Data":"5f35cf4077e4ba9ef3b58db0be30ca171cd84839dc0460825c1ad95e246c604e"} Dec 01 21:52:55 crc kubenswrapper[4962]: I1201 21:52:55.219831 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 01 21:52:55 crc kubenswrapper[4962]: I1201 21:52:55.227137 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c4bf499-lfzj5" event={"ID":"8aeb2e7e-7195-43a5-b5a8-760f44df8a86","Type":"ContainerStarted","Data":"967d9d27d2edde06950283754e43f7084cac93167a8753c4c662c253fb7cfcba"} Dec 01 21:52:55 crc kubenswrapper[4962]: I1201 21:52:55.232453 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-kv54m" event={"ID":"33007873-cb3d-4f47-8883-a60f3f823a16","Type":"ContainerStarted","Data":"ef39056f5dc29b17fbcdcec508f51b55561a93e38b7c4f611a448d91a3c73791"} Dec 01 21:52:55 crc kubenswrapper[4962]: I1201 21:52:55.232791 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-kv54m" Dec 01 21:52:55 crc kubenswrapper[4962]: I1201 21:52:55.245002 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-cdpb9" event={"ID":"88fa575e-baee-41dd-8c7e-72baff22783e","Type":"ContainerStarted","Data":"313a7228c0ed1b4e14e386eb096f4bf95fc7dd3a201d8939c1f701b1ef35d131"} Dec 01 21:52:55 crc kubenswrapper[4962]: I1201 21:52:55.248036 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=13.463221114 podStartE2EDuration="16.248019162s" podCreationTimestamp="2025-12-01 21:52:39 +0000 UTC" firstStartedPulling="2025-12-01 21:52:51.354462074 +0000 UTC m=+1155.455901269" lastFinishedPulling="2025-12-01 21:52:54.139260122 +0000 UTC m=+1158.240699317" observedRunningTime="2025-12-01 21:52:55.238575284 +0000 UTC m=+1159.340014479" watchObservedRunningTime="2025-12-01 21:52:55.248019162 +0000 UTC m=+1159.349458357" Dec 01 21:52:55 crc kubenswrapper[4962]: I1201 21:52:55.288829 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6c4bf499-lfzj5" podStartSLOduration=15.288808593 podStartE2EDuration="15.288808593s" podCreationTimestamp="2025-12-01 21:52:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:52:55.284875841 +0000 UTC m=+1159.386315036" watchObservedRunningTime="2025-12-01 21:52:55.288808593 +0000 UTC m=+1159.390247788" Dec 01 21:52:55 crc kubenswrapper[4962]: I1201 21:52:55.294335 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-kv54m" podStartSLOduration=-9223372014.560457 podStartE2EDuration="22.294317789s" podCreationTimestamp="2025-12-01 21:52:33 +0000 UTC" firstStartedPulling="2025-12-01 21:52:34.30361614 +0000 UTC m=+1138.405055335" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:52:55.2693712 +0000 UTC m=+1159.370810395" watchObservedRunningTime="2025-12-01 21:52:55.294317789 +0000 UTC m=+1159.395756984" Dec 01 21:52:56 crc kubenswrapper[4962]: I1201 21:52:56.236000 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6951c087-297e-4743-a6ec-3a1e3a7a3f9f" path="/var/lib/kubelet/pods/6951c087-297e-4743-a6ec-3a1e3a7a3f9f/volumes" Dec 01 21:52:56 crc kubenswrapper[4962]: I1201 21:52:56.236544 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acafcb5c-9dfc-4997-bc42-84fbcb84c6d1" path="/var/lib/kubelet/pods/acafcb5c-9dfc-4997-bc42-84fbcb84c6d1/volumes" Dec 01 21:52:58 crc kubenswrapper[4962]: I1201 21:52:58.326920 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-666b6646f7-njtpt" Dec 01 21:52:59 crc kubenswrapper[4962]: I1201 21:52:59.673111 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 01 21:53:00 crc kubenswrapper[4962]: I1201 21:53:00.503542 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6c4bf499-lfzj5" Dec 01 21:53:00 crc kubenswrapper[4962]: I1201 21:53:00.503869 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-6c4bf499-lfzj5" Dec 01 21:53:00 crc kubenswrapper[4962]: I1201 21:53:00.526914 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6c4bf499-lfzj5" Dec 01 21:53:01 crc kubenswrapper[4962]: I1201 21:53:01.326469 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6c4bf499-lfzj5" Dec 01 21:53:01 crc kubenswrapper[4962]: I1201 21:53:01.396314 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-578c49649f-mltwz"] Dec 01 21:53:03 crc kubenswrapper[4962]: I1201 21:53:03.612524 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-kv54m" Dec 01 21:53:03 crc kubenswrapper[4962]: I1201 21:53:03.672916 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-njtpt"] Dec 01 21:53:03 crc kubenswrapper[4962]: I1201 21:53:03.673153 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-njtpt" podUID="214eff82-ef7c-49b0-a9a3-5246584e9b66" containerName="dnsmasq-dns" containerID="cri-o://bb7e400348f164596c2b4bf570e25417f82a7d8338cf7b8c5d94f13403552be4" gracePeriod=10 Dec 01 21:53:05 crc kubenswrapper[4962]: I1201 21:53:05.364838 4962 generic.go:334] "Generic (PLEG): container finished" podID="214eff82-ef7c-49b0-a9a3-5246584e9b66" containerID="bb7e400348f164596c2b4bf570e25417f82a7d8338cf7b8c5d94f13403552be4" exitCode=0 Dec 01 21:53:05 crc kubenswrapper[4962]: I1201 21:53:05.364903 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-njtpt" event={"ID":"214eff82-ef7c-49b0-a9a3-5246584e9b66","Type":"ContainerDied","Data":"bb7e400348f164596c2b4bf570e25417f82a7d8338cf7b8c5d94f13403552be4"} Dec 01 21:53:08 crc kubenswrapper[4962]: I1201 21:53:08.325868 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-666b6646f7-njtpt" podUID="214eff82-ef7c-49b0-a9a3-5246584e9b66" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.126:5353: connect: connection refused" Dec 01 21:53:08 crc kubenswrapper[4962]: E1201 21:53:08.916910 4962 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified" Dec 01 21:53:08 crc kubenswrapper[4962]: E1201 21:53:08.917372 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ovsdbserver-sb,Image:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified,Command:[/usr/bin/dumb-init],Args:[/usr/local/bin/container-scripts/setup.sh],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5c6h5d9h585h54fhcch5fch649h568hd8h56bh57dh554h67bhbh597h5f7h64chfh9dh546h5cbh645h586h67hbh5cdh69hfch57bh669h548h696q,ValueFrom:nil,},EnvVar{Name:OVN_LOGDIR,Value:/tmp,ValueFrom:nil,},EnvVar{Name:OVN_RUNDIR,Value:/tmp,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovndbcluster-sb-etc-ovn,ReadOnly:false,MountPath:/etc/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdb-rundir,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndb.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovndb.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jwhqb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:&Lifecycle{PostStart:nil,PreStop:&LifecycleHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/cleanup.sh],},HTTPGet:nil,TCPSocket:nil,Sleep:nil,},},TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:20,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovsdbserver-sb-0_openstack(fe00e319-7859-4bac-9316-156263865d80): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 21:53:09 crc kubenswrapper[4962]: E1201 21:53:09.575065 4962 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified" Dec 01 21:53:09 crc kubenswrapper[4962]: E1201 21:53:09.575314 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ovn-controller,Image:quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified,Command:[ovn-controller --pidfile unix:/run/openvswitch/db.sock --certificate=/etc/pki/tls/certs/ovndb.crt --private-key=/etc/pki/tls/private/ovndb.key --ca-cert=/etc/pki/tls/certs/ovndbca.crt],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n67ch57h5f7h58ch5fchc9hdfh667hfdhd8h667h56ch54h575h5bhbh548h559hc6hcdh658h594h594h5bh65dhfbh558h55dh5dch88h67hffq,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:var-run,ReadOnly:false,MountPath:/var/run/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-run-ovn,ReadOnly:false,MountPath:/var/run/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-log-ovn,ReadOnly:false,MountPath:/var/log/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-controller-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndb.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-controller-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovndb.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-controller-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8z9k5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/ovn_controller_liveness.sh],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:30,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/ovn_controller_readiness.sh],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:30,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:&Lifecycle{PostStart:nil,PreStop:&LifecycleHandler{Exec:&ExecAction{Command:[/usr/share/ovn/scripts/ovn-ctl stop_controller],},HTTPGet:nil,TCPSocket:nil,Sleep:nil,},},TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[NET_ADMIN SYS_ADMIN SYS_NICE],Drop:[],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-controller-xd7ph_openstack(f4f0c9ce-824a-4a5b-adc9-f7a09b3ab97d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 21:53:09 crc kubenswrapper[4962]: E1201 21:53:09.577018 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovn-controller\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ovn-controller-xd7ph" podUID="f4f0c9ce-824a-4a5b-adc9-f7a09b3ab97d" Dec 01 21:53:09 crc kubenswrapper[4962]: I1201 21:53:09.995086 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-njtpt" Dec 01 21:53:10 crc kubenswrapper[4962]: I1201 21:53:10.034599 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/214eff82-ef7c-49b0-a9a3-5246584e9b66-config\") pod \"214eff82-ef7c-49b0-a9a3-5246584e9b66\" (UID: \"214eff82-ef7c-49b0-a9a3-5246584e9b66\") " Dec 01 21:53:10 crc kubenswrapper[4962]: I1201 21:53:10.034702 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvflk\" (UniqueName: \"kubernetes.io/projected/214eff82-ef7c-49b0-a9a3-5246584e9b66-kube-api-access-gvflk\") pod \"214eff82-ef7c-49b0-a9a3-5246584e9b66\" (UID: \"214eff82-ef7c-49b0-a9a3-5246584e9b66\") " Dec 01 21:53:10 crc kubenswrapper[4962]: I1201 21:53:10.035010 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/214eff82-ef7c-49b0-a9a3-5246584e9b66-dns-svc\") pod \"214eff82-ef7c-49b0-a9a3-5246584e9b66\" (UID: \"214eff82-ef7c-49b0-a9a3-5246584e9b66\") " Dec 01 21:53:10 crc kubenswrapper[4962]: I1201 21:53:10.115301 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/214eff82-ef7c-49b0-a9a3-5246584e9b66-kube-api-access-gvflk" (OuterVolumeSpecName: "kube-api-access-gvflk") pod "214eff82-ef7c-49b0-a9a3-5246584e9b66" (UID: "214eff82-ef7c-49b0-a9a3-5246584e9b66"). InnerVolumeSpecName "kube-api-access-gvflk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:53:10 crc kubenswrapper[4962]: I1201 21:53:10.149712 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvflk\" (UniqueName: \"kubernetes.io/projected/214eff82-ef7c-49b0-a9a3-5246584e9b66-kube-api-access-gvflk\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:10 crc kubenswrapper[4962]: I1201 21:53:10.432989 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-njtpt" event={"ID":"214eff82-ef7c-49b0-a9a3-5246584e9b66","Type":"ContainerDied","Data":"b9ef5b9b904a69be7cb21f895f4db8cb7b5fa52634eadf9d37a5f616e0c66c0d"} Dec 01 21:53:10 crc kubenswrapper[4962]: I1201 21:53:10.433345 4962 scope.go:117] "RemoveContainer" containerID="bb7e400348f164596c2b4bf570e25417f82a7d8338cf7b8c5d94f13403552be4" Dec 01 21:53:10 crc kubenswrapper[4962]: I1201 21:53:10.433482 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-njtpt" Dec 01 21:53:10 crc kubenswrapper[4962]: I1201 21:53:10.436585 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"aebd10ab-b3dd-4bc7-8ea0-f5883d794715","Type":"ContainerStarted","Data":"b816197ad0c41e001d921ad988531031707a87e15443c6459e4f7331898f5fbd"} Dec 01 21:53:10 crc kubenswrapper[4962]: I1201 21:53:10.449421 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"c1c35af6-81b8-418f-a1e9-e19209bab14d","Type":"ContainerStarted","Data":"a7d526896d0977ba86ec94ea2ae8887bf027d10275708effc31e2f85ca5b020e"} Dec 01 21:53:10 crc kubenswrapper[4962]: I1201 21:53:10.449524 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 01 21:53:10 crc kubenswrapper[4962]: E1201 21:53:10.453624 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovn-controller\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified\\\"\"" pod="openstack/ovn-controller-xd7ph" podUID="f4f0c9ce-824a-4a5b-adc9-f7a09b3ab97d" Dec 01 21:53:10 crc kubenswrapper[4962]: I1201 21:53:10.495683 4962 scope.go:117] "RemoveContainer" containerID="13306bf603397e4024db8bc4f86d54c728e0721e68e03f159229afeedca2751e" Dec 01 21:53:10 crc kubenswrapper[4962]: I1201 21:53:10.504377 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=18.099707229 podStartE2EDuration="33.504355388s" podCreationTimestamp="2025-12-01 21:52:37 +0000 UTC" firstStartedPulling="2025-12-01 21:52:52.742752346 +0000 UTC m=+1156.844191541" lastFinishedPulling="2025-12-01 21:53:08.147400495 +0000 UTC m=+1172.248839700" observedRunningTime="2025-12-01 21:53:10.485379287 +0000 UTC m=+1174.586818492" watchObservedRunningTime="2025-12-01 21:53:10.504355388 +0000 UTC m=+1174.605794583" Dec 01 21:53:10 crc kubenswrapper[4962]: I1201 21:53:10.610508 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/214eff82-ef7c-49b0-a9a3-5246584e9b66-config" (OuterVolumeSpecName: "config") pod "214eff82-ef7c-49b0-a9a3-5246584e9b66" (UID: "214eff82-ef7c-49b0-a9a3-5246584e9b66"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:53:10 crc kubenswrapper[4962]: I1201 21:53:10.625670 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/214eff82-ef7c-49b0-a9a3-5246584e9b66-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "214eff82-ef7c-49b0-a9a3-5246584e9b66" (UID: "214eff82-ef7c-49b0-a9a3-5246584e9b66"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:53:10 crc kubenswrapper[4962]: I1201 21:53:10.668888 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/214eff82-ef7c-49b0-a9a3-5246584e9b66-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:10 crc kubenswrapper[4962]: I1201 21:53:10.668916 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/214eff82-ef7c-49b0-a9a3-5246584e9b66-config\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:10 crc kubenswrapper[4962]: I1201 21:53:10.781662 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-njtpt"] Dec 01 21:53:10 crc kubenswrapper[4962]: I1201 21:53:10.794913 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-njtpt"] Dec 01 21:53:11 crc kubenswrapper[4962]: I1201 21:53:11.461733 4962 generic.go:334] "Generic (PLEG): container finished" podID="88fa575e-baee-41dd-8c7e-72baff22783e" containerID="ca4b2607c6686e44d39b76860e3d28347f4c7b41110b3b751a3a3e256d2ed10b" exitCode=0 Dec 01 21:53:11 crc kubenswrapper[4962]: I1201 21:53:11.461833 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-cdpb9" event={"ID":"88fa575e-baee-41dd-8c7e-72baff22783e","Type":"ContainerDied","Data":"ca4b2607c6686e44d39b76860e3d28347f4c7b41110b3b751a3a3e256d2ed10b"} Dec 01 21:53:11 crc kubenswrapper[4962]: I1201 21:53:11.465157 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"f893a462-9c1f-4b76-84fc-ba5e84364399","Type":"ContainerStarted","Data":"ef0f49c302eaf2a5af17013eba889a33ba5d3d499751c01e2bb190ee997b2b67"} Dec 01 21:53:11 crc kubenswrapper[4962]: I1201 21:53:11.468399 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"c09bcbbf-f96b-4f90-8f2d-9d635454a05e","Type":"ContainerStarted","Data":"a7396560ecdd00d6c4979b49374390be3a370d23de7c509617eca962007d5c5b"} Dec 01 21:53:11 crc kubenswrapper[4962]: I1201 21:53:11.485821 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-9stzc" event={"ID":"07284111-fb8f-4fc6-9693-dfe6869248bf","Type":"ContainerStarted","Data":"046c6a6c4556a968460f268de38ed68785f26217e3f5b282b54083f622bd89f1"} Dec 01 21:53:11 crc kubenswrapper[4962]: I1201 21:53:11.584025 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-9stzc" podStartSLOduration=16.109540611 podStartE2EDuration="32.584000739s" podCreationTimestamp="2025-12-01 21:52:39 +0000 UTC" firstStartedPulling="2025-12-01 21:52:53.143363902 +0000 UTC m=+1157.244803097" lastFinishedPulling="2025-12-01 21:53:09.61782402 +0000 UTC m=+1173.719263225" observedRunningTime="2025-12-01 21:53:11.571499003 +0000 UTC m=+1175.672938198" watchObservedRunningTime="2025-12-01 21:53:11.584000739 +0000 UTC m=+1175.685439934" Dec 01 21:53:12 crc kubenswrapper[4962]: I1201 21:53:12.230404 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="214eff82-ef7c-49b0-a9a3-5246584e9b66" path="/var/lib/kubelet/pods/214eff82-ef7c-49b0-a9a3-5246584e9b66/volumes" Dec 01 21:53:13 crc kubenswrapper[4962]: I1201 21:53:13.510288 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4f8782cd-368d-4071-848d-8ad2379ddf6c","Type":"ContainerStarted","Data":"97294dc7e4dd8cc70e7c81893b1ac1dbf62e5ad1c96188ee0339b22ce385b53b"} Dec 01 21:53:13 crc kubenswrapper[4962]: E1201 21:53:13.753884 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-sb\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ovsdbserver-sb-0" podUID="fe00e319-7859-4bac-9316-156263865d80" Dec 01 21:53:14 crc kubenswrapper[4962]: I1201 21:53:14.524100 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"fe00e319-7859-4bac-9316-156263865d80","Type":"ContainerStarted","Data":"24b076726553af879d93a847fc2ef3f4d5d01dab5936fa6cd555717c35d97b44"} Dec 01 21:53:14 crc kubenswrapper[4962]: I1201 21:53:14.527494 4962 generic.go:334] "Generic (PLEG): container finished" podID="aebd10ab-b3dd-4bc7-8ea0-f5883d794715" containerID="b816197ad0c41e001d921ad988531031707a87e15443c6459e4f7331898f5fbd" exitCode=0 Dec 01 21:53:14 crc kubenswrapper[4962]: I1201 21:53:14.527710 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"aebd10ab-b3dd-4bc7-8ea0-f5883d794715","Type":"ContainerDied","Data":"b816197ad0c41e001d921ad988531031707a87e15443c6459e4f7331898f5fbd"} Dec 01 21:53:14 crc kubenswrapper[4962]: E1201 21:53:14.527784 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-sb\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified\\\"\"" pod="openstack/ovsdbserver-sb-0" podUID="fe00e319-7859-4bac-9316-156263865d80" Dec 01 21:53:14 crc kubenswrapper[4962]: I1201 21:53:14.533151 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-cdpb9" event={"ID":"88fa575e-baee-41dd-8c7e-72baff22783e","Type":"ContainerStarted","Data":"3b3eccf2886d28ebb319d4608cf1149cff2744eca6dbef364c6c4e5e05d00896"} Dec 01 21:53:14 crc kubenswrapper[4962]: I1201 21:53:14.533220 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-cdpb9" event={"ID":"88fa575e-baee-41dd-8c7e-72baff22783e","Type":"ContainerStarted","Data":"049c765324a448f5f06239d9aa7ed0e6b5c510b25741a283caac900fb160081e"} Dec 01 21:53:14 crc kubenswrapper[4962]: I1201 21:53:14.533251 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-cdpb9" Dec 01 21:53:14 crc kubenswrapper[4962]: I1201 21:53:14.533274 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-cdpb9" Dec 01 21:53:14 crc kubenswrapper[4962]: I1201 21:53:14.539374 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"f893a462-9c1f-4b76-84fc-ba5e84364399","Type":"ContainerStarted","Data":"1fd1f274bdd3a8a3650092a91b4bdbdd574fe4808601a8fbc99a17c45d014915"} Dec 01 21:53:14 crc kubenswrapper[4962]: I1201 21:53:14.547865 4962 generic.go:334] "Generic (PLEG): container finished" podID="c09bcbbf-f96b-4f90-8f2d-9d635454a05e" containerID="a7396560ecdd00d6c4979b49374390be3a370d23de7c509617eca962007d5c5b" exitCode=0 Dec 01 21:53:14 crc kubenswrapper[4962]: I1201 21:53:14.549206 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"c09bcbbf-f96b-4f90-8f2d-9d635454a05e","Type":"ContainerDied","Data":"a7396560ecdd00d6c4979b49374390be3a370d23de7c509617eca962007d5c5b"} Dec 01 21:53:14 crc kubenswrapper[4962]: I1201 21:53:14.647956 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-cdpb9" podStartSLOduration=17.140624656 podStartE2EDuration="32.647918805s" podCreationTimestamp="2025-12-01 21:52:42 +0000 UTC" firstStartedPulling="2025-12-01 21:52:54.198695713 +0000 UTC m=+1158.300134908" lastFinishedPulling="2025-12-01 21:53:09.705989852 +0000 UTC m=+1173.807429057" observedRunningTime="2025-12-01 21:53:14.641693427 +0000 UTC m=+1178.743132632" watchObservedRunningTime="2025-12-01 21:53:14.647918805 +0000 UTC m=+1178.749358020" Dec 01 21:53:14 crc kubenswrapper[4962]: I1201 21:53:14.706143 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=13.350918646 podStartE2EDuration="32.706124493s" podCreationTimestamp="2025-12-01 21:52:42 +0000 UTC" firstStartedPulling="2025-12-01 21:52:54.077364921 +0000 UTC m=+1158.178804116" lastFinishedPulling="2025-12-01 21:53:13.432570728 +0000 UTC m=+1177.534009963" observedRunningTime="2025-12-01 21:53:14.699254507 +0000 UTC m=+1178.800693712" watchObservedRunningTime="2025-12-01 21:53:14.706124493 +0000 UTC m=+1178.807563678" Dec 01 21:53:15 crc kubenswrapper[4962]: I1201 21:53:15.563686 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"aebd10ab-b3dd-4bc7-8ea0-f5883d794715","Type":"ContainerStarted","Data":"38cfc471a18bcbe5720daa2b08f0c9ec21ddda74021e722643e008b828acf63a"} Dec 01 21:53:15 crc kubenswrapper[4962]: I1201 21:53:15.566506 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"c09bcbbf-f96b-4f90-8f2d-9d635454a05e","Type":"ContainerStarted","Data":"f99e605afe8438a82c1a8135014d99d64a30cc3f963a16bff984bb000b5e3bfc"} Dec 01 21:53:15 crc kubenswrapper[4962]: E1201 21:53:15.568901 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-sb\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified\\\"\"" pod="openstack/ovsdbserver-sb-0" podUID="fe00e319-7859-4bac-9316-156263865d80" Dec 01 21:53:15 crc kubenswrapper[4962]: I1201 21:53:15.613073 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=22.650072505 podStartE2EDuration="39.613038393s" podCreationTimestamp="2025-12-01 21:52:36 +0000 UTC" firstStartedPulling="2025-12-01 21:52:52.742762226 +0000 UTC m=+1156.844201431" lastFinishedPulling="2025-12-01 21:53:09.705728114 +0000 UTC m=+1173.807167319" observedRunningTime="2025-12-01 21:53:15.607280769 +0000 UTC m=+1179.708720004" watchObservedRunningTime="2025-12-01 21:53:15.613038393 +0000 UTC m=+1179.714477648" Dec 01 21:53:15 crc kubenswrapper[4962]: I1201 21:53:15.665281 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=25.290894844 podStartE2EDuration="41.665254621s" podCreationTimestamp="2025-12-01 21:52:34 +0000 UTC" firstStartedPulling="2025-12-01 21:52:53.393131387 +0000 UTC m=+1157.494570582" lastFinishedPulling="2025-12-01 21:53:09.767491164 +0000 UTC m=+1173.868930359" observedRunningTime="2025-12-01 21:53:15.659875597 +0000 UTC m=+1179.761314802" watchObservedRunningTime="2025-12-01 21:53:15.665254621 +0000 UTC m=+1179.766693856" Dec 01 21:53:16 crc kubenswrapper[4962]: I1201 21:53:16.118997 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 01 21:53:16 crc kubenswrapper[4962]: I1201 21:53:16.120069 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 01 21:53:17 crc kubenswrapper[4962]: I1201 21:53:17.321184 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 01 21:53:17 crc kubenswrapper[4962]: I1201 21:53:17.372723 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 01 21:53:17 crc kubenswrapper[4962]: I1201 21:53:17.559467 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 01 21:53:17 crc kubenswrapper[4962]: I1201 21:53:17.560132 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 01 21:53:17 crc kubenswrapper[4962]: I1201 21:53:17.587854 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 01 21:53:17 crc kubenswrapper[4962]: I1201 21:53:17.615616 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 01 21:53:17 crc kubenswrapper[4962]: I1201 21:53:17.637084 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 01 21:53:17 crc kubenswrapper[4962]: I1201 21:53:17.971136 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-2c9dk"] Dec 01 21:53:17 crc kubenswrapper[4962]: E1201 21:53:17.971791 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="214eff82-ef7c-49b0-a9a3-5246584e9b66" containerName="init" Dec 01 21:53:17 crc kubenswrapper[4962]: I1201 21:53:17.971805 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="214eff82-ef7c-49b0-a9a3-5246584e9b66" containerName="init" Dec 01 21:53:17 crc kubenswrapper[4962]: E1201 21:53:17.971825 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="214eff82-ef7c-49b0-a9a3-5246584e9b66" containerName="dnsmasq-dns" Dec 01 21:53:17 crc kubenswrapper[4962]: I1201 21:53:17.971831 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="214eff82-ef7c-49b0-a9a3-5246584e9b66" containerName="dnsmasq-dns" Dec 01 21:53:17 crc kubenswrapper[4962]: I1201 21:53:17.972021 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="214eff82-ef7c-49b0-a9a3-5246584e9b66" containerName="dnsmasq-dns" Dec 01 21:53:17 crc kubenswrapper[4962]: I1201 21:53:17.973088 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-2c9dk" Dec 01 21:53:17 crc kubenswrapper[4962]: I1201 21:53:17.976792 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 01 21:53:17 crc kubenswrapper[4962]: I1201 21:53:17.980702 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-2c9dk"] Dec 01 21:53:18 crc kubenswrapper[4962]: I1201 21:53:18.080910 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-vqjlm"] Dec 01 21:53:18 crc kubenswrapper[4962]: I1201 21:53:18.082186 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-vqjlm" Dec 01 21:53:18 crc kubenswrapper[4962]: I1201 21:53:18.083876 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Dec 01 21:53:18 crc kubenswrapper[4962]: I1201 21:53:18.095390 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-vqjlm"] Dec 01 21:53:18 crc kubenswrapper[4962]: I1201 21:53:18.141399 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/292e65d7-a332-450a-83b9-802e53eb5382-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-2c9dk\" (UID: \"292e65d7-a332-450a-83b9-802e53eb5382\") " pod="openstack/dnsmasq-dns-5bf47b49b7-2c9dk" Dec 01 21:53:18 crc kubenswrapper[4962]: I1201 21:53:18.141481 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/292e65d7-a332-450a-83b9-802e53eb5382-config\") pod \"dnsmasq-dns-5bf47b49b7-2c9dk\" (UID: \"292e65d7-a332-450a-83b9-802e53eb5382\") " pod="openstack/dnsmasq-dns-5bf47b49b7-2c9dk" Dec 01 21:53:18 crc kubenswrapper[4962]: I1201 21:53:18.141574 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/292e65d7-a332-450a-83b9-802e53eb5382-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-2c9dk\" (UID: \"292e65d7-a332-450a-83b9-802e53eb5382\") " pod="openstack/dnsmasq-dns-5bf47b49b7-2c9dk" Dec 01 21:53:18 crc kubenswrapper[4962]: I1201 21:53:18.141632 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxnq2\" (UniqueName: \"kubernetes.io/projected/292e65d7-a332-450a-83b9-802e53eb5382-kube-api-access-wxnq2\") pod \"dnsmasq-dns-5bf47b49b7-2c9dk\" (UID: \"292e65d7-a332-450a-83b9-802e53eb5382\") " pod="openstack/dnsmasq-dns-5bf47b49b7-2c9dk" Dec 01 21:53:18 crc kubenswrapper[4962]: I1201 21:53:18.242965 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxnq2\" (UniqueName: \"kubernetes.io/projected/292e65d7-a332-450a-83b9-802e53eb5382-kube-api-access-wxnq2\") pod \"dnsmasq-dns-5bf47b49b7-2c9dk\" (UID: \"292e65d7-a332-450a-83b9-802e53eb5382\") " pod="openstack/dnsmasq-dns-5bf47b49b7-2c9dk" Dec 01 21:53:18 crc kubenswrapper[4962]: I1201 21:53:18.243038 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f47734e-2a33-432b-8030-c82a75ec77c3-combined-ca-bundle\") pod \"ovn-controller-metrics-vqjlm\" (UID: \"1f47734e-2a33-432b-8030-c82a75ec77c3\") " pod="openstack/ovn-controller-metrics-vqjlm" Dec 01 21:53:18 crc kubenswrapper[4962]: I1201 21:53:18.243138 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/292e65d7-a332-450a-83b9-802e53eb5382-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-2c9dk\" (UID: \"292e65d7-a332-450a-83b9-802e53eb5382\") " pod="openstack/dnsmasq-dns-5bf47b49b7-2c9dk" Dec 01 21:53:18 crc kubenswrapper[4962]: I1201 21:53:18.243214 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/292e65d7-a332-450a-83b9-802e53eb5382-config\") pod \"dnsmasq-dns-5bf47b49b7-2c9dk\" (UID: \"292e65d7-a332-450a-83b9-802e53eb5382\") " pod="openstack/dnsmasq-dns-5bf47b49b7-2c9dk" Dec 01 21:53:18 crc kubenswrapper[4962]: I1201 21:53:18.243259 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/1f47734e-2a33-432b-8030-c82a75ec77c3-ovn-rundir\") pod \"ovn-controller-metrics-vqjlm\" (UID: \"1f47734e-2a33-432b-8030-c82a75ec77c3\") " pod="openstack/ovn-controller-metrics-vqjlm" Dec 01 21:53:18 crc kubenswrapper[4962]: I1201 21:53:18.243289 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f47734e-2a33-432b-8030-c82a75ec77c3-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-vqjlm\" (UID: \"1f47734e-2a33-432b-8030-c82a75ec77c3\") " pod="openstack/ovn-controller-metrics-vqjlm" Dec 01 21:53:18 crc kubenswrapper[4962]: I1201 21:53:18.243360 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vt9l6\" (UniqueName: \"kubernetes.io/projected/1f47734e-2a33-432b-8030-c82a75ec77c3-kube-api-access-vt9l6\") pod \"ovn-controller-metrics-vqjlm\" (UID: \"1f47734e-2a33-432b-8030-c82a75ec77c3\") " pod="openstack/ovn-controller-metrics-vqjlm" Dec 01 21:53:18 crc kubenswrapper[4962]: I1201 21:53:18.243413 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f47734e-2a33-432b-8030-c82a75ec77c3-config\") pod \"ovn-controller-metrics-vqjlm\" (UID: \"1f47734e-2a33-432b-8030-c82a75ec77c3\") " pod="openstack/ovn-controller-metrics-vqjlm" Dec 01 21:53:18 crc kubenswrapper[4962]: I1201 21:53:18.243448 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/292e65d7-a332-450a-83b9-802e53eb5382-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-2c9dk\" (UID: \"292e65d7-a332-450a-83b9-802e53eb5382\") " pod="openstack/dnsmasq-dns-5bf47b49b7-2c9dk" Dec 01 21:53:18 crc kubenswrapper[4962]: I1201 21:53:18.243487 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/1f47734e-2a33-432b-8030-c82a75ec77c3-ovs-rundir\") pod \"ovn-controller-metrics-vqjlm\" (UID: \"1f47734e-2a33-432b-8030-c82a75ec77c3\") " pod="openstack/ovn-controller-metrics-vqjlm" Dec 01 21:53:18 crc kubenswrapper[4962]: I1201 21:53:18.244123 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/292e65d7-a332-450a-83b9-802e53eb5382-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-2c9dk\" (UID: \"292e65d7-a332-450a-83b9-802e53eb5382\") " pod="openstack/dnsmasq-dns-5bf47b49b7-2c9dk" Dec 01 21:53:18 crc kubenswrapper[4962]: I1201 21:53:18.244286 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/292e65d7-a332-450a-83b9-802e53eb5382-config\") pod \"dnsmasq-dns-5bf47b49b7-2c9dk\" (UID: \"292e65d7-a332-450a-83b9-802e53eb5382\") " pod="openstack/dnsmasq-dns-5bf47b49b7-2c9dk" Dec 01 21:53:18 crc kubenswrapper[4962]: I1201 21:53:18.244335 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/292e65d7-a332-450a-83b9-802e53eb5382-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-2c9dk\" (UID: \"292e65d7-a332-450a-83b9-802e53eb5382\") " pod="openstack/dnsmasq-dns-5bf47b49b7-2c9dk" Dec 01 21:53:18 crc kubenswrapper[4962]: I1201 21:53:18.266083 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxnq2\" (UniqueName: \"kubernetes.io/projected/292e65d7-a332-450a-83b9-802e53eb5382-kube-api-access-wxnq2\") pod \"dnsmasq-dns-5bf47b49b7-2c9dk\" (UID: \"292e65d7-a332-450a-83b9-802e53eb5382\") " pod="openstack/dnsmasq-dns-5bf47b49b7-2c9dk" Dec 01 21:53:18 crc kubenswrapper[4962]: I1201 21:53:18.293362 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-2c9dk" Dec 01 21:53:18 crc kubenswrapper[4962]: I1201 21:53:18.346528 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/1f47734e-2a33-432b-8030-c82a75ec77c3-ovs-rundir\") pod \"ovn-controller-metrics-vqjlm\" (UID: \"1f47734e-2a33-432b-8030-c82a75ec77c3\") " pod="openstack/ovn-controller-metrics-vqjlm" Dec 01 21:53:18 crc kubenswrapper[4962]: I1201 21:53:18.346686 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f47734e-2a33-432b-8030-c82a75ec77c3-combined-ca-bundle\") pod \"ovn-controller-metrics-vqjlm\" (UID: \"1f47734e-2a33-432b-8030-c82a75ec77c3\") " pod="openstack/ovn-controller-metrics-vqjlm" Dec 01 21:53:18 crc kubenswrapper[4962]: I1201 21:53:18.347289 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/1f47734e-2a33-432b-8030-c82a75ec77c3-ovn-rundir\") pod \"ovn-controller-metrics-vqjlm\" (UID: \"1f47734e-2a33-432b-8030-c82a75ec77c3\") " pod="openstack/ovn-controller-metrics-vqjlm" Dec 01 21:53:18 crc kubenswrapper[4962]: I1201 21:53:18.347381 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f47734e-2a33-432b-8030-c82a75ec77c3-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-vqjlm\" (UID: \"1f47734e-2a33-432b-8030-c82a75ec77c3\") " pod="openstack/ovn-controller-metrics-vqjlm" Dec 01 21:53:18 crc kubenswrapper[4962]: I1201 21:53:18.347412 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vt9l6\" (UniqueName: \"kubernetes.io/projected/1f47734e-2a33-432b-8030-c82a75ec77c3-kube-api-access-vt9l6\") pod \"ovn-controller-metrics-vqjlm\" (UID: \"1f47734e-2a33-432b-8030-c82a75ec77c3\") " pod="openstack/ovn-controller-metrics-vqjlm" Dec 01 21:53:18 crc kubenswrapper[4962]: I1201 21:53:18.347539 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f47734e-2a33-432b-8030-c82a75ec77c3-config\") pod \"ovn-controller-metrics-vqjlm\" (UID: \"1f47734e-2a33-432b-8030-c82a75ec77c3\") " pod="openstack/ovn-controller-metrics-vqjlm" Dec 01 21:53:18 crc kubenswrapper[4962]: I1201 21:53:18.348314 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/1f47734e-2a33-432b-8030-c82a75ec77c3-ovs-rundir\") pod \"ovn-controller-metrics-vqjlm\" (UID: \"1f47734e-2a33-432b-8030-c82a75ec77c3\") " pod="openstack/ovn-controller-metrics-vqjlm" Dec 01 21:53:18 crc kubenswrapper[4962]: I1201 21:53:18.348539 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f47734e-2a33-432b-8030-c82a75ec77c3-config\") pod \"ovn-controller-metrics-vqjlm\" (UID: \"1f47734e-2a33-432b-8030-c82a75ec77c3\") " pod="openstack/ovn-controller-metrics-vqjlm" Dec 01 21:53:18 crc kubenswrapper[4962]: I1201 21:53:18.348640 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/1f47734e-2a33-432b-8030-c82a75ec77c3-ovn-rundir\") pod \"ovn-controller-metrics-vqjlm\" (UID: \"1f47734e-2a33-432b-8030-c82a75ec77c3\") " pod="openstack/ovn-controller-metrics-vqjlm" Dec 01 21:53:18 crc kubenswrapper[4962]: I1201 21:53:18.352611 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f47734e-2a33-432b-8030-c82a75ec77c3-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-vqjlm\" (UID: \"1f47734e-2a33-432b-8030-c82a75ec77c3\") " pod="openstack/ovn-controller-metrics-vqjlm" Dec 01 21:53:18 crc kubenswrapper[4962]: I1201 21:53:18.353315 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f47734e-2a33-432b-8030-c82a75ec77c3-combined-ca-bundle\") pod \"ovn-controller-metrics-vqjlm\" (UID: \"1f47734e-2a33-432b-8030-c82a75ec77c3\") " pod="openstack/ovn-controller-metrics-vqjlm" Dec 01 21:53:18 crc kubenswrapper[4962]: I1201 21:53:18.368551 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-2c9dk"] Dec 01 21:53:18 crc kubenswrapper[4962]: I1201 21:53:18.371264 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vt9l6\" (UniqueName: \"kubernetes.io/projected/1f47734e-2a33-432b-8030-c82a75ec77c3-kube-api-access-vt9l6\") pod \"ovn-controller-metrics-vqjlm\" (UID: \"1f47734e-2a33-432b-8030-c82a75ec77c3\") " pod="openstack/ovn-controller-metrics-vqjlm" Dec 01 21:53:18 crc kubenswrapper[4962]: I1201 21:53:18.397500 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-vqjlm" Dec 01 21:53:18 crc kubenswrapper[4962]: I1201 21:53:18.456044 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8554648995-7hk2m"] Dec 01 21:53:18 crc kubenswrapper[4962]: I1201 21:53:18.459928 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-7hk2m" Dec 01 21:53:18 crc kubenswrapper[4962]: I1201 21:53:18.464781 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 01 21:53:18 crc kubenswrapper[4962]: I1201 21:53:18.472638 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-7hk2m"] Dec 01 21:53:18 crc kubenswrapper[4962]: I1201 21:53:18.553397 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eab93f47-8c3c-470b-9427-3b48dc613572-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-7hk2m\" (UID: \"eab93f47-8c3c-470b-9427-3b48dc613572\") " pod="openstack/dnsmasq-dns-8554648995-7hk2m" Dec 01 21:53:18 crc kubenswrapper[4962]: I1201 21:53:18.553702 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eab93f47-8c3c-470b-9427-3b48dc613572-config\") pod \"dnsmasq-dns-8554648995-7hk2m\" (UID: \"eab93f47-8c3c-470b-9427-3b48dc613572\") " pod="openstack/dnsmasq-dns-8554648995-7hk2m" Dec 01 21:53:18 crc kubenswrapper[4962]: I1201 21:53:18.554743 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srdjk\" (UniqueName: \"kubernetes.io/projected/eab93f47-8c3c-470b-9427-3b48dc613572-kube-api-access-srdjk\") pod \"dnsmasq-dns-8554648995-7hk2m\" (UID: \"eab93f47-8c3c-470b-9427-3b48dc613572\") " pod="openstack/dnsmasq-dns-8554648995-7hk2m" Dec 01 21:53:18 crc kubenswrapper[4962]: I1201 21:53:18.554846 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eab93f47-8c3c-470b-9427-3b48dc613572-dns-svc\") pod \"dnsmasq-dns-8554648995-7hk2m\" (UID: \"eab93f47-8c3c-470b-9427-3b48dc613572\") " pod="openstack/dnsmasq-dns-8554648995-7hk2m" Dec 01 21:53:18 crc kubenswrapper[4962]: I1201 21:53:18.554863 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eab93f47-8c3c-470b-9427-3b48dc613572-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-7hk2m\" (UID: \"eab93f47-8c3c-470b-9427-3b48dc613572\") " pod="openstack/dnsmasq-dns-8554648995-7hk2m" Dec 01 21:53:18 crc kubenswrapper[4962]: I1201 21:53:18.606531 4962 generic.go:334] "Generic (PLEG): container finished" podID="4f8782cd-368d-4071-848d-8ad2379ddf6c" containerID="97294dc7e4dd8cc70e7c81893b1ac1dbf62e5ad1c96188ee0339b22ce385b53b" exitCode=0 Dec 01 21:53:18 crc kubenswrapper[4962]: I1201 21:53:18.606604 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4f8782cd-368d-4071-848d-8ad2379ddf6c","Type":"ContainerDied","Data":"97294dc7e4dd8cc70e7c81893b1ac1dbf62e5ad1c96188ee0339b22ce385b53b"} Dec 01 21:53:18 crc kubenswrapper[4962]: I1201 21:53:18.656589 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eab93f47-8c3c-470b-9427-3b48dc613572-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-7hk2m\" (UID: \"eab93f47-8c3c-470b-9427-3b48dc613572\") " pod="openstack/dnsmasq-dns-8554648995-7hk2m" Dec 01 21:53:18 crc kubenswrapper[4962]: I1201 21:53:18.656664 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eab93f47-8c3c-470b-9427-3b48dc613572-config\") pod \"dnsmasq-dns-8554648995-7hk2m\" (UID: \"eab93f47-8c3c-470b-9427-3b48dc613572\") " pod="openstack/dnsmasq-dns-8554648995-7hk2m" Dec 01 21:53:18 crc kubenswrapper[4962]: I1201 21:53:18.656707 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srdjk\" (UniqueName: \"kubernetes.io/projected/eab93f47-8c3c-470b-9427-3b48dc613572-kube-api-access-srdjk\") pod \"dnsmasq-dns-8554648995-7hk2m\" (UID: \"eab93f47-8c3c-470b-9427-3b48dc613572\") " pod="openstack/dnsmasq-dns-8554648995-7hk2m" Dec 01 21:53:18 crc kubenswrapper[4962]: I1201 21:53:18.656770 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eab93f47-8c3c-470b-9427-3b48dc613572-dns-svc\") pod \"dnsmasq-dns-8554648995-7hk2m\" (UID: \"eab93f47-8c3c-470b-9427-3b48dc613572\") " pod="openstack/dnsmasq-dns-8554648995-7hk2m" Dec 01 21:53:18 crc kubenswrapper[4962]: I1201 21:53:18.656790 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eab93f47-8c3c-470b-9427-3b48dc613572-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-7hk2m\" (UID: \"eab93f47-8c3c-470b-9427-3b48dc613572\") " pod="openstack/dnsmasq-dns-8554648995-7hk2m" Dec 01 21:53:18 crc kubenswrapper[4962]: I1201 21:53:18.657703 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eab93f47-8c3c-470b-9427-3b48dc613572-config\") pod \"dnsmasq-dns-8554648995-7hk2m\" (UID: \"eab93f47-8c3c-470b-9427-3b48dc613572\") " pod="openstack/dnsmasq-dns-8554648995-7hk2m" Dec 01 21:53:18 crc kubenswrapper[4962]: I1201 21:53:18.657918 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eab93f47-8c3c-470b-9427-3b48dc613572-dns-svc\") pod \"dnsmasq-dns-8554648995-7hk2m\" (UID: \"eab93f47-8c3c-470b-9427-3b48dc613572\") " pod="openstack/dnsmasq-dns-8554648995-7hk2m" Dec 01 21:53:18 crc kubenswrapper[4962]: I1201 21:53:18.658247 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eab93f47-8c3c-470b-9427-3b48dc613572-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-7hk2m\" (UID: \"eab93f47-8c3c-470b-9427-3b48dc613572\") " pod="openstack/dnsmasq-dns-8554648995-7hk2m" Dec 01 21:53:18 crc kubenswrapper[4962]: I1201 21:53:18.658367 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eab93f47-8c3c-470b-9427-3b48dc613572-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-7hk2m\" (UID: \"eab93f47-8c3c-470b-9427-3b48dc613572\") " pod="openstack/dnsmasq-dns-8554648995-7hk2m" Dec 01 21:53:18 crc kubenswrapper[4962]: I1201 21:53:18.675268 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srdjk\" (UniqueName: \"kubernetes.io/projected/eab93f47-8c3c-470b-9427-3b48dc613572-kube-api-access-srdjk\") pod \"dnsmasq-dns-8554648995-7hk2m\" (UID: \"eab93f47-8c3c-470b-9427-3b48dc613572\") " pod="openstack/dnsmasq-dns-8554648995-7hk2m" Dec 01 21:53:18 crc kubenswrapper[4962]: I1201 21:53:18.844342 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-7hk2m" Dec 01 21:53:18 crc kubenswrapper[4962]: I1201 21:53:18.874044 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-2c9dk"] Dec 01 21:53:18 crc kubenswrapper[4962]: W1201 21:53:18.879383 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod292e65d7_a332_450a_83b9_802e53eb5382.slice/crio-cf261afb5b558228e263ff5648af70a35bd1b190c27f6cdf50b01672014c2553 WatchSource:0}: Error finding container cf261afb5b558228e263ff5648af70a35bd1b190c27f6cdf50b01672014c2553: Status 404 returned error can't find the container with id cf261afb5b558228e263ff5648af70a35bd1b190c27f6cdf50b01672014c2553 Dec 01 21:53:19 crc kubenswrapper[4962]: I1201 21:53:19.032352 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-vqjlm"] Dec 01 21:53:19 crc kubenswrapper[4962]: I1201 21:53:19.465777 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-7hk2m"] Dec 01 21:53:19 crc kubenswrapper[4962]: I1201 21:53:19.472679 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-7hk2m"] Dec 01 21:53:19 crc kubenswrapper[4962]: I1201 21:53:19.551518 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-nn26f"] Dec 01 21:53:19 crc kubenswrapper[4962]: I1201 21:53:19.553206 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-nn26f" Dec 01 21:53:19 crc kubenswrapper[4962]: I1201 21:53:19.562947 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-nn26f"] Dec 01 21:53:19 crc kubenswrapper[4962]: I1201 21:53:19.624835 4962 generic.go:334] "Generic (PLEG): container finished" podID="292e65d7-a332-450a-83b9-802e53eb5382" containerID="8fd35d4f15dcff3daec9e3f9719a691263709aa97fe58b767294c3a2a4433bb5" exitCode=0 Dec 01 21:53:19 crc kubenswrapper[4962]: I1201 21:53:19.625311 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-2c9dk" event={"ID":"292e65d7-a332-450a-83b9-802e53eb5382","Type":"ContainerDied","Data":"8fd35d4f15dcff3daec9e3f9719a691263709aa97fe58b767294c3a2a4433bb5"} Dec 01 21:53:19 crc kubenswrapper[4962]: I1201 21:53:19.625394 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-2c9dk" event={"ID":"292e65d7-a332-450a-83b9-802e53eb5382","Type":"ContainerStarted","Data":"cf261afb5b558228e263ff5648af70a35bd1b190c27f6cdf50b01672014c2553"} Dec 01 21:53:19 crc kubenswrapper[4962]: I1201 21:53:19.639595 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-vqjlm" event={"ID":"1f47734e-2a33-432b-8030-c82a75ec77c3","Type":"ContainerStarted","Data":"7d296636c28b2a99d828db9d288084444f960eaf0fcd45b190add962a749079c"} Dec 01 21:53:19 crc kubenswrapper[4962]: I1201 21:53:19.639657 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-vqjlm" event={"ID":"1f47734e-2a33-432b-8030-c82a75ec77c3","Type":"ContainerStarted","Data":"df94535a4e2eca6d500171bd2d96e4ed1b1bcfd9d16318e259956fb9595b7a79"} Dec 01 21:53:19 crc kubenswrapper[4962]: I1201 21:53:19.661667 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-7hk2m" event={"ID":"eab93f47-8c3c-470b-9427-3b48dc613572","Type":"ContainerStarted","Data":"97e891f5993ac1767932c12f3d00e928bfb7723d178a8ee1c8203a32763a670e"} Dec 01 21:53:19 crc kubenswrapper[4962]: I1201 21:53:19.679526 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-vqjlm" podStartSLOduration=1.679508433 podStartE2EDuration="1.679508433s" podCreationTimestamp="2025-12-01 21:53:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:53:19.675296033 +0000 UTC m=+1183.776735228" watchObservedRunningTime="2025-12-01 21:53:19.679508433 +0000 UTC m=+1183.780947628" Dec 01 21:53:19 crc kubenswrapper[4962]: I1201 21:53:19.690259 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bfb8d480-e429-4cd8-b9c8-1361a41deb16-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-nn26f\" (UID: \"bfb8d480-e429-4cd8-b9c8-1361a41deb16\") " pod="openstack/dnsmasq-dns-b8fbc5445-nn26f" Dec 01 21:53:19 crc kubenswrapper[4962]: I1201 21:53:19.690376 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bslkm\" (UniqueName: \"kubernetes.io/projected/bfb8d480-e429-4cd8-b9c8-1361a41deb16-kube-api-access-bslkm\") pod \"dnsmasq-dns-b8fbc5445-nn26f\" (UID: \"bfb8d480-e429-4cd8-b9c8-1361a41deb16\") " pod="openstack/dnsmasq-dns-b8fbc5445-nn26f" Dec 01 21:53:19 crc kubenswrapper[4962]: I1201 21:53:19.690401 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bfb8d480-e429-4cd8-b9c8-1361a41deb16-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-nn26f\" (UID: \"bfb8d480-e429-4cd8-b9c8-1361a41deb16\") " pod="openstack/dnsmasq-dns-b8fbc5445-nn26f" Dec 01 21:53:19 crc kubenswrapper[4962]: I1201 21:53:19.690460 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bfb8d480-e429-4cd8-b9c8-1361a41deb16-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-nn26f\" (UID: \"bfb8d480-e429-4cd8-b9c8-1361a41deb16\") " pod="openstack/dnsmasq-dns-b8fbc5445-nn26f" Dec 01 21:53:19 crc kubenswrapper[4962]: I1201 21:53:19.690510 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfb8d480-e429-4cd8-b9c8-1361a41deb16-config\") pod \"dnsmasq-dns-b8fbc5445-nn26f\" (UID: \"bfb8d480-e429-4cd8-b9c8-1361a41deb16\") " pod="openstack/dnsmasq-dns-b8fbc5445-nn26f" Dec 01 21:53:19 crc kubenswrapper[4962]: I1201 21:53:19.792827 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bslkm\" (UniqueName: \"kubernetes.io/projected/bfb8d480-e429-4cd8-b9c8-1361a41deb16-kube-api-access-bslkm\") pod \"dnsmasq-dns-b8fbc5445-nn26f\" (UID: \"bfb8d480-e429-4cd8-b9c8-1361a41deb16\") " pod="openstack/dnsmasq-dns-b8fbc5445-nn26f" Dec 01 21:53:19 crc kubenswrapper[4962]: I1201 21:53:19.793080 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bfb8d480-e429-4cd8-b9c8-1361a41deb16-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-nn26f\" (UID: \"bfb8d480-e429-4cd8-b9c8-1361a41deb16\") " pod="openstack/dnsmasq-dns-b8fbc5445-nn26f" Dec 01 21:53:19 crc kubenswrapper[4962]: I1201 21:53:19.793143 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bfb8d480-e429-4cd8-b9c8-1361a41deb16-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-nn26f\" (UID: \"bfb8d480-e429-4cd8-b9c8-1361a41deb16\") " pod="openstack/dnsmasq-dns-b8fbc5445-nn26f" Dec 01 21:53:19 crc kubenswrapper[4962]: I1201 21:53:19.793186 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfb8d480-e429-4cd8-b9c8-1361a41deb16-config\") pod \"dnsmasq-dns-b8fbc5445-nn26f\" (UID: \"bfb8d480-e429-4cd8-b9c8-1361a41deb16\") " pod="openstack/dnsmasq-dns-b8fbc5445-nn26f" Dec 01 21:53:19 crc kubenswrapper[4962]: I1201 21:53:19.793322 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bfb8d480-e429-4cd8-b9c8-1361a41deb16-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-nn26f\" (UID: \"bfb8d480-e429-4cd8-b9c8-1361a41deb16\") " pod="openstack/dnsmasq-dns-b8fbc5445-nn26f" Dec 01 21:53:19 crc kubenswrapper[4962]: I1201 21:53:19.794343 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bfb8d480-e429-4cd8-b9c8-1361a41deb16-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-nn26f\" (UID: \"bfb8d480-e429-4cd8-b9c8-1361a41deb16\") " pod="openstack/dnsmasq-dns-b8fbc5445-nn26f" Dec 01 21:53:19 crc kubenswrapper[4962]: I1201 21:53:19.794354 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfb8d480-e429-4cd8-b9c8-1361a41deb16-config\") pod \"dnsmasq-dns-b8fbc5445-nn26f\" (UID: \"bfb8d480-e429-4cd8-b9c8-1361a41deb16\") " pod="openstack/dnsmasq-dns-b8fbc5445-nn26f" Dec 01 21:53:19 crc kubenswrapper[4962]: I1201 21:53:19.794374 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bfb8d480-e429-4cd8-b9c8-1361a41deb16-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-nn26f\" (UID: \"bfb8d480-e429-4cd8-b9c8-1361a41deb16\") " pod="openstack/dnsmasq-dns-b8fbc5445-nn26f" Dec 01 21:53:19 crc kubenswrapper[4962]: I1201 21:53:19.795142 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bfb8d480-e429-4cd8-b9c8-1361a41deb16-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-nn26f\" (UID: \"bfb8d480-e429-4cd8-b9c8-1361a41deb16\") " pod="openstack/dnsmasq-dns-b8fbc5445-nn26f" Dec 01 21:53:19 crc kubenswrapper[4962]: I1201 21:53:19.822359 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bslkm\" (UniqueName: \"kubernetes.io/projected/bfb8d480-e429-4cd8-b9c8-1361a41deb16-kube-api-access-bslkm\") pod \"dnsmasq-dns-b8fbc5445-nn26f\" (UID: \"bfb8d480-e429-4cd8-b9c8-1361a41deb16\") " pod="openstack/dnsmasq-dns-b8fbc5445-nn26f" Dec 01 21:53:19 crc kubenswrapper[4962]: I1201 21:53:19.873831 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 01 21:53:19 crc kubenswrapper[4962]: I1201 21:53:19.929712 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-2c9dk" Dec 01 21:53:19 crc kubenswrapper[4962]: I1201 21:53:19.952473 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-nn26f" Dec 01 21:53:19 crc kubenswrapper[4962]: I1201 21:53:19.956608 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 01 21:53:20 crc kubenswrapper[4962]: I1201 21:53:20.100691 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/292e65d7-a332-450a-83b9-802e53eb5382-ovsdbserver-nb\") pod \"292e65d7-a332-450a-83b9-802e53eb5382\" (UID: \"292e65d7-a332-450a-83b9-802e53eb5382\") " Dec 01 21:53:20 crc kubenswrapper[4962]: I1201 21:53:20.101163 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/292e65d7-a332-450a-83b9-802e53eb5382-dns-svc\") pod \"292e65d7-a332-450a-83b9-802e53eb5382\" (UID: \"292e65d7-a332-450a-83b9-802e53eb5382\") " Dec 01 21:53:20 crc kubenswrapper[4962]: I1201 21:53:20.101241 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxnq2\" (UniqueName: \"kubernetes.io/projected/292e65d7-a332-450a-83b9-802e53eb5382-kube-api-access-wxnq2\") pod \"292e65d7-a332-450a-83b9-802e53eb5382\" (UID: \"292e65d7-a332-450a-83b9-802e53eb5382\") " Dec 01 21:53:20 crc kubenswrapper[4962]: I1201 21:53:20.101257 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/292e65d7-a332-450a-83b9-802e53eb5382-config\") pod \"292e65d7-a332-450a-83b9-802e53eb5382\" (UID: \"292e65d7-a332-450a-83b9-802e53eb5382\") " Dec 01 21:53:20 crc kubenswrapper[4962]: I1201 21:53:20.117151 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/292e65d7-a332-450a-83b9-802e53eb5382-kube-api-access-wxnq2" (OuterVolumeSpecName: "kube-api-access-wxnq2") pod "292e65d7-a332-450a-83b9-802e53eb5382" (UID: "292e65d7-a332-450a-83b9-802e53eb5382"). InnerVolumeSpecName "kube-api-access-wxnq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:53:20 crc kubenswrapper[4962]: I1201 21:53:20.130019 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/292e65d7-a332-450a-83b9-802e53eb5382-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "292e65d7-a332-450a-83b9-802e53eb5382" (UID: "292e65d7-a332-450a-83b9-802e53eb5382"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:53:20 crc kubenswrapper[4962]: I1201 21:53:20.139462 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/292e65d7-a332-450a-83b9-802e53eb5382-config" (OuterVolumeSpecName: "config") pod "292e65d7-a332-450a-83b9-802e53eb5382" (UID: "292e65d7-a332-450a-83b9-802e53eb5382"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:53:20 crc kubenswrapper[4962]: I1201 21:53:20.159273 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/292e65d7-a332-450a-83b9-802e53eb5382-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "292e65d7-a332-450a-83b9-802e53eb5382" (UID: "292e65d7-a332-450a-83b9-802e53eb5382"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:53:20 crc kubenswrapper[4962]: I1201 21:53:20.203331 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxnq2\" (UniqueName: \"kubernetes.io/projected/292e65d7-a332-450a-83b9-802e53eb5382-kube-api-access-wxnq2\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:20 crc kubenswrapper[4962]: I1201 21:53:20.203363 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/292e65d7-a332-450a-83b9-802e53eb5382-config\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:20 crc kubenswrapper[4962]: I1201 21:53:20.203374 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/292e65d7-a332-450a-83b9-802e53eb5382-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:20 crc kubenswrapper[4962]: I1201 21:53:20.203382 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/292e65d7-a332-450a-83b9-802e53eb5382-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:20 crc kubenswrapper[4962]: I1201 21:53:20.435493 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-nn26f"] Dec 01 21:53:20 crc kubenswrapper[4962]: W1201 21:53:20.437222 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbfb8d480_e429_4cd8_b9c8_1361a41deb16.slice/crio-2915dc50064e2a305c92086a7b1e04f0220b98ccd3636a8bc15a77f713ed415a WatchSource:0}: Error finding container 2915dc50064e2a305c92086a7b1e04f0220b98ccd3636a8bc15a77f713ed415a: Status 404 returned error can't find the container with id 2915dc50064e2a305c92086a7b1e04f0220b98ccd3636a8bc15a77f713ed415a Dec 01 21:53:20 crc kubenswrapper[4962]: I1201 21:53:20.649393 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Dec 01 21:53:20 crc kubenswrapper[4962]: E1201 21:53:20.650946 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="292e65d7-a332-450a-83b9-802e53eb5382" containerName="init" Dec 01 21:53:20 crc kubenswrapper[4962]: I1201 21:53:20.651010 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="292e65d7-a332-450a-83b9-802e53eb5382" containerName="init" Dec 01 21:53:20 crc kubenswrapper[4962]: I1201 21:53:20.651290 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="292e65d7-a332-450a-83b9-802e53eb5382" containerName="init" Dec 01 21:53:20 crc kubenswrapper[4962]: I1201 21:53:20.695361 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 01 21:53:20 crc kubenswrapper[4962]: I1201 21:53:20.697287 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-zl7zs" Dec 01 21:53:20 crc kubenswrapper[4962]: I1201 21:53:20.697297 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Dec 01 21:53:20 crc kubenswrapper[4962]: I1201 21:53:20.697417 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Dec 01 21:53:20 crc kubenswrapper[4962]: I1201 21:53:20.698115 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Dec 01 21:53:20 crc kubenswrapper[4962]: I1201 21:53:20.703840 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 01 21:53:20 crc kubenswrapper[4962]: I1201 21:53:20.713558 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-2c9dk" Dec 01 21:53:20 crc kubenswrapper[4962]: I1201 21:53:20.713626 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-2c9dk" event={"ID":"292e65d7-a332-450a-83b9-802e53eb5382","Type":"ContainerDied","Data":"cf261afb5b558228e263ff5648af70a35bd1b190c27f6cdf50b01672014c2553"} Dec 01 21:53:20 crc kubenswrapper[4962]: I1201 21:53:20.714829 4962 scope.go:117] "RemoveContainer" containerID="8fd35d4f15dcff3daec9e3f9719a691263709aa97fe58b767294c3a2a4433bb5" Dec 01 21:53:20 crc kubenswrapper[4962]: I1201 21:53:20.724978 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-nn26f" event={"ID":"bfb8d480-e429-4cd8-b9c8-1361a41deb16","Type":"ContainerStarted","Data":"2915dc50064e2a305c92086a7b1e04f0220b98ccd3636a8bc15a77f713ed415a"} Dec 01 21:53:20 crc kubenswrapper[4962]: I1201 21:53:20.730962 4962 generic.go:334] "Generic (PLEG): container finished" podID="eab93f47-8c3c-470b-9427-3b48dc613572" containerID="13e3bc633aadab198f2554baf71af5d2ba546040b744cd23b795ad1a391c00cc" exitCode=0 Dec 01 21:53:20 crc kubenswrapper[4962]: I1201 21:53:20.731270 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-7hk2m" event={"ID":"eab93f47-8c3c-470b-9427-3b48dc613572","Type":"ContainerDied","Data":"13e3bc633aadab198f2554baf71af5d2ba546040b744cd23b795ad1a391c00cc"} Dec 01 21:53:20 crc kubenswrapper[4962]: I1201 21:53:20.780701 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-2c9dk"] Dec 01 21:53:20 crc kubenswrapper[4962]: I1201 21:53:20.798216 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-2c9dk"] Dec 01 21:53:20 crc kubenswrapper[4962]: I1201 21:53:20.815507 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3-lock\") pod \"swift-storage-0\" (UID: \"f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3\") " pod="openstack/swift-storage-0" Dec 01 21:53:20 crc kubenswrapper[4962]: I1201 21:53:20.815576 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8w7kl\" (UniqueName: \"kubernetes.io/projected/f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3-kube-api-access-8w7kl\") pod \"swift-storage-0\" (UID: \"f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3\") " pod="openstack/swift-storage-0" Dec 01 21:53:20 crc kubenswrapper[4962]: I1201 21:53:20.815719 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3-etc-swift\") pod \"swift-storage-0\" (UID: \"f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3\") " pod="openstack/swift-storage-0" Dec 01 21:53:20 crc kubenswrapper[4962]: I1201 21:53:20.815750 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3\") " pod="openstack/swift-storage-0" Dec 01 21:53:20 crc kubenswrapper[4962]: I1201 21:53:20.815809 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3-cache\") pod \"swift-storage-0\" (UID: \"f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3\") " pod="openstack/swift-storage-0" Dec 01 21:53:20 crc kubenswrapper[4962]: I1201 21:53:20.920034 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8w7kl\" (UniqueName: \"kubernetes.io/projected/f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3-kube-api-access-8w7kl\") pod \"swift-storage-0\" (UID: \"f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3\") " pod="openstack/swift-storage-0" Dec 01 21:53:20 crc kubenswrapper[4962]: I1201 21:53:20.920157 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3-etc-swift\") pod \"swift-storage-0\" (UID: \"f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3\") " pod="openstack/swift-storage-0" Dec 01 21:53:20 crc kubenswrapper[4962]: I1201 21:53:20.920189 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3\") " pod="openstack/swift-storage-0" Dec 01 21:53:20 crc kubenswrapper[4962]: I1201 21:53:20.920240 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3-cache\") pod \"swift-storage-0\" (UID: \"f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3\") " pod="openstack/swift-storage-0" Dec 01 21:53:20 crc kubenswrapper[4962]: I1201 21:53:20.920301 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3-lock\") pod \"swift-storage-0\" (UID: \"f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3\") " pod="openstack/swift-storage-0" Dec 01 21:53:20 crc kubenswrapper[4962]: E1201 21:53:20.920424 4962 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 01 21:53:20 crc kubenswrapper[4962]: E1201 21:53:20.920451 4962 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 01 21:53:20 crc kubenswrapper[4962]: E1201 21:53:20.920509 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3-etc-swift podName:f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3 nodeName:}" failed. No retries permitted until 2025-12-01 21:53:21.42048961 +0000 UTC m=+1185.521928805 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3-etc-swift") pod "swift-storage-0" (UID: "f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3") : configmap "swift-ring-files" not found Dec 01 21:53:20 crc kubenswrapper[4962]: I1201 21:53:20.920652 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/swift-storage-0" Dec 01 21:53:20 crc kubenswrapper[4962]: I1201 21:53:20.928216 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3-cache\") pod \"swift-storage-0\" (UID: \"f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3\") " pod="openstack/swift-storage-0" Dec 01 21:53:20 crc kubenswrapper[4962]: I1201 21:53:20.928495 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3-lock\") pod \"swift-storage-0\" (UID: \"f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3\") " pod="openstack/swift-storage-0" Dec 01 21:53:20 crc kubenswrapper[4962]: I1201 21:53:20.961597 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3\") " pod="openstack/swift-storage-0" Dec 01 21:53:20 crc kubenswrapper[4962]: I1201 21:53:20.966095 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8w7kl\" (UniqueName: \"kubernetes.io/projected/f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3-kube-api-access-8w7kl\") pod \"swift-storage-0\" (UID: \"f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3\") " pod="openstack/swift-storage-0" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.052664 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-7hk2m" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.131189 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eab93f47-8c3c-470b-9427-3b48dc613572-dns-svc\") pod \"eab93f47-8c3c-470b-9427-3b48dc613572\" (UID: \"eab93f47-8c3c-470b-9427-3b48dc613572\") " Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.131230 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eab93f47-8c3c-470b-9427-3b48dc613572-ovsdbserver-sb\") pod \"eab93f47-8c3c-470b-9427-3b48dc613572\" (UID: \"eab93f47-8c3c-470b-9427-3b48dc613572\") " Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.131309 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srdjk\" (UniqueName: \"kubernetes.io/projected/eab93f47-8c3c-470b-9427-3b48dc613572-kube-api-access-srdjk\") pod \"eab93f47-8c3c-470b-9427-3b48dc613572\" (UID: \"eab93f47-8c3c-470b-9427-3b48dc613572\") " Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.131432 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eab93f47-8c3c-470b-9427-3b48dc613572-config\") pod \"eab93f47-8c3c-470b-9427-3b48dc613572\" (UID: \"eab93f47-8c3c-470b-9427-3b48dc613572\") " Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.131556 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eab93f47-8c3c-470b-9427-3b48dc613572-ovsdbserver-nb\") pod \"eab93f47-8c3c-470b-9427-3b48dc613572\" (UID: \"eab93f47-8c3c-470b-9427-3b48dc613572\") " Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.136697 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eab93f47-8c3c-470b-9427-3b48dc613572-kube-api-access-srdjk" (OuterVolumeSpecName: "kube-api-access-srdjk") pod "eab93f47-8c3c-470b-9427-3b48dc613572" (UID: "eab93f47-8c3c-470b-9427-3b48dc613572"). InnerVolumeSpecName "kube-api-access-srdjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.157656 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-8fpdb"] Dec 01 21:53:21 crc kubenswrapper[4962]: E1201 21:53:21.158150 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eab93f47-8c3c-470b-9427-3b48dc613572" containerName="init" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.158166 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="eab93f47-8c3c-470b-9427-3b48dc613572" containerName="init" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.158361 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="eab93f47-8c3c-470b-9427-3b48dc613572" containerName="init" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.161403 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-8fpdb" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.165237 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.170394 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.170795 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.180156 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-8fpdb"] Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.188652 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-8fpdb"] Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.204062 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eab93f47-8c3c-470b-9427-3b48dc613572-config" (OuterVolumeSpecName: "config") pod "eab93f47-8c3c-470b-9427-3b48dc613572" (UID: "eab93f47-8c3c-470b-9427-3b48dc613572"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:53:21 crc kubenswrapper[4962]: E1201 21:53:21.204535 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-kvcrd ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/swift-ring-rebalance-8fpdb" podUID="9ebee34e-0222-4cee-a9ee-f08d1d5fc670" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.212523 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eab93f47-8c3c-470b-9427-3b48dc613572-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "eab93f47-8c3c-470b-9427-3b48dc613572" (UID: "eab93f47-8c3c-470b-9427-3b48dc613572"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.220438 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eab93f47-8c3c-470b-9427-3b48dc613572-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "eab93f47-8c3c-470b-9427-3b48dc613572" (UID: "eab93f47-8c3c-470b-9427-3b48dc613572"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.229318 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eab93f47-8c3c-470b-9427-3b48dc613572-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "eab93f47-8c3c-470b-9427-3b48dc613572" (UID: "eab93f47-8c3c-470b-9427-3b48dc613572"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.235720 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9ebee34e-0222-4cee-a9ee-f08d1d5fc670-scripts\") pod \"swift-ring-rebalance-8fpdb\" (UID: \"9ebee34e-0222-4cee-a9ee-f08d1d5fc670\") " pod="openstack/swift-ring-rebalance-8fpdb" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.235808 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9ebee34e-0222-4cee-a9ee-f08d1d5fc670-etc-swift\") pod \"swift-ring-rebalance-8fpdb\" (UID: \"9ebee34e-0222-4cee-a9ee-f08d1d5fc670\") " pod="openstack/swift-ring-rebalance-8fpdb" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.237177 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-5db2p"] Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.240360 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-5db2p" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.241413 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9ebee34e-0222-4cee-a9ee-f08d1d5fc670-ring-data-devices\") pod \"swift-ring-rebalance-8fpdb\" (UID: \"9ebee34e-0222-4cee-a9ee-f08d1d5fc670\") " pod="openstack/swift-ring-rebalance-8fpdb" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.241509 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvcrd\" (UniqueName: \"kubernetes.io/projected/9ebee34e-0222-4cee-a9ee-f08d1d5fc670-kube-api-access-kvcrd\") pod \"swift-ring-rebalance-8fpdb\" (UID: \"9ebee34e-0222-4cee-a9ee-f08d1d5fc670\") " pod="openstack/swift-ring-rebalance-8fpdb" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.241585 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ebee34e-0222-4cee-a9ee-f08d1d5fc670-combined-ca-bundle\") pod \"swift-ring-rebalance-8fpdb\" (UID: \"9ebee34e-0222-4cee-a9ee-f08d1d5fc670\") " pod="openstack/swift-ring-rebalance-8fpdb" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.241640 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9ebee34e-0222-4cee-a9ee-f08d1d5fc670-swiftconf\") pod \"swift-ring-rebalance-8fpdb\" (UID: \"9ebee34e-0222-4cee-a9ee-f08d1d5fc670\") " pod="openstack/swift-ring-rebalance-8fpdb" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.241690 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9ebee34e-0222-4cee-a9ee-f08d1d5fc670-dispersionconf\") pod \"swift-ring-rebalance-8fpdb\" (UID: \"9ebee34e-0222-4cee-a9ee-f08d1d5fc670\") " pod="openstack/swift-ring-rebalance-8fpdb" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.242019 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eab93f47-8c3c-470b-9427-3b48dc613572-config\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.242037 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eab93f47-8c3c-470b-9427-3b48dc613572-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.242048 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eab93f47-8c3c-470b-9427-3b48dc613572-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.242056 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eab93f47-8c3c-470b-9427-3b48dc613572-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.242065 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srdjk\" (UniqueName: \"kubernetes.io/projected/eab93f47-8c3c-470b-9427-3b48dc613572-kube-api-access-srdjk\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.253325 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-5db2p"] Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.343957 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9ebee34e-0222-4cee-a9ee-f08d1d5fc670-scripts\") pod \"swift-ring-rebalance-8fpdb\" (UID: \"9ebee34e-0222-4cee-a9ee-f08d1d5fc670\") " pod="openstack/swift-ring-rebalance-8fpdb" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.344015 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34cbe04f-2bf2-4b5e-bf91-00787b7e4fee-combined-ca-bundle\") pod \"swift-ring-rebalance-5db2p\" (UID: \"34cbe04f-2bf2-4b5e-bf91-00787b7e4fee\") " pod="openstack/swift-ring-rebalance-5db2p" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.344053 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9ebee34e-0222-4cee-a9ee-f08d1d5fc670-etc-swift\") pod \"swift-ring-rebalance-8fpdb\" (UID: \"9ebee34e-0222-4cee-a9ee-f08d1d5fc670\") " pod="openstack/swift-ring-rebalance-8fpdb" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.344146 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9ebee34e-0222-4cee-a9ee-f08d1d5fc670-ring-data-devices\") pod \"swift-ring-rebalance-8fpdb\" (UID: \"9ebee34e-0222-4cee-a9ee-f08d1d5fc670\") " pod="openstack/swift-ring-rebalance-8fpdb" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.344174 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/34cbe04f-2bf2-4b5e-bf91-00787b7e4fee-etc-swift\") pod \"swift-ring-rebalance-5db2p\" (UID: \"34cbe04f-2bf2-4b5e-bf91-00787b7e4fee\") " pod="openstack/swift-ring-rebalance-5db2p" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.344193 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvcrd\" (UniqueName: \"kubernetes.io/projected/9ebee34e-0222-4cee-a9ee-f08d1d5fc670-kube-api-access-kvcrd\") pod \"swift-ring-rebalance-8fpdb\" (UID: \"9ebee34e-0222-4cee-a9ee-f08d1d5fc670\") " pod="openstack/swift-ring-rebalance-8fpdb" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.344225 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ebee34e-0222-4cee-a9ee-f08d1d5fc670-combined-ca-bundle\") pod \"swift-ring-rebalance-8fpdb\" (UID: \"9ebee34e-0222-4cee-a9ee-f08d1d5fc670\") " pod="openstack/swift-ring-rebalance-8fpdb" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.344250 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9ebee34e-0222-4cee-a9ee-f08d1d5fc670-swiftconf\") pod \"swift-ring-rebalance-8fpdb\" (UID: \"9ebee34e-0222-4cee-a9ee-f08d1d5fc670\") " pod="openstack/swift-ring-rebalance-8fpdb" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.344272 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/34cbe04f-2bf2-4b5e-bf91-00787b7e4fee-dispersionconf\") pod \"swift-ring-rebalance-5db2p\" (UID: \"34cbe04f-2bf2-4b5e-bf91-00787b7e4fee\") " pod="openstack/swift-ring-rebalance-5db2p" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.344290 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9ebee34e-0222-4cee-a9ee-f08d1d5fc670-dispersionconf\") pod \"swift-ring-rebalance-8fpdb\" (UID: \"9ebee34e-0222-4cee-a9ee-f08d1d5fc670\") " pod="openstack/swift-ring-rebalance-8fpdb" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.344343 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/34cbe04f-2bf2-4b5e-bf91-00787b7e4fee-scripts\") pod \"swift-ring-rebalance-5db2p\" (UID: \"34cbe04f-2bf2-4b5e-bf91-00787b7e4fee\") " pod="openstack/swift-ring-rebalance-5db2p" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.344364 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bczsp\" (UniqueName: \"kubernetes.io/projected/34cbe04f-2bf2-4b5e-bf91-00787b7e4fee-kube-api-access-bczsp\") pod \"swift-ring-rebalance-5db2p\" (UID: \"34cbe04f-2bf2-4b5e-bf91-00787b7e4fee\") " pod="openstack/swift-ring-rebalance-5db2p" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.344383 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/34cbe04f-2bf2-4b5e-bf91-00787b7e4fee-ring-data-devices\") pod \"swift-ring-rebalance-5db2p\" (UID: \"34cbe04f-2bf2-4b5e-bf91-00787b7e4fee\") " pod="openstack/swift-ring-rebalance-5db2p" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.344401 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/34cbe04f-2bf2-4b5e-bf91-00787b7e4fee-swiftconf\") pod \"swift-ring-rebalance-5db2p\" (UID: \"34cbe04f-2bf2-4b5e-bf91-00787b7e4fee\") " pod="openstack/swift-ring-rebalance-5db2p" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.345296 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9ebee34e-0222-4cee-a9ee-f08d1d5fc670-scripts\") pod \"swift-ring-rebalance-8fpdb\" (UID: \"9ebee34e-0222-4cee-a9ee-f08d1d5fc670\") " pod="openstack/swift-ring-rebalance-8fpdb" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.345439 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9ebee34e-0222-4cee-a9ee-f08d1d5fc670-etc-swift\") pod \"swift-ring-rebalance-8fpdb\" (UID: \"9ebee34e-0222-4cee-a9ee-f08d1d5fc670\") " pod="openstack/swift-ring-rebalance-8fpdb" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.345562 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9ebee34e-0222-4cee-a9ee-f08d1d5fc670-ring-data-devices\") pod \"swift-ring-rebalance-8fpdb\" (UID: \"9ebee34e-0222-4cee-a9ee-f08d1d5fc670\") " pod="openstack/swift-ring-rebalance-8fpdb" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.348530 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9ebee34e-0222-4cee-a9ee-f08d1d5fc670-swiftconf\") pod \"swift-ring-rebalance-8fpdb\" (UID: \"9ebee34e-0222-4cee-a9ee-f08d1d5fc670\") " pod="openstack/swift-ring-rebalance-8fpdb" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.348999 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9ebee34e-0222-4cee-a9ee-f08d1d5fc670-dispersionconf\") pod \"swift-ring-rebalance-8fpdb\" (UID: \"9ebee34e-0222-4cee-a9ee-f08d1d5fc670\") " pod="openstack/swift-ring-rebalance-8fpdb" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.352432 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ebee34e-0222-4cee-a9ee-f08d1d5fc670-combined-ca-bundle\") pod \"swift-ring-rebalance-8fpdb\" (UID: \"9ebee34e-0222-4cee-a9ee-f08d1d5fc670\") " pod="openstack/swift-ring-rebalance-8fpdb" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.365728 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvcrd\" (UniqueName: \"kubernetes.io/projected/9ebee34e-0222-4cee-a9ee-f08d1d5fc670-kube-api-access-kvcrd\") pod \"swift-ring-rebalance-8fpdb\" (UID: \"9ebee34e-0222-4cee-a9ee-f08d1d5fc670\") " pod="openstack/swift-ring-rebalance-8fpdb" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.446391 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/34cbe04f-2bf2-4b5e-bf91-00787b7e4fee-etc-swift\") pod \"swift-ring-rebalance-5db2p\" (UID: \"34cbe04f-2bf2-4b5e-bf91-00787b7e4fee\") " pod="openstack/swift-ring-rebalance-5db2p" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.446666 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/34cbe04f-2bf2-4b5e-bf91-00787b7e4fee-dispersionconf\") pod \"swift-ring-rebalance-5db2p\" (UID: \"34cbe04f-2bf2-4b5e-bf91-00787b7e4fee\") " pod="openstack/swift-ring-rebalance-5db2p" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.446793 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/34cbe04f-2bf2-4b5e-bf91-00787b7e4fee-etc-swift\") pod \"swift-ring-rebalance-5db2p\" (UID: \"34cbe04f-2bf2-4b5e-bf91-00787b7e4fee\") " pod="openstack/swift-ring-rebalance-5db2p" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.446816 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3-etc-swift\") pod \"swift-storage-0\" (UID: \"f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3\") " pod="openstack/swift-storage-0" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.447025 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/34cbe04f-2bf2-4b5e-bf91-00787b7e4fee-scripts\") pod \"swift-ring-rebalance-5db2p\" (UID: \"34cbe04f-2bf2-4b5e-bf91-00787b7e4fee\") " pod="openstack/swift-ring-rebalance-5db2p" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.447139 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bczsp\" (UniqueName: \"kubernetes.io/projected/34cbe04f-2bf2-4b5e-bf91-00787b7e4fee-kube-api-access-bczsp\") pod \"swift-ring-rebalance-5db2p\" (UID: \"34cbe04f-2bf2-4b5e-bf91-00787b7e4fee\") " pod="openstack/swift-ring-rebalance-5db2p" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.447232 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/34cbe04f-2bf2-4b5e-bf91-00787b7e4fee-ring-data-devices\") pod \"swift-ring-rebalance-5db2p\" (UID: \"34cbe04f-2bf2-4b5e-bf91-00787b7e4fee\") " pod="openstack/swift-ring-rebalance-5db2p" Dec 01 21:53:21 crc kubenswrapper[4962]: E1201 21:53:21.447048 4962 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 01 21:53:21 crc kubenswrapper[4962]: E1201 21:53:21.447362 4962 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.447320 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/34cbe04f-2bf2-4b5e-bf91-00787b7e4fee-swiftconf\") pod \"swift-ring-rebalance-5db2p\" (UID: \"34cbe04f-2bf2-4b5e-bf91-00787b7e4fee\") " pod="openstack/swift-ring-rebalance-5db2p" Dec 01 21:53:21 crc kubenswrapper[4962]: E1201 21:53:21.447453 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3-etc-swift podName:f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3 nodeName:}" failed. No retries permitted until 2025-12-01 21:53:22.447429324 +0000 UTC m=+1186.548868519 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3-etc-swift") pod "swift-storage-0" (UID: "f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3") : configmap "swift-ring-files" not found Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.447689 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34cbe04f-2bf2-4b5e-bf91-00787b7e4fee-combined-ca-bundle\") pod \"swift-ring-rebalance-5db2p\" (UID: \"34cbe04f-2bf2-4b5e-bf91-00787b7e4fee\") " pod="openstack/swift-ring-rebalance-5db2p" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.447786 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/34cbe04f-2bf2-4b5e-bf91-00787b7e4fee-ring-data-devices\") pod \"swift-ring-rebalance-5db2p\" (UID: \"34cbe04f-2bf2-4b5e-bf91-00787b7e4fee\") " pod="openstack/swift-ring-rebalance-5db2p" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.447711 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/34cbe04f-2bf2-4b5e-bf91-00787b7e4fee-scripts\") pod \"swift-ring-rebalance-5db2p\" (UID: \"34cbe04f-2bf2-4b5e-bf91-00787b7e4fee\") " pod="openstack/swift-ring-rebalance-5db2p" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.452567 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/34cbe04f-2bf2-4b5e-bf91-00787b7e4fee-dispersionconf\") pod \"swift-ring-rebalance-5db2p\" (UID: \"34cbe04f-2bf2-4b5e-bf91-00787b7e4fee\") " pod="openstack/swift-ring-rebalance-5db2p" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.453592 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/34cbe04f-2bf2-4b5e-bf91-00787b7e4fee-swiftconf\") pod \"swift-ring-rebalance-5db2p\" (UID: \"34cbe04f-2bf2-4b5e-bf91-00787b7e4fee\") " pod="openstack/swift-ring-rebalance-5db2p" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.457044 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34cbe04f-2bf2-4b5e-bf91-00787b7e4fee-combined-ca-bundle\") pod \"swift-ring-rebalance-5db2p\" (UID: \"34cbe04f-2bf2-4b5e-bf91-00787b7e4fee\") " pod="openstack/swift-ring-rebalance-5db2p" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.464921 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bczsp\" (UniqueName: \"kubernetes.io/projected/34cbe04f-2bf2-4b5e-bf91-00787b7e4fee-kube-api-access-bczsp\") pod \"swift-ring-rebalance-5db2p\" (UID: \"34cbe04f-2bf2-4b5e-bf91-00787b7e4fee\") " pod="openstack/swift-ring-rebalance-5db2p" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.566660 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-5db2p" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.746103 4962 generic.go:334] "Generic (PLEG): container finished" podID="bfb8d480-e429-4cd8-b9c8-1361a41deb16" containerID="1594447eaf1039a5d8eefd1cacc041bfc9767ecf6be3bd82b31dd5eca34ade07" exitCode=0 Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.746241 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-nn26f" event={"ID":"bfb8d480-e429-4cd8-b9c8-1361a41deb16","Type":"ContainerDied","Data":"1594447eaf1039a5d8eefd1cacc041bfc9767ecf6be3bd82b31dd5eca34ade07"} Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.748322 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-7hk2m" event={"ID":"eab93f47-8c3c-470b-9427-3b48dc613572","Type":"ContainerDied","Data":"97e891f5993ac1767932c12f3d00e928bfb7723d178a8ee1c8203a32763a670e"} Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.748358 4962 scope.go:117] "RemoveContainer" containerID="13e3bc633aadab198f2554baf71af5d2ba546040b744cd23b795ad1a391c00cc" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.748430 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-7hk2m" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.752796 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-8fpdb" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.774096 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-8fpdb" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.846140 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-7hk2m"] Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.852839 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvcrd\" (UniqueName: \"kubernetes.io/projected/9ebee34e-0222-4cee-a9ee-f08d1d5fc670-kube-api-access-kvcrd\") pod \"9ebee34e-0222-4cee-a9ee-f08d1d5fc670\" (UID: \"9ebee34e-0222-4cee-a9ee-f08d1d5fc670\") " Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.852884 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9ebee34e-0222-4cee-a9ee-f08d1d5fc670-dispersionconf\") pod \"9ebee34e-0222-4cee-a9ee-f08d1d5fc670\" (UID: \"9ebee34e-0222-4cee-a9ee-f08d1d5fc670\") " Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.852924 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9ebee34e-0222-4cee-a9ee-f08d1d5fc670-scripts\") pod \"9ebee34e-0222-4cee-a9ee-f08d1d5fc670\" (UID: \"9ebee34e-0222-4cee-a9ee-f08d1d5fc670\") " Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.853030 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9ebee34e-0222-4cee-a9ee-f08d1d5fc670-swiftconf\") pod \"9ebee34e-0222-4cee-a9ee-f08d1d5fc670\" (UID: \"9ebee34e-0222-4cee-a9ee-f08d1d5fc670\") " Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.853105 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9ebee34e-0222-4cee-a9ee-f08d1d5fc670-etc-swift\") pod \"9ebee34e-0222-4cee-a9ee-f08d1d5fc670\" (UID: \"9ebee34e-0222-4cee-a9ee-f08d1d5fc670\") " Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.853150 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ebee34e-0222-4cee-a9ee-f08d1d5fc670-combined-ca-bundle\") pod \"9ebee34e-0222-4cee-a9ee-f08d1d5fc670\" (UID: \"9ebee34e-0222-4cee-a9ee-f08d1d5fc670\") " Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.853289 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9ebee34e-0222-4cee-a9ee-f08d1d5fc670-ring-data-devices\") pod \"9ebee34e-0222-4cee-a9ee-f08d1d5fc670\" (UID: \"9ebee34e-0222-4cee-a9ee-f08d1d5fc670\") " Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.854496 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ebee34e-0222-4cee-a9ee-f08d1d5fc670-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "9ebee34e-0222-4cee-a9ee-f08d1d5fc670" (UID: "9ebee34e-0222-4cee-a9ee-f08d1d5fc670"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.854883 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ebee34e-0222-4cee-a9ee-f08d1d5fc670-scripts" (OuterVolumeSpecName: "scripts") pod "9ebee34e-0222-4cee-a9ee-f08d1d5fc670" (UID: "9ebee34e-0222-4cee-a9ee-f08d1d5fc670"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.855248 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ebee34e-0222-4cee-a9ee-f08d1d5fc670-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "9ebee34e-0222-4cee-a9ee-f08d1d5fc670" (UID: "9ebee34e-0222-4cee-a9ee-f08d1d5fc670"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.858038 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ebee34e-0222-4cee-a9ee-f08d1d5fc670-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9ebee34e-0222-4cee-a9ee-f08d1d5fc670" (UID: "9ebee34e-0222-4cee-a9ee-f08d1d5fc670"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.860304 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ebee34e-0222-4cee-a9ee-f08d1d5fc670-kube-api-access-kvcrd" (OuterVolumeSpecName: "kube-api-access-kvcrd") pod "9ebee34e-0222-4cee-a9ee-f08d1d5fc670" (UID: "9ebee34e-0222-4cee-a9ee-f08d1d5fc670"). InnerVolumeSpecName "kube-api-access-kvcrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.861404 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ebee34e-0222-4cee-a9ee-f08d1d5fc670-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "9ebee34e-0222-4cee-a9ee-f08d1d5fc670" (UID: "9ebee34e-0222-4cee-a9ee-f08d1d5fc670"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.862223 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ebee34e-0222-4cee-a9ee-f08d1d5fc670-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "9ebee34e-0222-4cee-a9ee-f08d1d5fc670" (UID: "9ebee34e-0222-4cee-a9ee-f08d1d5fc670"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.885100 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8554648995-7hk2m"] Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.959511 4962 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9ebee34e-0222-4cee-a9ee-f08d1d5fc670-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.959554 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvcrd\" (UniqueName: \"kubernetes.io/projected/9ebee34e-0222-4cee-a9ee-f08d1d5fc670-kube-api-access-kvcrd\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.959571 4962 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9ebee34e-0222-4cee-a9ee-f08d1d5fc670-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.959594 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9ebee34e-0222-4cee-a9ee-f08d1d5fc670-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.959606 4962 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9ebee34e-0222-4cee-a9ee-f08d1d5fc670-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.959616 4962 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9ebee34e-0222-4cee-a9ee-f08d1d5fc670-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:21 crc kubenswrapper[4962]: I1201 21:53:21.959632 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ebee34e-0222-4cee-a9ee-f08d1d5fc670-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:21 crc kubenswrapper[4962]: E1201 21:53:21.993258 4962 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeab93f47_8c3c_470b_9427_3b48dc613572.slice/crio-97e891f5993ac1767932c12f3d00e928bfb7723d178a8ee1c8203a32763a670e\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeab93f47_8c3c_470b_9427_3b48dc613572.slice\": RecentStats: unable to find data in memory cache]" Dec 01 21:53:22 crc kubenswrapper[4962]: I1201 21:53:22.050665 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-5db2p"] Dec 01 21:53:22 crc kubenswrapper[4962]: W1201 21:53:22.053538 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34cbe04f_2bf2_4b5e_bf91_00787b7e4fee.slice/crio-baee82bd951435ebc75516dd6df589905e9751df53568c4bdde12718379643f4 WatchSource:0}: Error finding container baee82bd951435ebc75516dd6df589905e9751df53568c4bdde12718379643f4: Status 404 returned error can't find the container with id baee82bd951435ebc75516dd6df589905e9751df53568c4bdde12718379643f4 Dec 01 21:53:22 crc kubenswrapper[4962]: I1201 21:53:22.229952 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="292e65d7-a332-450a-83b9-802e53eb5382" path="/var/lib/kubelet/pods/292e65d7-a332-450a-83b9-802e53eb5382/volumes" Dec 01 21:53:22 crc kubenswrapper[4962]: I1201 21:53:22.230535 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eab93f47-8c3c-470b-9427-3b48dc613572" path="/var/lib/kubelet/pods/eab93f47-8c3c-470b-9427-3b48dc613572/volumes" Dec 01 21:53:22 crc kubenswrapper[4962]: I1201 21:53:22.273408 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 01 21:53:22 crc kubenswrapper[4962]: I1201 21:53:22.350440 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 01 21:53:22 crc kubenswrapper[4962]: I1201 21:53:22.467602 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3-etc-swift\") pod \"swift-storage-0\" (UID: \"f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3\") " pod="openstack/swift-storage-0" Dec 01 21:53:22 crc kubenswrapper[4962]: E1201 21:53:22.467986 4962 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 01 21:53:22 crc kubenswrapper[4962]: E1201 21:53:22.468030 4962 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 01 21:53:22 crc kubenswrapper[4962]: E1201 21:53:22.468111 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3-etc-swift podName:f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3 nodeName:}" failed. No retries permitted until 2025-12-01 21:53:24.468082584 +0000 UTC m=+1188.569521799 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3-etc-swift") pod "swift-storage-0" (UID: "f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3") : configmap "swift-ring-files" not found Dec 01 21:53:22 crc kubenswrapper[4962]: I1201 21:53:22.818146 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-nn26f" event={"ID":"bfb8d480-e429-4cd8-b9c8-1361a41deb16","Type":"ContainerStarted","Data":"749385e365b4e87b4a31f8cc1590103d02a4cbb7c023be6575567602b907c9d7"} Dec 01 21:53:22 crc kubenswrapper[4962]: I1201 21:53:22.818284 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-nn26f" Dec 01 21:53:22 crc kubenswrapper[4962]: I1201 21:53:22.826777 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-5db2p" event={"ID":"34cbe04f-2bf2-4b5e-bf91-00787b7e4fee","Type":"ContainerStarted","Data":"baee82bd951435ebc75516dd6df589905e9751df53568c4bdde12718379643f4"} Dec 01 21:53:22 crc kubenswrapper[4962]: I1201 21:53:22.834250 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-xd7ph" event={"ID":"f4f0c9ce-824a-4a5b-adc9-f7a09b3ab97d","Type":"ContainerStarted","Data":"9d64b154b9caf1adf256d2f416063c67f520c06c50805d60c0e2915196b9b130"} Dec 01 21:53:22 crc kubenswrapper[4962]: I1201 21:53:22.834317 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-8fpdb" Dec 01 21:53:22 crc kubenswrapper[4962]: I1201 21:53:22.842268 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b8fbc5445-nn26f" podStartSLOduration=3.842252455 podStartE2EDuration="3.842252455s" podCreationTimestamp="2025-12-01 21:53:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:53:22.833124775 +0000 UTC m=+1186.934563980" watchObservedRunningTime="2025-12-01 21:53:22.842252455 +0000 UTC m=+1186.943691660" Dec 01 21:53:22 crc kubenswrapper[4962]: I1201 21:53:22.872346 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-xd7ph" podStartSLOduration=11.98284529 podStartE2EDuration="40.872325252s" podCreationTimestamp="2025-12-01 21:52:42 +0000 UTC" firstStartedPulling="2025-12-01 21:52:53.154229281 +0000 UTC m=+1157.255668476" lastFinishedPulling="2025-12-01 21:53:22.043709243 +0000 UTC m=+1186.145148438" observedRunningTime="2025-12-01 21:53:22.867297958 +0000 UTC m=+1186.968737153" watchObservedRunningTime="2025-12-01 21:53:22.872325252 +0000 UTC m=+1186.973764457" Dec 01 21:53:22 crc kubenswrapper[4962]: I1201 21:53:22.918500 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-8fpdb"] Dec 01 21:53:22 crc kubenswrapper[4962]: I1201 21:53:22.930691 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-8fpdb"] Dec 01 21:53:24 crc kubenswrapper[4962]: I1201 21:53:24.232717 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ebee34e-0222-4cee-a9ee-f08d1d5fc670" path="/var/lib/kubelet/pods/9ebee34e-0222-4cee-a9ee-f08d1d5fc670/volumes" Dec 01 21:53:24 crc kubenswrapper[4962]: I1201 21:53:24.509645 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3-etc-swift\") pod \"swift-storage-0\" (UID: \"f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3\") " pod="openstack/swift-storage-0" Dec 01 21:53:24 crc kubenswrapper[4962]: E1201 21:53:24.509887 4962 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 01 21:53:24 crc kubenswrapper[4962]: E1201 21:53:24.509962 4962 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 01 21:53:24 crc kubenswrapper[4962]: E1201 21:53:24.510078 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3-etc-swift podName:f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3 nodeName:}" failed. No retries permitted until 2025-12-01 21:53:28.510025472 +0000 UTC m=+1192.611464707 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3-etc-swift") pod "swift-storage-0" (UID: "f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3") : configmap "swift-ring-files" not found Dec 01 21:53:25 crc kubenswrapper[4962]: I1201 21:53:25.872320 4962 generic.go:334] "Generic (PLEG): container finished" podID="c9ef8bb6-0fc4-411e-82a1-85d95ced5818" containerID="8adf4ba0aa720627144c4b6055ae0379a2e8bc72a0049b5aea7634192f4d4038" exitCode=0 Dec 01 21:53:25 crc kubenswrapper[4962]: I1201 21:53:25.872387 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c9ef8bb6-0fc4-411e-82a1-85d95ced5818","Type":"ContainerDied","Data":"8adf4ba0aa720627144c4b6055ae0379a2e8bc72a0049b5aea7634192f4d4038"} Dec 01 21:53:25 crc kubenswrapper[4962]: I1201 21:53:25.874647 4962 generic.go:334] "Generic (PLEG): container finished" podID="1e9a059a-712b-4ff4-b50e-7d94a96a9db5" containerID="2e612e8c7d52bd7bb195592643b0167d6f4ce348b0ef115b6d213703e68c13cb" exitCode=0 Dec 01 21:53:25 crc kubenswrapper[4962]: I1201 21:53:25.874681 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1e9a059a-712b-4ff4-b50e-7d94a96a9db5","Type":"ContainerDied","Data":"2e612e8c7d52bd7bb195592643b0167d6f4ce348b0ef115b6d213703e68c13cb"} Dec 01 21:53:26 crc kubenswrapper[4962]: I1201 21:53:26.473388 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-578c49649f-mltwz" podUID="75defef6-e656-452e-a623-8e1dd47c8078" containerName="console" containerID="cri-o://719c04e831220c088ca2d9bac2c0648a40fbdf7225b0a92e1eb15edcfc075d99" gracePeriod=15 Dec 01 21:53:26 crc kubenswrapper[4962]: I1201 21:53:26.886436 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-578c49649f-mltwz_75defef6-e656-452e-a623-8e1dd47c8078/console/0.log" Dec 01 21:53:26 crc kubenswrapper[4962]: I1201 21:53:26.887557 4962 generic.go:334] "Generic (PLEG): container finished" podID="75defef6-e656-452e-a623-8e1dd47c8078" containerID="719c04e831220c088ca2d9bac2c0648a40fbdf7225b0a92e1eb15edcfc075d99" exitCode=2 Dec 01 21:53:26 crc kubenswrapper[4962]: I1201 21:53:26.887652 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-578c49649f-mltwz" event={"ID":"75defef6-e656-452e-a623-8e1dd47c8078","Type":"ContainerDied","Data":"719c04e831220c088ca2d9bac2c0648a40fbdf7225b0a92e1eb15edcfc075d99"} Dec 01 21:53:27 crc kubenswrapper[4962]: I1201 21:53:27.308492 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-bk224"] Dec 01 21:53:27 crc kubenswrapper[4962]: I1201 21:53:27.310363 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-bk224" Dec 01 21:53:27 crc kubenswrapper[4962]: I1201 21:53:27.319000 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-3d8a-account-create-update-8j7jd"] Dec 01 21:53:27 crc kubenswrapper[4962]: I1201 21:53:27.320673 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-3d8a-account-create-update-8j7jd" Dec 01 21:53:27 crc kubenswrapper[4962]: I1201 21:53:27.322029 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 01 21:53:27 crc kubenswrapper[4962]: I1201 21:53:27.325522 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-bk224"] Dec 01 21:53:27 crc kubenswrapper[4962]: I1201 21:53:27.348433 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-3d8a-account-create-update-8j7jd"] Dec 01 21:53:27 crc kubenswrapper[4962]: I1201 21:53:27.376293 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90e37bf0-3279-477d-bb05-cb6744af0908-operator-scripts\") pod \"keystone-3d8a-account-create-update-8j7jd\" (UID: \"90e37bf0-3279-477d-bb05-cb6744af0908\") " pod="openstack/keystone-3d8a-account-create-update-8j7jd" Dec 01 21:53:27 crc kubenswrapper[4962]: I1201 21:53:27.376330 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc37b492-d684-42d9-a258-391677fbb9d7-operator-scripts\") pod \"keystone-db-create-bk224\" (UID: \"fc37b492-d684-42d9-a258-391677fbb9d7\") " pod="openstack/keystone-db-create-bk224" Dec 01 21:53:27 crc kubenswrapper[4962]: I1201 21:53:27.376508 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6d9bf\" (UniqueName: \"kubernetes.io/projected/90e37bf0-3279-477d-bb05-cb6744af0908-kube-api-access-6d9bf\") pod \"keystone-3d8a-account-create-update-8j7jd\" (UID: \"90e37bf0-3279-477d-bb05-cb6744af0908\") " pod="openstack/keystone-3d8a-account-create-update-8j7jd" Dec 01 21:53:27 crc kubenswrapper[4962]: I1201 21:53:27.376548 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r77bp\" (UniqueName: \"kubernetes.io/projected/fc37b492-d684-42d9-a258-391677fbb9d7-kube-api-access-r77bp\") pod \"keystone-db-create-bk224\" (UID: \"fc37b492-d684-42d9-a258-391677fbb9d7\") " pod="openstack/keystone-db-create-bk224" Dec 01 21:53:27 crc kubenswrapper[4962]: I1201 21:53:27.478607 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc37b492-d684-42d9-a258-391677fbb9d7-operator-scripts\") pod \"keystone-db-create-bk224\" (UID: \"fc37b492-d684-42d9-a258-391677fbb9d7\") " pod="openstack/keystone-db-create-bk224" Dec 01 21:53:27 crc kubenswrapper[4962]: I1201 21:53:27.480712 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6d9bf\" (UniqueName: \"kubernetes.io/projected/90e37bf0-3279-477d-bb05-cb6744af0908-kube-api-access-6d9bf\") pod \"keystone-3d8a-account-create-update-8j7jd\" (UID: \"90e37bf0-3279-477d-bb05-cb6744af0908\") " pod="openstack/keystone-3d8a-account-create-update-8j7jd" Dec 01 21:53:27 crc kubenswrapper[4962]: I1201 21:53:27.480804 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r77bp\" (UniqueName: \"kubernetes.io/projected/fc37b492-d684-42d9-a258-391677fbb9d7-kube-api-access-r77bp\") pod \"keystone-db-create-bk224\" (UID: \"fc37b492-d684-42d9-a258-391677fbb9d7\") " pod="openstack/keystone-db-create-bk224" Dec 01 21:53:27 crc kubenswrapper[4962]: I1201 21:53:27.480863 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90e37bf0-3279-477d-bb05-cb6744af0908-operator-scripts\") pod \"keystone-3d8a-account-create-update-8j7jd\" (UID: \"90e37bf0-3279-477d-bb05-cb6744af0908\") " pod="openstack/keystone-3d8a-account-create-update-8j7jd" Dec 01 21:53:27 crc kubenswrapper[4962]: I1201 21:53:27.481858 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90e37bf0-3279-477d-bb05-cb6744af0908-operator-scripts\") pod \"keystone-3d8a-account-create-update-8j7jd\" (UID: \"90e37bf0-3279-477d-bb05-cb6744af0908\") " pod="openstack/keystone-3d8a-account-create-update-8j7jd" Dec 01 21:53:27 crc kubenswrapper[4962]: I1201 21:53:27.482266 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc37b492-d684-42d9-a258-391677fbb9d7-operator-scripts\") pod \"keystone-db-create-bk224\" (UID: \"fc37b492-d684-42d9-a258-391677fbb9d7\") " pod="openstack/keystone-db-create-bk224" Dec 01 21:53:27 crc kubenswrapper[4962]: I1201 21:53:27.505209 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-8b9sd"] Dec 01 21:53:27 crc kubenswrapper[4962]: I1201 21:53:27.506386 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-8b9sd" Dec 01 21:53:27 crc kubenswrapper[4962]: I1201 21:53:27.511492 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r77bp\" (UniqueName: \"kubernetes.io/projected/fc37b492-d684-42d9-a258-391677fbb9d7-kube-api-access-r77bp\") pod \"keystone-db-create-bk224\" (UID: \"fc37b492-d684-42d9-a258-391677fbb9d7\") " pod="openstack/keystone-db-create-bk224" Dec 01 21:53:27 crc kubenswrapper[4962]: I1201 21:53:27.514590 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6d9bf\" (UniqueName: \"kubernetes.io/projected/90e37bf0-3279-477d-bb05-cb6744af0908-kube-api-access-6d9bf\") pod \"keystone-3d8a-account-create-update-8j7jd\" (UID: \"90e37bf0-3279-477d-bb05-cb6744af0908\") " pod="openstack/keystone-3d8a-account-create-update-8j7jd" Dec 01 21:53:27 crc kubenswrapper[4962]: I1201 21:53:27.518599 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-8b9sd"] Dec 01 21:53:27 crc kubenswrapper[4962]: I1201 21:53:27.556394 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-xd7ph" Dec 01 21:53:27 crc kubenswrapper[4962]: I1201 21:53:27.623028 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-4fa1-account-create-update-rfbmh"] Dec 01 21:53:27 crc kubenswrapper[4962]: I1201 21:53:27.624496 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-4fa1-account-create-update-rfbmh" Dec 01 21:53:27 crc kubenswrapper[4962]: I1201 21:53:27.626580 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 01 21:53:27 crc kubenswrapper[4962]: I1201 21:53:27.639213 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-bk224" Dec 01 21:53:27 crc kubenswrapper[4962]: I1201 21:53:27.640108 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-4fa1-account-create-update-rfbmh"] Dec 01 21:53:27 crc kubenswrapper[4962]: I1201 21:53:27.650977 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-3d8a-account-create-update-8j7jd" Dec 01 21:53:27 crc kubenswrapper[4962]: I1201 21:53:27.692817 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/057caf82-bc3a-4a9a-92a8-ac85a2ce7fd0-operator-scripts\") pod \"placement-4fa1-account-create-update-rfbmh\" (UID: \"057caf82-bc3a-4a9a-92a8-ac85a2ce7fd0\") " pod="openstack/placement-4fa1-account-create-update-rfbmh" Dec 01 21:53:27 crc kubenswrapper[4962]: I1201 21:53:27.692910 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xz6pz\" (UniqueName: \"kubernetes.io/projected/9e9931f2-fa3e-4c5a-a8ea-fbfe5dbb25fb-kube-api-access-xz6pz\") pod \"placement-db-create-8b9sd\" (UID: \"9e9931f2-fa3e-4c5a-a8ea-fbfe5dbb25fb\") " pod="openstack/placement-db-create-8b9sd" Dec 01 21:53:27 crc kubenswrapper[4962]: I1201 21:53:27.692975 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e9931f2-fa3e-4c5a-a8ea-fbfe5dbb25fb-operator-scripts\") pod \"placement-db-create-8b9sd\" (UID: \"9e9931f2-fa3e-4c5a-a8ea-fbfe5dbb25fb\") " pod="openstack/placement-db-create-8b9sd" Dec 01 21:53:27 crc kubenswrapper[4962]: I1201 21:53:27.693030 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqqs4\" (UniqueName: \"kubernetes.io/projected/057caf82-bc3a-4a9a-92a8-ac85a2ce7fd0-kube-api-access-vqqs4\") pod \"placement-4fa1-account-create-update-rfbmh\" (UID: \"057caf82-bc3a-4a9a-92a8-ac85a2ce7fd0\") " pod="openstack/placement-4fa1-account-create-update-rfbmh" Dec 01 21:53:27 crc kubenswrapper[4962]: I1201 21:53:27.789794 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-gxffv"] Dec 01 21:53:27 crc kubenswrapper[4962]: I1201 21:53:27.791428 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-gxffv" Dec 01 21:53:27 crc kubenswrapper[4962]: I1201 21:53:27.796309 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/057caf82-bc3a-4a9a-92a8-ac85a2ce7fd0-operator-scripts\") pod \"placement-4fa1-account-create-update-rfbmh\" (UID: \"057caf82-bc3a-4a9a-92a8-ac85a2ce7fd0\") " pod="openstack/placement-4fa1-account-create-update-rfbmh" Dec 01 21:53:27 crc kubenswrapper[4962]: I1201 21:53:27.796600 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xz6pz\" (UniqueName: \"kubernetes.io/projected/9e9931f2-fa3e-4c5a-a8ea-fbfe5dbb25fb-kube-api-access-xz6pz\") pod \"placement-db-create-8b9sd\" (UID: \"9e9931f2-fa3e-4c5a-a8ea-fbfe5dbb25fb\") " pod="openstack/placement-db-create-8b9sd" Dec 01 21:53:27 crc kubenswrapper[4962]: I1201 21:53:27.796752 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e9931f2-fa3e-4c5a-a8ea-fbfe5dbb25fb-operator-scripts\") pod \"placement-db-create-8b9sd\" (UID: \"9e9931f2-fa3e-4c5a-a8ea-fbfe5dbb25fb\") " pod="openstack/placement-db-create-8b9sd" Dec 01 21:53:27 crc kubenswrapper[4962]: I1201 21:53:27.796922 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqqs4\" (UniqueName: \"kubernetes.io/projected/057caf82-bc3a-4a9a-92a8-ac85a2ce7fd0-kube-api-access-vqqs4\") pod \"placement-4fa1-account-create-update-rfbmh\" (UID: \"057caf82-bc3a-4a9a-92a8-ac85a2ce7fd0\") " pod="openstack/placement-4fa1-account-create-update-rfbmh" Dec 01 21:53:27 crc kubenswrapper[4962]: I1201 21:53:27.798431 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/057caf82-bc3a-4a9a-92a8-ac85a2ce7fd0-operator-scripts\") pod \"placement-4fa1-account-create-update-rfbmh\" (UID: \"057caf82-bc3a-4a9a-92a8-ac85a2ce7fd0\") " pod="openstack/placement-4fa1-account-create-update-rfbmh" Dec 01 21:53:27 crc kubenswrapper[4962]: I1201 21:53:27.798775 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e9931f2-fa3e-4c5a-a8ea-fbfe5dbb25fb-operator-scripts\") pod \"placement-db-create-8b9sd\" (UID: \"9e9931f2-fa3e-4c5a-a8ea-fbfe5dbb25fb\") " pod="openstack/placement-db-create-8b9sd" Dec 01 21:53:27 crc kubenswrapper[4962]: I1201 21:53:27.803726 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-gxffv"] Dec 01 21:53:27 crc kubenswrapper[4962]: I1201 21:53:27.828520 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xz6pz\" (UniqueName: \"kubernetes.io/projected/9e9931f2-fa3e-4c5a-a8ea-fbfe5dbb25fb-kube-api-access-xz6pz\") pod \"placement-db-create-8b9sd\" (UID: \"9e9931f2-fa3e-4c5a-a8ea-fbfe5dbb25fb\") " pod="openstack/placement-db-create-8b9sd" Dec 01 21:53:27 crc kubenswrapper[4962]: I1201 21:53:27.829071 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqqs4\" (UniqueName: \"kubernetes.io/projected/057caf82-bc3a-4a9a-92a8-ac85a2ce7fd0-kube-api-access-vqqs4\") pod \"placement-4fa1-account-create-update-rfbmh\" (UID: \"057caf82-bc3a-4a9a-92a8-ac85a2ce7fd0\") " pod="openstack/placement-4fa1-account-create-update-rfbmh" Dec 01 21:53:27 crc kubenswrapper[4962]: I1201 21:53:27.898967 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwc69\" (UniqueName: \"kubernetes.io/projected/75a602fc-6e93-4dff-a482-64734dd6a817-kube-api-access-jwc69\") pod \"glance-db-create-gxffv\" (UID: \"75a602fc-6e93-4dff-a482-64734dd6a817\") " pod="openstack/glance-db-create-gxffv" Dec 01 21:53:27 crc kubenswrapper[4962]: I1201 21:53:27.899147 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75a602fc-6e93-4dff-a482-64734dd6a817-operator-scripts\") pod \"glance-db-create-gxffv\" (UID: \"75a602fc-6e93-4dff-a482-64734dd6a817\") " pod="openstack/glance-db-create-gxffv" Dec 01 21:53:27 crc kubenswrapper[4962]: I1201 21:53:27.904201 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-3f94-account-create-update-tcsxp"] Dec 01 21:53:27 crc kubenswrapper[4962]: I1201 21:53:27.909704 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-8b9sd" Dec 01 21:53:27 crc kubenswrapper[4962]: I1201 21:53:27.919872 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-3f94-account-create-update-tcsxp"] Dec 01 21:53:27 crc kubenswrapper[4962]: I1201 21:53:27.920156 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3f94-account-create-update-tcsxp" Dec 01 21:53:27 crc kubenswrapper[4962]: I1201 21:53:27.922468 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 01 21:53:27 crc kubenswrapper[4962]: I1201 21:53:27.940722 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-4fa1-account-create-update-rfbmh" Dec 01 21:53:28 crc kubenswrapper[4962]: I1201 21:53:28.001429 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwc69\" (UniqueName: \"kubernetes.io/projected/75a602fc-6e93-4dff-a482-64734dd6a817-kube-api-access-jwc69\") pod \"glance-db-create-gxffv\" (UID: \"75a602fc-6e93-4dff-a482-64734dd6a817\") " pod="openstack/glance-db-create-gxffv" Dec 01 21:53:28 crc kubenswrapper[4962]: I1201 21:53:28.001564 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75a602fc-6e93-4dff-a482-64734dd6a817-operator-scripts\") pod \"glance-db-create-gxffv\" (UID: \"75a602fc-6e93-4dff-a482-64734dd6a817\") " pod="openstack/glance-db-create-gxffv" Dec 01 21:53:28 crc kubenswrapper[4962]: I1201 21:53:28.002817 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75a602fc-6e93-4dff-a482-64734dd6a817-operator-scripts\") pod \"glance-db-create-gxffv\" (UID: \"75a602fc-6e93-4dff-a482-64734dd6a817\") " pod="openstack/glance-db-create-gxffv" Dec 01 21:53:28 crc kubenswrapper[4962]: I1201 21:53:28.020055 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwc69\" (UniqueName: \"kubernetes.io/projected/75a602fc-6e93-4dff-a482-64734dd6a817-kube-api-access-jwc69\") pod \"glance-db-create-gxffv\" (UID: \"75a602fc-6e93-4dff-a482-64734dd6a817\") " pod="openstack/glance-db-create-gxffv" Dec 01 21:53:28 crc kubenswrapper[4962]: I1201 21:53:28.103081 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/40f2f133-6737-4ab7-ab43-4de519e2d4c0-operator-scripts\") pod \"glance-3f94-account-create-update-tcsxp\" (UID: \"40f2f133-6737-4ab7-ab43-4de519e2d4c0\") " pod="openstack/glance-3f94-account-create-update-tcsxp" Dec 01 21:53:28 crc kubenswrapper[4962]: I1201 21:53:28.103561 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2dvt\" (UniqueName: \"kubernetes.io/projected/40f2f133-6737-4ab7-ab43-4de519e2d4c0-kube-api-access-v2dvt\") pod \"glance-3f94-account-create-update-tcsxp\" (UID: \"40f2f133-6737-4ab7-ab43-4de519e2d4c0\") " pod="openstack/glance-3f94-account-create-update-tcsxp" Dec 01 21:53:28 crc kubenswrapper[4962]: I1201 21:53:28.123815 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-gxffv" Dec 01 21:53:28 crc kubenswrapper[4962]: I1201 21:53:28.205514 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2dvt\" (UniqueName: \"kubernetes.io/projected/40f2f133-6737-4ab7-ab43-4de519e2d4c0-kube-api-access-v2dvt\") pod \"glance-3f94-account-create-update-tcsxp\" (UID: \"40f2f133-6737-4ab7-ab43-4de519e2d4c0\") " pod="openstack/glance-3f94-account-create-update-tcsxp" Dec 01 21:53:28 crc kubenswrapper[4962]: I1201 21:53:28.205597 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/40f2f133-6737-4ab7-ab43-4de519e2d4c0-operator-scripts\") pod \"glance-3f94-account-create-update-tcsxp\" (UID: \"40f2f133-6737-4ab7-ab43-4de519e2d4c0\") " pod="openstack/glance-3f94-account-create-update-tcsxp" Dec 01 21:53:28 crc kubenswrapper[4962]: I1201 21:53:28.206351 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/40f2f133-6737-4ab7-ab43-4de519e2d4c0-operator-scripts\") pod \"glance-3f94-account-create-update-tcsxp\" (UID: \"40f2f133-6737-4ab7-ab43-4de519e2d4c0\") " pod="openstack/glance-3f94-account-create-update-tcsxp" Dec 01 21:53:28 crc kubenswrapper[4962]: I1201 21:53:28.229532 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2dvt\" (UniqueName: \"kubernetes.io/projected/40f2f133-6737-4ab7-ab43-4de519e2d4c0-kube-api-access-v2dvt\") pod \"glance-3f94-account-create-update-tcsxp\" (UID: \"40f2f133-6737-4ab7-ab43-4de519e2d4c0\") " pod="openstack/glance-3f94-account-create-update-tcsxp" Dec 01 21:53:28 crc kubenswrapper[4962]: I1201 21:53:28.239006 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3f94-account-create-update-tcsxp" Dec 01 21:53:28 crc kubenswrapper[4962]: I1201 21:53:28.513402 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3-etc-swift\") pod \"swift-storage-0\" (UID: \"f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3\") " pod="openstack/swift-storage-0" Dec 01 21:53:28 crc kubenswrapper[4962]: E1201 21:53:28.513581 4962 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 01 21:53:28 crc kubenswrapper[4962]: E1201 21:53:28.513607 4962 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 01 21:53:28 crc kubenswrapper[4962]: E1201 21:53:28.513669 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3-etc-swift podName:f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3 nodeName:}" failed. No retries permitted until 2025-12-01 21:53:36.513650522 +0000 UTC m=+1200.615089717 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3-etc-swift") pod "swift-storage-0" (UID: "f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3") : configmap "swift-ring-files" not found Dec 01 21:53:29 crc kubenswrapper[4962]: I1201 21:53:29.259384 4962 patch_prober.go:28] interesting pod/console-578c49649f-mltwz container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.88:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 01 21:53:29 crc kubenswrapper[4962]: I1201 21:53:29.259445 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-578c49649f-mltwz" podUID="75defef6-e656-452e-a623-8e1dd47c8078" containerName="console" probeResult="failure" output="Get \"https://10.217.0.88:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 01 21:53:29 crc kubenswrapper[4962]: I1201 21:53:29.405756 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-qd95v"] Dec 01 21:53:29 crc kubenswrapper[4962]: I1201 21:53:29.407792 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-qd95v" Dec 01 21:53:29 crc kubenswrapper[4962]: I1201 21:53:29.412691 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-qd95v"] Dec 01 21:53:29 crc kubenswrapper[4962]: I1201 21:53:29.537913 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjlft\" (UniqueName: \"kubernetes.io/projected/c05e4ab6-7ea8-4adb-b276-f2c1883ac638-kube-api-access-rjlft\") pod \"mysqld-exporter-openstack-db-create-qd95v\" (UID: \"c05e4ab6-7ea8-4adb-b276-f2c1883ac638\") " pod="openstack/mysqld-exporter-openstack-db-create-qd95v" Dec 01 21:53:29 crc kubenswrapper[4962]: I1201 21:53:29.538193 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c05e4ab6-7ea8-4adb-b276-f2c1883ac638-operator-scripts\") pod \"mysqld-exporter-openstack-db-create-qd95v\" (UID: \"c05e4ab6-7ea8-4adb-b276-f2c1883ac638\") " pod="openstack/mysqld-exporter-openstack-db-create-qd95v" Dec 01 21:53:29 crc kubenswrapper[4962]: I1201 21:53:29.595116 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-3071-account-create-update-pqf8s"] Dec 01 21:53:29 crc kubenswrapper[4962]: I1201 21:53:29.596801 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-3071-account-create-update-pqf8s" Dec 01 21:53:29 crc kubenswrapper[4962]: I1201 21:53:29.599052 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-openstack-db-secret" Dec 01 21:53:29 crc kubenswrapper[4962]: I1201 21:53:29.604200 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-3071-account-create-update-pqf8s"] Dec 01 21:53:29 crc kubenswrapper[4962]: I1201 21:53:29.639966 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjlft\" (UniqueName: \"kubernetes.io/projected/c05e4ab6-7ea8-4adb-b276-f2c1883ac638-kube-api-access-rjlft\") pod \"mysqld-exporter-openstack-db-create-qd95v\" (UID: \"c05e4ab6-7ea8-4adb-b276-f2c1883ac638\") " pod="openstack/mysqld-exporter-openstack-db-create-qd95v" Dec 01 21:53:29 crc kubenswrapper[4962]: I1201 21:53:29.640058 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c05e4ab6-7ea8-4adb-b276-f2c1883ac638-operator-scripts\") pod \"mysqld-exporter-openstack-db-create-qd95v\" (UID: \"c05e4ab6-7ea8-4adb-b276-f2c1883ac638\") " pod="openstack/mysqld-exporter-openstack-db-create-qd95v" Dec 01 21:53:29 crc kubenswrapper[4962]: I1201 21:53:29.640759 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c05e4ab6-7ea8-4adb-b276-f2c1883ac638-operator-scripts\") pod \"mysqld-exporter-openstack-db-create-qd95v\" (UID: \"c05e4ab6-7ea8-4adb-b276-f2c1883ac638\") " pod="openstack/mysqld-exporter-openstack-db-create-qd95v" Dec 01 21:53:29 crc kubenswrapper[4962]: I1201 21:53:29.657890 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjlft\" (UniqueName: \"kubernetes.io/projected/c05e4ab6-7ea8-4adb-b276-f2c1883ac638-kube-api-access-rjlft\") pod \"mysqld-exporter-openstack-db-create-qd95v\" (UID: \"c05e4ab6-7ea8-4adb-b276-f2c1883ac638\") " pod="openstack/mysqld-exporter-openstack-db-create-qd95v" Dec 01 21:53:29 crc kubenswrapper[4962]: I1201 21:53:29.737002 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-qd95v" Dec 01 21:53:29 crc kubenswrapper[4962]: I1201 21:53:29.741573 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d75vw\" (UniqueName: \"kubernetes.io/projected/287023aa-b843-4ff9-bca0-7c5fcfc67688-kube-api-access-d75vw\") pod \"mysqld-exporter-3071-account-create-update-pqf8s\" (UID: \"287023aa-b843-4ff9-bca0-7c5fcfc67688\") " pod="openstack/mysqld-exporter-3071-account-create-update-pqf8s" Dec 01 21:53:29 crc kubenswrapper[4962]: I1201 21:53:29.741904 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/287023aa-b843-4ff9-bca0-7c5fcfc67688-operator-scripts\") pod \"mysqld-exporter-3071-account-create-update-pqf8s\" (UID: \"287023aa-b843-4ff9-bca0-7c5fcfc67688\") " pod="openstack/mysqld-exporter-3071-account-create-update-pqf8s" Dec 01 21:53:29 crc kubenswrapper[4962]: I1201 21:53:29.843551 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d75vw\" (UniqueName: \"kubernetes.io/projected/287023aa-b843-4ff9-bca0-7c5fcfc67688-kube-api-access-d75vw\") pod \"mysqld-exporter-3071-account-create-update-pqf8s\" (UID: \"287023aa-b843-4ff9-bca0-7c5fcfc67688\") " pod="openstack/mysqld-exporter-3071-account-create-update-pqf8s" Dec 01 21:53:29 crc kubenswrapper[4962]: I1201 21:53:29.843838 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/287023aa-b843-4ff9-bca0-7c5fcfc67688-operator-scripts\") pod \"mysqld-exporter-3071-account-create-update-pqf8s\" (UID: \"287023aa-b843-4ff9-bca0-7c5fcfc67688\") " pod="openstack/mysqld-exporter-3071-account-create-update-pqf8s" Dec 01 21:53:29 crc kubenswrapper[4962]: I1201 21:53:29.844643 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/287023aa-b843-4ff9-bca0-7c5fcfc67688-operator-scripts\") pod \"mysqld-exporter-3071-account-create-update-pqf8s\" (UID: \"287023aa-b843-4ff9-bca0-7c5fcfc67688\") " pod="openstack/mysqld-exporter-3071-account-create-update-pqf8s" Dec 01 21:53:29 crc kubenswrapper[4962]: I1201 21:53:29.865042 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-578c49649f-mltwz_75defef6-e656-452e-a623-8e1dd47c8078/console/0.log" Dec 01 21:53:29 crc kubenswrapper[4962]: I1201 21:53:29.865100 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-578c49649f-mltwz" Dec 01 21:53:29 crc kubenswrapper[4962]: I1201 21:53:29.869461 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d75vw\" (UniqueName: \"kubernetes.io/projected/287023aa-b843-4ff9-bca0-7c5fcfc67688-kube-api-access-d75vw\") pod \"mysqld-exporter-3071-account-create-update-pqf8s\" (UID: \"287023aa-b843-4ff9-bca0-7c5fcfc67688\") " pod="openstack/mysqld-exporter-3071-account-create-update-pqf8s" Dec 01 21:53:29 crc kubenswrapper[4962]: I1201 21:53:29.932457 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-578c49649f-mltwz_75defef6-e656-452e-a623-8e1dd47c8078/console/0.log" Dec 01 21:53:29 crc kubenswrapper[4962]: I1201 21:53:29.932533 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-578c49649f-mltwz" event={"ID":"75defef6-e656-452e-a623-8e1dd47c8078","Type":"ContainerDied","Data":"f6ecf38f2d33aad78a29e4a4a63f32add8c01e305bd00e74e4227c777384ac9a"} Dec 01 21:53:29 crc kubenswrapper[4962]: I1201 21:53:29.932585 4962 scope.go:117] "RemoveContainer" containerID="719c04e831220c088ca2d9bac2c0648a40fbdf7225b0a92e1eb15edcfc075d99" Dec 01 21:53:29 crc kubenswrapper[4962]: I1201 21:53:29.932796 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-578c49649f-mltwz" Dec 01 21:53:29 crc kubenswrapper[4962]: I1201 21:53:29.954225 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b8fbc5445-nn26f" Dec 01 21:53:30 crc kubenswrapper[4962]: I1201 21:53:30.027031 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-3071-account-create-update-pqf8s" Dec 01 21:53:30 crc kubenswrapper[4962]: I1201 21:53:30.054820 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/75defef6-e656-452e-a623-8e1dd47c8078-console-serving-cert\") pod \"75defef6-e656-452e-a623-8e1dd47c8078\" (UID: \"75defef6-e656-452e-a623-8e1dd47c8078\") " Dec 01 21:53:30 crc kubenswrapper[4962]: I1201 21:53:30.054975 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/75defef6-e656-452e-a623-8e1dd47c8078-console-oauth-config\") pod \"75defef6-e656-452e-a623-8e1dd47c8078\" (UID: \"75defef6-e656-452e-a623-8e1dd47c8078\") " Dec 01 21:53:30 crc kubenswrapper[4962]: I1201 21:53:30.056302 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/75defef6-e656-452e-a623-8e1dd47c8078-service-ca\") pod \"75defef6-e656-452e-a623-8e1dd47c8078\" (UID: \"75defef6-e656-452e-a623-8e1dd47c8078\") " Dec 01 21:53:30 crc kubenswrapper[4962]: I1201 21:53:30.056399 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75defef6-e656-452e-a623-8e1dd47c8078-trusted-ca-bundle\") pod \"75defef6-e656-452e-a623-8e1dd47c8078\" (UID: \"75defef6-e656-452e-a623-8e1dd47c8078\") " Dec 01 21:53:30 crc kubenswrapper[4962]: I1201 21:53:30.056532 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqwfp\" (UniqueName: \"kubernetes.io/projected/75defef6-e656-452e-a623-8e1dd47c8078-kube-api-access-xqwfp\") pod \"75defef6-e656-452e-a623-8e1dd47c8078\" (UID: \"75defef6-e656-452e-a623-8e1dd47c8078\") " Dec 01 21:53:30 crc kubenswrapper[4962]: I1201 21:53:30.056588 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/75defef6-e656-452e-a623-8e1dd47c8078-console-config\") pod \"75defef6-e656-452e-a623-8e1dd47c8078\" (UID: \"75defef6-e656-452e-a623-8e1dd47c8078\") " Dec 01 21:53:30 crc kubenswrapper[4962]: I1201 21:53:30.056704 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/75defef6-e656-452e-a623-8e1dd47c8078-oauth-serving-cert\") pod \"75defef6-e656-452e-a623-8e1dd47c8078\" (UID: \"75defef6-e656-452e-a623-8e1dd47c8078\") " Dec 01 21:53:30 crc kubenswrapper[4962]: I1201 21:53:30.059361 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75defef6-e656-452e-a623-8e1dd47c8078-service-ca" (OuterVolumeSpecName: "service-ca") pod "75defef6-e656-452e-a623-8e1dd47c8078" (UID: "75defef6-e656-452e-a623-8e1dd47c8078"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:53:30 crc kubenswrapper[4962]: I1201 21:53:30.059381 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75defef6-e656-452e-a623-8e1dd47c8078-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "75defef6-e656-452e-a623-8e1dd47c8078" (UID: "75defef6-e656-452e-a623-8e1dd47c8078"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:53:30 crc kubenswrapper[4962]: I1201 21:53:30.059778 4962 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/75defef6-e656-452e-a623-8e1dd47c8078-service-ca\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:30 crc kubenswrapper[4962]: I1201 21:53:30.059804 4962 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75defef6-e656-452e-a623-8e1dd47c8078-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:30 crc kubenswrapper[4962]: I1201 21:53:30.060389 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75defef6-e656-452e-a623-8e1dd47c8078-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "75defef6-e656-452e-a623-8e1dd47c8078" (UID: "75defef6-e656-452e-a623-8e1dd47c8078"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:53:30 crc kubenswrapper[4962]: I1201 21:53:30.060812 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75defef6-e656-452e-a623-8e1dd47c8078-console-config" (OuterVolumeSpecName: "console-config") pod "75defef6-e656-452e-a623-8e1dd47c8078" (UID: "75defef6-e656-452e-a623-8e1dd47c8078"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:53:30 crc kubenswrapper[4962]: I1201 21:53:30.062699 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75defef6-e656-452e-a623-8e1dd47c8078-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "75defef6-e656-452e-a623-8e1dd47c8078" (UID: "75defef6-e656-452e-a623-8e1dd47c8078"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:53:30 crc kubenswrapper[4962]: I1201 21:53:30.066601 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75defef6-e656-452e-a623-8e1dd47c8078-kube-api-access-xqwfp" (OuterVolumeSpecName: "kube-api-access-xqwfp") pod "75defef6-e656-452e-a623-8e1dd47c8078" (UID: "75defef6-e656-452e-a623-8e1dd47c8078"). InnerVolumeSpecName "kube-api-access-xqwfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:53:30 crc kubenswrapper[4962]: I1201 21:53:30.068493 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75defef6-e656-452e-a623-8e1dd47c8078-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "75defef6-e656-452e-a623-8e1dd47c8078" (UID: "75defef6-e656-452e-a623-8e1dd47c8078"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:53:30 crc kubenswrapper[4962]: I1201 21:53:30.069504 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-kv54m"] Dec 01 21:53:30 crc kubenswrapper[4962]: I1201 21:53:30.069909 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-kv54m" podUID="33007873-cb3d-4f47-8883-a60f3f823a16" containerName="dnsmasq-dns" containerID="cri-o://ef39056f5dc29b17fbcdcec508f51b55561a93e38b7c4f611a448d91a3c73791" gracePeriod=10 Dec 01 21:53:30 crc kubenswrapper[4962]: I1201 21:53:30.161506 4962 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/75defef6-e656-452e-a623-8e1dd47c8078-console-config\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:30 crc kubenswrapper[4962]: I1201 21:53:30.161539 4962 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/75defef6-e656-452e-a623-8e1dd47c8078-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:30 crc kubenswrapper[4962]: I1201 21:53:30.161550 4962 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/75defef6-e656-452e-a623-8e1dd47c8078-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:30 crc kubenswrapper[4962]: I1201 21:53:30.161560 4962 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/75defef6-e656-452e-a623-8e1dd47c8078-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:30 crc kubenswrapper[4962]: I1201 21:53:30.161568 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xqwfp\" (UniqueName: \"kubernetes.io/projected/75defef6-e656-452e-a623-8e1dd47c8078-kube-api-access-xqwfp\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:30 crc kubenswrapper[4962]: I1201 21:53:30.293703 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-578c49649f-mltwz"] Dec 01 21:53:30 crc kubenswrapper[4962]: I1201 21:53:30.302172 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-578c49649f-mltwz"] Dec 01 21:53:30 crc kubenswrapper[4962]: I1201 21:53:30.622127 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-3f94-account-create-update-tcsxp"] Dec 01 21:53:30 crc kubenswrapper[4962]: W1201 21:53:30.626649 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod40f2f133_6737_4ab7_ab43_4de519e2d4c0.slice/crio-a2aee4db5ca85296a68119a66b5ec1ae23c226971a2f8169c7f17261139a22e3 WatchSource:0}: Error finding container a2aee4db5ca85296a68119a66b5ec1ae23c226971a2f8169c7f17261139a22e3: Status 404 returned error can't find the container with id a2aee4db5ca85296a68119a66b5ec1ae23c226971a2f8169c7f17261139a22e3 Dec 01 21:53:30 crc kubenswrapper[4962]: I1201 21:53:30.947667 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c9ef8bb6-0fc4-411e-82a1-85d95ced5818","Type":"ContainerStarted","Data":"461b6d48eac2ff8ad256d35fbf30e75bf1e1a0278acfdd421ba2a7d95ae106de"} Dec 01 21:53:30 crc kubenswrapper[4962]: I1201 21:53:30.948234 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 01 21:53:30 crc kubenswrapper[4962]: I1201 21:53:30.952798 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-5db2p" event={"ID":"34cbe04f-2bf2-4b5e-bf91-00787b7e4fee","Type":"ContainerStarted","Data":"833b74755b103539c261d57def2dacb37e5701587d65924b6de39c3bf25557a5"} Dec 01 21:53:30 crc kubenswrapper[4962]: I1201 21:53:30.954844 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3f94-account-create-update-tcsxp" event={"ID":"40f2f133-6737-4ab7-ab43-4de519e2d4c0","Type":"ContainerStarted","Data":"a2aee4db5ca85296a68119a66b5ec1ae23c226971a2f8169c7f17261139a22e3"} Dec 01 21:53:30 crc kubenswrapper[4962]: I1201 21:53:30.960846 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1e9a059a-712b-4ff4-b50e-7d94a96a9db5","Type":"ContainerStarted","Data":"e05d8b8b65733b514c93844819537d91355a50c1bbe84d5b1f3c2f1e6383e213"} Dec 01 21:53:30 crc kubenswrapper[4962]: I1201 21:53:30.961770 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 01 21:53:30 crc kubenswrapper[4962]: I1201 21:53:30.972914 4962 generic.go:334] "Generic (PLEG): container finished" podID="33007873-cb3d-4f47-8883-a60f3f823a16" containerID="ef39056f5dc29b17fbcdcec508f51b55561a93e38b7c4f611a448d91a3c73791" exitCode=0 Dec 01 21:53:30 crc kubenswrapper[4962]: I1201 21:53:30.972967 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-kv54m" event={"ID":"33007873-cb3d-4f47-8883-a60f3f823a16","Type":"ContainerDied","Data":"ef39056f5dc29b17fbcdcec508f51b55561a93e38b7c4f611a448d91a3c73791"} Dec 01 21:53:30 crc kubenswrapper[4962]: I1201 21:53:30.972990 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-kv54m" event={"ID":"33007873-cb3d-4f47-8883-a60f3f823a16","Type":"ContainerDied","Data":"e52e85602f6fe719f856f8cef376f2f3020aaf6e0f60d71855a7f6464d21fff8"} Dec 01 21:53:30 crc kubenswrapper[4962]: I1201 21:53:30.973002 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e52e85602f6fe719f856f8cef376f2f3020aaf6e0f60d71855a7f6464d21fff8" Dec 01 21:53:30 crc kubenswrapper[4962]: I1201 21:53:30.986339 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-3d8a-account-create-update-8j7jd"] Dec 01 21:53:30 crc kubenswrapper[4962]: I1201 21:53:30.996071 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-8b9sd"] Dec 01 21:53:31 crc kubenswrapper[4962]: I1201 21:53:31.002574 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=47.136153364 podStartE2EDuration="58.002557545s" podCreationTimestamp="2025-12-01 21:52:33 +0000 UTC" firstStartedPulling="2025-12-01 21:52:40.706791196 +0000 UTC m=+1144.808230391" lastFinishedPulling="2025-12-01 21:52:51.573195377 +0000 UTC m=+1155.674634572" observedRunningTime="2025-12-01 21:53:30.980759914 +0000 UTC m=+1195.082199119" watchObservedRunningTime="2025-12-01 21:53:31.002557545 +0000 UTC m=+1195.103996740" Dec 01 21:53:31 crc kubenswrapper[4962]: I1201 21:53:31.025161 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=47.235891304 podStartE2EDuration="58.025107198s" podCreationTimestamp="2025-12-01 21:52:33 +0000 UTC" firstStartedPulling="2025-12-01 21:52:40.709263567 +0000 UTC m=+1144.810702762" lastFinishedPulling="2025-12-01 21:52:51.498479461 +0000 UTC m=+1155.599918656" observedRunningTime="2025-12-01 21:53:31.006684173 +0000 UTC m=+1195.108123368" watchObservedRunningTime="2025-12-01 21:53:31.025107198 +0000 UTC m=+1195.126546393" Dec 01 21:53:31 crc kubenswrapper[4962]: I1201 21:53:31.063397 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-kv54m" Dec 01 21:53:31 crc kubenswrapper[4962]: I1201 21:53:31.199859 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-bk224"] Dec 01 21:53:31 crc kubenswrapper[4962]: I1201 21:53:31.214822 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33007873-cb3d-4f47-8883-a60f3f823a16-dns-svc\") pod \"33007873-cb3d-4f47-8883-a60f3f823a16\" (UID: \"33007873-cb3d-4f47-8883-a60f3f823a16\") " Dec 01 21:53:31 crc kubenswrapper[4962]: I1201 21:53:31.214884 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33007873-cb3d-4f47-8883-a60f3f823a16-config\") pod \"33007873-cb3d-4f47-8883-a60f3f823a16\" (UID: \"33007873-cb3d-4f47-8883-a60f3f823a16\") " Dec 01 21:53:31 crc kubenswrapper[4962]: I1201 21:53:31.214949 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g46qf\" (UniqueName: \"kubernetes.io/projected/33007873-cb3d-4f47-8883-a60f3f823a16-kube-api-access-g46qf\") pod \"33007873-cb3d-4f47-8883-a60f3f823a16\" (UID: \"33007873-cb3d-4f47-8883-a60f3f823a16\") " Dec 01 21:53:31 crc kubenswrapper[4962]: W1201 21:53:31.219186 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod75a602fc_6e93_4dff_a482_64734dd6a817.slice/crio-32c6fc9be4000ee276f9e271db11c35dd79b7c5e88eb1adfa32ce7ff1b187a78 WatchSource:0}: Error finding container 32c6fc9be4000ee276f9e271db11c35dd79b7c5e88eb1adfa32ce7ff1b187a78: Status 404 returned error can't find the container with id 32c6fc9be4000ee276f9e271db11c35dd79b7c5e88eb1adfa32ce7ff1b187a78 Dec 01 21:53:31 crc kubenswrapper[4962]: I1201 21:53:31.224157 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-4fa1-account-create-update-rfbmh"] Dec 01 21:53:31 crc kubenswrapper[4962]: I1201 21:53:31.224819 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33007873-cb3d-4f47-8883-a60f3f823a16-kube-api-access-g46qf" (OuterVolumeSpecName: "kube-api-access-g46qf") pod "33007873-cb3d-4f47-8883-a60f3f823a16" (UID: "33007873-cb3d-4f47-8883-a60f3f823a16"). InnerVolumeSpecName "kube-api-access-g46qf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:53:31 crc kubenswrapper[4962]: I1201 21:53:31.237104 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-gxffv"] Dec 01 21:53:31 crc kubenswrapper[4962]: I1201 21:53:31.249749 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-qd95v"] Dec 01 21:53:31 crc kubenswrapper[4962]: I1201 21:53:31.279250 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33007873-cb3d-4f47-8883-a60f3f823a16-config" (OuterVolumeSpecName: "config") pod "33007873-cb3d-4f47-8883-a60f3f823a16" (UID: "33007873-cb3d-4f47-8883-a60f3f823a16"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:53:31 crc kubenswrapper[4962]: I1201 21:53:31.291843 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33007873-cb3d-4f47-8883-a60f3f823a16-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "33007873-cb3d-4f47-8883-a60f3f823a16" (UID: "33007873-cb3d-4f47-8883-a60f3f823a16"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:53:31 crc kubenswrapper[4962]: I1201 21:53:31.317325 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33007873-cb3d-4f47-8883-a60f3f823a16-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:31 crc kubenswrapper[4962]: I1201 21:53:31.317349 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33007873-cb3d-4f47-8883-a60f3f823a16-config\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:31 crc kubenswrapper[4962]: I1201 21:53:31.317358 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g46qf\" (UniqueName: \"kubernetes.io/projected/33007873-cb3d-4f47-8883-a60f3f823a16-kube-api-access-g46qf\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:31 crc kubenswrapper[4962]: I1201 21:53:31.350791 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-3071-account-create-update-pqf8s"] Dec 01 21:53:31 crc kubenswrapper[4962]: I1201 21:53:31.988498 4962 generic.go:334] "Generic (PLEG): container finished" podID="057caf82-bc3a-4a9a-92a8-ac85a2ce7fd0" containerID="4c6cb28056486117a7cacbe282e06f687ec3d7e1358ff656bc53622675e1876a" exitCode=0 Dec 01 21:53:31 crc kubenswrapper[4962]: I1201 21:53:31.988615 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-4fa1-account-create-update-rfbmh" event={"ID":"057caf82-bc3a-4a9a-92a8-ac85a2ce7fd0","Type":"ContainerDied","Data":"4c6cb28056486117a7cacbe282e06f687ec3d7e1358ff656bc53622675e1876a"} Dec 01 21:53:31 crc kubenswrapper[4962]: I1201 21:53:31.988872 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-4fa1-account-create-update-rfbmh" event={"ID":"057caf82-bc3a-4a9a-92a8-ac85a2ce7fd0","Type":"ContainerStarted","Data":"790ad2eecbce3c645e6ab6640eaf45b3061692e9577b5e054a98c285db98552f"} Dec 01 21:53:31 crc kubenswrapper[4962]: I1201 21:53:31.992170 4962 generic.go:334] "Generic (PLEG): container finished" podID="75a602fc-6e93-4dff-a482-64734dd6a817" containerID="b3615ed752a8c2690297a5c0938d81c9c0b3433c8a9c6df6f4abc6631c78ffae" exitCode=0 Dec 01 21:53:31 crc kubenswrapper[4962]: I1201 21:53:31.992247 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-gxffv" event={"ID":"75a602fc-6e93-4dff-a482-64734dd6a817","Type":"ContainerDied","Data":"b3615ed752a8c2690297a5c0938d81c9c0b3433c8a9c6df6f4abc6631c78ffae"} Dec 01 21:53:31 crc kubenswrapper[4962]: I1201 21:53:31.992266 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-gxffv" event={"ID":"75a602fc-6e93-4dff-a482-64734dd6a817","Type":"ContainerStarted","Data":"32c6fc9be4000ee276f9e271db11c35dd79b7c5e88eb1adfa32ce7ff1b187a78"} Dec 01 21:53:31 crc kubenswrapper[4962]: I1201 21:53:31.993974 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-3071-account-create-update-pqf8s" event={"ID":"287023aa-b843-4ff9-bca0-7c5fcfc67688","Type":"ContainerStarted","Data":"b260d26addbd49a8a9f9d59648defcb6b45e3e5b147d3fdcf866e7baf96f703f"} Dec 01 21:53:31 crc kubenswrapper[4962]: I1201 21:53:31.993998 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-3071-account-create-update-pqf8s" event={"ID":"287023aa-b843-4ff9-bca0-7c5fcfc67688","Type":"ContainerStarted","Data":"0398f32810dd23b5426853ec92b663a538507fad688e0d2d6251bd2d1432c8aa"} Dec 01 21:53:31 crc kubenswrapper[4962]: I1201 21:53:31.996644 4962 generic.go:334] "Generic (PLEG): container finished" podID="9e9931f2-fa3e-4c5a-a8ea-fbfe5dbb25fb" containerID="58e8b4e147d076ee2c8ef454df8e0fdd9b54d15263c0c463a2b94585625c6b77" exitCode=0 Dec 01 21:53:31 crc kubenswrapper[4962]: I1201 21:53:31.996677 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-8b9sd" event={"ID":"9e9931f2-fa3e-4c5a-a8ea-fbfe5dbb25fb","Type":"ContainerDied","Data":"58e8b4e147d076ee2c8ef454df8e0fdd9b54d15263c0c463a2b94585625c6b77"} Dec 01 21:53:31 crc kubenswrapper[4962]: I1201 21:53:31.996714 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-8b9sd" event={"ID":"9e9931f2-fa3e-4c5a-a8ea-fbfe5dbb25fb","Type":"ContainerStarted","Data":"ba5c2f87b8d2cd4c8b19579860abf44b5bc22dfacad289929cf5fdc65d0eda7d"} Dec 01 21:53:31 crc kubenswrapper[4962]: I1201 21:53:31.998245 4962 generic.go:334] "Generic (PLEG): container finished" podID="90e37bf0-3279-477d-bb05-cb6744af0908" containerID="9972fa75fbfa3f510c06489b11cac1b796cb70719b4acf05a42d6aa6303f5de1" exitCode=0 Dec 01 21:53:31 crc kubenswrapper[4962]: I1201 21:53:31.998326 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-3d8a-account-create-update-8j7jd" event={"ID":"90e37bf0-3279-477d-bb05-cb6744af0908","Type":"ContainerDied","Data":"9972fa75fbfa3f510c06489b11cac1b796cb70719b4acf05a42d6aa6303f5de1"} Dec 01 21:53:31 crc kubenswrapper[4962]: I1201 21:53:31.998348 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-3d8a-account-create-update-8j7jd" event={"ID":"90e37bf0-3279-477d-bb05-cb6744af0908","Type":"ContainerStarted","Data":"9b36dc2721b9910c253cae696a0501293a88a28e656b6a3424a339f761149caf"} Dec 01 21:53:31 crc kubenswrapper[4962]: I1201 21:53:31.999876 4962 generic.go:334] "Generic (PLEG): container finished" podID="40f2f133-6737-4ab7-ab43-4de519e2d4c0" containerID="8a1c0fb7af3abf3b53e289d52dcbb4c123cacf5925374f7d1493f706edb38d79" exitCode=0 Dec 01 21:53:31 crc kubenswrapper[4962]: I1201 21:53:31.999895 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3f94-account-create-update-tcsxp" event={"ID":"40f2f133-6737-4ab7-ab43-4de519e2d4c0","Type":"ContainerDied","Data":"8a1c0fb7af3abf3b53e289d52dcbb4c123cacf5925374f7d1493f706edb38d79"} Dec 01 21:53:32 crc kubenswrapper[4962]: I1201 21:53:32.001864 4962 generic.go:334] "Generic (PLEG): container finished" podID="c05e4ab6-7ea8-4adb-b276-f2c1883ac638" containerID="32b621ce7bb8e9b7672dc0f0861f7989a2e83922bd9cdcc8a372a5bdc0b8ada4" exitCode=0 Dec 01 21:53:32 crc kubenswrapper[4962]: I1201 21:53:32.001921 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-qd95v" event={"ID":"c05e4ab6-7ea8-4adb-b276-f2c1883ac638","Type":"ContainerDied","Data":"32b621ce7bb8e9b7672dc0f0861f7989a2e83922bd9cdcc8a372a5bdc0b8ada4"} Dec 01 21:53:32 crc kubenswrapper[4962]: I1201 21:53:32.001967 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-qd95v" event={"ID":"c05e4ab6-7ea8-4adb-b276-f2c1883ac638","Type":"ContainerStarted","Data":"62fde53561ff9b36152e846dc2b7f2016e6a333ef6579ba62eccb6f1b81926fe"} Dec 01 21:53:32 crc kubenswrapper[4962]: I1201 21:53:32.025198 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4f8782cd-368d-4071-848d-8ad2379ddf6c","Type":"ContainerStarted","Data":"3675bc4a7786ba872510fd4c188c15df2d44bbd1efdb4336d12b622aa06dd5b1"} Dec 01 21:53:32 crc kubenswrapper[4962]: I1201 21:53:32.027307 4962 generic.go:334] "Generic (PLEG): container finished" podID="fc37b492-d684-42d9-a258-391677fbb9d7" containerID="29e1028ed8f8120707ef6f71cc59fdc41314e0ea8c38c91d05363a3950f63b1f" exitCode=0 Dec 01 21:53:32 crc kubenswrapper[4962]: I1201 21:53:32.027400 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-bk224" event={"ID":"fc37b492-d684-42d9-a258-391677fbb9d7","Type":"ContainerDied","Data":"29e1028ed8f8120707ef6f71cc59fdc41314e0ea8c38c91d05363a3950f63b1f"} Dec 01 21:53:32 crc kubenswrapper[4962]: I1201 21:53:32.027459 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-bk224" event={"ID":"fc37b492-d684-42d9-a258-391677fbb9d7","Type":"ContainerStarted","Data":"d8867d45ffb639e5d4fe10549b77bb85a1258807ed1a7346f2fd4fa809a3e907"} Dec 01 21:53:32 crc kubenswrapper[4962]: I1201 21:53:32.030820 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"fe00e319-7859-4bac-9316-156263865d80","Type":"ContainerStarted","Data":"48d9bf5b6c642e7fc6b72d4d4a02701499f921fa0b43814db275d3587d98d47d"} Dec 01 21:53:32 crc kubenswrapper[4962]: I1201 21:53:32.032237 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-kv54m" Dec 01 21:53:32 crc kubenswrapper[4962]: I1201 21:53:32.055537 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-3071-account-create-update-pqf8s" podStartSLOduration=3.055514745 podStartE2EDuration="3.055514745s" podCreationTimestamp="2025-12-01 21:53:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:53:32.040134577 +0000 UTC m=+1196.141573762" watchObservedRunningTime="2025-12-01 21:53:32.055514745 +0000 UTC m=+1196.156953950" Dec 01 21:53:32 crc kubenswrapper[4962]: I1201 21:53:32.153040 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-5db2p" podStartSLOduration=3.273197084 podStartE2EDuration="11.153024103s" podCreationTimestamp="2025-12-01 21:53:21 +0000 UTC" firstStartedPulling="2025-12-01 21:53:22.05589409 +0000 UTC m=+1186.157333275" lastFinishedPulling="2025-12-01 21:53:29.935721099 +0000 UTC m=+1194.037160294" observedRunningTime="2025-12-01 21:53:32.150275015 +0000 UTC m=+1196.251714210" watchObservedRunningTime="2025-12-01 21:53:32.153024103 +0000 UTC m=+1196.254463298" Dec 01 21:53:32 crc kubenswrapper[4962]: I1201 21:53:32.180333 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=8.480664142 podStartE2EDuration="46.180312251s" podCreationTimestamp="2025-12-01 21:52:46 +0000 UTC" firstStartedPulling="2025-12-01 21:52:53.393162238 +0000 UTC m=+1157.494601443" lastFinishedPulling="2025-12-01 21:53:31.092810367 +0000 UTC m=+1195.194249552" observedRunningTime="2025-12-01 21:53:32.166732954 +0000 UTC m=+1196.268172159" watchObservedRunningTime="2025-12-01 21:53:32.180312251 +0000 UTC m=+1196.281751446" Dec 01 21:53:32 crc kubenswrapper[4962]: I1201 21:53:32.207243 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-kv54m"] Dec 01 21:53:32 crc kubenswrapper[4962]: I1201 21:53:32.214596 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-kv54m"] Dec 01 21:53:32 crc kubenswrapper[4962]: I1201 21:53:32.230543 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33007873-cb3d-4f47-8883-a60f3f823a16" path="/var/lib/kubelet/pods/33007873-cb3d-4f47-8883-a60f3f823a16/volumes" Dec 01 21:53:32 crc kubenswrapper[4962]: I1201 21:53:32.231408 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75defef6-e656-452e-a623-8e1dd47c8078" path="/var/lib/kubelet/pods/75defef6-e656-452e-a623-8e1dd47c8078/volumes" Dec 01 21:53:32 crc kubenswrapper[4962]: I1201 21:53:32.510749 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 01 21:53:32 crc kubenswrapper[4962]: I1201 21:53:32.510814 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 01 21:53:32 crc kubenswrapper[4962]: I1201 21:53:32.784198 4962 patch_prober.go:28] interesting pod/machine-config-daemon-b642k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 21:53:32 crc kubenswrapper[4962]: I1201 21:53:32.784530 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 21:53:33 crc kubenswrapper[4962]: I1201 21:53:33.038789 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-3071-account-create-update-pqf8s" event={"ID":"287023aa-b843-4ff9-bca0-7c5fcfc67688","Type":"ContainerDied","Data":"b260d26addbd49a8a9f9d59648defcb6b45e3e5b147d3fdcf866e7baf96f703f"} Dec 01 21:53:33 crc kubenswrapper[4962]: I1201 21:53:33.038379 4962 generic.go:334] "Generic (PLEG): container finished" podID="287023aa-b843-4ff9-bca0-7c5fcfc67688" containerID="b260d26addbd49a8a9f9d59648defcb6b45e3e5b147d3fdcf866e7baf96f703f" exitCode=0 Dec 01 21:53:33 crc kubenswrapper[4962]: I1201 21:53:33.581033 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-bk224" Dec 01 21:53:33 crc kubenswrapper[4962]: I1201 21:53:33.668692 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r77bp\" (UniqueName: \"kubernetes.io/projected/fc37b492-d684-42d9-a258-391677fbb9d7-kube-api-access-r77bp\") pod \"fc37b492-d684-42d9-a258-391677fbb9d7\" (UID: \"fc37b492-d684-42d9-a258-391677fbb9d7\") " Dec 01 21:53:33 crc kubenswrapper[4962]: I1201 21:53:33.668825 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc37b492-d684-42d9-a258-391677fbb9d7-operator-scripts\") pod \"fc37b492-d684-42d9-a258-391677fbb9d7\" (UID: \"fc37b492-d684-42d9-a258-391677fbb9d7\") " Dec 01 21:53:33 crc kubenswrapper[4962]: I1201 21:53:33.669615 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc37b492-d684-42d9-a258-391677fbb9d7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fc37b492-d684-42d9-a258-391677fbb9d7" (UID: "fc37b492-d684-42d9-a258-391677fbb9d7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:53:33 crc kubenswrapper[4962]: I1201 21:53:33.678286 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc37b492-d684-42d9-a258-391677fbb9d7-kube-api-access-r77bp" (OuterVolumeSpecName: "kube-api-access-r77bp") pod "fc37b492-d684-42d9-a258-391677fbb9d7" (UID: "fc37b492-d684-42d9-a258-391677fbb9d7"). InnerVolumeSpecName "kube-api-access-r77bp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:53:33 crc kubenswrapper[4962]: I1201 21:53:33.771432 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r77bp\" (UniqueName: \"kubernetes.io/projected/fc37b492-d684-42d9-a258-391677fbb9d7-kube-api-access-r77bp\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:33 crc kubenswrapper[4962]: I1201 21:53:33.771470 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc37b492-d684-42d9-a258-391677fbb9d7-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:33 crc kubenswrapper[4962]: I1201 21:53:33.980275 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-8b9sd" Dec 01 21:53:33 crc kubenswrapper[4962]: I1201 21:53:33.995177 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-qd95v" Dec 01 21:53:34 crc kubenswrapper[4962]: I1201 21:53:34.000363 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-3d8a-account-create-update-8j7jd" Dec 01 21:53:34 crc kubenswrapper[4962]: I1201 21:53:34.006838 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-gxffv" Dec 01 21:53:34 crc kubenswrapper[4962]: I1201 21:53:34.019018 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3f94-account-create-update-tcsxp" Dec 01 21:53:34 crc kubenswrapper[4962]: I1201 21:53:34.037323 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-4fa1-account-create-update-rfbmh" Dec 01 21:53:34 crc kubenswrapper[4962]: I1201 21:53:34.079761 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-3d8a-account-create-update-8j7jd" event={"ID":"90e37bf0-3279-477d-bb05-cb6744af0908","Type":"ContainerDied","Data":"9b36dc2721b9910c253cae696a0501293a88a28e656b6a3424a339f761149caf"} Dec 01 21:53:34 crc kubenswrapper[4962]: I1201 21:53:34.079804 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b36dc2721b9910c253cae696a0501293a88a28e656b6a3424a339f761149caf" Dec 01 21:53:34 crc kubenswrapper[4962]: I1201 21:53:34.079845 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/40f2f133-6737-4ab7-ab43-4de519e2d4c0-operator-scripts\") pod \"40f2f133-6737-4ab7-ab43-4de519e2d4c0\" (UID: \"40f2f133-6737-4ab7-ab43-4de519e2d4c0\") " Dec 01 21:53:34 crc kubenswrapper[4962]: I1201 21:53:34.079955 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xz6pz\" (UniqueName: \"kubernetes.io/projected/9e9931f2-fa3e-4c5a-a8ea-fbfe5dbb25fb-kube-api-access-xz6pz\") pod \"9e9931f2-fa3e-4c5a-a8ea-fbfe5dbb25fb\" (UID: \"9e9931f2-fa3e-4c5a-a8ea-fbfe5dbb25fb\") " Dec 01 21:53:34 crc kubenswrapper[4962]: I1201 21:53:34.079992 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjlft\" (UniqueName: \"kubernetes.io/projected/c05e4ab6-7ea8-4adb-b276-f2c1883ac638-kube-api-access-rjlft\") pod \"c05e4ab6-7ea8-4adb-b276-f2c1883ac638\" (UID: \"c05e4ab6-7ea8-4adb-b276-f2c1883ac638\") " Dec 01 21:53:34 crc kubenswrapper[4962]: I1201 21:53:34.080023 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/057caf82-bc3a-4a9a-92a8-ac85a2ce7fd0-operator-scripts\") pod \"057caf82-bc3a-4a9a-92a8-ac85a2ce7fd0\" (UID: \"057caf82-bc3a-4a9a-92a8-ac85a2ce7fd0\") " Dec 01 21:53:34 crc kubenswrapper[4962]: I1201 21:53:34.080026 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-3d8a-account-create-update-8j7jd" Dec 01 21:53:34 crc kubenswrapper[4962]: I1201 21:53:34.080109 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90e37bf0-3279-477d-bb05-cb6744af0908-operator-scripts\") pod \"90e37bf0-3279-477d-bb05-cb6744af0908\" (UID: \"90e37bf0-3279-477d-bb05-cb6744af0908\") " Dec 01 21:53:34 crc kubenswrapper[4962]: I1201 21:53:34.080628 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75a602fc-6e93-4dff-a482-64734dd6a817-operator-scripts\") pod \"75a602fc-6e93-4dff-a482-64734dd6a817\" (UID: \"75a602fc-6e93-4dff-a482-64734dd6a817\") " Dec 01 21:53:34 crc kubenswrapper[4962]: I1201 21:53:34.080678 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e9931f2-fa3e-4c5a-a8ea-fbfe5dbb25fb-operator-scripts\") pod \"9e9931f2-fa3e-4c5a-a8ea-fbfe5dbb25fb\" (UID: \"9e9931f2-fa3e-4c5a-a8ea-fbfe5dbb25fb\") " Dec 01 21:53:34 crc kubenswrapper[4962]: I1201 21:53:34.080705 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6d9bf\" (UniqueName: \"kubernetes.io/projected/90e37bf0-3279-477d-bb05-cb6744af0908-kube-api-access-6d9bf\") pod \"90e37bf0-3279-477d-bb05-cb6744af0908\" (UID: \"90e37bf0-3279-477d-bb05-cb6744af0908\") " Dec 01 21:53:34 crc kubenswrapper[4962]: I1201 21:53:34.080791 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c05e4ab6-7ea8-4adb-b276-f2c1883ac638-operator-scripts\") pod \"c05e4ab6-7ea8-4adb-b276-f2c1883ac638\" (UID: \"c05e4ab6-7ea8-4adb-b276-f2c1883ac638\") " Dec 01 21:53:34 crc kubenswrapper[4962]: I1201 21:53:34.080826 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqqs4\" (UniqueName: \"kubernetes.io/projected/057caf82-bc3a-4a9a-92a8-ac85a2ce7fd0-kube-api-access-vqqs4\") pod \"057caf82-bc3a-4a9a-92a8-ac85a2ce7fd0\" (UID: \"057caf82-bc3a-4a9a-92a8-ac85a2ce7fd0\") " Dec 01 21:53:34 crc kubenswrapper[4962]: I1201 21:53:34.080876 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2dvt\" (UniqueName: \"kubernetes.io/projected/40f2f133-6737-4ab7-ab43-4de519e2d4c0-kube-api-access-v2dvt\") pod \"40f2f133-6737-4ab7-ab43-4de519e2d4c0\" (UID: \"40f2f133-6737-4ab7-ab43-4de519e2d4c0\") " Dec 01 21:53:34 crc kubenswrapper[4962]: I1201 21:53:34.080922 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwc69\" (UniqueName: \"kubernetes.io/projected/75a602fc-6e93-4dff-a482-64734dd6a817-kube-api-access-jwc69\") pod \"75a602fc-6e93-4dff-a482-64734dd6a817\" (UID: \"75a602fc-6e93-4dff-a482-64734dd6a817\") " Dec 01 21:53:34 crc kubenswrapper[4962]: I1201 21:53:34.081229 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40f2f133-6737-4ab7-ab43-4de519e2d4c0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "40f2f133-6737-4ab7-ab43-4de519e2d4c0" (UID: "40f2f133-6737-4ab7-ab43-4de519e2d4c0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:53:34 crc kubenswrapper[4962]: I1201 21:53:34.081865 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/40f2f133-6737-4ab7-ab43-4de519e2d4c0-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:34 crc kubenswrapper[4962]: I1201 21:53:34.083448 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75a602fc-6e93-4dff-a482-64734dd6a817-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "75a602fc-6e93-4dff-a482-64734dd6a817" (UID: "75a602fc-6e93-4dff-a482-64734dd6a817"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:53:34 crc kubenswrapper[4962]: I1201 21:53:34.084602 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/057caf82-bc3a-4a9a-92a8-ac85a2ce7fd0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "057caf82-bc3a-4a9a-92a8-ac85a2ce7fd0" (UID: "057caf82-bc3a-4a9a-92a8-ac85a2ce7fd0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:53:34 crc kubenswrapper[4962]: I1201 21:53:34.085859 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75a602fc-6e93-4dff-a482-64734dd6a817-kube-api-access-jwc69" (OuterVolumeSpecName: "kube-api-access-jwc69") pod "75a602fc-6e93-4dff-a482-64734dd6a817" (UID: "75a602fc-6e93-4dff-a482-64734dd6a817"). InnerVolumeSpecName "kube-api-access-jwc69". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:53:34 crc kubenswrapper[4962]: I1201 21:53:34.085833 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4f8782cd-368d-4071-848d-8ad2379ddf6c","Type":"ContainerStarted","Data":"bd0853a1c41936fe5edca16315e3a14860a7e14835a6c36bfcc4f4563e8138c8"} Dec 01 21:53:34 crc kubenswrapper[4962]: I1201 21:53:34.086295 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90e37bf0-3279-477d-bb05-cb6744af0908-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "90e37bf0-3279-477d-bb05-cb6744af0908" (UID: "90e37bf0-3279-477d-bb05-cb6744af0908"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:53:34 crc kubenswrapper[4962]: I1201 21:53:34.086336 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e9931f2-fa3e-4c5a-a8ea-fbfe5dbb25fb-kube-api-access-xz6pz" (OuterVolumeSpecName: "kube-api-access-xz6pz") pod "9e9931f2-fa3e-4c5a-a8ea-fbfe5dbb25fb" (UID: "9e9931f2-fa3e-4c5a-a8ea-fbfe5dbb25fb"). InnerVolumeSpecName "kube-api-access-xz6pz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:53:34 crc kubenswrapper[4962]: I1201 21:53:34.086425 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c05e4ab6-7ea8-4adb-b276-f2c1883ac638-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c05e4ab6-7ea8-4adb-b276-f2c1883ac638" (UID: "c05e4ab6-7ea8-4adb-b276-f2c1883ac638"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:53:34 crc kubenswrapper[4962]: I1201 21:53:34.087470 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e9931f2-fa3e-4c5a-a8ea-fbfe5dbb25fb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9e9931f2-fa3e-4c5a-a8ea-fbfe5dbb25fb" (UID: "9e9931f2-fa3e-4c5a-a8ea-fbfe5dbb25fb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:53:34 crc kubenswrapper[4962]: I1201 21:53:34.090436 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c05e4ab6-7ea8-4adb-b276-f2c1883ac638-kube-api-access-rjlft" (OuterVolumeSpecName: "kube-api-access-rjlft") pod "c05e4ab6-7ea8-4adb-b276-f2c1883ac638" (UID: "c05e4ab6-7ea8-4adb-b276-f2c1883ac638"). InnerVolumeSpecName "kube-api-access-rjlft". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:53:34 crc kubenswrapper[4962]: I1201 21:53:34.093333 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90e37bf0-3279-477d-bb05-cb6744af0908-kube-api-access-6d9bf" (OuterVolumeSpecName: "kube-api-access-6d9bf") pod "90e37bf0-3279-477d-bb05-cb6744af0908" (UID: "90e37bf0-3279-477d-bb05-cb6744af0908"). InnerVolumeSpecName "kube-api-access-6d9bf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:53:34 crc kubenswrapper[4962]: I1201 21:53:34.095384 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40f2f133-6737-4ab7-ab43-4de519e2d4c0-kube-api-access-v2dvt" (OuterVolumeSpecName: "kube-api-access-v2dvt") pod "40f2f133-6737-4ab7-ab43-4de519e2d4c0" (UID: "40f2f133-6737-4ab7-ab43-4de519e2d4c0"). InnerVolumeSpecName "kube-api-access-v2dvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:53:34 crc kubenswrapper[4962]: I1201 21:53:34.096993 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-bk224" event={"ID":"fc37b492-d684-42d9-a258-391677fbb9d7","Type":"ContainerDied","Data":"d8867d45ffb639e5d4fe10549b77bb85a1258807ed1a7346f2fd4fa809a3e907"} Dec 01 21:53:34 crc kubenswrapper[4962]: I1201 21:53:34.097036 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8867d45ffb639e5d4fe10549b77bb85a1258807ed1a7346f2fd4fa809a3e907" Dec 01 21:53:34 crc kubenswrapper[4962]: I1201 21:53:34.097095 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-bk224" Dec 01 21:53:34 crc kubenswrapper[4962]: I1201 21:53:34.097150 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/057caf82-bc3a-4a9a-92a8-ac85a2ce7fd0-kube-api-access-vqqs4" (OuterVolumeSpecName: "kube-api-access-vqqs4") pod "057caf82-bc3a-4a9a-92a8-ac85a2ce7fd0" (UID: "057caf82-bc3a-4a9a-92a8-ac85a2ce7fd0"). InnerVolumeSpecName "kube-api-access-vqqs4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:53:34 crc kubenswrapper[4962]: I1201 21:53:34.111394 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3f94-account-create-update-tcsxp" Dec 01 21:53:34 crc kubenswrapper[4962]: I1201 21:53:34.111380 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3f94-account-create-update-tcsxp" event={"ID":"40f2f133-6737-4ab7-ab43-4de519e2d4c0","Type":"ContainerDied","Data":"a2aee4db5ca85296a68119a66b5ec1ae23c226971a2f8169c7f17261139a22e3"} Dec 01 21:53:34 crc kubenswrapper[4962]: I1201 21:53:34.111461 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a2aee4db5ca85296a68119a66b5ec1ae23c226971a2f8169c7f17261139a22e3" Dec 01 21:53:34 crc kubenswrapper[4962]: I1201 21:53:34.113312 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-4fa1-account-create-update-rfbmh" event={"ID":"057caf82-bc3a-4a9a-92a8-ac85a2ce7fd0","Type":"ContainerDied","Data":"790ad2eecbce3c645e6ab6640eaf45b3061692e9577b5e054a98c285db98552f"} Dec 01 21:53:34 crc kubenswrapper[4962]: I1201 21:53:34.113337 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="790ad2eecbce3c645e6ab6640eaf45b3061692e9577b5e054a98c285db98552f" Dec 01 21:53:34 crc kubenswrapper[4962]: I1201 21:53:34.113379 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-4fa1-account-create-update-rfbmh" Dec 01 21:53:34 crc kubenswrapper[4962]: I1201 21:53:34.118758 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-qd95v" event={"ID":"c05e4ab6-7ea8-4adb-b276-f2c1883ac638","Type":"ContainerDied","Data":"62fde53561ff9b36152e846dc2b7f2016e6a333ef6579ba62eccb6f1b81926fe"} Dec 01 21:53:34 crc kubenswrapper[4962]: I1201 21:53:34.118952 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62fde53561ff9b36152e846dc2b7f2016e6a333ef6579ba62eccb6f1b81926fe" Dec 01 21:53:34 crc kubenswrapper[4962]: I1201 21:53:34.119121 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-qd95v" Dec 01 21:53:34 crc kubenswrapper[4962]: I1201 21:53:34.120918 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-gxffv" event={"ID":"75a602fc-6e93-4dff-a482-64734dd6a817","Type":"ContainerDied","Data":"32c6fc9be4000ee276f9e271db11c35dd79b7c5e88eb1adfa32ce7ff1b187a78"} Dec 01 21:53:34 crc kubenswrapper[4962]: I1201 21:53:34.120972 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="32c6fc9be4000ee276f9e271db11c35dd79b7c5e88eb1adfa32ce7ff1b187a78" Dec 01 21:53:34 crc kubenswrapper[4962]: I1201 21:53:34.121037 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-gxffv" Dec 01 21:53:34 crc kubenswrapper[4962]: I1201 21:53:34.124902 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-8b9sd" Dec 01 21:53:34 crc kubenswrapper[4962]: I1201 21:53:34.124989 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-8b9sd" event={"ID":"9e9931f2-fa3e-4c5a-a8ea-fbfe5dbb25fb","Type":"ContainerDied","Data":"ba5c2f87b8d2cd4c8b19579860abf44b5bc22dfacad289929cf5fdc65d0eda7d"} Dec 01 21:53:34 crc kubenswrapper[4962]: I1201 21:53:34.125142 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba5c2f87b8d2cd4c8b19579860abf44b5bc22dfacad289929cf5fdc65d0eda7d" Dec 01 21:53:34 crc kubenswrapper[4962]: I1201 21:53:34.184375 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90e37bf0-3279-477d-bb05-cb6744af0908-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:34 crc kubenswrapper[4962]: I1201 21:53:34.184423 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75a602fc-6e93-4dff-a482-64734dd6a817-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:34 crc kubenswrapper[4962]: I1201 21:53:34.184434 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e9931f2-fa3e-4c5a-a8ea-fbfe5dbb25fb-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:34 crc kubenswrapper[4962]: I1201 21:53:34.184443 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6d9bf\" (UniqueName: \"kubernetes.io/projected/90e37bf0-3279-477d-bb05-cb6744af0908-kube-api-access-6d9bf\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:34 crc kubenswrapper[4962]: I1201 21:53:34.184458 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c05e4ab6-7ea8-4adb-b276-f2c1883ac638-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:34 crc kubenswrapper[4962]: I1201 21:53:34.184467 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqqs4\" (UniqueName: \"kubernetes.io/projected/057caf82-bc3a-4a9a-92a8-ac85a2ce7fd0-kube-api-access-vqqs4\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:34 crc kubenswrapper[4962]: I1201 21:53:34.184477 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2dvt\" (UniqueName: \"kubernetes.io/projected/40f2f133-6737-4ab7-ab43-4de519e2d4c0-kube-api-access-v2dvt\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:34 crc kubenswrapper[4962]: I1201 21:53:34.184486 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwc69\" (UniqueName: \"kubernetes.io/projected/75a602fc-6e93-4dff-a482-64734dd6a817-kube-api-access-jwc69\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:34 crc kubenswrapper[4962]: I1201 21:53:34.184510 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xz6pz\" (UniqueName: \"kubernetes.io/projected/9e9931f2-fa3e-4c5a-a8ea-fbfe5dbb25fb-kube-api-access-xz6pz\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:34 crc kubenswrapper[4962]: I1201 21:53:34.184518 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjlft\" (UniqueName: \"kubernetes.io/projected/c05e4ab6-7ea8-4adb-b276-f2c1883ac638-kube-api-access-rjlft\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:34 crc kubenswrapper[4962]: I1201 21:53:34.184527 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/057caf82-bc3a-4a9a-92a8-ac85a2ce7fd0-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:34 crc kubenswrapper[4962]: I1201 21:53:34.437571 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-3071-account-create-update-pqf8s" Dec 01 21:53:34 crc kubenswrapper[4962]: I1201 21:53:34.489813 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/287023aa-b843-4ff9-bca0-7c5fcfc67688-operator-scripts\") pod \"287023aa-b843-4ff9-bca0-7c5fcfc67688\" (UID: \"287023aa-b843-4ff9-bca0-7c5fcfc67688\") " Dec 01 21:53:34 crc kubenswrapper[4962]: I1201 21:53:34.490056 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d75vw\" (UniqueName: \"kubernetes.io/projected/287023aa-b843-4ff9-bca0-7c5fcfc67688-kube-api-access-d75vw\") pod \"287023aa-b843-4ff9-bca0-7c5fcfc67688\" (UID: \"287023aa-b843-4ff9-bca0-7c5fcfc67688\") " Dec 01 21:53:34 crc kubenswrapper[4962]: I1201 21:53:34.490266 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/287023aa-b843-4ff9-bca0-7c5fcfc67688-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "287023aa-b843-4ff9-bca0-7c5fcfc67688" (UID: "287023aa-b843-4ff9-bca0-7c5fcfc67688"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:53:34 crc kubenswrapper[4962]: I1201 21:53:34.490790 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/287023aa-b843-4ff9-bca0-7c5fcfc67688-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:34 crc kubenswrapper[4962]: I1201 21:53:34.494365 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/287023aa-b843-4ff9-bca0-7c5fcfc67688-kube-api-access-d75vw" (OuterVolumeSpecName: "kube-api-access-d75vw") pod "287023aa-b843-4ff9-bca0-7c5fcfc67688" (UID: "287023aa-b843-4ff9-bca0-7c5fcfc67688"). InnerVolumeSpecName "kube-api-access-d75vw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:53:34 crc kubenswrapper[4962]: I1201 21:53:34.592664 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d75vw\" (UniqueName: \"kubernetes.io/projected/287023aa-b843-4ff9-bca0-7c5fcfc67688-kube-api-access-d75vw\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:35 crc kubenswrapper[4962]: I1201 21:53:35.138156 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-3071-account-create-update-pqf8s" event={"ID":"287023aa-b843-4ff9-bca0-7c5fcfc67688","Type":"ContainerDied","Data":"0398f32810dd23b5426853ec92b663a538507fad688e0d2d6251bd2d1432c8aa"} Dec 01 21:53:35 crc kubenswrapper[4962]: I1201 21:53:35.138199 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0398f32810dd23b5426853ec92b663a538507fad688e0d2d6251bd2d1432c8aa" Dec 01 21:53:35 crc kubenswrapper[4962]: I1201 21:53:35.138233 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-3071-account-create-update-pqf8s" Dec 01 21:53:35 crc kubenswrapper[4962]: I1201 21:53:35.596080 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 01 21:53:36 crc kubenswrapper[4962]: I1201 21:53:36.567596 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3-etc-swift\") pod \"swift-storage-0\" (UID: \"f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3\") " pod="openstack/swift-storage-0" Dec 01 21:53:36 crc kubenswrapper[4962]: E1201 21:53:36.567803 4962 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 01 21:53:36 crc kubenswrapper[4962]: E1201 21:53:36.567824 4962 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 01 21:53:36 crc kubenswrapper[4962]: E1201 21:53:36.567873 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3-etc-swift podName:f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3 nodeName:}" failed. No retries permitted until 2025-12-01 21:53:52.567857709 +0000 UTC m=+1216.669296904 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3-etc-swift") pod "swift-storage-0" (UID: "f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3") : configmap "swift-ring-files" not found Dec 01 21:53:37 crc kubenswrapper[4962]: I1201 21:53:37.559836 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 01 21:53:38 crc kubenswrapper[4962]: I1201 21:53:38.063921 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 01 21:53:38 crc kubenswrapper[4962]: E1201 21:53:38.066061 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc37b492-d684-42d9-a258-391677fbb9d7" containerName="mariadb-database-create" Dec 01 21:53:38 crc kubenswrapper[4962]: I1201 21:53:38.066079 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc37b492-d684-42d9-a258-391677fbb9d7" containerName="mariadb-database-create" Dec 01 21:53:38 crc kubenswrapper[4962]: E1201 21:53:38.066100 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40f2f133-6737-4ab7-ab43-4de519e2d4c0" containerName="mariadb-account-create-update" Dec 01 21:53:38 crc kubenswrapper[4962]: I1201 21:53:38.066110 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="40f2f133-6737-4ab7-ab43-4de519e2d4c0" containerName="mariadb-account-create-update" Dec 01 21:53:38 crc kubenswrapper[4962]: E1201 21:53:38.066128 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e9931f2-fa3e-4c5a-a8ea-fbfe5dbb25fb" containerName="mariadb-database-create" Dec 01 21:53:38 crc kubenswrapper[4962]: I1201 21:53:38.066136 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e9931f2-fa3e-4c5a-a8ea-fbfe5dbb25fb" containerName="mariadb-database-create" Dec 01 21:53:38 crc kubenswrapper[4962]: E1201 21:53:38.066153 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c05e4ab6-7ea8-4adb-b276-f2c1883ac638" containerName="mariadb-database-create" Dec 01 21:53:38 crc kubenswrapper[4962]: I1201 21:53:38.066161 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="c05e4ab6-7ea8-4adb-b276-f2c1883ac638" containerName="mariadb-database-create" Dec 01 21:53:38 crc kubenswrapper[4962]: E1201 21:53:38.066178 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90e37bf0-3279-477d-bb05-cb6744af0908" containerName="mariadb-account-create-update" Dec 01 21:53:38 crc kubenswrapper[4962]: I1201 21:53:38.066186 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="90e37bf0-3279-477d-bb05-cb6744af0908" containerName="mariadb-account-create-update" Dec 01 21:53:38 crc kubenswrapper[4962]: E1201 21:53:38.066202 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33007873-cb3d-4f47-8883-a60f3f823a16" containerName="init" Dec 01 21:53:38 crc kubenswrapper[4962]: I1201 21:53:38.066210 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="33007873-cb3d-4f47-8883-a60f3f823a16" containerName="init" Dec 01 21:53:38 crc kubenswrapper[4962]: E1201 21:53:38.066228 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="057caf82-bc3a-4a9a-92a8-ac85a2ce7fd0" containerName="mariadb-account-create-update" Dec 01 21:53:38 crc kubenswrapper[4962]: I1201 21:53:38.066236 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="057caf82-bc3a-4a9a-92a8-ac85a2ce7fd0" containerName="mariadb-account-create-update" Dec 01 21:53:38 crc kubenswrapper[4962]: E1201 21:53:38.066250 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75a602fc-6e93-4dff-a482-64734dd6a817" containerName="mariadb-database-create" Dec 01 21:53:38 crc kubenswrapper[4962]: I1201 21:53:38.066258 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="75a602fc-6e93-4dff-a482-64734dd6a817" containerName="mariadb-database-create" Dec 01 21:53:38 crc kubenswrapper[4962]: E1201 21:53:38.066275 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="287023aa-b843-4ff9-bca0-7c5fcfc67688" containerName="mariadb-account-create-update" Dec 01 21:53:38 crc kubenswrapper[4962]: I1201 21:53:38.066283 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="287023aa-b843-4ff9-bca0-7c5fcfc67688" containerName="mariadb-account-create-update" Dec 01 21:53:38 crc kubenswrapper[4962]: E1201 21:53:38.066301 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75defef6-e656-452e-a623-8e1dd47c8078" containerName="console" Dec 01 21:53:38 crc kubenswrapper[4962]: I1201 21:53:38.066309 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="75defef6-e656-452e-a623-8e1dd47c8078" containerName="console" Dec 01 21:53:38 crc kubenswrapper[4962]: E1201 21:53:38.066328 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33007873-cb3d-4f47-8883-a60f3f823a16" containerName="dnsmasq-dns" Dec 01 21:53:38 crc kubenswrapper[4962]: I1201 21:53:38.066336 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="33007873-cb3d-4f47-8883-a60f3f823a16" containerName="dnsmasq-dns" Dec 01 21:53:38 crc kubenswrapper[4962]: I1201 21:53:38.066587 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="c05e4ab6-7ea8-4adb-b276-f2c1883ac638" containerName="mariadb-database-create" Dec 01 21:53:38 crc kubenswrapper[4962]: I1201 21:53:38.066605 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc37b492-d684-42d9-a258-391677fbb9d7" containerName="mariadb-database-create" Dec 01 21:53:38 crc kubenswrapper[4962]: I1201 21:53:38.066620 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="75defef6-e656-452e-a623-8e1dd47c8078" containerName="console" Dec 01 21:53:38 crc kubenswrapper[4962]: I1201 21:53:38.066635 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="75a602fc-6e93-4dff-a482-64734dd6a817" containerName="mariadb-database-create" Dec 01 21:53:38 crc kubenswrapper[4962]: I1201 21:53:38.066647 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="90e37bf0-3279-477d-bb05-cb6744af0908" containerName="mariadb-account-create-update" Dec 01 21:53:38 crc kubenswrapper[4962]: I1201 21:53:38.066657 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="33007873-cb3d-4f47-8883-a60f3f823a16" containerName="dnsmasq-dns" Dec 01 21:53:38 crc kubenswrapper[4962]: I1201 21:53:38.066667 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="40f2f133-6737-4ab7-ab43-4de519e2d4c0" containerName="mariadb-account-create-update" Dec 01 21:53:38 crc kubenswrapper[4962]: I1201 21:53:38.066675 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="057caf82-bc3a-4a9a-92a8-ac85a2ce7fd0" containerName="mariadb-account-create-update" Dec 01 21:53:38 crc kubenswrapper[4962]: I1201 21:53:38.066696 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="287023aa-b843-4ff9-bca0-7c5fcfc67688" containerName="mariadb-account-create-update" Dec 01 21:53:38 crc kubenswrapper[4962]: I1201 21:53:38.066709 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e9931f2-fa3e-4c5a-a8ea-fbfe5dbb25fb" containerName="mariadb-database-create" Dec 01 21:53:38 crc kubenswrapper[4962]: I1201 21:53:38.073044 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 01 21:53:38 crc kubenswrapper[4962]: I1201 21:53:38.076153 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-5j4qv" Dec 01 21:53:38 crc kubenswrapper[4962]: I1201 21:53:38.076346 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Dec 01 21:53:38 crc kubenswrapper[4962]: I1201 21:53:38.076603 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 01 21:53:38 crc kubenswrapper[4962]: I1201 21:53:38.077644 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 01 21:53:38 crc kubenswrapper[4962]: I1201 21:53:38.089207 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 01 21:53:38 crc kubenswrapper[4962]: I1201 21:53:38.150640 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-28z4p"] Dec 01 21:53:38 crc kubenswrapper[4962]: I1201 21:53:38.152016 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-28z4p" Dec 01 21:53:38 crc kubenswrapper[4962]: I1201 21:53:38.155764 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 01 21:53:38 crc kubenswrapper[4962]: I1201 21:53:38.156471 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-nj6wq" Dec 01 21:53:38 crc kubenswrapper[4962]: I1201 21:53:38.163812 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-28z4p"] Dec 01 21:53:38 crc kubenswrapper[4962]: I1201 21:53:38.175597 4962 generic.go:334] "Generic (PLEG): container finished" podID="34cbe04f-2bf2-4b5e-bf91-00787b7e4fee" containerID="833b74755b103539c261d57def2dacb37e5701587d65924b6de39c3bf25557a5" exitCode=0 Dec 01 21:53:38 crc kubenswrapper[4962]: I1201 21:53:38.175874 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-5db2p" event={"ID":"34cbe04f-2bf2-4b5e-bf91-00787b7e4fee","Type":"ContainerDied","Data":"833b74755b103539c261d57def2dacb37e5701587d65924b6de39c3bf25557a5"} Dec 01 21:53:38 crc kubenswrapper[4962]: I1201 21:53:38.225806 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08d09a46-a04a-4b53-aa6c-e24f284063f0-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"08d09a46-a04a-4b53-aa6c-e24f284063f0\") " pod="openstack/ovn-northd-0" Dec 01 21:53:38 crc kubenswrapper[4962]: I1201 21:53:38.225922 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/08d09a46-a04a-4b53-aa6c-e24f284063f0-scripts\") pod \"ovn-northd-0\" (UID: \"08d09a46-a04a-4b53-aa6c-e24f284063f0\") " pod="openstack/ovn-northd-0" Dec 01 21:53:38 crc kubenswrapper[4962]: I1201 21:53:38.226083 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/08d09a46-a04a-4b53-aa6c-e24f284063f0-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"08d09a46-a04a-4b53-aa6c-e24f284063f0\") " pod="openstack/ovn-northd-0" Dec 01 21:53:38 crc kubenswrapper[4962]: I1201 21:53:38.226135 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9phgs\" (UniqueName: \"kubernetes.io/projected/08d09a46-a04a-4b53-aa6c-e24f284063f0-kube-api-access-9phgs\") pod \"ovn-northd-0\" (UID: \"08d09a46-a04a-4b53-aa6c-e24f284063f0\") " pod="openstack/ovn-northd-0" Dec 01 21:53:38 crc kubenswrapper[4962]: I1201 21:53:38.226172 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/235fb826-ef71-488f-b902-efcf5dc9a7dd-db-sync-config-data\") pod \"glance-db-sync-28z4p\" (UID: \"235fb826-ef71-488f-b902-efcf5dc9a7dd\") " pod="openstack/glance-db-sync-28z4p" Dec 01 21:53:38 crc kubenswrapper[4962]: I1201 21:53:38.226200 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/08d09a46-a04a-4b53-aa6c-e24f284063f0-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"08d09a46-a04a-4b53-aa6c-e24f284063f0\") " pod="openstack/ovn-northd-0" Dec 01 21:53:38 crc kubenswrapper[4962]: I1201 21:53:38.226428 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/235fb826-ef71-488f-b902-efcf5dc9a7dd-config-data\") pod \"glance-db-sync-28z4p\" (UID: \"235fb826-ef71-488f-b902-efcf5dc9a7dd\") " pod="openstack/glance-db-sync-28z4p" Dec 01 21:53:38 crc kubenswrapper[4962]: I1201 21:53:38.226501 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08d09a46-a04a-4b53-aa6c-e24f284063f0-config\") pod \"ovn-northd-0\" (UID: \"08d09a46-a04a-4b53-aa6c-e24f284063f0\") " pod="openstack/ovn-northd-0" Dec 01 21:53:38 crc kubenswrapper[4962]: I1201 21:53:38.227319 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/235fb826-ef71-488f-b902-efcf5dc9a7dd-combined-ca-bundle\") pod \"glance-db-sync-28z4p\" (UID: \"235fb826-ef71-488f-b902-efcf5dc9a7dd\") " pod="openstack/glance-db-sync-28z4p" Dec 01 21:53:38 crc kubenswrapper[4962]: I1201 21:53:38.227367 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdpmp\" (UniqueName: \"kubernetes.io/projected/235fb826-ef71-488f-b902-efcf5dc9a7dd-kube-api-access-sdpmp\") pod \"glance-db-sync-28z4p\" (UID: \"235fb826-ef71-488f-b902-efcf5dc9a7dd\") " pod="openstack/glance-db-sync-28z4p" Dec 01 21:53:38 crc kubenswrapper[4962]: I1201 21:53:38.227403 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/08d09a46-a04a-4b53-aa6c-e24f284063f0-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"08d09a46-a04a-4b53-aa6c-e24f284063f0\") " pod="openstack/ovn-northd-0" Dec 01 21:53:38 crc kubenswrapper[4962]: I1201 21:53:38.329261 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/235fb826-ef71-488f-b902-efcf5dc9a7dd-config-data\") pod \"glance-db-sync-28z4p\" (UID: \"235fb826-ef71-488f-b902-efcf5dc9a7dd\") " pod="openstack/glance-db-sync-28z4p" Dec 01 21:53:38 crc kubenswrapper[4962]: I1201 21:53:38.329322 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08d09a46-a04a-4b53-aa6c-e24f284063f0-config\") pod \"ovn-northd-0\" (UID: \"08d09a46-a04a-4b53-aa6c-e24f284063f0\") " pod="openstack/ovn-northd-0" Dec 01 21:53:38 crc kubenswrapper[4962]: I1201 21:53:38.329361 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/235fb826-ef71-488f-b902-efcf5dc9a7dd-combined-ca-bundle\") pod \"glance-db-sync-28z4p\" (UID: \"235fb826-ef71-488f-b902-efcf5dc9a7dd\") " pod="openstack/glance-db-sync-28z4p" Dec 01 21:53:38 crc kubenswrapper[4962]: I1201 21:53:38.329383 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdpmp\" (UniqueName: \"kubernetes.io/projected/235fb826-ef71-488f-b902-efcf5dc9a7dd-kube-api-access-sdpmp\") pod \"glance-db-sync-28z4p\" (UID: \"235fb826-ef71-488f-b902-efcf5dc9a7dd\") " pod="openstack/glance-db-sync-28z4p" Dec 01 21:53:38 crc kubenswrapper[4962]: I1201 21:53:38.329407 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/08d09a46-a04a-4b53-aa6c-e24f284063f0-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"08d09a46-a04a-4b53-aa6c-e24f284063f0\") " pod="openstack/ovn-northd-0" Dec 01 21:53:38 crc kubenswrapper[4962]: I1201 21:53:38.329436 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08d09a46-a04a-4b53-aa6c-e24f284063f0-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"08d09a46-a04a-4b53-aa6c-e24f284063f0\") " pod="openstack/ovn-northd-0" Dec 01 21:53:38 crc kubenswrapper[4962]: I1201 21:53:38.329464 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/08d09a46-a04a-4b53-aa6c-e24f284063f0-scripts\") pod \"ovn-northd-0\" (UID: \"08d09a46-a04a-4b53-aa6c-e24f284063f0\") " pod="openstack/ovn-northd-0" Dec 01 21:53:38 crc kubenswrapper[4962]: I1201 21:53:38.329503 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/08d09a46-a04a-4b53-aa6c-e24f284063f0-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"08d09a46-a04a-4b53-aa6c-e24f284063f0\") " pod="openstack/ovn-northd-0" Dec 01 21:53:38 crc kubenswrapper[4962]: I1201 21:53:38.329520 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9phgs\" (UniqueName: \"kubernetes.io/projected/08d09a46-a04a-4b53-aa6c-e24f284063f0-kube-api-access-9phgs\") pod \"ovn-northd-0\" (UID: \"08d09a46-a04a-4b53-aa6c-e24f284063f0\") " pod="openstack/ovn-northd-0" Dec 01 21:53:38 crc kubenswrapper[4962]: I1201 21:53:38.329541 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/235fb826-ef71-488f-b902-efcf5dc9a7dd-db-sync-config-data\") pod \"glance-db-sync-28z4p\" (UID: \"235fb826-ef71-488f-b902-efcf5dc9a7dd\") " pod="openstack/glance-db-sync-28z4p" Dec 01 21:53:38 crc kubenswrapper[4962]: I1201 21:53:38.329560 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/08d09a46-a04a-4b53-aa6c-e24f284063f0-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"08d09a46-a04a-4b53-aa6c-e24f284063f0\") " pod="openstack/ovn-northd-0" Dec 01 21:53:38 crc kubenswrapper[4962]: I1201 21:53:38.331058 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08d09a46-a04a-4b53-aa6c-e24f284063f0-config\") pod \"ovn-northd-0\" (UID: \"08d09a46-a04a-4b53-aa6c-e24f284063f0\") " pod="openstack/ovn-northd-0" Dec 01 21:53:38 crc kubenswrapper[4962]: I1201 21:53:38.331171 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/08d09a46-a04a-4b53-aa6c-e24f284063f0-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"08d09a46-a04a-4b53-aa6c-e24f284063f0\") " pod="openstack/ovn-northd-0" Dec 01 21:53:38 crc kubenswrapper[4962]: I1201 21:53:38.332278 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/08d09a46-a04a-4b53-aa6c-e24f284063f0-scripts\") pod \"ovn-northd-0\" (UID: \"08d09a46-a04a-4b53-aa6c-e24f284063f0\") " pod="openstack/ovn-northd-0" Dec 01 21:53:38 crc kubenswrapper[4962]: I1201 21:53:38.335453 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/235fb826-ef71-488f-b902-efcf5dc9a7dd-config-data\") pod \"glance-db-sync-28z4p\" (UID: \"235fb826-ef71-488f-b902-efcf5dc9a7dd\") " pod="openstack/glance-db-sync-28z4p" Dec 01 21:53:38 crc kubenswrapper[4962]: I1201 21:53:38.336126 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/08d09a46-a04a-4b53-aa6c-e24f284063f0-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"08d09a46-a04a-4b53-aa6c-e24f284063f0\") " pod="openstack/ovn-northd-0" Dec 01 21:53:38 crc kubenswrapper[4962]: I1201 21:53:38.336203 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08d09a46-a04a-4b53-aa6c-e24f284063f0-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"08d09a46-a04a-4b53-aa6c-e24f284063f0\") " pod="openstack/ovn-northd-0" Dec 01 21:53:38 crc kubenswrapper[4962]: I1201 21:53:38.336607 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/08d09a46-a04a-4b53-aa6c-e24f284063f0-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"08d09a46-a04a-4b53-aa6c-e24f284063f0\") " pod="openstack/ovn-northd-0" Dec 01 21:53:38 crc kubenswrapper[4962]: I1201 21:53:38.339398 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/235fb826-ef71-488f-b902-efcf5dc9a7dd-combined-ca-bundle\") pod \"glance-db-sync-28z4p\" (UID: \"235fb826-ef71-488f-b902-efcf5dc9a7dd\") " pod="openstack/glance-db-sync-28z4p" Dec 01 21:53:38 crc kubenswrapper[4962]: I1201 21:53:38.340659 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/235fb826-ef71-488f-b902-efcf5dc9a7dd-db-sync-config-data\") pod \"glance-db-sync-28z4p\" (UID: \"235fb826-ef71-488f-b902-efcf5dc9a7dd\") " pod="openstack/glance-db-sync-28z4p" Dec 01 21:53:38 crc kubenswrapper[4962]: I1201 21:53:38.355374 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9phgs\" (UniqueName: \"kubernetes.io/projected/08d09a46-a04a-4b53-aa6c-e24f284063f0-kube-api-access-9phgs\") pod \"ovn-northd-0\" (UID: \"08d09a46-a04a-4b53-aa6c-e24f284063f0\") " pod="openstack/ovn-northd-0" Dec 01 21:53:38 crc kubenswrapper[4962]: I1201 21:53:38.355895 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdpmp\" (UniqueName: \"kubernetes.io/projected/235fb826-ef71-488f-b902-efcf5dc9a7dd-kube-api-access-sdpmp\") pod \"glance-db-sync-28z4p\" (UID: \"235fb826-ef71-488f-b902-efcf5dc9a7dd\") " pod="openstack/glance-db-sync-28z4p" Dec 01 21:53:38 crc kubenswrapper[4962]: I1201 21:53:38.409136 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 01 21:53:38 crc kubenswrapper[4962]: I1201 21:53:38.472322 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-28z4p" Dec 01 21:53:39 crc kubenswrapper[4962]: I1201 21:53:39.971013 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-bj9qn"] Dec 01 21:53:39 crc kubenswrapper[4962]: I1201 21:53:39.982224 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-bj9qn" Dec 01 21:53:39 crc kubenswrapper[4962]: I1201 21:53:39.985871 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-bj9qn"] Dec 01 21:53:40 crc kubenswrapper[4962]: I1201 21:53:40.059756 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-af24-account-create-update-g6rz5"] Dec 01 21:53:40 crc kubenswrapper[4962]: I1201 21:53:40.063872 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-af24-account-create-update-g6rz5" Dec 01 21:53:40 crc kubenswrapper[4962]: I1201 21:53:40.068074 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-openstack-cell1-db-secret" Dec 01 21:53:40 crc kubenswrapper[4962]: I1201 21:53:40.070234 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0dbe1ae4-c145-451d-9350-d1172e7042d3-operator-scripts\") pod \"mysqld-exporter-openstack-cell1-db-create-bj9qn\" (UID: \"0dbe1ae4-c145-451d-9350-d1172e7042d3\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-bj9qn" Dec 01 21:53:40 crc kubenswrapper[4962]: I1201 21:53:40.070500 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tb79\" (UniqueName: \"kubernetes.io/projected/0dbe1ae4-c145-451d-9350-d1172e7042d3-kube-api-access-9tb79\") pod \"mysqld-exporter-openstack-cell1-db-create-bj9qn\" (UID: \"0dbe1ae4-c145-451d-9350-d1172e7042d3\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-bj9qn" Dec 01 21:53:40 crc kubenswrapper[4962]: I1201 21:53:40.071177 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-af24-account-create-update-g6rz5"] Dec 01 21:53:40 crc kubenswrapper[4962]: I1201 21:53:40.096567 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-5db2p" Dec 01 21:53:40 crc kubenswrapper[4962]: I1201 21:53:40.171663 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/34cbe04f-2bf2-4b5e-bf91-00787b7e4fee-dispersionconf\") pod \"34cbe04f-2bf2-4b5e-bf91-00787b7e4fee\" (UID: \"34cbe04f-2bf2-4b5e-bf91-00787b7e4fee\") " Dec 01 21:53:40 crc kubenswrapper[4962]: I1201 21:53:40.171750 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/34cbe04f-2bf2-4b5e-bf91-00787b7e4fee-swiftconf\") pod \"34cbe04f-2bf2-4b5e-bf91-00787b7e4fee\" (UID: \"34cbe04f-2bf2-4b5e-bf91-00787b7e4fee\") " Dec 01 21:53:40 crc kubenswrapper[4962]: I1201 21:53:40.171852 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bczsp\" (UniqueName: \"kubernetes.io/projected/34cbe04f-2bf2-4b5e-bf91-00787b7e4fee-kube-api-access-bczsp\") pod \"34cbe04f-2bf2-4b5e-bf91-00787b7e4fee\" (UID: \"34cbe04f-2bf2-4b5e-bf91-00787b7e4fee\") " Dec 01 21:53:40 crc kubenswrapper[4962]: I1201 21:53:40.171898 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/34cbe04f-2bf2-4b5e-bf91-00787b7e4fee-ring-data-devices\") pod \"34cbe04f-2bf2-4b5e-bf91-00787b7e4fee\" (UID: \"34cbe04f-2bf2-4b5e-bf91-00787b7e4fee\") " Dec 01 21:53:40 crc kubenswrapper[4962]: I1201 21:53:40.171928 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/34cbe04f-2bf2-4b5e-bf91-00787b7e4fee-scripts\") pod \"34cbe04f-2bf2-4b5e-bf91-00787b7e4fee\" (UID: \"34cbe04f-2bf2-4b5e-bf91-00787b7e4fee\") " Dec 01 21:53:40 crc kubenswrapper[4962]: I1201 21:53:40.171970 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34cbe04f-2bf2-4b5e-bf91-00787b7e4fee-combined-ca-bundle\") pod \"34cbe04f-2bf2-4b5e-bf91-00787b7e4fee\" (UID: \"34cbe04f-2bf2-4b5e-bf91-00787b7e4fee\") " Dec 01 21:53:40 crc kubenswrapper[4962]: I1201 21:53:40.171995 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/34cbe04f-2bf2-4b5e-bf91-00787b7e4fee-etc-swift\") pod \"34cbe04f-2bf2-4b5e-bf91-00787b7e4fee\" (UID: \"34cbe04f-2bf2-4b5e-bf91-00787b7e4fee\") " Dec 01 21:53:40 crc kubenswrapper[4962]: I1201 21:53:40.172326 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tb79\" (UniqueName: \"kubernetes.io/projected/0dbe1ae4-c145-451d-9350-d1172e7042d3-kube-api-access-9tb79\") pod \"mysqld-exporter-openstack-cell1-db-create-bj9qn\" (UID: \"0dbe1ae4-c145-451d-9350-d1172e7042d3\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-bj9qn" Dec 01 21:53:40 crc kubenswrapper[4962]: I1201 21:53:40.172377 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0dbe1ae4-c145-451d-9350-d1172e7042d3-operator-scripts\") pod \"mysqld-exporter-openstack-cell1-db-create-bj9qn\" (UID: \"0dbe1ae4-c145-451d-9350-d1172e7042d3\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-bj9qn" Dec 01 21:53:40 crc kubenswrapper[4962]: I1201 21:53:40.172401 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hctxh\" (UniqueName: \"kubernetes.io/projected/095fd634-41e6-4675-b439-50fe9f184b3a-kube-api-access-hctxh\") pod \"mysqld-exporter-af24-account-create-update-g6rz5\" (UID: \"095fd634-41e6-4675-b439-50fe9f184b3a\") " pod="openstack/mysqld-exporter-af24-account-create-update-g6rz5" Dec 01 21:53:40 crc kubenswrapper[4962]: I1201 21:53:40.172453 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/095fd634-41e6-4675-b439-50fe9f184b3a-operator-scripts\") pod \"mysqld-exporter-af24-account-create-update-g6rz5\" (UID: \"095fd634-41e6-4675-b439-50fe9f184b3a\") " pod="openstack/mysqld-exporter-af24-account-create-update-g6rz5" Dec 01 21:53:40 crc kubenswrapper[4962]: I1201 21:53:40.173601 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34cbe04f-2bf2-4b5e-bf91-00787b7e4fee-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "34cbe04f-2bf2-4b5e-bf91-00787b7e4fee" (UID: "34cbe04f-2bf2-4b5e-bf91-00787b7e4fee"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:53:40 crc kubenswrapper[4962]: I1201 21:53:40.173820 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0dbe1ae4-c145-451d-9350-d1172e7042d3-operator-scripts\") pod \"mysqld-exporter-openstack-cell1-db-create-bj9qn\" (UID: \"0dbe1ae4-c145-451d-9350-d1172e7042d3\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-bj9qn" Dec 01 21:53:40 crc kubenswrapper[4962]: I1201 21:53:40.175118 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34cbe04f-2bf2-4b5e-bf91-00787b7e4fee-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "34cbe04f-2bf2-4b5e-bf91-00787b7e4fee" (UID: "34cbe04f-2bf2-4b5e-bf91-00787b7e4fee"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:53:40 crc kubenswrapper[4962]: I1201 21:53:40.177671 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34cbe04f-2bf2-4b5e-bf91-00787b7e4fee-kube-api-access-bczsp" (OuterVolumeSpecName: "kube-api-access-bczsp") pod "34cbe04f-2bf2-4b5e-bf91-00787b7e4fee" (UID: "34cbe04f-2bf2-4b5e-bf91-00787b7e4fee"). InnerVolumeSpecName "kube-api-access-bczsp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:53:40 crc kubenswrapper[4962]: I1201 21:53:40.180323 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34cbe04f-2bf2-4b5e-bf91-00787b7e4fee-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "34cbe04f-2bf2-4b5e-bf91-00787b7e4fee" (UID: "34cbe04f-2bf2-4b5e-bf91-00787b7e4fee"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:53:40 crc kubenswrapper[4962]: I1201 21:53:40.191583 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tb79\" (UniqueName: \"kubernetes.io/projected/0dbe1ae4-c145-451d-9350-d1172e7042d3-kube-api-access-9tb79\") pod \"mysqld-exporter-openstack-cell1-db-create-bj9qn\" (UID: \"0dbe1ae4-c145-451d-9350-d1172e7042d3\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-bj9qn" Dec 01 21:53:40 crc kubenswrapper[4962]: I1201 21:53:40.198733 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-5db2p" event={"ID":"34cbe04f-2bf2-4b5e-bf91-00787b7e4fee","Type":"ContainerDied","Data":"baee82bd951435ebc75516dd6df589905e9751df53568c4bdde12718379643f4"} Dec 01 21:53:40 crc kubenswrapper[4962]: I1201 21:53:40.198916 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="baee82bd951435ebc75516dd6df589905e9751df53568c4bdde12718379643f4" Dec 01 21:53:40 crc kubenswrapper[4962]: I1201 21:53:40.198918 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-5db2p" Dec 01 21:53:40 crc kubenswrapper[4962]: I1201 21:53:40.200828 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34cbe04f-2bf2-4b5e-bf91-00787b7e4fee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "34cbe04f-2bf2-4b5e-bf91-00787b7e4fee" (UID: "34cbe04f-2bf2-4b5e-bf91-00787b7e4fee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:53:40 crc kubenswrapper[4962]: I1201 21:53:40.202456 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4f8782cd-368d-4071-848d-8ad2379ddf6c","Type":"ContainerStarted","Data":"839c10079b3d9f73ad42d3111a556e84005077ff2ac143ea95a254189ef9b995"} Dec 01 21:53:40 crc kubenswrapper[4962]: I1201 21:53:40.203950 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34cbe04f-2bf2-4b5e-bf91-00787b7e4fee-scripts" (OuterVolumeSpecName: "scripts") pod "34cbe04f-2bf2-4b5e-bf91-00787b7e4fee" (UID: "34cbe04f-2bf2-4b5e-bf91-00787b7e4fee"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:53:40 crc kubenswrapper[4962]: I1201 21:53:40.204915 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34cbe04f-2bf2-4b5e-bf91-00787b7e4fee-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "34cbe04f-2bf2-4b5e-bf91-00787b7e4fee" (UID: "34cbe04f-2bf2-4b5e-bf91-00787b7e4fee"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:53:40 crc kubenswrapper[4962]: I1201 21:53:40.251097 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=14.5537298 podStartE2EDuration="1m1.25107608s" podCreationTimestamp="2025-12-01 21:52:39 +0000 UTC" firstStartedPulling="2025-12-01 21:52:52.994747414 +0000 UTC m=+1157.096186609" lastFinishedPulling="2025-12-01 21:53:39.692093694 +0000 UTC m=+1203.793532889" observedRunningTime="2025-12-01 21:53:40.224341458 +0000 UTC m=+1204.325780653" watchObservedRunningTime="2025-12-01 21:53:40.25107608 +0000 UTC m=+1204.352515275" Dec 01 21:53:40 crc kubenswrapper[4962]: I1201 21:53:40.274852 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hctxh\" (UniqueName: \"kubernetes.io/projected/095fd634-41e6-4675-b439-50fe9f184b3a-kube-api-access-hctxh\") pod \"mysqld-exporter-af24-account-create-update-g6rz5\" (UID: \"095fd634-41e6-4675-b439-50fe9f184b3a\") " pod="openstack/mysqld-exporter-af24-account-create-update-g6rz5" Dec 01 21:53:40 crc kubenswrapper[4962]: I1201 21:53:40.274960 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/095fd634-41e6-4675-b439-50fe9f184b3a-operator-scripts\") pod \"mysqld-exporter-af24-account-create-update-g6rz5\" (UID: \"095fd634-41e6-4675-b439-50fe9f184b3a\") " pod="openstack/mysqld-exporter-af24-account-create-update-g6rz5" Dec 01 21:53:40 crc kubenswrapper[4962]: I1201 21:53:40.275142 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/34cbe04f-2bf2-4b5e-bf91-00787b7e4fee-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:40 crc kubenswrapper[4962]: I1201 21:53:40.275154 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34cbe04f-2bf2-4b5e-bf91-00787b7e4fee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:40 crc kubenswrapper[4962]: I1201 21:53:40.275165 4962 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/34cbe04f-2bf2-4b5e-bf91-00787b7e4fee-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:40 crc kubenswrapper[4962]: I1201 21:53:40.275175 4962 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/34cbe04f-2bf2-4b5e-bf91-00787b7e4fee-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:40 crc kubenswrapper[4962]: I1201 21:53:40.275182 4962 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/34cbe04f-2bf2-4b5e-bf91-00787b7e4fee-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:40 crc kubenswrapper[4962]: I1201 21:53:40.275193 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bczsp\" (UniqueName: \"kubernetes.io/projected/34cbe04f-2bf2-4b5e-bf91-00787b7e4fee-kube-api-access-bczsp\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:40 crc kubenswrapper[4962]: I1201 21:53:40.275201 4962 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/34cbe04f-2bf2-4b5e-bf91-00787b7e4fee-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:40 crc kubenswrapper[4962]: I1201 21:53:40.276989 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/095fd634-41e6-4675-b439-50fe9f184b3a-operator-scripts\") pod \"mysqld-exporter-af24-account-create-update-g6rz5\" (UID: \"095fd634-41e6-4675-b439-50fe9f184b3a\") " pod="openstack/mysqld-exporter-af24-account-create-update-g6rz5" Dec 01 21:53:40 crc kubenswrapper[4962]: I1201 21:53:40.292480 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hctxh\" (UniqueName: \"kubernetes.io/projected/095fd634-41e6-4675-b439-50fe9f184b3a-kube-api-access-hctxh\") pod \"mysqld-exporter-af24-account-create-update-g6rz5\" (UID: \"095fd634-41e6-4675-b439-50fe9f184b3a\") " pod="openstack/mysqld-exporter-af24-account-create-update-g6rz5" Dec 01 21:53:40 crc kubenswrapper[4962]: I1201 21:53:40.384205 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-bj9qn" Dec 01 21:53:40 crc kubenswrapper[4962]: I1201 21:53:40.387198 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 01 21:53:40 crc kubenswrapper[4962]: I1201 21:53:40.404155 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-af24-account-create-update-g6rz5" Dec 01 21:53:40 crc kubenswrapper[4962]: I1201 21:53:40.467285 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-28z4p"] Dec 01 21:53:40 crc kubenswrapper[4962]: W1201 21:53:40.471536 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod235fb826_ef71_488f_b902_efcf5dc9a7dd.slice/crio-76176445f256833949ecc0764c276eb96a69cedfb523fafc04b5c6e3274a3e87 WatchSource:0}: Error finding container 76176445f256833949ecc0764c276eb96a69cedfb523fafc04b5c6e3274a3e87: Status 404 returned error can't find the container with id 76176445f256833949ecc0764c276eb96a69cedfb523fafc04b5c6e3274a3e87 Dec 01 21:53:40 crc kubenswrapper[4962]: I1201 21:53:40.761164 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Dec 01 21:53:40 crc kubenswrapper[4962]: I1201 21:53:40.761228 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Dec 01 21:53:40 crc kubenswrapper[4962]: I1201 21:53:40.764575 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Dec 01 21:53:40 crc kubenswrapper[4962]: I1201 21:53:40.872288 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-bj9qn"] Dec 01 21:53:40 crc kubenswrapper[4962]: W1201 21:53:40.874707 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0dbe1ae4_c145_451d_9350_d1172e7042d3.slice/crio-41436ef196449f2d323604ef08ccf88344b68380b009a71104c0a45bf59dc8c1 WatchSource:0}: Error finding container 41436ef196449f2d323604ef08ccf88344b68380b009a71104c0a45bf59dc8c1: Status 404 returned error can't find the container with id 41436ef196449f2d323604ef08ccf88344b68380b009a71104c0a45bf59dc8c1 Dec 01 21:53:41 crc kubenswrapper[4962]: I1201 21:53:41.003073 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-af24-account-create-update-g6rz5"] Dec 01 21:53:41 crc kubenswrapper[4962]: W1201 21:53:41.013204 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod095fd634_41e6_4675_b439_50fe9f184b3a.slice/crio-e52f3378ed033e7912fe882e1b1c9694aa0e8de71f80c32585bf9965624c4584 WatchSource:0}: Error finding container e52f3378ed033e7912fe882e1b1c9694aa0e8de71f80c32585bf9965624c4584: Status 404 returned error can't find the container with id e52f3378ed033e7912fe882e1b1c9694aa0e8de71f80c32585bf9965624c4584 Dec 01 21:53:41 crc kubenswrapper[4962]: I1201 21:53:41.214065 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-28z4p" event={"ID":"235fb826-ef71-488f-b902-efcf5dc9a7dd","Type":"ContainerStarted","Data":"76176445f256833949ecc0764c276eb96a69cedfb523fafc04b5c6e3274a3e87"} Dec 01 21:53:41 crc kubenswrapper[4962]: I1201 21:53:41.215726 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-bj9qn" event={"ID":"0dbe1ae4-c145-451d-9350-d1172e7042d3","Type":"ContainerStarted","Data":"b689f1199444cf62162b7b56d7045dee1401338f5e4f1e0e90a273c84b6dfe8f"} Dec 01 21:53:41 crc kubenswrapper[4962]: I1201 21:53:41.215771 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-bj9qn" event={"ID":"0dbe1ae4-c145-451d-9350-d1172e7042d3","Type":"ContainerStarted","Data":"41436ef196449f2d323604ef08ccf88344b68380b009a71104c0a45bf59dc8c1"} Dec 01 21:53:41 crc kubenswrapper[4962]: I1201 21:53:41.218139 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-af24-account-create-update-g6rz5" event={"ID":"095fd634-41e6-4675-b439-50fe9f184b3a","Type":"ContainerStarted","Data":"e52f3378ed033e7912fe882e1b1c9694aa0e8de71f80c32585bf9965624c4584"} Dec 01 21:53:41 crc kubenswrapper[4962]: I1201 21:53:41.221034 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"08d09a46-a04a-4b53-aa6c-e24f284063f0","Type":"ContainerStarted","Data":"f796839789b35cd5f0baf460139969c31655805c4aa28297d13b2f210ad3be36"} Dec 01 21:53:41 crc kubenswrapper[4962]: I1201 21:53:41.222820 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Dec 01 21:53:41 crc kubenswrapper[4962]: I1201 21:53:41.233684 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-openstack-cell1-db-create-bj9qn" podStartSLOduration=2.233664916 podStartE2EDuration="2.233664916s" podCreationTimestamp="2025-12-01 21:53:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:53:41.231763502 +0000 UTC m=+1205.333202697" watchObservedRunningTime="2025-12-01 21:53:41.233664916 +0000 UTC m=+1205.335104101" Dec 01 21:53:41 crc kubenswrapper[4962]: I1201 21:53:41.320190 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-af24-account-create-update-g6rz5" podStartSLOduration=1.32016342 podStartE2EDuration="1.32016342s" podCreationTimestamp="2025-12-01 21:53:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:53:41.313540932 +0000 UTC m=+1205.414980127" watchObservedRunningTime="2025-12-01 21:53:41.32016342 +0000 UTC m=+1205.421602635" Dec 01 21:53:42 crc kubenswrapper[4962]: I1201 21:53:42.232296 4962 generic.go:334] "Generic (PLEG): container finished" podID="095fd634-41e6-4675-b439-50fe9f184b3a" containerID="0efd232c766d87d65cec6a2ccc46feda5170654f1da631611abc283402224ee7" exitCode=0 Dec 01 21:53:42 crc kubenswrapper[4962]: I1201 21:53:42.232392 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-af24-account-create-update-g6rz5" event={"ID":"095fd634-41e6-4675-b439-50fe9f184b3a","Type":"ContainerDied","Data":"0efd232c766d87d65cec6a2ccc46feda5170654f1da631611abc283402224ee7"} Dec 01 21:53:42 crc kubenswrapper[4962]: I1201 21:53:42.244299 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"08d09a46-a04a-4b53-aa6c-e24f284063f0","Type":"ContainerStarted","Data":"456307c8fa8fc3bbcdda4c49aafdc662a97f7d42d8910f8787e86f80baba2b46"} Dec 01 21:53:42 crc kubenswrapper[4962]: I1201 21:53:42.252148 4962 generic.go:334] "Generic (PLEG): container finished" podID="0dbe1ae4-c145-451d-9350-d1172e7042d3" containerID="b689f1199444cf62162b7b56d7045dee1401338f5e4f1e0e90a273c84b6dfe8f" exitCode=0 Dec 01 21:53:42 crc kubenswrapper[4962]: I1201 21:53:42.253116 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-bj9qn" event={"ID":"0dbe1ae4-c145-451d-9350-d1172e7042d3","Type":"ContainerDied","Data":"b689f1199444cf62162b7b56d7045dee1401338f5e4f1e0e90a273c84b6dfe8f"} Dec 01 21:53:43 crc kubenswrapper[4962]: I1201 21:53:43.263232 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"08d09a46-a04a-4b53-aa6c-e24f284063f0","Type":"ContainerStarted","Data":"f50a9cd4cc6806fc80d038de3b6ed461923f91aa21c0a3b2cad2a4d4a74cbc7e"} Dec 01 21:53:43 crc kubenswrapper[4962]: I1201 21:53:43.263588 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 01 21:53:43 crc kubenswrapper[4962]: I1201 21:53:43.896671 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-af24-account-create-update-g6rz5" Dec 01 21:53:43 crc kubenswrapper[4962]: I1201 21:53:43.904663 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-bj9qn" Dec 01 21:53:43 crc kubenswrapper[4962]: I1201 21:53:43.927357 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=4.381298583 podStartE2EDuration="5.927330072s" podCreationTimestamp="2025-12-01 21:53:38 +0000 UTC" firstStartedPulling="2025-12-01 21:53:40.39918811 +0000 UTC m=+1204.500627305" lastFinishedPulling="2025-12-01 21:53:41.945219569 +0000 UTC m=+1206.046658794" observedRunningTime="2025-12-01 21:53:43.288329926 +0000 UTC m=+1207.389769141" watchObservedRunningTime="2025-12-01 21:53:43.927330072 +0000 UTC m=+1208.028769297" Dec 01 21:53:43 crc kubenswrapper[4962]: I1201 21:53:43.976712 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9tb79\" (UniqueName: \"kubernetes.io/projected/0dbe1ae4-c145-451d-9350-d1172e7042d3-kube-api-access-9tb79\") pod \"0dbe1ae4-c145-451d-9350-d1172e7042d3\" (UID: \"0dbe1ae4-c145-451d-9350-d1172e7042d3\") " Dec 01 21:53:43 crc kubenswrapper[4962]: I1201 21:53:43.976837 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/095fd634-41e6-4675-b439-50fe9f184b3a-operator-scripts\") pod \"095fd634-41e6-4675-b439-50fe9f184b3a\" (UID: \"095fd634-41e6-4675-b439-50fe9f184b3a\") " Dec 01 21:53:43 crc kubenswrapper[4962]: I1201 21:53:43.976924 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hctxh\" (UniqueName: \"kubernetes.io/projected/095fd634-41e6-4675-b439-50fe9f184b3a-kube-api-access-hctxh\") pod \"095fd634-41e6-4675-b439-50fe9f184b3a\" (UID: \"095fd634-41e6-4675-b439-50fe9f184b3a\") " Dec 01 21:53:43 crc kubenswrapper[4962]: I1201 21:53:43.977044 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0dbe1ae4-c145-451d-9350-d1172e7042d3-operator-scripts\") pod \"0dbe1ae4-c145-451d-9350-d1172e7042d3\" (UID: \"0dbe1ae4-c145-451d-9350-d1172e7042d3\") " Dec 01 21:53:43 crc kubenswrapper[4962]: I1201 21:53:43.977547 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/095fd634-41e6-4675-b439-50fe9f184b3a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "095fd634-41e6-4675-b439-50fe9f184b3a" (UID: "095fd634-41e6-4675-b439-50fe9f184b3a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:53:43 crc kubenswrapper[4962]: I1201 21:53:43.977652 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0dbe1ae4-c145-451d-9350-d1172e7042d3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0dbe1ae4-c145-451d-9350-d1172e7042d3" (UID: "0dbe1ae4-c145-451d-9350-d1172e7042d3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:53:43 crc kubenswrapper[4962]: I1201 21:53:43.983703 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dbe1ae4-c145-451d-9350-d1172e7042d3-kube-api-access-9tb79" (OuterVolumeSpecName: "kube-api-access-9tb79") pod "0dbe1ae4-c145-451d-9350-d1172e7042d3" (UID: "0dbe1ae4-c145-451d-9350-d1172e7042d3"). InnerVolumeSpecName "kube-api-access-9tb79". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:53:43 crc kubenswrapper[4962]: I1201 21:53:43.990184 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/095fd634-41e6-4675-b439-50fe9f184b3a-kube-api-access-hctxh" (OuterVolumeSpecName: "kube-api-access-hctxh") pod "095fd634-41e6-4675-b439-50fe9f184b3a" (UID: "095fd634-41e6-4675-b439-50fe9f184b3a"). InnerVolumeSpecName "kube-api-access-hctxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:53:44 crc kubenswrapper[4962]: I1201 21:53:44.037654 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 01 21:53:44 crc kubenswrapper[4962]: I1201 21:53:44.037978 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="4f8782cd-368d-4071-848d-8ad2379ddf6c" containerName="prometheus" containerID="cri-o://3675bc4a7786ba872510fd4c188c15df2d44bbd1efdb4336d12b622aa06dd5b1" gracePeriod=600 Dec 01 21:53:44 crc kubenswrapper[4962]: I1201 21:53:44.038112 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="4f8782cd-368d-4071-848d-8ad2379ddf6c" containerName="thanos-sidecar" containerID="cri-o://839c10079b3d9f73ad42d3111a556e84005077ff2ac143ea95a254189ef9b995" gracePeriod=600 Dec 01 21:53:44 crc kubenswrapper[4962]: I1201 21:53:44.038167 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="4f8782cd-368d-4071-848d-8ad2379ddf6c" containerName="config-reloader" containerID="cri-o://bd0853a1c41936fe5edca16315e3a14860a7e14835a6c36bfcc4f4563e8138c8" gracePeriod=600 Dec 01 21:53:44 crc kubenswrapper[4962]: I1201 21:53:44.078842 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/095fd634-41e6-4675-b439-50fe9f184b3a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:44 crc kubenswrapper[4962]: I1201 21:53:44.078871 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hctxh\" (UniqueName: \"kubernetes.io/projected/095fd634-41e6-4675-b439-50fe9f184b3a-kube-api-access-hctxh\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:44 crc kubenswrapper[4962]: I1201 21:53:44.078882 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0dbe1ae4-c145-451d-9350-d1172e7042d3-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:44 crc kubenswrapper[4962]: I1201 21:53:44.078891 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9tb79\" (UniqueName: \"kubernetes.io/projected/0dbe1ae4-c145-451d-9350-d1172e7042d3-kube-api-access-9tb79\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:44 crc kubenswrapper[4962]: I1201 21:53:44.289492 4962 generic.go:334] "Generic (PLEG): container finished" podID="4f8782cd-368d-4071-848d-8ad2379ddf6c" containerID="839c10079b3d9f73ad42d3111a556e84005077ff2ac143ea95a254189ef9b995" exitCode=0 Dec 01 21:53:44 crc kubenswrapper[4962]: I1201 21:53:44.289530 4962 generic.go:334] "Generic (PLEG): container finished" podID="4f8782cd-368d-4071-848d-8ad2379ddf6c" containerID="3675bc4a7786ba872510fd4c188c15df2d44bbd1efdb4336d12b622aa06dd5b1" exitCode=0 Dec 01 21:53:44 crc kubenswrapper[4962]: I1201 21:53:44.289627 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4f8782cd-368d-4071-848d-8ad2379ddf6c","Type":"ContainerDied","Data":"839c10079b3d9f73ad42d3111a556e84005077ff2ac143ea95a254189ef9b995"} Dec 01 21:53:44 crc kubenswrapper[4962]: I1201 21:53:44.289656 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4f8782cd-368d-4071-848d-8ad2379ddf6c","Type":"ContainerDied","Data":"3675bc4a7786ba872510fd4c188c15df2d44bbd1efdb4336d12b622aa06dd5b1"} Dec 01 21:53:44 crc kubenswrapper[4962]: I1201 21:53:44.296587 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-bj9qn" event={"ID":"0dbe1ae4-c145-451d-9350-d1172e7042d3","Type":"ContainerDied","Data":"41436ef196449f2d323604ef08ccf88344b68380b009a71104c0a45bf59dc8c1"} Dec 01 21:53:44 crc kubenswrapper[4962]: I1201 21:53:44.296629 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="41436ef196449f2d323604ef08ccf88344b68380b009a71104c0a45bf59dc8c1" Dec 01 21:53:44 crc kubenswrapper[4962]: I1201 21:53:44.296605 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-bj9qn" Dec 01 21:53:44 crc kubenswrapper[4962]: I1201 21:53:44.300562 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-af24-account-create-update-g6rz5" Dec 01 21:53:44 crc kubenswrapper[4962]: I1201 21:53:44.301311 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-af24-account-create-update-g6rz5" event={"ID":"095fd634-41e6-4675-b439-50fe9f184b3a","Type":"ContainerDied","Data":"e52f3378ed033e7912fe882e1b1c9694aa0e8de71f80c32585bf9965624c4584"} Dec 01 21:53:44 crc kubenswrapper[4962]: I1201 21:53:44.301342 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e52f3378ed033e7912fe882e1b1c9694aa0e8de71f80c32585bf9965624c4584" Dec 01 21:53:44 crc kubenswrapper[4962]: I1201 21:53:44.409105 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 01 21:53:44 crc kubenswrapper[4962]: I1201 21:53:44.718372 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-lpdtt"] Dec 01 21:53:44 crc kubenswrapper[4962]: E1201 21:53:44.719131 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dbe1ae4-c145-451d-9350-d1172e7042d3" containerName="mariadb-database-create" Dec 01 21:53:44 crc kubenswrapper[4962]: I1201 21:53:44.719151 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dbe1ae4-c145-451d-9350-d1172e7042d3" containerName="mariadb-database-create" Dec 01 21:53:44 crc kubenswrapper[4962]: E1201 21:53:44.719164 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34cbe04f-2bf2-4b5e-bf91-00787b7e4fee" containerName="swift-ring-rebalance" Dec 01 21:53:44 crc kubenswrapper[4962]: I1201 21:53:44.719171 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="34cbe04f-2bf2-4b5e-bf91-00787b7e4fee" containerName="swift-ring-rebalance" Dec 01 21:53:44 crc kubenswrapper[4962]: E1201 21:53:44.719193 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="095fd634-41e6-4675-b439-50fe9f184b3a" containerName="mariadb-account-create-update" Dec 01 21:53:44 crc kubenswrapper[4962]: I1201 21:53:44.719199 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="095fd634-41e6-4675-b439-50fe9f184b3a" containerName="mariadb-account-create-update" Dec 01 21:53:44 crc kubenswrapper[4962]: I1201 21:53:44.719383 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="34cbe04f-2bf2-4b5e-bf91-00787b7e4fee" containerName="swift-ring-rebalance" Dec 01 21:53:44 crc kubenswrapper[4962]: I1201 21:53:44.719406 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="095fd634-41e6-4675-b439-50fe9f184b3a" containerName="mariadb-account-create-update" Dec 01 21:53:44 crc kubenswrapper[4962]: I1201 21:53:44.719419 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dbe1ae4-c145-451d-9350-d1172e7042d3" containerName="mariadb-database-create" Dec 01 21:53:44 crc kubenswrapper[4962]: I1201 21:53:44.720184 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-lpdtt" Dec 01 21:53:44 crc kubenswrapper[4962]: I1201 21:53:44.728563 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-lpdtt"] Dec 01 21:53:44 crc kubenswrapper[4962]: I1201 21:53:44.791881 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 01 21:53:44 crc kubenswrapper[4962]: I1201 21:53:44.801918 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8svw9\" (UniqueName: \"kubernetes.io/projected/bfa1c735-01c9-4f3e-ae1d-32bc2af0972d-kube-api-access-8svw9\") pod \"cinder-db-create-lpdtt\" (UID: \"bfa1c735-01c9-4f3e-ae1d-32bc2af0972d\") " pod="openstack/cinder-db-create-lpdtt" Dec 01 21:53:44 crc kubenswrapper[4962]: I1201 21:53:44.801989 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bfa1c735-01c9-4f3e-ae1d-32bc2af0972d-operator-scripts\") pod \"cinder-db-create-lpdtt\" (UID: \"bfa1c735-01c9-4f3e-ae1d-32bc2af0972d\") " pod="openstack/cinder-db-create-lpdtt" Dec 01 21:53:44 crc kubenswrapper[4962]: I1201 21:53:44.820460 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-bg7z9"] Dec 01 21:53:44 crc kubenswrapper[4962]: I1201 21:53:44.839089 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-bg7z9" Dec 01 21:53:44 crc kubenswrapper[4962]: I1201 21:53:44.846157 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-bbd8-account-create-update-tnv4m"] Dec 01 21:53:44 crc kubenswrapper[4962]: I1201 21:53:44.847374 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-bbd8-account-create-update-tnv4m" Dec 01 21:53:44 crc kubenswrapper[4962]: I1201 21:53:44.862385 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Dec 01 21:53:44 crc kubenswrapper[4962]: I1201 21:53:44.865770 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-bg7z9"] Dec 01 21:53:44 crc kubenswrapper[4962]: I1201 21:53:44.888851 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-bbd8-account-create-update-tnv4m"] Dec 01 21:53:44 crc kubenswrapper[4962]: I1201 21:53:44.904330 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b539cc27-f876-40e3-b77a-8af750ce5b3a-operator-scripts\") pod \"heat-bbd8-account-create-update-tnv4m\" (UID: \"b539cc27-f876-40e3-b77a-8af750ce5b3a\") " pod="openstack/heat-bbd8-account-create-update-tnv4m" Dec 01 21:53:44 crc kubenswrapper[4962]: I1201 21:53:44.904372 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bfa1c735-01c9-4f3e-ae1d-32bc2af0972d-operator-scripts\") pod \"cinder-db-create-lpdtt\" (UID: \"bfa1c735-01c9-4f3e-ae1d-32bc2af0972d\") " pod="openstack/cinder-db-create-lpdtt" Dec 01 21:53:44 crc kubenswrapper[4962]: I1201 21:53:44.904390 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29e77231-0423-458f-8262-aa12c2536566-operator-scripts\") pod \"barbican-db-create-bg7z9\" (UID: \"29e77231-0423-458f-8262-aa12c2536566\") " pod="openstack/barbican-db-create-bg7z9" Dec 01 21:53:44 crc kubenswrapper[4962]: I1201 21:53:44.904451 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbhcf\" (UniqueName: \"kubernetes.io/projected/b539cc27-f876-40e3-b77a-8af750ce5b3a-kube-api-access-zbhcf\") pod \"heat-bbd8-account-create-update-tnv4m\" (UID: \"b539cc27-f876-40e3-b77a-8af750ce5b3a\") " pod="openstack/heat-bbd8-account-create-update-tnv4m" Dec 01 21:53:44 crc kubenswrapper[4962]: I1201 21:53:44.904487 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5zpb\" (UniqueName: \"kubernetes.io/projected/29e77231-0423-458f-8262-aa12c2536566-kube-api-access-l5zpb\") pod \"barbican-db-create-bg7z9\" (UID: \"29e77231-0423-458f-8262-aa12c2536566\") " pod="openstack/barbican-db-create-bg7z9" Dec 01 21:53:44 crc kubenswrapper[4962]: I1201 21:53:44.904756 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8svw9\" (UniqueName: \"kubernetes.io/projected/bfa1c735-01c9-4f3e-ae1d-32bc2af0972d-kube-api-access-8svw9\") pod \"cinder-db-create-lpdtt\" (UID: \"bfa1c735-01c9-4f3e-ae1d-32bc2af0972d\") " pod="openstack/cinder-db-create-lpdtt" Dec 01 21:53:44 crc kubenswrapper[4962]: I1201 21:53:44.905676 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bfa1c735-01c9-4f3e-ae1d-32bc2af0972d-operator-scripts\") pod \"cinder-db-create-lpdtt\" (UID: \"bfa1c735-01c9-4f3e-ae1d-32bc2af0972d\") " pod="openstack/cinder-db-create-lpdtt" Dec 01 21:53:44 crc kubenswrapper[4962]: I1201 21:53:44.933355 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8svw9\" (UniqueName: \"kubernetes.io/projected/bfa1c735-01c9-4f3e-ae1d-32bc2af0972d-kube-api-access-8svw9\") pod \"cinder-db-create-lpdtt\" (UID: \"bfa1c735-01c9-4f3e-ae1d-32bc2af0972d\") " pod="openstack/cinder-db-create-lpdtt" Dec 01 21:53:44 crc kubenswrapper[4962]: I1201 21:53:44.962866 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-58e3-account-create-update-qf5td"] Dec 01 21:53:44 crc kubenswrapper[4962]: I1201 21:53:44.964329 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-58e3-account-create-update-qf5td" Dec 01 21:53:44 crc kubenswrapper[4962]: I1201 21:53:44.967233 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 01 21:53:44 crc kubenswrapper[4962]: I1201 21:53:44.967429 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-58e3-account-create-update-qf5td"] Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.007560 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b539cc27-f876-40e3-b77a-8af750ce5b3a-operator-scripts\") pod \"heat-bbd8-account-create-update-tnv4m\" (UID: \"b539cc27-f876-40e3-b77a-8af750ce5b3a\") " pod="openstack/heat-bbd8-account-create-update-tnv4m" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.007602 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29e77231-0423-458f-8262-aa12c2536566-operator-scripts\") pod \"barbican-db-create-bg7z9\" (UID: \"29e77231-0423-458f-8262-aa12c2536566\") " pod="openstack/barbican-db-create-bg7z9" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.007636 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbhcf\" (UniqueName: \"kubernetes.io/projected/b539cc27-f876-40e3-b77a-8af750ce5b3a-kube-api-access-zbhcf\") pod \"heat-bbd8-account-create-update-tnv4m\" (UID: \"b539cc27-f876-40e3-b77a-8af750ce5b3a\") " pod="openstack/heat-bbd8-account-create-update-tnv4m" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.007657 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d0bb151-9b17-4471-9b0d-05f74fa33f0c-operator-scripts\") pod \"barbican-58e3-account-create-update-qf5td\" (UID: \"8d0bb151-9b17-4471-9b0d-05f74fa33f0c\") " pod="openstack/barbican-58e3-account-create-update-qf5td" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.007677 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5zpb\" (UniqueName: \"kubernetes.io/projected/29e77231-0423-458f-8262-aa12c2536566-kube-api-access-l5zpb\") pod \"barbican-db-create-bg7z9\" (UID: \"29e77231-0423-458f-8262-aa12c2536566\") " pod="openstack/barbican-db-create-bg7z9" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.007788 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6fht\" (UniqueName: \"kubernetes.io/projected/8d0bb151-9b17-4471-9b0d-05f74fa33f0c-kube-api-access-j6fht\") pod \"barbican-58e3-account-create-update-qf5td\" (UID: \"8d0bb151-9b17-4471-9b0d-05f74fa33f0c\") " pod="openstack/barbican-58e3-account-create-update-qf5td" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.008480 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b539cc27-f876-40e3-b77a-8af750ce5b3a-operator-scripts\") pod \"heat-bbd8-account-create-update-tnv4m\" (UID: \"b539cc27-f876-40e3-b77a-8af750ce5b3a\") " pod="openstack/heat-bbd8-account-create-update-tnv4m" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.008973 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29e77231-0423-458f-8262-aa12c2536566-operator-scripts\") pod \"barbican-db-create-bg7z9\" (UID: \"29e77231-0423-458f-8262-aa12c2536566\") " pod="openstack/barbican-db-create-bg7z9" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.024766 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-f7vp4"] Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.026021 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-f7vp4" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.048395 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-f7vp4"] Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.061085 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbhcf\" (UniqueName: \"kubernetes.io/projected/b539cc27-f876-40e3-b77a-8af750ce5b3a-kube-api-access-zbhcf\") pod \"heat-bbd8-account-create-update-tnv4m\" (UID: \"b539cc27-f876-40e3-b77a-8af750ce5b3a\") " pod="openstack/heat-bbd8-account-create-update-tnv4m" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.077113 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-8031-account-create-update-x2qgm"] Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.078728 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8031-account-create-update-x2qgm" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.080689 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-lpdtt" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.080905 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5zpb\" (UniqueName: \"kubernetes.io/projected/29e77231-0423-458f-8262-aa12c2536566-kube-api-access-l5zpb\") pod \"barbican-db-create-bg7z9\" (UID: \"29e77231-0423-458f-8262-aa12c2536566\") " pod="openstack/barbican-db-create-bg7z9" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.091446 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.109146 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c889a0f9-46ee-413a-bae1-94ee4eb8f16d-operator-scripts\") pod \"heat-db-create-f7vp4\" (UID: \"c889a0f9-46ee-413a-bae1-94ee4eb8f16d\") " pod="openstack/heat-db-create-f7vp4" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.109201 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6fht\" (UniqueName: \"kubernetes.io/projected/8d0bb151-9b17-4471-9b0d-05f74fa33f0c-kube-api-access-j6fht\") pod \"barbican-58e3-account-create-update-qf5td\" (UID: \"8d0bb151-9b17-4471-9b0d-05f74fa33f0c\") " pod="openstack/barbican-58e3-account-create-update-qf5td" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.109304 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d0bb151-9b17-4471-9b0d-05f74fa33f0c-operator-scripts\") pod \"barbican-58e3-account-create-update-qf5td\" (UID: \"8d0bb151-9b17-4471-9b0d-05f74fa33f0c\") " pod="openstack/barbican-58e3-account-create-update-qf5td" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.109345 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdkwr\" (UniqueName: \"kubernetes.io/projected/c889a0f9-46ee-413a-bae1-94ee4eb8f16d-kube-api-access-qdkwr\") pod \"heat-db-create-f7vp4\" (UID: \"c889a0f9-46ee-413a-bae1-94ee4eb8f16d\") " pod="openstack/heat-db-create-f7vp4" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.110238 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d0bb151-9b17-4471-9b0d-05f74fa33f0c-operator-scripts\") pod \"barbican-58e3-account-create-update-qf5td\" (UID: \"8d0bb151-9b17-4471-9b0d-05f74fa33f0c\") " pod="openstack/barbican-58e3-account-create-update-qf5td" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.134433 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-8031-account-create-update-x2qgm"] Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.144697 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6fht\" (UniqueName: \"kubernetes.io/projected/8d0bb151-9b17-4471-9b0d-05f74fa33f0c-kube-api-access-j6fht\") pod \"barbican-58e3-account-create-update-qf5td\" (UID: \"8d0bb151-9b17-4471-9b0d-05f74fa33f0c\") " pod="openstack/barbican-58e3-account-create-update-qf5td" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.163402 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-bg7z9" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.208849 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-bbd8-account-create-update-tnv4m" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.216227 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7515449-4f20-4673-ac23-a7a5a40f852d-operator-scripts\") pod \"cinder-8031-account-create-update-x2qgm\" (UID: \"a7515449-4f20-4673-ac23-a7a5a40f852d\") " pod="openstack/cinder-8031-account-create-update-x2qgm" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.216282 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c889a0f9-46ee-413a-bae1-94ee4eb8f16d-operator-scripts\") pod \"heat-db-create-f7vp4\" (UID: \"c889a0f9-46ee-413a-bae1-94ee4eb8f16d\") " pod="openstack/heat-db-create-f7vp4" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.216417 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwm74\" (UniqueName: \"kubernetes.io/projected/a7515449-4f20-4673-ac23-a7a5a40f852d-kube-api-access-jwm74\") pod \"cinder-8031-account-create-update-x2qgm\" (UID: \"a7515449-4f20-4673-ac23-a7a5a40f852d\") " pod="openstack/cinder-8031-account-create-update-x2qgm" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.216436 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdkwr\" (UniqueName: \"kubernetes.io/projected/c889a0f9-46ee-413a-bae1-94ee4eb8f16d-kube-api-access-qdkwr\") pod \"heat-db-create-f7vp4\" (UID: \"c889a0f9-46ee-413a-bae1-94ee4eb8f16d\") " pod="openstack/heat-db-create-f7vp4" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.217335 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c889a0f9-46ee-413a-bae1-94ee4eb8f16d-operator-scripts\") pod \"heat-db-create-f7vp4\" (UID: \"c889a0f9-46ee-413a-bae1-94ee4eb8f16d\") " pod="openstack/heat-db-create-f7vp4" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.277763 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdkwr\" (UniqueName: \"kubernetes.io/projected/c889a0f9-46ee-413a-bae1-94ee4eb8f16d-kube-api-access-qdkwr\") pod \"heat-db-create-f7vp4\" (UID: \"c889a0f9-46ee-413a-bae1-94ee4eb8f16d\") " pod="openstack/heat-db-create-f7vp4" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.281385 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-qd22d"] Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.289589 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-qd22d" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.299354 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.299548 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.299665 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.300143 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-tlj7d" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.300860 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-58e3-account-create-update-qf5td" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.322850 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6634bf9-94a9-4b1c-b14b-44b4ecc882bb-combined-ca-bundle\") pod \"keystone-db-sync-qd22d\" (UID: \"e6634bf9-94a9-4b1c-b14b-44b4ecc882bb\") " pod="openstack/keystone-db-sync-qd22d" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.322901 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6cks\" (UniqueName: \"kubernetes.io/projected/e6634bf9-94a9-4b1c-b14b-44b4ecc882bb-kube-api-access-c6cks\") pod \"keystone-db-sync-qd22d\" (UID: \"e6634bf9-94a9-4b1c-b14b-44b4ecc882bb\") " pod="openstack/keystone-db-sync-qd22d" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.322943 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwm74\" (UniqueName: \"kubernetes.io/projected/a7515449-4f20-4673-ac23-a7a5a40f852d-kube-api-access-jwm74\") pod \"cinder-8031-account-create-update-x2qgm\" (UID: \"a7515449-4f20-4673-ac23-a7a5a40f852d\") " pod="openstack/cinder-8031-account-create-update-x2qgm" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.323022 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6634bf9-94a9-4b1c-b14b-44b4ecc882bb-config-data\") pod \"keystone-db-sync-qd22d\" (UID: \"e6634bf9-94a9-4b1c-b14b-44b4ecc882bb\") " pod="openstack/keystone-db-sync-qd22d" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.323059 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7515449-4f20-4673-ac23-a7a5a40f852d-operator-scripts\") pod \"cinder-8031-account-create-update-x2qgm\" (UID: \"a7515449-4f20-4673-ac23-a7a5a40f852d\") " pod="openstack/cinder-8031-account-create-update-x2qgm" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.323859 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7515449-4f20-4673-ac23-a7a5a40f852d-operator-scripts\") pod \"cinder-8031-account-create-update-x2qgm\" (UID: \"a7515449-4f20-4673-ac23-a7a5a40f852d\") " pod="openstack/cinder-8031-account-create-update-x2qgm" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.345516 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwm74\" (UniqueName: \"kubernetes.io/projected/a7515449-4f20-4673-ac23-a7a5a40f852d-kube-api-access-jwm74\") pod \"cinder-8031-account-create-update-x2qgm\" (UID: \"a7515449-4f20-4673-ac23-a7a5a40f852d\") " pod="openstack/cinder-8031-account-create-update-x2qgm" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.355499 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-f7vp4" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.356137 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-qd22d"] Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.390264 4962 generic.go:334] "Generic (PLEG): container finished" podID="4f8782cd-368d-4071-848d-8ad2379ddf6c" containerID="bd0853a1c41936fe5edca16315e3a14860a7e14835a6c36bfcc4f4563e8138c8" exitCode=0 Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.390314 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4f8782cd-368d-4071-848d-8ad2379ddf6c","Type":"ContainerDied","Data":"bd0853a1c41936fe5edca16315e3a14860a7e14835a6c36bfcc4f4563e8138c8"} Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.424418 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6634bf9-94a9-4b1c-b14b-44b4ecc882bb-combined-ca-bundle\") pod \"keystone-db-sync-qd22d\" (UID: \"e6634bf9-94a9-4b1c-b14b-44b4ecc882bb\") " pod="openstack/keystone-db-sync-qd22d" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.424465 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6cks\" (UniqueName: \"kubernetes.io/projected/e6634bf9-94a9-4b1c-b14b-44b4ecc882bb-kube-api-access-c6cks\") pod \"keystone-db-sync-qd22d\" (UID: \"e6634bf9-94a9-4b1c-b14b-44b4ecc882bb\") " pod="openstack/keystone-db-sync-qd22d" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.424559 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6634bf9-94a9-4b1c-b14b-44b4ecc882bb-config-data\") pod \"keystone-db-sync-qd22d\" (UID: \"e6634bf9-94a9-4b1c-b14b-44b4ecc882bb\") " pod="openstack/keystone-db-sync-qd22d" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.431318 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6634bf9-94a9-4b1c-b14b-44b4ecc882bb-combined-ca-bundle\") pod \"keystone-db-sync-qd22d\" (UID: \"e6634bf9-94a9-4b1c-b14b-44b4ecc882bb\") " pod="openstack/keystone-db-sync-qd22d" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.458644 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6634bf9-94a9-4b1c-b14b-44b4ecc882bb-config-data\") pod \"keystone-db-sync-qd22d\" (UID: \"e6634bf9-94a9-4b1c-b14b-44b4ecc882bb\") " pod="openstack/keystone-db-sync-qd22d" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.461528 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6cks\" (UniqueName: \"kubernetes.io/projected/e6634bf9-94a9-4b1c-b14b-44b4ecc882bb-kube-api-access-c6cks\") pod \"keystone-db-sync-qd22d\" (UID: \"e6634bf9-94a9-4b1c-b14b-44b4ecc882bb\") " pod="openstack/keystone-db-sync-qd22d" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.476205 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-dbc8-account-create-update-x8lg9"] Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.477538 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dbc8-account-create-update-x8lg9" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.483419 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.504789 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dbc8-account-create-update-x8lg9"] Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.526602 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5cd1b6eb-52cf-44aa-993a-90d3abec28ad-operator-scripts\") pod \"neutron-dbc8-account-create-update-x8lg9\" (UID: \"5cd1b6eb-52cf-44aa-993a-90d3abec28ad\") " pod="openstack/neutron-dbc8-account-create-update-x8lg9" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.526777 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vs2gz\" (UniqueName: \"kubernetes.io/projected/5cd1b6eb-52cf-44aa-993a-90d3abec28ad-kube-api-access-vs2gz\") pod \"neutron-dbc8-account-create-update-x8lg9\" (UID: \"5cd1b6eb-52cf-44aa-993a-90d3abec28ad\") " pod="openstack/neutron-dbc8-account-create-update-x8lg9" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.612395 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-dr5bl"] Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.618050 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-dr5bl" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.630790 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8031-account-create-update-x2qgm" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.650730 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8h5pz\" (UniqueName: \"kubernetes.io/projected/37f45ac0-c39d-4785-957e-e69e1659927e-kube-api-access-8h5pz\") pod \"neutron-db-create-dr5bl\" (UID: \"37f45ac0-c39d-4785-957e-e69e1659927e\") " pod="openstack/neutron-db-create-dr5bl" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.650914 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37f45ac0-c39d-4785-957e-e69e1659927e-operator-scripts\") pod \"neutron-db-create-dr5bl\" (UID: \"37f45ac0-c39d-4785-957e-e69e1659927e\") " pod="openstack/neutron-db-create-dr5bl" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.650988 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vs2gz\" (UniqueName: \"kubernetes.io/projected/5cd1b6eb-52cf-44aa-993a-90d3abec28ad-kube-api-access-vs2gz\") pod \"neutron-dbc8-account-create-update-x8lg9\" (UID: \"5cd1b6eb-52cf-44aa-993a-90d3abec28ad\") " pod="openstack/neutron-dbc8-account-create-update-x8lg9" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.651225 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5cd1b6eb-52cf-44aa-993a-90d3abec28ad-operator-scripts\") pod \"neutron-dbc8-account-create-update-x8lg9\" (UID: \"5cd1b6eb-52cf-44aa-993a-90d3abec28ad\") " pod="openstack/neutron-dbc8-account-create-update-x8lg9" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.651894 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-dr5bl"] Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.652211 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5cd1b6eb-52cf-44aa-993a-90d3abec28ad-operator-scripts\") pod \"neutron-dbc8-account-create-update-x8lg9\" (UID: \"5cd1b6eb-52cf-44aa-993a-90d3abec28ad\") " pod="openstack/neutron-dbc8-account-create-update-x8lg9" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.653738 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.663800 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-qd22d" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.707759 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vs2gz\" (UniqueName: \"kubernetes.io/projected/5cd1b6eb-52cf-44aa-993a-90d3abec28ad-kube-api-access-vs2gz\") pod \"neutron-dbc8-account-create-update-x8lg9\" (UID: \"5cd1b6eb-52cf-44aa-993a-90d3abec28ad\") " pod="openstack/neutron-dbc8-account-create-update-x8lg9" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.742179 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-0"] Dec 01 21:53:45 crc kubenswrapper[4962]: E1201 21:53:45.742892 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f8782cd-368d-4071-848d-8ad2379ddf6c" containerName="thanos-sidecar" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.742907 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f8782cd-368d-4071-848d-8ad2379ddf6c" containerName="thanos-sidecar" Dec 01 21:53:45 crc kubenswrapper[4962]: E1201 21:53:45.742946 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f8782cd-368d-4071-848d-8ad2379ddf6c" containerName="init-config-reloader" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.742953 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f8782cd-368d-4071-848d-8ad2379ddf6c" containerName="init-config-reloader" Dec 01 21:53:45 crc kubenswrapper[4962]: E1201 21:53:45.742977 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f8782cd-368d-4071-848d-8ad2379ddf6c" containerName="config-reloader" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.742983 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f8782cd-368d-4071-848d-8ad2379ddf6c" containerName="config-reloader" Dec 01 21:53:45 crc kubenswrapper[4962]: E1201 21:53:45.743011 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f8782cd-368d-4071-848d-8ad2379ddf6c" containerName="prometheus" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.743017 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f8782cd-368d-4071-848d-8ad2379ddf6c" containerName="prometheus" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.743400 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f8782cd-368d-4071-848d-8ad2379ddf6c" containerName="thanos-sidecar" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.743437 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f8782cd-368d-4071-848d-8ad2379ddf6c" containerName="prometheus" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.743450 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f8782cd-368d-4071-848d-8ad2379ddf6c" containerName="config-reloader" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.744992 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.749311 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-config-data" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.754878 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.768629 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4f8782cd-368d-4071-848d-8ad2379ddf6c-thanos-prometheus-http-client-file\") pod \"4f8782cd-368d-4071-848d-8ad2379ddf6c\" (UID: \"4f8782cd-368d-4071-848d-8ad2379ddf6c\") " Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.768684 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4f8782cd-368d-4071-848d-8ad2379ddf6c-tls-assets\") pod \"4f8782cd-368d-4071-848d-8ad2379ddf6c\" (UID: \"4f8782cd-368d-4071-848d-8ad2379ddf6c\") " Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.768746 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4f8782cd-368d-4071-848d-8ad2379ddf6c-web-config\") pod \"4f8782cd-368d-4071-848d-8ad2379ddf6c\" (UID: \"4f8782cd-368d-4071-848d-8ad2379ddf6c\") " Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.768791 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4f8782cd-368d-4071-848d-8ad2379ddf6c-config-out\") pod \"4f8782cd-368d-4071-848d-8ad2379ddf6c\" (UID: \"4f8782cd-368d-4071-848d-8ad2379ddf6c\") " Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.768867 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7xqn\" (UniqueName: \"kubernetes.io/projected/4f8782cd-368d-4071-848d-8ad2379ddf6c-kube-api-access-v7xqn\") pod \"4f8782cd-368d-4071-848d-8ad2379ddf6c\" (UID: \"4f8782cd-368d-4071-848d-8ad2379ddf6c\") " Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.768983 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"4f8782cd-368d-4071-848d-8ad2379ddf6c\" (UID: \"4f8782cd-368d-4071-848d-8ad2379ddf6c\") " Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.769019 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4f8782cd-368d-4071-848d-8ad2379ddf6c-prometheus-metric-storage-rulefiles-0\") pod \"4f8782cd-368d-4071-848d-8ad2379ddf6c\" (UID: \"4f8782cd-368d-4071-848d-8ad2379ddf6c\") " Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.769072 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4f8782cd-368d-4071-848d-8ad2379ddf6c-config\") pod \"4f8782cd-368d-4071-848d-8ad2379ddf6c\" (UID: \"4f8782cd-368d-4071-848d-8ad2379ddf6c\") " Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.771138 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a120e58c-62c2-4242-a668-151b872a9cb4-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"a120e58c-62c2-4242-a668-151b872a9cb4\") " pod="openstack/mysqld-exporter-0" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.771198 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8h5pz\" (UniqueName: \"kubernetes.io/projected/37f45ac0-c39d-4785-957e-e69e1659927e-kube-api-access-8h5pz\") pod \"neutron-db-create-dr5bl\" (UID: \"37f45ac0-c39d-4785-957e-e69e1659927e\") " pod="openstack/neutron-db-create-dr5bl" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.771236 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37f45ac0-c39d-4785-957e-e69e1659927e-operator-scripts\") pod \"neutron-db-create-dr5bl\" (UID: \"37f45ac0-c39d-4785-957e-e69e1659927e\") " pod="openstack/neutron-db-create-dr5bl" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.771322 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a120e58c-62c2-4242-a668-151b872a9cb4-config-data\") pod \"mysqld-exporter-0\" (UID: \"a120e58c-62c2-4242-a668-151b872a9cb4\") " pod="openstack/mysqld-exporter-0" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.771342 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rhvx\" (UniqueName: \"kubernetes.io/projected/a120e58c-62c2-4242-a668-151b872a9cb4-kube-api-access-8rhvx\") pod \"mysqld-exporter-0\" (UID: \"a120e58c-62c2-4242-a668-151b872a9cb4\") " pod="openstack/mysqld-exporter-0" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.773375 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f8782cd-368d-4071-848d-8ad2379ddf6c-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "4f8782cd-368d-4071-848d-8ad2379ddf6c" (UID: "4f8782cd-368d-4071-848d-8ad2379ddf6c"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.774786 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f8782cd-368d-4071-848d-8ad2379ddf6c-kube-api-access-v7xqn" (OuterVolumeSpecName: "kube-api-access-v7xqn") pod "4f8782cd-368d-4071-848d-8ad2379ddf6c" (UID: "4f8782cd-368d-4071-848d-8ad2379ddf6c"). InnerVolumeSpecName "kube-api-access-v7xqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.775223 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37f45ac0-c39d-4785-957e-e69e1659927e-operator-scripts\") pod \"neutron-db-create-dr5bl\" (UID: \"37f45ac0-c39d-4785-957e-e69e1659927e\") " pod="openstack/neutron-db-create-dr5bl" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.784743 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f8782cd-368d-4071-848d-8ad2379ddf6c-config-out" (OuterVolumeSpecName: "config-out") pod "4f8782cd-368d-4071-848d-8ad2379ddf6c" (UID: "4f8782cd-368d-4071-848d-8ad2379ddf6c"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.784781 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f8782cd-368d-4071-848d-8ad2379ddf6c-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "4f8782cd-368d-4071-848d-8ad2379ddf6c" (UID: "4f8782cd-368d-4071-848d-8ad2379ddf6c"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.784908 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f8782cd-368d-4071-848d-8ad2379ddf6c-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "4f8782cd-368d-4071-848d-8ad2379ddf6c" (UID: "4f8782cd-368d-4071-848d-8ad2379ddf6c"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.786041 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "4f8782cd-368d-4071-848d-8ad2379ddf6c" (UID: "4f8782cd-368d-4071-848d-8ad2379ddf6c"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.794112 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f8782cd-368d-4071-848d-8ad2379ddf6c-config" (OuterVolumeSpecName: "config") pod "4f8782cd-368d-4071-848d-8ad2379ddf6c" (UID: "4f8782cd-368d-4071-848d-8ad2379ddf6c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.806175 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8h5pz\" (UniqueName: \"kubernetes.io/projected/37f45ac0-c39d-4785-957e-e69e1659927e-kube-api-access-8h5pz\") pod \"neutron-db-create-dr5bl\" (UID: \"37f45ac0-c39d-4785-957e-e69e1659927e\") " pod="openstack/neutron-db-create-dr5bl" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.808517 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f8782cd-368d-4071-848d-8ad2379ddf6c-web-config" (OuterVolumeSpecName: "web-config") pod "4f8782cd-368d-4071-848d-8ad2379ddf6c" (UID: "4f8782cd-368d-4071-848d-8ad2379ddf6c"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.836784 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dbc8-account-create-update-x8lg9" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.873067 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a120e58c-62c2-4242-a668-151b872a9cb4-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"a120e58c-62c2-4242-a668-151b872a9cb4\") " pod="openstack/mysqld-exporter-0" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.873213 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a120e58c-62c2-4242-a668-151b872a9cb4-config-data\") pod \"mysqld-exporter-0\" (UID: \"a120e58c-62c2-4242-a668-151b872a9cb4\") " pod="openstack/mysqld-exporter-0" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.873239 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rhvx\" (UniqueName: \"kubernetes.io/projected/a120e58c-62c2-4242-a668-151b872a9cb4-kube-api-access-8rhvx\") pod \"mysqld-exporter-0\" (UID: \"a120e58c-62c2-4242-a668-151b872a9cb4\") " pod="openstack/mysqld-exporter-0" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.873325 4962 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.873341 4962 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4f8782cd-368d-4071-848d-8ad2379ddf6c-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.873351 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/4f8782cd-368d-4071-848d-8ad2379ddf6c-config\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.873361 4962 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4f8782cd-368d-4071-848d-8ad2379ddf6c-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.873369 4962 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4f8782cd-368d-4071-848d-8ad2379ddf6c-tls-assets\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.873377 4962 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4f8782cd-368d-4071-848d-8ad2379ddf6c-web-config\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.873386 4962 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4f8782cd-368d-4071-848d-8ad2379ddf6c-config-out\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.873395 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7xqn\" (UniqueName: \"kubernetes.io/projected/4f8782cd-368d-4071-848d-8ad2379ddf6c-kube-api-access-v7xqn\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.879454 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a120e58c-62c2-4242-a668-151b872a9cb4-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"a120e58c-62c2-4242-a668-151b872a9cb4\") " pod="openstack/mysqld-exporter-0" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.880456 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a120e58c-62c2-4242-a668-151b872a9cb4-config-data\") pod \"mysqld-exporter-0\" (UID: \"a120e58c-62c2-4242-a668-151b872a9cb4\") " pod="openstack/mysqld-exporter-0" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.894493 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rhvx\" (UniqueName: \"kubernetes.io/projected/a120e58c-62c2-4242-a668-151b872a9cb4-kube-api-access-8rhvx\") pod \"mysqld-exporter-0\" (UID: \"a120e58c-62c2-4242-a668-151b872a9cb4\") " pod="openstack/mysqld-exporter-0" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.948332 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-dr5bl" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.972376 4962 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Dec 01 21:53:45 crc kubenswrapper[4962]: I1201 21:53:45.975544 4962 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:46 crc kubenswrapper[4962]: I1201 21:53:46.036146 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-lpdtt"] Dec 01 21:53:46 crc kubenswrapper[4962]: I1201 21:53:46.090725 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Dec 01 21:53:46 crc kubenswrapper[4962]: W1201 21:53:46.240893 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod29e77231_0423_458f_8262_aa12c2536566.slice/crio-21f7c920dc99e90bdd6b34d448916aac90bb2a1c05c378efaa5e91c90c9abac3 WatchSource:0}: Error finding container 21f7c920dc99e90bdd6b34d448916aac90bb2a1c05c378efaa5e91c90c9abac3: Status 404 returned error can't find the container with id 21f7c920dc99e90bdd6b34d448916aac90bb2a1c05c378efaa5e91c90c9abac3 Dec 01 21:53:46 crc kubenswrapper[4962]: I1201 21:53:46.258404 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-bg7z9"] Dec 01 21:53:46 crc kubenswrapper[4962]: I1201 21:53:46.440957 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4f8782cd-368d-4071-848d-8ad2379ddf6c","Type":"ContainerDied","Data":"a68f2fb2321cc67b862c7135ec6db51274b6482a0f892e5ba2f3b3c06b66c488"} Dec 01 21:53:46 crc kubenswrapper[4962]: I1201 21:53:46.441248 4962 scope.go:117] "RemoveContainer" containerID="839c10079b3d9f73ad42d3111a556e84005077ff2ac143ea95a254189ef9b995" Dec 01 21:53:46 crc kubenswrapper[4962]: I1201 21:53:46.441298 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 01 21:53:46 crc kubenswrapper[4962]: I1201 21:53:46.452241 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-bg7z9" event={"ID":"29e77231-0423-458f-8262-aa12c2536566","Type":"ContainerStarted","Data":"21f7c920dc99e90bdd6b34d448916aac90bb2a1c05c378efaa5e91c90c9abac3"} Dec 01 21:53:46 crc kubenswrapper[4962]: I1201 21:53:46.455202 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-lpdtt" event={"ID":"bfa1c735-01c9-4f3e-ae1d-32bc2af0972d","Type":"ContainerStarted","Data":"406ae3e11d35520dd2d6c91ed74bae17f7928537aed81575e79575cb521b001b"} Dec 01 21:53:46 crc kubenswrapper[4962]: I1201 21:53:46.475904 4962 scope.go:117] "RemoveContainer" containerID="bd0853a1c41936fe5edca16315e3a14860a7e14835a6c36bfcc4f4563e8138c8" Dec 01 21:53:46 crc kubenswrapper[4962]: I1201 21:53:46.493216 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 01 21:53:46 crc kubenswrapper[4962]: I1201 21:53:46.536243 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 01 21:53:46 crc kubenswrapper[4962]: I1201 21:53:46.553616 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 01 21:53:46 crc kubenswrapper[4962]: I1201 21:53:46.557741 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 01 21:53:46 crc kubenswrapper[4962]: I1201 21:53:46.558732 4962 scope.go:117] "RemoveContainer" containerID="3675bc4a7786ba872510fd4c188c15df2d44bbd1efdb4336d12b622aa06dd5b1" Dec 01 21:53:46 crc kubenswrapper[4962]: I1201 21:53:46.565156 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Dec 01 21:53:46 crc kubenswrapper[4962]: I1201 21:53:46.565470 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Dec 01 21:53:46 crc kubenswrapper[4962]: I1201 21:53:46.565585 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Dec 01 21:53:46 crc kubenswrapper[4962]: I1201 21:53:46.565683 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Dec 01 21:53:46 crc kubenswrapper[4962]: I1201 21:53:46.565784 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Dec 01 21:53:46 crc kubenswrapper[4962]: I1201 21:53:46.565887 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-65vdm" Dec 01 21:53:46 crc kubenswrapper[4962]: I1201 21:53:46.573648 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Dec 01 21:53:46 crc kubenswrapper[4962]: I1201 21:53:46.581064 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 01 21:53:46 crc kubenswrapper[4962]: I1201 21:53:46.633095 4962 scope.go:117] "RemoveContainer" containerID="97294dc7e4dd8cc70e7c81893b1ac1dbf62e5ad1c96188ee0339b22ce385b53b" Dec 01 21:53:46 crc kubenswrapper[4962]: I1201 21:53:46.708560 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620\") " pod="openstack/prometheus-metric-storage-0" Dec 01 21:53:46 crc kubenswrapper[4962]: I1201 21:53:46.708613 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"prometheus-metric-storage-0\" (UID: \"7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620\") " pod="openstack/prometheus-metric-storage-0" Dec 01 21:53:46 crc kubenswrapper[4962]: I1201 21:53:46.708639 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620\") " pod="openstack/prometheus-metric-storage-0" Dec 01 21:53:46 crc kubenswrapper[4962]: I1201 21:53:46.708659 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7b9v\" (UniqueName: \"kubernetes.io/projected/7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620-kube-api-access-v7b9v\") pod \"prometheus-metric-storage-0\" (UID: \"7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620\") " pod="openstack/prometheus-metric-storage-0" Dec 01 21:53:46 crc kubenswrapper[4962]: I1201 21:53:46.708679 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620\") " pod="openstack/prometheus-metric-storage-0" Dec 01 21:53:46 crc kubenswrapper[4962]: I1201 21:53:46.708727 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620\") " pod="openstack/prometheus-metric-storage-0" Dec 01 21:53:46 crc kubenswrapper[4962]: I1201 21:53:46.708756 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620\") " pod="openstack/prometheus-metric-storage-0" Dec 01 21:53:46 crc kubenswrapper[4962]: I1201 21:53:46.708869 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620\") " pod="openstack/prometheus-metric-storage-0" Dec 01 21:53:46 crc kubenswrapper[4962]: I1201 21:53:46.708888 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620\") " pod="openstack/prometheus-metric-storage-0" Dec 01 21:53:46 crc kubenswrapper[4962]: I1201 21:53:46.708923 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620-config\") pod \"prometheus-metric-storage-0\" (UID: \"7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620\") " pod="openstack/prometheus-metric-storage-0" Dec 01 21:53:46 crc kubenswrapper[4962]: I1201 21:53:46.708954 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620\") " pod="openstack/prometheus-metric-storage-0" Dec 01 21:53:46 crc kubenswrapper[4962]: I1201 21:53:46.717066 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-bbd8-account-create-update-tnv4m"] Dec 01 21:53:46 crc kubenswrapper[4962]: I1201 21:53:46.728745 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-58e3-account-create-update-qf5td"] Dec 01 21:53:46 crc kubenswrapper[4962]: I1201 21:53:46.811052 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620\") " pod="openstack/prometheus-metric-storage-0" Dec 01 21:53:46 crc kubenswrapper[4962]: I1201 21:53:46.811200 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620\") " pod="openstack/prometheus-metric-storage-0" Dec 01 21:53:46 crc kubenswrapper[4962]: I1201 21:53:46.811227 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620\") " pod="openstack/prometheus-metric-storage-0" Dec 01 21:53:46 crc kubenswrapper[4962]: I1201 21:53:46.811265 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620-config\") pod \"prometheus-metric-storage-0\" (UID: \"7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620\") " pod="openstack/prometheus-metric-storage-0" Dec 01 21:53:46 crc kubenswrapper[4962]: I1201 21:53:46.811290 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620\") " pod="openstack/prometheus-metric-storage-0" Dec 01 21:53:46 crc kubenswrapper[4962]: I1201 21:53:46.811318 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620\") " pod="openstack/prometheus-metric-storage-0" Dec 01 21:53:46 crc kubenswrapper[4962]: I1201 21:53:46.811342 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"prometheus-metric-storage-0\" (UID: \"7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620\") " pod="openstack/prometheus-metric-storage-0" Dec 01 21:53:46 crc kubenswrapper[4962]: I1201 21:53:46.811361 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620\") " pod="openstack/prometheus-metric-storage-0" Dec 01 21:53:46 crc kubenswrapper[4962]: I1201 21:53:46.811377 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7b9v\" (UniqueName: \"kubernetes.io/projected/7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620-kube-api-access-v7b9v\") pod \"prometheus-metric-storage-0\" (UID: \"7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620\") " pod="openstack/prometheus-metric-storage-0" Dec 01 21:53:46 crc kubenswrapper[4962]: I1201 21:53:46.811396 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620\") " pod="openstack/prometheus-metric-storage-0" Dec 01 21:53:46 crc kubenswrapper[4962]: I1201 21:53:46.811439 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620\") " pod="openstack/prometheus-metric-storage-0" Dec 01 21:53:46 crc kubenswrapper[4962]: I1201 21:53:46.812693 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620\") " pod="openstack/prometheus-metric-storage-0" Dec 01 21:53:46 crc kubenswrapper[4962]: I1201 21:53:46.819484 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"prometheus-metric-storage-0\" (UID: \"7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/prometheus-metric-storage-0" Dec 01 21:53:46 crc kubenswrapper[4962]: I1201 21:53:46.825313 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620\") " pod="openstack/prometheus-metric-storage-0" Dec 01 21:53:46 crc kubenswrapper[4962]: I1201 21:53:46.826081 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620\") " pod="openstack/prometheus-metric-storage-0" Dec 01 21:53:46 crc kubenswrapper[4962]: I1201 21:53:46.837106 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620\") " pod="openstack/prometheus-metric-storage-0" Dec 01 21:53:46 crc kubenswrapper[4962]: I1201 21:53:46.840314 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620-config\") pod \"prometheus-metric-storage-0\" (UID: \"7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620\") " pod="openstack/prometheus-metric-storage-0" Dec 01 21:53:46 crc kubenswrapper[4962]: I1201 21:53:46.841078 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620\") " pod="openstack/prometheus-metric-storage-0" Dec 01 21:53:46 crc kubenswrapper[4962]: I1201 21:53:46.849369 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620\") " pod="openstack/prometheus-metric-storage-0" Dec 01 21:53:46 crc kubenswrapper[4962]: I1201 21:53:46.852520 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620\") " pod="openstack/prometheus-metric-storage-0" Dec 01 21:53:46 crc kubenswrapper[4962]: I1201 21:53:46.856212 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7b9v\" (UniqueName: \"kubernetes.io/projected/7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620-kube-api-access-v7b9v\") pod \"prometheus-metric-storage-0\" (UID: \"7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620\") " pod="openstack/prometheus-metric-storage-0" Dec 01 21:53:46 crc kubenswrapper[4962]: I1201 21:53:46.857094 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620\") " pod="openstack/prometheus-metric-storage-0" Dec 01 21:53:46 crc kubenswrapper[4962]: I1201 21:53:46.876512 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"prometheus-metric-storage-0\" (UID: \"7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620\") " pod="openstack/prometheus-metric-storage-0" Dec 01 21:53:46 crc kubenswrapper[4962]: I1201 21:53:46.926802 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 01 21:53:47 crc kubenswrapper[4962]: I1201 21:53:47.195338 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-8031-account-create-update-x2qgm"] Dec 01 21:53:47 crc kubenswrapper[4962]: I1201 21:53:47.208976 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-qd22d"] Dec 01 21:53:47 crc kubenswrapper[4962]: I1201 21:53:47.212574 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dbc8-account-create-update-x8lg9"] Dec 01 21:53:47 crc kubenswrapper[4962]: I1201 21:53:47.371679 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-f7vp4"] Dec 01 21:53:47 crc kubenswrapper[4962]: I1201 21:53:47.435664 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-dr5bl"] Dec 01 21:53:47 crc kubenswrapper[4962]: I1201 21:53:47.464890 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Dec 01 21:53:47 crc kubenswrapper[4962]: W1201 21:53:47.499678 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37f45ac0_c39d_4785_957e_e69e1659927e.slice/crio-4d31ba7440bd7946e52631d3a177735835f6dc93bab266eea444482c1cb03b35 WatchSource:0}: Error finding container 4d31ba7440bd7946e52631d3a177735835f6dc93bab266eea444482c1cb03b35: Status 404 returned error can't find the container with id 4d31ba7440bd7946e52631d3a177735835f6dc93bab266eea444482c1cb03b35 Dec 01 21:53:47 crc kubenswrapper[4962]: I1201 21:53:47.500446 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-58e3-account-create-update-qf5td" event={"ID":"8d0bb151-9b17-4471-9b0d-05f74fa33f0c","Type":"ContainerStarted","Data":"439f6ee826c6a9cfb86d2dcf01fcd627b3f5f7a81364853b124c3648e9a5f44d"} Dec 01 21:53:47 crc kubenswrapper[4962]: I1201 21:53:47.521589 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8031-account-create-update-x2qgm" event={"ID":"a7515449-4f20-4673-ac23-a7a5a40f852d","Type":"ContainerStarted","Data":"7dd0db7069aeb2bda06a9fe133ca9a08dede1f37c4f70dbe7c2854243983c84b"} Dec 01 21:53:47 crc kubenswrapper[4962]: I1201 21:53:47.540306 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dbc8-account-create-update-x8lg9" event={"ID":"5cd1b6eb-52cf-44aa-993a-90d3abec28ad","Type":"ContainerStarted","Data":"c04ade9a52131b0b76590538e22bd9a048b4970d4df8ba951cde653a21dd87d0"} Dec 01 21:53:47 crc kubenswrapper[4962]: I1201 21:53:47.596091 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-qd22d" event={"ID":"e6634bf9-94a9-4b1c-b14b-44b4ecc882bb","Type":"ContainerStarted","Data":"214e2173990b5e8285f59310540f15223c48240ca91c4cb2003acaf84787e35e"} Dec 01 21:53:47 crc kubenswrapper[4962]: I1201 21:53:47.638168 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-f7vp4" event={"ID":"c889a0f9-46ee-413a-bae1-94ee4eb8f16d","Type":"ContainerStarted","Data":"7a3a1ff9d5f8b0b974386285037c5fc78a4d13bfbd7673fea33d4bc870d7a5f8"} Dec 01 21:53:47 crc kubenswrapper[4962]: I1201 21:53:47.686015 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-bbd8-account-create-update-tnv4m" event={"ID":"b539cc27-f876-40e3-b77a-8af750ce5b3a","Type":"ContainerStarted","Data":"8505ace8458b3bc3f869bc05421920ea9754616da1600d094d8f2747c6432d77"} Dec 01 21:53:47 crc kubenswrapper[4962]: I1201 21:53:47.739265 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 01 21:53:47 crc kubenswrapper[4962]: W1201 21:53:47.756872 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7bb7f1e5_3ee6_4c9f_8d8a_3e66dd8f5620.slice/crio-f8e491fc4d8e5c2b3bae62849dd51669a827d7b3fc756eca16e2ca69613d515e WatchSource:0}: Error finding container f8e491fc4d8e5c2b3bae62849dd51669a827d7b3fc756eca16e2ca69613d515e: Status 404 returned error can't find the container with id f8e491fc4d8e5c2b3bae62849dd51669a827d7b3fc756eca16e2ca69613d515e Dec 01 21:53:47 crc kubenswrapper[4962]: I1201 21:53:47.858845 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-cdpb9" Dec 01 21:53:48 crc kubenswrapper[4962]: I1201 21:53:48.049897 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-cdpb9" Dec 01 21:53:48 crc kubenswrapper[4962]: I1201 21:53:48.230589 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f8782cd-368d-4071-848d-8ad2379ddf6c" path="/var/lib/kubelet/pods/4f8782cd-368d-4071-848d-8ad2379ddf6c/volumes" Dec 01 21:53:48 crc kubenswrapper[4962]: I1201 21:53:48.296062 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-xd7ph-config-8szbc"] Dec 01 21:53:48 crc kubenswrapper[4962]: I1201 21:53:48.297472 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-xd7ph-config-8szbc" Dec 01 21:53:48 crc kubenswrapper[4962]: I1201 21:53:48.303152 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 01 21:53:48 crc kubenswrapper[4962]: I1201 21:53:48.319221 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-xd7ph-config-8szbc"] Dec 01 21:53:48 crc kubenswrapper[4962]: I1201 21:53:48.355892 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5987bc9d-49c8-4f77-a8b4-509d2096ca53-scripts\") pod \"ovn-controller-xd7ph-config-8szbc\" (UID: \"5987bc9d-49c8-4f77-a8b4-509d2096ca53\") " pod="openstack/ovn-controller-xd7ph-config-8szbc" Dec 01 21:53:48 crc kubenswrapper[4962]: I1201 21:53:48.355963 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5987bc9d-49c8-4f77-a8b4-509d2096ca53-additional-scripts\") pod \"ovn-controller-xd7ph-config-8szbc\" (UID: \"5987bc9d-49c8-4f77-a8b4-509d2096ca53\") " pod="openstack/ovn-controller-xd7ph-config-8szbc" Dec 01 21:53:48 crc kubenswrapper[4962]: I1201 21:53:48.355986 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6nl4\" (UniqueName: \"kubernetes.io/projected/5987bc9d-49c8-4f77-a8b4-509d2096ca53-kube-api-access-m6nl4\") pod \"ovn-controller-xd7ph-config-8szbc\" (UID: \"5987bc9d-49c8-4f77-a8b4-509d2096ca53\") " pod="openstack/ovn-controller-xd7ph-config-8szbc" Dec 01 21:53:48 crc kubenswrapper[4962]: I1201 21:53:48.356035 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5987bc9d-49c8-4f77-a8b4-509d2096ca53-var-run-ovn\") pod \"ovn-controller-xd7ph-config-8szbc\" (UID: \"5987bc9d-49c8-4f77-a8b4-509d2096ca53\") " pod="openstack/ovn-controller-xd7ph-config-8szbc" Dec 01 21:53:48 crc kubenswrapper[4962]: I1201 21:53:48.356069 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5987bc9d-49c8-4f77-a8b4-509d2096ca53-var-log-ovn\") pod \"ovn-controller-xd7ph-config-8szbc\" (UID: \"5987bc9d-49c8-4f77-a8b4-509d2096ca53\") " pod="openstack/ovn-controller-xd7ph-config-8szbc" Dec 01 21:53:48 crc kubenswrapper[4962]: I1201 21:53:48.356142 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5987bc9d-49c8-4f77-a8b4-509d2096ca53-var-run\") pod \"ovn-controller-xd7ph-config-8szbc\" (UID: \"5987bc9d-49c8-4f77-a8b4-509d2096ca53\") " pod="openstack/ovn-controller-xd7ph-config-8szbc" Dec 01 21:53:48 crc kubenswrapper[4962]: I1201 21:53:48.457943 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5987bc9d-49c8-4f77-a8b4-509d2096ca53-var-run-ovn\") pod \"ovn-controller-xd7ph-config-8szbc\" (UID: \"5987bc9d-49c8-4f77-a8b4-509d2096ca53\") " pod="openstack/ovn-controller-xd7ph-config-8szbc" Dec 01 21:53:48 crc kubenswrapper[4962]: I1201 21:53:48.458010 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5987bc9d-49c8-4f77-a8b4-509d2096ca53-var-log-ovn\") pod \"ovn-controller-xd7ph-config-8szbc\" (UID: \"5987bc9d-49c8-4f77-a8b4-509d2096ca53\") " pod="openstack/ovn-controller-xd7ph-config-8szbc" Dec 01 21:53:48 crc kubenswrapper[4962]: I1201 21:53:48.458094 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5987bc9d-49c8-4f77-a8b4-509d2096ca53-var-run\") pod \"ovn-controller-xd7ph-config-8szbc\" (UID: \"5987bc9d-49c8-4f77-a8b4-509d2096ca53\") " pod="openstack/ovn-controller-xd7ph-config-8szbc" Dec 01 21:53:48 crc kubenswrapper[4962]: I1201 21:53:48.458166 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5987bc9d-49c8-4f77-a8b4-509d2096ca53-scripts\") pod \"ovn-controller-xd7ph-config-8szbc\" (UID: \"5987bc9d-49c8-4f77-a8b4-509d2096ca53\") " pod="openstack/ovn-controller-xd7ph-config-8szbc" Dec 01 21:53:48 crc kubenswrapper[4962]: I1201 21:53:48.458198 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5987bc9d-49c8-4f77-a8b4-509d2096ca53-additional-scripts\") pod \"ovn-controller-xd7ph-config-8szbc\" (UID: \"5987bc9d-49c8-4f77-a8b4-509d2096ca53\") " pod="openstack/ovn-controller-xd7ph-config-8szbc" Dec 01 21:53:48 crc kubenswrapper[4962]: I1201 21:53:48.458215 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6nl4\" (UniqueName: \"kubernetes.io/projected/5987bc9d-49c8-4f77-a8b4-509d2096ca53-kube-api-access-m6nl4\") pod \"ovn-controller-xd7ph-config-8szbc\" (UID: \"5987bc9d-49c8-4f77-a8b4-509d2096ca53\") " pod="openstack/ovn-controller-xd7ph-config-8szbc" Dec 01 21:53:48 crc kubenswrapper[4962]: I1201 21:53:48.458249 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5987bc9d-49c8-4f77-a8b4-509d2096ca53-var-run-ovn\") pod \"ovn-controller-xd7ph-config-8szbc\" (UID: \"5987bc9d-49c8-4f77-a8b4-509d2096ca53\") " pod="openstack/ovn-controller-xd7ph-config-8szbc" Dec 01 21:53:48 crc kubenswrapper[4962]: I1201 21:53:48.458393 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5987bc9d-49c8-4f77-a8b4-509d2096ca53-var-run\") pod \"ovn-controller-xd7ph-config-8szbc\" (UID: \"5987bc9d-49c8-4f77-a8b4-509d2096ca53\") " pod="openstack/ovn-controller-xd7ph-config-8szbc" Dec 01 21:53:48 crc kubenswrapper[4962]: I1201 21:53:48.458474 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5987bc9d-49c8-4f77-a8b4-509d2096ca53-var-log-ovn\") pod \"ovn-controller-xd7ph-config-8szbc\" (UID: \"5987bc9d-49c8-4f77-a8b4-509d2096ca53\") " pod="openstack/ovn-controller-xd7ph-config-8szbc" Dec 01 21:53:48 crc kubenswrapper[4962]: I1201 21:53:48.459134 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5987bc9d-49c8-4f77-a8b4-509d2096ca53-additional-scripts\") pod \"ovn-controller-xd7ph-config-8szbc\" (UID: \"5987bc9d-49c8-4f77-a8b4-509d2096ca53\") " pod="openstack/ovn-controller-xd7ph-config-8szbc" Dec 01 21:53:48 crc kubenswrapper[4962]: I1201 21:53:48.460364 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5987bc9d-49c8-4f77-a8b4-509d2096ca53-scripts\") pod \"ovn-controller-xd7ph-config-8szbc\" (UID: \"5987bc9d-49c8-4f77-a8b4-509d2096ca53\") " pod="openstack/ovn-controller-xd7ph-config-8szbc" Dec 01 21:53:48 crc kubenswrapper[4962]: I1201 21:53:48.478575 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6nl4\" (UniqueName: \"kubernetes.io/projected/5987bc9d-49c8-4f77-a8b4-509d2096ca53-kube-api-access-m6nl4\") pod \"ovn-controller-xd7ph-config-8szbc\" (UID: \"5987bc9d-49c8-4f77-a8b4-509d2096ca53\") " pod="openstack/ovn-controller-xd7ph-config-8szbc" Dec 01 21:53:48 crc kubenswrapper[4962]: I1201 21:53:48.711944 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620","Type":"ContainerStarted","Data":"f8e491fc4d8e5c2b3bae62849dd51669a827d7b3fc756eca16e2ca69613d515e"} Dec 01 21:53:48 crc kubenswrapper[4962]: I1201 21:53:48.713666 4962 generic.go:334] "Generic (PLEG): container finished" podID="b539cc27-f876-40e3-b77a-8af750ce5b3a" containerID="8829dfd12517b7a3c99f992987be4805b73b3ae2e753eca56a8bde276acdaec7" exitCode=0 Dec 01 21:53:48 crc kubenswrapper[4962]: I1201 21:53:48.713925 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-bbd8-account-create-update-tnv4m" event={"ID":"b539cc27-f876-40e3-b77a-8af750ce5b3a","Type":"ContainerDied","Data":"8829dfd12517b7a3c99f992987be4805b73b3ae2e753eca56a8bde276acdaec7"} Dec 01 21:53:48 crc kubenswrapper[4962]: I1201 21:53:48.716056 4962 generic.go:334] "Generic (PLEG): container finished" podID="8d0bb151-9b17-4471-9b0d-05f74fa33f0c" containerID="2833172e38a500bfe10d49fda192ef2936fa2ad2ae35f63856a63fb81bd140eb" exitCode=0 Dec 01 21:53:48 crc kubenswrapper[4962]: I1201 21:53:48.716098 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-58e3-account-create-update-qf5td" event={"ID":"8d0bb151-9b17-4471-9b0d-05f74fa33f0c","Type":"ContainerDied","Data":"2833172e38a500bfe10d49fda192ef2936fa2ad2ae35f63856a63fb81bd140eb"} Dec 01 21:53:48 crc kubenswrapper[4962]: I1201 21:53:48.718153 4962 generic.go:334] "Generic (PLEG): container finished" podID="a7515449-4f20-4673-ac23-a7a5a40f852d" containerID="0e9aab10cf8f465f91a67c0776211da3fcb1b8a3473f1328b1a6d4d40b74c6e2" exitCode=0 Dec 01 21:53:48 crc kubenswrapper[4962]: I1201 21:53:48.718215 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8031-account-create-update-x2qgm" event={"ID":"a7515449-4f20-4673-ac23-a7a5a40f852d","Type":"ContainerDied","Data":"0e9aab10cf8f465f91a67c0776211da3fcb1b8a3473f1328b1a6d4d40b74c6e2"} Dec 01 21:53:48 crc kubenswrapper[4962]: I1201 21:53:48.721629 4962 generic.go:334] "Generic (PLEG): container finished" podID="29e77231-0423-458f-8262-aa12c2536566" containerID="30bb1e4077ae583f12c4c6689e334cdf76af4dfb6afe2c61c6b099563f3eb6a2" exitCode=0 Dec 01 21:53:48 crc kubenswrapper[4962]: I1201 21:53:48.721709 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-bg7z9" event={"ID":"29e77231-0423-458f-8262-aa12c2536566","Type":"ContainerDied","Data":"30bb1e4077ae583f12c4c6689e334cdf76af4dfb6afe2c61c6b099563f3eb6a2"} Dec 01 21:53:48 crc kubenswrapper[4962]: I1201 21:53:48.723299 4962 generic.go:334] "Generic (PLEG): container finished" podID="bfa1c735-01c9-4f3e-ae1d-32bc2af0972d" containerID="511ff24f0bc730b93e596c22bc2f62060c72391a0076d989f6d5ad1cf7f2cd12" exitCode=0 Dec 01 21:53:48 crc kubenswrapper[4962]: I1201 21:53:48.723391 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-lpdtt" event={"ID":"bfa1c735-01c9-4f3e-ae1d-32bc2af0972d","Type":"ContainerDied","Data":"511ff24f0bc730b93e596c22bc2f62060c72391a0076d989f6d5ad1cf7f2cd12"} Dec 01 21:53:48 crc kubenswrapper[4962]: I1201 21:53:48.724885 4962 generic.go:334] "Generic (PLEG): container finished" podID="c889a0f9-46ee-413a-bae1-94ee4eb8f16d" containerID="8f2da0fd1ec5fd019c5150b84d509035ed363835c9f0a0203d2563fdbc807f70" exitCode=0 Dec 01 21:53:48 crc kubenswrapper[4962]: I1201 21:53:48.724988 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-f7vp4" event={"ID":"c889a0f9-46ee-413a-bae1-94ee4eb8f16d","Type":"ContainerDied","Data":"8f2da0fd1ec5fd019c5150b84d509035ed363835c9f0a0203d2563fdbc807f70"} Dec 01 21:53:48 crc kubenswrapper[4962]: I1201 21:53:48.728063 4962 generic.go:334] "Generic (PLEG): container finished" podID="37f45ac0-c39d-4785-957e-e69e1659927e" containerID="77f87a9016ecab1c560188b5260a06146211e48bfc5c67248b63450fca93f960" exitCode=0 Dec 01 21:53:48 crc kubenswrapper[4962]: I1201 21:53:48.728143 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-dr5bl" event={"ID":"37f45ac0-c39d-4785-957e-e69e1659927e","Type":"ContainerDied","Data":"77f87a9016ecab1c560188b5260a06146211e48bfc5c67248b63450fca93f960"} Dec 01 21:53:48 crc kubenswrapper[4962]: I1201 21:53:48.728167 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-dr5bl" event={"ID":"37f45ac0-c39d-4785-957e-e69e1659927e","Type":"ContainerStarted","Data":"4d31ba7440bd7946e52631d3a177735835f6dc93bab266eea444482c1cb03b35"} Dec 01 21:53:48 crc kubenswrapper[4962]: I1201 21:53:48.733727 4962 generic.go:334] "Generic (PLEG): container finished" podID="5cd1b6eb-52cf-44aa-993a-90d3abec28ad" containerID="8ab4004fc2e5dc0799776153309c837efc8148de0b5ea7246b5dc107b10df66a" exitCode=0 Dec 01 21:53:48 crc kubenswrapper[4962]: I1201 21:53:48.733794 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dbc8-account-create-update-x8lg9" event={"ID":"5cd1b6eb-52cf-44aa-993a-90d3abec28ad","Type":"ContainerDied","Data":"8ab4004fc2e5dc0799776153309c837efc8148de0b5ea7246b5dc107b10df66a"} Dec 01 21:53:48 crc kubenswrapper[4962]: I1201 21:53:48.734818 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"a120e58c-62c2-4242-a668-151b872a9cb4","Type":"ContainerStarted","Data":"01314995120ca2a359e62111f59a5bc1ddcbb3896a2be7ff5302f1af741be76a"} Dec 01 21:53:48 crc kubenswrapper[4962]: I1201 21:53:48.788803 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-xd7ph-config-8szbc" Dec 01 21:53:49 crc kubenswrapper[4962]: I1201 21:53:49.262373 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-xd7ph-config-8szbc"] Dec 01 21:53:49 crc kubenswrapper[4962]: I1201 21:53:49.753677 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-xd7ph-config-8szbc" event={"ID":"5987bc9d-49c8-4f77-a8b4-509d2096ca53","Type":"ContainerStarted","Data":"ee46eef84872efd0f6091b889c036282a64fe41fca51819c77b23dfe90ad9487"} Dec 01 21:53:49 crc kubenswrapper[4962]: I1201 21:53:49.754129 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-xd7ph-config-8szbc" event={"ID":"5987bc9d-49c8-4f77-a8b4-509d2096ca53","Type":"ContainerStarted","Data":"2474b0ef9a85f4b2454e15688528043935c4128499d2fbf12c05f821028ec796"} Dec 01 21:53:49 crc kubenswrapper[4962]: I1201 21:53:49.774178 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-xd7ph-config-8szbc" podStartSLOduration=1.774161058 podStartE2EDuration="1.774161058s" podCreationTimestamp="2025-12-01 21:53:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:53:49.772642425 +0000 UTC m=+1213.874081620" watchObservedRunningTime="2025-12-01 21:53:49.774161058 +0000 UTC m=+1213.875600243" Dec 01 21:53:50 crc kubenswrapper[4962]: I1201 21:53:50.765564 4962 generic.go:334] "Generic (PLEG): container finished" podID="5987bc9d-49c8-4f77-a8b4-509d2096ca53" containerID="ee46eef84872efd0f6091b889c036282a64fe41fca51819c77b23dfe90ad9487" exitCode=0 Dec 01 21:53:50 crc kubenswrapper[4962]: I1201 21:53:50.765679 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-xd7ph-config-8szbc" event={"ID":"5987bc9d-49c8-4f77-a8b4-509d2096ca53","Type":"ContainerDied","Data":"ee46eef84872efd0f6091b889c036282a64fe41fca51819c77b23dfe90ad9487"} Dec 01 21:53:50 crc kubenswrapper[4962]: I1201 21:53:50.771188 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-dr5bl" event={"ID":"37f45ac0-c39d-4785-957e-e69e1659927e","Type":"ContainerDied","Data":"4d31ba7440bd7946e52631d3a177735835f6dc93bab266eea444482c1cb03b35"} Dec 01 21:53:50 crc kubenswrapper[4962]: I1201 21:53:50.771253 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d31ba7440bd7946e52631d3a177735835f6dc93bab266eea444482c1cb03b35" Dec 01 21:53:50 crc kubenswrapper[4962]: I1201 21:53:50.786526 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-dr5bl" Dec 01 21:53:50 crc kubenswrapper[4962]: I1201 21:53:50.812770 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37f45ac0-c39d-4785-957e-e69e1659927e-operator-scripts\") pod \"37f45ac0-c39d-4785-957e-e69e1659927e\" (UID: \"37f45ac0-c39d-4785-957e-e69e1659927e\") " Dec 01 21:53:50 crc kubenswrapper[4962]: I1201 21:53:50.813045 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8h5pz\" (UniqueName: \"kubernetes.io/projected/37f45ac0-c39d-4785-957e-e69e1659927e-kube-api-access-8h5pz\") pod \"37f45ac0-c39d-4785-957e-e69e1659927e\" (UID: \"37f45ac0-c39d-4785-957e-e69e1659927e\") " Dec 01 21:53:50 crc kubenswrapper[4962]: I1201 21:53:50.814352 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37f45ac0-c39d-4785-957e-e69e1659927e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "37f45ac0-c39d-4785-957e-e69e1659927e" (UID: "37f45ac0-c39d-4785-957e-e69e1659927e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:53:50 crc kubenswrapper[4962]: I1201 21:53:50.824861 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37f45ac0-c39d-4785-957e-e69e1659927e-kube-api-access-8h5pz" (OuterVolumeSpecName: "kube-api-access-8h5pz") pod "37f45ac0-c39d-4785-957e-e69e1659927e" (UID: "37f45ac0-c39d-4785-957e-e69e1659927e"). InnerVolumeSpecName "kube-api-access-8h5pz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:53:50 crc kubenswrapper[4962]: I1201 21:53:50.915676 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8h5pz\" (UniqueName: \"kubernetes.io/projected/37f45ac0-c39d-4785-957e-e69e1659927e-kube-api-access-8h5pz\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:50 crc kubenswrapper[4962]: I1201 21:53:50.915715 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37f45ac0-c39d-4785-957e-e69e1659927e-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:51 crc kubenswrapper[4962]: I1201 21:53:51.144219 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-lpdtt" Dec 01 21:53:51 crc kubenswrapper[4962]: I1201 21:53:51.154489 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-bbd8-account-create-update-tnv4m" Dec 01 21:53:51 crc kubenswrapper[4962]: I1201 21:53:51.199646 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-f7vp4" Dec 01 21:53:51 crc kubenswrapper[4962]: I1201 21:53:51.211503 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8031-account-create-update-x2qgm" Dec 01 21:53:51 crc kubenswrapper[4962]: I1201 21:53:51.223731 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8svw9\" (UniqueName: \"kubernetes.io/projected/bfa1c735-01c9-4f3e-ae1d-32bc2af0972d-kube-api-access-8svw9\") pod \"bfa1c735-01c9-4f3e-ae1d-32bc2af0972d\" (UID: \"bfa1c735-01c9-4f3e-ae1d-32bc2af0972d\") " Dec 01 21:53:51 crc kubenswrapper[4962]: I1201 21:53:51.223793 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b539cc27-f876-40e3-b77a-8af750ce5b3a-operator-scripts\") pod \"b539cc27-f876-40e3-b77a-8af750ce5b3a\" (UID: \"b539cc27-f876-40e3-b77a-8af750ce5b3a\") " Dec 01 21:53:51 crc kubenswrapper[4962]: I1201 21:53:51.223854 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbhcf\" (UniqueName: \"kubernetes.io/projected/b539cc27-f876-40e3-b77a-8af750ce5b3a-kube-api-access-zbhcf\") pod \"b539cc27-f876-40e3-b77a-8af750ce5b3a\" (UID: \"b539cc27-f876-40e3-b77a-8af750ce5b3a\") " Dec 01 21:53:51 crc kubenswrapper[4962]: I1201 21:53:51.223986 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bfa1c735-01c9-4f3e-ae1d-32bc2af0972d-operator-scripts\") pod \"bfa1c735-01c9-4f3e-ae1d-32bc2af0972d\" (UID: \"bfa1c735-01c9-4f3e-ae1d-32bc2af0972d\") " Dec 01 21:53:51 crc kubenswrapper[4962]: I1201 21:53:51.224894 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b539cc27-f876-40e3-b77a-8af750ce5b3a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b539cc27-f876-40e3-b77a-8af750ce5b3a" (UID: "b539cc27-f876-40e3-b77a-8af750ce5b3a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:53:51 crc kubenswrapper[4962]: I1201 21:53:51.225002 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfa1c735-01c9-4f3e-ae1d-32bc2af0972d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bfa1c735-01c9-4f3e-ae1d-32bc2af0972d" (UID: "bfa1c735-01c9-4f3e-ae1d-32bc2af0972d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:53:51 crc kubenswrapper[4962]: I1201 21:53:51.326011 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c889a0f9-46ee-413a-bae1-94ee4eb8f16d-operator-scripts\") pod \"c889a0f9-46ee-413a-bae1-94ee4eb8f16d\" (UID: \"c889a0f9-46ee-413a-bae1-94ee4eb8f16d\") " Dec 01 21:53:51 crc kubenswrapper[4962]: I1201 21:53:51.326175 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwm74\" (UniqueName: \"kubernetes.io/projected/a7515449-4f20-4673-ac23-a7a5a40f852d-kube-api-access-jwm74\") pod \"a7515449-4f20-4673-ac23-a7a5a40f852d\" (UID: \"a7515449-4f20-4673-ac23-a7a5a40f852d\") " Dec 01 21:53:51 crc kubenswrapper[4962]: I1201 21:53:51.326217 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdkwr\" (UniqueName: \"kubernetes.io/projected/c889a0f9-46ee-413a-bae1-94ee4eb8f16d-kube-api-access-qdkwr\") pod \"c889a0f9-46ee-413a-bae1-94ee4eb8f16d\" (UID: \"c889a0f9-46ee-413a-bae1-94ee4eb8f16d\") " Dec 01 21:53:51 crc kubenswrapper[4962]: I1201 21:53:51.326263 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7515449-4f20-4673-ac23-a7a5a40f852d-operator-scripts\") pod \"a7515449-4f20-4673-ac23-a7a5a40f852d\" (UID: \"a7515449-4f20-4673-ac23-a7a5a40f852d\") " Dec 01 21:53:51 crc kubenswrapper[4962]: I1201 21:53:51.326914 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b539cc27-f876-40e3-b77a-8af750ce5b3a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:51 crc kubenswrapper[4962]: I1201 21:53:51.326953 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bfa1c735-01c9-4f3e-ae1d-32bc2af0972d-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:51 crc kubenswrapper[4962]: I1201 21:53:51.327206 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7515449-4f20-4673-ac23-a7a5a40f852d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a7515449-4f20-4673-ac23-a7a5a40f852d" (UID: "a7515449-4f20-4673-ac23-a7a5a40f852d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:53:51 crc kubenswrapper[4962]: I1201 21:53:51.327638 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c889a0f9-46ee-413a-bae1-94ee4eb8f16d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c889a0f9-46ee-413a-bae1-94ee4eb8f16d" (UID: "c889a0f9-46ee-413a-bae1-94ee4eb8f16d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:53:51 crc kubenswrapper[4962]: I1201 21:53:51.388167 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfa1c735-01c9-4f3e-ae1d-32bc2af0972d-kube-api-access-8svw9" (OuterVolumeSpecName: "kube-api-access-8svw9") pod "bfa1c735-01c9-4f3e-ae1d-32bc2af0972d" (UID: "bfa1c735-01c9-4f3e-ae1d-32bc2af0972d"). InnerVolumeSpecName "kube-api-access-8svw9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:53:51 crc kubenswrapper[4962]: I1201 21:53:51.388235 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b539cc27-f876-40e3-b77a-8af750ce5b3a-kube-api-access-zbhcf" (OuterVolumeSpecName: "kube-api-access-zbhcf") pod "b539cc27-f876-40e3-b77a-8af750ce5b3a" (UID: "b539cc27-f876-40e3-b77a-8af750ce5b3a"). InnerVolumeSpecName "kube-api-access-zbhcf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:53:51 crc kubenswrapper[4962]: I1201 21:53:51.388807 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7515449-4f20-4673-ac23-a7a5a40f852d-kube-api-access-jwm74" (OuterVolumeSpecName: "kube-api-access-jwm74") pod "a7515449-4f20-4673-ac23-a7a5a40f852d" (UID: "a7515449-4f20-4673-ac23-a7a5a40f852d"). InnerVolumeSpecName "kube-api-access-jwm74". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:53:51 crc kubenswrapper[4962]: I1201 21:53:51.388897 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c889a0f9-46ee-413a-bae1-94ee4eb8f16d-kube-api-access-qdkwr" (OuterVolumeSpecName: "kube-api-access-qdkwr") pod "c889a0f9-46ee-413a-bae1-94ee4eb8f16d" (UID: "c889a0f9-46ee-413a-bae1-94ee4eb8f16d"). InnerVolumeSpecName "kube-api-access-qdkwr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:53:51 crc kubenswrapper[4962]: I1201 21:53:51.430023 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwm74\" (UniqueName: \"kubernetes.io/projected/a7515449-4f20-4673-ac23-a7a5a40f852d-kube-api-access-jwm74\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:51 crc kubenswrapper[4962]: I1201 21:53:51.430065 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qdkwr\" (UniqueName: \"kubernetes.io/projected/c889a0f9-46ee-413a-bae1-94ee4eb8f16d-kube-api-access-qdkwr\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:51 crc kubenswrapper[4962]: I1201 21:53:51.430079 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zbhcf\" (UniqueName: \"kubernetes.io/projected/b539cc27-f876-40e3-b77a-8af750ce5b3a-kube-api-access-zbhcf\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:51 crc kubenswrapper[4962]: I1201 21:53:51.430091 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7515449-4f20-4673-ac23-a7a5a40f852d-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:51 crc kubenswrapper[4962]: I1201 21:53:51.430105 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c889a0f9-46ee-413a-bae1-94ee4eb8f16d-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:51 crc kubenswrapper[4962]: I1201 21:53:51.430117 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8svw9\" (UniqueName: \"kubernetes.io/projected/bfa1c735-01c9-4f3e-ae1d-32bc2af0972d-kube-api-access-8svw9\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:51 crc kubenswrapper[4962]: I1201 21:53:51.516395 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dbc8-account-create-update-x8lg9" Dec 01 21:53:51 crc kubenswrapper[4962]: I1201 21:53:51.522336 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-bg7z9" Dec 01 21:53:51 crc kubenswrapper[4962]: I1201 21:53:51.532446 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-58e3-account-create-update-qf5td" Dec 01 21:53:51 crc kubenswrapper[4962]: I1201 21:53:51.633372 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29e77231-0423-458f-8262-aa12c2536566-operator-scripts\") pod \"29e77231-0423-458f-8262-aa12c2536566\" (UID: \"29e77231-0423-458f-8262-aa12c2536566\") " Dec 01 21:53:51 crc kubenswrapper[4962]: I1201 21:53:51.633417 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5cd1b6eb-52cf-44aa-993a-90d3abec28ad-operator-scripts\") pod \"5cd1b6eb-52cf-44aa-993a-90d3abec28ad\" (UID: \"5cd1b6eb-52cf-44aa-993a-90d3abec28ad\") " Dec 01 21:53:51 crc kubenswrapper[4962]: I1201 21:53:51.633513 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d0bb151-9b17-4471-9b0d-05f74fa33f0c-operator-scripts\") pod \"8d0bb151-9b17-4471-9b0d-05f74fa33f0c\" (UID: \"8d0bb151-9b17-4471-9b0d-05f74fa33f0c\") " Dec 01 21:53:51 crc kubenswrapper[4962]: I1201 21:53:51.633577 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5zpb\" (UniqueName: \"kubernetes.io/projected/29e77231-0423-458f-8262-aa12c2536566-kube-api-access-l5zpb\") pod \"29e77231-0423-458f-8262-aa12c2536566\" (UID: \"29e77231-0423-458f-8262-aa12c2536566\") " Dec 01 21:53:51 crc kubenswrapper[4962]: I1201 21:53:51.633601 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vs2gz\" (UniqueName: \"kubernetes.io/projected/5cd1b6eb-52cf-44aa-993a-90d3abec28ad-kube-api-access-vs2gz\") pod \"5cd1b6eb-52cf-44aa-993a-90d3abec28ad\" (UID: \"5cd1b6eb-52cf-44aa-993a-90d3abec28ad\") " Dec 01 21:53:51 crc kubenswrapper[4962]: I1201 21:53:51.633641 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6fht\" (UniqueName: \"kubernetes.io/projected/8d0bb151-9b17-4471-9b0d-05f74fa33f0c-kube-api-access-j6fht\") pod \"8d0bb151-9b17-4471-9b0d-05f74fa33f0c\" (UID: \"8d0bb151-9b17-4471-9b0d-05f74fa33f0c\") " Dec 01 21:53:51 crc kubenswrapper[4962]: I1201 21:53:51.634049 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5cd1b6eb-52cf-44aa-993a-90d3abec28ad-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5cd1b6eb-52cf-44aa-993a-90d3abec28ad" (UID: "5cd1b6eb-52cf-44aa-993a-90d3abec28ad"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:53:51 crc kubenswrapper[4962]: I1201 21:53:51.634455 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d0bb151-9b17-4471-9b0d-05f74fa33f0c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8d0bb151-9b17-4471-9b0d-05f74fa33f0c" (UID: "8d0bb151-9b17-4471-9b0d-05f74fa33f0c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:53:51 crc kubenswrapper[4962]: I1201 21:53:51.634484 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5cd1b6eb-52cf-44aa-993a-90d3abec28ad-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:51 crc kubenswrapper[4962]: I1201 21:53:51.635455 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29e77231-0423-458f-8262-aa12c2536566-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "29e77231-0423-458f-8262-aa12c2536566" (UID: "29e77231-0423-458f-8262-aa12c2536566"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:53:51 crc kubenswrapper[4962]: I1201 21:53:51.638093 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29e77231-0423-458f-8262-aa12c2536566-kube-api-access-l5zpb" (OuterVolumeSpecName: "kube-api-access-l5zpb") pod "29e77231-0423-458f-8262-aa12c2536566" (UID: "29e77231-0423-458f-8262-aa12c2536566"). InnerVolumeSpecName "kube-api-access-l5zpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:53:51 crc kubenswrapper[4962]: I1201 21:53:51.638138 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d0bb151-9b17-4471-9b0d-05f74fa33f0c-kube-api-access-j6fht" (OuterVolumeSpecName: "kube-api-access-j6fht") pod "8d0bb151-9b17-4471-9b0d-05f74fa33f0c" (UID: "8d0bb151-9b17-4471-9b0d-05f74fa33f0c"). InnerVolumeSpecName "kube-api-access-j6fht". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:53:51 crc kubenswrapper[4962]: I1201 21:53:51.639290 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5cd1b6eb-52cf-44aa-993a-90d3abec28ad-kube-api-access-vs2gz" (OuterVolumeSpecName: "kube-api-access-vs2gz") pod "5cd1b6eb-52cf-44aa-993a-90d3abec28ad" (UID: "5cd1b6eb-52cf-44aa-993a-90d3abec28ad"). InnerVolumeSpecName "kube-api-access-vs2gz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:53:51 crc kubenswrapper[4962]: I1201 21:53:51.736178 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6fht\" (UniqueName: \"kubernetes.io/projected/8d0bb151-9b17-4471-9b0d-05f74fa33f0c-kube-api-access-j6fht\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:51 crc kubenswrapper[4962]: I1201 21:53:51.736212 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29e77231-0423-458f-8262-aa12c2536566-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:51 crc kubenswrapper[4962]: I1201 21:53:51.736222 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d0bb151-9b17-4471-9b0d-05f74fa33f0c-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:51 crc kubenswrapper[4962]: I1201 21:53:51.736231 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5zpb\" (UniqueName: \"kubernetes.io/projected/29e77231-0423-458f-8262-aa12c2536566-kube-api-access-l5zpb\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:51 crc kubenswrapper[4962]: I1201 21:53:51.736240 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vs2gz\" (UniqueName: \"kubernetes.io/projected/5cd1b6eb-52cf-44aa-993a-90d3abec28ad-kube-api-access-vs2gz\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:51 crc kubenswrapper[4962]: I1201 21:53:51.780303 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-f7vp4" event={"ID":"c889a0f9-46ee-413a-bae1-94ee4eb8f16d","Type":"ContainerDied","Data":"7a3a1ff9d5f8b0b974386285037c5fc78a4d13bfbd7673fea33d4bc870d7a5f8"} Dec 01 21:53:51 crc kubenswrapper[4962]: I1201 21:53:51.780344 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a3a1ff9d5f8b0b974386285037c5fc78a4d13bfbd7673fea33d4bc870d7a5f8" Dec 01 21:53:51 crc kubenswrapper[4962]: I1201 21:53:51.780405 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-f7vp4" Dec 01 21:53:51 crc kubenswrapper[4962]: I1201 21:53:51.786997 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-bbd8-account-create-update-tnv4m" Dec 01 21:53:51 crc kubenswrapper[4962]: I1201 21:53:51.787044 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-bbd8-account-create-update-tnv4m" event={"ID":"b539cc27-f876-40e3-b77a-8af750ce5b3a","Type":"ContainerDied","Data":"8505ace8458b3bc3f869bc05421920ea9754616da1600d094d8f2747c6432d77"} Dec 01 21:53:51 crc kubenswrapper[4962]: I1201 21:53:51.787069 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8505ace8458b3bc3f869bc05421920ea9754616da1600d094d8f2747c6432d77" Dec 01 21:53:51 crc kubenswrapper[4962]: I1201 21:53:51.790016 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-58e3-account-create-update-qf5td" event={"ID":"8d0bb151-9b17-4471-9b0d-05f74fa33f0c","Type":"ContainerDied","Data":"439f6ee826c6a9cfb86d2dcf01fcd627b3f5f7a81364853b124c3648e9a5f44d"} Dec 01 21:53:51 crc kubenswrapper[4962]: I1201 21:53:51.790045 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="439f6ee826c6a9cfb86d2dcf01fcd627b3f5f7a81364853b124c3648e9a5f44d" Dec 01 21:53:51 crc kubenswrapper[4962]: I1201 21:53:51.790105 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-58e3-account-create-update-qf5td" Dec 01 21:53:51 crc kubenswrapper[4962]: I1201 21:53:51.796452 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8031-account-create-update-x2qgm" event={"ID":"a7515449-4f20-4673-ac23-a7a5a40f852d","Type":"ContainerDied","Data":"7dd0db7069aeb2bda06a9fe133ca9a08dede1f37c4f70dbe7c2854243983c84b"} Dec 01 21:53:51 crc kubenswrapper[4962]: I1201 21:53:51.796481 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7dd0db7069aeb2bda06a9fe133ca9a08dede1f37c4f70dbe7c2854243983c84b" Dec 01 21:53:51 crc kubenswrapper[4962]: I1201 21:53:51.796535 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8031-account-create-update-x2qgm" Dec 01 21:53:51 crc kubenswrapper[4962]: I1201 21:53:51.806779 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-bg7z9" event={"ID":"29e77231-0423-458f-8262-aa12c2536566","Type":"ContainerDied","Data":"21f7c920dc99e90bdd6b34d448916aac90bb2a1c05c378efaa5e91c90c9abac3"} Dec 01 21:53:51 crc kubenswrapper[4962]: I1201 21:53:51.806797 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-bg7z9" Dec 01 21:53:51 crc kubenswrapper[4962]: I1201 21:53:51.806802 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21f7c920dc99e90bdd6b34d448916aac90bb2a1c05c378efaa5e91c90c9abac3" Dec 01 21:53:51 crc kubenswrapper[4962]: I1201 21:53:51.825844 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-lpdtt" event={"ID":"bfa1c735-01c9-4f3e-ae1d-32bc2af0972d","Type":"ContainerDied","Data":"406ae3e11d35520dd2d6c91ed74bae17f7928537aed81575e79575cb521b001b"} Dec 01 21:53:51 crc kubenswrapper[4962]: I1201 21:53:51.825886 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="406ae3e11d35520dd2d6c91ed74bae17f7928537aed81575e79575cb521b001b" Dec 01 21:53:51 crc kubenswrapper[4962]: I1201 21:53:51.825943 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-lpdtt" Dec 01 21:53:51 crc kubenswrapper[4962]: I1201 21:53:51.829666 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-dr5bl" Dec 01 21:53:51 crc kubenswrapper[4962]: I1201 21:53:51.829692 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dbc8-account-create-update-x8lg9" Dec 01 21:53:51 crc kubenswrapper[4962]: I1201 21:53:51.829734 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dbc8-account-create-update-x8lg9" event={"ID":"5cd1b6eb-52cf-44aa-993a-90d3abec28ad","Type":"ContainerDied","Data":"c04ade9a52131b0b76590538e22bd9a048b4970d4df8ba951cde653a21dd87d0"} Dec 01 21:53:51 crc kubenswrapper[4962]: I1201 21:53:51.829757 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c04ade9a52131b0b76590538e22bd9a048b4970d4df8ba951cde653a21dd87d0" Dec 01 21:53:51 crc kubenswrapper[4962]: E1201 21:53:51.943883 4962 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbfa1c735_01c9_4f3e_ae1d_32bc2af0972d.slice/crio-406ae3e11d35520dd2d6c91ed74bae17f7928537aed81575e79575cb521b001b\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb539cc27_f876_40e3_b77a_8af750ce5b3a.slice/crio-8505ace8458b3bc3f869bc05421920ea9754616da1600d094d8f2747c6432d77\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37f45ac0_c39d_4785_957e_e69e1659927e.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d0bb151_9b17_4471_9b0d_05f74fa33f0c.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc889a0f9_46ee_413a_bae1_94ee4eb8f16d.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7515449_4f20_4673_ac23_a7a5a40f852d.slice/crio-7dd0db7069aeb2bda06a9fe133ca9a08dede1f37c4f70dbe7c2854243983c84b\": RecentStats: unable to find data in memory cache]" Dec 01 21:53:52 crc kubenswrapper[4962]: I1201 21:53:52.605128 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-xd7ph" Dec 01 21:53:52 crc kubenswrapper[4962]: I1201 21:53:52.656130 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3-etc-swift\") pod \"swift-storage-0\" (UID: \"f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3\") " pod="openstack/swift-storage-0" Dec 01 21:53:52 crc kubenswrapper[4962]: I1201 21:53:52.675888 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3-etc-swift\") pod \"swift-storage-0\" (UID: \"f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3\") " pod="openstack/swift-storage-0" Dec 01 21:53:52 crc kubenswrapper[4962]: I1201 21:53:52.842975 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620","Type":"ContainerStarted","Data":"8799a055bd1da97ec035082f3f2d1951555f16b8e3856b1314e634e710269b4b"} Dec 01 21:53:52 crc kubenswrapper[4962]: I1201 21:53:52.937390 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 01 21:53:53 crc kubenswrapper[4962]: I1201 21:53:53.480173 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 01 21:53:57 crc kubenswrapper[4962]: I1201 21:53:57.929901 4962 generic.go:334] "Generic (PLEG): container finished" podID="7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620" containerID="8799a055bd1da97ec035082f3f2d1951555f16b8e3856b1314e634e710269b4b" exitCode=0 Dec 01 21:53:57 crc kubenswrapper[4962]: I1201 21:53:57.929971 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620","Type":"ContainerDied","Data":"8799a055bd1da97ec035082f3f2d1951555f16b8e3856b1314e634e710269b4b"} Dec 01 21:53:59 crc kubenswrapper[4962]: I1201 21:53:59.821182 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-xd7ph-config-8szbc" Dec 01 21:53:59 crc kubenswrapper[4962]: I1201 21:53:59.914533 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5987bc9d-49c8-4f77-a8b4-509d2096ca53-var-log-ovn\") pod \"5987bc9d-49c8-4f77-a8b4-509d2096ca53\" (UID: \"5987bc9d-49c8-4f77-a8b4-509d2096ca53\") " Dec 01 21:53:59 crc kubenswrapper[4962]: I1201 21:53:59.914580 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5987bc9d-49c8-4f77-a8b4-509d2096ca53-var-run\") pod \"5987bc9d-49c8-4f77-a8b4-509d2096ca53\" (UID: \"5987bc9d-49c8-4f77-a8b4-509d2096ca53\") " Dec 01 21:53:59 crc kubenswrapper[4962]: I1201 21:53:59.914607 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5987bc9d-49c8-4f77-a8b4-509d2096ca53-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "5987bc9d-49c8-4f77-a8b4-509d2096ca53" (UID: "5987bc9d-49c8-4f77-a8b4-509d2096ca53"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 21:53:59 crc kubenswrapper[4962]: I1201 21:53:59.914649 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5987bc9d-49c8-4f77-a8b4-509d2096ca53-scripts\") pod \"5987bc9d-49c8-4f77-a8b4-509d2096ca53\" (UID: \"5987bc9d-49c8-4f77-a8b4-509d2096ca53\") " Dec 01 21:53:59 crc kubenswrapper[4962]: I1201 21:53:59.914721 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5987bc9d-49c8-4f77-a8b4-509d2096ca53-var-run-ovn\") pod \"5987bc9d-49c8-4f77-a8b4-509d2096ca53\" (UID: \"5987bc9d-49c8-4f77-a8b4-509d2096ca53\") " Dec 01 21:53:59 crc kubenswrapper[4962]: I1201 21:53:59.914766 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5987bc9d-49c8-4f77-a8b4-509d2096ca53-additional-scripts\") pod \"5987bc9d-49c8-4f77-a8b4-509d2096ca53\" (UID: \"5987bc9d-49c8-4f77-a8b4-509d2096ca53\") " Dec 01 21:53:59 crc kubenswrapper[4962]: I1201 21:53:59.914736 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5987bc9d-49c8-4f77-a8b4-509d2096ca53-var-run" (OuterVolumeSpecName: "var-run") pod "5987bc9d-49c8-4f77-a8b4-509d2096ca53" (UID: "5987bc9d-49c8-4f77-a8b4-509d2096ca53"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 21:53:59 crc kubenswrapper[4962]: I1201 21:53:59.914797 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m6nl4\" (UniqueName: \"kubernetes.io/projected/5987bc9d-49c8-4f77-a8b4-509d2096ca53-kube-api-access-m6nl4\") pod \"5987bc9d-49c8-4f77-a8b4-509d2096ca53\" (UID: \"5987bc9d-49c8-4f77-a8b4-509d2096ca53\") " Dec 01 21:53:59 crc kubenswrapper[4962]: I1201 21:53:59.915634 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5987bc9d-49c8-4f77-a8b4-509d2096ca53-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "5987bc9d-49c8-4f77-a8b4-509d2096ca53" (UID: "5987bc9d-49c8-4f77-a8b4-509d2096ca53"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 21:53:59 crc kubenswrapper[4962]: I1201 21:53:59.916074 4962 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5987bc9d-49c8-4f77-a8b4-509d2096ca53-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:59 crc kubenswrapper[4962]: I1201 21:53:59.916100 4962 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5987bc9d-49c8-4f77-a8b4-509d2096ca53-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:59 crc kubenswrapper[4962]: I1201 21:53:59.916123 4962 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5987bc9d-49c8-4f77-a8b4-509d2096ca53-var-run\") on node \"crc\" DevicePath \"\"" Dec 01 21:53:59 crc kubenswrapper[4962]: I1201 21:53:59.916173 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5987bc9d-49c8-4f77-a8b4-509d2096ca53-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "5987bc9d-49c8-4f77-a8b4-509d2096ca53" (UID: "5987bc9d-49c8-4f77-a8b4-509d2096ca53"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:53:59 crc kubenswrapper[4962]: I1201 21:53:59.916692 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5987bc9d-49c8-4f77-a8b4-509d2096ca53-scripts" (OuterVolumeSpecName: "scripts") pod "5987bc9d-49c8-4f77-a8b4-509d2096ca53" (UID: "5987bc9d-49c8-4f77-a8b4-509d2096ca53"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:53:59 crc kubenswrapper[4962]: I1201 21:53:59.930670 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5987bc9d-49c8-4f77-a8b4-509d2096ca53-kube-api-access-m6nl4" (OuterVolumeSpecName: "kube-api-access-m6nl4") pod "5987bc9d-49c8-4f77-a8b4-509d2096ca53" (UID: "5987bc9d-49c8-4f77-a8b4-509d2096ca53"). InnerVolumeSpecName "kube-api-access-m6nl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:53:59 crc kubenswrapper[4962]: I1201 21:53:59.954737 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-xd7ph-config-8szbc" event={"ID":"5987bc9d-49c8-4f77-a8b4-509d2096ca53","Type":"ContainerDied","Data":"2474b0ef9a85f4b2454e15688528043935c4128499d2fbf12c05f821028ec796"} Dec 01 21:53:59 crc kubenswrapper[4962]: I1201 21:53:59.954816 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2474b0ef9a85f4b2454e15688528043935c4128499d2fbf12c05f821028ec796" Dec 01 21:53:59 crc kubenswrapper[4962]: I1201 21:53:59.954838 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-xd7ph-config-8szbc" Dec 01 21:54:00 crc kubenswrapper[4962]: I1201 21:54:00.018533 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5987bc9d-49c8-4f77-a8b4-509d2096ca53-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 21:54:00 crc kubenswrapper[4962]: I1201 21:54:00.018571 4962 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5987bc9d-49c8-4f77-a8b4-509d2096ca53-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 21:54:00 crc kubenswrapper[4962]: I1201 21:54:00.018587 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m6nl4\" (UniqueName: \"kubernetes.io/projected/5987bc9d-49c8-4f77-a8b4-509d2096ca53-kube-api-access-m6nl4\") on node \"crc\" DevicePath \"\"" Dec 01 21:54:00 crc kubenswrapper[4962]: I1201 21:54:00.989514 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-xd7ph-config-8szbc"] Dec 01 21:54:01 crc kubenswrapper[4962]: I1201 21:54:01.016224 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-xd7ph-config-8szbc"] Dec 01 21:54:01 crc kubenswrapper[4962]: I1201 21:54:01.145435 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-xd7ph-config-9dk5j"] Dec 01 21:54:01 crc kubenswrapper[4962]: E1201 21:54:01.145865 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b539cc27-f876-40e3-b77a-8af750ce5b3a" containerName="mariadb-account-create-update" Dec 01 21:54:01 crc kubenswrapper[4962]: I1201 21:54:01.145883 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b539cc27-f876-40e3-b77a-8af750ce5b3a" containerName="mariadb-account-create-update" Dec 01 21:54:01 crc kubenswrapper[4962]: E1201 21:54:01.145921 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d0bb151-9b17-4471-9b0d-05f74fa33f0c" containerName="mariadb-account-create-update" Dec 01 21:54:01 crc kubenswrapper[4962]: I1201 21:54:01.145942 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d0bb151-9b17-4471-9b0d-05f74fa33f0c" containerName="mariadb-account-create-update" Dec 01 21:54:01 crc kubenswrapper[4962]: E1201 21:54:01.145951 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cd1b6eb-52cf-44aa-993a-90d3abec28ad" containerName="mariadb-account-create-update" Dec 01 21:54:01 crc kubenswrapper[4962]: I1201 21:54:01.145958 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cd1b6eb-52cf-44aa-993a-90d3abec28ad" containerName="mariadb-account-create-update" Dec 01 21:54:01 crc kubenswrapper[4962]: E1201 21:54:01.145967 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37f45ac0-c39d-4785-957e-e69e1659927e" containerName="mariadb-database-create" Dec 01 21:54:01 crc kubenswrapper[4962]: I1201 21:54:01.145973 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="37f45ac0-c39d-4785-957e-e69e1659927e" containerName="mariadb-database-create" Dec 01 21:54:01 crc kubenswrapper[4962]: E1201 21:54:01.145985 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c889a0f9-46ee-413a-bae1-94ee4eb8f16d" containerName="mariadb-database-create" Dec 01 21:54:01 crc kubenswrapper[4962]: I1201 21:54:01.145993 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="c889a0f9-46ee-413a-bae1-94ee4eb8f16d" containerName="mariadb-database-create" Dec 01 21:54:01 crc kubenswrapper[4962]: E1201 21:54:01.146006 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29e77231-0423-458f-8262-aa12c2536566" containerName="mariadb-database-create" Dec 01 21:54:01 crc kubenswrapper[4962]: I1201 21:54:01.146012 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="29e77231-0423-458f-8262-aa12c2536566" containerName="mariadb-database-create" Dec 01 21:54:01 crc kubenswrapper[4962]: E1201 21:54:01.146027 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5987bc9d-49c8-4f77-a8b4-509d2096ca53" containerName="ovn-config" Dec 01 21:54:01 crc kubenswrapper[4962]: I1201 21:54:01.146034 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="5987bc9d-49c8-4f77-a8b4-509d2096ca53" containerName="ovn-config" Dec 01 21:54:01 crc kubenswrapper[4962]: E1201 21:54:01.146045 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfa1c735-01c9-4f3e-ae1d-32bc2af0972d" containerName="mariadb-database-create" Dec 01 21:54:01 crc kubenswrapper[4962]: I1201 21:54:01.146051 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfa1c735-01c9-4f3e-ae1d-32bc2af0972d" containerName="mariadb-database-create" Dec 01 21:54:01 crc kubenswrapper[4962]: E1201 21:54:01.146072 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7515449-4f20-4673-ac23-a7a5a40f852d" containerName="mariadb-account-create-update" Dec 01 21:54:01 crc kubenswrapper[4962]: I1201 21:54:01.146079 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7515449-4f20-4673-ac23-a7a5a40f852d" containerName="mariadb-account-create-update" Dec 01 21:54:01 crc kubenswrapper[4962]: I1201 21:54:01.146275 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="37f45ac0-c39d-4785-957e-e69e1659927e" containerName="mariadb-database-create" Dec 01 21:54:01 crc kubenswrapper[4962]: I1201 21:54:01.146290 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfa1c735-01c9-4f3e-ae1d-32bc2af0972d" containerName="mariadb-database-create" Dec 01 21:54:01 crc kubenswrapper[4962]: I1201 21:54:01.146301 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="c889a0f9-46ee-413a-bae1-94ee4eb8f16d" containerName="mariadb-database-create" Dec 01 21:54:01 crc kubenswrapper[4962]: I1201 21:54:01.146311 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7515449-4f20-4673-ac23-a7a5a40f852d" containerName="mariadb-account-create-update" Dec 01 21:54:01 crc kubenswrapper[4962]: I1201 21:54:01.146322 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="b539cc27-f876-40e3-b77a-8af750ce5b3a" containerName="mariadb-account-create-update" Dec 01 21:54:01 crc kubenswrapper[4962]: I1201 21:54:01.146331 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="29e77231-0423-458f-8262-aa12c2536566" containerName="mariadb-database-create" Dec 01 21:54:01 crc kubenswrapper[4962]: I1201 21:54:01.146339 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="5987bc9d-49c8-4f77-a8b4-509d2096ca53" containerName="ovn-config" Dec 01 21:54:01 crc kubenswrapper[4962]: I1201 21:54:01.146350 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d0bb151-9b17-4471-9b0d-05f74fa33f0c" containerName="mariadb-account-create-update" Dec 01 21:54:01 crc kubenswrapper[4962]: I1201 21:54:01.146358 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="5cd1b6eb-52cf-44aa-993a-90d3abec28ad" containerName="mariadb-account-create-update" Dec 01 21:54:01 crc kubenswrapper[4962]: I1201 21:54:01.147083 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-xd7ph-config-9dk5j" Dec 01 21:54:01 crc kubenswrapper[4962]: I1201 21:54:01.176046 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-xd7ph-config-9dk5j"] Dec 01 21:54:01 crc kubenswrapper[4962]: I1201 21:54:01.188432 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 01 21:54:01 crc kubenswrapper[4962]: I1201 21:54:01.269748 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sc5f5\" (UniqueName: \"kubernetes.io/projected/944353cb-2186-4dd5-ba7c-4841ef140b2c-kube-api-access-sc5f5\") pod \"ovn-controller-xd7ph-config-9dk5j\" (UID: \"944353cb-2186-4dd5-ba7c-4841ef140b2c\") " pod="openstack/ovn-controller-xd7ph-config-9dk5j" Dec 01 21:54:01 crc kubenswrapper[4962]: I1201 21:54:01.269837 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/944353cb-2186-4dd5-ba7c-4841ef140b2c-var-run\") pod \"ovn-controller-xd7ph-config-9dk5j\" (UID: \"944353cb-2186-4dd5-ba7c-4841ef140b2c\") " pod="openstack/ovn-controller-xd7ph-config-9dk5j" Dec 01 21:54:01 crc kubenswrapper[4962]: I1201 21:54:01.269862 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/944353cb-2186-4dd5-ba7c-4841ef140b2c-scripts\") pod \"ovn-controller-xd7ph-config-9dk5j\" (UID: \"944353cb-2186-4dd5-ba7c-4841ef140b2c\") " pod="openstack/ovn-controller-xd7ph-config-9dk5j" Dec 01 21:54:01 crc kubenswrapper[4962]: I1201 21:54:01.269889 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/944353cb-2186-4dd5-ba7c-4841ef140b2c-var-log-ovn\") pod \"ovn-controller-xd7ph-config-9dk5j\" (UID: \"944353cb-2186-4dd5-ba7c-4841ef140b2c\") " pod="openstack/ovn-controller-xd7ph-config-9dk5j" Dec 01 21:54:01 crc kubenswrapper[4962]: I1201 21:54:01.270005 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/944353cb-2186-4dd5-ba7c-4841ef140b2c-additional-scripts\") pod \"ovn-controller-xd7ph-config-9dk5j\" (UID: \"944353cb-2186-4dd5-ba7c-4841ef140b2c\") " pod="openstack/ovn-controller-xd7ph-config-9dk5j" Dec 01 21:54:01 crc kubenswrapper[4962]: I1201 21:54:01.270037 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/944353cb-2186-4dd5-ba7c-4841ef140b2c-var-run-ovn\") pod \"ovn-controller-xd7ph-config-9dk5j\" (UID: \"944353cb-2186-4dd5-ba7c-4841ef140b2c\") " pod="openstack/ovn-controller-xd7ph-config-9dk5j" Dec 01 21:54:01 crc kubenswrapper[4962]: I1201 21:54:01.373040 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/944353cb-2186-4dd5-ba7c-4841ef140b2c-var-run-ovn\") pod \"ovn-controller-xd7ph-config-9dk5j\" (UID: \"944353cb-2186-4dd5-ba7c-4841ef140b2c\") " pod="openstack/ovn-controller-xd7ph-config-9dk5j" Dec 01 21:54:01 crc kubenswrapper[4962]: I1201 21:54:01.373147 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sc5f5\" (UniqueName: \"kubernetes.io/projected/944353cb-2186-4dd5-ba7c-4841ef140b2c-kube-api-access-sc5f5\") pod \"ovn-controller-xd7ph-config-9dk5j\" (UID: \"944353cb-2186-4dd5-ba7c-4841ef140b2c\") " pod="openstack/ovn-controller-xd7ph-config-9dk5j" Dec 01 21:54:01 crc kubenswrapper[4962]: I1201 21:54:01.373206 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/944353cb-2186-4dd5-ba7c-4841ef140b2c-var-run\") pod \"ovn-controller-xd7ph-config-9dk5j\" (UID: \"944353cb-2186-4dd5-ba7c-4841ef140b2c\") " pod="openstack/ovn-controller-xd7ph-config-9dk5j" Dec 01 21:54:01 crc kubenswrapper[4962]: I1201 21:54:01.373227 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/944353cb-2186-4dd5-ba7c-4841ef140b2c-scripts\") pod \"ovn-controller-xd7ph-config-9dk5j\" (UID: \"944353cb-2186-4dd5-ba7c-4841ef140b2c\") " pod="openstack/ovn-controller-xd7ph-config-9dk5j" Dec 01 21:54:01 crc kubenswrapper[4962]: I1201 21:54:01.373258 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/944353cb-2186-4dd5-ba7c-4841ef140b2c-var-log-ovn\") pod \"ovn-controller-xd7ph-config-9dk5j\" (UID: \"944353cb-2186-4dd5-ba7c-4841ef140b2c\") " pod="openstack/ovn-controller-xd7ph-config-9dk5j" Dec 01 21:54:01 crc kubenswrapper[4962]: I1201 21:54:01.373346 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/944353cb-2186-4dd5-ba7c-4841ef140b2c-additional-scripts\") pod \"ovn-controller-xd7ph-config-9dk5j\" (UID: \"944353cb-2186-4dd5-ba7c-4841ef140b2c\") " pod="openstack/ovn-controller-xd7ph-config-9dk5j" Dec 01 21:54:01 crc kubenswrapper[4962]: I1201 21:54:01.374239 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/944353cb-2186-4dd5-ba7c-4841ef140b2c-additional-scripts\") pod \"ovn-controller-xd7ph-config-9dk5j\" (UID: \"944353cb-2186-4dd5-ba7c-4841ef140b2c\") " pod="openstack/ovn-controller-xd7ph-config-9dk5j" Dec 01 21:54:01 crc kubenswrapper[4962]: I1201 21:54:01.374469 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/944353cb-2186-4dd5-ba7c-4841ef140b2c-var-run-ovn\") pod \"ovn-controller-xd7ph-config-9dk5j\" (UID: \"944353cb-2186-4dd5-ba7c-4841ef140b2c\") " pod="openstack/ovn-controller-xd7ph-config-9dk5j" Dec 01 21:54:01 crc kubenswrapper[4962]: I1201 21:54:01.374507 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/944353cb-2186-4dd5-ba7c-4841ef140b2c-var-run\") pod \"ovn-controller-xd7ph-config-9dk5j\" (UID: \"944353cb-2186-4dd5-ba7c-4841ef140b2c\") " pod="openstack/ovn-controller-xd7ph-config-9dk5j" Dec 01 21:54:01 crc kubenswrapper[4962]: I1201 21:54:01.374886 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/944353cb-2186-4dd5-ba7c-4841ef140b2c-var-log-ovn\") pod \"ovn-controller-xd7ph-config-9dk5j\" (UID: \"944353cb-2186-4dd5-ba7c-4841ef140b2c\") " pod="openstack/ovn-controller-xd7ph-config-9dk5j" Dec 01 21:54:01 crc kubenswrapper[4962]: I1201 21:54:01.376415 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/944353cb-2186-4dd5-ba7c-4841ef140b2c-scripts\") pod \"ovn-controller-xd7ph-config-9dk5j\" (UID: \"944353cb-2186-4dd5-ba7c-4841ef140b2c\") " pod="openstack/ovn-controller-xd7ph-config-9dk5j" Dec 01 21:54:01 crc kubenswrapper[4962]: I1201 21:54:01.404445 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sc5f5\" (UniqueName: \"kubernetes.io/projected/944353cb-2186-4dd5-ba7c-4841ef140b2c-kube-api-access-sc5f5\") pod \"ovn-controller-xd7ph-config-9dk5j\" (UID: \"944353cb-2186-4dd5-ba7c-4841ef140b2c\") " pod="openstack/ovn-controller-xd7ph-config-9dk5j" Dec 01 21:54:01 crc kubenswrapper[4962]: I1201 21:54:01.520713 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-xd7ph-config-9dk5j" Dec 01 21:54:02 crc kubenswrapper[4962]: I1201 21:54:02.231418 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5987bc9d-49c8-4f77-a8b4-509d2096ca53" path="/var/lib/kubelet/pods/5987bc9d-49c8-4f77-a8b4-509d2096ca53/volumes" Dec 01 21:54:02 crc kubenswrapper[4962]: I1201 21:54:02.786989 4962 patch_prober.go:28] interesting pod/machine-config-daemon-b642k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 21:54:02 crc kubenswrapper[4962]: I1201 21:54:02.787062 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 21:54:02 crc kubenswrapper[4962]: E1201 21:54:02.970530 4962 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-keystone:current-podified" Dec 01 21:54:02 crc kubenswrapper[4962]: E1201 21:54:02.970674 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:keystone-db-sync,Image:quay.io/podified-antelope-centos9/openstack-keystone:current-podified,Command:[/bin/bash],Args:[-c keystone-manage db_sync],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/keystone/keystone.conf,SubPath:keystone.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-c6cks,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42425,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42425,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-db-sync-qd22d_openstack(e6634bf9-94a9-4b1c-b14b-44b4ecc882bb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 21:54:02 crc kubenswrapper[4962]: E1201 21:54:02.972274 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"keystone-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/keystone-db-sync-qd22d" podUID="e6634bf9-94a9-4b1c-b14b-44b4ecc882bb" Dec 01 21:54:03 crc kubenswrapper[4962]: E1201 21:54:03.009177 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"keystone-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-keystone:current-podified\\\"\"" pod="openstack/keystone-db-sync-qd22d" podUID="e6634bf9-94a9-4b1c-b14b-44b4ecc882bb" Dec 01 21:54:03 crc kubenswrapper[4962]: I1201 21:54:03.487557 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-xd7ph-config-9dk5j"] Dec 01 21:54:03 crc kubenswrapper[4962]: W1201 21:54:03.495443 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod944353cb_2186_4dd5_ba7c_4841ef140b2c.slice/crio-bdfaf92c6750a5593a380f941cca66bf912cced1597d76ba28dc1a5e866df88b WatchSource:0}: Error finding container bdfaf92c6750a5593a380f941cca66bf912cced1597d76ba28dc1a5e866df88b: Status 404 returned error can't find the container with id bdfaf92c6750a5593a380f941cca66bf912cced1597d76ba28dc1a5e866df88b Dec 01 21:54:03 crc kubenswrapper[4962]: I1201 21:54:03.678034 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 01 21:54:03 crc kubenswrapper[4962]: W1201 21:54:03.681542 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf53f9dd4_f949_4ae6_a2d5_7a19a21d80c3.slice/crio-a84689a4716f8202521d65d0dc389e7bb110fdad016c610fc4b0a15c60d9548a WatchSource:0}: Error finding container a84689a4716f8202521d65d0dc389e7bb110fdad016c610fc4b0a15c60d9548a: Status 404 returned error can't find the container with id a84689a4716f8202521d65d0dc389e7bb110fdad016c610fc4b0a15c60d9548a Dec 01 21:54:04 crc kubenswrapper[4962]: I1201 21:54:04.018755 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620","Type":"ContainerStarted","Data":"48e5472debba328f566b88c5a3e8a260622ba411bc7c38e98fcc5df61e62fce5"} Dec 01 21:54:04 crc kubenswrapper[4962]: I1201 21:54:04.020330 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-xd7ph-config-9dk5j" event={"ID":"944353cb-2186-4dd5-ba7c-4841ef140b2c","Type":"ContainerStarted","Data":"3a870dd0de19c7ac629701cb1c7e5d27c5d5f4d2d56d910a99a91d08aa4523cd"} Dec 01 21:54:04 crc kubenswrapper[4962]: I1201 21:54:04.020390 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-xd7ph-config-9dk5j" event={"ID":"944353cb-2186-4dd5-ba7c-4841ef140b2c","Type":"ContainerStarted","Data":"bdfaf92c6750a5593a380f941cca66bf912cced1597d76ba28dc1a5e866df88b"} Dec 01 21:54:04 crc kubenswrapper[4962]: I1201 21:54:04.021465 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-28z4p" event={"ID":"235fb826-ef71-488f-b902-efcf5dc9a7dd","Type":"ContainerStarted","Data":"b9bc37bd96b4ccb33a288be857b879b5744c6cdb5081ebf82d1f77c52393fb2d"} Dec 01 21:54:04 crc kubenswrapper[4962]: I1201 21:54:04.022869 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"a120e58c-62c2-4242-a668-151b872a9cb4","Type":"ContainerStarted","Data":"565453a6732b26778e1fd23b877ef4726753596be880161a75cb33142f68b534"} Dec 01 21:54:04 crc kubenswrapper[4962]: I1201 21:54:04.026290 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3","Type":"ContainerStarted","Data":"a84689a4716f8202521d65d0dc389e7bb110fdad016c610fc4b0a15c60d9548a"} Dec 01 21:54:04 crc kubenswrapper[4962]: I1201 21:54:04.045093 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-xd7ph-config-9dk5j" podStartSLOduration=3.045070389 podStartE2EDuration="3.045070389s" podCreationTimestamp="2025-12-01 21:54:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:54:04.040421376 +0000 UTC m=+1228.141860582" watchObservedRunningTime="2025-12-01 21:54:04.045070389 +0000 UTC m=+1228.146509584" Dec 01 21:54:04 crc kubenswrapper[4962]: I1201 21:54:04.070381 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-0" podStartSLOduration=3.627446566 podStartE2EDuration="19.070358699s" podCreationTimestamp="2025-12-01 21:53:45 +0000 UTC" firstStartedPulling="2025-12-01 21:53:47.551384098 +0000 UTC m=+1211.652823293" lastFinishedPulling="2025-12-01 21:54:02.994296221 +0000 UTC m=+1227.095735426" observedRunningTime="2025-12-01 21:54:04.055049473 +0000 UTC m=+1228.156488678" watchObservedRunningTime="2025-12-01 21:54:04.070358699 +0000 UTC m=+1228.171797894" Dec 01 21:54:04 crc kubenswrapper[4962]: I1201 21:54:04.094750 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-28z4p" podStartSLOduration=3.519653564 podStartE2EDuration="26.094731114s" podCreationTimestamp="2025-12-01 21:53:38 +0000 UTC" firstStartedPulling="2025-12-01 21:53:40.474914328 +0000 UTC m=+1204.576353523" lastFinishedPulling="2025-12-01 21:54:03.049991878 +0000 UTC m=+1227.151431073" observedRunningTime="2025-12-01 21:54:04.092785168 +0000 UTC m=+1228.194224383" watchObservedRunningTime="2025-12-01 21:54:04.094731114 +0000 UTC m=+1228.196170319" Dec 01 21:54:05 crc kubenswrapper[4962]: I1201 21:54:05.036515 4962 generic.go:334] "Generic (PLEG): container finished" podID="944353cb-2186-4dd5-ba7c-4841ef140b2c" containerID="3a870dd0de19c7ac629701cb1c7e5d27c5d5f4d2d56d910a99a91d08aa4523cd" exitCode=0 Dec 01 21:54:05 crc kubenswrapper[4962]: I1201 21:54:05.036606 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-xd7ph-config-9dk5j" event={"ID":"944353cb-2186-4dd5-ba7c-4841ef140b2c","Type":"ContainerDied","Data":"3a870dd0de19c7ac629701cb1c7e5d27c5d5f4d2d56d910a99a91d08aa4523cd"} Dec 01 21:54:06 crc kubenswrapper[4962]: I1201 21:54:06.634608 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-xd7ph-config-9dk5j" Dec 01 21:54:06 crc kubenswrapper[4962]: I1201 21:54:06.769439 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/944353cb-2186-4dd5-ba7c-4841ef140b2c-scripts\") pod \"944353cb-2186-4dd5-ba7c-4841ef140b2c\" (UID: \"944353cb-2186-4dd5-ba7c-4841ef140b2c\") " Dec 01 21:54:06 crc kubenswrapper[4962]: I1201 21:54:06.769824 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/944353cb-2186-4dd5-ba7c-4841ef140b2c-var-log-ovn\") pod \"944353cb-2186-4dd5-ba7c-4841ef140b2c\" (UID: \"944353cb-2186-4dd5-ba7c-4841ef140b2c\") " Dec 01 21:54:06 crc kubenswrapper[4962]: I1201 21:54:06.769909 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/944353cb-2186-4dd5-ba7c-4841ef140b2c-additional-scripts\") pod \"944353cb-2186-4dd5-ba7c-4841ef140b2c\" (UID: \"944353cb-2186-4dd5-ba7c-4841ef140b2c\") " Dec 01 21:54:06 crc kubenswrapper[4962]: I1201 21:54:06.769964 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/944353cb-2186-4dd5-ba7c-4841ef140b2c-var-run\") pod \"944353cb-2186-4dd5-ba7c-4841ef140b2c\" (UID: \"944353cb-2186-4dd5-ba7c-4841ef140b2c\") " Dec 01 21:54:06 crc kubenswrapper[4962]: I1201 21:54:06.770199 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/944353cb-2186-4dd5-ba7c-4841ef140b2c-var-run-ovn\") pod \"944353cb-2186-4dd5-ba7c-4841ef140b2c\" (UID: \"944353cb-2186-4dd5-ba7c-4841ef140b2c\") " Dec 01 21:54:06 crc kubenswrapper[4962]: I1201 21:54:06.770332 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sc5f5\" (UniqueName: \"kubernetes.io/projected/944353cb-2186-4dd5-ba7c-4841ef140b2c-kube-api-access-sc5f5\") pod \"944353cb-2186-4dd5-ba7c-4841ef140b2c\" (UID: \"944353cb-2186-4dd5-ba7c-4841ef140b2c\") " Dec 01 21:54:06 crc kubenswrapper[4962]: I1201 21:54:06.770747 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/944353cb-2186-4dd5-ba7c-4841ef140b2c-scripts" (OuterVolumeSpecName: "scripts") pod "944353cb-2186-4dd5-ba7c-4841ef140b2c" (UID: "944353cb-2186-4dd5-ba7c-4841ef140b2c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:54:06 crc kubenswrapper[4962]: I1201 21:54:06.771076 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/944353cb-2186-4dd5-ba7c-4841ef140b2c-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "944353cb-2186-4dd5-ba7c-4841ef140b2c" (UID: "944353cb-2186-4dd5-ba7c-4841ef140b2c"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 21:54:06 crc kubenswrapper[4962]: I1201 21:54:06.771157 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/944353cb-2186-4dd5-ba7c-4841ef140b2c-var-run" (OuterVolumeSpecName: "var-run") pod "944353cb-2186-4dd5-ba7c-4841ef140b2c" (UID: "944353cb-2186-4dd5-ba7c-4841ef140b2c"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 21:54:06 crc kubenswrapper[4962]: I1201 21:54:06.771203 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/944353cb-2186-4dd5-ba7c-4841ef140b2c-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "944353cb-2186-4dd5-ba7c-4841ef140b2c" (UID: "944353cb-2186-4dd5-ba7c-4841ef140b2c"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 21:54:06 crc kubenswrapper[4962]: I1201 21:54:06.771517 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/944353cb-2186-4dd5-ba7c-4841ef140b2c-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 21:54:06 crc kubenswrapper[4962]: I1201 21:54:06.771543 4962 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/944353cb-2186-4dd5-ba7c-4841ef140b2c-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 01 21:54:06 crc kubenswrapper[4962]: I1201 21:54:06.771557 4962 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/944353cb-2186-4dd5-ba7c-4841ef140b2c-var-run\") on node \"crc\" DevicePath \"\"" Dec 01 21:54:06 crc kubenswrapper[4962]: I1201 21:54:06.771568 4962 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/944353cb-2186-4dd5-ba7c-4841ef140b2c-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 01 21:54:06 crc kubenswrapper[4962]: I1201 21:54:06.771537 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/944353cb-2186-4dd5-ba7c-4841ef140b2c-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "944353cb-2186-4dd5-ba7c-4841ef140b2c" (UID: "944353cb-2186-4dd5-ba7c-4841ef140b2c"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:54:06 crc kubenswrapper[4962]: I1201 21:54:06.779229 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/944353cb-2186-4dd5-ba7c-4841ef140b2c-kube-api-access-sc5f5" (OuterVolumeSpecName: "kube-api-access-sc5f5") pod "944353cb-2186-4dd5-ba7c-4841ef140b2c" (UID: "944353cb-2186-4dd5-ba7c-4841ef140b2c"). InnerVolumeSpecName "kube-api-access-sc5f5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:54:06 crc kubenswrapper[4962]: I1201 21:54:06.873804 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sc5f5\" (UniqueName: \"kubernetes.io/projected/944353cb-2186-4dd5-ba7c-4841ef140b2c-kube-api-access-sc5f5\") on node \"crc\" DevicePath \"\"" Dec 01 21:54:06 crc kubenswrapper[4962]: I1201 21:54:06.873841 4962 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/944353cb-2186-4dd5-ba7c-4841ef140b2c-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 21:54:07 crc kubenswrapper[4962]: I1201 21:54:07.059762 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3","Type":"ContainerStarted","Data":"d5e6581b022919dbcadb4669280254cbc6c885bf0e21d93368d8b356aef3a08b"} Dec 01 21:54:07 crc kubenswrapper[4962]: I1201 21:54:07.060207 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3","Type":"ContainerStarted","Data":"35b7145fe827ddef18627168dc8fe5bd4e9babbf0b76107a358b017ddfbcba49"} Dec 01 21:54:07 crc kubenswrapper[4962]: I1201 21:54:07.064218 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620","Type":"ContainerStarted","Data":"6af69d3d8fe932c92654a0d562d8a63391dc2c04b7a78577e9e1b7eb2b8b9b72"} Dec 01 21:54:07 crc kubenswrapper[4962]: I1201 21:54:07.064262 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620","Type":"ContainerStarted","Data":"f6a0da4079d83e41ff47d33e65be6ab8a0ac225799f409b18d8a9caf2946b1da"} Dec 01 21:54:07 crc kubenswrapper[4962]: I1201 21:54:07.066825 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-xd7ph-config-9dk5j" event={"ID":"944353cb-2186-4dd5-ba7c-4841ef140b2c","Type":"ContainerDied","Data":"bdfaf92c6750a5593a380f941cca66bf912cced1597d76ba28dc1a5e866df88b"} Dec 01 21:54:07 crc kubenswrapper[4962]: I1201 21:54:07.066861 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bdfaf92c6750a5593a380f941cca66bf912cced1597d76ba28dc1a5e866df88b" Dec 01 21:54:07 crc kubenswrapper[4962]: I1201 21:54:07.066911 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-xd7ph-config-9dk5j" Dec 01 21:54:07 crc kubenswrapper[4962]: I1201 21:54:07.101644 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=21.101624526 podStartE2EDuration="21.101624526s" podCreationTimestamp="2025-12-01 21:53:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:54:07.099482125 +0000 UTC m=+1231.200921340" watchObservedRunningTime="2025-12-01 21:54:07.101624526 +0000 UTC m=+1231.203063731" Dec 01 21:54:07 crc kubenswrapper[4962]: I1201 21:54:07.705531 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-xd7ph-config-9dk5j"] Dec 01 21:54:07 crc kubenswrapper[4962]: I1201 21:54:07.720227 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-xd7ph-config-9dk5j"] Dec 01 21:54:08 crc kubenswrapper[4962]: I1201 21:54:08.094187 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3","Type":"ContainerStarted","Data":"0908449a847332624ee02328dda2b45a75f0180c84ae6b2d1590f011efd4720d"} Dec 01 21:54:08 crc kubenswrapper[4962]: I1201 21:54:08.094248 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3","Type":"ContainerStarted","Data":"54c75c54906bc3b6fcd30855790830f7a7aab376a48db6db12b5d1fbdb1cc30c"} Dec 01 21:54:08 crc kubenswrapper[4962]: I1201 21:54:08.233078 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="944353cb-2186-4dd5-ba7c-4841ef140b2c" path="/var/lib/kubelet/pods/944353cb-2186-4dd5-ba7c-4841ef140b2c/volumes" Dec 01 21:54:10 crc kubenswrapper[4962]: I1201 21:54:10.118790 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3","Type":"ContainerStarted","Data":"0b0297cef425686bf7063f62e804820438247d66eaa01c8b070ed6c3fe18761d"} Dec 01 21:54:10 crc kubenswrapper[4962]: I1201 21:54:10.119293 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3","Type":"ContainerStarted","Data":"2a96d34272abc2add8c360976f6824c957a810d45c48a476ba41330b145cdb28"} Dec 01 21:54:10 crc kubenswrapper[4962]: I1201 21:54:10.119306 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3","Type":"ContainerStarted","Data":"b9d40244bd632768e185f61efc686b89860b801a3b6f35a5de4d7fe482743d72"} Dec 01 21:54:10 crc kubenswrapper[4962]: I1201 21:54:10.119318 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3","Type":"ContainerStarted","Data":"9e87a10d692a9df20ba6f313181ed49e010577cba4794ff73aa3fade40bcdd73"} Dec 01 21:54:11 crc kubenswrapper[4962]: I1201 21:54:11.142740 4962 generic.go:334] "Generic (PLEG): container finished" podID="235fb826-ef71-488f-b902-efcf5dc9a7dd" containerID="b9bc37bd96b4ccb33a288be857b879b5744c6cdb5081ebf82d1f77c52393fb2d" exitCode=0 Dec 01 21:54:11 crc kubenswrapper[4962]: I1201 21:54:11.142879 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-28z4p" event={"ID":"235fb826-ef71-488f-b902-efcf5dc9a7dd","Type":"ContainerDied","Data":"b9bc37bd96b4ccb33a288be857b879b5744c6cdb5081ebf82d1f77c52393fb2d"} Dec 01 21:54:11 crc kubenswrapper[4962]: I1201 21:54:11.929219 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Dec 01 21:54:12 crc kubenswrapper[4962]: I1201 21:54:12.164390 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3","Type":"ContainerStarted","Data":"3eac09a7e08e02ffd5b5c7929e2b06c4d42b8f113b2226a710d3315cc84bf01d"} Dec 01 21:54:12 crc kubenswrapper[4962]: I1201 21:54:12.164440 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3","Type":"ContainerStarted","Data":"f13744a85c1aa88394850ca5608532fad64e3bdcfd5e96f8fa46a858fa41b290"} Dec 01 21:54:12 crc kubenswrapper[4962]: I1201 21:54:12.164453 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3","Type":"ContainerStarted","Data":"74cd3a7671695f18799614b3af1c1eed4b196f6068f4400b01a72e3d9c8156d6"} Dec 01 21:54:12 crc kubenswrapper[4962]: I1201 21:54:12.766015 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-28z4p" Dec 01 21:54:12 crc kubenswrapper[4962]: I1201 21:54:12.825875 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/235fb826-ef71-488f-b902-efcf5dc9a7dd-combined-ca-bundle\") pod \"235fb826-ef71-488f-b902-efcf5dc9a7dd\" (UID: \"235fb826-ef71-488f-b902-efcf5dc9a7dd\") " Dec 01 21:54:12 crc kubenswrapper[4962]: I1201 21:54:12.826001 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/235fb826-ef71-488f-b902-efcf5dc9a7dd-config-data\") pod \"235fb826-ef71-488f-b902-efcf5dc9a7dd\" (UID: \"235fb826-ef71-488f-b902-efcf5dc9a7dd\") " Dec 01 21:54:12 crc kubenswrapper[4962]: I1201 21:54:12.826027 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/235fb826-ef71-488f-b902-efcf5dc9a7dd-db-sync-config-data\") pod \"235fb826-ef71-488f-b902-efcf5dc9a7dd\" (UID: \"235fb826-ef71-488f-b902-efcf5dc9a7dd\") " Dec 01 21:54:12 crc kubenswrapper[4962]: I1201 21:54:12.826160 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdpmp\" (UniqueName: \"kubernetes.io/projected/235fb826-ef71-488f-b902-efcf5dc9a7dd-kube-api-access-sdpmp\") pod \"235fb826-ef71-488f-b902-efcf5dc9a7dd\" (UID: \"235fb826-ef71-488f-b902-efcf5dc9a7dd\") " Dec 01 21:54:12 crc kubenswrapper[4962]: I1201 21:54:12.832080 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/235fb826-ef71-488f-b902-efcf5dc9a7dd-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "235fb826-ef71-488f-b902-efcf5dc9a7dd" (UID: "235fb826-ef71-488f-b902-efcf5dc9a7dd"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:54:12 crc kubenswrapper[4962]: I1201 21:54:12.843169 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/235fb826-ef71-488f-b902-efcf5dc9a7dd-kube-api-access-sdpmp" (OuterVolumeSpecName: "kube-api-access-sdpmp") pod "235fb826-ef71-488f-b902-efcf5dc9a7dd" (UID: "235fb826-ef71-488f-b902-efcf5dc9a7dd"). InnerVolumeSpecName "kube-api-access-sdpmp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:54:12 crc kubenswrapper[4962]: I1201 21:54:12.868338 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/235fb826-ef71-488f-b902-efcf5dc9a7dd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "235fb826-ef71-488f-b902-efcf5dc9a7dd" (UID: "235fb826-ef71-488f-b902-efcf5dc9a7dd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:54:12 crc kubenswrapper[4962]: I1201 21:54:12.889731 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/235fb826-ef71-488f-b902-efcf5dc9a7dd-config-data" (OuterVolumeSpecName: "config-data") pod "235fb826-ef71-488f-b902-efcf5dc9a7dd" (UID: "235fb826-ef71-488f-b902-efcf5dc9a7dd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:54:12 crc kubenswrapper[4962]: I1201 21:54:12.928184 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdpmp\" (UniqueName: \"kubernetes.io/projected/235fb826-ef71-488f-b902-efcf5dc9a7dd-kube-api-access-sdpmp\") on node \"crc\" DevicePath \"\"" Dec 01 21:54:12 crc kubenswrapper[4962]: I1201 21:54:12.928222 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/235fb826-ef71-488f-b902-efcf5dc9a7dd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 21:54:12 crc kubenswrapper[4962]: I1201 21:54:12.928234 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/235fb826-ef71-488f-b902-efcf5dc9a7dd-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 21:54:12 crc kubenswrapper[4962]: I1201 21:54:12.928242 4962 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/235fb826-ef71-488f-b902-efcf5dc9a7dd-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 21:54:13 crc kubenswrapper[4962]: I1201 21:54:13.193971 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3","Type":"ContainerStarted","Data":"aab71ed43cef3c253878cd87cc4ed5dc2916ade8db5ffbd4049e4598b795a9ab"} Dec 01 21:54:13 crc kubenswrapper[4962]: I1201 21:54:13.194058 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3","Type":"ContainerStarted","Data":"84e78122facdd068182bbadbd256dd0b2fd15c3da201e539e1182188dd60ba6e"} Dec 01 21:54:13 crc kubenswrapper[4962]: I1201 21:54:13.194085 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3","Type":"ContainerStarted","Data":"2e02406aaa130ebd99984e3f09ac15d5f7f69e6cf84b2af7c1ac98d44717547b"} Dec 01 21:54:13 crc kubenswrapper[4962]: I1201 21:54:13.196997 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-28z4p" event={"ID":"235fb826-ef71-488f-b902-efcf5dc9a7dd","Type":"ContainerDied","Data":"76176445f256833949ecc0764c276eb96a69cedfb523fafc04b5c6e3274a3e87"} Dec 01 21:54:13 crc kubenswrapper[4962]: I1201 21:54:13.197035 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76176445f256833949ecc0764c276eb96a69cedfb523fafc04b5c6e3274a3e87" Dec 01 21:54:13 crc kubenswrapper[4962]: I1201 21:54:13.197085 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-28z4p" Dec 01 21:54:13 crc kubenswrapper[4962]: I1201 21:54:13.654895 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74dc88fc-jt748"] Dec 01 21:54:13 crc kubenswrapper[4962]: E1201 21:54:13.655599 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="944353cb-2186-4dd5-ba7c-4841ef140b2c" containerName="ovn-config" Dec 01 21:54:13 crc kubenswrapper[4962]: I1201 21:54:13.655617 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="944353cb-2186-4dd5-ba7c-4841ef140b2c" containerName="ovn-config" Dec 01 21:54:13 crc kubenswrapper[4962]: E1201 21:54:13.655648 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="235fb826-ef71-488f-b902-efcf5dc9a7dd" containerName="glance-db-sync" Dec 01 21:54:13 crc kubenswrapper[4962]: I1201 21:54:13.655656 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="235fb826-ef71-488f-b902-efcf5dc9a7dd" containerName="glance-db-sync" Dec 01 21:54:13 crc kubenswrapper[4962]: I1201 21:54:13.655856 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="944353cb-2186-4dd5-ba7c-4841ef140b2c" containerName="ovn-config" Dec 01 21:54:13 crc kubenswrapper[4962]: I1201 21:54:13.655873 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="235fb826-ef71-488f-b902-efcf5dc9a7dd" containerName="glance-db-sync" Dec 01 21:54:13 crc kubenswrapper[4962]: I1201 21:54:13.656969 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74dc88fc-jt748" Dec 01 21:54:13 crc kubenswrapper[4962]: I1201 21:54:13.675477 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74dc88fc-jt748"] Dec 01 21:54:13 crc kubenswrapper[4962]: I1201 21:54:13.759113 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c6ac2b00-3dcf-4181-a7f7-9954685ca96c-dns-svc\") pod \"dnsmasq-dns-74dc88fc-jt748\" (UID: \"c6ac2b00-3dcf-4181-a7f7-9954685ca96c\") " pod="openstack/dnsmasq-dns-74dc88fc-jt748" Dec 01 21:54:13 crc kubenswrapper[4962]: I1201 21:54:13.759169 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c6ac2b00-3dcf-4181-a7f7-9954685ca96c-ovsdbserver-sb\") pod \"dnsmasq-dns-74dc88fc-jt748\" (UID: \"c6ac2b00-3dcf-4181-a7f7-9954685ca96c\") " pod="openstack/dnsmasq-dns-74dc88fc-jt748" Dec 01 21:54:13 crc kubenswrapper[4962]: I1201 21:54:13.759209 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c6ac2b00-3dcf-4181-a7f7-9954685ca96c-ovsdbserver-nb\") pod \"dnsmasq-dns-74dc88fc-jt748\" (UID: \"c6ac2b00-3dcf-4181-a7f7-9954685ca96c\") " pod="openstack/dnsmasq-dns-74dc88fc-jt748" Dec 01 21:54:13 crc kubenswrapper[4962]: I1201 21:54:13.759241 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6ac2b00-3dcf-4181-a7f7-9954685ca96c-config\") pod \"dnsmasq-dns-74dc88fc-jt748\" (UID: \"c6ac2b00-3dcf-4181-a7f7-9954685ca96c\") " pod="openstack/dnsmasq-dns-74dc88fc-jt748" Dec 01 21:54:13 crc kubenswrapper[4962]: I1201 21:54:13.759376 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fw82c\" (UniqueName: \"kubernetes.io/projected/c6ac2b00-3dcf-4181-a7f7-9954685ca96c-kube-api-access-fw82c\") pod \"dnsmasq-dns-74dc88fc-jt748\" (UID: \"c6ac2b00-3dcf-4181-a7f7-9954685ca96c\") " pod="openstack/dnsmasq-dns-74dc88fc-jt748" Dec 01 21:54:13 crc kubenswrapper[4962]: I1201 21:54:13.860790 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c6ac2b00-3dcf-4181-a7f7-9954685ca96c-dns-svc\") pod \"dnsmasq-dns-74dc88fc-jt748\" (UID: \"c6ac2b00-3dcf-4181-a7f7-9954685ca96c\") " pod="openstack/dnsmasq-dns-74dc88fc-jt748" Dec 01 21:54:13 crc kubenswrapper[4962]: I1201 21:54:13.860845 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c6ac2b00-3dcf-4181-a7f7-9954685ca96c-ovsdbserver-sb\") pod \"dnsmasq-dns-74dc88fc-jt748\" (UID: \"c6ac2b00-3dcf-4181-a7f7-9954685ca96c\") " pod="openstack/dnsmasq-dns-74dc88fc-jt748" Dec 01 21:54:13 crc kubenswrapper[4962]: I1201 21:54:13.860900 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c6ac2b00-3dcf-4181-a7f7-9954685ca96c-ovsdbserver-nb\") pod \"dnsmasq-dns-74dc88fc-jt748\" (UID: \"c6ac2b00-3dcf-4181-a7f7-9954685ca96c\") " pod="openstack/dnsmasq-dns-74dc88fc-jt748" Dec 01 21:54:13 crc kubenswrapper[4962]: I1201 21:54:13.861837 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c6ac2b00-3dcf-4181-a7f7-9954685ca96c-ovsdbserver-nb\") pod \"dnsmasq-dns-74dc88fc-jt748\" (UID: \"c6ac2b00-3dcf-4181-a7f7-9954685ca96c\") " pod="openstack/dnsmasq-dns-74dc88fc-jt748" Dec 01 21:54:13 crc kubenswrapper[4962]: I1201 21:54:13.861874 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c6ac2b00-3dcf-4181-a7f7-9954685ca96c-ovsdbserver-sb\") pod \"dnsmasq-dns-74dc88fc-jt748\" (UID: \"c6ac2b00-3dcf-4181-a7f7-9954685ca96c\") " pod="openstack/dnsmasq-dns-74dc88fc-jt748" Dec 01 21:54:13 crc kubenswrapper[4962]: I1201 21:54:13.861911 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6ac2b00-3dcf-4181-a7f7-9954685ca96c-config\") pod \"dnsmasq-dns-74dc88fc-jt748\" (UID: \"c6ac2b00-3dcf-4181-a7f7-9954685ca96c\") " pod="openstack/dnsmasq-dns-74dc88fc-jt748" Dec 01 21:54:13 crc kubenswrapper[4962]: I1201 21:54:13.861928 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c6ac2b00-3dcf-4181-a7f7-9954685ca96c-dns-svc\") pod \"dnsmasq-dns-74dc88fc-jt748\" (UID: \"c6ac2b00-3dcf-4181-a7f7-9954685ca96c\") " pod="openstack/dnsmasq-dns-74dc88fc-jt748" Dec 01 21:54:13 crc kubenswrapper[4962]: I1201 21:54:13.861980 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fw82c\" (UniqueName: \"kubernetes.io/projected/c6ac2b00-3dcf-4181-a7f7-9954685ca96c-kube-api-access-fw82c\") pod \"dnsmasq-dns-74dc88fc-jt748\" (UID: \"c6ac2b00-3dcf-4181-a7f7-9954685ca96c\") " pod="openstack/dnsmasq-dns-74dc88fc-jt748" Dec 01 21:54:13 crc kubenswrapper[4962]: I1201 21:54:13.862475 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6ac2b00-3dcf-4181-a7f7-9954685ca96c-config\") pod \"dnsmasq-dns-74dc88fc-jt748\" (UID: \"c6ac2b00-3dcf-4181-a7f7-9954685ca96c\") " pod="openstack/dnsmasq-dns-74dc88fc-jt748" Dec 01 21:54:13 crc kubenswrapper[4962]: I1201 21:54:13.878646 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fw82c\" (UniqueName: \"kubernetes.io/projected/c6ac2b00-3dcf-4181-a7f7-9954685ca96c-kube-api-access-fw82c\") pod \"dnsmasq-dns-74dc88fc-jt748\" (UID: \"c6ac2b00-3dcf-4181-a7f7-9954685ca96c\") " pod="openstack/dnsmasq-dns-74dc88fc-jt748" Dec 01 21:54:13 crc kubenswrapper[4962]: I1201 21:54:13.996696 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74dc88fc-jt748" Dec 01 21:54:14 crc kubenswrapper[4962]: I1201 21:54:14.216552 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3","Type":"ContainerStarted","Data":"e1e14b2d7315a5a088c9ee6ba8890edbac36e37c0d27c3f1c5fb099ba97de03b"} Dec 01 21:54:14 crc kubenswrapper[4962]: I1201 21:54:14.253823 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=47.847508257 podStartE2EDuration="55.253808083s" podCreationTimestamp="2025-12-01 21:53:19 +0000 UTC" firstStartedPulling="2025-12-01 21:54:03.684468026 +0000 UTC m=+1227.785907221" lastFinishedPulling="2025-12-01 21:54:11.090767852 +0000 UTC m=+1235.192207047" observedRunningTime="2025-12-01 21:54:14.249817219 +0000 UTC m=+1238.351256424" watchObservedRunningTime="2025-12-01 21:54:14.253808083 +0000 UTC m=+1238.355247278" Dec 01 21:54:14 crc kubenswrapper[4962]: I1201 21:54:14.511546 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74dc88fc-jt748"] Dec 01 21:54:14 crc kubenswrapper[4962]: I1201 21:54:14.548008 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74dc88fc-jt748"] Dec 01 21:54:14 crc kubenswrapper[4962]: I1201 21:54:14.571370 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-vhbb6"] Dec 01 21:54:14 crc kubenswrapper[4962]: I1201 21:54:14.575040 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-vhbb6" Dec 01 21:54:14 crc kubenswrapper[4962]: I1201 21:54:14.580263 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Dec 01 21:54:14 crc kubenswrapper[4962]: I1201 21:54:14.583767 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-vhbb6"] Dec 01 21:54:14 crc kubenswrapper[4962]: I1201 21:54:14.679735 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/98c2491b-c2d9-4a33-a9a9-315c1fbdc8af-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-vhbb6\" (UID: \"98c2491b-c2d9-4a33-a9a9-315c1fbdc8af\") " pod="openstack/dnsmasq-dns-5f59b8f679-vhbb6" Dec 01 21:54:14 crc kubenswrapper[4962]: I1201 21:54:14.680122 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/98c2491b-c2d9-4a33-a9a9-315c1fbdc8af-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-vhbb6\" (UID: \"98c2491b-c2d9-4a33-a9a9-315c1fbdc8af\") " pod="openstack/dnsmasq-dns-5f59b8f679-vhbb6" Dec 01 21:54:14 crc kubenswrapper[4962]: I1201 21:54:14.680290 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crk5v\" (UniqueName: \"kubernetes.io/projected/98c2491b-c2d9-4a33-a9a9-315c1fbdc8af-kube-api-access-crk5v\") pod \"dnsmasq-dns-5f59b8f679-vhbb6\" (UID: \"98c2491b-c2d9-4a33-a9a9-315c1fbdc8af\") " pod="openstack/dnsmasq-dns-5f59b8f679-vhbb6" Dec 01 21:54:14 crc kubenswrapper[4962]: I1201 21:54:14.680396 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/98c2491b-c2d9-4a33-a9a9-315c1fbdc8af-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-vhbb6\" (UID: \"98c2491b-c2d9-4a33-a9a9-315c1fbdc8af\") " pod="openstack/dnsmasq-dns-5f59b8f679-vhbb6" Dec 01 21:54:14 crc kubenswrapper[4962]: I1201 21:54:14.680478 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98c2491b-c2d9-4a33-a9a9-315c1fbdc8af-config\") pod \"dnsmasq-dns-5f59b8f679-vhbb6\" (UID: \"98c2491b-c2d9-4a33-a9a9-315c1fbdc8af\") " pod="openstack/dnsmasq-dns-5f59b8f679-vhbb6" Dec 01 21:54:14 crc kubenswrapper[4962]: I1201 21:54:14.680602 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/98c2491b-c2d9-4a33-a9a9-315c1fbdc8af-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-vhbb6\" (UID: \"98c2491b-c2d9-4a33-a9a9-315c1fbdc8af\") " pod="openstack/dnsmasq-dns-5f59b8f679-vhbb6" Dec 01 21:54:14 crc kubenswrapper[4962]: I1201 21:54:14.782505 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crk5v\" (UniqueName: \"kubernetes.io/projected/98c2491b-c2d9-4a33-a9a9-315c1fbdc8af-kube-api-access-crk5v\") pod \"dnsmasq-dns-5f59b8f679-vhbb6\" (UID: \"98c2491b-c2d9-4a33-a9a9-315c1fbdc8af\") " pod="openstack/dnsmasq-dns-5f59b8f679-vhbb6" Dec 01 21:54:14 crc kubenswrapper[4962]: I1201 21:54:14.782581 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/98c2491b-c2d9-4a33-a9a9-315c1fbdc8af-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-vhbb6\" (UID: \"98c2491b-c2d9-4a33-a9a9-315c1fbdc8af\") " pod="openstack/dnsmasq-dns-5f59b8f679-vhbb6" Dec 01 21:54:14 crc kubenswrapper[4962]: I1201 21:54:14.782606 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98c2491b-c2d9-4a33-a9a9-315c1fbdc8af-config\") pod \"dnsmasq-dns-5f59b8f679-vhbb6\" (UID: \"98c2491b-c2d9-4a33-a9a9-315c1fbdc8af\") " pod="openstack/dnsmasq-dns-5f59b8f679-vhbb6" Dec 01 21:54:14 crc kubenswrapper[4962]: I1201 21:54:14.782652 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/98c2491b-c2d9-4a33-a9a9-315c1fbdc8af-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-vhbb6\" (UID: \"98c2491b-c2d9-4a33-a9a9-315c1fbdc8af\") " pod="openstack/dnsmasq-dns-5f59b8f679-vhbb6" Dec 01 21:54:14 crc kubenswrapper[4962]: I1201 21:54:14.782723 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/98c2491b-c2d9-4a33-a9a9-315c1fbdc8af-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-vhbb6\" (UID: \"98c2491b-c2d9-4a33-a9a9-315c1fbdc8af\") " pod="openstack/dnsmasq-dns-5f59b8f679-vhbb6" Dec 01 21:54:14 crc kubenswrapper[4962]: I1201 21:54:14.782781 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/98c2491b-c2d9-4a33-a9a9-315c1fbdc8af-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-vhbb6\" (UID: \"98c2491b-c2d9-4a33-a9a9-315c1fbdc8af\") " pod="openstack/dnsmasq-dns-5f59b8f679-vhbb6" Dec 01 21:54:14 crc kubenswrapper[4962]: I1201 21:54:14.783718 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/98c2491b-c2d9-4a33-a9a9-315c1fbdc8af-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-vhbb6\" (UID: \"98c2491b-c2d9-4a33-a9a9-315c1fbdc8af\") " pod="openstack/dnsmasq-dns-5f59b8f679-vhbb6" Dec 01 21:54:14 crc kubenswrapper[4962]: I1201 21:54:14.783864 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98c2491b-c2d9-4a33-a9a9-315c1fbdc8af-config\") pod \"dnsmasq-dns-5f59b8f679-vhbb6\" (UID: \"98c2491b-c2d9-4a33-a9a9-315c1fbdc8af\") " pod="openstack/dnsmasq-dns-5f59b8f679-vhbb6" Dec 01 21:54:14 crc kubenswrapper[4962]: I1201 21:54:14.783884 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/98c2491b-c2d9-4a33-a9a9-315c1fbdc8af-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-vhbb6\" (UID: \"98c2491b-c2d9-4a33-a9a9-315c1fbdc8af\") " pod="openstack/dnsmasq-dns-5f59b8f679-vhbb6" Dec 01 21:54:14 crc kubenswrapper[4962]: I1201 21:54:14.783974 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/98c2491b-c2d9-4a33-a9a9-315c1fbdc8af-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-vhbb6\" (UID: \"98c2491b-c2d9-4a33-a9a9-315c1fbdc8af\") " pod="openstack/dnsmasq-dns-5f59b8f679-vhbb6" Dec 01 21:54:14 crc kubenswrapper[4962]: I1201 21:54:14.784286 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/98c2491b-c2d9-4a33-a9a9-315c1fbdc8af-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-vhbb6\" (UID: \"98c2491b-c2d9-4a33-a9a9-315c1fbdc8af\") " pod="openstack/dnsmasq-dns-5f59b8f679-vhbb6" Dec 01 21:54:14 crc kubenswrapper[4962]: I1201 21:54:14.805050 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crk5v\" (UniqueName: \"kubernetes.io/projected/98c2491b-c2d9-4a33-a9a9-315c1fbdc8af-kube-api-access-crk5v\") pod \"dnsmasq-dns-5f59b8f679-vhbb6\" (UID: \"98c2491b-c2d9-4a33-a9a9-315c1fbdc8af\") " pod="openstack/dnsmasq-dns-5f59b8f679-vhbb6" Dec 01 21:54:14 crc kubenswrapper[4962]: I1201 21:54:14.982240 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-vhbb6" Dec 01 21:54:15 crc kubenswrapper[4962]: I1201 21:54:15.251684 4962 generic.go:334] "Generic (PLEG): container finished" podID="c6ac2b00-3dcf-4181-a7f7-9954685ca96c" containerID="c92b5e08f88c061c5d658381922cc95600b2e445d39e8a2305ce8c46062e6286" exitCode=0 Dec 01 21:54:15 crc kubenswrapper[4962]: I1201 21:54:15.253465 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74dc88fc-jt748" event={"ID":"c6ac2b00-3dcf-4181-a7f7-9954685ca96c","Type":"ContainerDied","Data":"c92b5e08f88c061c5d658381922cc95600b2e445d39e8a2305ce8c46062e6286"} Dec 01 21:54:15 crc kubenswrapper[4962]: I1201 21:54:15.253492 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74dc88fc-jt748" event={"ID":"c6ac2b00-3dcf-4181-a7f7-9954685ca96c","Type":"ContainerStarted","Data":"6f2c93c91e93925c57ad826af97fb72a5401ef80abe0eaa0b8d21b23a4afcf6d"} Dec 01 21:54:15 crc kubenswrapper[4962]: I1201 21:54:15.628968 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-vhbb6"] Dec 01 21:54:15 crc kubenswrapper[4962]: I1201 21:54:15.897641 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74dc88fc-jt748" Dec 01 21:54:16 crc kubenswrapper[4962]: I1201 21:54:16.037666 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c6ac2b00-3dcf-4181-a7f7-9954685ca96c-ovsdbserver-nb\") pod \"c6ac2b00-3dcf-4181-a7f7-9954685ca96c\" (UID: \"c6ac2b00-3dcf-4181-a7f7-9954685ca96c\") " Dec 01 21:54:16 crc kubenswrapper[4962]: I1201 21:54:16.038281 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c6ac2b00-3dcf-4181-a7f7-9954685ca96c-dns-svc\") pod \"c6ac2b00-3dcf-4181-a7f7-9954685ca96c\" (UID: \"c6ac2b00-3dcf-4181-a7f7-9954685ca96c\") " Dec 01 21:54:16 crc kubenswrapper[4962]: I1201 21:54:16.038333 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fw82c\" (UniqueName: \"kubernetes.io/projected/c6ac2b00-3dcf-4181-a7f7-9954685ca96c-kube-api-access-fw82c\") pod \"c6ac2b00-3dcf-4181-a7f7-9954685ca96c\" (UID: \"c6ac2b00-3dcf-4181-a7f7-9954685ca96c\") " Dec 01 21:54:16 crc kubenswrapper[4962]: I1201 21:54:16.038399 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6ac2b00-3dcf-4181-a7f7-9954685ca96c-config\") pod \"c6ac2b00-3dcf-4181-a7f7-9954685ca96c\" (UID: \"c6ac2b00-3dcf-4181-a7f7-9954685ca96c\") " Dec 01 21:54:16 crc kubenswrapper[4962]: I1201 21:54:16.038480 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c6ac2b00-3dcf-4181-a7f7-9954685ca96c-ovsdbserver-sb\") pod \"c6ac2b00-3dcf-4181-a7f7-9954685ca96c\" (UID: \"c6ac2b00-3dcf-4181-a7f7-9954685ca96c\") " Dec 01 21:54:16 crc kubenswrapper[4962]: I1201 21:54:16.042906 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6ac2b00-3dcf-4181-a7f7-9954685ca96c-kube-api-access-fw82c" (OuterVolumeSpecName: "kube-api-access-fw82c") pod "c6ac2b00-3dcf-4181-a7f7-9954685ca96c" (UID: "c6ac2b00-3dcf-4181-a7f7-9954685ca96c"). InnerVolumeSpecName "kube-api-access-fw82c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:54:16 crc kubenswrapper[4962]: I1201 21:54:16.063271 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6ac2b00-3dcf-4181-a7f7-9954685ca96c-config" (OuterVolumeSpecName: "config") pod "c6ac2b00-3dcf-4181-a7f7-9954685ca96c" (UID: "c6ac2b00-3dcf-4181-a7f7-9954685ca96c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:54:16 crc kubenswrapper[4962]: I1201 21:54:16.075797 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6ac2b00-3dcf-4181-a7f7-9954685ca96c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c6ac2b00-3dcf-4181-a7f7-9954685ca96c" (UID: "c6ac2b00-3dcf-4181-a7f7-9954685ca96c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:54:16 crc kubenswrapper[4962]: I1201 21:54:16.087228 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6ac2b00-3dcf-4181-a7f7-9954685ca96c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c6ac2b00-3dcf-4181-a7f7-9954685ca96c" (UID: "c6ac2b00-3dcf-4181-a7f7-9954685ca96c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:54:16 crc kubenswrapper[4962]: I1201 21:54:16.096643 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6ac2b00-3dcf-4181-a7f7-9954685ca96c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c6ac2b00-3dcf-4181-a7f7-9954685ca96c" (UID: "c6ac2b00-3dcf-4181-a7f7-9954685ca96c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:54:16 crc kubenswrapper[4962]: I1201 21:54:16.140685 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c6ac2b00-3dcf-4181-a7f7-9954685ca96c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 21:54:16 crc kubenswrapper[4962]: I1201 21:54:16.140744 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c6ac2b00-3dcf-4181-a7f7-9954685ca96c-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 21:54:16 crc kubenswrapper[4962]: I1201 21:54:16.140765 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fw82c\" (UniqueName: \"kubernetes.io/projected/c6ac2b00-3dcf-4181-a7f7-9954685ca96c-kube-api-access-fw82c\") on node \"crc\" DevicePath \"\"" Dec 01 21:54:16 crc kubenswrapper[4962]: I1201 21:54:16.140785 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6ac2b00-3dcf-4181-a7f7-9954685ca96c-config\") on node \"crc\" DevicePath \"\"" Dec 01 21:54:16 crc kubenswrapper[4962]: I1201 21:54:16.140802 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c6ac2b00-3dcf-4181-a7f7-9954685ca96c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 21:54:16 crc kubenswrapper[4962]: I1201 21:54:16.267125 4962 generic.go:334] "Generic (PLEG): container finished" podID="98c2491b-c2d9-4a33-a9a9-315c1fbdc8af" containerID="6ad9975cf2a83da885d1e72842624c13d5c5963975b6061d53d3e0ff43593de5" exitCode=0 Dec 01 21:54:16 crc kubenswrapper[4962]: I1201 21:54:16.267206 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-vhbb6" event={"ID":"98c2491b-c2d9-4a33-a9a9-315c1fbdc8af","Type":"ContainerDied","Data":"6ad9975cf2a83da885d1e72842624c13d5c5963975b6061d53d3e0ff43593de5"} Dec 01 21:54:16 crc kubenswrapper[4962]: I1201 21:54:16.267251 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-vhbb6" event={"ID":"98c2491b-c2d9-4a33-a9a9-315c1fbdc8af","Type":"ContainerStarted","Data":"d47ae36e1e4e3b6743b00e67da5be55de146099090dd80b4a4c6e5ad1e6cbec0"} Dec 01 21:54:16 crc kubenswrapper[4962]: I1201 21:54:16.271339 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74dc88fc-jt748" event={"ID":"c6ac2b00-3dcf-4181-a7f7-9954685ca96c","Type":"ContainerDied","Data":"6f2c93c91e93925c57ad826af97fb72a5401ef80abe0eaa0b8d21b23a4afcf6d"} Dec 01 21:54:16 crc kubenswrapper[4962]: I1201 21:54:16.271398 4962 scope.go:117] "RemoveContainer" containerID="c92b5e08f88c061c5d658381922cc95600b2e445d39e8a2305ce8c46062e6286" Dec 01 21:54:16 crc kubenswrapper[4962]: I1201 21:54:16.271560 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74dc88fc-jt748" Dec 01 21:54:16 crc kubenswrapper[4962]: I1201 21:54:16.369693 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74dc88fc-jt748"] Dec 01 21:54:16 crc kubenswrapper[4962]: I1201 21:54:16.381532 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74dc88fc-jt748"] Dec 01 21:54:16 crc kubenswrapper[4962]: I1201 21:54:16.928210 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Dec 01 21:54:16 crc kubenswrapper[4962]: I1201 21:54:16.936515 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Dec 01 21:54:17 crc kubenswrapper[4962]: I1201 21:54:17.289338 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-vhbb6" event={"ID":"98c2491b-c2d9-4a33-a9a9-315c1fbdc8af","Type":"ContainerStarted","Data":"b2006a7882e564db2976f58ea596464dc1508227cb617c0c64ce425635602e33"} Dec 01 21:54:17 crc kubenswrapper[4962]: I1201 21:54:17.289610 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f59b8f679-vhbb6" Dec 01 21:54:17 crc kubenswrapper[4962]: I1201 21:54:17.293245 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-qd22d" event={"ID":"e6634bf9-94a9-4b1c-b14b-44b4ecc882bb","Type":"ContainerStarted","Data":"0cf2149813cb7bb627682a3022799e5c83455c122581776f0639344ccd6f3827"} Dec 01 21:54:17 crc kubenswrapper[4962]: I1201 21:54:17.303738 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Dec 01 21:54:17 crc kubenswrapper[4962]: I1201 21:54:17.318028 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5f59b8f679-vhbb6" podStartSLOduration=3.318009977 podStartE2EDuration="3.318009977s" podCreationTimestamp="2025-12-01 21:54:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:54:17.309166085 +0000 UTC m=+1241.410605280" watchObservedRunningTime="2025-12-01 21:54:17.318009977 +0000 UTC m=+1241.419449182" Dec 01 21:54:17 crc kubenswrapper[4962]: I1201 21:54:17.337351 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-qd22d" podStartSLOduration=3.5749079889999997 podStartE2EDuration="32.337330237s" podCreationTimestamp="2025-12-01 21:53:45 +0000 UTC" firstStartedPulling="2025-12-01 21:53:47.254411876 +0000 UTC m=+1211.355851071" lastFinishedPulling="2025-12-01 21:54:16.016834124 +0000 UTC m=+1240.118273319" observedRunningTime="2025-12-01 21:54:17.328167786 +0000 UTC m=+1241.429607041" watchObservedRunningTime="2025-12-01 21:54:17.337330237 +0000 UTC m=+1241.438769442" Dec 01 21:54:18 crc kubenswrapper[4962]: I1201 21:54:18.234114 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6ac2b00-3dcf-4181-a7f7-9954685ca96c" path="/var/lib/kubelet/pods/c6ac2b00-3dcf-4181-a7f7-9954685ca96c/volumes" Dec 01 21:54:21 crc kubenswrapper[4962]: E1201 21:54:21.965832 4962 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6634bf9_94a9_4b1c_b14b_44b4ecc882bb.slice/crio-0cf2149813cb7bb627682a3022799e5c83455c122581776f0639344ccd6f3827.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6634bf9_94a9_4b1c_b14b_44b4ecc882bb.slice/crio-conmon-0cf2149813cb7bb627682a3022799e5c83455c122581776f0639344ccd6f3827.scope\": RecentStats: unable to find data in memory cache]" Dec 01 21:54:22 crc kubenswrapper[4962]: I1201 21:54:22.347373 4962 generic.go:334] "Generic (PLEG): container finished" podID="e6634bf9-94a9-4b1c-b14b-44b4ecc882bb" containerID="0cf2149813cb7bb627682a3022799e5c83455c122581776f0639344ccd6f3827" exitCode=0 Dec 01 21:54:22 crc kubenswrapper[4962]: I1201 21:54:22.347425 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-qd22d" event={"ID":"e6634bf9-94a9-4b1c-b14b-44b4ecc882bb","Type":"ContainerDied","Data":"0cf2149813cb7bb627682a3022799e5c83455c122581776f0639344ccd6f3827"} Dec 01 21:54:23 crc kubenswrapper[4962]: I1201 21:54:23.812721 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-qd22d" Dec 01 21:54:23 crc kubenswrapper[4962]: I1201 21:54:23.927323 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6634bf9-94a9-4b1c-b14b-44b4ecc882bb-config-data\") pod \"e6634bf9-94a9-4b1c-b14b-44b4ecc882bb\" (UID: \"e6634bf9-94a9-4b1c-b14b-44b4ecc882bb\") " Dec 01 21:54:23 crc kubenswrapper[4962]: I1201 21:54:23.927430 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6634bf9-94a9-4b1c-b14b-44b4ecc882bb-combined-ca-bundle\") pod \"e6634bf9-94a9-4b1c-b14b-44b4ecc882bb\" (UID: \"e6634bf9-94a9-4b1c-b14b-44b4ecc882bb\") " Dec 01 21:54:23 crc kubenswrapper[4962]: I1201 21:54:23.927595 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6cks\" (UniqueName: \"kubernetes.io/projected/e6634bf9-94a9-4b1c-b14b-44b4ecc882bb-kube-api-access-c6cks\") pod \"e6634bf9-94a9-4b1c-b14b-44b4ecc882bb\" (UID: \"e6634bf9-94a9-4b1c-b14b-44b4ecc882bb\") " Dec 01 21:54:23 crc kubenswrapper[4962]: I1201 21:54:23.981165 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6634bf9-94a9-4b1c-b14b-44b4ecc882bb-kube-api-access-c6cks" (OuterVolumeSpecName: "kube-api-access-c6cks") pod "e6634bf9-94a9-4b1c-b14b-44b4ecc882bb" (UID: "e6634bf9-94a9-4b1c-b14b-44b4ecc882bb"). InnerVolumeSpecName "kube-api-access-c6cks". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:54:24 crc kubenswrapper[4962]: I1201 21:54:24.030181 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6cks\" (UniqueName: \"kubernetes.io/projected/e6634bf9-94a9-4b1c-b14b-44b4ecc882bb-kube-api-access-c6cks\") on node \"crc\" DevicePath \"\"" Dec 01 21:54:24 crc kubenswrapper[4962]: I1201 21:54:24.073107 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6634bf9-94a9-4b1c-b14b-44b4ecc882bb-config-data" (OuterVolumeSpecName: "config-data") pod "e6634bf9-94a9-4b1c-b14b-44b4ecc882bb" (UID: "e6634bf9-94a9-4b1c-b14b-44b4ecc882bb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:54:24 crc kubenswrapper[4962]: I1201 21:54:24.074821 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6634bf9-94a9-4b1c-b14b-44b4ecc882bb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e6634bf9-94a9-4b1c-b14b-44b4ecc882bb" (UID: "e6634bf9-94a9-4b1c-b14b-44b4ecc882bb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:54:24 crc kubenswrapper[4962]: I1201 21:54:24.132112 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6634bf9-94a9-4b1c-b14b-44b4ecc882bb-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 21:54:24 crc kubenswrapper[4962]: I1201 21:54:24.132333 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6634bf9-94a9-4b1c-b14b-44b4ecc882bb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 21:54:24 crc kubenswrapper[4962]: I1201 21:54:24.375186 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-qd22d" event={"ID":"e6634bf9-94a9-4b1c-b14b-44b4ecc882bb","Type":"ContainerDied","Data":"214e2173990b5e8285f59310540f15223c48240ca91c4cb2003acaf84787e35e"} Dec 01 21:54:24 crc kubenswrapper[4962]: I1201 21:54:24.375218 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="214e2173990b5e8285f59310540f15223c48240ca91c4cb2003acaf84787e35e" Dec 01 21:54:24 crc kubenswrapper[4962]: I1201 21:54:24.375271 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-qd22d" Dec 01 21:54:24 crc kubenswrapper[4962]: I1201 21:54:24.593084 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-vhbb6"] Dec 01 21:54:24 crc kubenswrapper[4962]: I1201 21:54:24.593380 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5f59b8f679-vhbb6" podUID="98c2491b-c2d9-4a33-a9a9-315c1fbdc8af" containerName="dnsmasq-dns" containerID="cri-o://b2006a7882e564db2976f58ea596464dc1508227cb617c0c64ce425635602e33" gracePeriod=10 Dec 01 21:54:24 crc kubenswrapper[4962]: I1201 21:54:24.600217 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5f59b8f679-vhbb6" Dec 01 21:54:24 crc kubenswrapper[4962]: I1201 21:54:24.624922 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-zbmfp"] Dec 01 21:54:24 crc kubenswrapper[4962]: E1201 21:54:24.625433 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6ac2b00-3dcf-4181-a7f7-9954685ca96c" containerName="init" Dec 01 21:54:24 crc kubenswrapper[4962]: I1201 21:54:24.625444 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6ac2b00-3dcf-4181-a7f7-9954685ca96c" containerName="init" Dec 01 21:54:24 crc kubenswrapper[4962]: E1201 21:54:24.625455 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6634bf9-94a9-4b1c-b14b-44b4ecc882bb" containerName="keystone-db-sync" Dec 01 21:54:24 crc kubenswrapper[4962]: I1201 21:54:24.625463 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6634bf9-94a9-4b1c-b14b-44b4ecc882bb" containerName="keystone-db-sync" Dec 01 21:54:24 crc kubenswrapper[4962]: I1201 21:54:24.625659 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6634bf9-94a9-4b1c-b14b-44b4ecc882bb" containerName="keystone-db-sync" Dec 01 21:54:24 crc kubenswrapper[4962]: I1201 21:54:24.625686 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6ac2b00-3dcf-4181-a7f7-9954685ca96c" containerName="init" Dec 01 21:54:24 crc kubenswrapper[4962]: I1201 21:54:24.626397 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-zbmfp" Dec 01 21:54:24 crc kubenswrapper[4962]: I1201 21:54:24.639290 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 01 21:54:24 crc kubenswrapper[4962]: I1201 21:54:24.639511 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 01 21:54:24 crc kubenswrapper[4962]: I1201 21:54:24.639650 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 01 21:54:24 crc kubenswrapper[4962]: I1201 21:54:24.639776 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 01 21:54:24 crc kubenswrapper[4962]: I1201 21:54:24.640223 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-tlj7d" Dec 01 21:54:24 crc kubenswrapper[4962]: I1201 21:54:24.651252 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-m9jmv"] Dec 01 21:54:24 crc kubenswrapper[4962]: I1201 21:54:24.653492 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-m9jmv" Dec 01 21:54:24 crc kubenswrapper[4962]: I1201 21:54:24.719923 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-zbmfp"] Dec 01 21:54:24 crc kubenswrapper[4962]: I1201 21:54:24.747770 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/465331dc-de3a-40d9-b3bd-139e2ee114ec-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-m9jmv\" (UID: \"465331dc-de3a-40d9-b3bd-139e2ee114ec\") " pod="openstack/dnsmasq-dns-bbf5cc879-m9jmv" Dec 01 21:54:24 crc kubenswrapper[4962]: I1201 21:54:24.748176 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e62f3680-a9bc-455b-a71e-77efaa2f2a87-scripts\") pod \"keystone-bootstrap-zbmfp\" (UID: \"e62f3680-a9bc-455b-a71e-77efaa2f2a87\") " pod="openstack/keystone-bootstrap-zbmfp" Dec 01 21:54:24 crc kubenswrapper[4962]: I1201 21:54:24.748218 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmg6x\" (UniqueName: \"kubernetes.io/projected/e62f3680-a9bc-455b-a71e-77efaa2f2a87-kube-api-access-wmg6x\") pod \"keystone-bootstrap-zbmfp\" (UID: \"e62f3680-a9bc-455b-a71e-77efaa2f2a87\") " pod="openstack/keystone-bootstrap-zbmfp" Dec 01 21:54:24 crc kubenswrapper[4962]: I1201 21:54:24.748282 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e62f3680-a9bc-455b-a71e-77efaa2f2a87-combined-ca-bundle\") pod \"keystone-bootstrap-zbmfp\" (UID: \"e62f3680-a9bc-455b-a71e-77efaa2f2a87\") " pod="openstack/keystone-bootstrap-zbmfp" Dec 01 21:54:24 crc kubenswrapper[4962]: I1201 21:54:24.748313 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/465331dc-de3a-40d9-b3bd-139e2ee114ec-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-m9jmv\" (UID: \"465331dc-de3a-40d9-b3bd-139e2ee114ec\") " pod="openstack/dnsmasq-dns-bbf5cc879-m9jmv" Dec 01 21:54:24 crc kubenswrapper[4962]: I1201 21:54:24.748358 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e62f3680-a9bc-455b-a71e-77efaa2f2a87-fernet-keys\") pod \"keystone-bootstrap-zbmfp\" (UID: \"e62f3680-a9bc-455b-a71e-77efaa2f2a87\") " pod="openstack/keystone-bootstrap-zbmfp" Dec 01 21:54:24 crc kubenswrapper[4962]: I1201 21:54:24.748397 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e62f3680-a9bc-455b-a71e-77efaa2f2a87-credential-keys\") pod \"keystone-bootstrap-zbmfp\" (UID: \"e62f3680-a9bc-455b-a71e-77efaa2f2a87\") " pod="openstack/keystone-bootstrap-zbmfp" Dec 01 21:54:24 crc kubenswrapper[4962]: I1201 21:54:24.748589 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/465331dc-de3a-40d9-b3bd-139e2ee114ec-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-m9jmv\" (UID: \"465331dc-de3a-40d9-b3bd-139e2ee114ec\") " pod="openstack/dnsmasq-dns-bbf5cc879-m9jmv" Dec 01 21:54:24 crc kubenswrapper[4962]: I1201 21:54:24.748619 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmmx4\" (UniqueName: \"kubernetes.io/projected/465331dc-de3a-40d9-b3bd-139e2ee114ec-kube-api-access-tmmx4\") pod \"dnsmasq-dns-bbf5cc879-m9jmv\" (UID: \"465331dc-de3a-40d9-b3bd-139e2ee114ec\") " pod="openstack/dnsmasq-dns-bbf5cc879-m9jmv" Dec 01 21:54:24 crc kubenswrapper[4962]: I1201 21:54:24.748682 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e62f3680-a9bc-455b-a71e-77efaa2f2a87-config-data\") pod \"keystone-bootstrap-zbmfp\" (UID: \"e62f3680-a9bc-455b-a71e-77efaa2f2a87\") " pod="openstack/keystone-bootstrap-zbmfp" Dec 01 21:54:24 crc kubenswrapper[4962]: I1201 21:54:24.748774 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/465331dc-de3a-40d9-b3bd-139e2ee114ec-config\") pod \"dnsmasq-dns-bbf5cc879-m9jmv\" (UID: \"465331dc-de3a-40d9-b3bd-139e2ee114ec\") " pod="openstack/dnsmasq-dns-bbf5cc879-m9jmv" Dec 01 21:54:24 crc kubenswrapper[4962]: I1201 21:54:24.748893 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/465331dc-de3a-40d9-b3bd-139e2ee114ec-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-m9jmv\" (UID: \"465331dc-de3a-40d9-b3bd-139e2ee114ec\") " pod="openstack/dnsmasq-dns-bbf5cc879-m9jmv" Dec 01 21:54:24 crc kubenswrapper[4962]: I1201 21:54:24.790018 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-m9jmv"] Dec 01 21:54:24 crc kubenswrapper[4962]: I1201 21:54:24.851347 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e62f3680-a9bc-455b-a71e-77efaa2f2a87-credential-keys\") pod \"keystone-bootstrap-zbmfp\" (UID: \"e62f3680-a9bc-455b-a71e-77efaa2f2a87\") " pod="openstack/keystone-bootstrap-zbmfp" Dec 01 21:54:24 crc kubenswrapper[4962]: I1201 21:54:24.851440 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/465331dc-de3a-40d9-b3bd-139e2ee114ec-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-m9jmv\" (UID: \"465331dc-de3a-40d9-b3bd-139e2ee114ec\") " pod="openstack/dnsmasq-dns-bbf5cc879-m9jmv" Dec 01 21:54:24 crc kubenswrapper[4962]: I1201 21:54:24.851468 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmmx4\" (UniqueName: \"kubernetes.io/projected/465331dc-de3a-40d9-b3bd-139e2ee114ec-kube-api-access-tmmx4\") pod \"dnsmasq-dns-bbf5cc879-m9jmv\" (UID: \"465331dc-de3a-40d9-b3bd-139e2ee114ec\") " pod="openstack/dnsmasq-dns-bbf5cc879-m9jmv" Dec 01 21:54:24 crc kubenswrapper[4962]: I1201 21:54:24.851511 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e62f3680-a9bc-455b-a71e-77efaa2f2a87-config-data\") pod \"keystone-bootstrap-zbmfp\" (UID: \"e62f3680-a9bc-455b-a71e-77efaa2f2a87\") " pod="openstack/keystone-bootstrap-zbmfp" Dec 01 21:54:24 crc kubenswrapper[4962]: I1201 21:54:24.851538 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/465331dc-de3a-40d9-b3bd-139e2ee114ec-config\") pod \"dnsmasq-dns-bbf5cc879-m9jmv\" (UID: \"465331dc-de3a-40d9-b3bd-139e2ee114ec\") " pod="openstack/dnsmasq-dns-bbf5cc879-m9jmv" Dec 01 21:54:24 crc kubenswrapper[4962]: I1201 21:54:24.851590 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/465331dc-de3a-40d9-b3bd-139e2ee114ec-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-m9jmv\" (UID: \"465331dc-de3a-40d9-b3bd-139e2ee114ec\") " pod="openstack/dnsmasq-dns-bbf5cc879-m9jmv" Dec 01 21:54:24 crc kubenswrapper[4962]: I1201 21:54:24.851643 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/465331dc-de3a-40d9-b3bd-139e2ee114ec-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-m9jmv\" (UID: \"465331dc-de3a-40d9-b3bd-139e2ee114ec\") " pod="openstack/dnsmasq-dns-bbf5cc879-m9jmv" Dec 01 21:54:24 crc kubenswrapper[4962]: I1201 21:54:24.851668 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e62f3680-a9bc-455b-a71e-77efaa2f2a87-scripts\") pod \"keystone-bootstrap-zbmfp\" (UID: \"e62f3680-a9bc-455b-a71e-77efaa2f2a87\") " pod="openstack/keystone-bootstrap-zbmfp" Dec 01 21:54:24 crc kubenswrapper[4962]: I1201 21:54:24.851689 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmg6x\" (UniqueName: \"kubernetes.io/projected/e62f3680-a9bc-455b-a71e-77efaa2f2a87-kube-api-access-wmg6x\") pod \"keystone-bootstrap-zbmfp\" (UID: \"e62f3680-a9bc-455b-a71e-77efaa2f2a87\") " pod="openstack/keystone-bootstrap-zbmfp" Dec 01 21:54:24 crc kubenswrapper[4962]: I1201 21:54:24.851729 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e62f3680-a9bc-455b-a71e-77efaa2f2a87-combined-ca-bundle\") pod \"keystone-bootstrap-zbmfp\" (UID: \"e62f3680-a9bc-455b-a71e-77efaa2f2a87\") " pod="openstack/keystone-bootstrap-zbmfp" Dec 01 21:54:24 crc kubenswrapper[4962]: I1201 21:54:24.851751 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/465331dc-de3a-40d9-b3bd-139e2ee114ec-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-m9jmv\" (UID: \"465331dc-de3a-40d9-b3bd-139e2ee114ec\") " pod="openstack/dnsmasq-dns-bbf5cc879-m9jmv" Dec 01 21:54:24 crc kubenswrapper[4962]: I1201 21:54:24.851789 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e62f3680-a9bc-455b-a71e-77efaa2f2a87-fernet-keys\") pod \"keystone-bootstrap-zbmfp\" (UID: \"e62f3680-a9bc-455b-a71e-77efaa2f2a87\") " pod="openstack/keystone-bootstrap-zbmfp" Dec 01 21:54:24 crc kubenswrapper[4962]: I1201 21:54:24.856045 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/465331dc-de3a-40d9-b3bd-139e2ee114ec-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-m9jmv\" (UID: \"465331dc-de3a-40d9-b3bd-139e2ee114ec\") " pod="openstack/dnsmasq-dns-bbf5cc879-m9jmv" Dec 01 21:54:24 crc kubenswrapper[4962]: I1201 21:54:24.856998 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e62f3680-a9bc-455b-a71e-77efaa2f2a87-fernet-keys\") pod \"keystone-bootstrap-zbmfp\" (UID: \"e62f3680-a9bc-455b-a71e-77efaa2f2a87\") " pod="openstack/keystone-bootstrap-zbmfp" Dec 01 21:54:24 crc kubenswrapper[4962]: I1201 21:54:24.857745 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/465331dc-de3a-40d9-b3bd-139e2ee114ec-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-m9jmv\" (UID: \"465331dc-de3a-40d9-b3bd-139e2ee114ec\") " pod="openstack/dnsmasq-dns-bbf5cc879-m9jmv" Dec 01 21:54:24 crc kubenswrapper[4962]: I1201 21:54:24.860294 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e62f3680-a9bc-455b-a71e-77efaa2f2a87-scripts\") pod \"keystone-bootstrap-zbmfp\" (UID: \"e62f3680-a9bc-455b-a71e-77efaa2f2a87\") " pod="openstack/keystone-bootstrap-zbmfp" Dec 01 21:54:24 crc kubenswrapper[4962]: I1201 21:54:24.862889 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e62f3680-a9bc-455b-a71e-77efaa2f2a87-combined-ca-bundle\") pod \"keystone-bootstrap-zbmfp\" (UID: \"e62f3680-a9bc-455b-a71e-77efaa2f2a87\") " pod="openstack/keystone-bootstrap-zbmfp" Dec 01 21:54:24 crc kubenswrapper[4962]: I1201 21:54:24.863903 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/465331dc-de3a-40d9-b3bd-139e2ee114ec-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-m9jmv\" (UID: \"465331dc-de3a-40d9-b3bd-139e2ee114ec\") " pod="openstack/dnsmasq-dns-bbf5cc879-m9jmv" Dec 01 21:54:24 crc kubenswrapper[4962]: I1201 21:54:24.866501 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/465331dc-de3a-40d9-b3bd-139e2ee114ec-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-m9jmv\" (UID: \"465331dc-de3a-40d9-b3bd-139e2ee114ec\") " pod="openstack/dnsmasq-dns-bbf5cc879-m9jmv" Dec 01 21:54:24 crc kubenswrapper[4962]: I1201 21:54:24.881369 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/465331dc-de3a-40d9-b3bd-139e2ee114ec-config\") pod \"dnsmasq-dns-bbf5cc879-m9jmv\" (UID: \"465331dc-de3a-40d9-b3bd-139e2ee114ec\") " pod="openstack/dnsmasq-dns-bbf5cc879-m9jmv" Dec 01 21:54:24 crc kubenswrapper[4962]: I1201 21:54:24.918186 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-x55sw"] Dec 01 21:54:24 crc kubenswrapper[4962]: I1201 21:54:24.920455 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-x55sw" Dec 01 21:54:24 crc kubenswrapper[4962]: I1201 21:54:24.926621 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Dec 01 21:54:24 crc kubenswrapper[4962]: I1201 21:54:24.928559 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e62f3680-a9bc-455b-a71e-77efaa2f2a87-config-data\") pod \"keystone-bootstrap-zbmfp\" (UID: \"e62f3680-a9bc-455b-a71e-77efaa2f2a87\") " pod="openstack/keystone-bootstrap-zbmfp" Dec 01 21:54:24 crc kubenswrapper[4962]: I1201 21:54:24.935052 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-zjthz" Dec 01 21:54:24 crc kubenswrapper[4962]: I1201 21:54:24.937146 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e62f3680-a9bc-455b-a71e-77efaa2f2a87-credential-keys\") pod \"keystone-bootstrap-zbmfp\" (UID: \"e62f3680-a9bc-455b-a71e-77efaa2f2a87\") " pod="openstack/keystone-bootstrap-zbmfp" Dec 01 21:54:24 crc kubenswrapper[4962]: I1201 21:54:24.938247 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmg6x\" (UniqueName: \"kubernetes.io/projected/e62f3680-a9bc-455b-a71e-77efaa2f2a87-kube-api-access-wmg6x\") pod \"keystone-bootstrap-zbmfp\" (UID: \"e62f3680-a9bc-455b-a71e-77efaa2f2a87\") " pod="openstack/keystone-bootstrap-zbmfp" Dec 01 21:54:24 crc kubenswrapper[4962]: I1201 21:54:24.940392 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmmx4\" (UniqueName: \"kubernetes.io/projected/465331dc-de3a-40d9-b3bd-139e2ee114ec-kube-api-access-tmmx4\") pod \"dnsmasq-dns-bbf5cc879-m9jmv\" (UID: \"465331dc-de3a-40d9-b3bd-139e2ee114ec\") " pod="openstack/dnsmasq-dns-bbf5cc879-m9jmv" Dec 01 21:54:24 crc kubenswrapper[4962]: I1201 21:54:24.964074 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-x55sw"] Dec 01 21:54:24 crc kubenswrapper[4962]: I1201 21:54:24.988964 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-q7knr"] Dec 01 21:54:24 crc kubenswrapper[4962]: I1201 21:54:24.990427 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-q7knr" Dec 01 21:54:24 crc kubenswrapper[4962]: I1201 21:54:24.992906 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-6rtb4" Dec 01 21:54:24 crc kubenswrapper[4962]: I1201 21:54:24.994032 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 01 21:54:24 crc kubenswrapper[4962]: I1201 21:54:24.995471 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-m9jmv" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.040215 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-q7knr"] Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.075271 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/873a333f-1f2b-4824-86a7-7935ff6908f9-config-data\") pod \"heat-db-sync-x55sw\" (UID: \"873a333f-1f2b-4824-86a7-7935ff6908f9\") " pod="openstack/heat-db-sync-x55sw" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.075508 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/873a333f-1f2b-4824-86a7-7935ff6908f9-combined-ca-bundle\") pod \"heat-db-sync-x55sw\" (UID: \"873a333f-1f2b-4824-86a7-7935ff6908f9\") " pod="openstack/heat-db-sync-x55sw" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.075615 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98cede6f-200e-44d9-a4ee-886de53f2459-combined-ca-bundle\") pod \"barbican-db-sync-q7knr\" (UID: \"98cede6f-200e-44d9-a4ee-886de53f2459\") " pod="openstack/barbican-db-sync-q7knr" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.079254 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/98cede6f-200e-44d9-a4ee-886de53f2459-db-sync-config-data\") pod \"barbican-db-sync-q7knr\" (UID: \"98cede6f-200e-44d9-a4ee-886de53f2459\") " pod="openstack/barbican-db-sync-q7knr" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.079359 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfbmn\" (UniqueName: \"kubernetes.io/projected/98cede6f-200e-44d9-a4ee-886de53f2459-kube-api-access-xfbmn\") pod \"barbican-db-sync-q7knr\" (UID: \"98cede6f-200e-44d9-a4ee-886de53f2459\") " pod="openstack/barbican-db-sync-q7knr" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.086505 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtbpm\" (UniqueName: \"kubernetes.io/projected/873a333f-1f2b-4824-86a7-7935ff6908f9-kube-api-access-dtbpm\") pod \"heat-db-sync-x55sw\" (UID: \"873a333f-1f2b-4824-86a7-7935ff6908f9\") " pod="openstack/heat-db-sync-x55sw" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.096473 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-pvhvh"] Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.097843 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-pvhvh" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.101034 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-ggdvg" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.103873 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.103962 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.125143 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-pvhvh"] Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.127081 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-zbmfp" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.142223 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-vzzq8"] Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.143921 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-vzzq8" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.148483 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.148774 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.148897 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-n49jq" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.163765 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-l44sw"] Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.165519 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-l44sw" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.171260 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.171286 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-zj84r" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.171536 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.180871 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-vzzq8"] Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.187957 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ecfd3cc-e01f-4a75-ad5e-8c0e44638525-combined-ca-bundle\") pod \"neutron-db-sync-pvhvh\" (UID: \"4ecfd3cc-e01f-4a75-ad5e-8c0e44638525\") " pod="openstack/neutron-db-sync-pvhvh" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.188229 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/873a333f-1f2b-4824-86a7-7935ff6908f9-config-data\") pod \"heat-db-sync-x55sw\" (UID: \"873a333f-1f2b-4824-86a7-7935ff6908f9\") " pod="openstack/heat-db-sync-x55sw" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.188336 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zb7f\" (UniqueName: \"kubernetes.io/projected/4ecfd3cc-e01f-4a75-ad5e-8c0e44638525-kube-api-access-9zb7f\") pod \"neutron-db-sync-pvhvh\" (UID: \"4ecfd3cc-e01f-4a75-ad5e-8c0e44638525\") " pod="openstack/neutron-db-sync-pvhvh" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.188410 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/873a333f-1f2b-4824-86a7-7935ff6908f9-combined-ca-bundle\") pod \"heat-db-sync-x55sw\" (UID: \"873a333f-1f2b-4824-86a7-7935ff6908f9\") " pod="openstack/heat-db-sync-x55sw" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.188504 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98cede6f-200e-44d9-a4ee-886de53f2459-combined-ca-bundle\") pod \"barbican-db-sync-q7knr\" (UID: \"98cede6f-200e-44d9-a4ee-886de53f2459\") " pod="openstack/barbican-db-sync-q7knr" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.188690 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4ecfd3cc-e01f-4a75-ad5e-8c0e44638525-config\") pod \"neutron-db-sync-pvhvh\" (UID: \"4ecfd3cc-e01f-4a75-ad5e-8c0e44638525\") " pod="openstack/neutron-db-sync-pvhvh" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.188780 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/98cede6f-200e-44d9-a4ee-886de53f2459-db-sync-config-data\") pod \"barbican-db-sync-q7knr\" (UID: \"98cede6f-200e-44d9-a4ee-886de53f2459\") " pod="openstack/barbican-db-sync-q7knr" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.188846 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfbmn\" (UniqueName: \"kubernetes.io/projected/98cede6f-200e-44d9-a4ee-886de53f2459-kube-api-access-xfbmn\") pod \"barbican-db-sync-q7knr\" (UID: \"98cede6f-200e-44d9-a4ee-886de53f2459\") " pod="openstack/barbican-db-sync-q7knr" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.188924 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtbpm\" (UniqueName: \"kubernetes.io/projected/873a333f-1f2b-4824-86a7-7935ff6908f9-kube-api-access-dtbpm\") pod \"heat-db-sync-x55sw\" (UID: \"873a333f-1f2b-4824-86a7-7935ff6908f9\") " pod="openstack/heat-db-sync-x55sw" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.191866 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-m9jmv"] Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.194761 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/873a333f-1f2b-4824-86a7-7935ff6908f9-config-data\") pod \"heat-db-sync-x55sw\" (UID: \"873a333f-1f2b-4824-86a7-7935ff6908f9\") " pod="openstack/heat-db-sync-x55sw" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.206998 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-l44sw"] Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.209624 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/98cede6f-200e-44d9-a4ee-886de53f2459-db-sync-config-data\") pod \"barbican-db-sync-q7knr\" (UID: \"98cede6f-200e-44d9-a4ee-886de53f2459\") " pod="openstack/barbican-db-sync-q7knr" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.218905 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-mfzbs"] Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.220658 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-mfzbs" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.223521 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfbmn\" (UniqueName: \"kubernetes.io/projected/98cede6f-200e-44d9-a4ee-886de53f2459-kube-api-access-xfbmn\") pod \"barbican-db-sync-q7knr\" (UID: \"98cede6f-200e-44d9-a4ee-886de53f2459\") " pod="openstack/barbican-db-sync-q7knr" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.225393 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtbpm\" (UniqueName: \"kubernetes.io/projected/873a333f-1f2b-4824-86a7-7935ff6908f9-kube-api-access-dtbpm\") pod \"heat-db-sync-x55sw\" (UID: \"873a333f-1f2b-4824-86a7-7935ff6908f9\") " pod="openstack/heat-db-sync-x55sw" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.228493 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98cede6f-200e-44d9-a4ee-886de53f2459-combined-ca-bundle\") pod \"barbican-db-sync-q7knr\" (UID: \"98cede6f-200e-44d9-a4ee-886de53f2459\") " pod="openstack/barbican-db-sync-q7knr" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.232451 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-mfzbs"] Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.260526 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/873a333f-1f2b-4824-86a7-7935ff6908f9-combined-ca-bundle\") pod \"heat-db-sync-x55sw\" (UID: \"873a333f-1f2b-4824-86a7-7935ff6908f9\") " pod="openstack/heat-db-sync-x55sw" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.293326 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ecfd3cc-e01f-4a75-ad5e-8c0e44638525-combined-ca-bundle\") pod \"neutron-db-sync-pvhvh\" (UID: \"4ecfd3cc-e01f-4a75-ad5e-8c0e44638525\") " pod="openstack/neutron-db-sync-pvhvh" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.293419 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29a10360-5fb7-4add-8bf5-1bc35e6e76dd-config-data\") pod \"cinder-db-sync-vzzq8\" (UID: \"29a10360-5fb7-4add-8bf5-1bc35e6e76dd\") " pod="openstack/cinder-db-sync-vzzq8" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.293437 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/29a10360-5fb7-4add-8bf5-1bc35e6e76dd-etc-machine-id\") pod \"cinder-db-sync-vzzq8\" (UID: \"29a10360-5fb7-4add-8bf5-1bc35e6e76dd\") " pod="openstack/cinder-db-sync-vzzq8" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.293489 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb73c9fd-2c3b-4d8e-a1c4-5ff4a2b8f67c-scripts\") pod \"placement-db-sync-l44sw\" (UID: \"eb73c9fd-2c3b-4d8e-a1c4-5ff4a2b8f67c\") " pod="openstack/placement-db-sync-l44sw" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.293513 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38501711-21d8-43f0-8657-7507944ef792-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-mfzbs\" (UID: \"38501711-21d8-43f0-8657-7507944ef792\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mfzbs" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.293543 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb73c9fd-2c3b-4d8e-a1c4-5ff4a2b8f67c-logs\") pod \"placement-db-sync-l44sw\" (UID: \"eb73c9fd-2c3b-4d8e-a1c4-5ff4a2b8f67c\") " pod="openstack/placement-db-sync-l44sw" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.293568 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zb7f\" (UniqueName: \"kubernetes.io/projected/4ecfd3cc-e01f-4a75-ad5e-8c0e44638525-kube-api-access-9zb7f\") pod \"neutron-db-sync-pvhvh\" (UID: \"4ecfd3cc-e01f-4a75-ad5e-8c0e44638525\") " pod="openstack/neutron-db-sync-pvhvh" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.293599 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb73c9fd-2c3b-4d8e-a1c4-5ff4a2b8f67c-combined-ca-bundle\") pod \"placement-db-sync-l44sw\" (UID: \"eb73c9fd-2c3b-4d8e-a1c4-5ff4a2b8f67c\") " pod="openstack/placement-db-sync-l44sw" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.293633 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29a10360-5fb7-4add-8bf5-1bc35e6e76dd-combined-ca-bundle\") pod \"cinder-db-sync-vzzq8\" (UID: \"29a10360-5fb7-4add-8bf5-1bc35e6e76dd\") " pod="openstack/cinder-db-sync-vzzq8" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.293650 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dprzw\" (UniqueName: \"kubernetes.io/projected/38501711-21d8-43f0-8657-7507944ef792-kube-api-access-dprzw\") pod \"dnsmasq-dns-56df8fb6b7-mfzbs\" (UID: \"38501711-21d8-43f0-8657-7507944ef792\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mfzbs" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.293676 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/29a10360-5fb7-4add-8bf5-1bc35e6e76dd-db-sync-config-data\") pod \"cinder-db-sync-vzzq8\" (UID: \"29a10360-5fb7-4add-8bf5-1bc35e6e76dd\") " pod="openstack/cinder-db-sync-vzzq8" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.293696 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb73c9fd-2c3b-4d8e-a1c4-5ff4a2b8f67c-config-data\") pod \"placement-db-sync-l44sw\" (UID: \"eb73c9fd-2c3b-4d8e-a1c4-5ff4a2b8f67c\") " pod="openstack/placement-db-sync-l44sw" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.293720 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4ecfd3cc-e01f-4a75-ad5e-8c0e44638525-config\") pod \"neutron-db-sync-pvhvh\" (UID: \"4ecfd3cc-e01f-4a75-ad5e-8c0e44638525\") " pod="openstack/neutron-db-sync-pvhvh" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.293739 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dv9w6\" (UniqueName: \"kubernetes.io/projected/29a10360-5fb7-4add-8bf5-1bc35e6e76dd-kube-api-access-dv9w6\") pod \"cinder-db-sync-vzzq8\" (UID: \"29a10360-5fb7-4add-8bf5-1bc35e6e76dd\") " pod="openstack/cinder-db-sync-vzzq8" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.294642 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/38501711-21d8-43f0-8657-7507944ef792-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-mfzbs\" (UID: \"38501711-21d8-43f0-8657-7507944ef792\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mfzbs" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.294688 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29a10360-5fb7-4add-8bf5-1bc35e6e76dd-scripts\") pod \"cinder-db-sync-vzzq8\" (UID: \"29a10360-5fb7-4add-8bf5-1bc35e6e76dd\") " pod="openstack/cinder-db-sync-vzzq8" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.294732 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8m7gn\" (UniqueName: \"kubernetes.io/projected/eb73c9fd-2c3b-4d8e-a1c4-5ff4a2b8f67c-kube-api-access-8m7gn\") pod \"placement-db-sync-l44sw\" (UID: \"eb73c9fd-2c3b-4d8e-a1c4-5ff4a2b8f67c\") " pod="openstack/placement-db-sync-l44sw" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.294776 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38501711-21d8-43f0-8657-7507944ef792-config\") pod \"dnsmasq-dns-56df8fb6b7-mfzbs\" (UID: \"38501711-21d8-43f0-8657-7507944ef792\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mfzbs" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.294817 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/38501711-21d8-43f0-8657-7507944ef792-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-mfzbs\" (UID: \"38501711-21d8-43f0-8657-7507944ef792\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mfzbs" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.294837 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/38501711-21d8-43f0-8657-7507944ef792-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-mfzbs\" (UID: \"38501711-21d8-43f0-8657-7507944ef792\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mfzbs" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.305159 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ecfd3cc-e01f-4a75-ad5e-8c0e44638525-combined-ca-bundle\") pod \"neutron-db-sync-pvhvh\" (UID: \"4ecfd3cc-e01f-4a75-ad5e-8c0e44638525\") " pod="openstack/neutron-db-sync-pvhvh" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.320530 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/4ecfd3cc-e01f-4a75-ad5e-8c0e44638525-config\") pod \"neutron-db-sync-pvhvh\" (UID: \"4ecfd3cc-e01f-4a75-ad5e-8c0e44638525\") " pod="openstack/neutron-db-sync-pvhvh" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.346542 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zb7f\" (UniqueName: \"kubernetes.io/projected/4ecfd3cc-e01f-4a75-ad5e-8c0e44638525-kube-api-access-9zb7f\") pod \"neutron-db-sync-pvhvh\" (UID: \"4ecfd3cc-e01f-4a75-ad5e-8c0e44638525\") " pod="openstack/neutron-db-sync-pvhvh" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.353801 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-x55sw" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.384503 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.387062 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.391819 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.393724 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.397334 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29a10360-5fb7-4add-8bf5-1bc35e6e76dd-config-data\") pod \"cinder-db-sync-vzzq8\" (UID: \"29a10360-5fb7-4add-8bf5-1bc35e6e76dd\") " pod="openstack/cinder-db-sync-vzzq8" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.397893 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/29a10360-5fb7-4add-8bf5-1bc35e6e76dd-etc-machine-id\") pod \"cinder-db-sync-vzzq8\" (UID: \"29a10360-5fb7-4add-8bf5-1bc35e6e76dd\") " pod="openstack/cinder-db-sync-vzzq8" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.398043 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb73c9fd-2c3b-4d8e-a1c4-5ff4a2b8f67c-scripts\") pod \"placement-db-sync-l44sw\" (UID: \"eb73c9fd-2c3b-4d8e-a1c4-5ff4a2b8f67c\") " pod="openstack/placement-db-sync-l44sw" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.398116 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38501711-21d8-43f0-8657-7507944ef792-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-mfzbs\" (UID: \"38501711-21d8-43f0-8657-7507944ef792\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mfzbs" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.398223 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb73c9fd-2c3b-4d8e-a1c4-5ff4a2b8f67c-logs\") pod \"placement-db-sync-l44sw\" (UID: \"eb73c9fd-2c3b-4d8e-a1c4-5ff4a2b8f67c\") " pod="openstack/placement-db-sync-l44sw" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.398320 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb73c9fd-2c3b-4d8e-a1c4-5ff4a2b8f67c-combined-ca-bundle\") pod \"placement-db-sync-l44sw\" (UID: \"eb73c9fd-2c3b-4d8e-a1c4-5ff4a2b8f67c\") " pod="openstack/placement-db-sync-l44sw" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.398411 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29a10360-5fb7-4add-8bf5-1bc35e6e76dd-combined-ca-bundle\") pod \"cinder-db-sync-vzzq8\" (UID: \"29a10360-5fb7-4add-8bf5-1bc35e6e76dd\") " pod="openstack/cinder-db-sync-vzzq8" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.400026 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dprzw\" (UniqueName: \"kubernetes.io/projected/38501711-21d8-43f0-8657-7507944ef792-kube-api-access-dprzw\") pod \"dnsmasq-dns-56df8fb6b7-mfzbs\" (UID: \"38501711-21d8-43f0-8657-7507944ef792\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mfzbs" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.400128 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/29a10360-5fb7-4add-8bf5-1bc35e6e76dd-db-sync-config-data\") pod \"cinder-db-sync-vzzq8\" (UID: \"29a10360-5fb7-4add-8bf5-1bc35e6e76dd\") " pod="openstack/cinder-db-sync-vzzq8" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.400237 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb73c9fd-2c3b-4d8e-a1c4-5ff4a2b8f67c-config-data\") pod \"placement-db-sync-l44sw\" (UID: \"eb73c9fd-2c3b-4d8e-a1c4-5ff4a2b8f67c\") " pod="openstack/placement-db-sync-l44sw" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.400355 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dv9w6\" (UniqueName: \"kubernetes.io/projected/29a10360-5fb7-4add-8bf5-1bc35e6e76dd-kube-api-access-dv9w6\") pod \"cinder-db-sync-vzzq8\" (UID: \"29a10360-5fb7-4add-8bf5-1bc35e6e76dd\") " pod="openstack/cinder-db-sync-vzzq8" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.400495 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/38501711-21d8-43f0-8657-7507944ef792-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-mfzbs\" (UID: \"38501711-21d8-43f0-8657-7507944ef792\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mfzbs" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.400614 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29a10360-5fb7-4add-8bf5-1bc35e6e76dd-scripts\") pod \"cinder-db-sync-vzzq8\" (UID: \"29a10360-5fb7-4add-8bf5-1bc35e6e76dd\") " pod="openstack/cinder-db-sync-vzzq8" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.400749 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8m7gn\" (UniqueName: \"kubernetes.io/projected/eb73c9fd-2c3b-4d8e-a1c4-5ff4a2b8f67c-kube-api-access-8m7gn\") pod \"placement-db-sync-l44sw\" (UID: \"eb73c9fd-2c3b-4d8e-a1c4-5ff4a2b8f67c\") " pod="openstack/placement-db-sync-l44sw" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.402672 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38501711-21d8-43f0-8657-7507944ef792-config\") pod \"dnsmasq-dns-56df8fb6b7-mfzbs\" (UID: \"38501711-21d8-43f0-8657-7507944ef792\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mfzbs" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.402784 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/38501711-21d8-43f0-8657-7507944ef792-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-mfzbs\" (UID: \"38501711-21d8-43f0-8657-7507944ef792\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mfzbs" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.402880 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/38501711-21d8-43f0-8657-7507944ef792-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-mfzbs\" (UID: \"38501711-21d8-43f0-8657-7507944ef792\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mfzbs" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.403888 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/38501711-21d8-43f0-8657-7507944ef792-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-mfzbs\" (UID: \"38501711-21d8-43f0-8657-7507944ef792\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mfzbs" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.403958 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb73c9fd-2c3b-4d8e-a1c4-5ff4a2b8f67c-scripts\") pod \"placement-db-sync-l44sw\" (UID: \"eb73c9fd-2c3b-4d8e-a1c4-5ff4a2b8f67c\") " pod="openstack/placement-db-sync-l44sw" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.404019 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.404082 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/29a10360-5fb7-4add-8bf5-1bc35e6e76dd-etc-machine-id\") pod \"cinder-db-sync-vzzq8\" (UID: \"29a10360-5fb7-4add-8bf5-1bc35e6e76dd\") " pod="openstack/cinder-db-sync-vzzq8" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.405105 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-q7knr" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.399566 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38501711-21d8-43f0-8657-7507944ef792-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-mfzbs\" (UID: \"38501711-21d8-43f0-8657-7507944ef792\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mfzbs" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.398893 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb73c9fd-2c3b-4d8e-a1c4-5ff4a2b8f67c-logs\") pod \"placement-db-sync-l44sw\" (UID: \"eb73c9fd-2c3b-4d8e-a1c4-5ff4a2b8f67c\") " pod="openstack/placement-db-sync-l44sw" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.406948 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38501711-21d8-43f0-8657-7507944ef792-config\") pod \"dnsmasq-dns-56df8fb6b7-mfzbs\" (UID: \"38501711-21d8-43f0-8657-7507944ef792\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mfzbs" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.407774 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/38501711-21d8-43f0-8657-7507944ef792-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-mfzbs\" (UID: \"38501711-21d8-43f0-8657-7507944ef792\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mfzbs" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.409425 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb73c9fd-2c3b-4d8e-a1c4-5ff4a2b8f67c-combined-ca-bundle\") pod \"placement-db-sync-l44sw\" (UID: \"eb73c9fd-2c3b-4d8e-a1c4-5ff4a2b8f67c\") " pod="openstack/placement-db-sync-l44sw" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.410022 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/38501711-21d8-43f0-8657-7507944ef792-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-mfzbs\" (UID: \"38501711-21d8-43f0-8657-7507944ef792\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mfzbs" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.425835 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb73c9fd-2c3b-4d8e-a1c4-5ff4a2b8f67c-config-data\") pod \"placement-db-sync-l44sw\" (UID: \"eb73c9fd-2c3b-4d8e-a1c4-5ff4a2b8f67c\") " pod="openstack/placement-db-sync-l44sw" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.426206 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29a10360-5fb7-4add-8bf5-1bc35e6e76dd-combined-ca-bundle\") pod \"cinder-db-sync-vzzq8\" (UID: \"29a10360-5fb7-4add-8bf5-1bc35e6e76dd\") " pod="openstack/cinder-db-sync-vzzq8" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.426876 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29a10360-5fb7-4add-8bf5-1bc35e6e76dd-config-data\") pod \"cinder-db-sync-vzzq8\" (UID: \"29a10360-5fb7-4add-8bf5-1bc35e6e76dd\") " pod="openstack/cinder-db-sync-vzzq8" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.428691 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dprzw\" (UniqueName: \"kubernetes.io/projected/38501711-21d8-43f0-8657-7507944ef792-kube-api-access-dprzw\") pod \"dnsmasq-dns-56df8fb6b7-mfzbs\" (UID: \"38501711-21d8-43f0-8657-7507944ef792\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mfzbs" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.429125 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/29a10360-5fb7-4add-8bf5-1bc35e6e76dd-db-sync-config-data\") pod \"cinder-db-sync-vzzq8\" (UID: \"29a10360-5fb7-4add-8bf5-1bc35e6e76dd\") " pod="openstack/cinder-db-sync-vzzq8" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.429822 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dv9w6\" (UniqueName: \"kubernetes.io/projected/29a10360-5fb7-4add-8bf5-1bc35e6e76dd-kube-api-access-dv9w6\") pod \"cinder-db-sync-vzzq8\" (UID: \"29a10360-5fb7-4add-8bf5-1bc35e6e76dd\") " pod="openstack/cinder-db-sync-vzzq8" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.430312 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29a10360-5fb7-4add-8bf5-1bc35e6e76dd-scripts\") pod \"cinder-db-sync-vzzq8\" (UID: \"29a10360-5fb7-4add-8bf5-1bc35e6e76dd\") " pod="openstack/cinder-db-sync-vzzq8" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.430713 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8m7gn\" (UniqueName: \"kubernetes.io/projected/eb73c9fd-2c3b-4d8e-a1c4-5ff4a2b8f67c-kube-api-access-8m7gn\") pod \"placement-db-sync-l44sw\" (UID: \"eb73c9fd-2c3b-4d8e-a1c4-5ff4a2b8f67c\") " pod="openstack/placement-db-sync-l44sw" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.431462 4962 generic.go:334] "Generic (PLEG): container finished" podID="98c2491b-c2d9-4a33-a9a9-315c1fbdc8af" containerID="b2006a7882e564db2976f58ea596464dc1508227cb617c0c64ce425635602e33" exitCode=0 Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.431466 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-vhbb6" event={"ID":"98c2491b-c2d9-4a33-a9a9-315c1fbdc8af","Type":"ContainerDied","Data":"b2006a7882e564db2976f58ea596464dc1508227cb617c0c64ce425635602e33"} Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.433724 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-pvhvh" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.482895 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-vhbb6" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.513475 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxvtk\" (UniqueName: \"kubernetes.io/projected/624dc66d-d4f2-447b-861b-23987a75a3d6-kube-api-access-jxvtk\") pod \"ceilometer-0\" (UID: \"624dc66d-d4f2-447b-861b-23987a75a3d6\") " pod="openstack/ceilometer-0" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.513520 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/624dc66d-d4f2-447b-861b-23987a75a3d6-log-httpd\") pod \"ceilometer-0\" (UID: \"624dc66d-d4f2-447b-861b-23987a75a3d6\") " pod="openstack/ceilometer-0" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.513628 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/624dc66d-d4f2-447b-861b-23987a75a3d6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"624dc66d-d4f2-447b-861b-23987a75a3d6\") " pod="openstack/ceilometer-0" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.513645 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/624dc66d-d4f2-447b-861b-23987a75a3d6-run-httpd\") pod \"ceilometer-0\" (UID: \"624dc66d-d4f2-447b-861b-23987a75a3d6\") " pod="openstack/ceilometer-0" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.513669 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/624dc66d-d4f2-447b-861b-23987a75a3d6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"624dc66d-d4f2-447b-861b-23987a75a3d6\") " pod="openstack/ceilometer-0" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.513738 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/624dc66d-d4f2-447b-861b-23987a75a3d6-config-data\") pod \"ceilometer-0\" (UID: \"624dc66d-d4f2-447b-861b-23987a75a3d6\") " pod="openstack/ceilometer-0" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.513762 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/624dc66d-d4f2-447b-861b-23987a75a3d6-scripts\") pod \"ceilometer-0\" (UID: \"624dc66d-d4f2-447b-861b-23987a75a3d6\") " pod="openstack/ceilometer-0" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.523000 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-vzzq8" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.561591 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-l44sw" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.581259 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-mfzbs" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.618184 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/98c2491b-c2d9-4a33-a9a9-315c1fbdc8af-ovsdbserver-nb\") pod \"98c2491b-c2d9-4a33-a9a9-315c1fbdc8af\" (UID: \"98c2491b-c2d9-4a33-a9a9-315c1fbdc8af\") " Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.618226 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98c2491b-c2d9-4a33-a9a9-315c1fbdc8af-config\") pod \"98c2491b-c2d9-4a33-a9a9-315c1fbdc8af\" (UID: \"98c2491b-c2d9-4a33-a9a9-315c1fbdc8af\") " Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.618282 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/98c2491b-c2d9-4a33-a9a9-315c1fbdc8af-dns-swift-storage-0\") pod \"98c2491b-c2d9-4a33-a9a9-315c1fbdc8af\" (UID: \"98c2491b-c2d9-4a33-a9a9-315c1fbdc8af\") " Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.618359 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crk5v\" (UniqueName: \"kubernetes.io/projected/98c2491b-c2d9-4a33-a9a9-315c1fbdc8af-kube-api-access-crk5v\") pod \"98c2491b-c2d9-4a33-a9a9-315c1fbdc8af\" (UID: \"98c2491b-c2d9-4a33-a9a9-315c1fbdc8af\") " Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.618492 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/98c2491b-c2d9-4a33-a9a9-315c1fbdc8af-dns-svc\") pod \"98c2491b-c2d9-4a33-a9a9-315c1fbdc8af\" (UID: \"98c2491b-c2d9-4a33-a9a9-315c1fbdc8af\") " Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.618529 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/98c2491b-c2d9-4a33-a9a9-315c1fbdc8af-ovsdbserver-sb\") pod \"98c2491b-c2d9-4a33-a9a9-315c1fbdc8af\" (UID: \"98c2491b-c2d9-4a33-a9a9-315c1fbdc8af\") " Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.618827 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/624dc66d-d4f2-447b-861b-23987a75a3d6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"624dc66d-d4f2-447b-861b-23987a75a3d6\") " pod="openstack/ceilometer-0" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.618854 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/624dc66d-d4f2-447b-861b-23987a75a3d6-run-httpd\") pod \"ceilometer-0\" (UID: \"624dc66d-d4f2-447b-861b-23987a75a3d6\") " pod="openstack/ceilometer-0" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.618881 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/624dc66d-d4f2-447b-861b-23987a75a3d6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"624dc66d-d4f2-447b-861b-23987a75a3d6\") " pod="openstack/ceilometer-0" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.618958 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/624dc66d-d4f2-447b-861b-23987a75a3d6-config-data\") pod \"ceilometer-0\" (UID: \"624dc66d-d4f2-447b-861b-23987a75a3d6\") " pod="openstack/ceilometer-0" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.618982 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/624dc66d-d4f2-447b-861b-23987a75a3d6-scripts\") pod \"ceilometer-0\" (UID: \"624dc66d-d4f2-447b-861b-23987a75a3d6\") " pod="openstack/ceilometer-0" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.619020 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxvtk\" (UniqueName: \"kubernetes.io/projected/624dc66d-d4f2-447b-861b-23987a75a3d6-kube-api-access-jxvtk\") pod \"ceilometer-0\" (UID: \"624dc66d-d4f2-447b-861b-23987a75a3d6\") " pod="openstack/ceilometer-0" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.619044 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/624dc66d-d4f2-447b-861b-23987a75a3d6-log-httpd\") pod \"ceilometer-0\" (UID: \"624dc66d-d4f2-447b-861b-23987a75a3d6\") " pod="openstack/ceilometer-0" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.619485 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/624dc66d-d4f2-447b-861b-23987a75a3d6-log-httpd\") pod \"ceilometer-0\" (UID: \"624dc66d-d4f2-447b-861b-23987a75a3d6\") " pod="openstack/ceilometer-0" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.626691 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/624dc66d-d4f2-447b-861b-23987a75a3d6-run-httpd\") pod \"ceilometer-0\" (UID: \"624dc66d-d4f2-447b-861b-23987a75a3d6\") " pod="openstack/ceilometer-0" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.634799 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/624dc66d-d4f2-447b-861b-23987a75a3d6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"624dc66d-d4f2-447b-861b-23987a75a3d6\") " pod="openstack/ceilometer-0" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.635454 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98c2491b-c2d9-4a33-a9a9-315c1fbdc8af-kube-api-access-crk5v" (OuterVolumeSpecName: "kube-api-access-crk5v") pod "98c2491b-c2d9-4a33-a9a9-315c1fbdc8af" (UID: "98c2491b-c2d9-4a33-a9a9-315c1fbdc8af"). InnerVolumeSpecName "kube-api-access-crk5v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.638776 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/624dc66d-d4f2-447b-861b-23987a75a3d6-scripts\") pod \"ceilometer-0\" (UID: \"624dc66d-d4f2-447b-861b-23987a75a3d6\") " pod="openstack/ceilometer-0" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.640648 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/624dc66d-d4f2-447b-861b-23987a75a3d6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"624dc66d-d4f2-447b-861b-23987a75a3d6\") " pod="openstack/ceilometer-0" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.641542 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/624dc66d-d4f2-447b-861b-23987a75a3d6-config-data\") pod \"ceilometer-0\" (UID: \"624dc66d-d4f2-447b-861b-23987a75a3d6\") " pod="openstack/ceilometer-0" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.668125 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxvtk\" (UniqueName: \"kubernetes.io/projected/624dc66d-d4f2-447b-861b-23987a75a3d6-kube-api-access-jxvtk\") pod \"ceilometer-0\" (UID: \"624dc66d-d4f2-447b-861b-23987a75a3d6\") " pod="openstack/ceilometer-0" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.722133 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crk5v\" (UniqueName: \"kubernetes.io/projected/98c2491b-c2d9-4a33-a9a9-315c1fbdc8af-kube-api-access-crk5v\") on node \"crc\" DevicePath \"\"" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.737459 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.754256 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-m9jmv"] Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.770091 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 21:54:25 crc kubenswrapper[4962]: E1201 21:54:25.770592 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98c2491b-c2d9-4a33-a9a9-315c1fbdc8af" containerName="dnsmasq-dns" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.770605 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="98c2491b-c2d9-4a33-a9a9-315c1fbdc8af" containerName="dnsmasq-dns" Dec 01 21:54:25 crc kubenswrapper[4962]: E1201 21:54:25.770615 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98c2491b-c2d9-4a33-a9a9-315c1fbdc8af" containerName="init" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.770621 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="98c2491b-c2d9-4a33-a9a9-315c1fbdc8af" containerName="init" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.770815 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="98c2491b-c2d9-4a33-a9a9-315c1fbdc8af" containerName="dnsmasq-dns" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.771890 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.780305 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.780485 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.780546 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.780713 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-nj6wq" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.781277 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.833436 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98c2491b-c2d9-4a33-a9a9-315c1fbdc8af-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "98c2491b-c2d9-4a33-a9a9-315c1fbdc8af" (UID: "98c2491b-c2d9-4a33-a9a9-315c1fbdc8af"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.838240 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17a84147-ff3d-4d41-a2d4-355e5e4de88b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"17a84147-ff3d-4d41-a2d4-355e5e4de88b\") " pod="openstack/glance-default-external-api-0" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.838297 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17a84147-ff3d-4d41-a2d4-355e5e4de88b-config-data\") pod \"glance-default-external-api-0\" (UID: \"17a84147-ff3d-4d41-a2d4-355e5e4de88b\") " pod="openstack/glance-default-external-api-0" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.838716 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/17a84147-ff3d-4d41-a2d4-355e5e4de88b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"17a84147-ff3d-4d41-a2d4-355e5e4de88b\") " pod="openstack/glance-default-external-api-0" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.838818 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcx72\" (UniqueName: \"kubernetes.io/projected/17a84147-ff3d-4d41-a2d4-355e5e4de88b-kube-api-access-fcx72\") pod \"glance-default-external-api-0\" (UID: \"17a84147-ff3d-4d41-a2d4-355e5e4de88b\") " pod="openstack/glance-default-external-api-0" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.838888 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17a84147-ff3d-4d41-a2d4-355e5e4de88b-scripts\") pod \"glance-default-external-api-0\" (UID: \"17a84147-ff3d-4d41-a2d4-355e5e4de88b\") " pod="openstack/glance-default-external-api-0" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.839041 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"17a84147-ff3d-4d41-a2d4-355e5e4de88b\") " pod="openstack/glance-default-external-api-0" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.839195 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17a84147-ff3d-4d41-a2d4-355e5e4de88b-logs\") pod \"glance-default-external-api-0\" (UID: \"17a84147-ff3d-4d41-a2d4-355e5e4de88b\") " pod="openstack/glance-default-external-api-0" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.839272 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/17a84147-ff3d-4d41-a2d4-355e5e4de88b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"17a84147-ff3d-4d41-a2d4-355e5e4de88b\") " pod="openstack/glance-default-external-api-0" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.839425 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/98c2491b-c2d9-4a33-a9a9-315c1fbdc8af-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.844356 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98c2491b-c2d9-4a33-a9a9-315c1fbdc8af-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "98c2491b-c2d9-4a33-a9a9-315c1fbdc8af" (UID: "98c2491b-c2d9-4a33-a9a9-315c1fbdc8af"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.941161 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"17a84147-ff3d-4d41-a2d4-355e5e4de88b\") " pod="openstack/glance-default-external-api-0" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.941430 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17a84147-ff3d-4d41-a2d4-355e5e4de88b-logs\") pod \"glance-default-external-api-0\" (UID: \"17a84147-ff3d-4d41-a2d4-355e5e4de88b\") " pod="openstack/glance-default-external-api-0" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.941451 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/17a84147-ff3d-4d41-a2d4-355e5e4de88b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"17a84147-ff3d-4d41-a2d4-355e5e4de88b\") " pod="openstack/glance-default-external-api-0" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.941483 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17a84147-ff3d-4d41-a2d4-355e5e4de88b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"17a84147-ff3d-4d41-a2d4-355e5e4de88b\") " pod="openstack/glance-default-external-api-0" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.941504 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17a84147-ff3d-4d41-a2d4-355e5e4de88b-config-data\") pod \"glance-default-external-api-0\" (UID: \"17a84147-ff3d-4d41-a2d4-355e5e4de88b\") " pod="openstack/glance-default-external-api-0" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.941623 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/17a84147-ff3d-4d41-a2d4-355e5e4de88b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"17a84147-ff3d-4d41-a2d4-355e5e4de88b\") " pod="openstack/glance-default-external-api-0" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.941644 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcx72\" (UniqueName: \"kubernetes.io/projected/17a84147-ff3d-4d41-a2d4-355e5e4de88b-kube-api-access-fcx72\") pod \"glance-default-external-api-0\" (UID: \"17a84147-ff3d-4d41-a2d4-355e5e4de88b\") " pod="openstack/glance-default-external-api-0" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.941661 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17a84147-ff3d-4d41-a2d4-355e5e4de88b-scripts\") pod \"glance-default-external-api-0\" (UID: \"17a84147-ff3d-4d41-a2d4-355e5e4de88b\") " pod="openstack/glance-default-external-api-0" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.941726 4962 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/98c2491b-c2d9-4a33-a9a9-315c1fbdc8af-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.949797 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/17a84147-ff3d-4d41-a2d4-355e5e4de88b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"17a84147-ff3d-4d41-a2d4-355e5e4de88b\") " pod="openstack/glance-default-external-api-0" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.950062 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"17a84147-ff3d-4d41-a2d4-355e5e4de88b\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-external-api-0" Dec 01 21:54:25 crc kubenswrapper[4962]: I1201 21:54:25.950587 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17a84147-ff3d-4d41-a2d4-355e5e4de88b-logs\") pod \"glance-default-external-api-0\" (UID: \"17a84147-ff3d-4d41-a2d4-355e5e4de88b\") " pod="openstack/glance-default-external-api-0" Dec 01 21:54:26 crc kubenswrapper[4962]: I1201 21:54:26.006456 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17a84147-ff3d-4d41-a2d4-355e5e4de88b-config-data\") pod \"glance-default-external-api-0\" (UID: \"17a84147-ff3d-4d41-a2d4-355e5e4de88b\") " pod="openstack/glance-default-external-api-0" Dec 01 21:54:26 crc kubenswrapper[4962]: I1201 21:54:26.006585 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/17a84147-ff3d-4d41-a2d4-355e5e4de88b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"17a84147-ff3d-4d41-a2d4-355e5e4de88b\") " pod="openstack/glance-default-external-api-0" Dec 01 21:54:26 crc kubenswrapper[4962]: I1201 21:54:26.006883 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17a84147-ff3d-4d41-a2d4-355e5e4de88b-scripts\") pod \"glance-default-external-api-0\" (UID: \"17a84147-ff3d-4d41-a2d4-355e5e4de88b\") " pod="openstack/glance-default-external-api-0" Dec 01 21:54:26 crc kubenswrapper[4962]: I1201 21:54:26.007174 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17a84147-ff3d-4d41-a2d4-355e5e4de88b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"17a84147-ff3d-4d41-a2d4-355e5e4de88b\") " pod="openstack/glance-default-external-api-0" Dec 01 21:54:26 crc kubenswrapper[4962]: I1201 21:54:26.011207 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98c2491b-c2d9-4a33-a9a9-315c1fbdc8af-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "98c2491b-c2d9-4a33-a9a9-315c1fbdc8af" (UID: "98c2491b-c2d9-4a33-a9a9-315c1fbdc8af"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:54:26 crc kubenswrapper[4962]: I1201 21:54:26.015017 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcx72\" (UniqueName: \"kubernetes.io/projected/17a84147-ff3d-4d41-a2d4-355e5e4de88b-kube-api-access-fcx72\") pod \"glance-default-external-api-0\" (UID: \"17a84147-ff3d-4d41-a2d4-355e5e4de88b\") " pod="openstack/glance-default-external-api-0" Dec 01 21:54:26 crc kubenswrapper[4962]: I1201 21:54:26.016154 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98c2491b-c2d9-4a33-a9a9-315c1fbdc8af-config" (OuterVolumeSpecName: "config") pod "98c2491b-c2d9-4a33-a9a9-315c1fbdc8af" (UID: "98c2491b-c2d9-4a33-a9a9-315c1fbdc8af"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:54:26 crc kubenswrapper[4962]: I1201 21:54:26.025734 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 21:54:26 crc kubenswrapper[4962]: I1201 21:54:26.027503 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 21:54:26 crc kubenswrapper[4962]: I1201 21:54:26.032285 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 01 21:54:26 crc kubenswrapper[4962]: I1201 21:54:26.039539 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 01 21:54:26 crc kubenswrapper[4962]: I1201 21:54:26.079561 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/98c2491b-c2d9-4a33-a9a9-315c1fbdc8af-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 21:54:26 crc kubenswrapper[4962]: I1201 21:54:26.079602 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98c2491b-c2d9-4a33-a9a9-315c1fbdc8af-config\") on node \"crc\" DevicePath \"\"" Dec 01 21:54:26 crc kubenswrapper[4962]: I1201 21:54:26.091235 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98c2491b-c2d9-4a33-a9a9-315c1fbdc8af-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "98c2491b-c2d9-4a33-a9a9-315c1fbdc8af" (UID: "98c2491b-c2d9-4a33-a9a9-315c1fbdc8af"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:54:26 crc kubenswrapper[4962]: I1201 21:54:26.146052 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 21:54:26 crc kubenswrapper[4962]: I1201 21:54:26.196408 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s284m\" (UniqueName: \"kubernetes.io/projected/89c75330-08c2-4341-85cf-2b550d6f0aa7-kube-api-access-s284m\") pod \"glance-default-internal-api-0\" (UID: \"89c75330-08c2-4341-85cf-2b550d6f0aa7\") " pod="openstack/glance-default-internal-api-0" Dec 01 21:54:26 crc kubenswrapper[4962]: I1201 21:54:26.196502 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/89c75330-08c2-4341-85cf-2b550d6f0aa7-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"89c75330-08c2-4341-85cf-2b550d6f0aa7\") " pod="openstack/glance-default-internal-api-0" Dec 01 21:54:26 crc kubenswrapper[4962]: I1201 21:54:26.196664 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"89c75330-08c2-4341-85cf-2b550d6f0aa7\") " pod="openstack/glance-default-internal-api-0" Dec 01 21:54:26 crc kubenswrapper[4962]: I1201 21:54:26.196696 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89c75330-08c2-4341-85cf-2b550d6f0aa7-logs\") pod \"glance-default-internal-api-0\" (UID: \"89c75330-08c2-4341-85cf-2b550d6f0aa7\") " pod="openstack/glance-default-internal-api-0" Dec 01 21:54:26 crc kubenswrapper[4962]: I1201 21:54:26.196774 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89c75330-08c2-4341-85cf-2b550d6f0aa7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"89c75330-08c2-4341-85cf-2b550d6f0aa7\") " pod="openstack/glance-default-internal-api-0" Dec 01 21:54:26 crc kubenswrapper[4962]: I1201 21:54:26.196799 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89c75330-08c2-4341-85cf-2b550d6f0aa7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"89c75330-08c2-4341-85cf-2b550d6f0aa7\") " pod="openstack/glance-default-internal-api-0" Dec 01 21:54:26 crc kubenswrapper[4962]: I1201 21:54:26.196971 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/89c75330-08c2-4341-85cf-2b550d6f0aa7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"89c75330-08c2-4341-85cf-2b550d6f0aa7\") " pod="openstack/glance-default-internal-api-0" Dec 01 21:54:26 crc kubenswrapper[4962]: I1201 21:54:26.197028 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89c75330-08c2-4341-85cf-2b550d6f0aa7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"89c75330-08c2-4341-85cf-2b550d6f0aa7\") " pod="openstack/glance-default-internal-api-0" Dec 01 21:54:26 crc kubenswrapper[4962]: I1201 21:54:26.197339 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/98c2491b-c2d9-4a33-a9a9-315c1fbdc8af-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 21:54:26 crc kubenswrapper[4962]: I1201 21:54:26.232699 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"17a84147-ff3d-4d41-a2d4-355e5e4de88b\") " pod="openstack/glance-default-external-api-0" Dec 01 21:54:26 crc kubenswrapper[4962]: W1201 21:54:26.262054 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode62f3680_a9bc_455b_a71e_77efaa2f2a87.slice/crio-bb169b0d4d8c5668de7066d89351fa85c8aef3e4e790eaf814fa415c23709149 WatchSource:0}: Error finding container bb169b0d4d8c5668de7066d89351fa85c8aef3e4e790eaf814fa415c23709149: Status 404 returned error can't find the container with id bb169b0d4d8c5668de7066d89351fa85c8aef3e4e790eaf814fa415c23709149 Dec 01 21:54:26 crc kubenswrapper[4962]: I1201 21:54:26.270957 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-zbmfp"] Dec 01 21:54:26 crc kubenswrapper[4962]: I1201 21:54:26.300125 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"89c75330-08c2-4341-85cf-2b550d6f0aa7\") " pod="openstack/glance-default-internal-api-0" Dec 01 21:54:26 crc kubenswrapper[4962]: I1201 21:54:26.300163 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89c75330-08c2-4341-85cf-2b550d6f0aa7-logs\") pod \"glance-default-internal-api-0\" (UID: \"89c75330-08c2-4341-85cf-2b550d6f0aa7\") " pod="openstack/glance-default-internal-api-0" Dec 01 21:54:26 crc kubenswrapper[4962]: I1201 21:54:26.300221 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89c75330-08c2-4341-85cf-2b550d6f0aa7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"89c75330-08c2-4341-85cf-2b550d6f0aa7\") " pod="openstack/glance-default-internal-api-0" Dec 01 21:54:26 crc kubenswrapper[4962]: I1201 21:54:26.300243 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89c75330-08c2-4341-85cf-2b550d6f0aa7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"89c75330-08c2-4341-85cf-2b550d6f0aa7\") " pod="openstack/glance-default-internal-api-0" Dec 01 21:54:26 crc kubenswrapper[4962]: I1201 21:54:26.300524 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/89c75330-08c2-4341-85cf-2b550d6f0aa7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"89c75330-08c2-4341-85cf-2b550d6f0aa7\") " pod="openstack/glance-default-internal-api-0" Dec 01 21:54:26 crc kubenswrapper[4962]: I1201 21:54:26.300550 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89c75330-08c2-4341-85cf-2b550d6f0aa7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"89c75330-08c2-4341-85cf-2b550d6f0aa7\") " pod="openstack/glance-default-internal-api-0" Dec 01 21:54:26 crc kubenswrapper[4962]: I1201 21:54:26.300641 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s284m\" (UniqueName: \"kubernetes.io/projected/89c75330-08c2-4341-85cf-2b550d6f0aa7-kube-api-access-s284m\") pod \"glance-default-internal-api-0\" (UID: \"89c75330-08c2-4341-85cf-2b550d6f0aa7\") " pod="openstack/glance-default-internal-api-0" Dec 01 21:54:26 crc kubenswrapper[4962]: I1201 21:54:26.300668 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/89c75330-08c2-4341-85cf-2b550d6f0aa7-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"89c75330-08c2-4341-85cf-2b550d6f0aa7\") " pod="openstack/glance-default-internal-api-0" Dec 01 21:54:26 crc kubenswrapper[4962]: I1201 21:54:26.305178 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89c75330-08c2-4341-85cf-2b550d6f0aa7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"89c75330-08c2-4341-85cf-2b550d6f0aa7\") " pod="openstack/glance-default-internal-api-0" Dec 01 21:54:26 crc kubenswrapper[4962]: I1201 21:54:26.305504 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"89c75330-08c2-4341-85cf-2b550d6f0aa7\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Dec 01 21:54:26 crc kubenswrapper[4962]: I1201 21:54:26.307515 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/89c75330-08c2-4341-85cf-2b550d6f0aa7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"89c75330-08c2-4341-85cf-2b550d6f0aa7\") " pod="openstack/glance-default-internal-api-0" Dec 01 21:54:26 crc kubenswrapper[4962]: I1201 21:54:26.309358 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/89c75330-08c2-4341-85cf-2b550d6f0aa7-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"89c75330-08c2-4341-85cf-2b550d6f0aa7\") " pod="openstack/glance-default-internal-api-0" Dec 01 21:54:26 crc kubenswrapper[4962]: I1201 21:54:26.309461 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89c75330-08c2-4341-85cf-2b550d6f0aa7-logs\") pod \"glance-default-internal-api-0\" (UID: \"89c75330-08c2-4341-85cf-2b550d6f0aa7\") " pod="openstack/glance-default-internal-api-0" Dec 01 21:54:26 crc kubenswrapper[4962]: I1201 21:54:26.310337 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89c75330-08c2-4341-85cf-2b550d6f0aa7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"89c75330-08c2-4341-85cf-2b550d6f0aa7\") " pod="openstack/glance-default-internal-api-0" Dec 01 21:54:26 crc kubenswrapper[4962]: I1201 21:54:26.311927 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89c75330-08c2-4341-85cf-2b550d6f0aa7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"89c75330-08c2-4341-85cf-2b550d6f0aa7\") " pod="openstack/glance-default-internal-api-0" Dec 01 21:54:26 crc kubenswrapper[4962]: I1201 21:54:26.339178 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s284m\" (UniqueName: \"kubernetes.io/projected/89c75330-08c2-4341-85cf-2b550d6f0aa7-kube-api-access-s284m\") pod \"glance-default-internal-api-0\" (UID: \"89c75330-08c2-4341-85cf-2b550d6f0aa7\") " pod="openstack/glance-default-internal-api-0" Dec 01 21:54:26 crc kubenswrapper[4962]: I1201 21:54:26.374200 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"89c75330-08c2-4341-85cf-2b550d6f0aa7\") " pod="openstack/glance-default-internal-api-0" Dec 01 21:54:26 crc kubenswrapper[4962]: I1201 21:54:26.463572 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-zbmfp" event={"ID":"e62f3680-a9bc-455b-a71e-77efaa2f2a87","Type":"ContainerStarted","Data":"bb169b0d4d8c5668de7066d89351fa85c8aef3e4e790eaf814fa415c23709149"} Dec 01 21:54:26 crc kubenswrapper[4962]: I1201 21:54:26.464636 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 21:54:26 crc kubenswrapper[4962]: I1201 21:54:26.489696 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-m9jmv" event={"ID":"465331dc-de3a-40d9-b3bd-139e2ee114ec","Type":"ContainerStarted","Data":"2441265647eb3b87fb7087dd96055a21e6e57fdd3a1bff68bbba3d45a3721cce"} Dec 01 21:54:26 crc kubenswrapper[4962]: I1201 21:54:26.489750 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-m9jmv" event={"ID":"465331dc-de3a-40d9-b3bd-139e2ee114ec","Type":"ContainerStarted","Data":"66afc612769e2eb4e689e19b3ad237db8ff038114864fdd69523954c4cd55dbd"} Dec 01 21:54:26 crc kubenswrapper[4962]: I1201 21:54:26.500020 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 21:54:26 crc kubenswrapper[4962]: I1201 21:54:26.504853 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-pvhvh"] Dec 01 21:54:26 crc kubenswrapper[4962]: I1201 21:54:26.510489 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-vhbb6" event={"ID":"98c2491b-c2d9-4a33-a9a9-315c1fbdc8af","Type":"ContainerDied","Data":"d47ae36e1e4e3b6743b00e67da5be55de146099090dd80b4a4c6e5ad1e6cbec0"} Dec 01 21:54:26 crc kubenswrapper[4962]: I1201 21:54:26.510542 4962 scope.go:117] "RemoveContainer" containerID="b2006a7882e564db2976f58ea596464dc1508227cb617c0c64ce425635602e33" Dec 01 21:54:26 crc kubenswrapper[4962]: I1201 21:54:26.510726 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-vhbb6" Dec 01 21:54:26 crc kubenswrapper[4962]: W1201 21:54:26.531502 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ecfd3cc_e01f_4a75_ad5e_8c0e44638525.slice/crio-276be4279574e0ab70f52bc0c3ca097dacdf41eae1b2b6f36e99099417c774af WatchSource:0}: Error finding container 276be4279574e0ab70f52bc0c3ca097dacdf41eae1b2b6f36e99099417c774af: Status 404 returned error can't find the container with id 276be4279574e0ab70f52bc0c3ca097dacdf41eae1b2b6f36e99099417c774af Dec 01 21:54:26 crc kubenswrapper[4962]: I1201 21:54:26.555290 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-vhbb6"] Dec 01 21:54:26 crc kubenswrapper[4962]: I1201 21:54:26.571135 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-vhbb6"] Dec 01 21:54:26 crc kubenswrapper[4962]: I1201 21:54:26.576689 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-x55sw"] Dec 01 21:54:26 crc kubenswrapper[4962]: I1201 21:54:26.646279 4962 scope.go:117] "RemoveContainer" containerID="6ad9975cf2a83da885d1e72842624c13d5c5963975b6061d53d3e0ff43593de5" Dec 01 21:54:26 crc kubenswrapper[4962]: I1201 21:54:26.776574 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-q7knr"] Dec 01 21:54:26 crc kubenswrapper[4962]: I1201 21:54:26.798913 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-vzzq8"] Dec 01 21:54:26 crc kubenswrapper[4962]: W1201 21:54:26.827622 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod29a10360_5fb7_4add_8bf5_1bc35e6e76dd.slice/crio-4d39c5143aa52d6e193b656e023b02164398969e26dc8f00e0e07da4013cde1a WatchSource:0}: Error finding container 4d39c5143aa52d6e193b656e023b02164398969e26dc8f00e0e07da4013cde1a: Status 404 returned error can't find the container with id 4d39c5143aa52d6e193b656e023b02164398969e26dc8f00e0e07da4013cde1a Dec 01 21:54:27 crc kubenswrapper[4962]: W1201 21:54:27.241123 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod38501711_21d8_43f0_8657_7507944ef792.slice/crio-5d9f9179d1745ffbf39c9179ad3c2e0c622a0418a15dec8fc5f2b4e27af15075 WatchSource:0}: Error finding container 5d9f9179d1745ffbf39c9179ad3c2e0c622a0418a15dec8fc5f2b4e27af15075: Status 404 returned error can't find the container with id 5d9f9179d1745ffbf39c9179ad3c2e0c622a0418a15dec8fc5f2b4e27af15075 Dec 01 21:54:27 crc kubenswrapper[4962]: I1201 21:54:27.261384 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-mfzbs"] Dec 01 21:54:27 crc kubenswrapper[4962]: I1201 21:54:27.295099 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-l44sw"] Dec 01 21:54:27 crc kubenswrapper[4962]: I1201 21:54:27.339870 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 21:54:27 crc kubenswrapper[4962]: I1201 21:54:27.351599 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-m9jmv" Dec 01 21:54:27 crc kubenswrapper[4962]: I1201 21:54:27.478115 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/465331dc-de3a-40d9-b3bd-139e2ee114ec-dns-svc\") pod \"465331dc-de3a-40d9-b3bd-139e2ee114ec\" (UID: \"465331dc-de3a-40d9-b3bd-139e2ee114ec\") " Dec 01 21:54:27 crc kubenswrapper[4962]: I1201 21:54:27.478189 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/465331dc-de3a-40d9-b3bd-139e2ee114ec-config\") pod \"465331dc-de3a-40d9-b3bd-139e2ee114ec\" (UID: \"465331dc-de3a-40d9-b3bd-139e2ee114ec\") " Dec 01 21:54:27 crc kubenswrapper[4962]: I1201 21:54:27.478292 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/465331dc-de3a-40d9-b3bd-139e2ee114ec-ovsdbserver-sb\") pod \"465331dc-de3a-40d9-b3bd-139e2ee114ec\" (UID: \"465331dc-de3a-40d9-b3bd-139e2ee114ec\") " Dec 01 21:54:27 crc kubenswrapper[4962]: I1201 21:54:27.478332 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmmx4\" (UniqueName: \"kubernetes.io/projected/465331dc-de3a-40d9-b3bd-139e2ee114ec-kube-api-access-tmmx4\") pod \"465331dc-de3a-40d9-b3bd-139e2ee114ec\" (UID: \"465331dc-de3a-40d9-b3bd-139e2ee114ec\") " Dec 01 21:54:27 crc kubenswrapper[4962]: I1201 21:54:27.478366 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/465331dc-de3a-40d9-b3bd-139e2ee114ec-dns-swift-storage-0\") pod \"465331dc-de3a-40d9-b3bd-139e2ee114ec\" (UID: \"465331dc-de3a-40d9-b3bd-139e2ee114ec\") " Dec 01 21:54:27 crc kubenswrapper[4962]: I1201 21:54:27.478445 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/465331dc-de3a-40d9-b3bd-139e2ee114ec-ovsdbserver-nb\") pod \"465331dc-de3a-40d9-b3bd-139e2ee114ec\" (UID: \"465331dc-de3a-40d9-b3bd-139e2ee114ec\") " Dec 01 21:54:27 crc kubenswrapper[4962]: I1201 21:54:27.505688 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/465331dc-de3a-40d9-b3bd-139e2ee114ec-kube-api-access-tmmx4" (OuterVolumeSpecName: "kube-api-access-tmmx4") pod "465331dc-de3a-40d9-b3bd-139e2ee114ec" (UID: "465331dc-de3a-40d9-b3bd-139e2ee114ec"). InnerVolumeSpecName "kube-api-access-tmmx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:54:27 crc kubenswrapper[4962]: I1201 21:54:27.571467 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 21:54:27 crc kubenswrapper[4962]: I1201 21:54:27.571510 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-l44sw" event={"ID":"eb73c9fd-2c3b-4d8e-a1c4-5ff4a2b8f67c","Type":"ContainerStarted","Data":"12ad90e3be792bb4ce4c07251a62f3070e17815d0c1d05b7718123681646953e"} Dec 01 21:54:27 crc kubenswrapper[4962]: I1201 21:54:27.586266 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/465331dc-de3a-40d9-b3bd-139e2ee114ec-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "465331dc-de3a-40d9-b3bd-139e2ee114ec" (UID: "465331dc-de3a-40d9-b3bd-139e2ee114ec"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:54:27 crc kubenswrapper[4962]: I1201 21:54:27.587688 4962 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/465331dc-de3a-40d9-b3bd-139e2ee114ec-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 01 21:54:27 crc kubenswrapper[4962]: I1201 21:54:27.587710 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmmx4\" (UniqueName: \"kubernetes.io/projected/465331dc-de3a-40d9-b3bd-139e2ee114ec-kube-api-access-tmmx4\") on node \"crc\" DevicePath \"\"" Dec 01 21:54:27 crc kubenswrapper[4962]: I1201 21:54:27.588870 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/465331dc-de3a-40d9-b3bd-139e2ee114ec-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "465331dc-de3a-40d9-b3bd-139e2ee114ec" (UID: "465331dc-de3a-40d9-b3bd-139e2ee114ec"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:54:27 crc kubenswrapper[4962]: I1201 21:54:27.599630 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"624dc66d-d4f2-447b-861b-23987a75a3d6","Type":"ContainerStarted","Data":"3ea58faafc6a483aea4ddd4602ca9cc031011293c854d83a3f507657532c683e"} Dec 01 21:54:27 crc kubenswrapper[4962]: I1201 21:54:27.619117 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/465331dc-de3a-40d9-b3bd-139e2ee114ec-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "465331dc-de3a-40d9-b3bd-139e2ee114ec" (UID: "465331dc-de3a-40d9-b3bd-139e2ee114ec"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:54:27 crc kubenswrapper[4962]: I1201 21:54:27.631405 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-zbmfp" event={"ID":"e62f3680-a9bc-455b-a71e-77efaa2f2a87","Type":"ContainerStarted","Data":"6229a94679f1738612a7755d3dfa7af7a89ee9d8252d01528edb8b738d3cd3cc"} Dec 01 21:54:27 crc kubenswrapper[4962]: I1201 21:54:27.639783 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 21:54:27 crc kubenswrapper[4962]: I1201 21:54:27.664020 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-x55sw" event={"ID":"873a333f-1f2b-4824-86a7-7935ff6908f9","Type":"ContainerStarted","Data":"b9155aa85aa567602788e132d7acb942d8ca696a1545b52ae459425e7dff0d50"} Dec 01 21:54:27 crc kubenswrapper[4962]: I1201 21:54:27.672400 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-q7knr" event={"ID":"98cede6f-200e-44d9-a4ee-886de53f2459","Type":"ContainerStarted","Data":"d47b5bc6cb9c8a4fab77a102e3a42937b83029a7e3bb86fb46aec16c0e7760f8"} Dec 01 21:54:27 crc kubenswrapper[4962]: I1201 21:54:27.673862 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-zbmfp" podStartSLOduration=3.673841431 podStartE2EDuration="3.673841431s" podCreationTimestamp="2025-12-01 21:54:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:54:27.654606663 +0000 UTC m=+1251.756045858" watchObservedRunningTime="2025-12-01 21:54:27.673841431 +0000 UTC m=+1251.775280616" Dec 01 21:54:27 crc kubenswrapper[4962]: I1201 21:54:27.685294 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-pvhvh" event={"ID":"4ecfd3cc-e01f-4a75-ad5e-8c0e44638525","Type":"ContainerStarted","Data":"93735b0229348dd1a45e61ee147ea636c8eb78d00c50726b4a269773fbd0fea9"} Dec 01 21:54:27 crc kubenswrapper[4962]: I1201 21:54:27.685622 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-pvhvh" event={"ID":"4ecfd3cc-e01f-4a75-ad5e-8c0e44638525","Type":"ContainerStarted","Data":"276be4279574e0ab70f52bc0c3ca097dacdf41eae1b2b6f36e99099417c774af"} Dec 01 21:54:27 crc kubenswrapper[4962]: I1201 21:54:27.690736 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/465331dc-de3a-40d9-b3bd-139e2ee114ec-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 21:54:27 crc kubenswrapper[4962]: I1201 21:54:27.690766 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/465331dc-de3a-40d9-b3bd-139e2ee114ec-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 21:54:27 crc kubenswrapper[4962]: I1201 21:54:27.703470 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/465331dc-de3a-40d9-b3bd-139e2ee114ec-config" (OuterVolumeSpecName: "config") pod "465331dc-de3a-40d9-b3bd-139e2ee114ec" (UID: "465331dc-de3a-40d9-b3bd-139e2ee114ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:54:27 crc kubenswrapper[4962]: I1201 21:54:27.706627 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-mfzbs" event={"ID":"38501711-21d8-43f0-8657-7507944ef792","Type":"ContainerStarted","Data":"5d9f9179d1745ffbf39c9179ad3c2e0c622a0418a15dec8fc5f2b4e27af15075"} Dec 01 21:54:27 crc kubenswrapper[4962]: I1201 21:54:27.710600 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 21:54:27 crc kubenswrapper[4962]: I1201 21:54:27.725052 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-pvhvh" podStartSLOduration=3.724998349 podStartE2EDuration="3.724998349s" podCreationTimestamp="2025-12-01 21:54:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:54:27.707127209 +0000 UTC m=+1251.808566404" watchObservedRunningTime="2025-12-01 21:54:27.724998349 +0000 UTC m=+1251.826437544" Dec 01 21:54:27 crc kubenswrapper[4962]: I1201 21:54:27.736592 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-vzzq8" event={"ID":"29a10360-5fb7-4add-8bf5-1bc35e6e76dd","Type":"ContainerStarted","Data":"4d39c5143aa52d6e193b656e023b02164398969e26dc8f00e0e07da4013cde1a"} Dec 01 21:54:27 crc kubenswrapper[4962]: I1201 21:54:27.753251 4962 generic.go:334] "Generic (PLEG): container finished" podID="465331dc-de3a-40d9-b3bd-139e2ee114ec" containerID="2441265647eb3b87fb7087dd96055a21e6e57fdd3a1bff68bbba3d45a3721cce" exitCode=0 Dec 01 21:54:27 crc kubenswrapper[4962]: I1201 21:54:27.753355 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-m9jmv" event={"ID":"465331dc-de3a-40d9-b3bd-139e2ee114ec","Type":"ContainerDied","Data":"2441265647eb3b87fb7087dd96055a21e6e57fdd3a1bff68bbba3d45a3721cce"} Dec 01 21:54:27 crc kubenswrapper[4962]: I1201 21:54:27.753385 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-m9jmv" event={"ID":"465331dc-de3a-40d9-b3bd-139e2ee114ec","Type":"ContainerDied","Data":"66afc612769e2eb4e689e19b3ad237db8ff038114864fdd69523954c4cd55dbd"} Dec 01 21:54:27 crc kubenswrapper[4962]: I1201 21:54:27.753402 4962 scope.go:117] "RemoveContainer" containerID="2441265647eb3b87fb7087dd96055a21e6e57fdd3a1bff68bbba3d45a3721cce" Dec 01 21:54:27 crc kubenswrapper[4962]: I1201 21:54:27.753543 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-m9jmv" Dec 01 21:54:27 crc kubenswrapper[4962]: I1201 21:54:27.778561 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/465331dc-de3a-40d9-b3bd-139e2ee114ec-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "465331dc-de3a-40d9-b3bd-139e2ee114ec" (UID: "465331dc-de3a-40d9-b3bd-139e2ee114ec"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:54:27 crc kubenswrapper[4962]: I1201 21:54:27.792897 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/465331dc-de3a-40d9-b3bd-139e2ee114ec-config\") on node \"crc\" DevicePath \"\"" Dec 01 21:54:27 crc kubenswrapper[4962]: I1201 21:54:27.792942 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/465331dc-de3a-40d9-b3bd-139e2ee114ec-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 21:54:28 crc kubenswrapper[4962]: I1201 21:54:28.002701 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 21:54:28 crc kubenswrapper[4962]: I1201 21:54:28.370977 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98c2491b-c2d9-4a33-a9a9-315c1fbdc8af" path="/var/lib/kubelet/pods/98c2491b-c2d9-4a33-a9a9-315c1fbdc8af/volumes" Dec 01 21:54:28 crc kubenswrapper[4962]: I1201 21:54:28.422432 4962 scope.go:117] "RemoveContainer" containerID="2441265647eb3b87fb7087dd96055a21e6e57fdd3a1bff68bbba3d45a3721cce" Dec 01 21:54:28 crc kubenswrapper[4962]: E1201 21:54:28.423238 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2441265647eb3b87fb7087dd96055a21e6e57fdd3a1bff68bbba3d45a3721cce\": container with ID starting with 2441265647eb3b87fb7087dd96055a21e6e57fdd3a1bff68bbba3d45a3721cce not found: ID does not exist" containerID="2441265647eb3b87fb7087dd96055a21e6e57fdd3a1bff68bbba3d45a3721cce" Dec 01 21:54:28 crc kubenswrapper[4962]: I1201 21:54:28.423269 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2441265647eb3b87fb7087dd96055a21e6e57fdd3a1bff68bbba3d45a3721cce"} err="failed to get container status \"2441265647eb3b87fb7087dd96055a21e6e57fdd3a1bff68bbba3d45a3721cce\": rpc error: code = NotFound desc = could not find container \"2441265647eb3b87fb7087dd96055a21e6e57fdd3a1bff68bbba3d45a3721cce\": container with ID starting with 2441265647eb3b87fb7087dd96055a21e6e57fdd3a1bff68bbba3d45a3721cce not found: ID does not exist" Dec 01 21:54:28 crc kubenswrapper[4962]: I1201 21:54:28.466509 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-m9jmv"] Dec 01 21:54:28 crc kubenswrapper[4962]: I1201 21:54:28.500175 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-m9jmv"] Dec 01 21:54:28 crc kubenswrapper[4962]: I1201 21:54:28.772919 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 21:54:28 crc kubenswrapper[4962]: I1201 21:54:28.802633 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"17a84147-ff3d-4d41-a2d4-355e5e4de88b","Type":"ContainerStarted","Data":"109a2fc31fcf269b4f6399b57e258f7c0c1a91d9678d2427681491a4a863b584"} Dec 01 21:54:28 crc kubenswrapper[4962]: I1201 21:54:28.805481 4962 generic.go:334] "Generic (PLEG): container finished" podID="38501711-21d8-43f0-8657-7507944ef792" containerID="d6f19e7483f9073cbf641cc7a0feb21630f8bbdbaaa2b28af533b631218286cd" exitCode=0 Dec 01 21:54:28 crc kubenswrapper[4962]: I1201 21:54:28.805823 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-mfzbs" event={"ID":"38501711-21d8-43f0-8657-7507944ef792","Type":"ContainerDied","Data":"d6f19e7483f9073cbf641cc7a0feb21630f8bbdbaaa2b28af533b631218286cd"} Dec 01 21:54:29 crc kubenswrapper[4962]: I1201 21:54:29.872019 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"89c75330-08c2-4341-85cf-2b550d6f0aa7","Type":"ContainerStarted","Data":"af7a0cf9fe1c1a9e78d9acbcc9b7a4e8c305f25be58f1939ca78dc3a63a835fa"} Dec 01 21:54:29 crc kubenswrapper[4962]: I1201 21:54:29.883279 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"17a84147-ff3d-4d41-a2d4-355e5e4de88b","Type":"ContainerStarted","Data":"a3a2f5927c7be4aa76873ff4b6e350e8355d3302eb751bd3f94301a0f63ad51f"} Dec 01 21:54:29 crc kubenswrapper[4962]: I1201 21:54:29.888168 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-mfzbs" event={"ID":"38501711-21d8-43f0-8657-7507944ef792","Type":"ContainerStarted","Data":"b40b507046667320549cf3401127527920805e6dcd9c4165a87336393f712877"} Dec 01 21:54:29 crc kubenswrapper[4962]: I1201 21:54:29.888346 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-56df8fb6b7-mfzbs" Dec 01 21:54:29 crc kubenswrapper[4962]: I1201 21:54:29.913342 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-56df8fb6b7-mfzbs" podStartSLOduration=4.913320477 podStartE2EDuration="4.913320477s" podCreationTimestamp="2025-12-01 21:54:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:54:29.905310609 +0000 UTC m=+1254.006749804" watchObservedRunningTime="2025-12-01 21:54:29.913320477 +0000 UTC m=+1254.014759692" Dec 01 21:54:29 crc kubenswrapper[4962]: I1201 21:54:29.983521 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5f59b8f679-vhbb6" podUID="98c2491b-c2d9-4a33-a9a9-315c1fbdc8af" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.174:5353: i/o timeout" Dec 01 21:54:30 crc kubenswrapper[4962]: I1201 21:54:30.248610 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="465331dc-de3a-40d9-b3bd-139e2ee114ec" path="/var/lib/kubelet/pods/465331dc-de3a-40d9-b3bd-139e2ee114ec/volumes" Dec 01 21:54:30 crc kubenswrapper[4962]: I1201 21:54:30.904197 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"17a84147-ff3d-4d41-a2d4-355e5e4de88b","Type":"ContainerStarted","Data":"9c6dfed22e756ed7dfd4e7eb66ce3586e164f648f3d32338a9adf8545f1e983e"} Dec 01 21:54:30 crc kubenswrapper[4962]: I1201 21:54:30.904553 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="17a84147-ff3d-4d41-a2d4-355e5e4de88b" containerName="glance-log" containerID="cri-o://a3a2f5927c7be4aa76873ff4b6e350e8355d3302eb751bd3f94301a0f63ad51f" gracePeriod=30 Dec 01 21:54:30 crc kubenswrapper[4962]: I1201 21:54:30.905044 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="17a84147-ff3d-4d41-a2d4-355e5e4de88b" containerName="glance-httpd" containerID="cri-o://9c6dfed22e756ed7dfd4e7eb66ce3586e164f648f3d32338a9adf8545f1e983e" gracePeriod=30 Dec 01 21:54:30 crc kubenswrapper[4962]: I1201 21:54:30.912700 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"89c75330-08c2-4341-85cf-2b550d6f0aa7","Type":"ContainerStarted","Data":"6af0e933d1fe26a65c58702f662ce24d423e525ea0b2984d4e048663b5ca361a"} Dec 01 21:54:30 crc kubenswrapper[4962]: I1201 21:54:30.936004 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.935982805 podStartE2EDuration="6.935982805s" podCreationTimestamp="2025-12-01 21:54:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:54:30.930250771 +0000 UTC m=+1255.031689976" watchObservedRunningTime="2025-12-01 21:54:30.935982805 +0000 UTC m=+1255.037422000" Dec 01 21:54:31 crc kubenswrapper[4962]: I1201 21:54:31.925264 4962 generic.go:334] "Generic (PLEG): container finished" podID="e62f3680-a9bc-455b-a71e-77efaa2f2a87" containerID="6229a94679f1738612a7755d3dfa7af7a89ee9d8252d01528edb8b738d3cd3cc" exitCode=0 Dec 01 21:54:31 crc kubenswrapper[4962]: I1201 21:54:31.925345 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-zbmfp" event={"ID":"e62f3680-a9bc-455b-a71e-77efaa2f2a87","Type":"ContainerDied","Data":"6229a94679f1738612a7755d3dfa7af7a89ee9d8252d01528edb8b738d3cd3cc"} Dec 01 21:54:31 crc kubenswrapper[4962]: I1201 21:54:31.929675 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"89c75330-08c2-4341-85cf-2b550d6f0aa7","Type":"ContainerStarted","Data":"0400b21828f01f4d9b4cbf879769a0b2f3b0f5c68005be933879f3e756179064"} Dec 01 21:54:31 crc kubenswrapper[4962]: I1201 21:54:31.929845 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="89c75330-08c2-4341-85cf-2b550d6f0aa7" containerName="glance-log" containerID="cri-o://6af0e933d1fe26a65c58702f662ce24d423e525ea0b2984d4e048663b5ca361a" gracePeriod=30 Dec 01 21:54:31 crc kubenswrapper[4962]: I1201 21:54:31.929986 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="89c75330-08c2-4341-85cf-2b550d6f0aa7" containerName="glance-httpd" containerID="cri-o://0400b21828f01f4d9b4cbf879769a0b2f3b0f5c68005be933879f3e756179064" gracePeriod=30 Dec 01 21:54:31 crc kubenswrapper[4962]: I1201 21:54:31.934025 4962 generic.go:334] "Generic (PLEG): container finished" podID="17a84147-ff3d-4d41-a2d4-355e5e4de88b" containerID="9c6dfed22e756ed7dfd4e7eb66ce3586e164f648f3d32338a9adf8545f1e983e" exitCode=0 Dec 01 21:54:31 crc kubenswrapper[4962]: I1201 21:54:31.934095 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"17a84147-ff3d-4d41-a2d4-355e5e4de88b","Type":"ContainerDied","Data":"9c6dfed22e756ed7dfd4e7eb66ce3586e164f648f3d32338a9adf8545f1e983e"} Dec 01 21:54:31 crc kubenswrapper[4962]: I1201 21:54:31.934769 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"17a84147-ff3d-4d41-a2d4-355e5e4de88b","Type":"ContainerDied","Data":"a3a2f5927c7be4aa76873ff4b6e350e8355d3302eb751bd3f94301a0f63ad51f"} Dec 01 21:54:31 crc kubenswrapper[4962]: I1201 21:54:31.934848 4962 generic.go:334] "Generic (PLEG): container finished" podID="17a84147-ff3d-4d41-a2d4-355e5e4de88b" containerID="a3a2f5927c7be4aa76873ff4b6e350e8355d3302eb751bd3f94301a0f63ad51f" exitCode=143 Dec 01 21:54:31 crc kubenswrapper[4962]: I1201 21:54:31.977663 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=7.977642904 podStartE2EDuration="7.977642904s" podCreationTimestamp="2025-12-01 21:54:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:54:31.965895469 +0000 UTC m=+1256.067334674" watchObservedRunningTime="2025-12-01 21:54:31.977642904 +0000 UTC m=+1256.079082099" Dec 01 21:54:32 crc kubenswrapper[4962]: I1201 21:54:32.784852 4962 patch_prober.go:28] interesting pod/machine-config-daemon-b642k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 21:54:32 crc kubenswrapper[4962]: I1201 21:54:32.784924 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 21:54:32 crc kubenswrapper[4962]: I1201 21:54:32.785018 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b642k" Dec 01 21:54:32 crc kubenswrapper[4962]: I1201 21:54:32.785853 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7e98140d5fb11879a3903d3761dc38b8ef264c041494b571b46af54f4f57bb50"} pod="openshift-machine-config-operator/machine-config-daemon-b642k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 21:54:32 crc kubenswrapper[4962]: I1201 21:54:32.785915 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" containerID="cri-o://7e98140d5fb11879a3903d3761dc38b8ef264c041494b571b46af54f4f57bb50" gracePeriod=600 Dec 01 21:54:32 crc kubenswrapper[4962]: I1201 21:54:32.963271 4962 generic.go:334] "Generic (PLEG): container finished" podID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerID="7e98140d5fb11879a3903d3761dc38b8ef264c041494b571b46af54f4f57bb50" exitCode=0 Dec 01 21:54:32 crc kubenswrapper[4962]: I1201 21:54:32.963340 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" event={"ID":"191b6ce3-f613-4217-b224-a65ee4cfdfe7","Type":"ContainerDied","Data":"7e98140d5fb11879a3903d3761dc38b8ef264c041494b571b46af54f4f57bb50"} Dec 01 21:54:32 crc kubenswrapper[4962]: I1201 21:54:32.963379 4962 scope.go:117] "RemoveContainer" containerID="95b773e188f611e19f1e133dda091ac575dae9bb165debbd86a90d7593910a0b" Dec 01 21:54:32 crc kubenswrapper[4962]: I1201 21:54:32.967102 4962 generic.go:334] "Generic (PLEG): container finished" podID="89c75330-08c2-4341-85cf-2b550d6f0aa7" containerID="0400b21828f01f4d9b4cbf879769a0b2f3b0f5c68005be933879f3e756179064" exitCode=0 Dec 01 21:54:32 crc kubenswrapper[4962]: I1201 21:54:32.967140 4962 generic.go:334] "Generic (PLEG): container finished" podID="89c75330-08c2-4341-85cf-2b550d6f0aa7" containerID="6af0e933d1fe26a65c58702f662ce24d423e525ea0b2984d4e048663b5ca361a" exitCode=143 Dec 01 21:54:32 crc kubenswrapper[4962]: I1201 21:54:32.967232 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"89c75330-08c2-4341-85cf-2b550d6f0aa7","Type":"ContainerDied","Data":"0400b21828f01f4d9b4cbf879769a0b2f3b0f5c68005be933879f3e756179064"} Dec 01 21:54:32 crc kubenswrapper[4962]: I1201 21:54:32.967315 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"89c75330-08c2-4341-85cf-2b550d6f0aa7","Type":"ContainerDied","Data":"6af0e933d1fe26a65c58702f662ce24d423e525ea0b2984d4e048663b5ca361a"} Dec 01 21:54:33 crc kubenswrapper[4962]: I1201 21:54:33.743217 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-zbmfp" Dec 01 21:54:33 crc kubenswrapper[4962]: I1201 21:54:33.829254 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e62f3680-a9bc-455b-a71e-77efaa2f2a87-scripts\") pod \"e62f3680-a9bc-455b-a71e-77efaa2f2a87\" (UID: \"e62f3680-a9bc-455b-a71e-77efaa2f2a87\") " Dec 01 21:54:33 crc kubenswrapper[4962]: I1201 21:54:33.829292 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e62f3680-a9bc-455b-a71e-77efaa2f2a87-config-data\") pod \"e62f3680-a9bc-455b-a71e-77efaa2f2a87\" (UID: \"e62f3680-a9bc-455b-a71e-77efaa2f2a87\") " Dec 01 21:54:33 crc kubenswrapper[4962]: I1201 21:54:33.829368 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e62f3680-a9bc-455b-a71e-77efaa2f2a87-fernet-keys\") pod \"e62f3680-a9bc-455b-a71e-77efaa2f2a87\" (UID: \"e62f3680-a9bc-455b-a71e-77efaa2f2a87\") " Dec 01 21:54:33 crc kubenswrapper[4962]: I1201 21:54:33.829391 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e62f3680-a9bc-455b-a71e-77efaa2f2a87-combined-ca-bundle\") pod \"e62f3680-a9bc-455b-a71e-77efaa2f2a87\" (UID: \"e62f3680-a9bc-455b-a71e-77efaa2f2a87\") " Dec 01 21:54:33 crc kubenswrapper[4962]: I1201 21:54:33.829467 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e62f3680-a9bc-455b-a71e-77efaa2f2a87-credential-keys\") pod \"e62f3680-a9bc-455b-a71e-77efaa2f2a87\" (UID: \"e62f3680-a9bc-455b-a71e-77efaa2f2a87\") " Dec 01 21:54:33 crc kubenswrapper[4962]: I1201 21:54:33.829509 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmg6x\" (UniqueName: \"kubernetes.io/projected/e62f3680-a9bc-455b-a71e-77efaa2f2a87-kube-api-access-wmg6x\") pod \"e62f3680-a9bc-455b-a71e-77efaa2f2a87\" (UID: \"e62f3680-a9bc-455b-a71e-77efaa2f2a87\") " Dec 01 21:54:33 crc kubenswrapper[4962]: I1201 21:54:33.836272 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e62f3680-a9bc-455b-a71e-77efaa2f2a87-kube-api-access-wmg6x" (OuterVolumeSpecName: "kube-api-access-wmg6x") pod "e62f3680-a9bc-455b-a71e-77efaa2f2a87" (UID: "e62f3680-a9bc-455b-a71e-77efaa2f2a87"). InnerVolumeSpecName "kube-api-access-wmg6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:54:33 crc kubenswrapper[4962]: I1201 21:54:33.836307 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e62f3680-a9bc-455b-a71e-77efaa2f2a87-scripts" (OuterVolumeSpecName: "scripts") pod "e62f3680-a9bc-455b-a71e-77efaa2f2a87" (UID: "e62f3680-a9bc-455b-a71e-77efaa2f2a87"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:54:33 crc kubenswrapper[4962]: I1201 21:54:33.838176 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e62f3680-a9bc-455b-a71e-77efaa2f2a87-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "e62f3680-a9bc-455b-a71e-77efaa2f2a87" (UID: "e62f3680-a9bc-455b-a71e-77efaa2f2a87"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:54:33 crc kubenswrapper[4962]: I1201 21:54:33.862370 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e62f3680-a9bc-455b-a71e-77efaa2f2a87-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "e62f3680-a9bc-455b-a71e-77efaa2f2a87" (UID: "e62f3680-a9bc-455b-a71e-77efaa2f2a87"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:54:33 crc kubenswrapper[4962]: I1201 21:54:33.864834 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e62f3680-a9bc-455b-a71e-77efaa2f2a87-config-data" (OuterVolumeSpecName: "config-data") pod "e62f3680-a9bc-455b-a71e-77efaa2f2a87" (UID: "e62f3680-a9bc-455b-a71e-77efaa2f2a87"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:54:33 crc kubenswrapper[4962]: I1201 21:54:33.871714 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e62f3680-a9bc-455b-a71e-77efaa2f2a87-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e62f3680-a9bc-455b-a71e-77efaa2f2a87" (UID: "e62f3680-a9bc-455b-a71e-77efaa2f2a87"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:54:33 crc kubenswrapper[4962]: I1201 21:54:33.932887 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e62f3680-a9bc-455b-a71e-77efaa2f2a87-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 21:54:33 crc kubenswrapper[4962]: I1201 21:54:33.932924 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e62f3680-a9bc-455b-a71e-77efaa2f2a87-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 21:54:33 crc kubenswrapper[4962]: I1201 21:54:33.932958 4962 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e62f3680-a9bc-455b-a71e-77efaa2f2a87-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 01 21:54:33 crc kubenswrapper[4962]: I1201 21:54:33.932971 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e62f3680-a9bc-455b-a71e-77efaa2f2a87-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 21:54:33 crc kubenswrapper[4962]: I1201 21:54:33.932986 4962 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e62f3680-a9bc-455b-a71e-77efaa2f2a87-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 01 21:54:33 crc kubenswrapper[4962]: I1201 21:54:33.932998 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmg6x\" (UniqueName: \"kubernetes.io/projected/e62f3680-a9bc-455b-a71e-77efaa2f2a87-kube-api-access-wmg6x\") on node \"crc\" DevicePath \"\"" Dec 01 21:54:33 crc kubenswrapper[4962]: I1201 21:54:33.987799 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-zbmfp" event={"ID":"e62f3680-a9bc-455b-a71e-77efaa2f2a87","Type":"ContainerDied","Data":"bb169b0d4d8c5668de7066d89351fa85c8aef3e4e790eaf814fa415c23709149"} Dec 01 21:54:33 crc kubenswrapper[4962]: I1201 21:54:33.987857 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb169b0d4d8c5668de7066d89351fa85c8aef3e4e790eaf814fa415c23709149" Dec 01 21:54:33 crc kubenswrapper[4962]: I1201 21:54:33.988067 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-zbmfp" Dec 01 21:54:34 crc kubenswrapper[4962]: I1201 21:54:34.027501 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-zbmfp"] Dec 01 21:54:34 crc kubenswrapper[4962]: I1201 21:54:34.041033 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-zbmfp"] Dec 01 21:54:34 crc kubenswrapper[4962]: I1201 21:54:34.117775 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-762cx"] Dec 01 21:54:34 crc kubenswrapper[4962]: E1201 21:54:34.118306 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="465331dc-de3a-40d9-b3bd-139e2ee114ec" containerName="init" Dec 01 21:54:34 crc kubenswrapper[4962]: I1201 21:54:34.118321 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="465331dc-de3a-40d9-b3bd-139e2ee114ec" containerName="init" Dec 01 21:54:34 crc kubenswrapper[4962]: E1201 21:54:34.118338 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e62f3680-a9bc-455b-a71e-77efaa2f2a87" containerName="keystone-bootstrap" Dec 01 21:54:34 crc kubenswrapper[4962]: I1201 21:54:34.118345 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="e62f3680-a9bc-455b-a71e-77efaa2f2a87" containerName="keystone-bootstrap" Dec 01 21:54:34 crc kubenswrapper[4962]: I1201 21:54:34.118575 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="465331dc-de3a-40d9-b3bd-139e2ee114ec" containerName="init" Dec 01 21:54:34 crc kubenswrapper[4962]: I1201 21:54:34.118589 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="e62f3680-a9bc-455b-a71e-77efaa2f2a87" containerName="keystone-bootstrap" Dec 01 21:54:34 crc kubenswrapper[4962]: I1201 21:54:34.119331 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-762cx" Dec 01 21:54:34 crc kubenswrapper[4962]: I1201 21:54:34.125456 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 01 21:54:34 crc kubenswrapper[4962]: I1201 21:54:34.126012 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 01 21:54:34 crc kubenswrapper[4962]: I1201 21:54:34.129012 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-762cx"] Dec 01 21:54:34 crc kubenswrapper[4962]: I1201 21:54:34.129471 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 01 21:54:34 crc kubenswrapper[4962]: I1201 21:54:34.129549 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 01 21:54:34 crc kubenswrapper[4962]: I1201 21:54:34.142107 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-tlj7d" Dec 01 21:54:34 crc kubenswrapper[4962]: E1201 21:54:34.207995 4962 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode62f3680_a9bc_455b_a71e_77efaa2f2a87.slice\": RecentStats: unable to find data in memory cache]" Dec 01 21:54:34 crc kubenswrapper[4962]: I1201 21:54:34.235836 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e62f3680-a9bc-455b-a71e-77efaa2f2a87" path="/var/lib/kubelet/pods/e62f3680-a9bc-455b-a71e-77efaa2f2a87/volumes" Dec 01 21:54:34 crc kubenswrapper[4962]: I1201 21:54:34.239712 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7c13da36-7f69-4c0e-b830-8b71bdc181b2-fernet-keys\") pod \"keystone-bootstrap-762cx\" (UID: \"7c13da36-7f69-4c0e-b830-8b71bdc181b2\") " pod="openstack/keystone-bootstrap-762cx" Dec 01 21:54:34 crc kubenswrapper[4962]: I1201 21:54:34.239952 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c13da36-7f69-4c0e-b830-8b71bdc181b2-combined-ca-bundle\") pod \"keystone-bootstrap-762cx\" (UID: \"7c13da36-7f69-4c0e-b830-8b71bdc181b2\") " pod="openstack/keystone-bootstrap-762cx" Dec 01 21:54:34 crc kubenswrapper[4962]: I1201 21:54:34.240207 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c13da36-7f69-4c0e-b830-8b71bdc181b2-scripts\") pod \"keystone-bootstrap-762cx\" (UID: \"7c13da36-7f69-4c0e-b830-8b71bdc181b2\") " pod="openstack/keystone-bootstrap-762cx" Dec 01 21:54:34 crc kubenswrapper[4962]: I1201 21:54:34.240376 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c13da36-7f69-4c0e-b830-8b71bdc181b2-config-data\") pod \"keystone-bootstrap-762cx\" (UID: \"7c13da36-7f69-4c0e-b830-8b71bdc181b2\") " pod="openstack/keystone-bootstrap-762cx" Dec 01 21:54:34 crc kubenswrapper[4962]: I1201 21:54:34.240593 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bqzn\" (UniqueName: \"kubernetes.io/projected/7c13da36-7f69-4c0e-b830-8b71bdc181b2-kube-api-access-8bqzn\") pod \"keystone-bootstrap-762cx\" (UID: \"7c13da36-7f69-4c0e-b830-8b71bdc181b2\") " pod="openstack/keystone-bootstrap-762cx" Dec 01 21:54:34 crc kubenswrapper[4962]: I1201 21:54:34.240688 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7c13da36-7f69-4c0e-b830-8b71bdc181b2-credential-keys\") pod \"keystone-bootstrap-762cx\" (UID: \"7c13da36-7f69-4c0e-b830-8b71bdc181b2\") " pod="openstack/keystone-bootstrap-762cx" Dec 01 21:54:34 crc kubenswrapper[4962]: I1201 21:54:34.342185 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c13da36-7f69-4c0e-b830-8b71bdc181b2-combined-ca-bundle\") pod \"keystone-bootstrap-762cx\" (UID: \"7c13da36-7f69-4c0e-b830-8b71bdc181b2\") " pod="openstack/keystone-bootstrap-762cx" Dec 01 21:54:34 crc kubenswrapper[4962]: I1201 21:54:34.342245 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c13da36-7f69-4c0e-b830-8b71bdc181b2-scripts\") pod \"keystone-bootstrap-762cx\" (UID: \"7c13da36-7f69-4c0e-b830-8b71bdc181b2\") " pod="openstack/keystone-bootstrap-762cx" Dec 01 21:54:34 crc kubenswrapper[4962]: I1201 21:54:34.342287 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c13da36-7f69-4c0e-b830-8b71bdc181b2-config-data\") pod \"keystone-bootstrap-762cx\" (UID: \"7c13da36-7f69-4c0e-b830-8b71bdc181b2\") " pod="openstack/keystone-bootstrap-762cx" Dec 01 21:54:34 crc kubenswrapper[4962]: I1201 21:54:34.342364 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bqzn\" (UniqueName: \"kubernetes.io/projected/7c13da36-7f69-4c0e-b830-8b71bdc181b2-kube-api-access-8bqzn\") pod \"keystone-bootstrap-762cx\" (UID: \"7c13da36-7f69-4c0e-b830-8b71bdc181b2\") " pod="openstack/keystone-bootstrap-762cx" Dec 01 21:54:34 crc kubenswrapper[4962]: I1201 21:54:34.342408 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7c13da36-7f69-4c0e-b830-8b71bdc181b2-credential-keys\") pod \"keystone-bootstrap-762cx\" (UID: \"7c13da36-7f69-4c0e-b830-8b71bdc181b2\") " pod="openstack/keystone-bootstrap-762cx" Dec 01 21:54:34 crc kubenswrapper[4962]: I1201 21:54:34.342443 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7c13da36-7f69-4c0e-b830-8b71bdc181b2-fernet-keys\") pod \"keystone-bootstrap-762cx\" (UID: \"7c13da36-7f69-4c0e-b830-8b71bdc181b2\") " pod="openstack/keystone-bootstrap-762cx" Dec 01 21:54:34 crc kubenswrapper[4962]: I1201 21:54:34.354555 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c13da36-7f69-4c0e-b830-8b71bdc181b2-config-data\") pod \"keystone-bootstrap-762cx\" (UID: \"7c13da36-7f69-4c0e-b830-8b71bdc181b2\") " pod="openstack/keystone-bootstrap-762cx" Dec 01 21:54:34 crc kubenswrapper[4962]: I1201 21:54:34.361495 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7c13da36-7f69-4c0e-b830-8b71bdc181b2-fernet-keys\") pod \"keystone-bootstrap-762cx\" (UID: \"7c13da36-7f69-4c0e-b830-8b71bdc181b2\") " pod="openstack/keystone-bootstrap-762cx" Dec 01 21:54:34 crc kubenswrapper[4962]: I1201 21:54:34.362626 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c13da36-7f69-4c0e-b830-8b71bdc181b2-scripts\") pod \"keystone-bootstrap-762cx\" (UID: \"7c13da36-7f69-4c0e-b830-8b71bdc181b2\") " pod="openstack/keystone-bootstrap-762cx" Dec 01 21:54:34 crc kubenswrapper[4962]: I1201 21:54:34.379416 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bqzn\" (UniqueName: \"kubernetes.io/projected/7c13da36-7f69-4c0e-b830-8b71bdc181b2-kube-api-access-8bqzn\") pod \"keystone-bootstrap-762cx\" (UID: \"7c13da36-7f69-4c0e-b830-8b71bdc181b2\") " pod="openstack/keystone-bootstrap-762cx" Dec 01 21:54:34 crc kubenswrapper[4962]: I1201 21:54:34.380431 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7c13da36-7f69-4c0e-b830-8b71bdc181b2-credential-keys\") pod \"keystone-bootstrap-762cx\" (UID: \"7c13da36-7f69-4c0e-b830-8b71bdc181b2\") " pod="openstack/keystone-bootstrap-762cx" Dec 01 21:54:34 crc kubenswrapper[4962]: I1201 21:54:34.380599 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c13da36-7f69-4c0e-b830-8b71bdc181b2-combined-ca-bundle\") pod \"keystone-bootstrap-762cx\" (UID: \"7c13da36-7f69-4c0e-b830-8b71bdc181b2\") " pod="openstack/keystone-bootstrap-762cx" Dec 01 21:54:34 crc kubenswrapper[4962]: I1201 21:54:34.449689 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-762cx" Dec 01 21:54:35 crc kubenswrapper[4962]: I1201 21:54:35.583107 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-56df8fb6b7-mfzbs" Dec 01 21:54:35 crc kubenswrapper[4962]: I1201 21:54:35.657499 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-nn26f"] Dec 01 21:54:35 crc kubenswrapper[4962]: I1201 21:54:35.657756 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b8fbc5445-nn26f" podUID="bfb8d480-e429-4cd8-b9c8-1361a41deb16" containerName="dnsmasq-dns" containerID="cri-o://749385e365b4e87b4a31f8cc1590103d02a4cbb7c023be6575567602b907c9d7" gracePeriod=10 Dec 01 21:54:37 crc kubenswrapper[4962]: I1201 21:54:37.026912 4962 generic.go:334] "Generic (PLEG): container finished" podID="bfb8d480-e429-4cd8-b9c8-1361a41deb16" containerID="749385e365b4e87b4a31f8cc1590103d02a4cbb7c023be6575567602b907c9d7" exitCode=0 Dec 01 21:54:37 crc kubenswrapper[4962]: I1201 21:54:37.026978 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-nn26f" event={"ID":"bfb8d480-e429-4cd8-b9c8-1361a41deb16","Type":"ContainerDied","Data":"749385e365b4e87b4a31f8cc1590103d02a4cbb7c023be6575567602b907c9d7"} Dec 01 21:54:39 crc kubenswrapper[4962]: I1201 21:54:39.953130 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-nn26f" podUID="bfb8d480-e429-4cd8-b9c8-1361a41deb16" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.144:5353: connect: connection refused" Dec 01 21:54:43 crc kubenswrapper[4962]: E1201 21:54:43.153364 4962 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api:current-podified" Dec 01 21:54:43 crc kubenswrapper[4962]: E1201 21:54:43.154244 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8m7gn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-l44sw_openstack(eb73c9fd-2c3b-4d8e-a1c4-5ff4a2b8f67c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 21:54:43 crc kubenswrapper[4962]: E1201 21:54:43.155478 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-l44sw" podUID="eb73c9fd-2c3b-4d8e-a1c4-5ff4a2b8f67c" Dec 01 21:54:44 crc kubenswrapper[4962]: E1201 21:54:44.163004 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api:current-podified\\\"\"" pod="openstack/placement-db-sync-l44sw" podUID="eb73c9fd-2c3b-4d8e-a1c4-5ff4a2b8f67c" Dec 01 21:54:48 crc kubenswrapper[4962]: I1201 21:54:48.219441 4962 generic.go:334] "Generic (PLEG): container finished" podID="4ecfd3cc-e01f-4a75-ad5e-8c0e44638525" containerID="93735b0229348dd1a45e61ee147ea636c8eb78d00c50726b4a269773fbd0fea9" exitCode=0 Dec 01 21:54:48 crc kubenswrapper[4962]: I1201 21:54:48.234576 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-pvhvh" event={"ID":"4ecfd3cc-e01f-4a75-ad5e-8c0e44638525","Type":"ContainerDied","Data":"93735b0229348dd1a45e61ee147ea636c8eb78d00c50726b4a269773fbd0fea9"} Dec 01 21:54:49 crc kubenswrapper[4962]: I1201 21:54:49.954195 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-nn26f" podUID="bfb8d480-e429-4cd8-b9c8-1361a41deb16" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.144:5353: i/o timeout" Dec 01 21:54:52 crc kubenswrapper[4962]: E1201 21:54:52.001729 4962 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified" Dec 01 21:54:52 crc kubenswrapper[4962]: E1201 21:54:52.002311 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:heat-db-sync,Image:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,Command:[/bin/bash],Args:[-c /usr/bin/heat-manage --config-dir /etc/heat/heat.conf.d db_sync],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/heat/heat.conf.d/00-default.conf,SubPath:00-default.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/heat/heat.conf.d/01-custom.conf,SubPath:01-custom.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dtbpm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42418,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42418,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-db-sync-x55sw_openstack(873a333f-1f2b-4824-86a7-7935ff6908f9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 21:54:52 crc kubenswrapper[4962]: E1201 21:54:52.003501 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/heat-db-sync-x55sw" podUID="873a333f-1f2b-4824-86a7-7935ff6908f9" Dec 01 21:54:52 crc kubenswrapper[4962]: E1201 21:54:52.262031 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified\\\"\"" pod="openstack/heat-db-sync-x55sw" podUID="873a333f-1f2b-4824-86a7-7935ff6908f9" Dec 01 21:54:52 crc kubenswrapper[4962]: E1201 21:54:52.517832 4962 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Dec 01 21:54:52 crc kubenswrapper[4962]: E1201 21:54:52.517996 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xfbmn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-q7knr_openstack(98cede6f-200e-44d9-a4ee-886de53f2459): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 21:54:52 crc kubenswrapper[4962]: E1201 21:54:52.519220 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-q7knr" podUID="98cede6f-200e-44d9-a4ee-886de53f2459" Dec 01 21:54:52 crc kubenswrapper[4962]: I1201 21:54:52.683442 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-nn26f" Dec 01 21:54:52 crc kubenswrapper[4962]: I1201 21:54:52.692698 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-pvhvh" Dec 01 21:54:52 crc kubenswrapper[4962]: I1201 21:54:52.706123 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 21:54:52 crc kubenswrapper[4962]: I1201 21:54:52.740046 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 21:54:52 crc kubenswrapper[4962]: I1201 21:54:52.849675 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17a84147-ff3d-4d41-a2d4-355e5e4de88b-scripts\") pod \"17a84147-ff3d-4d41-a2d4-355e5e4de88b\" (UID: \"17a84147-ff3d-4d41-a2d4-355e5e4de88b\") " Dec 01 21:54:52 crc kubenswrapper[4962]: I1201 21:54:52.849757 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/89c75330-08c2-4341-85cf-2b550d6f0aa7-internal-tls-certs\") pod \"89c75330-08c2-4341-85cf-2b550d6f0aa7\" (UID: \"89c75330-08c2-4341-85cf-2b550d6f0aa7\") " Dec 01 21:54:52 crc kubenswrapper[4962]: I1201 21:54:52.849791 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/17a84147-ff3d-4d41-a2d4-355e5e4de88b-public-tls-certs\") pod \"17a84147-ff3d-4d41-a2d4-355e5e4de88b\" (UID: \"17a84147-ff3d-4d41-a2d4-355e5e4de88b\") " Dec 01 21:54:52 crc kubenswrapper[4962]: I1201 21:54:52.849827 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zb7f\" (UniqueName: \"kubernetes.io/projected/4ecfd3cc-e01f-4a75-ad5e-8c0e44638525-kube-api-access-9zb7f\") pod \"4ecfd3cc-e01f-4a75-ad5e-8c0e44638525\" (UID: \"4ecfd3cc-e01f-4a75-ad5e-8c0e44638525\") " Dec 01 21:54:52 crc kubenswrapper[4962]: I1201 21:54:52.849855 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17a84147-ff3d-4d41-a2d4-355e5e4de88b-combined-ca-bundle\") pod \"17a84147-ff3d-4d41-a2d4-355e5e4de88b\" (UID: \"17a84147-ff3d-4d41-a2d4-355e5e4de88b\") " Dec 01 21:54:52 crc kubenswrapper[4962]: I1201 21:54:52.849883 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bfb8d480-e429-4cd8-b9c8-1361a41deb16-ovsdbserver-nb\") pod \"bfb8d480-e429-4cd8-b9c8-1361a41deb16\" (UID: \"bfb8d480-e429-4cd8-b9c8-1361a41deb16\") " Dec 01 21:54:52 crc kubenswrapper[4962]: I1201 21:54:52.849905 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bfb8d480-e429-4cd8-b9c8-1361a41deb16-ovsdbserver-sb\") pod \"bfb8d480-e429-4cd8-b9c8-1361a41deb16\" (UID: \"bfb8d480-e429-4cd8-b9c8-1361a41deb16\") " Dec 01 21:54:52 crc kubenswrapper[4962]: I1201 21:54:52.849955 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcx72\" (UniqueName: \"kubernetes.io/projected/17a84147-ff3d-4d41-a2d4-355e5e4de88b-kube-api-access-fcx72\") pod \"17a84147-ff3d-4d41-a2d4-355e5e4de88b\" (UID: \"17a84147-ff3d-4d41-a2d4-355e5e4de88b\") " Dec 01 21:54:52 crc kubenswrapper[4962]: I1201 21:54:52.849981 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bslkm\" (UniqueName: \"kubernetes.io/projected/bfb8d480-e429-4cd8-b9c8-1361a41deb16-kube-api-access-bslkm\") pod \"bfb8d480-e429-4cd8-b9c8-1361a41deb16\" (UID: \"bfb8d480-e429-4cd8-b9c8-1361a41deb16\") " Dec 01 21:54:52 crc kubenswrapper[4962]: I1201 21:54:52.850002 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89c75330-08c2-4341-85cf-2b550d6f0aa7-scripts\") pod \"89c75330-08c2-4341-85cf-2b550d6f0aa7\" (UID: \"89c75330-08c2-4341-85cf-2b550d6f0aa7\") " Dec 01 21:54:52 crc kubenswrapper[4962]: I1201 21:54:52.850062 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfb8d480-e429-4cd8-b9c8-1361a41deb16-config\") pod \"bfb8d480-e429-4cd8-b9c8-1361a41deb16\" (UID: \"bfb8d480-e429-4cd8-b9c8-1361a41deb16\") " Dec 01 21:54:52 crc kubenswrapper[4962]: I1201 21:54:52.850155 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89c75330-08c2-4341-85cf-2b550d6f0aa7-logs\") pod \"89c75330-08c2-4341-85cf-2b550d6f0aa7\" (UID: \"89c75330-08c2-4341-85cf-2b550d6f0aa7\") " Dec 01 21:54:52 crc kubenswrapper[4962]: I1201 21:54:52.850210 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17a84147-ff3d-4d41-a2d4-355e5e4de88b-config-data\") pod \"17a84147-ff3d-4d41-a2d4-355e5e4de88b\" (UID: \"17a84147-ff3d-4d41-a2d4-355e5e4de88b\") " Dec 01 21:54:52 crc kubenswrapper[4962]: I1201 21:54:52.850260 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"89c75330-08c2-4341-85cf-2b550d6f0aa7\" (UID: \"89c75330-08c2-4341-85cf-2b550d6f0aa7\") " Dec 01 21:54:52 crc kubenswrapper[4962]: I1201 21:54:52.850280 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89c75330-08c2-4341-85cf-2b550d6f0aa7-config-data\") pod \"89c75330-08c2-4341-85cf-2b550d6f0aa7\" (UID: \"89c75330-08c2-4341-85cf-2b550d6f0aa7\") " Dec 01 21:54:52 crc kubenswrapper[4962]: I1201 21:54:52.850297 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/17a84147-ff3d-4d41-a2d4-355e5e4de88b-httpd-run\") pod \"17a84147-ff3d-4d41-a2d4-355e5e4de88b\" (UID: \"17a84147-ff3d-4d41-a2d4-355e5e4de88b\") " Dec 01 21:54:52 crc kubenswrapper[4962]: I1201 21:54:52.850335 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s284m\" (UniqueName: \"kubernetes.io/projected/89c75330-08c2-4341-85cf-2b550d6f0aa7-kube-api-access-s284m\") pod \"89c75330-08c2-4341-85cf-2b550d6f0aa7\" (UID: \"89c75330-08c2-4341-85cf-2b550d6f0aa7\") " Dec 01 21:54:52 crc kubenswrapper[4962]: I1201 21:54:52.850355 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ecfd3cc-e01f-4a75-ad5e-8c0e44638525-combined-ca-bundle\") pod \"4ecfd3cc-e01f-4a75-ad5e-8c0e44638525\" (UID: \"4ecfd3cc-e01f-4a75-ad5e-8c0e44638525\") " Dec 01 21:54:52 crc kubenswrapper[4962]: I1201 21:54:52.850371 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/89c75330-08c2-4341-85cf-2b550d6f0aa7-httpd-run\") pod \"89c75330-08c2-4341-85cf-2b550d6f0aa7\" (UID: \"89c75330-08c2-4341-85cf-2b550d6f0aa7\") " Dec 01 21:54:52 crc kubenswrapper[4962]: I1201 21:54:52.850400 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4ecfd3cc-e01f-4a75-ad5e-8c0e44638525-config\") pod \"4ecfd3cc-e01f-4a75-ad5e-8c0e44638525\" (UID: \"4ecfd3cc-e01f-4a75-ad5e-8c0e44638525\") " Dec 01 21:54:52 crc kubenswrapper[4962]: I1201 21:54:52.850426 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bfb8d480-e429-4cd8-b9c8-1361a41deb16-dns-svc\") pod \"bfb8d480-e429-4cd8-b9c8-1361a41deb16\" (UID: \"bfb8d480-e429-4cd8-b9c8-1361a41deb16\") " Dec 01 21:54:52 crc kubenswrapper[4962]: I1201 21:54:52.850509 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17a84147-ff3d-4d41-a2d4-355e5e4de88b-logs\") pod \"17a84147-ff3d-4d41-a2d4-355e5e4de88b\" (UID: \"17a84147-ff3d-4d41-a2d4-355e5e4de88b\") " Dec 01 21:54:52 crc kubenswrapper[4962]: I1201 21:54:52.850533 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"17a84147-ff3d-4d41-a2d4-355e5e4de88b\" (UID: \"17a84147-ff3d-4d41-a2d4-355e5e4de88b\") " Dec 01 21:54:52 crc kubenswrapper[4962]: I1201 21:54:52.850552 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89c75330-08c2-4341-85cf-2b550d6f0aa7-combined-ca-bundle\") pod \"89c75330-08c2-4341-85cf-2b550d6f0aa7\" (UID: \"89c75330-08c2-4341-85cf-2b550d6f0aa7\") " Dec 01 21:54:52 crc kubenswrapper[4962]: I1201 21:54:52.851400 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17a84147-ff3d-4d41-a2d4-355e5e4de88b-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "17a84147-ff3d-4d41-a2d4-355e5e4de88b" (UID: "17a84147-ff3d-4d41-a2d4-355e5e4de88b"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:54:52 crc kubenswrapper[4962]: I1201 21:54:52.851468 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89c75330-08c2-4341-85cf-2b550d6f0aa7-logs" (OuterVolumeSpecName: "logs") pod "89c75330-08c2-4341-85cf-2b550d6f0aa7" (UID: "89c75330-08c2-4341-85cf-2b550d6f0aa7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:54:52 crc kubenswrapper[4962]: I1201 21:54:52.855879 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17a84147-ff3d-4d41-a2d4-355e5e4de88b-kube-api-access-fcx72" (OuterVolumeSpecName: "kube-api-access-fcx72") pod "17a84147-ff3d-4d41-a2d4-355e5e4de88b" (UID: "17a84147-ff3d-4d41-a2d4-355e5e4de88b"). InnerVolumeSpecName "kube-api-access-fcx72". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:54:52 crc kubenswrapper[4962]: I1201 21:54:52.857163 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "89c75330-08c2-4341-85cf-2b550d6f0aa7" (UID: "89c75330-08c2-4341-85cf-2b550d6f0aa7"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 21:54:52 crc kubenswrapper[4962]: I1201 21:54:52.857173 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89c75330-08c2-4341-85cf-2b550d6f0aa7-scripts" (OuterVolumeSpecName: "scripts") pod "89c75330-08c2-4341-85cf-2b550d6f0aa7" (UID: "89c75330-08c2-4341-85cf-2b550d6f0aa7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:54:52 crc kubenswrapper[4962]: I1201 21:54:52.857441 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17a84147-ff3d-4d41-a2d4-355e5e4de88b-logs" (OuterVolumeSpecName: "logs") pod "17a84147-ff3d-4d41-a2d4-355e5e4de88b" (UID: "17a84147-ff3d-4d41-a2d4-355e5e4de88b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:54:52 crc kubenswrapper[4962]: I1201 21:54:52.857556 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89c75330-08c2-4341-85cf-2b550d6f0aa7-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "89c75330-08c2-4341-85cf-2b550d6f0aa7" (UID: "89c75330-08c2-4341-85cf-2b550d6f0aa7"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:54:52 crc kubenswrapper[4962]: I1201 21:54:52.859557 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfb8d480-e429-4cd8-b9c8-1361a41deb16-kube-api-access-bslkm" (OuterVolumeSpecName: "kube-api-access-bslkm") pod "bfb8d480-e429-4cd8-b9c8-1361a41deb16" (UID: "bfb8d480-e429-4cd8-b9c8-1361a41deb16"). InnerVolumeSpecName "kube-api-access-bslkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:54:52 crc kubenswrapper[4962]: I1201 21:54:52.860882 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "17a84147-ff3d-4d41-a2d4-355e5e4de88b" (UID: "17a84147-ff3d-4d41-a2d4-355e5e4de88b"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 21:54:52 crc kubenswrapper[4962]: I1201 21:54:52.863487 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89c75330-08c2-4341-85cf-2b550d6f0aa7-kube-api-access-s284m" (OuterVolumeSpecName: "kube-api-access-s284m") pod "89c75330-08c2-4341-85cf-2b550d6f0aa7" (UID: "89c75330-08c2-4341-85cf-2b550d6f0aa7"). InnerVolumeSpecName "kube-api-access-s284m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:54:52 crc kubenswrapper[4962]: I1201 21:54:52.878510 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17a84147-ff3d-4d41-a2d4-355e5e4de88b-scripts" (OuterVolumeSpecName: "scripts") pod "17a84147-ff3d-4d41-a2d4-355e5e4de88b" (UID: "17a84147-ff3d-4d41-a2d4-355e5e4de88b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:54:52 crc kubenswrapper[4962]: I1201 21:54:52.881710 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ecfd3cc-e01f-4a75-ad5e-8c0e44638525-kube-api-access-9zb7f" (OuterVolumeSpecName: "kube-api-access-9zb7f") pod "4ecfd3cc-e01f-4a75-ad5e-8c0e44638525" (UID: "4ecfd3cc-e01f-4a75-ad5e-8c0e44638525"). InnerVolumeSpecName "kube-api-access-9zb7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:54:52 crc kubenswrapper[4962]: I1201 21:54:52.887767 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89c75330-08c2-4341-85cf-2b550d6f0aa7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "89c75330-08c2-4341-85cf-2b550d6f0aa7" (UID: "89c75330-08c2-4341-85cf-2b550d6f0aa7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:54:52 crc kubenswrapper[4962]: I1201 21:54:52.901134 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ecfd3cc-e01f-4a75-ad5e-8c0e44638525-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4ecfd3cc-e01f-4a75-ad5e-8c0e44638525" (UID: "4ecfd3cc-e01f-4a75-ad5e-8c0e44638525"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:54:52 crc kubenswrapper[4962]: I1201 21:54:52.909172 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ecfd3cc-e01f-4a75-ad5e-8c0e44638525-config" (OuterVolumeSpecName: "config") pod "4ecfd3cc-e01f-4a75-ad5e-8c0e44638525" (UID: "4ecfd3cc-e01f-4a75-ad5e-8c0e44638525"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:54:52 crc kubenswrapper[4962]: I1201 21:54:52.928630 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfb8d480-e429-4cd8-b9c8-1361a41deb16-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bfb8d480-e429-4cd8-b9c8-1361a41deb16" (UID: "bfb8d480-e429-4cd8-b9c8-1361a41deb16"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:54:52 crc kubenswrapper[4962]: I1201 21:54:52.935365 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89c75330-08c2-4341-85cf-2b550d6f0aa7-config-data" (OuterVolumeSpecName: "config-data") pod "89c75330-08c2-4341-85cf-2b550d6f0aa7" (UID: "89c75330-08c2-4341-85cf-2b550d6f0aa7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:54:52 crc kubenswrapper[4962]: I1201 21:54:52.946592 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17a84147-ff3d-4d41-a2d4-355e5e4de88b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "17a84147-ff3d-4d41-a2d4-355e5e4de88b" (UID: "17a84147-ff3d-4d41-a2d4-355e5e4de88b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:54:52 crc kubenswrapper[4962]: I1201 21:54:52.953167 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zb7f\" (UniqueName: \"kubernetes.io/projected/4ecfd3cc-e01f-4a75-ad5e-8c0e44638525-kube-api-access-9zb7f\") on node \"crc\" DevicePath \"\"" Dec 01 21:54:52 crc kubenswrapper[4962]: I1201 21:54:52.953191 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17a84147-ff3d-4d41-a2d4-355e5e4de88b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 21:54:52 crc kubenswrapper[4962]: I1201 21:54:52.953200 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcx72\" (UniqueName: \"kubernetes.io/projected/17a84147-ff3d-4d41-a2d4-355e5e4de88b-kube-api-access-fcx72\") on node \"crc\" DevicePath \"\"" Dec 01 21:54:52 crc kubenswrapper[4962]: I1201 21:54:52.953209 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bslkm\" (UniqueName: \"kubernetes.io/projected/bfb8d480-e429-4cd8-b9c8-1361a41deb16-kube-api-access-bslkm\") on node \"crc\" DevicePath \"\"" Dec 01 21:54:52 crc kubenswrapper[4962]: I1201 21:54:52.953219 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89c75330-08c2-4341-85cf-2b550d6f0aa7-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 21:54:52 crc kubenswrapper[4962]: I1201 21:54:52.953227 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89c75330-08c2-4341-85cf-2b550d6f0aa7-logs\") on node \"crc\" DevicePath \"\"" Dec 01 21:54:52 crc kubenswrapper[4962]: I1201 21:54:52.953252 4962 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Dec 01 21:54:52 crc kubenswrapper[4962]: I1201 21:54:52.953262 4962 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/17a84147-ff3d-4d41-a2d4-355e5e4de88b-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 01 21:54:52 crc kubenswrapper[4962]: I1201 21:54:52.953270 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89c75330-08c2-4341-85cf-2b550d6f0aa7-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 21:54:52 crc kubenswrapper[4962]: I1201 21:54:52.953278 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s284m\" (UniqueName: \"kubernetes.io/projected/89c75330-08c2-4341-85cf-2b550d6f0aa7-kube-api-access-s284m\") on node \"crc\" DevicePath \"\"" Dec 01 21:54:52 crc kubenswrapper[4962]: I1201 21:54:52.953286 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ecfd3cc-e01f-4a75-ad5e-8c0e44638525-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 21:54:52 crc kubenswrapper[4962]: I1201 21:54:52.953293 4962 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/89c75330-08c2-4341-85cf-2b550d6f0aa7-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 01 21:54:52 crc kubenswrapper[4962]: I1201 21:54:52.953301 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/4ecfd3cc-e01f-4a75-ad5e-8c0e44638525-config\") on node \"crc\" DevicePath \"\"" Dec 01 21:54:52 crc kubenswrapper[4962]: I1201 21:54:52.953309 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bfb8d480-e429-4cd8-b9c8-1361a41deb16-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 21:54:52 crc kubenswrapper[4962]: I1201 21:54:52.953316 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17a84147-ff3d-4d41-a2d4-355e5e4de88b-logs\") on node \"crc\" DevicePath \"\"" Dec 01 21:54:52 crc kubenswrapper[4962]: I1201 21:54:52.953329 4962 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Dec 01 21:54:52 crc kubenswrapper[4962]: I1201 21:54:52.953337 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89c75330-08c2-4341-85cf-2b550d6f0aa7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 21:54:52 crc kubenswrapper[4962]: I1201 21:54:52.953344 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17a84147-ff3d-4d41-a2d4-355e5e4de88b-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 21:54:52 crc kubenswrapper[4962]: I1201 21:54:52.953529 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfb8d480-e429-4cd8-b9c8-1361a41deb16-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bfb8d480-e429-4cd8-b9c8-1361a41deb16" (UID: "bfb8d480-e429-4cd8-b9c8-1361a41deb16"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:54:52 crc kubenswrapper[4962]: I1201 21:54:52.956702 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfb8d480-e429-4cd8-b9c8-1361a41deb16-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bfb8d480-e429-4cd8-b9c8-1361a41deb16" (UID: "bfb8d480-e429-4cd8-b9c8-1361a41deb16"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:54:52 crc kubenswrapper[4962]: I1201 21:54:52.960765 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfb8d480-e429-4cd8-b9c8-1361a41deb16-config" (OuterVolumeSpecName: "config") pod "bfb8d480-e429-4cd8-b9c8-1361a41deb16" (UID: "bfb8d480-e429-4cd8-b9c8-1361a41deb16"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:54:52 crc kubenswrapper[4962]: I1201 21:54:52.963750 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89c75330-08c2-4341-85cf-2b550d6f0aa7-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "89c75330-08c2-4341-85cf-2b550d6f0aa7" (UID: "89c75330-08c2-4341-85cf-2b550d6f0aa7"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:54:52 crc kubenswrapper[4962]: I1201 21:54:52.979086 4962 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Dec 01 21:54:52 crc kubenswrapper[4962]: I1201 21:54:52.979909 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17a84147-ff3d-4d41-a2d4-355e5e4de88b-config-data" (OuterVolumeSpecName: "config-data") pod "17a84147-ff3d-4d41-a2d4-355e5e4de88b" (UID: "17a84147-ff3d-4d41-a2d4-355e5e4de88b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:54:52 crc kubenswrapper[4962]: I1201 21:54:52.980050 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17a84147-ff3d-4d41-a2d4-355e5e4de88b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "17a84147-ff3d-4d41-a2d4-355e5e4de88b" (UID: "17a84147-ff3d-4d41-a2d4-355e5e4de88b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:54:52 crc kubenswrapper[4962]: I1201 21:54:52.982853 4962 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.054925 4962 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/89c75330-08c2-4341-85cf-2b550d6f0aa7-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.054978 4962 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/17a84147-ff3d-4d41-a2d4-355e5e4de88b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.054987 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bfb8d480-e429-4cd8-b9c8-1361a41deb16-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.054996 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bfb8d480-e429-4cd8-b9c8-1361a41deb16-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.055007 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfb8d480-e429-4cd8-b9c8-1361a41deb16-config\") on node \"crc\" DevicePath \"\"" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.055015 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17a84147-ff3d-4d41-a2d4-355e5e4de88b-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.055025 4962 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.055033 4962 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.273398 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-nn26f" event={"ID":"bfb8d480-e429-4cd8-b9c8-1361a41deb16","Type":"ContainerDied","Data":"2915dc50064e2a305c92086a7b1e04f0220b98ccd3636a8bc15a77f713ed415a"} Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.273435 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-nn26f" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.277095 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.277176 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"17a84147-ff3d-4d41-a2d4-355e5e4de88b","Type":"ContainerDied","Data":"109a2fc31fcf269b4f6399b57e258f7c0c1a91d9678d2427681491a4a863b584"} Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.280524 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-pvhvh" event={"ID":"4ecfd3cc-e01f-4a75-ad5e-8c0e44638525","Type":"ContainerDied","Data":"276be4279574e0ab70f52bc0c3ca097dacdf41eae1b2b6f36e99099417c774af"} Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.280556 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="276be4279574e0ab70f52bc0c3ca097dacdf41eae1b2b6f36e99099417c774af" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.280656 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-pvhvh" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.283964 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"89c75330-08c2-4341-85cf-2b550d6f0aa7","Type":"ContainerDied","Data":"af7a0cf9fe1c1a9e78d9acbcc9b7a4e8c305f25be58f1939ca78dc3a63a835fa"} Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.284632 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 21:54:53 crc kubenswrapper[4962]: E1201 21:54:53.286293 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-q7knr" podUID="98cede6f-200e-44d9-a4ee-886de53f2459" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.328002 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.362901 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.377741 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 21:54:53 crc kubenswrapper[4962]: E1201 21:54:53.378255 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89c75330-08c2-4341-85cf-2b550d6f0aa7" containerName="glance-log" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.378269 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="89c75330-08c2-4341-85cf-2b550d6f0aa7" containerName="glance-log" Dec 01 21:54:53 crc kubenswrapper[4962]: E1201 21:54:53.378297 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17a84147-ff3d-4d41-a2d4-355e5e4de88b" containerName="glance-log" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.378303 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="17a84147-ff3d-4d41-a2d4-355e5e4de88b" containerName="glance-log" Dec 01 21:54:53 crc kubenswrapper[4962]: E1201 21:54:53.378315 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfb8d480-e429-4cd8-b9c8-1361a41deb16" containerName="init" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.378320 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfb8d480-e429-4cd8-b9c8-1361a41deb16" containerName="init" Dec 01 21:54:53 crc kubenswrapper[4962]: E1201 21:54:53.378331 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89c75330-08c2-4341-85cf-2b550d6f0aa7" containerName="glance-httpd" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.378340 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="89c75330-08c2-4341-85cf-2b550d6f0aa7" containerName="glance-httpd" Dec 01 21:54:53 crc kubenswrapper[4962]: E1201 21:54:53.378353 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17a84147-ff3d-4d41-a2d4-355e5e4de88b" containerName="glance-httpd" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.378359 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="17a84147-ff3d-4d41-a2d4-355e5e4de88b" containerName="glance-httpd" Dec 01 21:54:53 crc kubenswrapper[4962]: E1201 21:54:53.378370 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfb8d480-e429-4cd8-b9c8-1361a41deb16" containerName="dnsmasq-dns" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.378375 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfb8d480-e429-4cd8-b9c8-1361a41deb16" containerName="dnsmasq-dns" Dec 01 21:54:53 crc kubenswrapper[4962]: E1201 21:54:53.378388 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ecfd3cc-e01f-4a75-ad5e-8c0e44638525" containerName="neutron-db-sync" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.378394 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ecfd3cc-e01f-4a75-ad5e-8c0e44638525" containerName="neutron-db-sync" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.378589 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="89c75330-08c2-4341-85cf-2b550d6f0aa7" containerName="glance-httpd" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.378603 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="89c75330-08c2-4341-85cf-2b550d6f0aa7" containerName="glance-log" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.378611 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="17a84147-ff3d-4d41-a2d4-355e5e4de88b" containerName="glance-log" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.378618 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="17a84147-ff3d-4d41-a2d4-355e5e4de88b" containerName="glance-httpd" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.378626 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ecfd3cc-e01f-4a75-ad5e-8c0e44638525" containerName="neutron-db-sync" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.378635 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfb8d480-e429-4cd8-b9c8-1361a41deb16" containerName="dnsmasq-dns" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.379802 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.384130 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-nj6wq" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.384272 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.384396 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.384490 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.391957 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.417493 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.437018 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.451374 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-nn26f"] Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.462276 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ded81d0-3be6-4293-abd6-c3434d42667e-scripts\") pod \"glance-default-external-api-0\" (UID: \"1ded81d0-3be6-4293-abd6-c3434d42667e\") " pod="openstack/glance-default-external-api-0" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.462674 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ded81d0-3be6-4293-abd6-c3434d42667e-config-data\") pod \"glance-default-external-api-0\" (UID: \"1ded81d0-3be6-4293-abd6-c3434d42667e\") " pod="openstack/glance-default-external-api-0" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.462755 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ded81d0-3be6-4293-abd6-c3434d42667e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1ded81d0-3be6-4293-abd6-c3434d42667e\") " pod="openstack/glance-default-external-api-0" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.462902 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ded81d0-3be6-4293-abd6-c3434d42667e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"1ded81d0-3be6-4293-abd6-c3434d42667e\") " pod="openstack/glance-default-external-api-0" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.462986 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65226\" (UniqueName: \"kubernetes.io/projected/1ded81d0-3be6-4293-abd6-c3434d42667e-kube-api-access-65226\") pod \"glance-default-external-api-0\" (UID: \"1ded81d0-3be6-4293-abd6-c3434d42667e\") " pod="openstack/glance-default-external-api-0" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.463112 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ded81d0-3be6-4293-abd6-c3434d42667e-logs\") pod \"glance-default-external-api-0\" (UID: \"1ded81d0-3be6-4293-abd6-c3434d42667e\") " pod="openstack/glance-default-external-api-0" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.463218 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1ded81d0-3be6-4293-abd6-c3434d42667e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1ded81d0-3be6-4293-abd6-c3434d42667e\") " pod="openstack/glance-default-external-api-0" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.463303 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"1ded81d0-3be6-4293-abd6-c3434d42667e\") " pod="openstack/glance-default-external-api-0" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.465546 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-nn26f"] Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.482115 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.484273 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.486611 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.486848 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.491133 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.565567 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ded81d0-3be6-4293-abd6-c3434d42667e-scripts\") pod \"glance-default-external-api-0\" (UID: \"1ded81d0-3be6-4293-abd6-c3434d42667e\") " pod="openstack/glance-default-external-api-0" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.565614 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/788bee1d-914f-4efc-acea-67ff250ce73f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"788bee1d-914f-4efc-acea-67ff250ce73f\") " pod="openstack/glance-default-internal-api-0" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.565650 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/788bee1d-914f-4efc-acea-67ff250ce73f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"788bee1d-914f-4efc-acea-67ff250ce73f\") " pod="openstack/glance-default-internal-api-0" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.565676 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfknw\" (UniqueName: \"kubernetes.io/projected/788bee1d-914f-4efc-acea-67ff250ce73f-kube-api-access-mfknw\") pod \"glance-default-internal-api-0\" (UID: \"788bee1d-914f-4efc-acea-67ff250ce73f\") " pod="openstack/glance-default-internal-api-0" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.565827 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ded81d0-3be6-4293-abd6-c3434d42667e-config-data\") pod \"glance-default-external-api-0\" (UID: \"1ded81d0-3be6-4293-abd6-c3434d42667e\") " pod="openstack/glance-default-external-api-0" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.565872 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ded81d0-3be6-4293-abd6-c3434d42667e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1ded81d0-3be6-4293-abd6-c3434d42667e\") " pod="openstack/glance-default-external-api-0" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.565917 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/788bee1d-914f-4efc-acea-67ff250ce73f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"788bee1d-914f-4efc-acea-67ff250ce73f\") " pod="openstack/glance-default-internal-api-0" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.566048 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/788bee1d-914f-4efc-acea-67ff250ce73f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"788bee1d-914f-4efc-acea-67ff250ce73f\") " pod="openstack/glance-default-internal-api-0" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.566122 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ded81d0-3be6-4293-abd6-c3434d42667e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"1ded81d0-3be6-4293-abd6-c3434d42667e\") " pod="openstack/glance-default-external-api-0" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.566145 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65226\" (UniqueName: \"kubernetes.io/projected/1ded81d0-3be6-4293-abd6-c3434d42667e-kube-api-access-65226\") pod \"glance-default-external-api-0\" (UID: \"1ded81d0-3be6-4293-abd6-c3434d42667e\") " pod="openstack/glance-default-external-api-0" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.566221 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/788bee1d-914f-4efc-acea-67ff250ce73f-logs\") pod \"glance-default-internal-api-0\" (UID: \"788bee1d-914f-4efc-acea-67ff250ce73f\") " pod="openstack/glance-default-internal-api-0" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.566375 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"788bee1d-914f-4efc-acea-67ff250ce73f\") " pod="openstack/glance-default-internal-api-0" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.566405 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ded81d0-3be6-4293-abd6-c3434d42667e-logs\") pod \"glance-default-external-api-0\" (UID: \"1ded81d0-3be6-4293-abd6-c3434d42667e\") " pod="openstack/glance-default-external-api-0" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.566493 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1ded81d0-3be6-4293-abd6-c3434d42667e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1ded81d0-3be6-4293-abd6-c3434d42667e\") " pod="openstack/glance-default-external-api-0" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.566565 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"1ded81d0-3be6-4293-abd6-c3434d42667e\") " pod="openstack/glance-default-external-api-0" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.566597 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/788bee1d-914f-4efc-acea-67ff250ce73f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"788bee1d-914f-4efc-acea-67ff250ce73f\") " pod="openstack/glance-default-internal-api-0" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.568659 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"1ded81d0-3be6-4293-abd6-c3434d42667e\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-external-api-0" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.568662 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1ded81d0-3be6-4293-abd6-c3434d42667e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1ded81d0-3be6-4293-abd6-c3434d42667e\") " pod="openstack/glance-default-external-api-0" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.569630 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ded81d0-3be6-4293-abd6-c3434d42667e-scripts\") pod \"glance-default-external-api-0\" (UID: \"1ded81d0-3be6-4293-abd6-c3434d42667e\") " pod="openstack/glance-default-external-api-0" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.570187 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ded81d0-3be6-4293-abd6-c3434d42667e-logs\") pod \"glance-default-external-api-0\" (UID: \"1ded81d0-3be6-4293-abd6-c3434d42667e\") " pod="openstack/glance-default-external-api-0" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.571081 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ded81d0-3be6-4293-abd6-c3434d42667e-config-data\") pod \"glance-default-external-api-0\" (UID: \"1ded81d0-3be6-4293-abd6-c3434d42667e\") " pod="openstack/glance-default-external-api-0" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.572176 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ded81d0-3be6-4293-abd6-c3434d42667e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"1ded81d0-3be6-4293-abd6-c3434d42667e\") " pod="openstack/glance-default-external-api-0" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.572434 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ded81d0-3be6-4293-abd6-c3434d42667e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1ded81d0-3be6-4293-abd6-c3434d42667e\") " pod="openstack/glance-default-external-api-0" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.588028 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65226\" (UniqueName: \"kubernetes.io/projected/1ded81d0-3be6-4293-abd6-c3434d42667e-kube-api-access-65226\") pod \"glance-default-external-api-0\" (UID: \"1ded81d0-3be6-4293-abd6-c3434d42667e\") " pod="openstack/glance-default-external-api-0" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.608414 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"1ded81d0-3be6-4293-abd6-c3434d42667e\") " pod="openstack/glance-default-external-api-0" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.668643 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/788bee1d-914f-4efc-acea-67ff250ce73f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"788bee1d-914f-4efc-acea-67ff250ce73f\") " pod="openstack/glance-default-internal-api-0" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.668723 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/788bee1d-914f-4efc-acea-67ff250ce73f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"788bee1d-914f-4efc-acea-67ff250ce73f\") " pod="openstack/glance-default-internal-api-0" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.668751 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/788bee1d-914f-4efc-acea-67ff250ce73f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"788bee1d-914f-4efc-acea-67ff250ce73f\") " pod="openstack/glance-default-internal-api-0" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.668773 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfknw\" (UniqueName: \"kubernetes.io/projected/788bee1d-914f-4efc-acea-67ff250ce73f-kube-api-access-mfknw\") pod \"glance-default-internal-api-0\" (UID: \"788bee1d-914f-4efc-acea-67ff250ce73f\") " pod="openstack/glance-default-internal-api-0" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.668812 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/788bee1d-914f-4efc-acea-67ff250ce73f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"788bee1d-914f-4efc-acea-67ff250ce73f\") " pod="openstack/glance-default-internal-api-0" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.668853 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/788bee1d-914f-4efc-acea-67ff250ce73f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"788bee1d-914f-4efc-acea-67ff250ce73f\") " pod="openstack/glance-default-internal-api-0" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.668889 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/788bee1d-914f-4efc-acea-67ff250ce73f-logs\") pod \"glance-default-internal-api-0\" (UID: \"788bee1d-914f-4efc-acea-67ff250ce73f\") " pod="openstack/glance-default-internal-api-0" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.668928 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"788bee1d-914f-4efc-acea-67ff250ce73f\") " pod="openstack/glance-default-internal-api-0" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.669105 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"788bee1d-914f-4efc-acea-67ff250ce73f\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.671421 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/788bee1d-914f-4efc-acea-67ff250ce73f-logs\") pod \"glance-default-internal-api-0\" (UID: \"788bee1d-914f-4efc-acea-67ff250ce73f\") " pod="openstack/glance-default-internal-api-0" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.671981 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/788bee1d-914f-4efc-acea-67ff250ce73f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"788bee1d-914f-4efc-acea-67ff250ce73f\") " pod="openstack/glance-default-internal-api-0" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.672187 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/788bee1d-914f-4efc-acea-67ff250ce73f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"788bee1d-914f-4efc-acea-67ff250ce73f\") " pod="openstack/glance-default-internal-api-0" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.673612 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/788bee1d-914f-4efc-acea-67ff250ce73f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"788bee1d-914f-4efc-acea-67ff250ce73f\") " pod="openstack/glance-default-internal-api-0" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.676459 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/788bee1d-914f-4efc-acea-67ff250ce73f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"788bee1d-914f-4efc-acea-67ff250ce73f\") " pod="openstack/glance-default-internal-api-0" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.679017 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/788bee1d-914f-4efc-acea-67ff250ce73f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"788bee1d-914f-4efc-acea-67ff250ce73f\") " pod="openstack/glance-default-internal-api-0" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.689047 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfknw\" (UniqueName: \"kubernetes.io/projected/788bee1d-914f-4efc-acea-67ff250ce73f-kube-api-access-mfknw\") pod \"glance-default-internal-api-0\" (UID: \"788bee1d-914f-4efc-acea-67ff250ce73f\") " pod="openstack/glance-default-internal-api-0" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.699948 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"788bee1d-914f-4efc-acea-67ff250ce73f\") " pod="openstack/glance-default-internal-api-0" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.712105 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.809091 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 21:54:53 crc kubenswrapper[4962]: I1201 21:54:53.926467 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-qgvg7"] Dec 01 21:54:54 crc kubenswrapper[4962]: I1201 21:54:54.034355 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-qgvg7"] Dec 01 21:54:54 crc kubenswrapper[4962]: I1201 21:54:54.034709 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-qgvg7" Dec 01 21:54:54 crc kubenswrapper[4962]: I1201 21:54:54.084036 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2fc05c2b-9041-4361-8635-826c5a64afd2-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-qgvg7\" (UID: \"2fc05c2b-9041-4361-8635-826c5a64afd2\") " pod="openstack/dnsmasq-dns-6b7b667979-qgvg7" Dec 01 21:54:54 crc kubenswrapper[4962]: I1201 21:54:54.084105 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2fc05c2b-9041-4361-8635-826c5a64afd2-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-qgvg7\" (UID: \"2fc05c2b-9041-4361-8635-826c5a64afd2\") " pod="openstack/dnsmasq-dns-6b7b667979-qgvg7" Dec 01 21:54:54 crc kubenswrapper[4962]: I1201 21:54:54.084153 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2fc05c2b-9041-4361-8635-826c5a64afd2-dns-svc\") pod \"dnsmasq-dns-6b7b667979-qgvg7\" (UID: \"2fc05c2b-9041-4361-8635-826c5a64afd2\") " pod="openstack/dnsmasq-dns-6b7b667979-qgvg7" Dec 01 21:54:54 crc kubenswrapper[4962]: I1201 21:54:54.084209 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fc05c2b-9041-4361-8635-826c5a64afd2-config\") pod \"dnsmasq-dns-6b7b667979-qgvg7\" (UID: \"2fc05c2b-9041-4361-8635-826c5a64afd2\") " pod="openstack/dnsmasq-dns-6b7b667979-qgvg7" Dec 01 21:54:54 crc kubenswrapper[4962]: I1201 21:54:54.084227 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkmnr\" (UniqueName: \"kubernetes.io/projected/2fc05c2b-9041-4361-8635-826c5a64afd2-kube-api-access-kkmnr\") pod \"dnsmasq-dns-6b7b667979-qgvg7\" (UID: \"2fc05c2b-9041-4361-8635-826c5a64afd2\") " pod="openstack/dnsmasq-dns-6b7b667979-qgvg7" Dec 01 21:54:54 crc kubenswrapper[4962]: I1201 21:54:54.084256 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2fc05c2b-9041-4361-8635-826c5a64afd2-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-qgvg7\" (UID: \"2fc05c2b-9041-4361-8635-826c5a64afd2\") " pod="openstack/dnsmasq-dns-6b7b667979-qgvg7" Dec 01 21:54:54 crc kubenswrapper[4962]: I1201 21:54:54.128995 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5579c75d4-fhjn4"] Dec 01 21:54:54 crc kubenswrapper[4962]: I1201 21:54:54.131037 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5579c75d4-fhjn4" Dec 01 21:54:54 crc kubenswrapper[4962]: I1201 21:54:54.133967 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 01 21:54:54 crc kubenswrapper[4962]: I1201 21:54:54.135044 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-ggdvg" Dec 01 21:54:54 crc kubenswrapper[4962]: I1201 21:54:54.135229 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Dec 01 21:54:54 crc kubenswrapper[4962]: I1201 21:54:54.135411 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 01 21:54:54 crc kubenswrapper[4962]: I1201 21:54:54.175148 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5579c75d4-fhjn4"] Dec 01 21:54:54 crc kubenswrapper[4962]: I1201 21:54:54.186358 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4898fa68-89d9-4aa6-9b60-4503ad99778e-config\") pod \"neutron-5579c75d4-fhjn4\" (UID: \"4898fa68-89d9-4aa6-9b60-4503ad99778e\") " pod="openstack/neutron-5579c75d4-fhjn4" Dec 01 21:54:54 crc kubenswrapper[4962]: I1201 21:54:54.186434 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2fc05c2b-9041-4361-8635-826c5a64afd2-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-qgvg7\" (UID: \"2fc05c2b-9041-4361-8635-826c5a64afd2\") " pod="openstack/dnsmasq-dns-6b7b667979-qgvg7" Dec 01 21:54:54 crc kubenswrapper[4962]: I1201 21:54:54.186463 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4898fa68-89d9-4aa6-9b60-4503ad99778e-httpd-config\") pod \"neutron-5579c75d4-fhjn4\" (UID: \"4898fa68-89d9-4aa6-9b60-4503ad99778e\") " pod="openstack/neutron-5579c75d4-fhjn4" Dec 01 21:54:54 crc kubenswrapper[4962]: I1201 21:54:54.186487 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdpt9\" (UniqueName: \"kubernetes.io/projected/4898fa68-89d9-4aa6-9b60-4503ad99778e-kube-api-access-wdpt9\") pod \"neutron-5579c75d4-fhjn4\" (UID: \"4898fa68-89d9-4aa6-9b60-4503ad99778e\") " pod="openstack/neutron-5579c75d4-fhjn4" Dec 01 21:54:54 crc kubenswrapper[4962]: I1201 21:54:54.186526 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2fc05c2b-9041-4361-8635-826c5a64afd2-dns-svc\") pod \"dnsmasq-dns-6b7b667979-qgvg7\" (UID: \"2fc05c2b-9041-4361-8635-826c5a64afd2\") " pod="openstack/dnsmasq-dns-6b7b667979-qgvg7" Dec 01 21:54:54 crc kubenswrapper[4962]: I1201 21:54:54.186566 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4898fa68-89d9-4aa6-9b60-4503ad99778e-ovndb-tls-certs\") pod \"neutron-5579c75d4-fhjn4\" (UID: \"4898fa68-89d9-4aa6-9b60-4503ad99778e\") " pod="openstack/neutron-5579c75d4-fhjn4" Dec 01 21:54:54 crc kubenswrapper[4962]: I1201 21:54:54.186597 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fc05c2b-9041-4361-8635-826c5a64afd2-config\") pod \"dnsmasq-dns-6b7b667979-qgvg7\" (UID: \"2fc05c2b-9041-4361-8635-826c5a64afd2\") " pod="openstack/dnsmasq-dns-6b7b667979-qgvg7" Dec 01 21:54:54 crc kubenswrapper[4962]: I1201 21:54:54.186612 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkmnr\" (UniqueName: \"kubernetes.io/projected/2fc05c2b-9041-4361-8635-826c5a64afd2-kube-api-access-kkmnr\") pod \"dnsmasq-dns-6b7b667979-qgvg7\" (UID: \"2fc05c2b-9041-4361-8635-826c5a64afd2\") " pod="openstack/dnsmasq-dns-6b7b667979-qgvg7" Dec 01 21:54:54 crc kubenswrapper[4962]: I1201 21:54:54.186642 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2fc05c2b-9041-4361-8635-826c5a64afd2-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-qgvg7\" (UID: \"2fc05c2b-9041-4361-8635-826c5a64afd2\") " pod="openstack/dnsmasq-dns-6b7b667979-qgvg7" Dec 01 21:54:54 crc kubenswrapper[4962]: I1201 21:54:54.186693 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4898fa68-89d9-4aa6-9b60-4503ad99778e-combined-ca-bundle\") pod \"neutron-5579c75d4-fhjn4\" (UID: \"4898fa68-89d9-4aa6-9b60-4503ad99778e\") " pod="openstack/neutron-5579c75d4-fhjn4" Dec 01 21:54:54 crc kubenswrapper[4962]: I1201 21:54:54.186715 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2fc05c2b-9041-4361-8635-826c5a64afd2-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-qgvg7\" (UID: \"2fc05c2b-9041-4361-8635-826c5a64afd2\") " pod="openstack/dnsmasq-dns-6b7b667979-qgvg7" Dec 01 21:54:54 crc kubenswrapper[4962]: I1201 21:54:54.187806 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2fc05c2b-9041-4361-8635-826c5a64afd2-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-qgvg7\" (UID: \"2fc05c2b-9041-4361-8635-826c5a64afd2\") " pod="openstack/dnsmasq-dns-6b7b667979-qgvg7" Dec 01 21:54:54 crc kubenswrapper[4962]: I1201 21:54:54.188139 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2fc05c2b-9041-4361-8635-826c5a64afd2-dns-svc\") pod \"dnsmasq-dns-6b7b667979-qgvg7\" (UID: \"2fc05c2b-9041-4361-8635-826c5a64afd2\") " pod="openstack/dnsmasq-dns-6b7b667979-qgvg7" Dec 01 21:54:54 crc kubenswrapper[4962]: I1201 21:54:54.188463 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fc05c2b-9041-4361-8635-826c5a64afd2-config\") pod \"dnsmasq-dns-6b7b667979-qgvg7\" (UID: \"2fc05c2b-9041-4361-8635-826c5a64afd2\") " pod="openstack/dnsmasq-dns-6b7b667979-qgvg7" Dec 01 21:54:54 crc kubenswrapper[4962]: I1201 21:54:54.188694 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2fc05c2b-9041-4361-8635-826c5a64afd2-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-qgvg7\" (UID: \"2fc05c2b-9041-4361-8635-826c5a64afd2\") " pod="openstack/dnsmasq-dns-6b7b667979-qgvg7" Dec 01 21:54:54 crc kubenswrapper[4962]: I1201 21:54:54.188716 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2fc05c2b-9041-4361-8635-826c5a64afd2-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-qgvg7\" (UID: \"2fc05c2b-9041-4361-8635-826c5a64afd2\") " pod="openstack/dnsmasq-dns-6b7b667979-qgvg7" Dec 01 21:54:54 crc kubenswrapper[4962]: I1201 21:54:54.230868 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkmnr\" (UniqueName: \"kubernetes.io/projected/2fc05c2b-9041-4361-8635-826c5a64afd2-kube-api-access-kkmnr\") pod \"dnsmasq-dns-6b7b667979-qgvg7\" (UID: \"2fc05c2b-9041-4361-8635-826c5a64afd2\") " pod="openstack/dnsmasq-dns-6b7b667979-qgvg7" Dec 01 21:54:54 crc kubenswrapper[4962]: I1201 21:54:54.233259 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17a84147-ff3d-4d41-a2d4-355e5e4de88b" path="/var/lib/kubelet/pods/17a84147-ff3d-4d41-a2d4-355e5e4de88b/volumes" Dec 01 21:54:54 crc kubenswrapper[4962]: I1201 21:54:54.234298 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89c75330-08c2-4341-85cf-2b550d6f0aa7" path="/var/lib/kubelet/pods/89c75330-08c2-4341-85cf-2b550d6f0aa7/volumes" Dec 01 21:54:54 crc kubenswrapper[4962]: I1201 21:54:54.235081 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfb8d480-e429-4cd8-b9c8-1361a41deb16" path="/var/lib/kubelet/pods/bfb8d480-e429-4cd8-b9c8-1361a41deb16/volumes" Dec 01 21:54:54 crc kubenswrapper[4962]: I1201 21:54:54.288861 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4898fa68-89d9-4aa6-9b60-4503ad99778e-config\") pod \"neutron-5579c75d4-fhjn4\" (UID: \"4898fa68-89d9-4aa6-9b60-4503ad99778e\") " pod="openstack/neutron-5579c75d4-fhjn4" Dec 01 21:54:54 crc kubenswrapper[4962]: I1201 21:54:54.288991 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4898fa68-89d9-4aa6-9b60-4503ad99778e-httpd-config\") pod \"neutron-5579c75d4-fhjn4\" (UID: \"4898fa68-89d9-4aa6-9b60-4503ad99778e\") " pod="openstack/neutron-5579c75d4-fhjn4" Dec 01 21:54:54 crc kubenswrapper[4962]: I1201 21:54:54.289029 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdpt9\" (UniqueName: \"kubernetes.io/projected/4898fa68-89d9-4aa6-9b60-4503ad99778e-kube-api-access-wdpt9\") pod \"neutron-5579c75d4-fhjn4\" (UID: \"4898fa68-89d9-4aa6-9b60-4503ad99778e\") " pod="openstack/neutron-5579c75d4-fhjn4" Dec 01 21:54:54 crc kubenswrapper[4962]: I1201 21:54:54.289150 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4898fa68-89d9-4aa6-9b60-4503ad99778e-ovndb-tls-certs\") pod \"neutron-5579c75d4-fhjn4\" (UID: \"4898fa68-89d9-4aa6-9b60-4503ad99778e\") " pod="openstack/neutron-5579c75d4-fhjn4" Dec 01 21:54:54 crc kubenswrapper[4962]: I1201 21:54:54.289381 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4898fa68-89d9-4aa6-9b60-4503ad99778e-combined-ca-bundle\") pod \"neutron-5579c75d4-fhjn4\" (UID: \"4898fa68-89d9-4aa6-9b60-4503ad99778e\") " pod="openstack/neutron-5579c75d4-fhjn4" Dec 01 21:54:54 crc kubenswrapper[4962]: I1201 21:54:54.294387 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/4898fa68-89d9-4aa6-9b60-4503ad99778e-config\") pod \"neutron-5579c75d4-fhjn4\" (UID: \"4898fa68-89d9-4aa6-9b60-4503ad99778e\") " pod="openstack/neutron-5579c75d4-fhjn4" Dec 01 21:54:54 crc kubenswrapper[4962]: I1201 21:54:54.295077 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4898fa68-89d9-4aa6-9b60-4503ad99778e-combined-ca-bundle\") pod \"neutron-5579c75d4-fhjn4\" (UID: \"4898fa68-89d9-4aa6-9b60-4503ad99778e\") " pod="openstack/neutron-5579c75d4-fhjn4" Dec 01 21:54:54 crc kubenswrapper[4962]: I1201 21:54:54.297015 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4898fa68-89d9-4aa6-9b60-4503ad99778e-ovndb-tls-certs\") pod \"neutron-5579c75d4-fhjn4\" (UID: \"4898fa68-89d9-4aa6-9b60-4503ad99778e\") " pod="openstack/neutron-5579c75d4-fhjn4" Dec 01 21:54:54 crc kubenswrapper[4962]: I1201 21:54:54.298573 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4898fa68-89d9-4aa6-9b60-4503ad99778e-httpd-config\") pod \"neutron-5579c75d4-fhjn4\" (UID: \"4898fa68-89d9-4aa6-9b60-4503ad99778e\") " pod="openstack/neutron-5579c75d4-fhjn4" Dec 01 21:54:54 crc kubenswrapper[4962]: I1201 21:54:54.310276 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdpt9\" (UniqueName: \"kubernetes.io/projected/4898fa68-89d9-4aa6-9b60-4503ad99778e-kube-api-access-wdpt9\") pod \"neutron-5579c75d4-fhjn4\" (UID: \"4898fa68-89d9-4aa6-9b60-4503ad99778e\") " pod="openstack/neutron-5579c75d4-fhjn4" Dec 01 21:54:54 crc kubenswrapper[4962]: I1201 21:54:54.396055 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-qgvg7" Dec 01 21:54:54 crc kubenswrapper[4962]: I1201 21:54:54.464610 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5579c75d4-fhjn4" Dec 01 21:54:54 crc kubenswrapper[4962]: I1201 21:54:54.840319 4962 scope.go:117] "RemoveContainer" containerID="749385e365b4e87b4a31f8cc1590103d02a4cbb7c023be6575567602b907c9d7" Dec 01 21:54:54 crc kubenswrapper[4962]: E1201 21:54:54.853113 4962 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Dec 01 21:54:54 crc kubenswrapper[4962]: E1201 21:54:54.853278 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dv9w6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-vzzq8_openstack(29a10360-5fb7-4add-8bf5-1bc35e6e76dd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 21:54:54 crc kubenswrapper[4962]: E1201 21:54:54.854795 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-vzzq8" podUID="29a10360-5fb7-4add-8bf5-1bc35e6e76dd" Dec 01 21:54:54 crc kubenswrapper[4962]: I1201 21:54:54.957068 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-nn26f" podUID="bfb8d480-e429-4cd8-b9c8-1361a41deb16" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.144:5353: i/o timeout" Dec 01 21:54:55 crc kubenswrapper[4962]: I1201 21:54:55.062268 4962 scope.go:117] "RemoveContainer" containerID="1594447eaf1039a5d8eefd1cacc041bfc9767ecf6be3bd82b31dd5eca34ade07" Dec 01 21:54:55 crc kubenswrapper[4962]: I1201 21:54:55.142423 4962 scope.go:117] "RemoveContainer" containerID="9c6dfed22e756ed7dfd4e7eb66ce3586e164f648f3d32338a9adf8545f1e983e" Dec 01 21:54:55 crc kubenswrapper[4962]: I1201 21:54:55.235159 4962 scope.go:117] "RemoveContainer" containerID="a3a2f5927c7be4aa76873ff4b6e350e8355d3302eb751bd3f94301a0f63ad51f" Dec 01 21:54:55 crc kubenswrapper[4962]: I1201 21:54:55.318535 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" event={"ID":"191b6ce3-f613-4217-b224-a65ee4cfdfe7","Type":"ContainerStarted","Data":"be316a715ee51336fb8f9d77180528af883eb796f40ec81884b6acc27922aa28"} Dec 01 21:54:55 crc kubenswrapper[4962]: I1201 21:54:55.324922 4962 scope.go:117] "RemoveContainer" containerID="0400b21828f01f4d9b4cbf879769a0b2f3b0f5c68005be933879f3e756179064" Dec 01 21:54:55 crc kubenswrapper[4962]: E1201 21:54:55.338508 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-vzzq8" podUID="29a10360-5fb7-4add-8bf5-1bc35e6e76dd" Dec 01 21:54:55 crc kubenswrapper[4962]: I1201 21:54:55.410353 4962 scope.go:117] "RemoveContainer" containerID="6af0e933d1fe26a65c58702f662ce24d423e525ea0b2984d4e048663b5ca361a" Dec 01 21:54:55 crc kubenswrapper[4962]: I1201 21:54:55.421539 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-762cx"] Dec 01 21:54:55 crc kubenswrapper[4962]: I1201 21:54:55.432232 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 01 21:54:55 crc kubenswrapper[4962]: I1201 21:54:55.676259 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-qgvg7"] Dec 01 21:54:55 crc kubenswrapper[4962]: I1201 21:54:55.808326 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 21:54:55 crc kubenswrapper[4962]: I1201 21:54:55.926799 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 21:54:55 crc kubenswrapper[4962]: W1201 21:54:55.949353 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ded81d0_3be6_4293_abd6_c3434d42667e.slice/crio-8731a4e99fbe79050dc71c2e66b6c676460403e7b8ad651e4670cdc44f28abfe WatchSource:0}: Error finding container 8731a4e99fbe79050dc71c2e66b6c676460403e7b8ad651e4670cdc44f28abfe: Status 404 returned error can't find the container with id 8731a4e99fbe79050dc71c2e66b6c676460403e7b8ad651e4670cdc44f28abfe Dec 01 21:54:56 crc kubenswrapper[4962]: I1201 21:54:56.034772 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5579c75d4-fhjn4"] Dec 01 21:54:56 crc kubenswrapper[4962]: W1201 21:54:56.045849 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4898fa68_89d9_4aa6_9b60_4503ad99778e.slice/crio-a7b5cb61d6d8aa54419ad0dfa828d9892a01bab53fb748a9e2d78225751479c6 WatchSource:0}: Error finding container a7b5cb61d6d8aa54419ad0dfa828d9892a01bab53fb748a9e2d78225751479c6: Status 404 returned error can't find the container with id a7b5cb61d6d8aa54419ad0dfa828d9892a01bab53fb748a9e2d78225751479c6 Dec 01 21:54:56 crc kubenswrapper[4962]: I1201 21:54:56.487082 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5579c75d4-fhjn4" event={"ID":"4898fa68-89d9-4aa6-9b60-4503ad99778e","Type":"ContainerStarted","Data":"a7b5cb61d6d8aa54419ad0dfa828d9892a01bab53fb748a9e2d78225751479c6"} Dec 01 21:54:56 crc kubenswrapper[4962]: I1201 21:54:56.502161 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-762cx" event={"ID":"7c13da36-7f69-4c0e-b830-8b71bdc181b2","Type":"ContainerStarted","Data":"ffb379f93d52940c69349e2ca2c9ad09bf479e0b63c8db987b3041ef9b636637"} Dec 01 21:54:56 crc kubenswrapper[4962]: I1201 21:54:56.502422 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-762cx" event={"ID":"7c13da36-7f69-4c0e-b830-8b71bdc181b2","Type":"ContainerStarted","Data":"4dffbe865a7e7225897bad38dbc0d4aaab0123066217fed0ccca10e984a6b3a4"} Dec 01 21:54:56 crc kubenswrapper[4962]: I1201 21:54:56.601680 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-762cx" podStartSLOduration=22.601656281 podStartE2EDuration="22.601656281s" podCreationTimestamp="2025-12-01 21:54:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:54:56.542736882 +0000 UTC m=+1280.644176067" watchObservedRunningTime="2025-12-01 21:54:56.601656281 +0000 UTC m=+1280.703095476" Dec 01 21:54:56 crc kubenswrapper[4962]: I1201 21:54:56.637258 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1ded81d0-3be6-4293-abd6-c3434d42667e","Type":"ContainerStarted","Data":"8731a4e99fbe79050dc71c2e66b6c676460403e7b8ad651e4670cdc44f28abfe"} Dec 01 21:54:56 crc kubenswrapper[4962]: I1201 21:54:56.659113 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"624dc66d-d4f2-447b-861b-23987a75a3d6","Type":"ContainerStarted","Data":"bfb880d1e65ab732bffe713cc7b20a7199ecc71eb02fc8813b5e5e816272e897"} Dec 01 21:54:56 crc kubenswrapper[4962]: I1201 21:54:56.667195 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"788bee1d-914f-4efc-acea-67ff250ce73f","Type":"ContainerStarted","Data":"cad83d54a0a2223e5c7a4492a3c5e86b9061a353d47a7a476f5b7a5570c7b218"} Dec 01 21:54:56 crc kubenswrapper[4962]: I1201 21:54:56.685449 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-qgvg7" event={"ID":"2fc05c2b-9041-4361-8635-826c5a64afd2","Type":"ContainerStarted","Data":"2c93fcac4a6d1e435abb195fe2941bb633d392dd11030f37db5fec8393d39320"} Dec 01 21:54:57 crc kubenswrapper[4962]: I1201 21:54:57.243716 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5b9776df9c-m5wv9"] Dec 01 21:54:57 crc kubenswrapper[4962]: I1201 21:54:57.246434 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5b9776df9c-m5wv9" Dec 01 21:54:57 crc kubenswrapper[4962]: I1201 21:54:57.251896 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Dec 01 21:54:57 crc kubenswrapper[4962]: I1201 21:54:57.252186 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Dec 01 21:54:57 crc kubenswrapper[4962]: I1201 21:54:57.256593 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5b9776df9c-m5wv9"] Dec 01 21:54:57 crc kubenswrapper[4962]: I1201 21:54:57.379298 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ab1fe3f-42f6-4652-8d33-5f97b860b8fc-public-tls-certs\") pod \"neutron-5b9776df9c-m5wv9\" (UID: \"6ab1fe3f-42f6-4652-8d33-5f97b860b8fc\") " pod="openstack/neutron-5b9776df9c-m5wv9" Dec 01 21:54:57 crc kubenswrapper[4962]: I1201 21:54:57.379397 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6ab1fe3f-42f6-4652-8d33-5f97b860b8fc-httpd-config\") pod \"neutron-5b9776df9c-m5wv9\" (UID: \"6ab1fe3f-42f6-4652-8d33-5f97b860b8fc\") " pod="openstack/neutron-5b9776df9c-m5wv9" Dec 01 21:54:57 crc kubenswrapper[4962]: I1201 21:54:57.379417 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ab1fe3f-42f6-4652-8d33-5f97b860b8fc-internal-tls-certs\") pod \"neutron-5b9776df9c-m5wv9\" (UID: \"6ab1fe3f-42f6-4652-8d33-5f97b860b8fc\") " pod="openstack/neutron-5b9776df9c-m5wv9" Dec 01 21:54:57 crc kubenswrapper[4962]: I1201 21:54:57.379440 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79jt8\" (UniqueName: \"kubernetes.io/projected/6ab1fe3f-42f6-4652-8d33-5f97b860b8fc-kube-api-access-79jt8\") pod \"neutron-5b9776df9c-m5wv9\" (UID: \"6ab1fe3f-42f6-4652-8d33-5f97b860b8fc\") " pod="openstack/neutron-5b9776df9c-m5wv9" Dec 01 21:54:57 crc kubenswrapper[4962]: I1201 21:54:57.379534 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6ab1fe3f-42f6-4652-8d33-5f97b860b8fc-config\") pod \"neutron-5b9776df9c-m5wv9\" (UID: \"6ab1fe3f-42f6-4652-8d33-5f97b860b8fc\") " pod="openstack/neutron-5b9776df9c-m5wv9" Dec 01 21:54:57 crc kubenswrapper[4962]: I1201 21:54:57.379607 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ab1fe3f-42f6-4652-8d33-5f97b860b8fc-combined-ca-bundle\") pod \"neutron-5b9776df9c-m5wv9\" (UID: \"6ab1fe3f-42f6-4652-8d33-5f97b860b8fc\") " pod="openstack/neutron-5b9776df9c-m5wv9" Dec 01 21:54:57 crc kubenswrapper[4962]: I1201 21:54:57.380313 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ab1fe3f-42f6-4652-8d33-5f97b860b8fc-ovndb-tls-certs\") pod \"neutron-5b9776df9c-m5wv9\" (UID: \"6ab1fe3f-42f6-4652-8d33-5f97b860b8fc\") " pod="openstack/neutron-5b9776df9c-m5wv9" Dec 01 21:54:57 crc kubenswrapper[4962]: I1201 21:54:57.482419 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ab1fe3f-42f6-4652-8d33-5f97b860b8fc-public-tls-certs\") pod \"neutron-5b9776df9c-m5wv9\" (UID: \"6ab1fe3f-42f6-4652-8d33-5f97b860b8fc\") " pod="openstack/neutron-5b9776df9c-m5wv9" Dec 01 21:54:57 crc kubenswrapper[4962]: I1201 21:54:57.483474 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6ab1fe3f-42f6-4652-8d33-5f97b860b8fc-httpd-config\") pod \"neutron-5b9776df9c-m5wv9\" (UID: \"6ab1fe3f-42f6-4652-8d33-5f97b860b8fc\") " pod="openstack/neutron-5b9776df9c-m5wv9" Dec 01 21:54:57 crc kubenswrapper[4962]: I1201 21:54:57.483662 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ab1fe3f-42f6-4652-8d33-5f97b860b8fc-internal-tls-certs\") pod \"neutron-5b9776df9c-m5wv9\" (UID: \"6ab1fe3f-42f6-4652-8d33-5f97b860b8fc\") " pod="openstack/neutron-5b9776df9c-m5wv9" Dec 01 21:54:57 crc kubenswrapper[4962]: I1201 21:54:57.483764 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79jt8\" (UniqueName: \"kubernetes.io/projected/6ab1fe3f-42f6-4652-8d33-5f97b860b8fc-kube-api-access-79jt8\") pod \"neutron-5b9776df9c-m5wv9\" (UID: \"6ab1fe3f-42f6-4652-8d33-5f97b860b8fc\") " pod="openstack/neutron-5b9776df9c-m5wv9" Dec 01 21:54:57 crc kubenswrapper[4962]: I1201 21:54:57.483996 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6ab1fe3f-42f6-4652-8d33-5f97b860b8fc-config\") pod \"neutron-5b9776df9c-m5wv9\" (UID: \"6ab1fe3f-42f6-4652-8d33-5f97b860b8fc\") " pod="openstack/neutron-5b9776df9c-m5wv9" Dec 01 21:54:57 crc kubenswrapper[4962]: I1201 21:54:57.484100 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ab1fe3f-42f6-4652-8d33-5f97b860b8fc-combined-ca-bundle\") pod \"neutron-5b9776df9c-m5wv9\" (UID: \"6ab1fe3f-42f6-4652-8d33-5f97b860b8fc\") " pod="openstack/neutron-5b9776df9c-m5wv9" Dec 01 21:54:57 crc kubenswrapper[4962]: I1201 21:54:57.484581 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ab1fe3f-42f6-4652-8d33-5f97b860b8fc-ovndb-tls-certs\") pod \"neutron-5b9776df9c-m5wv9\" (UID: \"6ab1fe3f-42f6-4652-8d33-5f97b860b8fc\") " pod="openstack/neutron-5b9776df9c-m5wv9" Dec 01 21:54:57 crc kubenswrapper[4962]: I1201 21:54:57.488632 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ab1fe3f-42f6-4652-8d33-5f97b860b8fc-public-tls-certs\") pod \"neutron-5b9776df9c-m5wv9\" (UID: \"6ab1fe3f-42f6-4652-8d33-5f97b860b8fc\") " pod="openstack/neutron-5b9776df9c-m5wv9" Dec 01 21:54:57 crc kubenswrapper[4962]: I1201 21:54:57.490010 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ab1fe3f-42f6-4652-8d33-5f97b860b8fc-internal-tls-certs\") pod \"neutron-5b9776df9c-m5wv9\" (UID: \"6ab1fe3f-42f6-4652-8d33-5f97b860b8fc\") " pod="openstack/neutron-5b9776df9c-m5wv9" Dec 01 21:54:57 crc kubenswrapper[4962]: I1201 21:54:57.490217 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6ab1fe3f-42f6-4652-8d33-5f97b860b8fc-httpd-config\") pod \"neutron-5b9776df9c-m5wv9\" (UID: \"6ab1fe3f-42f6-4652-8d33-5f97b860b8fc\") " pod="openstack/neutron-5b9776df9c-m5wv9" Dec 01 21:54:57 crc kubenswrapper[4962]: I1201 21:54:57.490787 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ab1fe3f-42f6-4652-8d33-5f97b860b8fc-ovndb-tls-certs\") pod \"neutron-5b9776df9c-m5wv9\" (UID: \"6ab1fe3f-42f6-4652-8d33-5f97b860b8fc\") " pod="openstack/neutron-5b9776df9c-m5wv9" Dec 01 21:54:57 crc kubenswrapper[4962]: I1201 21:54:57.493487 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/6ab1fe3f-42f6-4652-8d33-5f97b860b8fc-config\") pod \"neutron-5b9776df9c-m5wv9\" (UID: \"6ab1fe3f-42f6-4652-8d33-5f97b860b8fc\") " pod="openstack/neutron-5b9776df9c-m5wv9" Dec 01 21:54:57 crc kubenswrapper[4962]: I1201 21:54:57.493719 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ab1fe3f-42f6-4652-8d33-5f97b860b8fc-combined-ca-bundle\") pod \"neutron-5b9776df9c-m5wv9\" (UID: \"6ab1fe3f-42f6-4652-8d33-5f97b860b8fc\") " pod="openstack/neutron-5b9776df9c-m5wv9" Dec 01 21:54:57 crc kubenswrapper[4962]: I1201 21:54:57.517614 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79jt8\" (UniqueName: \"kubernetes.io/projected/6ab1fe3f-42f6-4652-8d33-5f97b860b8fc-kube-api-access-79jt8\") pod \"neutron-5b9776df9c-m5wv9\" (UID: \"6ab1fe3f-42f6-4652-8d33-5f97b860b8fc\") " pod="openstack/neutron-5b9776df9c-m5wv9" Dec 01 21:54:57 crc kubenswrapper[4962]: I1201 21:54:57.574347 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5b9776df9c-m5wv9" Dec 01 21:54:57 crc kubenswrapper[4962]: I1201 21:54:57.713855 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1ded81d0-3be6-4293-abd6-c3434d42667e","Type":"ContainerStarted","Data":"836b8101e947dd78979b83dfa7cb9ebeb2a41def98f01177745280a0ea6e6303"} Dec 01 21:54:57 crc kubenswrapper[4962]: I1201 21:54:57.719388 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"788bee1d-914f-4efc-acea-67ff250ce73f","Type":"ContainerStarted","Data":"844141a471fd7b5af60ab53947a4cbc50ad1f6e106b8325daf01b1afbb6c9dca"} Dec 01 21:54:57 crc kubenswrapper[4962]: I1201 21:54:57.722148 4962 generic.go:334] "Generic (PLEG): container finished" podID="2fc05c2b-9041-4361-8635-826c5a64afd2" containerID="584b68309f93cef9db13325a12b8d5d6137af068bfa65cd1ae925e264ba7e8ef" exitCode=0 Dec 01 21:54:57 crc kubenswrapper[4962]: I1201 21:54:57.722235 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-qgvg7" event={"ID":"2fc05c2b-9041-4361-8635-826c5a64afd2","Type":"ContainerStarted","Data":"32c1ca42c8a9652c96705182eb7172aa3626be179a0a65e6d6d1359bc68fcdfa"} Dec 01 21:54:57 crc kubenswrapper[4962]: I1201 21:54:57.722257 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-qgvg7" event={"ID":"2fc05c2b-9041-4361-8635-826c5a64afd2","Type":"ContainerDied","Data":"584b68309f93cef9db13325a12b8d5d6137af068bfa65cd1ae925e264ba7e8ef"} Dec 01 21:54:57 crc kubenswrapper[4962]: I1201 21:54:57.722377 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b7b667979-qgvg7" Dec 01 21:54:57 crc kubenswrapper[4962]: I1201 21:54:57.727549 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5579c75d4-fhjn4" event={"ID":"4898fa68-89d9-4aa6-9b60-4503ad99778e","Type":"ContainerStarted","Data":"aff509d58e98b5675f1c30d02d6086b8025528aa3dc95c0b52ab03fd5ba9602b"} Dec 01 21:54:57 crc kubenswrapper[4962]: I1201 21:54:57.727596 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5579c75d4-fhjn4" event={"ID":"4898fa68-89d9-4aa6-9b60-4503ad99778e","Type":"ContainerStarted","Data":"358e4acc7850c64440280b5d1c31ea9120e4d19ba36db602808cd65c946405fb"} Dec 01 21:54:57 crc kubenswrapper[4962]: I1201 21:54:57.727613 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5579c75d4-fhjn4" Dec 01 21:54:57 crc kubenswrapper[4962]: I1201 21:54:57.742310 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b7b667979-qgvg7" podStartSLOduration=4.742292959 podStartE2EDuration="4.742292959s" podCreationTimestamp="2025-12-01 21:54:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:54:57.7363535 +0000 UTC m=+1281.837792685" watchObservedRunningTime="2025-12-01 21:54:57.742292959 +0000 UTC m=+1281.843732154" Dec 01 21:54:57 crc kubenswrapper[4962]: I1201 21:54:57.760364 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5579c75d4-fhjn4" podStartSLOduration=3.760348993 podStartE2EDuration="3.760348993s" podCreationTimestamp="2025-12-01 21:54:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:54:57.757240185 +0000 UTC m=+1281.858679380" watchObservedRunningTime="2025-12-01 21:54:57.760348993 +0000 UTC m=+1281.861788178" Dec 01 21:54:58 crc kubenswrapper[4962]: I1201 21:54:58.743503 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"624dc66d-d4f2-447b-861b-23987a75a3d6","Type":"ContainerStarted","Data":"e957539bd231854f0073f95ebd36d89ec2ea62e54fe60af4a852a9def4749bd5"} Dec 01 21:54:58 crc kubenswrapper[4962]: I1201 21:54:58.784912 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5b9776df9c-m5wv9"] Dec 01 21:54:59 crc kubenswrapper[4962]: I1201 21:54:59.752629 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b9776df9c-m5wv9" event={"ID":"6ab1fe3f-42f6-4652-8d33-5f97b860b8fc","Type":"ContainerStarted","Data":"520df6842e486e1b87d5ac8b568620544f56a2738b4cb606071bf802b0d8e524"} Dec 01 21:54:59 crc kubenswrapper[4962]: I1201 21:54:59.753318 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b9776df9c-m5wv9" event={"ID":"6ab1fe3f-42f6-4652-8d33-5f97b860b8fc","Type":"ContainerStarted","Data":"552c64cf16000950b8a3dfe93bff1df8ae148a8b6ac8dd5e04e43c954d96c644"} Dec 01 21:54:59 crc kubenswrapper[4962]: I1201 21:54:59.755701 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"788bee1d-914f-4efc-acea-67ff250ce73f","Type":"ContainerStarted","Data":"e1cf1e720f696aabdf6d9c1f9c857171943a447ef9c523e1907ee394e3205972"} Dec 01 21:54:59 crc kubenswrapper[4962]: I1201 21:54:59.757577 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1ded81d0-3be6-4293-abd6-c3434d42667e","Type":"ContainerStarted","Data":"13883312d5b99c02ab6794ce1361846db18e86ab607c6100615a8682932dbec6"} Dec 01 21:54:59 crc kubenswrapper[4962]: I1201 21:54:59.759011 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-l44sw" event={"ID":"eb73c9fd-2c3b-4d8e-a1c4-5ff4a2b8f67c","Type":"ContainerStarted","Data":"45946b1574666843499789ecb531de96bd76a29871dbdec68756db81bc11fbc6"} Dec 01 21:54:59 crc kubenswrapper[4962]: I1201 21:54:59.780969 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.780947304 podStartE2EDuration="6.780947304s" podCreationTimestamp="2025-12-01 21:54:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:54:59.771506255 +0000 UTC m=+1283.872945450" watchObservedRunningTime="2025-12-01 21:54:59.780947304 +0000 UTC m=+1283.882386499" Dec 01 21:54:59 crc kubenswrapper[4962]: I1201 21:54:59.803998 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-l44sw" podStartSLOduration=4.1582086910000005 podStartE2EDuration="35.80397478s" podCreationTimestamp="2025-12-01 21:54:24 +0000 UTC" firstStartedPulling="2025-12-01 21:54:27.3119629 +0000 UTC m=+1251.413402095" lastFinishedPulling="2025-12-01 21:54:58.957728979 +0000 UTC m=+1283.059168184" observedRunningTime="2025-12-01 21:54:59.788553991 +0000 UTC m=+1283.889993196" watchObservedRunningTime="2025-12-01 21:54:59.80397478 +0000 UTC m=+1283.905413985" Dec 01 21:54:59 crc kubenswrapper[4962]: I1201 21:54:59.819123 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.8191020810000005 podStartE2EDuration="6.819102081s" podCreationTimestamp="2025-12-01 21:54:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:54:59.811631418 +0000 UTC m=+1283.913070633" watchObservedRunningTime="2025-12-01 21:54:59.819102081 +0000 UTC m=+1283.920541286" Dec 01 21:55:00 crc kubenswrapper[4962]: I1201 21:55:00.771727 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b9776df9c-m5wv9" event={"ID":"6ab1fe3f-42f6-4652-8d33-5f97b860b8fc","Type":"ContainerStarted","Data":"955c662668a7287cc9a0c8a8012879ad4a380c72cb2dc57ac2bfaafab273eb2d"} Dec 01 21:55:00 crc kubenswrapper[4962]: I1201 21:55:00.773474 4962 generic.go:334] "Generic (PLEG): container finished" podID="7c13da36-7f69-4c0e-b830-8b71bdc181b2" containerID="ffb379f93d52940c69349e2ca2c9ad09bf479e0b63c8db987b3041ef9b636637" exitCode=0 Dec 01 21:55:00 crc kubenswrapper[4962]: I1201 21:55:00.773552 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-762cx" event={"ID":"7c13da36-7f69-4c0e-b830-8b71bdc181b2","Type":"ContainerDied","Data":"ffb379f93d52940c69349e2ca2c9ad09bf479e0b63c8db987b3041ef9b636637"} Dec 01 21:55:01 crc kubenswrapper[4962]: I1201 21:55:01.823693 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5b9776df9c-m5wv9" podStartSLOduration=4.823668114 podStartE2EDuration="4.823668114s" podCreationTimestamp="2025-12-01 21:54:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:55:01.81617465 +0000 UTC m=+1285.917613835" watchObservedRunningTime="2025-12-01 21:55:01.823668114 +0000 UTC m=+1285.925107309" Dec 01 21:55:03 crc kubenswrapper[4962]: I1201 21:55:03.712972 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 01 21:55:03 crc kubenswrapper[4962]: I1201 21:55:03.713030 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 01 21:55:03 crc kubenswrapper[4962]: I1201 21:55:03.809450 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 01 21:55:03 crc kubenswrapper[4962]: I1201 21:55:03.809494 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 01 21:55:03 crc kubenswrapper[4962]: I1201 21:55:03.829720 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 01 21:55:03 crc kubenswrapper[4962]: I1201 21:55:03.829894 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 01 21:55:03 crc kubenswrapper[4962]: I1201 21:55:03.867444 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 01 21:55:03 crc kubenswrapper[4962]: I1201 21:55:03.872206 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 01 21:55:04 crc kubenswrapper[4962]: I1201 21:55:04.401533 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b7b667979-qgvg7" Dec 01 21:55:04 crc kubenswrapper[4962]: I1201 21:55:04.484711 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-mfzbs"] Dec 01 21:55:04 crc kubenswrapper[4962]: I1201 21:55:04.485970 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-56df8fb6b7-mfzbs" podUID="38501711-21d8-43f0-8657-7507944ef792" containerName="dnsmasq-dns" containerID="cri-o://b40b507046667320549cf3401127527920805e6dcd9c4165a87336393f712877" gracePeriod=10 Dec 01 21:55:04 crc kubenswrapper[4962]: I1201 21:55:04.838452 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-762cx" event={"ID":"7c13da36-7f69-4c0e-b830-8b71bdc181b2","Type":"ContainerDied","Data":"4dffbe865a7e7225897bad38dbc0d4aaab0123066217fed0ccca10e984a6b3a4"} Dec 01 21:55:04 crc kubenswrapper[4962]: I1201 21:55:04.838682 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4dffbe865a7e7225897bad38dbc0d4aaab0123066217fed0ccca10e984a6b3a4" Dec 01 21:55:04 crc kubenswrapper[4962]: I1201 21:55:04.838699 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 01 21:55:04 crc kubenswrapper[4962]: I1201 21:55:04.838710 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 01 21:55:04 crc kubenswrapper[4962]: I1201 21:55:04.838719 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 01 21:55:04 crc kubenswrapper[4962]: I1201 21:55:04.838727 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 01 21:55:04 crc kubenswrapper[4962]: I1201 21:55:04.932350 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-762cx" Dec 01 21:55:05 crc kubenswrapper[4962]: I1201 21:55:05.023120 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c13da36-7f69-4c0e-b830-8b71bdc181b2-combined-ca-bundle\") pod \"7c13da36-7f69-4c0e-b830-8b71bdc181b2\" (UID: \"7c13da36-7f69-4c0e-b830-8b71bdc181b2\") " Dec 01 21:55:05 crc kubenswrapper[4962]: I1201 21:55:05.023196 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7c13da36-7f69-4c0e-b830-8b71bdc181b2-credential-keys\") pod \"7c13da36-7f69-4c0e-b830-8b71bdc181b2\" (UID: \"7c13da36-7f69-4c0e-b830-8b71bdc181b2\") " Dec 01 21:55:05 crc kubenswrapper[4962]: I1201 21:55:05.023255 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7c13da36-7f69-4c0e-b830-8b71bdc181b2-fernet-keys\") pod \"7c13da36-7f69-4c0e-b830-8b71bdc181b2\" (UID: \"7c13da36-7f69-4c0e-b830-8b71bdc181b2\") " Dec 01 21:55:05 crc kubenswrapper[4962]: I1201 21:55:05.023329 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c13da36-7f69-4c0e-b830-8b71bdc181b2-config-data\") pod \"7c13da36-7f69-4c0e-b830-8b71bdc181b2\" (UID: \"7c13da36-7f69-4c0e-b830-8b71bdc181b2\") " Dec 01 21:55:05 crc kubenswrapper[4962]: I1201 21:55:05.023466 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bqzn\" (UniqueName: \"kubernetes.io/projected/7c13da36-7f69-4c0e-b830-8b71bdc181b2-kube-api-access-8bqzn\") pod \"7c13da36-7f69-4c0e-b830-8b71bdc181b2\" (UID: \"7c13da36-7f69-4c0e-b830-8b71bdc181b2\") " Dec 01 21:55:05 crc kubenswrapper[4962]: I1201 21:55:05.023492 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c13da36-7f69-4c0e-b830-8b71bdc181b2-scripts\") pod \"7c13da36-7f69-4c0e-b830-8b71bdc181b2\" (UID: \"7c13da36-7f69-4c0e-b830-8b71bdc181b2\") " Dec 01 21:55:05 crc kubenswrapper[4962]: I1201 21:55:05.030953 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c13da36-7f69-4c0e-b830-8b71bdc181b2-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "7c13da36-7f69-4c0e-b830-8b71bdc181b2" (UID: "7c13da36-7f69-4c0e-b830-8b71bdc181b2"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:55:05 crc kubenswrapper[4962]: I1201 21:55:05.031466 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c13da36-7f69-4c0e-b830-8b71bdc181b2-scripts" (OuterVolumeSpecName: "scripts") pod "7c13da36-7f69-4c0e-b830-8b71bdc181b2" (UID: "7c13da36-7f69-4c0e-b830-8b71bdc181b2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:55:05 crc kubenswrapper[4962]: I1201 21:55:05.032710 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c13da36-7f69-4c0e-b830-8b71bdc181b2-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "7c13da36-7f69-4c0e-b830-8b71bdc181b2" (UID: "7c13da36-7f69-4c0e-b830-8b71bdc181b2"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:55:05 crc kubenswrapper[4962]: I1201 21:55:05.048097 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c13da36-7f69-4c0e-b830-8b71bdc181b2-kube-api-access-8bqzn" (OuterVolumeSpecName: "kube-api-access-8bqzn") pod "7c13da36-7f69-4c0e-b830-8b71bdc181b2" (UID: "7c13da36-7f69-4c0e-b830-8b71bdc181b2"). InnerVolumeSpecName "kube-api-access-8bqzn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:55:05 crc kubenswrapper[4962]: I1201 21:55:05.057059 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c13da36-7f69-4c0e-b830-8b71bdc181b2-config-data" (OuterVolumeSpecName: "config-data") pod "7c13da36-7f69-4c0e-b830-8b71bdc181b2" (UID: "7c13da36-7f69-4c0e-b830-8b71bdc181b2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:55:05 crc kubenswrapper[4962]: I1201 21:55:05.067122 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c13da36-7f69-4c0e-b830-8b71bdc181b2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7c13da36-7f69-4c0e-b830-8b71bdc181b2" (UID: "7c13da36-7f69-4c0e-b830-8b71bdc181b2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:55:05 crc kubenswrapper[4962]: I1201 21:55:05.126659 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c13da36-7f69-4c0e-b830-8b71bdc181b2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 21:55:05 crc kubenswrapper[4962]: I1201 21:55:05.126693 4962 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7c13da36-7f69-4c0e-b830-8b71bdc181b2-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 01 21:55:05 crc kubenswrapper[4962]: I1201 21:55:05.126703 4962 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7c13da36-7f69-4c0e-b830-8b71bdc181b2-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 01 21:55:05 crc kubenswrapper[4962]: I1201 21:55:05.126711 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c13da36-7f69-4c0e-b830-8b71bdc181b2-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 21:55:05 crc kubenswrapper[4962]: I1201 21:55:05.126720 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bqzn\" (UniqueName: \"kubernetes.io/projected/7c13da36-7f69-4c0e-b830-8b71bdc181b2-kube-api-access-8bqzn\") on node \"crc\" DevicePath \"\"" Dec 01 21:55:05 crc kubenswrapper[4962]: I1201 21:55:05.126730 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c13da36-7f69-4c0e-b830-8b71bdc181b2-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 21:55:05 crc kubenswrapper[4962]: I1201 21:55:05.853062 4962 generic.go:334] "Generic (PLEG): container finished" podID="38501711-21d8-43f0-8657-7507944ef792" containerID="b40b507046667320549cf3401127527920805e6dcd9c4165a87336393f712877" exitCode=0 Dec 01 21:55:05 crc kubenswrapper[4962]: I1201 21:55:05.854151 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-762cx" Dec 01 21:55:05 crc kubenswrapper[4962]: I1201 21:55:05.854884 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-mfzbs" event={"ID":"38501711-21d8-43f0-8657-7507944ef792","Type":"ContainerDied","Data":"b40b507046667320549cf3401127527920805e6dcd9c4165a87336393f712877"} Dec 01 21:55:06 crc kubenswrapper[4962]: I1201 21:55:06.058672 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bfc984cd5-wc42c"] Dec 01 21:55:06 crc kubenswrapper[4962]: E1201 21:55:06.059422 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c13da36-7f69-4c0e-b830-8b71bdc181b2" containerName="keystone-bootstrap" Dec 01 21:55:06 crc kubenswrapper[4962]: I1201 21:55:06.059435 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c13da36-7f69-4c0e-b830-8b71bdc181b2" containerName="keystone-bootstrap" Dec 01 21:55:06 crc kubenswrapper[4962]: I1201 21:55:06.059635 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c13da36-7f69-4c0e-b830-8b71bdc181b2" containerName="keystone-bootstrap" Dec 01 21:55:06 crc kubenswrapper[4962]: I1201 21:55:06.060436 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bfc984cd5-wc42c" Dec 01 21:55:06 crc kubenswrapper[4962]: I1201 21:55:06.067185 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 01 21:55:06 crc kubenswrapper[4962]: I1201 21:55:06.067474 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Dec 01 21:55:06 crc kubenswrapper[4962]: I1201 21:55:06.083485 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 01 21:55:06 crc kubenswrapper[4962]: I1201 21:55:06.083709 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-tlj7d" Dec 01 21:55:06 crc kubenswrapper[4962]: I1201 21:55:06.083857 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 01 21:55:06 crc kubenswrapper[4962]: I1201 21:55:06.083989 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Dec 01 21:55:06 crc kubenswrapper[4962]: I1201 21:55:06.094110 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bfc984cd5-wc42c"] Dec 01 21:55:06 crc kubenswrapper[4962]: I1201 21:55:06.111682 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-mfzbs" Dec 01 21:55:06 crc kubenswrapper[4962]: I1201 21:55:06.193672 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38501711-21d8-43f0-8657-7507944ef792-config\") pod \"38501711-21d8-43f0-8657-7507944ef792\" (UID: \"38501711-21d8-43f0-8657-7507944ef792\") " Dec 01 21:55:06 crc kubenswrapper[4962]: I1201 21:55:06.193782 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dprzw\" (UniqueName: \"kubernetes.io/projected/38501711-21d8-43f0-8657-7507944ef792-kube-api-access-dprzw\") pod \"38501711-21d8-43f0-8657-7507944ef792\" (UID: \"38501711-21d8-43f0-8657-7507944ef792\") " Dec 01 21:55:06 crc kubenswrapper[4962]: I1201 21:55:06.193905 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38501711-21d8-43f0-8657-7507944ef792-dns-svc\") pod \"38501711-21d8-43f0-8657-7507944ef792\" (UID: \"38501711-21d8-43f0-8657-7507944ef792\") " Dec 01 21:55:06 crc kubenswrapper[4962]: I1201 21:55:06.193925 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/38501711-21d8-43f0-8657-7507944ef792-dns-swift-storage-0\") pod \"38501711-21d8-43f0-8657-7507944ef792\" (UID: \"38501711-21d8-43f0-8657-7507944ef792\") " Dec 01 21:55:06 crc kubenswrapper[4962]: I1201 21:55:06.194066 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/38501711-21d8-43f0-8657-7507944ef792-ovsdbserver-nb\") pod \"38501711-21d8-43f0-8657-7507944ef792\" (UID: \"38501711-21d8-43f0-8657-7507944ef792\") " Dec 01 21:55:06 crc kubenswrapper[4962]: I1201 21:55:06.194117 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/38501711-21d8-43f0-8657-7507944ef792-ovsdbserver-sb\") pod \"38501711-21d8-43f0-8657-7507944ef792\" (UID: \"38501711-21d8-43f0-8657-7507944ef792\") " Dec 01 21:55:06 crc kubenswrapper[4962]: I1201 21:55:06.194414 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e30f9e2d-e0be-4484-bf6b-83c39beaa7e6-credential-keys\") pod \"keystone-bfc984cd5-wc42c\" (UID: \"e30f9e2d-e0be-4484-bf6b-83c39beaa7e6\") " pod="openstack/keystone-bfc984cd5-wc42c" Dec 01 21:55:06 crc kubenswrapper[4962]: I1201 21:55:06.194433 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e30f9e2d-e0be-4484-bf6b-83c39beaa7e6-internal-tls-certs\") pod \"keystone-bfc984cd5-wc42c\" (UID: \"e30f9e2d-e0be-4484-bf6b-83c39beaa7e6\") " pod="openstack/keystone-bfc984cd5-wc42c" Dec 01 21:55:06 crc kubenswrapper[4962]: I1201 21:55:06.194461 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e30f9e2d-e0be-4484-bf6b-83c39beaa7e6-fernet-keys\") pod \"keystone-bfc984cd5-wc42c\" (UID: \"e30f9e2d-e0be-4484-bf6b-83c39beaa7e6\") " pod="openstack/keystone-bfc984cd5-wc42c" Dec 01 21:55:06 crc kubenswrapper[4962]: I1201 21:55:06.194501 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e30f9e2d-e0be-4484-bf6b-83c39beaa7e6-combined-ca-bundle\") pod \"keystone-bfc984cd5-wc42c\" (UID: \"e30f9e2d-e0be-4484-bf6b-83c39beaa7e6\") " pod="openstack/keystone-bfc984cd5-wc42c" Dec 01 21:55:06 crc kubenswrapper[4962]: I1201 21:55:06.194549 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e30f9e2d-e0be-4484-bf6b-83c39beaa7e6-public-tls-certs\") pod \"keystone-bfc984cd5-wc42c\" (UID: \"e30f9e2d-e0be-4484-bf6b-83c39beaa7e6\") " pod="openstack/keystone-bfc984cd5-wc42c" Dec 01 21:55:06 crc kubenswrapper[4962]: I1201 21:55:06.194569 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e30f9e2d-e0be-4484-bf6b-83c39beaa7e6-config-data\") pod \"keystone-bfc984cd5-wc42c\" (UID: \"e30f9e2d-e0be-4484-bf6b-83c39beaa7e6\") " pod="openstack/keystone-bfc984cd5-wc42c" Dec 01 21:55:06 crc kubenswrapper[4962]: I1201 21:55:06.194643 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e30f9e2d-e0be-4484-bf6b-83c39beaa7e6-scripts\") pod \"keystone-bfc984cd5-wc42c\" (UID: \"e30f9e2d-e0be-4484-bf6b-83c39beaa7e6\") " pod="openstack/keystone-bfc984cd5-wc42c" Dec 01 21:55:06 crc kubenswrapper[4962]: I1201 21:55:06.194704 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4r6g\" (UniqueName: \"kubernetes.io/projected/e30f9e2d-e0be-4484-bf6b-83c39beaa7e6-kube-api-access-s4r6g\") pod \"keystone-bfc984cd5-wc42c\" (UID: \"e30f9e2d-e0be-4484-bf6b-83c39beaa7e6\") " pod="openstack/keystone-bfc984cd5-wc42c" Dec 01 21:55:06 crc kubenswrapper[4962]: I1201 21:55:06.209317 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38501711-21d8-43f0-8657-7507944ef792-kube-api-access-dprzw" (OuterVolumeSpecName: "kube-api-access-dprzw") pod "38501711-21d8-43f0-8657-7507944ef792" (UID: "38501711-21d8-43f0-8657-7507944ef792"). InnerVolumeSpecName "kube-api-access-dprzw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:55:06 crc kubenswrapper[4962]: I1201 21:55:06.281707 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38501711-21d8-43f0-8657-7507944ef792-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "38501711-21d8-43f0-8657-7507944ef792" (UID: "38501711-21d8-43f0-8657-7507944ef792"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:55:06 crc kubenswrapper[4962]: I1201 21:55:06.297590 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38501711-21d8-43f0-8657-7507944ef792-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "38501711-21d8-43f0-8657-7507944ef792" (UID: "38501711-21d8-43f0-8657-7507944ef792"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:55:06 crc kubenswrapper[4962]: I1201 21:55:06.299147 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/38501711-21d8-43f0-8657-7507944ef792-dns-swift-storage-0\") pod \"38501711-21d8-43f0-8657-7507944ef792\" (UID: \"38501711-21d8-43f0-8657-7507944ef792\") " Dec 01 21:55:06 crc kubenswrapper[4962]: I1201 21:55:06.299724 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e30f9e2d-e0be-4484-bf6b-83c39beaa7e6-scripts\") pod \"keystone-bfc984cd5-wc42c\" (UID: \"e30f9e2d-e0be-4484-bf6b-83c39beaa7e6\") " pod="openstack/keystone-bfc984cd5-wc42c" Dec 01 21:55:06 crc kubenswrapper[4962]: I1201 21:55:06.299823 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4r6g\" (UniqueName: \"kubernetes.io/projected/e30f9e2d-e0be-4484-bf6b-83c39beaa7e6-kube-api-access-s4r6g\") pod \"keystone-bfc984cd5-wc42c\" (UID: \"e30f9e2d-e0be-4484-bf6b-83c39beaa7e6\") " pod="openstack/keystone-bfc984cd5-wc42c" Dec 01 21:55:06 crc kubenswrapper[4962]: W1201 21:55:06.300848 4962 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/38501711-21d8-43f0-8657-7507944ef792/volumes/kubernetes.io~configmap/dns-swift-storage-0 Dec 01 21:55:06 crc kubenswrapper[4962]: I1201 21:55:06.300881 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38501711-21d8-43f0-8657-7507944ef792-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "38501711-21d8-43f0-8657-7507944ef792" (UID: "38501711-21d8-43f0-8657-7507944ef792"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:55:06 crc kubenswrapper[4962]: I1201 21:55:06.301234 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e30f9e2d-e0be-4484-bf6b-83c39beaa7e6-credential-keys\") pod \"keystone-bfc984cd5-wc42c\" (UID: \"e30f9e2d-e0be-4484-bf6b-83c39beaa7e6\") " pod="openstack/keystone-bfc984cd5-wc42c" Dec 01 21:55:06 crc kubenswrapper[4962]: I1201 21:55:06.301287 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e30f9e2d-e0be-4484-bf6b-83c39beaa7e6-internal-tls-certs\") pod \"keystone-bfc984cd5-wc42c\" (UID: \"e30f9e2d-e0be-4484-bf6b-83c39beaa7e6\") " pod="openstack/keystone-bfc984cd5-wc42c" Dec 01 21:55:06 crc kubenswrapper[4962]: I1201 21:55:06.301334 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e30f9e2d-e0be-4484-bf6b-83c39beaa7e6-fernet-keys\") pod \"keystone-bfc984cd5-wc42c\" (UID: \"e30f9e2d-e0be-4484-bf6b-83c39beaa7e6\") " pod="openstack/keystone-bfc984cd5-wc42c" Dec 01 21:55:06 crc kubenswrapper[4962]: I1201 21:55:06.301436 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e30f9e2d-e0be-4484-bf6b-83c39beaa7e6-combined-ca-bundle\") pod \"keystone-bfc984cd5-wc42c\" (UID: \"e30f9e2d-e0be-4484-bf6b-83c39beaa7e6\") " pod="openstack/keystone-bfc984cd5-wc42c" Dec 01 21:55:06 crc kubenswrapper[4962]: I1201 21:55:06.301544 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e30f9e2d-e0be-4484-bf6b-83c39beaa7e6-public-tls-certs\") pod \"keystone-bfc984cd5-wc42c\" (UID: \"e30f9e2d-e0be-4484-bf6b-83c39beaa7e6\") " pod="openstack/keystone-bfc984cd5-wc42c" Dec 01 21:55:06 crc kubenswrapper[4962]: I1201 21:55:06.301581 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e30f9e2d-e0be-4484-bf6b-83c39beaa7e6-config-data\") pod \"keystone-bfc984cd5-wc42c\" (UID: \"e30f9e2d-e0be-4484-bf6b-83c39beaa7e6\") " pod="openstack/keystone-bfc984cd5-wc42c" Dec 01 21:55:06 crc kubenswrapper[4962]: I1201 21:55:06.301812 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38501711-21d8-43f0-8657-7507944ef792-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "38501711-21d8-43f0-8657-7507944ef792" (UID: "38501711-21d8-43f0-8657-7507944ef792"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:55:06 crc kubenswrapper[4962]: I1201 21:55:06.304153 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38501711-21d8-43f0-8657-7507944ef792-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "38501711-21d8-43f0-8657-7507944ef792" (UID: "38501711-21d8-43f0-8657-7507944ef792"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:55:06 crc kubenswrapper[4962]: I1201 21:55:06.307641 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e30f9e2d-e0be-4484-bf6b-83c39beaa7e6-public-tls-certs\") pod \"keystone-bfc984cd5-wc42c\" (UID: \"e30f9e2d-e0be-4484-bf6b-83c39beaa7e6\") " pod="openstack/keystone-bfc984cd5-wc42c" Dec 01 21:55:06 crc kubenswrapper[4962]: I1201 21:55:06.307719 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/38501711-21d8-43f0-8657-7507944ef792-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 21:55:06 crc kubenswrapper[4962]: I1201 21:55:06.308036 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dprzw\" (UniqueName: \"kubernetes.io/projected/38501711-21d8-43f0-8657-7507944ef792-kube-api-access-dprzw\") on node \"crc\" DevicePath \"\"" Dec 01 21:55:06 crc kubenswrapper[4962]: I1201 21:55:06.308087 4962 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/38501711-21d8-43f0-8657-7507944ef792-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 01 21:55:06 crc kubenswrapper[4962]: I1201 21:55:06.309158 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e30f9e2d-e0be-4484-bf6b-83c39beaa7e6-credential-keys\") pod \"keystone-bfc984cd5-wc42c\" (UID: \"e30f9e2d-e0be-4484-bf6b-83c39beaa7e6\") " pod="openstack/keystone-bfc984cd5-wc42c" Dec 01 21:55:06 crc kubenswrapper[4962]: I1201 21:55:06.310414 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e30f9e2d-e0be-4484-bf6b-83c39beaa7e6-internal-tls-certs\") pod \"keystone-bfc984cd5-wc42c\" (UID: \"e30f9e2d-e0be-4484-bf6b-83c39beaa7e6\") " pod="openstack/keystone-bfc984cd5-wc42c" Dec 01 21:55:06 crc kubenswrapper[4962]: I1201 21:55:06.311043 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e30f9e2d-e0be-4484-bf6b-83c39beaa7e6-combined-ca-bundle\") pod \"keystone-bfc984cd5-wc42c\" (UID: \"e30f9e2d-e0be-4484-bf6b-83c39beaa7e6\") " pod="openstack/keystone-bfc984cd5-wc42c" Dec 01 21:55:06 crc kubenswrapper[4962]: I1201 21:55:06.312256 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e30f9e2d-e0be-4484-bf6b-83c39beaa7e6-config-data\") pod \"keystone-bfc984cd5-wc42c\" (UID: \"e30f9e2d-e0be-4484-bf6b-83c39beaa7e6\") " pod="openstack/keystone-bfc984cd5-wc42c" Dec 01 21:55:06 crc kubenswrapper[4962]: I1201 21:55:06.314591 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e30f9e2d-e0be-4484-bf6b-83c39beaa7e6-fernet-keys\") pod \"keystone-bfc984cd5-wc42c\" (UID: \"e30f9e2d-e0be-4484-bf6b-83c39beaa7e6\") " pod="openstack/keystone-bfc984cd5-wc42c" Dec 01 21:55:06 crc kubenswrapper[4962]: I1201 21:55:06.322873 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38501711-21d8-43f0-8657-7507944ef792-config" (OuterVolumeSpecName: "config") pod "38501711-21d8-43f0-8657-7507944ef792" (UID: "38501711-21d8-43f0-8657-7507944ef792"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:55:06 crc kubenswrapper[4962]: I1201 21:55:06.324476 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e30f9e2d-e0be-4484-bf6b-83c39beaa7e6-scripts\") pod \"keystone-bfc984cd5-wc42c\" (UID: \"e30f9e2d-e0be-4484-bf6b-83c39beaa7e6\") " pod="openstack/keystone-bfc984cd5-wc42c" Dec 01 21:55:06 crc kubenswrapper[4962]: I1201 21:55:06.325691 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4r6g\" (UniqueName: \"kubernetes.io/projected/e30f9e2d-e0be-4484-bf6b-83c39beaa7e6-kube-api-access-s4r6g\") pod \"keystone-bfc984cd5-wc42c\" (UID: \"e30f9e2d-e0be-4484-bf6b-83c39beaa7e6\") " pod="openstack/keystone-bfc984cd5-wc42c" Dec 01 21:55:06 crc kubenswrapper[4962]: I1201 21:55:06.410135 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38501711-21d8-43f0-8657-7507944ef792-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 21:55:06 crc kubenswrapper[4962]: I1201 21:55:06.410160 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/38501711-21d8-43f0-8657-7507944ef792-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 21:55:06 crc kubenswrapper[4962]: I1201 21:55:06.410173 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38501711-21d8-43f0-8657-7507944ef792-config\") on node \"crc\" DevicePath \"\"" Dec 01 21:55:06 crc kubenswrapper[4962]: I1201 21:55:06.435495 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bfc984cd5-wc42c" Dec 01 21:55:06 crc kubenswrapper[4962]: I1201 21:55:06.955134 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-mfzbs" event={"ID":"38501711-21d8-43f0-8657-7507944ef792","Type":"ContainerDied","Data":"5d9f9179d1745ffbf39c9179ad3c2e0c622a0418a15dec8fc5f2b4e27af15075"} Dec 01 21:55:06 crc kubenswrapper[4962]: I1201 21:55:06.955662 4962 scope.go:117] "RemoveContainer" containerID="b40b507046667320549cf3401127527920805e6dcd9c4165a87336393f712877" Dec 01 21:55:06 crc kubenswrapper[4962]: I1201 21:55:06.955685 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-mfzbs" Dec 01 21:55:06 crc kubenswrapper[4962]: I1201 21:55:06.962583 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-x55sw" event={"ID":"873a333f-1f2b-4824-86a7-7935ff6908f9","Type":"ContainerStarted","Data":"7a0e052c4595839848ef3ee1213e2439112bd2167ad32ddfb8ab42a9161cb824"} Dec 01 21:55:06 crc kubenswrapper[4962]: I1201 21:55:06.971051 4962 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 21:55:06 crc kubenswrapper[4962]: I1201 21:55:06.971080 4962 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 21:55:06 crc kubenswrapper[4962]: I1201 21:55:06.971108 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-q7knr" event={"ID":"98cede6f-200e-44d9-a4ee-886de53f2459","Type":"ContainerStarted","Data":"03f499c20053f72585e31fd54b1d5759491a0ce63ac00973d6192f9e5c5683d4"} Dec 01 21:55:07 crc kubenswrapper[4962]: I1201 21:55:07.004258 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-x55sw" podStartSLOduration=3.759610045 podStartE2EDuration="43.004239757s" podCreationTimestamp="2025-12-01 21:54:24 +0000 UTC" firstStartedPulling="2025-12-01 21:54:26.670560186 +0000 UTC m=+1250.771999381" lastFinishedPulling="2025-12-01 21:55:05.915189908 +0000 UTC m=+1290.016629093" observedRunningTime="2025-12-01 21:55:06.981158129 +0000 UTC m=+1291.082597334" watchObservedRunningTime="2025-12-01 21:55:07.004239757 +0000 UTC m=+1291.105678962" Dec 01 21:55:07 crc kubenswrapper[4962]: I1201 21:55:07.007104 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-q7knr" podStartSLOduration=3.657269029 podStartE2EDuration="43.007096368s" podCreationTimestamp="2025-12-01 21:54:24 +0000 UTC" firstStartedPulling="2025-12-01 21:54:26.82759066 +0000 UTC m=+1250.929029855" lastFinishedPulling="2025-12-01 21:55:06.177417999 +0000 UTC m=+1290.278857194" observedRunningTime="2025-12-01 21:55:06.999853802 +0000 UTC m=+1291.101292997" watchObservedRunningTime="2025-12-01 21:55:07.007096368 +0000 UTC m=+1291.108535563" Dec 01 21:55:07 crc kubenswrapper[4962]: I1201 21:55:07.042086 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-mfzbs"] Dec 01 21:55:07 crc kubenswrapper[4962]: I1201 21:55:07.060176 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-mfzbs"] Dec 01 21:55:07 crc kubenswrapper[4962]: I1201 21:55:07.068097 4962 scope.go:117] "RemoveContainer" containerID="d6f19e7483f9073cbf641cc7a0feb21630f8bbdbaaa2b28af533b631218286cd" Dec 01 21:55:07 crc kubenswrapper[4962]: I1201 21:55:07.074478 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bfc984cd5-wc42c"] Dec 01 21:55:07 crc kubenswrapper[4962]: I1201 21:55:07.976982 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 01 21:55:07 crc kubenswrapper[4962]: I1201 21:55:07.977435 4962 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 21:55:07 crc kubenswrapper[4962]: I1201 21:55:07.994677 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bfc984cd5-wc42c" event={"ID":"e30f9e2d-e0be-4484-bf6b-83c39beaa7e6","Type":"ContainerStarted","Data":"43ae0d24f663409f2c13ea92878dbcf3e0c946ed7fc80a074d1309d7f1de6e3b"} Dec 01 21:55:07 crc kubenswrapper[4962]: I1201 21:55:07.994752 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-bfc984cd5-wc42c" Dec 01 21:55:07 crc kubenswrapper[4962]: I1201 21:55:07.994764 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bfc984cd5-wc42c" event={"ID":"e30f9e2d-e0be-4484-bf6b-83c39beaa7e6","Type":"ContainerStarted","Data":"469cd516bf5e8f7f752e22ddbcdb194e1e0bed0b13d419d8f351973ea6eaacad"} Dec 01 21:55:07 crc kubenswrapper[4962]: I1201 21:55:07.997552 4962 generic.go:334] "Generic (PLEG): container finished" podID="eb73c9fd-2c3b-4d8e-a1c4-5ff4a2b8f67c" containerID="45946b1574666843499789ecb531de96bd76a29871dbdec68756db81bc11fbc6" exitCode=0 Dec 01 21:55:07 crc kubenswrapper[4962]: I1201 21:55:07.997633 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-l44sw" event={"ID":"eb73c9fd-2c3b-4d8e-a1c4-5ff4a2b8f67c","Type":"ContainerDied","Data":"45946b1574666843499789ecb531de96bd76a29871dbdec68756db81bc11fbc6"} Dec 01 21:55:08 crc kubenswrapper[4962]: I1201 21:55:08.009774 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-vzzq8" event={"ID":"29a10360-5fb7-4add-8bf5-1bc35e6e76dd","Type":"ContainerStarted","Data":"4ee394e3c08b6b22f910c1ac746acc212144457ea20d4507fd9f0b626d62f0ea"} Dec 01 21:55:08 crc kubenswrapper[4962]: I1201 21:55:08.033865 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bfc984cd5-wc42c" podStartSLOduration=3.033846332 podStartE2EDuration="3.033846332s" podCreationTimestamp="2025-12-01 21:55:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:55:08.024564338 +0000 UTC m=+1292.126003533" watchObservedRunningTime="2025-12-01 21:55:08.033846332 +0000 UTC m=+1292.135285527" Dec 01 21:55:08 crc kubenswrapper[4962]: I1201 21:55:08.066792 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-vzzq8" podStartSLOduration=4.173881588 podStartE2EDuration="44.066775411s" podCreationTimestamp="2025-12-01 21:54:24 +0000 UTC" firstStartedPulling="2025-12-01 21:54:26.835844945 +0000 UTC m=+1250.937284140" lastFinishedPulling="2025-12-01 21:55:06.728738768 +0000 UTC m=+1290.830177963" observedRunningTime="2025-12-01 21:55:08.060651096 +0000 UTC m=+1292.162090301" watchObservedRunningTime="2025-12-01 21:55:08.066775411 +0000 UTC m=+1292.168214606" Dec 01 21:55:08 crc kubenswrapper[4962]: I1201 21:55:08.236766 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38501711-21d8-43f0-8657-7507944ef792" path="/var/lib/kubelet/pods/38501711-21d8-43f0-8657-7507944ef792/volumes" Dec 01 21:55:08 crc kubenswrapper[4962]: I1201 21:55:08.239081 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 01 21:55:08 crc kubenswrapper[4962]: I1201 21:55:08.474458 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 01 21:55:08 crc kubenswrapper[4962]: I1201 21:55:08.474562 4962 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 21:55:08 crc kubenswrapper[4962]: I1201 21:55:08.560154 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 01 21:55:10 crc kubenswrapper[4962]: I1201 21:55:10.582663 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-56df8fb6b7-mfzbs" podUID="38501711-21d8-43f0-8657-7507944ef792" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.182:5353: i/o timeout" Dec 01 21:55:10 crc kubenswrapper[4962]: I1201 21:55:10.945987 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-l44sw" Dec 01 21:55:11 crc kubenswrapper[4962]: I1201 21:55:11.045421 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb73c9fd-2c3b-4d8e-a1c4-5ff4a2b8f67c-scripts\") pod \"eb73c9fd-2c3b-4d8e-a1c4-5ff4a2b8f67c\" (UID: \"eb73c9fd-2c3b-4d8e-a1c4-5ff4a2b8f67c\") " Dec 01 21:55:11 crc kubenswrapper[4962]: I1201 21:55:11.045512 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb73c9fd-2c3b-4d8e-a1c4-5ff4a2b8f67c-logs\") pod \"eb73c9fd-2c3b-4d8e-a1c4-5ff4a2b8f67c\" (UID: \"eb73c9fd-2c3b-4d8e-a1c4-5ff4a2b8f67c\") " Dec 01 21:55:11 crc kubenswrapper[4962]: I1201 21:55:11.045545 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8m7gn\" (UniqueName: \"kubernetes.io/projected/eb73c9fd-2c3b-4d8e-a1c4-5ff4a2b8f67c-kube-api-access-8m7gn\") pod \"eb73c9fd-2c3b-4d8e-a1c4-5ff4a2b8f67c\" (UID: \"eb73c9fd-2c3b-4d8e-a1c4-5ff4a2b8f67c\") " Dec 01 21:55:11 crc kubenswrapper[4962]: I1201 21:55:11.045659 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb73c9fd-2c3b-4d8e-a1c4-5ff4a2b8f67c-combined-ca-bundle\") pod \"eb73c9fd-2c3b-4d8e-a1c4-5ff4a2b8f67c\" (UID: \"eb73c9fd-2c3b-4d8e-a1c4-5ff4a2b8f67c\") " Dec 01 21:55:11 crc kubenswrapper[4962]: I1201 21:55:11.045712 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb73c9fd-2c3b-4d8e-a1c4-5ff4a2b8f67c-config-data\") pod \"eb73c9fd-2c3b-4d8e-a1c4-5ff4a2b8f67c\" (UID: \"eb73c9fd-2c3b-4d8e-a1c4-5ff4a2b8f67c\") " Dec 01 21:55:11 crc kubenswrapper[4962]: I1201 21:55:11.046174 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb73c9fd-2c3b-4d8e-a1c4-5ff4a2b8f67c-logs" (OuterVolumeSpecName: "logs") pod "eb73c9fd-2c3b-4d8e-a1c4-5ff4a2b8f67c" (UID: "eb73c9fd-2c3b-4d8e-a1c4-5ff4a2b8f67c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:55:11 crc kubenswrapper[4962]: I1201 21:55:11.046347 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb73c9fd-2c3b-4d8e-a1c4-5ff4a2b8f67c-logs\") on node \"crc\" DevicePath \"\"" Dec 01 21:55:11 crc kubenswrapper[4962]: I1201 21:55:11.047377 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-l44sw" event={"ID":"eb73c9fd-2c3b-4d8e-a1c4-5ff4a2b8f67c","Type":"ContainerDied","Data":"12ad90e3be792bb4ce4c07251a62f3070e17815d0c1d05b7718123681646953e"} Dec 01 21:55:11 crc kubenswrapper[4962]: I1201 21:55:11.047411 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12ad90e3be792bb4ce4c07251a62f3070e17815d0c1d05b7718123681646953e" Dec 01 21:55:11 crc kubenswrapper[4962]: I1201 21:55:11.047463 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-l44sw" Dec 01 21:55:11 crc kubenswrapper[4962]: I1201 21:55:11.061195 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb73c9fd-2c3b-4d8e-a1c4-5ff4a2b8f67c-kube-api-access-8m7gn" (OuterVolumeSpecName: "kube-api-access-8m7gn") pod "eb73c9fd-2c3b-4d8e-a1c4-5ff4a2b8f67c" (UID: "eb73c9fd-2c3b-4d8e-a1c4-5ff4a2b8f67c"). InnerVolumeSpecName "kube-api-access-8m7gn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:55:11 crc kubenswrapper[4962]: I1201 21:55:11.073117 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb73c9fd-2c3b-4d8e-a1c4-5ff4a2b8f67c-scripts" (OuterVolumeSpecName: "scripts") pod "eb73c9fd-2c3b-4d8e-a1c4-5ff4a2b8f67c" (UID: "eb73c9fd-2c3b-4d8e-a1c4-5ff4a2b8f67c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:55:11 crc kubenswrapper[4962]: I1201 21:55:11.085648 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb73c9fd-2c3b-4d8e-a1c4-5ff4a2b8f67c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eb73c9fd-2c3b-4d8e-a1c4-5ff4a2b8f67c" (UID: "eb73c9fd-2c3b-4d8e-a1c4-5ff4a2b8f67c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:55:11 crc kubenswrapper[4962]: I1201 21:55:11.098634 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb73c9fd-2c3b-4d8e-a1c4-5ff4a2b8f67c-config-data" (OuterVolumeSpecName: "config-data") pod "eb73c9fd-2c3b-4d8e-a1c4-5ff4a2b8f67c" (UID: "eb73c9fd-2c3b-4d8e-a1c4-5ff4a2b8f67c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:55:11 crc kubenswrapper[4962]: I1201 21:55:11.149065 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb73c9fd-2c3b-4d8e-a1c4-5ff4a2b8f67c-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 21:55:11 crc kubenswrapper[4962]: I1201 21:55:11.149119 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8m7gn\" (UniqueName: \"kubernetes.io/projected/eb73c9fd-2c3b-4d8e-a1c4-5ff4a2b8f67c-kube-api-access-8m7gn\") on node \"crc\" DevicePath \"\"" Dec 01 21:55:11 crc kubenswrapper[4962]: I1201 21:55:11.149136 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb73c9fd-2c3b-4d8e-a1c4-5ff4a2b8f67c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 21:55:11 crc kubenswrapper[4962]: I1201 21:55:11.149149 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb73c9fd-2c3b-4d8e-a1c4-5ff4a2b8f67c-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 21:55:12 crc kubenswrapper[4962]: I1201 21:55:12.041191 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-54647f544-d6jzt"] Dec 01 21:55:12 crc kubenswrapper[4962]: E1201 21:55:12.041634 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38501711-21d8-43f0-8657-7507944ef792" containerName="init" Dec 01 21:55:12 crc kubenswrapper[4962]: I1201 21:55:12.041649 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="38501711-21d8-43f0-8657-7507944ef792" containerName="init" Dec 01 21:55:12 crc kubenswrapper[4962]: E1201 21:55:12.041691 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38501711-21d8-43f0-8657-7507944ef792" containerName="dnsmasq-dns" Dec 01 21:55:12 crc kubenswrapper[4962]: I1201 21:55:12.041697 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="38501711-21d8-43f0-8657-7507944ef792" containerName="dnsmasq-dns" Dec 01 21:55:12 crc kubenswrapper[4962]: E1201 21:55:12.041711 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb73c9fd-2c3b-4d8e-a1c4-5ff4a2b8f67c" containerName="placement-db-sync" Dec 01 21:55:12 crc kubenswrapper[4962]: I1201 21:55:12.041717 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb73c9fd-2c3b-4d8e-a1c4-5ff4a2b8f67c" containerName="placement-db-sync" Dec 01 21:55:12 crc kubenswrapper[4962]: I1201 21:55:12.041920 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="38501711-21d8-43f0-8657-7507944ef792" containerName="dnsmasq-dns" Dec 01 21:55:12 crc kubenswrapper[4962]: I1201 21:55:12.041954 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb73c9fd-2c3b-4d8e-a1c4-5ff4a2b8f67c" containerName="placement-db-sync" Dec 01 21:55:12 crc kubenswrapper[4962]: I1201 21:55:12.043655 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-54647f544-d6jzt" Dec 01 21:55:12 crc kubenswrapper[4962]: I1201 21:55:12.049251 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 01 21:55:12 crc kubenswrapper[4962]: I1201 21:55:12.049452 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Dec 01 21:55:12 crc kubenswrapper[4962]: I1201 21:55:12.049634 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-zj84r" Dec 01 21:55:12 crc kubenswrapper[4962]: I1201 21:55:12.049781 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 01 21:55:12 crc kubenswrapper[4962]: I1201 21:55:12.050610 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Dec 01 21:55:12 crc kubenswrapper[4962]: I1201 21:55:12.061027 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-54647f544-d6jzt"] Dec 01 21:55:12 crc kubenswrapper[4962]: I1201 21:55:12.192247 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6107fcdb-ceea-4953-a667-e3a973c68de3-internal-tls-certs\") pod \"placement-54647f544-d6jzt\" (UID: \"6107fcdb-ceea-4953-a667-e3a973c68de3\") " pod="openstack/placement-54647f544-d6jzt" Dec 01 21:55:12 crc kubenswrapper[4962]: I1201 21:55:12.192543 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6107fcdb-ceea-4953-a667-e3a973c68de3-scripts\") pod \"placement-54647f544-d6jzt\" (UID: \"6107fcdb-ceea-4953-a667-e3a973c68de3\") " pod="openstack/placement-54647f544-d6jzt" Dec 01 21:55:12 crc kubenswrapper[4962]: I1201 21:55:12.192604 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6107fcdb-ceea-4953-a667-e3a973c68de3-combined-ca-bundle\") pod \"placement-54647f544-d6jzt\" (UID: \"6107fcdb-ceea-4953-a667-e3a973c68de3\") " pod="openstack/placement-54647f544-d6jzt" Dec 01 21:55:12 crc kubenswrapper[4962]: I1201 21:55:12.192651 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6107fcdb-ceea-4953-a667-e3a973c68de3-logs\") pod \"placement-54647f544-d6jzt\" (UID: \"6107fcdb-ceea-4953-a667-e3a973c68de3\") " pod="openstack/placement-54647f544-d6jzt" Dec 01 21:55:12 crc kubenswrapper[4962]: I1201 21:55:12.192736 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6107fcdb-ceea-4953-a667-e3a973c68de3-config-data\") pod \"placement-54647f544-d6jzt\" (UID: \"6107fcdb-ceea-4953-a667-e3a973c68de3\") " pod="openstack/placement-54647f544-d6jzt" Dec 01 21:55:12 crc kubenswrapper[4962]: I1201 21:55:12.192803 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6107fcdb-ceea-4953-a667-e3a973c68de3-public-tls-certs\") pod \"placement-54647f544-d6jzt\" (UID: \"6107fcdb-ceea-4953-a667-e3a973c68de3\") " pod="openstack/placement-54647f544-d6jzt" Dec 01 21:55:12 crc kubenswrapper[4962]: I1201 21:55:12.192836 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfn56\" (UniqueName: \"kubernetes.io/projected/6107fcdb-ceea-4953-a667-e3a973c68de3-kube-api-access-wfn56\") pod \"placement-54647f544-d6jzt\" (UID: \"6107fcdb-ceea-4953-a667-e3a973c68de3\") " pod="openstack/placement-54647f544-d6jzt" Dec 01 21:55:12 crc kubenswrapper[4962]: I1201 21:55:12.294370 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6107fcdb-ceea-4953-a667-e3a973c68de3-combined-ca-bundle\") pod \"placement-54647f544-d6jzt\" (UID: \"6107fcdb-ceea-4953-a667-e3a973c68de3\") " pod="openstack/placement-54647f544-d6jzt" Dec 01 21:55:12 crc kubenswrapper[4962]: I1201 21:55:12.294435 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6107fcdb-ceea-4953-a667-e3a973c68de3-logs\") pod \"placement-54647f544-d6jzt\" (UID: \"6107fcdb-ceea-4953-a667-e3a973c68de3\") " pod="openstack/placement-54647f544-d6jzt" Dec 01 21:55:12 crc kubenswrapper[4962]: I1201 21:55:12.294506 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6107fcdb-ceea-4953-a667-e3a973c68de3-config-data\") pod \"placement-54647f544-d6jzt\" (UID: \"6107fcdb-ceea-4953-a667-e3a973c68de3\") " pod="openstack/placement-54647f544-d6jzt" Dec 01 21:55:12 crc kubenswrapper[4962]: I1201 21:55:12.294554 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6107fcdb-ceea-4953-a667-e3a973c68de3-public-tls-certs\") pod \"placement-54647f544-d6jzt\" (UID: \"6107fcdb-ceea-4953-a667-e3a973c68de3\") " pod="openstack/placement-54647f544-d6jzt" Dec 01 21:55:12 crc kubenswrapper[4962]: I1201 21:55:12.294574 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfn56\" (UniqueName: \"kubernetes.io/projected/6107fcdb-ceea-4953-a667-e3a973c68de3-kube-api-access-wfn56\") pod \"placement-54647f544-d6jzt\" (UID: \"6107fcdb-ceea-4953-a667-e3a973c68de3\") " pod="openstack/placement-54647f544-d6jzt" Dec 01 21:55:12 crc kubenswrapper[4962]: I1201 21:55:12.294652 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6107fcdb-ceea-4953-a667-e3a973c68de3-internal-tls-certs\") pod \"placement-54647f544-d6jzt\" (UID: \"6107fcdb-ceea-4953-a667-e3a973c68de3\") " pod="openstack/placement-54647f544-d6jzt" Dec 01 21:55:12 crc kubenswrapper[4962]: I1201 21:55:12.294678 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6107fcdb-ceea-4953-a667-e3a973c68de3-scripts\") pod \"placement-54647f544-d6jzt\" (UID: \"6107fcdb-ceea-4953-a667-e3a973c68de3\") " pod="openstack/placement-54647f544-d6jzt" Dec 01 21:55:12 crc kubenswrapper[4962]: I1201 21:55:12.295403 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6107fcdb-ceea-4953-a667-e3a973c68de3-logs\") pod \"placement-54647f544-d6jzt\" (UID: \"6107fcdb-ceea-4953-a667-e3a973c68de3\") " pod="openstack/placement-54647f544-d6jzt" Dec 01 21:55:12 crc kubenswrapper[4962]: I1201 21:55:12.300524 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6107fcdb-ceea-4953-a667-e3a973c68de3-public-tls-certs\") pod \"placement-54647f544-d6jzt\" (UID: \"6107fcdb-ceea-4953-a667-e3a973c68de3\") " pod="openstack/placement-54647f544-d6jzt" Dec 01 21:55:12 crc kubenswrapper[4962]: I1201 21:55:12.301386 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6107fcdb-ceea-4953-a667-e3a973c68de3-scripts\") pod \"placement-54647f544-d6jzt\" (UID: \"6107fcdb-ceea-4953-a667-e3a973c68de3\") " pod="openstack/placement-54647f544-d6jzt" Dec 01 21:55:12 crc kubenswrapper[4962]: I1201 21:55:12.301480 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6107fcdb-ceea-4953-a667-e3a973c68de3-config-data\") pod \"placement-54647f544-d6jzt\" (UID: \"6107fcdb-ceea-4953-a667-e3a973c68de3\") " pod="openstack/placement-54647f544-d6jzt" Dec 01 21:55:12 crc kubenswrapper[4962]: I1201 21:55:12.301669 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6107fcdb-ceea-4953-a667-e3a973c68de3-internal-tls-certs\") pod \"placement-54647f544-d6jzt\" (UID: \"6107fcdb-ceea-4953-a667-e3a973c68de3\") " pod="openstack/placement-54647f544-d6jzt" Dec 01 21:55:12 crc kubenswrapper[4962]: I1201 21:55:12.316688 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfn56\" (UniqueName: \"kubernetes.io/projected/6107fcdb-ceea-4953-a667-e3a973c68de3-kube-api-access-wfn56\") pod \"placement-54647f544-d6jzt\" (UID: \"6107fcdb-ceea-4953-a667-e3a973c68de3\") " pod="openstack/placement-54647f544-d6jzt" Dec 01 21:55:12 crc kubenswrapper[4962]: I1201 21:55:12.320320 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6107fcdb-ceea-4953-a667-e3a973c68de3-combined-ca-bundle\") pod \"placement-54647f544-d6jzt\" (UID: \"6107fcdb-ceea-4953-a667-e3a973c68de3\") " pod="openstack/placement-54647f544-d6jzt" Dec 01 21:55:12 crc kubenswrapper[4962]: I1201 21:55:12.363814 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-54647f544-d6jzt" Dec 01 21:55:13 crc kubenswrapper[4962]: I1201 21:55:13.084185 4962 generic.go:334] "Generic (PLEG): container finished" podID="98cede6f-200e-44d9-a4ee-886de53f2459" containerID="03f499c20053f72585e31fd54b1d5759491a0ce63ac00973d6192f9e5c5683d4" exitCode=0 Dec 01 21:55:13 crc kubenswrapper[4962]: I1201 21:55:13.084446 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-q7knr" event={"ID":"98cede6f-200e-44d9-a4ee-886de53f2459","Type":"ContainerDied","Data":"03f499c20053f72585e31fd54b1d5759491a0ce63ac00973d6192f9e5c5683d4"} Dec 01 21:55:13 crc kubenswrapper[4962]: I1201 21:55:13.574429 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-54647f544-d6jzt"] Dec 01 21:55:14 crc kubenswrapper[4962]: I1201 21:55:14.096383 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"624dc66d-d4f2-447b-861b-23987a75a3d6","Type":"ContainerStarted","Data":"3b700d2def88961fd57d19c4e4cc0de95c1da92a0cb130fc9a051f5f95bc4559"} Dec 01 21:55:14 crc kubenswrapper[4962]: I1201 21:55:14.100569 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-54647f544-d6jzt" event={"ID":"6107fcdb-ceea-4953-a667-e3a973c68de3","Type":"ContainerStarted","Data":"d9e2789fb239a2212f77db5e5c36826d3bee22c8da50da07c84a1ae68d73f954"} Dec 01 21:55:14 crc kubenswrapper[4962]: I1201 21:55:14.100750 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-54647f544-d6jzt" Dec 01 21:55:14 crc kubenswrapper[4962]: I1201 21:55:14.100997 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-54647f544-d6jzt" Dec 01 21:55:14 crc kubenswrapper[4962]: I1201 21:55:14.101103 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-54647f544-d6jzt" event={"ID":"6107fcdb-ceea-4953-a667-e3a973c68de3","Type":"ContainerStarted","Data":"01092cf89e520e178cd2a50ccfbbb6e788b4f2a89081494f45ffc5635d4f716b"} Dec 01 21:55:14 crc kubenswrapper[4962]: I1201 21:55:14.101198 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-54647f544-d6jzt" event={"ID":"6107fcdb-ceea-4953-a667-e3a973c68de3","Type":"ContainerStarted","Data":"426a54ca2c0ce1346f6b90d0b62c28dabb38689fa9dc2791fa9b30af7ea884fd"} Dec 01 21:55:14 crc kubenswrapper[4962]: I1201 21:55:14.102628 4962 generic.go:334] "Generic (PLEG): container finished" podID="873a333f-1f2b-4824-86a7-7935ff6908f9" containerID="7a0e052c4595839848ef3ee1213e2439112bd2167ad32ddfb8ab42a9161cb824" exitCode=0 Dec 01 21:55:14 crc kubenswrapper[4962]: I1201 21:55:14.102705 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-x55sw" event={"ID":"873a333f-1f2b-4824-86a7-7935ff6908f9","Type":"ContainerDied","Data":"7a0e052c4595839848ef3ee1213e2439112bd2167ad32ddfb8ab42a9161cb824"} Dec 01 21:55:14 crc kubenswrapper[4962]: I1201 21:55:14.126467 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-54647f544-d6jzt" podStartSLOduration=2.12644868 podStartE2EDuration="2.12644868s" podCreationTimestamp="2025-12-01 21:55:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:55:14.119457901 +0000 UTC m=+1298.220897116" watchObservedRunningTime="2025-12-01 21:55:14.12644868 +0000 UTC m=+1298.227887885" Dec 01 21:55:14 crc kubenswrapper[4962]: I1201 21:55:14.607574 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-q7knr" Dec 01 21:55:14 crc kubenswrapper[4962]: I1201 21:55:14.662324 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98cede6f-200e-44d9-a4ee-886de53f2459-combined-ca-bundle\") pod \"98cede6f-200e-44d9-a4ee-886de53f2459\" (UID: \"98cede6f-200e-44d9-a4ee-886de53f2459\") " Dec 01 21:55:14 crc kubenswrapper[4962]: I1201 21:55:14.662402 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/98cede6f-200e-44d9-a4ee-886de53f2459-db-sync-config-data\") pod \"98cede6f-200e-44d9-a4ee-886de53f2459\" (UID: \"98cede6f-200e-44d9-a4ee-886de53f2459\") " Dec 01 21:55:14 crc kubenswrapper[4962]: I1201 21:55:14.662598 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfbmn\" (UniqueName: \"kubernetes.io/projected/98cede6f-200e-44d9-a4ee-886de53f2459-kube-api-access-xfbmn\") pod \"98cede6f-200e-44d9-a4ee-886de53f2459\" (UID: \"98cede6f-200e-44d9-a4ee-886de53f2459\") " Dec 01 21:55:14 crc kubenswrapper[4962]: I1201 21:55:14.678306 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98cede6f-200e-44d9-a4ee-886de53f2459-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "98cede6f-200e-44d9-a4ee-886de53f2459" (UID: "98cede6f-200e-44d9-a4ee-886de53f2459"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:55:14 crc kubenswrapper[4962]: I1201 21:55:14.682161 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98cede6f-200e-44d9-a4ee-886de53f2459-kube-api-access-xfbmn" (OuterVolumeSpecName: "kube-api-access-xfbmn") pod "98cede6f-200e-44d9-a4ee-886de53f2459" (UID: "98cede6f-200e-44d9-a4ee-886de53f2459"). InnerVolumeSpecName "kube-api-access-xfbmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:55:14 crc kubenswrapper[4962]: I1201 21:55:14.718376 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98cede6f-200e-44d9-a4ee-886de53f2459-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "98cede6f-200e-44d9-a4ee-886de53f2459" (UID: "98cede6f-200e-44d9-a4ee-886de53f2459"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:55:14 crc kubenswrapper[4962]: I1201 21:55:14.765157 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98cede6f-200e-44d9-a4ee-886de53f2459-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 21:55:14 crc kubenswrapper[4962]: I1201 21:55:14.765354 4962 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/98cede6f-200e-44d9-a4ee-886de53f2459-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 21:55:14 crc kubenswrapper[4962]: I1201 21:55:14.765424 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfbmn\" (UniqueName: \"kubernetes.io/projected/98cede6f-200e-44d9-a4ee-886de53f2459-kube-api-access-xfbmn\") on node \"crc\" DevicePath \"\"" Dec 01 21:55:14 crc kubenswrapper[4962]: E1201 21:55:14.924714 4962 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod29a10360_5fb7_4add_8bf5_1bc35e6e76dd.slice/crio-conmon-4ee394e3c08b6b22f910c1ac746acc212144457ea20d4507fd9f0b626d62f0ea.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod29a10360_5fb7_4add_8bf5_1bc35e6e76dd.slice/crio-4ee394e3c08b6b22f910c1ac746acc212144457ea20d4507fd9f0b626d62f0ea.scope\": RecentStats: unable to find data in memory cache]" Dec 01 21:55:15 crc kubenswrapper[4962]: I1201 21:55:15.126798 4962 generic.go:334] "Generic (PLEG): container finished" podID="29a10360-5fb7-4add-8bf5-1bc35e6e76dd" containerID="4ee394e3c08b6b22f910c1ac746acc212144457ea20d4507fd9f0b626d62f0ea" exitCode=0 Dec 01 21:55:15 crc kubenswrapper[4962]: I1201 21:55:15.127009 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-vzzq8" event={"ID":"29a10360-5fb7-4add-8bf5-1bc35e6e76dd","Type":"ContainerDied","Data":"4ee394e3c08b6b22f910c1ac746acc212144457ea20d4507fd9f0b626d62f0ea"} Dec 01 21:55:15 crc kubenswrapper[4962]: I1201 21:55:15.134258 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-q7knr" event={"ID":"98cede6f-200e-44d9-a4ee-886de53f2459","Type":"ContainerDied","Data":"d47b5bc6cb9c8a4fab77a102e3a42937b83029a7e3bb86fb46aec16c0e7760f8"} Dec 01 21:55:15 crc kubenswrapper[4962]: I1201 21:55:15.134337 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d47b5bc6cb9c8a4fab77a102e3a42937b83029a7e3bb86fb46aec16c0e7760f8" Dec 01 21:55:15 crc kubenswrapper[4962]: I1201 21:55:15.134288 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-q7knr" Dec 01 21:55:15 crc kubenswrapper[4962]: I1201 21:55:15.420405 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-76b577fdff-d85rv"] Dec 01 21:55:15 crc kubenswrapper[4962]: E1201 21:55:15.432479 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98cede6f-200e-44d9-a4ee-886de53f2459" containerName="barbican-db-sync" Dec 01 21:55:15 crc kubenswrapper[4962]: I1201 21:55:15.432499 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="98cede6f-200e-44d9-a4ee-886de53f2459" containerName="barbican-db-sync" Dec 01 21:55:15 crc kubenswrapper[4962]: I1201 21:55:15.432731 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="98cede6f-200e-44d9-a4ee-886de53f2459" containerName="barbican-db-sync" Dec 01 21:55:15 crc kubenswrapper[4962]: I1201 21:55:15.433865 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-76b577fdff-d85rv" Dec 01 21:55:15 crc kubenswrapper[4962]: I1201 21:55:15.449455 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Dec 01 21:55:15 crc kubenswrapper[4962]: I1201 21:55:15.449774 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-6rtb4" Dec 01 21:55:15 crc kubenswrapper[4962]: I1201 21:55:15.449925 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 01 21:55:15 crc kubenswrapper[4962]: I1201 21:55:15.457003 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-7fbb7f6fcd-9l2gb"] Dec 01 21:55:15 crc kubenswrapper[4962]: I1201 21:55:15.458871 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7fbb7f6fcd-9l2gb" Dec 01 21:55:15 crc kubenswrapper[4962]: I1201 21:55:15.473067 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Dec 01 21:55:15 crc kubenswrapper[4962]: I1201 21:55:15.487851 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-76b577fdff-d85rv"] Dec 01 21:55:15 crc kubenswrapper[4962]: I1201 21:55:15.524591 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7fbb7f6fcd-9l2gb"] Dec 01 21:55:15 crc kubenswrapper[4962]: I1201 21:55:15.585943 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0fcefe11-14bc-40f6-8552-da42f7b63977-config-data-custom\") pod \"barbican-worker-76b577fdff-d85rv\" (UID: \"0fcefe11-14bc-40f6-8552-da42f7b63977\") " pod="openstack/barbican-worker-76b577fdff-d85rv" Dec 01 21:55:15 crc kubenswrapper[4962]: I1201 21:55:15.586018 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fcefe11-14bc-40f6-8552-da42f7b63977-combined-ca-bundle\") pod \"barbican-worker-76b577fdff-d85rv\" (UID: \"0fcefe11-14bc-40f6-8552-da42f7b63977\") " pod="openstack/barbican-worker-76b577fdff-d85rv" Dec 01 21:55:15 crc kubenswrapper[4962]: I1201 21:55:15.586083 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0fcefe11-14bc-40f6-8552-da42f7b63977-logs\") pod \"barbican-worker-76b577fdff-d85rv\" (UID: \"0fcefe11-14bc-40f6-8552-da42f7b63977\") " pod="openstack/barbican-worker-76b577fdff-d85rv" Dec 01 21:55:15 crc kubenswrapper[4962]: I1201 21:55:15.586105 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/909a52c1-349c-4b1b-929f-7d2c554cad32-combined-ca-bundle\") pod \"barbican-keystone-listener-7fbb7f6fcd-9l2gb\" (UID: \"909a52c1-349c-4b1b-929f-7d2c554cad32\") " pod="openstack/barbican-keystone-listener-7fbb7f6fcd-9l2gb" Dec 01 21:55:15 crc kubenswrapper[4962]: I1201 21:55:15.586123 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/909a52c1-349c-4b1b-929f-7d2c554cad32-config-data\") pod \"barbican-keystone-listener-7fbb7f6fcd-9l2gb\" (UID: \"909a52c1-349c-4b1b-929f-7d2c554cad32\") " pod="openstack/barbican-keystone-listener-7fbb7f6fcd-9l2gb" Dec 01 21:55:15 crc kubenswrapper[4962]: I1201 21:55:15.586182 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fcefe11-14bc-40f6-8552-da42f7b63977-config-data\") pod \"barbican-worker-76b577fdff-d85rv\" (UID: \"0fcefe11-14bc-40f6-8552-da42f7b63977\") " pod="openstack/barbican-worker-76b577fdff-d85rv" Dec 01 21:55:15 crc kubenswrapper[4962]: I1201 21:55:15.586222 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flgwb\" (UniqueName: \"kubernetes.io/projected/909a52c1-349c-4b1b-929f-7d2c554cad32-kube-api-access-flgwb\") pod \"barbican-keystone-listener-7fbb7f6fcd-9l2gb\" (UID: \"909a52c1-349c-4b1b-929f-7d2c554cad32\") " pod="openstack/barbican-keystone-listener-7fbb7f6fcd-9l2gb" Dec 01 21:55:15 crc kubenswrapper[4962]: I1201 21:55:15.586246 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kk8cw\" (UniqueName: \"kubernetes.io/projected/0fcefe11-14bc-40f6-8552-da42f7b63977-kube-api-access-kk8cw\") pod \"barbican-worker-76b577fdff-d85rv\" (UID: \"0fcefe11-14bc-40f6-8552-da42f7b63977\") " pod="openstack/barbican-worker-76b577fdff-d85rv" Dec 01 21:55:15 crc kubenswrapper[4962]: I1201 21:55:15.586260 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/909a52c1-349c-4b1b-929f-7d2c554cad32-logs\") pod \"barbican-keystone-listener-7fbb7f6fcd-9l2gb\" (UID: \"909a52c1-349c-4b1b-929f-7d2c554cad32\") " pod="openstack/barbican-keystone-listener-7fbb7f6fcd-9l2gb" Dec 01 21:55:15 crc kubenswrapper[4962]: I1201 21:55:15.586279 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/909a52c1-349c-4b1b-929f-7d2c554cad32-config-data-custom\") pod \"barbican-keystone-listener-7fbb7f6fcd-9l2gb\" (UID: \"909a52c1-349c-4b1b-929f-7d2c554cad32\") " pod="openstack/barbican-keystone-listener-7fbb7f6fcd-9l2gb" Dec 01 21:55:15 crc kubenswrapper[4962]: I1201 21:55:15.692349 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flgwb\" (UniqueName: \"kubernetes.io/projected/909a52c1-349c-4b1b-929f-7d2c554cad32-kube-api-access-flgwb\") pod \"barbican-keystone-listener-7fbb7f6fcd-9l2gb\" (UID: \"909a52c1-349c-4b1b-929f-7d2c554cad32\") " pod="openstack/barbican-keystone-listener-7fbb7f6fcd-9l2gb" Dec 01 21:55:15 crc kubenswrapper[4962]: I1201 21:55:15.692412 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kk8cw\" (UniqueName: \"kubernetes.io/projected/0fcefe11-14bc-40f6-8552-da42f7b63977-kube-api-access-kk8cw\") pod \"barbican-worker-76b577fdff-d85rv\" (UID: \"0fcefe11-14bc-40f6-8552-da42f7b63977\") " pod="openstack/barbican-worker-76b577fdff-d85rv" Dec 01 21:55:15 crc kubenswrapper[4962]: I1201 21:55:15.692433 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/909a52c1-349c-4b1b-929f-7d2c554cad32-logs\") pod \"barbican-keystone-listener-7fbb7f6fcd-9l2gb\" (UID: \"909a52c1-349c-4b1b-929f-7d2c554cad32\") " pod="openstack/barbican-keystone-listener-7fbb7f6fcd-9l2gb" Dec 01 21:55:15 crc kubenswrapper[4962]: I1201 21:55:15.692457 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/909a52c1-349c-4b1b-929f-7d2c554cad32-config-data-custom\") pod \"barbican-keystone-listener-7fbb7f6fcd-9l2gb\" (UID: \"909a52c1-349c-4b1b-929f-7d2c554cad32\") " pod="openstack/barbican-keystone-listener-7fbb7f6fcd-9l2gb" Dec 01 21:55:15 crc kubenswrapper[4962]: I1201 21:55:15.692478 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0fcefe11-14bc-40f6-8552-da42f7b63977-config-data-custom\") pod \"barbican-worker-76b577fdff-d85rv\" (UID: \"0fcefe11-14bc-40f6-8552-da42f7b63977\") " pod="openstack/barbican-worker-76b577fdff-d85rv" Dec 01 21:55:15 crc kubenswrapper[4962]: I1201 21:55:15.692533 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fcefe11-14bc-40f6-8552-da42f7b63977-combined-ca-bundle\") pod \"barbican-worker-76b577fdff-d85rv\" (UID: \"0fcefe11-14bc-40f6-8552-da42f7b63977\") " pod="openstack/barbican-worker-76b577fdff-d85rv" Dec 01 21:55:15 crc kubenswrapper[4962]: I1201 21:55:15.692604 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0fcefe11-14bc-40f6-8552-da42f7b63977-logs\") pod \"barbican-worker-76b577fdff-d85rv\" (UID: \"0fcefe11-14bc-40f6-8552-da42f7b63977\") " pod="openstack/barbican-worker-76b577fdff-d85rv" Dec 01 21:55:15 crc kubenswrapper[4962]: I1201 21:55:15.692624 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/909a52c1-349c-4b1b-929f-7d2c554cad32-combined-ca-bundle\") pod \"barbican-keystone-listener-7fbb7f6fcd-9l2gb\" (UID: \"909a52c1-349c-4b1b-929f-7d2c554cad32\") " pod="openstack/barbican-keystone-listener-7fbb7f6fcd-9l2gb" Dec 01 21:55:15 crc kubenswrapper[4962]: I1201 21:55:15.692640 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/909a52c1-349c-4b1b-929f-7d2c554cad32-config-data\") pod \"barbican-keystone-listener-7fbb7f6fcd-9l2gb\" (UID: \"909a52c1-349c-4b1b-929f-7d2c554cad32\") " pod="openstack/barbican-keystone-listener-7fbb7f6fcd-9l2gb" Dec 01 21:55:15 crc kubenswrapper[4962]: I1201 21:55:15.692685 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fcefe11-14bc-40f6-8552-da42f7b63977-config-data\") pod \"barbican-worker-76b577fdff-d85rv\" (UID: \"0fcefe11-14bc-40f6-8552-da42f7b63977\") " pod="openstack/barbican-worker-76b577fdff-d85rv" Dec 01 21:55:15 crc kubenswrapper[4962]: I1201 21:55:15.702552 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-jdj42"] Dec 01 21:55:15 crc kubenswrapper[4962]: I1201 21:55:15.703731 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/909a52c1-349c-4b1b-929f-7d2c554cad32-logs\") pod \"barbican-keystone-listener-7fbb7f6fcd-9l2gb\" (UID: \"909a52c1-349c-4b1b-929f-7d2c554cad32\") " pod="openstack/barbican-keystone-listener-7fbb7f6fcd-9l2gb" Dec 01 21:55:15 crc kubenswrapper[4962]: I1201 21:55:15.704009 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0fcefe11-14bc-40f6-8552-da42f7b63977-logs\") pod \"barbican-worker-76b577fdff-d85rv\" (UID: \"0fcefe11-14bc-40f6-8552-da42f7b63977\") " pod="openstack/barbican-worker-76b577fdff-d85rv" Dec 01 21:55:15 crc kubenswrapper[4962]: I1201 21:55:15.704468 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-jdj42" Dec 01 21:55:15 crc kubenswrapper[4962]: I1201 21:55:15.725174 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fcefe11-14bc-40f6-8552-da42f7b63977-config-data\") pod \"barbican-worker-76b577fdff-d85rv\" (UID: \"0fcefe11-14bc-40f6-8552-da42f7b63977\") " pod="openstack/barbican-worker-76b577fdff-d85rv" Dec 01 21:55:15 crc kubenswrapper[4962]: I1201 21:55:15.728383 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-jdj42"] Dec 01 21:55:15 crc kubenswrapper[4962]: I1201 21:55:15.737020 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/909a52c1-349c-4b1b-929f-7d2c554cad32-config-data\") pod \"barbican-keystone-listener-7fbb7f6fcd-9l2gb\" (UID: \"909a52c1-349c-4b1b-929f-7d2c554cad32\") " pod="openstack/barbican-keystone-listener-7fbb7f6fcd-9l2gb" Dec 01 21:55:15 crc kubenswrapper[4962]: I1201 21:55:15.737681 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fcefe11-14bc-40f6-8552-da42f7b63977-combined-ca-bundle\") pod \"barbican-worker-76b577fdff-d85rv\" (UID: \"0fcefe11-14bc-40f6-8552-da42f7b63977\") " pod="openstack/barbican-worker-76b577fdff-d85rv" Dec 01 21:55:15 crc kubenswrapper[4962]: I1201 21:55:15.739547 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0fcefe11-14bc-40f6-8552-da42f7b63977-config-data-custom\") pod \"barbican-worker-76b577fdff-d85rv\" (UID: \"0fcefe11-14bc-40f6-8552-da42f7b63977\") " pod="openstack/barbican-worker-76b577fdff-d85rv" Dec 01 21:55:15 crc kubenswrapper[4962]: I1201 21:55:15.740452 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/909a52c1-349c-4b1b-929f-7d2c554cad32-config-data-custom\") pod \"barbican-keystone-listener-7fbb7f6fcd-9l2gb\" (UID: \"909a52c1-349c-4b1b-929f-7d2c554cad32\") " pod="openstack/barbican-keystone-listener-7fbb7f6fcd-9l2gb" Dec 01 21:55:15 crc kubenswrapper[4962]: I1201 21:55:15.746323 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kk8cw\" (UniqueName: \"kubernetes.io/projected/0fcefe11-14bc-40f6-8552-da42f7b63977-kube-api-access-kk8cw\") pod \"barbican-worker-76b577fdff-d85rv\" (UID: \"0fcefe11-14bc-40f6-8552-da42f7b63977\") " pod="openstack/barbican-worker-76b577fdff-d85rv" Dec 01 21:55:15 crc kubenswrapper[4962]: I1201 21:55:15.756254 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/909a52c1-349c-4b1b-929f-7d2c554cad32-combined-ca-bundle\") pod \"barbican-keystone-listener-7fbb7f6fcd-9l2gb\" (UID: \"909a52c1-349c-4b1b-929f-7d2c554cad32\") " pod="openstack/barbican-keystone-listener-7fbb7f6fcd-9l2gb" Dec 01 21:55:15 crc kubenswrapper[4962]: I1201 21:55:15.807507 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flgwb\" (UniqueName: \"kubernetes.io/projected/909a52c1-349c-4b1b-929f-7d2c554cad32-kube-api-access-flgwb\") pod \"barbican-keystone-listener-7fbb7f6fcd-9l2gb\" (UID: \"909a52c1-349c-4b1b-929f-7d2c554cad32\") " pod="openstack/barbican-keystone-listener-7fbb7f6fcd-9l2gb" Dec 01 21:55:15 crc kubenswrapper[4962]: I1201 21:55:15.807613 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-76b577fdff-d85rv" Dec 01 21:55:15 crc kubenswrapper[4962]: I1201 21:55:15.874251 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7fbb7f6fcd-9l2gb" Dec 01 21:55:15 crc kubenswrapper[4962]: I1201 21:55:15.898842 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b92b9b58-0f0f-48ed-924a-005317716a78-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-jdj42\" (UID: \"b92b9b58-0f0f-48ed-924a-005317716a78\") " pod="openstack/dnsmasq-dns-848cf88cfc-jdj42" Dec 01 21:55:15 crc kubenswrapper[4962]: I1201 21:55:15.898920 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b92b9b58-0f0f-48ed-924a-005317716a78-config\") pod \"dnsmasq-dns-848cf88cfc-jdj42\" (UID: \"b92b9b58-0f0f-48ed-924a-005317716a78\") " pod="openstack/dnsmasq-dns-848cf88cfc-jdj42" Dec 01 21:55:15 crc kubenswrapper[4962]: I1201 21:55:15.898955 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b92b9b58-0f0f-48ed-924a-005317716a78-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-jdj42\" (UID: \"b92b9b58-0f0f-48ed-924a-005317716a78\") " pod="openstack/dnsmasq-dns-848cf88cfc-jdj42" Dec 01 21:55:15 crc kubenswrapper[4962]: I1201 21:55:15.898975 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmhws\" (UniqueName: \"kubernetes.io/projected/b92b9b58-0f0f-48ed-924a-005317716a78-kube-api-access-nmhws\") pod \"dnsmasq-dns-848cf88cfc-jdj42\" (UID: \"b92b9b58-0f0f-48ed-924a-005317716a78\") " pod="openstack/dnsmasq-dns-848cf88cfc-jdj42" Dec 01 21:55:15 crc kubenswrapper[4962]: I1201 21:55:15.899006 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b92b9b58-0f0f-48ed-924a-005317716a78-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-jdj42\" (UID: \"b92b9b58-0f0f-48ed-924a-005317716a78\") " pod="openstack/dnsmasq-dns-848cf88cfc-jdj42" Dec 01 21:55:15 crc kubenswrapper[4962]: I1201 21:55:15.899111 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b92b9b58-0f0f-48ed-924a-005317716a78-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-jdj42\" (UID: \"b92b9b58-0f0f-48ed-924a-005317716a78\") " pod="openstack/dnsmasq-dns-848cf88cfc-jdj42" Dec 01 21:55:15 crc kubenswrapper[4962]: I1201 21:55:15.929076 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-55db4bf6b-mhhwj"] Dec 01 21:55:15 crc kubenswrapper[4962]: I1201 21:55:15.930774 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-55db4bf6b-mhhwj" Dec 01 21:55:15 crc kubenswrapper[4962]: I1201 21:55:15.941957 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Dec 01 21:55:15 crc kubenswrapper[4962]: I1201 21:55:15.981075 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-55db4bf6b-mhhwj"] Dec 01 21:55:16 crc kubenswrapper[4962]: I1201 21:55:16.001541 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b92b9b58-0f0f-48ed-924a-005317716a78-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-jdj42\" (UID: \"b92b9b58-0f0f-48ed-924a-005317716a78\") " pod="openstack/dnsmasq-dns-848cf88cfc-jdj42" Dec 01 21:55:16 crc kubenswrapper[4962]: I1201 21:55:16.001623 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b92b9b58-0f0f-48ed-924a-005317716a78-config\") pod \"dnsmasq-dns-848cf88cfc-jdj42\" (UID: \"b92b9b58-0f0f-48ed-924a-005317716a78\") " pod="openstack/dnsmasq-dns-848cf88cfc-jdj42" Dec 01 21:55:16 crc kubenswrapper[4962]: I1201 21:55:16.001645 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b92b9b58-0f0f-48ed-924a-005317716a78-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-jdj42\" (UID: \"b92b9b58-0f0f-48ed-924a-005317716a78\") " pod="openstack/dnsmasq-dns-848cf88cfc-jdj42" Dec 01 21:55:16 crc kubenswrapper[4962]: I1201 21:55:16.001668 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmhws\" (UniqueName: \"kubernetes.io/projected/b92b9b58-0f0f-48ed-924a-005317716a78-kube-api-access-nmhws\") pod \"dnsmasq-dns-848cf88cfc-jdj42\" (UID: \"b92b9b58-0f0f-48ed-924a-005317716a78\") " pod="openstack/dnsmasq-dns-848cf88cfc-jdj42" Dec 01 21:55:16 crc kubenswrapper[4962]: I1201 21:55:16.001698 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b92b9b58-0f0f-48ed-924a-005317716a78-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-jdj42\" (UID: \"b92b9b58-0f0f-48ed-924a-005317716a78\") " pod="openstack/dnsmasq-dns-848cf88cfc-jdj42" Dec 01 21:55:16 crc kubenswrapper[4962]: I1201 21:55:16.001804 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b92b9b58-0f0f-48ed-924a-005317716a78-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-jdj42\" (UID: \"b92b9b58-0f0f-48ed-924a-005317716a78\") " pod="openstack/dnsmasq-dns-848cf88cfc-jdj42" Dec 01 21:55:16 crc kubenswrapper[4962]: I1201 21:55:16.002548 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b92b9b58-0f0f-48ed-924a-005317716a78-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-jdj42\" (UID: \"b92b9b58-0f0f-48ed-924a-005317716a78\") " pod="openstack/dnsmasq-dns-848cf88cfc-jdj42" Dec 01 21:55:16 crc kubenswrapper[4962]: I1201 21:55:16.002577 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b92b9b58-0f0f-48ed-924a-005317716a78-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-jdj42\" (UID: \"b92b9b58-0f0f-48ed-924a-005317716a78\") " pod="openstack/dnsmasq-dns-848cf88cfc-jdj42" Dec 01 21:55:16 crc kubenswrapper[4962]: I1201 21:55:16.003107 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b92b9b58-0f0f-48ed-924a-005317716a78-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-jdj42\" (UID: \"b92b9b58-0f0f-48ed-924a-005317716a78\") " pod="openstack/dnsmasq-dns-848cf88cfc-jdj42" Dec 01 21:55:16 crc kubenswrapper[4962]: I1201 21:55:16.003372 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b92b9b58-0f0f-48ed-924a-005317716a78-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-jdj42\" (UID: \"b92b9b58-0f0f-48ed-924a-005317716a78\") " pod="openstack/dnsmasq-dns-848cf88cfc-jdj42" Dec 01 21:55:16 crc kubenswrapper[4962]: I1201 21:55:16.003616 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b92b9b58-0f0f-48ed-924a-005317716a78-config\") pod \"dnsmasq-dns-848cf88cfc-jdj42\" (UID: \"b92b9b58-0f0f-48ed-924a-005317716a78\") " pod="openstack/dnsmasq-dns-848cf88cfc-jdj42" Dec 01 21:55:16 crc kubenswrapper[4962]: I1201 21:55:16.027634 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmhws\" (UniqueName: \"kubernetes.io/projected/b92b9b58-0f0f-48ed-924a-005317716a78-kube-api-access-nmhws\") pod \"dnsmasq-dns-848cf88cfc-jdj42\" (UID: \"b92b9b58-0f0f-48ed-924a-005317716a78\") " pod="openstack/dnsmasq-dns-848cf88cfc-jdj42" Dec 01 21:55:16 crc kubenswrapper[4962]: I1201 21:55:16.103719 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/79441f7a-11fa-4ace-8deb-29d7db95e67c-config-data-custom\") pod \"barbican-api-55db4bf6b-mhhwj\" (UID: \"79441f7a-11fa-4ace-8deb-29d7db95e67c\") " pod="openstack/barbican-api-55db4bf6b-mhhwj" Dec 01 21:55:16 crc kubenswrapper[4962]: I1201 21:55:16.103819 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pg5ht\" (UniqueName: \"kubernetes.io/projected/79441f7a-11fa-4ace-8deb-29d7db95e67c-kube-api-access-pg5ht\") pod \"barbican-api-55db4bf6b-mhhwj\" (UID: \"79441f7a-11fa-4ace-8deb-29d7db95e67c\") " pod="openstack/barbican-api-55db4bf6b-mhhwj" Dec 01 21:55:16 crc kubenswrapper[4962]: I1201 21:55:16.103855 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79441f7a-11fa-4ace-8deb-29d7db95e67c-config-data\") pod \"barbican-api-55db4bf6b-mhhwj\" (UID: \"79441f7a-11fa-4ace-8deb-29d7db95e67c\") " pod="openstack/barbican-api-55db4bf6b-mhhwj" Dec 01 21:55:16 crc kubenswrapper[4962]: I1201 21:55:16.103891 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79441f7a-11fa-4ace-8deb-29d7db95e67c-logs\") pod \"barbican-api-55db4bf6b-mhhwj\" (UID: \"79441f7a-11fa-4ace-8deb-29d7db95e67c\") " pod="openstack/barbican-api-55db4bf6b-mhhwj" Dec 01 21:55:16 crc kubenswrapper[4962]: I1201 21:55:16.103968 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79441f7a-11fa-4ace-8deb-29d7db95e67c-combined-ca-bundle\") pod \"barbican-api-55db4bf6b-mhhwj\" (UID: \"79441f7a-11fa-4ace-8deb-29d7db95e67c\") " pod="openstack/barbican-api-55db4bf6b-mhhwj" Dec 01 21:55:16 crc kubenswrapper[4962]: I1201 21:55:16.191859 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-jdj42" Dec 01 21:55:16 crc kubenswrapper[4962]: I1201 21:55:16.212435 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79441f7a-11fa-4ace-8deb-29d7db95e67c-combined-ca-bundle\") pod \"barbican-api-55db4bf6b-mhhwj\" (UID: \"79441f7a-11fa-4ace-8deb-29d7db95e67c\") " pod="openstack/barbican-api-55db4bf6b-mhhwj" Dec 01 21:55:16 crc kubenswrapper[4962]: I1201 21:55:16.212608 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/79441f7a-11fa-4ace-8deb-29d7db95e67c-config-data-custom\") pod \"barbican-api-55db4bf6b-mhhwj\" (UID: \"79441f7a-11fa-4ace-8deb-29d7db95e67c\") " pod="openstack/barbican-api-55db4bf6b-mhhwj" Dec 01 21:55:16 crc kubenswrapper[4962]: I1201 21:55:16.212773 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pg5ht\" (UniqueName: \"kubernetes.io/projected/79441f7a-11fa-4ace-8deb-29d7db95e67c-kube-api-access-pg5ht\") pod \"barbican-api-55db4bf6b-mhhwj\" (UID: \"79441f7a-11fa-4ace-8deb-29d7db95e67c\") " pod="openstack/barbican-api-55db4bf6b-mhhwj" Dec 01 21:55:16 crc kubenswrapper[4962]: I1201 21:55:16.212836 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79441f7a-11fa-4ace-8deb-29d7db95e67c-config-data\") pod \"barbican-api-55db4bf6b-mhhwj\" (UID: \"79441f7a-11fa-4ace-8deb-29d7db95e67c\") " pod="openstack/barbican-api-55db4bf6b-mhhwj" Dec 01 21:55:16 crc kubenswrapper[4962]: I1201 21:55:16.212906 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79441f7a-11fa-4ace-8deb-29d7db95e67c-logs\") pod \"barbican-api-55db4bf6b-mhhwj\" (UID: \"79441f7a-11fa-4ace-8deb-29d7db95e67c\") " pod="openstack/barbican-api-55db4bf6b-mhhwj" Dec 01 21:55:16 crc kubenswrapper[4962]: I1201 21:55:16.213992 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79441f7a-11fa-4ace-8deb-29d7db95e67c-logs\") pod \"barbican-api-55db4bf6b-mhhwj\" (UID: \"79441f7a-11fa-4ace-8deb-29d7db95e67c\") " pod="openstack/barbican-api-55db4bf6b-mhhwj" Dec 01 21:55:16 crc kubenswrapper[4962]: I1201 21:55:16.219297 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79441f7a-11fa-4ace-8deb-29d7db95e67c-combined-ca-bundle\") pod \"barbican-api-55db4bf6b-mhhwj\" (UID: \"79441f7a-11fa-4ace-8deb-29d7db95e67c\") " pod="openstack/barbican-api-55db4bf6b-mhhwj" Dec 01 21:55:16 crc kubenswrapper[4962]: I1201 21:55:16.219908 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79441f7a-11fa-4ace-8deb-29d7db95e67c-config-data\") pod \"barbican-api-55db4bf6b-mhhwj\" (UID: \"79441f7a-11fa-4ace-8deb-29d7db95e67c\") " pod="openstack/barbican-api-55db4bf6b-mhhwj" Dec 01 21:55:16 crc kubenswrapper[4962]: I1201 21:55:16.238890 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/79441f7a-11fa-4ace-8deb-29d7db95e67c-config-data-custom\") pod \"barbican-api-55db4bf6b-mhhwj\" (UID: \"79441f7a-11fa-4ace-8deb-29d7db95e67c\") " pod="openstack/barbican-api-55db4bf6b-mhhwj" Dec 01 21:55:16 crc kubenswrapper[4962]: I1201 21:55:16.264123 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-x55sw" Dec 01 21:55:16 crc kubenswrapper[4962]: I1201 21:55:16.278961 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pg5ht\" (UniqueName: \"kubernetes.io/projected/79441f7a-11fa-4ace-8deb-29d7db95e67c-kube-api-access-pg5ht\") pod \"barbican-api-55db4bf6b-mhhwj\" (UID: \"79441f7a-11fa-4ace-8deb-29d7db95e67c\") " pod="openstack/barbican-api-55db4bf6b-mhhwj" Dec 01 21:55:16 crc kubenswrapper[4962]: I1201 21:55:16.361869 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-55db4bf6b-mhhwj" Dec 01 21:55:16 crc kubenswrapper[4962]: I1201 21:55:16.418567 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/873a333f-1f2b-4824-86a7-7935ff6908f9-config-data\") pod \"873a333f-1f2b-4824-86a7-7935ff6908f9\" (UID: \"873a333f-1f2b-4824-86a7-7935ff6908f9\") " Dec 01 21:55:16 crc kubenswrapper[4962]: I1201 21:55:16.418954 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dtbpm\" (UniqueName: \"kubernetes.io/projected/873a333f-1f2b-4824-86a7-7935ff6908f9-kube-api-access-dtbpm\") pod \"873a333f-1f2b-4824-86a7-7935ff6908f9\" (UID: \"873a333f-1f2b-4824-86a7-7935ff6908f9\") " Dec 01 21:55:16 crc kubenswrapper[4962]: I1201 21:55:16.419093 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/873a333f-1f2b-4824-86a7-7935ff6908f9-combined-ca-bundle\") pod \"873a333f-1f2b-4824-86a7-7935ff6908f9\" (UID: \"873a333f-1f2b-4824-86a7-7935ff6908f9\") " Dec 01 21:55:16 crc kubenswrapper[4962]: I1201 21:55:16.429971 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/873a333f-1f2b-4824-86a7-7935ff6908f9-kube-api-access-dtbpm" (OuterVolumeSpecName: "kube-api-access-dtbpm") pod "873a333f-1f2b-4824-86a7-7935ff6908f9" (UID: "873a333f-1f2b-4824-86a7-7935ff6908f9"). InnerVolumeSpecName "kube-api-access-dtbpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:55:16 crc kubenswrapper[4962]: I1201 21:55:16.489868 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/873a333f-1f2b-4824-86a7-7935ff6908f9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "873a333f-1f2b-4824-86a7-7935ff6908f9" (UID: "873a333f-1f2b-4824-86a7-7935ff6908f9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:55:16 crc kubenswrapper[4962]: I1201 21:55:16.521828 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dtbpm\" (UniqueName: \"kubernetes.io/projected/873a333f-1f2b-4824-86a7-7935ff6908f9-kube-api-access-dtbpm\") on node \"crc\" DevicePath \"\"" Dec 01 21:55:16 crc kubenswrapper[4962]: I1201 21:55:16.521855 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/873a333f-1f2b-4824-86a7-7935ff6908f9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 21:55:16 crc kubenswrapper[4962]: I1201 21:55:16.568437 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/873a333f-1f2b-4824-86a7-7935ff6908f9-config-data" (OuterVolumeSpecName: "config-data") pod "873a333f-1f2b-4824-86a7-7935ff6908f9" (UID: "873a333f-1f2b-4824-86a7-7935ff6908f9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:55:16 crc kubenswrapper[4962]: I1201 21:55:16.628601 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/873a333f-1f2b-4824-86a7-7935ff6908f9-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 21:55:16 crc kubenswrapper[4962]: I1201 21:55:16.655012 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7fbb7f6fcd-9l2gb"] Dec 01 21:55:16 crc kubenswrapper[4962]: W1201 21:55:16.659387 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod909a52c1_349c_4b1b_929f_7d2c554cad32.slice/crio-1cc0924731f503e53e345303c6181bf7bc35b44c8e7ee48b6826f850dd914af8 WatchSource:0}: Error finding container 1cc0924731f503e53e345303c6181bf7bc35b44c8e7ee48b6826f850dd914af8: Status 404 returned error can't find the container with id 1cc0924731f503e53e345303c6181bf7bc35b44c8e7ee48b6826f850dd914af8 Dec 01 21:55:16 crc kubenswrapper[4962]: W1201 21:55:16.905277 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0fcefe11_14bc_40f6_8552_da42f7b63977.slice/crio-fe5dd8be40442f63822ef387493161ede9ca332883a45e8a12014cae6978474b WatchSource:0}: Error finding container fe5dd8be40442f63822ef387493161ede9ca332883a45e8a12014cae6978474b: Status 404 returned error can't find the container with id fe5dd8be40442f63822ef387493161ede9ca332883a45e8a12014cae6978474b Dec 01 21:55:16 crc kubenswrapper[4962]: I1201 21:55:16.906477 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-76b577fdff-d85rv"] Dec 01 21:55:16 crc kubenswrapper[4962]: I1201 21:55:16.906511 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-vzzq8" Dec 01 21:55:16 crc kubenswrapper[4962]: I1201 21:55:16.983586 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-jdj42"] Dec 01 21:55:16 crc kubenswrapper[4962]: W1201 21:55:16.993106 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb92b9b58_0f0f_48ed_924a_005317716a78.slice/crio-5606b20fc23188c1f18e65b41caa0639f15720c6b9c14c5f20caff6161f93893 WatchSource:0}: Error finding container 5606b20fc23188c1f18e65b41caa0639f15720c6b9c14c5f20caff6161f93893: Status 404 returned error can't find the container with id 5606b20fc23188c1f18e65b41caa0639f15720c6b9c14c5f20caff6161f93893 Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.036815 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/29a10360-5fb7-4add-8bf5-1bc35e6e76dd-etc-machine-id\") pod \"29a10360-5fb7-4add-8bf5-1bc35e6e76dd\" (UID: \"29a10360-5fb7-4add-8bf5-1bc35e6e76dd\") " Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.036853 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dv9w6\" (UniqueName: \"kubernetes.io/projected/29a10360-5fb7-4add-8bf5-1bc35e6e76dd-kube-api-access-dv9w6\") pod \"29a10360-5fb7-4add-8bf5-1bc35e6e76dd\" (UID: \"29a10360-5fb7-4add-8bf5-1bc35e6e76dd\") " Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.036928 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29a10360-5fb7-4add-8bf5-1bc35e6e76dd-config-data\") pod \"29a10360-5fb7-4add-8bf5-1bc35e6e76dd\" (UID: \"29a10360-5fb7-4add-8bf5-1bc35e6e76dd\") " Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.036923 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/29a10360-5fb7-4add-8bf5-1bc35e6e76dd-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "29a10360-5fb7-4add-8bf5-1bc35e6e76dd" (UID: "29a10360-5fb7-4add-8bf5-1bc35e6e76dd"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.037129 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/29a10360-5fb7-4add-8bf5-1bc35e6e76dd-db-sync-config-data\") pod \"29a10360-5fb7-4add-8bf5-1bc35e6e76dd\" (UID: \"29a10360-5fb7-4add-8bf5-1bc35e6e76dd\") " Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.037274 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29a10360-5fb7-4add-8bf5-1bc35e6e76dd-scripts\") pod \"29a10360-5fb7-4add-8bf5-1bc35e6e76dd\" (UID: \"29a10360-5fb7-4add-8bf5-1bc35e6e76dd\") " Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.037295 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29a10360-5fb7-4add-8bf5-1bc35e6e76dd-combined-ca-bundle\") pod \"29a10360-5fb7-4add-8bf5-1bc35e6e76dd\" (UID: \"29a10360-5fb7-4add-8bf5-1bc35e6e76dd\") " Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.037950 4962 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/29a10360-5fb7-4add-8bf5-1bc35e6e76dd-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.054446 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29a10360-5fb7-4add-8bf5-1bc35e6e76dd-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "29a10360-5fb7-4add-8bf5-1bc35e6e76dd" (UID: "29a10360-5fb7-4add-8bf5-1bc35e6e76dd"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.058419 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29a10360-5fb7-4add-8bf5-1bc35e6e76dd-scripts" (OuterVolumeSpecName: "scripts") pod "29a10360-5fb7-4add-8bf5-1bc35e6e76dd" (UID: "29a10360-5fb7-4add-8bf5-1bc35e6e76dd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.059496 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29a10360-5fb7-4add-8bf5-1bc35e6e76dd-kube-api-access-dv9w6" (OuterVolumeSpecName: "kube-api-access-dv9w6") pod "29a10360-5fb7-4add-8bf5-1bc35e6e76dd" (UID: "29a10360-5fb7-4add-8bf5-1bc35e6e76dd"). InnerVolumeSpecName "kube-api-access-dv9w6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.101569 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29a10360-5fb7-4add-8bf5-1bc35e6e76dd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "29a10360-5fb7-4add-8bf5-1bc35e6e76dd" (UID: "29a10360-5fb7-4add-8bf5-1bc35e6e76dd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.128309 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29a10360-5fb7-4add-8bf5-1bc35e6e76dd-config-data" (OuterVolumeSpecName: "config-data") pod "29a10360-5fb7-4add-8bf5-1bc35e6e76dd" (UID: "29a10360-5fb7-4add-8bf5-1bc35e6e76dd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.139403 4962 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/29a10360-5fb7-4add-8bf5-1bc35e6e76dd-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.139437 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29a10360-5fb7-4add-8bf5-1bc35e6e76dd-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.139447 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29a10360-5fb7-4add-8bf5-1bc35e6e76dd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.139456 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dv9w6\" (UniqueName: \"kubernetes.io/projected/29a10360-5fb7-4add-8bf5-1bc35e6e76dd-kube-api-access-dv9w6\") on node \"crc\" DevicePath \"\"" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.139468 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29a10360-5fb7-4add-8bf5-1bc35e6e76dd-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.197355 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-vzzq8" event={"ID":"29a10360-5fb7-4add-8bf5-1bc35e6e76dd","Type":"ContainerDied","Data":"4d39c5143aa52d6e193b656e023b02164398969e26dc8f00e0e07da4013cde1a"} Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.197397 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d39c5143aa52d6e193b656e023b02164398969e26dc8f00e0e07da4013cde1a" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.197458 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-vzzq8" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.211627 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-55db4bf6b-mhhwj"] Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.215632 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-x55sw" event={"ID":"873a333f-1f2b-4824-86a7-7935ff6908f9","Type":"ContainerDied","Data":"b9155aa85aa567602788e132d7acb942d8ca696a1545b52ae459425e7dff0d50"} Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.215668 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b9155aa85aa567602788e132d7acb942d8ca696a1545b52ae459425e7dff0d50" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.215695 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-x55sw" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.232040 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7fbb7f6fcd-9l2gb" event={"ID":"909a52c1-349c-4b1b-929f-7d2c554cad32","Type":"ContainerStarted","Data":"1cc0924731f503e53e345303c6181bf7bc35b44c8e7ee48b6826f850dd914af8"} Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.236384 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-jdj42" event={"ID":"b92b9b58-0f0f-48ed-924a-005317716a78","Type":"ContainerStarted","Data":"5606b20fc23188c1f18e65b41caa0639f15720c6b9c14c5f20caff6161f93893"} Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.240787 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-76b577fdff-d85rv" event={"ID":"0fcefe11-14bc-40f6-8552-da42f7b63977","Type":"ContainerStarted","Data":"fe5dd8be40442f63822ef387493161ede9ca332883a45e8a12014cae6978474b"} Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.462141 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 21:55:17 crc kubenswrapper[4962]: E1201 21:55:17.462884 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="873a333f-1f2b-4824-86a7-7935ff6908f9" containerName="heat-db-sync" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.462903 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="873a333f-1f2b-4824-86a7-7935ff6908f9" containerName="heat-db-sync" Dec 01 21:55:17 crc kubenswrapper[4962]: E1201 21:55:17.462916 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29a10360-5fb7-4add-8bf5-1bc35e6e76dd" containerName="cinder-db-sync" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.462922 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="29a10360-5fb7-4add-8bf5-1bc35e6e76dd" containerName="cinder-db-sync" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.463186 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="873a333f-1f2b-4824-86a7-7935ff6908f9" containerName="heat-db-sync" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.463226 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="29a10360-5fb7-4add-8bf5-1bc35e6e76dd" containerName="cinder-db-sync" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.464362 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.467653 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.467683 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-n49jq" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.469658 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.475544 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.482826 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.543368 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-jdj42"] Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.547418 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6cf6cbec-dfb2-4afc-a180-e38559810d02-scripts\") pod \"cinder-scheduler-0\" (UID: \"6cf6cbec-dfb2-4afc-a180-e38559810d02\") " pod="openstack/cinder-scheduler-0" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.547453 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptcv4\" (UniqueName: \"kubernetes.io/projected/6cf6cbec-dfb2-4afc-a180-e38559810d02-kube-api-access-ptcv4\") pod \"cinder-scheduler-0\" (UID: \"6cf6cbec-dfb2-4afc-a180-e38559810d02\") " pod="openstack/cinder-scheduler-0" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.547496 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cf6cbec-dfb2-4afc-a180-e38559810d02-config-data\") pod \"cinder-scheduler-0\" (UID: \"6cf6cbec-dfb2-4afc-a180-e38559810d02\") " pod="openstack/cinder-scheduler-0" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.547549 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6cf6cbec-dfb2-4afc-a180-e38559810d02-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"6cf6cbec-dfb2-4afc-a180-e38559810d02\") " pod="openstack/cinder-scheduler-0" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.547610 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6cf6cbec-dfb2-4afc-a180-e38559810d02-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"6cf6cbec-dfb2-4afc-a180-e38559810d02\") " pod="openstack/cinder-scheduler-0" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.547666 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cf6cbec-dfb2-4afc-a180-e38559810d02-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"6cf6cbec-dfb2-4afc-a180-e38559810d02\") " pod="openstack/cinder-scheduler-0" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.603054 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-42wgd"] Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.605034 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-42wgd" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.632046 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-42wgd"] Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.649478 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6cf6cbec-dfb2-4afc-a180-e38559810d02-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"6cf6cbec-dfb2-4afc-a180-e38559810d02\") " pod="openstack/cinder-scheduler-0" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.649946 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f1e9610-2753-4001-a9fb-5e020774725b-config\") pod \"dnsmasq-dns-6578955fd5-42wgd\" (UID: \"5f1e9610-2753-4001-a9fb-5e020774725b\") " pod="openstack/dnsmasq-dns-6578955fd5-42wgd" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.650047 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6cf6cbec-dfb2-4afc-a180-e38559810d02-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"6cf6cbec-dfb2-4afc-a180-e38559810d02\") " pod="openstack/cinder-scheduler-0" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.650115 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5f1e9610-2753-4001-a9fb-5e020774725b-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-42wgd\" (UID: \"5f1e9610-2753-4001-a9fb-5e020774725b\") " pod="openstack/dnsmasq-dns-6578955fd5-42wgd" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.650224 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6cf6cbec-dfb2-4afc-a180-e38559810d02-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"6cf6cbec-dfb2-4afc-a180-e38559810d02\") " pod="openstack/cinder-scheduler-0" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.650231 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4thwd\" (UniqueName: \"kubernetes.io/projected/5f1e9610-2753-4001-a9fb-5e020774725b-kube-api-access-4thwd\") pod \"dnsmasq-dns-6578955fd5-42wgd\" (UID: \"5f1e9610-2753-4001-a9fb-5e020774725b\") " pod="openstack/dnsmasq-dns-6578955fd5-42wgd" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.650382 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5f1e9610-2753-4001-a9fb-5e020774725b-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-42wgd\" (UID: \"5f1e9610-2753-4001-a9fb-5e020774725b\") " pod="openstack/dnsmasq-dns-6578955fd5-42wgd" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.650466 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cf6cbec-dfb2-4afc-a180-e38559810d02-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"6cf6cbec-dfb2-4afc-a180-e38559810d02\") " pod="openstack/cinder-scheduler-0" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.650582 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5f1e9610-2753-4001-a9fb-5e020774725b-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-42wgd\" (UID: \"5f1e9610-2753-4001-a9fb-5e020774725b\") " pod="openstack/dnsmasq-dns-6578955fd5-42wgd" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.650684 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6cf6cbec-dfb2-4afc-a180-e38559810d02-scripts\") pod \"cinder-scheduler-0\" (UID: \"6cf6cbec-dfb2-4afc-a180-e38559810d02\") " pod="openstack/cinder-scheduler-0" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.650763 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptcv4\" (UniqueName: \"kubernetes.io/projected/6cf6cbec-dfb2-4afc-a180-e38559810d02-kube-api-access-ptcv4\") pod \"cinder-scheduler-0\" (UID: \"6cf6cbec-dfb2-4afc-a180-e38559810d02\") " pod="openstack/cinder-scheduler-0" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.650839 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5f1e9610-2753-4001-a9fb-5e020774725b-dns-svc\") pod \"dnsmasq-dns-6578955fd5-42wgd\" (UID: \"5f1e9610-2753-4001-a9fb-5e020774725b\") " pod="openstack/dnsmasq-dns-6578955fd5-42wgd" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.650915 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cf6cbec-dfb2-4afc-a180-e38559810d02-config-data\") pod \"cinder-scheduler-0\" (UID: \"6cf6cbec-dfb2-4afc-a180-e38559810d02\") " pod="openstack/cinder-scheduler-0" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.662601 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cf6cbec-dfb2-4afc-a180-e38559810d02-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"6cf6cbec-dfb2-4afc-a180-e38559810d02\") " pod="openstack/cinder-scheduler-0" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.670699 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cf6cbec-dfb2-4afc-a180-e38559810d02-config-data\") pod \"cinder-scheduler-0\" (UID: \"6cf6cbec-dfb2-4afc-a180-e38559810d02\") " pod="openstack/cinder-scheduler-0" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.671412 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6cf6cbec-dfb2-4afc-a180-e38559810d02-scripts\") pod \"cinder-scheduler-0\" (UID: \"6cf6cbec-dfb2-4afc-a180-e38559810d02\") " pod="openstack/cinder-scheduler-0" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.677800 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptcv4\" (UniqueName: \"kubernetes.io/projected/6cf6cbec-dfb2-4afc-a180-e38559810d02-kube-api-access-ptcv4\") pod \"cinder-scheduler-0\" (UID: \"6cf6cbec-dfb2-4afc-a180-e38559810d02\") " pod="openstack/cinder-scheduler-0" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.685510 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6cf6cbec-dfb2-4afc-a180-e38559810d02-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"6cf6cbec-dfb2-4afc-a180-e38559810d02\") " pod="openstack/cinder-scheduler-0" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.729618 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.737139 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.743718 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.750586 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.752914 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f1e9610-2753-4001-a9fb-5e020774725b-config\") pod \"dnsmasq-dns-6578955fd5-42wgd\" (UID: \"5f1e9610-2753-4001-a9fb-5e020774725b\") " pod="openstack/dnsmasq-dns-6578955fd5-42wgd" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.752979 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5f1e9610-2753-4001-a9fb-5e020774725b-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-42wgd\" (UID: \"5f1e9610-2753-4001-a9fb-5e020774725b\") " pod="openstack/dnsmasq-dns-6578955fd5-42wgd" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.753026 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4thwd\" (UniqueName: \"kubernetes.io/projected/5f1e9610-2753-4001-a9fb-5e020774725b-kube-api-access-4thwd\") pod \"dnsmasq-dns-6578955fd5-42wgd\" (UID: \"5f1e9610-2753-4001-a9fb-5e020774725b\") " pod="openstack/dnsmasq-dns-6578955fd5-42wgd" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.753062 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5f1e9610-2753-4001-a9fb-5e020774725b-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-42wgd\" (UID: \"5f1e9610-2753-4001-a9fb-5e020774725b\") " pod="openstack/dnsmasq-dns-6578955fd5-42wgd" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.753110 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5f1e9610-2753-4001-a9fb-5e020774725b-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-42wgd\" (UID: \"5f1e9610-2753-4001-a9fb-5e020774725b\") " pod="openstack/dnsmasq-dns-6578955fd5-42wgd" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.753157 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5f1e9610-2753-4001-a9fb-5e020774725b-dns-svc\") pod \"dnsmasq-dns-6578955fd5-42wgd\" (UID: \"5f1e9610-2753-4001-a9fb-5e020774725b\") " pod="openstack/dnsmasq-dns-6578955fd5-42wgd" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.754046 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f1e9610-2753-4001-a9fb-5e020774725b-config\") pod \"dnsmasq-dns-6578955fd5-42wgd\" (UID: \"5f1e9610-2753-4001-a9fb-5e020774725b\") " pod="openstack/dnsmasq-dns-6578955fd5-42wgd" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.754115 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5f1e9610-2753-4001-a9fb-5e020774725b-dns-svc\") pod \"dnsmasq-dns-6578955fd5-42wgd\" (UID: \"5f1e9610-2753-4001-a9fb-5e020774725b\") " pod="openstack/dnsmasq-dns-6578955fd5-42wgd" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.754159 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5f1e9610-2753-4001-a9fb-5e020774725b-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-42wgd\" (UID: \"5f1e9610-2753-4001-a9fb-5e020774725b\") " pod="openstack/dnsmasq-dns-6578955fd5-42wgd" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.754858 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5f1e9610-2753-4001-a9fb-5e020774725b-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-42wgd\" (UID: \"5f1e9610-2753-4001-a9fb-5e020774725b\") " pod="openstack/dnsmasq-dns-6578955fd5-42wgd" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.755468 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5f1e9610-2753-4001-a9fb-5e020774725b-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-42wgd\" (UID: \"5f1e9610-2753-4001-a9fb-5e020774725b\") " pod="openstack/dnsmasq-dns-6578955fd5-42wgd" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.809436 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4thwd\" (UniqueName: \"kubernetes.io/projected/5f1e9610-2753-4001-a9fb-5e020774725b-kube-api-access-4thwd\") pod \"dnsmasq-dns-6578955fd5-42wgd\" (UID: \"5f1e9610-2753-4001-a9fb-5e020774725b\") " pod="openstack/dnsmasq-dns-6578955fd5-42wgd" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.821991 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.856383 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/95331b15-926c-429c-a7e2-41bc904f6629-config-data-custom\") pod \"cinder-api-0\" (UID: \"95331b15-926c-429c-a7e2-41bc904f6629\") " pod="openstack/cinder-api-0" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.856780 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/95331b15-926c-429c-a7e2-41bc904f6629-etc-machine-id\") pod \"cinder-api-0\" (UID: \"95331b15-926c-429c-a7e2-41bc904f6629\") " pod="openstack/cinder-api-0" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.856845 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffk2s\" (UniqueName: \"kubernetes.io/projected/95331b15-926c-429c-a7e2-41bc904f6629-kube-api-access-ffk2s\") pod \"cinder-api-0\" (UID: \"95331b15-926c-429c-a7e2-41bc904f6629\") " pod="openstack/cinder-api-0" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.857524 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95331b15-926c-429c-a7e2-41bc904f6629-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"95331b15-926c-429c-a7e2-41bc904f6629\") " pod="openstack/cinder-api-0" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.857592 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95331b15-926c-429c-a7e2-41bc904f6629-logs\") pod \"cinder-api-0\" (UID: \"95331b15-926c-429c-a7e2-41bc904f6629\") " pod="openstack/cinder-api-0" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.857623 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95331b15-926c-429c-a7e2-41bc904f6629-config-data\") pod \"cinder-api-0\" (UID: \"95331b15-926c-429c-a7e2-41bc904f6629\") " pod="openstack/cinder-api-0" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.857655 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95331b15-926c-429c-a7e2-41bc904f6629-scripts\") pod \"cinder-api-0\" (UID: \"95331b15-926c-429c-a7e2-41bc904f6629\") " pod="openstack/cinder-api-0" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.940871 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-42wgd" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.962226 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/95331b15-926c-429c-a7e2-41bc904f6629-etc-machine-id\") pod \"cinder-api-0\" (UID: \"95331b15-926c-429c-a7e2-41bc904f6629\") " pod="openstack/cinder-api-0" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.962279 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffk2s\" (UniqueName: \"kubernetes.io/projected/95331b15-926c-429c-a7e2-41bc904f6629-kube-api-access-ffk2s\") pod \"cinder-api-0\" (UID: \"95331b15-926c-429c-a7e2-41bc904f6629\") " pod="openstack/cinder-api-0" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.962365 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95331b15-926c-429c-a7e2-41bc904f6629-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"95331b15-926c-429c-a7e2-41bc904f6629\") " pod="openstack/cinder-api-0" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.962399 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95331b15-926c-429c-a7e2-41bc904f6629-logs\") pod \"cinder-api-0\" (UID: \"95331b15-926c-429c-a7e2-41bc904f6629\") " pod="openstack/cinder-api-0" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.962419 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95331b15-926c-429c-a7e2-41bc904f6629-config-data\") pod \"cinder-api-0\" (UID: \"95331b15-926c-429c-a7e2-41bc904f6629\") " pod="openstack/cinder-api-0" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.962441 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95331b15-926c-429c-a7e2-41bc904f6629-scripts\") pod \"cinder-api-0\" (UID: \"95331b15-926c-429c-a7e2-41bc904f6629\") " pod="openstack/cinder-api-0" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.962493 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/95331b15-926c-429c-a7e2-41bc904f6629-config-data-custom\") pod \"cinder-api-0\" (UID: \"95331b15-926c-429c-a7e2-41bc904f6629\") " pod="openstack/cinder-api-0" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.964920 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95331b15-926c-429c-a7e2-41bc904f6629-logs\") pod \"cinder-api-0\" (UID: \"95331b15-926c-429c-a7e2-41bc904f6629\") " pod="openstack/cinder-api-0" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.965013 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/95331b15-926c-429c-a7e2-41bc904f6629-etc-machine-id\") pod \"cinder-api-0\" (UID: \"95331b15-926c-429c-a7e2-41bc904f6629\") " pod="openstack/cinder-api-0" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.967101 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95331b15-926c-429c-a7e2-41bc904f6629-config-data\") pod \"cinder-api-0\" (UID: \"95331b15-926c-429c-a7e2-41bc904f6629\") " pod="openstack/cinder-api-0" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.969549 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95331b15-926c-429c-a7e2-41bc904f6629-scripts\") pod \"cinder-api-0\" (UID: \"95331b15-926c-429c-a7e2-41bc904f6629\") " pod="openstack/cinder-api-0" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.970633 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/95331b15-926c-429c-a7e2-41bc904f6629-config-data-custom\") pod \"cinder-api-0\" (UID: \"95331b15-926c-429c-a7e2-41bc904f6629\") " pod="openstack/cinder-api-0" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.976459 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95331b15-926c-429c-a7e2-41bc904f6629-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"95331b15-926c-429c-a7e2-41bc904f6629\") " pod="openstack/cinder-api-0" Dec 01 21:55:17 crc kubenswrapper[4962]: I1201 21:55:17.987472 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffk2s\" (UniqueName: \"kubernetes.io/projected/95331b15-926c-429c-a7e2-41bc904f6629-kube-api-access-ffk2s\") pod \"cinder-api-0\" (UID: \"95331b15-926c-429c-a7e2-41bc904f6629\") " pod="openstack/cinder-api-0" Dec 01 21:55:18 crc kubenswrapper[4962]: I1201 21:55:18.179479 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 01 21:55:18 crc kubenswrapper[4962]: I1201 21:55:18.438100 4962 generic.go:334] "Generic (PLEG): container finished" podID="b92b9b58-0f0f-48ed-924a-005317716a78" containerID="f30cd32a16ae30fb07bc255f27a885076d3e7f6d6b645546881a4289c0a926e9" exitCode=0 Dec 01 21:55:18 crc kubenswrapper[4962]: I1201 21:55:18.444490 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-jdj42" event={"ID":"b92b9b58-0f0f-48ed-924a-005317716a78","Type":"ContainerDied","Data":"f30cd32a16ae30fb07bc255f27a885076d3e7f6d6b645546881a4289c0a926e9"} Dec 01 21:55:18 crc kubenswrapper[4962]: I1201 21:55:18.448898 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-55db4bf6b-mhhwj" event={"ID":"79441f7a-11fa-4ace-8deb-29d7db95e67c","Type":"ContainerStarted","Data":"56188a8c3bc08d44dc9fad529e37f59ccaac6cc4e0beafc1c2ef0dfe6979ae6a"} Dec 01 21:55:18 crc kubenswrapper[4962]: I1201 21:55:18.448958 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-55db4bf6b-mhhwj" event={"ID":"79441f7a-11fa-4ace-8deb-29d7db95e67c","Type":"ContainerStarted","Data":"0b7eb0a1abd08acc09f071e88b9c324d0ef37fa750a1c136691a99ae37e0402c"} Dec 01 21:55:18 crc kubenswrapper[4962]: I1201 21:55:18.522597 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 21:55:18 crc kubenswrapper[4962]: I1201 21:55:18.846521 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-42wgd"] Dec 01 21:55:18 crc kubenswrapper[4962]: E1201 21:55:18.883475 4962 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Dec 01 21:55:18 crc kubenswrapper[4962]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/b92b9b58-0f0f-48ed-924a-005317716a78/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Dec 01 21:55:18 crc kubenswrapper[4962]: > podSandboxID="5606b20fc23188c1f18e65b41caa0639f15720c6b9c14c5f20caff6161f93893" Dec 01 21:55:18 crc kubenswrapper[4962]: E1201 21:55:18.883654 4962 kuberuntime_manager.go:1274] "Unhandled Error" err=< Dec 01 21:55:18 crc kubenswrapper[4962]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n99h8bhd9h696h649h588h5c6h658h5b4h57fh65h89h5f5h56h696h5dh8h57h597h68ch568h58dh66hf4h675h598h588h67dhb5h69h5dh6bq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-swift-storage-0,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-swift-storage-0,SubPath:dns-swift-storage-0,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-sb,SubPath:ovsdbserver-sb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nmhws,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-848cf88cfc-jdj42_openstack(b92b9b58-0f0f-48ed-924a-005317716a78): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/b92b9b58-0f0f-48ed-924a-005317716a78/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Dec 01 21:55:18 crc kubenswrapper[4962]: > logger="UnhandledError" Dec 01 21:55:18 crc kubenswrapper[4962]: E1201 21:55:18.884832 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/b92b9b58-0f0f-48ed-924a-005317716a78/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-848cf88cfc-jdj42" podUID="b92b9b58-0f0f-48ed-924a-005317716a78" Dec 01 21:55:18 crc kubenswrapper[4962]: W1201 21:55:18.915091 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f1e9610_2753_4001_a9fb_5e020774725b.slice/crio-1f92e3fa8a465f375e2752c223d3eb3067817fabd451aa5612d522f9389ec307 WatchSource:0}: Error finding container 1f92e3fa8a465f375e2752c223d3eb3067817fabd451aa5612d522f9389ec307: Status 404 returned error can't find the container with id 1f92e3fa8a465f375e2752c223d3eb3067817fabd451aa5612d522f9389ec307 Dec 01 21:55:19 crc kubenswrapper[4962]: I1201 21:55:19.046548 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 01 21:55:19 crc kubenswrapper[4962]: I1201 21:55:19.477042 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6cf6cbec-dfb2-4afc-a180-e38559810d02","Type":"ContainerStarted","Data":"6e0f0f845d71a8e5bad98dff9a711f23aaad820e9ed661556b4224357fdd5f5b"} Dec 01 21:55:19 crc kubenswrapper[4962]: I1201 21:55:19.479082 4962 generic.go:334] "Generic (PLEG): container finished" podID="5f1e9610-2753-4001-a9fb-5e020774725b" containerID="40bde0aa50d937698515a8077f29b99419a0c41d9992444b34270646a3a67cdf" exitCode=0 Dec 01 21:55:19 crc kubenswrapper[4962]: I1201 21:55:19.479151 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-42wgd" event={"ID":"5f1e9610-2753-4001-a9fb-5e020774725b","Type":"ContainerDied","Data":"40bde0aa50d937698515a8077f29b99419a0c41d9992444b34270646a3a67cdf"} Dec 01 21:55:19 crc kubenswrapper[4962]: I1201 21:55:19.479179 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-42wgd" event={"ID":"5f1e9610-2753-4001-a9fb-5e020774725b","Type":"ContainerStarted","Data":"1f92e3fa8a465f375e2752c223d3eb3067817fabd451aa5612d522f9389ec307"} Dec 01 21:55:19 crc kubenswrapper[4962]: I1201 21:55:19.485003 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-55db4bf6b-mhhwj" event={"ID":"79441f7a-11fa-4ace-8deb-29d7db95e67c","Type":"ContainerStarted","Data":"02bbd4368f9bd123ad569cc0d9dc963126eb75ec8073c5bd5ce7b17f8dcca4eb"} Dec 01 21:55:19 crc kubenswrapper[4962]: I1201 21:55:19.485278 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-55db4bf6b-mhhwj" Dec 01 21:55:19 crc kubenswrapper[4962]: I1201 21:55:19.485393 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-55db4bf6b-mhhwj" Dec 01 21:55:19 crc kubenswrapper[4962]: I1201 21:55:19.489683 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"95331b15-926c-429c-a7e2-41bc904f6629","Type":"ContainerStarted","Data":"fc9f0f3156bd9b5df0de24db8eaa51cb02c196090b4a7aa45e47b26ecbef71a7"} Dec 01 21:55:19 crc kubenswrapper[4962]: I1201 21:55:19.533912 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-55db4bf6b-mhhwj" podStartSLOduration=4.533888796 podStartE2EDuration="4.533888796s" podCreationTimestamp="2025-12-01 21:55:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:55:19.521615807 +0000 UTC m=+1303.623055002" watchObservedRunningTime="2025-12-01 21:55:19.533888796 +0000 UTC m=+1303.635327991" Dec 01 21:55:20 crc kubenswrapper[4962]: I1201 21:55:20.504409 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-jdj42" event={"ID":"b92b9b58-0f0f-48ed-924a-005317716a78","Type":"ContainerDied","Data":"5606b20fc23188c1f18e65b41caa0639f15720c6b9c14c5f20caff6161f93893"} Dec 01 21:55:20 crc kubenswrapper[4962]: I1201 21:55:20.504882 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5606b20fc23188c1f18e65b41caa0639f15720c6b9c14c5f20caff6161f93893" Dec 01 21:55:20 crc kubenswrapper[4962]: I1201 21:55:20.507421 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"95331b15-926c-429c-a7e2-41bc904f6629","Type":"ContainerStarted","Data":"93c9f9afcbd718d18e2c431ea6d16d9fe64efb9ed6b30f45ebaafdb9ec6ddbf7"} Dec 01 21:55:20 crc kubenswrapper[4962]: I1201 21:55:20.543179 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-jdj42" Dec 01 21:55:20 crc kubenswrapper[4962]: I1201 21:55:20.664787 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b92b9b58-0f0f-48ed-924a-005317716a78-dns-swift-storage-0\") pod \"b92b9b58-0f0f-48ed-924a-005317716a78\" (UID: \"b92b9b58-0f0f-48ed-924a-005317716a78\") " Dec 01 21:55:20 crc kubenswrapper[4962]: I1201 21:55:20.664840 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b92b9b58-0f0f-48ed-924a-005317716a78-ovsdbserver-sb\") pod \"b92b9b58-0f0f-48ed-924a-005317716a78\" (UID: \"b92b9b58-0f0f-48ed-924a-005317716a78\") " Dec 01 21:55:20 crc kubenswrapper[4962]: I1201 21:55:20.664926 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmhws\" (UniqueName: \"kubernetes.io/projected/b92b9b58-0f0f-48ed-924a-005317716a78-kube-api-access-nmhws\") pod \"b92b9b58-0f0f-48ed-924a-005317716a78\" (UID: \"b92b9b58-0f0f-48ed-924a-005317716a78\") " Dec 01 21:55:20 crc kubenswrapper[4962]: I1201 21:55:20.665127 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b92b9b58-0f0f-48ed-924a-005317716a78-dns-svc\") pod \"b92b9b58-0f0f-48ed-924a-005317716a78\" (UID: \"b92b9b58-0f0f-48ed-924a-005317716a78\") " Dec 01 21:55:20 crc kubenswrapper[4962]: I1201 21:55:20.665481 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b92b9b58-0f0f-48ed-924a-005317716a78-config\") pod \"b92b9b58-0f0f-48ed-924a-005317716a78\" (UID: \"b92b9b58-0f0f-48ed-924a-005317716a78\") " Dec 01 21:55:20 crc kubenswrapper[4962]: I1201 21:55:20.666060 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b92b9b58-0f0f-48ed-924a-005317716a78-ovsdbserver-nb\") pod \"b92b9b58-0f0f-48ed-924a-005317716a78\" (UID: \"b92b9b58-0f0f-48ed-924a-005317716a78\") " Dec 01 21:55:20 crc kubenswrapper[4962]: I1201 21:55:20.678665 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b92b9b58-0f0f-48ed-924a-005317716a78-kube-api-access-nmhws" (OuterVolumeSpecName: "kube-api-access-nmhws") pod "b92b9b58-0f0f-48ed-924a-005317716a78" (UID: "b92b9b58-0f0f-48ed-924a-005317716a78"). InnerVolumeSpecName "kube-api-access-nmhws". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:55:20 crc kubenswrapper[4962]: I1201 21:55:20.766953 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b92b9b58-0f0f-48ed-924a-005317716a78-config" (OuterVolumeSpecName: "config") pod "b92b9b58-0f0f-48ed-924a-005317716a78" (UID: "b92b9b58-0f0f-48ed-924a-005317716a78"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:55:20 crc kubenswrapper[4962]: I1201 21:55:20.769722 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b92b9b58-0f0f-48ed-924a-005317716a78-config\") on node \"crc\" DevicePath \"\"" Dec 01 21:55:20 crc kubenswrapper[4962]: I1201 21:55:20.769747 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmhws\" (UniqueName: \"kubernetes.io/projected/b92b9b58-0f0f-48ed-924a-005317716a78-kube-api-access-nmhws\") on node \"crc\" DevicePath \"\"" Dec 01 21:55:20 crc kubenswrapper[4962]: I1201 21:55:20.775920 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b92b9b58-0f0f-48ed-924a-005317716a78-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b92b9b58-0f0f-48ed-924a-005317716a78" (UID: "b92b9b58-0f0f-48ed-924a-005317716a78"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:55:20 crc kubenswrapper[4962]: I1201 21:55:20.783242 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b92b9b58-0f0f-48ed-924a-005317716a78-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b92b9b58-0f0f-48ed-924a-005317716a78" (UID: "b92b9b58-0f0f-48ed-924a-005317716a78"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:55:20 crc kubenswrapper[4962]: I1201 21:55:20.784927 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b92b9b58-0f0f-48ed-924a-005317716a78-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b92b9b58-0f0f-48ed-924a-005317716a78" (UID: "b92b9b58-0f0f-48ed-924a-005317716a78"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:55:20 crc kubenswrapper[4962]: I1201 21:55:20.797596 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b92b9b58-0f0f-48ed-924a-005317716a78-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b92b9b58-0f0f-48ed-924a-005317716a78" (UID: "b92b9b58-0f0f-48ed-924a-005317716a78"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:55:20 crc kubenswrapper[4962]: I1201 21:55:20.876736 4962 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b92b9b58-0f0f-48ed-924a-005317716a78-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 01 21:55:20 crc kubenswrapper[4962]: I1201 21:55:20.876770 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b92b9b58-0f0f-48ed-924a-005317716a78-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 21:55:20 crc kubenswrapper[4962]: I1201 21:55:20.876782 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b92b9b58-0f0f-48ed-924a-005317716a78-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 21:55:20 crc kubenswrapper[4962]: I1201 21:55:20.876810 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b92b9b58-0f0f-48ed-924a-005317716a78-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 21:55:20 crc kubenswrapper[4962]: I1201 21:55:20.976317 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 01 21:55:21 crc kubenswrapper[4962]: I1201 21:55:21.528002 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7fbb7f6fcd-9l2gb" event={"ID":"909a52c1-349c-4b1b-929f-7d2c554cad32","Type":"ContainerStarted","Data":"7f930a333d893e566036711b68e3ed5bef6cd169840c85c438c5ed40f8df5b05"} Dec 01 21:55:21 crc kubenswrapper[4962]: I1201 21:55:21.528058 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7fbb7f6fcd-9l2gb" event={"ID":"909a52c1-349c-4b1b-929f-7d2c554cad32","Type":"ContainerStarted","Data":"a6c156ed58f6d4829329a1b09b90e2dafed6d851310d99b49c780b0af6e2c541"} Dec 01 21:55:21 crc kubenswrapper[4962]: I1201 21:55:21.534922 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-42wgd" event={"ID":"5f1e9610-2753-4001-a9fb-5e020774725b","Type":"ContainerStarted","Data":"77fa35faed3d880a1a9dee57b9ba09a7430aca2181f1a19e5dd261235253db5d"} Dec 01 21:55:21 crc kubenswrapper[4962]: I1201 21:55:21.535053 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6578955fd5-42wgd" Dec 01 21:55:21 crc kubenswrapper[4962]: I1201 21:55:21.536883 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-jdj42" Dec 01 21:55:21 crc kubenswrapper[4962]: I1201 21:55:21.537395 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-76b577fdff-d85rv" event={"ID":"0fcefe11-14bc-40f6-8552-da42f7b63977","Type":"ContainerStarted","Data":"f904c8abf42798a9b2f8b2ea325dcdda6e8e3ba5aec33e888fe3239e9489bbeb"} Dec 01 21:55:21 crc kubenswrapper[4962]: I1201 21:55:21.537439 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-76b577fdff-d85rv" event={"ID":"0fcefe11-14bc-40f6-8552-da42f7b63977","Type":"ContainerStarted","Data":"c8b87fb66e14c7d639f01df1e22e0ee7900aeab225677f4d87709966aff5a383"} Dec 01 21:55:21 crc kubenswrapper[4962]: I1201 21:55:21.548115 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-7fbb7f6fcd-9l2gb" podStartSLOduration=2.663116016 podStartE2EDuration="6.548099105s" podCreationTimestamp="2025-12-01 21:55:15 +0000 UTC" firstStartedPulling="2025-12-01 21:55:16.665477981 +0000 UTC m=+1300.766917166" lastFinishedPulling="2025-12-01 21:55:20.55046106 +0000 UTC m=+1304.651900255" observedRunningTime="2025-12-01 21:55:21.54756099 +0000 UTC m=+1305.649000185" watchObservedRunningTime="2025-12-01 21:55:21.548099105 +0000 UTC m=+1305.649538300" Dec 01 21:55:21 crc kubenswrapper[4962]: I1201 21:55:21.593925 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-76b577fdff-d85rv" podStartSLOduration=2.963278497 podStartE2EDuration="6.5939051s" podCreationTimestamp="2025-12-01 21:55:15 +0000 UTC" firstStartedPulling="2025-12-01 21:55:16.918062977 +0000 UTC m=+1301.019502172" lastFinishedPulling="2025-12-01 21:55:20.54868958 +0000 UTC m=+1304.650128775" observedRunningTime="2025-12-01 21:55:21.582608408 +0000 UTC m=+1305.684047613" watchObservedRunningTime="2025-12-01 21:55:21.5939051 +0000 UTC m=+1305.695344295" Dec 01 21:55:21 crc kubenswrapper[4962]: I1201 21:55:21.622245 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6578955fd5-42wgd" podStartSLOduration=4.622229777 podStartE2EDuration="4.622229777s" podCreationTimestamp="2025-12-01 21:55:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:55:21.61427744 +0000 UTC m=+1305.715716635" watchObservedRunningTime="2025-12-01 21:55:21.622229777 +0000 UTC m=+1305.723668962" Dec 01 21:55:21 crc kubenswrapper[4962]: I1201 21:55:21.745044 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-jdj42"] Dec 01 21:55:21 crc kubenswrapper[4962]: I1201 21:55:21.757009 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-jdj42"] Dec 01 21:55:22 crc kubenswrapper[4962]: I1201 21:55:22.235000 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b92b9b58-0f0f-48ed-924a-005317716a78" path="/var/lib/kubelet/pods/b92b9b58-0f0f-48ed-924a-005317716a78/volumes" Dec 01 21:55:22 crc kubenswrapper[4962]: I1201 21:55:22.546564 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-65c9498c86-5xmqd"] Dec 01 21:55:22 crc kubenswrapper[4962]: E1201 21:55:22.548264 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b92b9b58-0f0f-48ed-924a-005317716a78" containerName="init" Dec 01 21:55:22 crc kubenswrapper[4962]: I1201 21:55:22.548355 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b92b9b58-0f0f-48ed-924a-005317716a78" containerName="init" Dec 01 21:55:22 crc kubenswrapper[4962]: I1201 21:55:22.548689 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="b92b9b58-0f0f-48ed-924a-005317716a78" containerName="init" Dec 01 21:55:22 crc kubenswrapper[4962]: I1201 21:55:22.551045 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-65c9498c86-5xmqd" Dec 01 21:55:22 crc kubenswrapper[4962]: I1201 21:55:22.558537 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Dec 01 21:55:22 crc kubenswrapper[4962]: I1201 21:55:22.558602 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Dec 01 21:55:22 crc kubenswrapper[4962]: I1201 21:55:22.560219 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-65c9498c86-5xmqd"] Dec 01 21:55:22 crc kubenswrapper[4962]: I1201 21:55:22.567416 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"95331b15-926c-429c-a7e2-41bc904f6629","Type":"ContainerStarted","Data":"453db843f82dd5ea7c684209285d986641744cb608bcf21dbe8e2885d8198ff3"} Dec 01 21:55:22 crc kubenswrapper[4962]: I1201 21:55:22.567577 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="95331b15-926c-429c-a7e2-41bc904f6629" containerName="cinder-api-log" containerID="cri-o://93c9f9afcbd718d18e2c431ea6d16d9fe64efb9ed6b30f45ebaafdb9ec6ddbf7" gracePeriod=30 Dec 01 21:55:22 crc kubenswrapper[4962]: I1201 21:55:22.567796 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="95331b15-926c-429c-a7e2-41bc904f6629" containerName="cinder-api" containerID="cri-o://453db843f82dd5ea7c684209285d986641744cb608bcf21dbe8e2885d8198ff3" gracePeriod=30 Dec 01 21:55:22 crc kubenswrapper[4962]: I1201 21:55:22.567846 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 01 21:55:22 crc kubenswrapper[4962]: I1201 21:55:22.601542 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6cf6cbec-dfb2-4afc-a180-e38559810d02","Type":"ContainerStarted","Data":"4fe7f9b202cf3bd255544ba999c4aec029a692aca05094f3299c20d5cdb5170a"} Dec 01 21:55:22 crc kubenswrapper[4962]: I1201 21:55:22.601631 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6cf6cbec-dfb2-4afc-a180-e38559810d02","Type":"ContainerStarted","Data":"ef2e1e950c9ed0a5e1b348acb20f5897de5d684ef7b59c4d14331740bc98ca0a"} Dec 01 21:55:22 crc kubenswrapper[4962]: I1201 21:55:22.625103 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.625081849 podStartE2EDuration="5.625081849s" podCreationTimestamp="2025-12-01 21:55:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:55:22.61038199 +0000 UTC m=+1306.711821185" watchObservedRunningTime="2025-12-01 21:55:22.625081849 +0000 UTC m=+1306.726521044" Dec 01 21:55:22 crc kubenswrapper[4962]: I1201 21:55:22.637020 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91d67b1b-578f-46c7-aec8-83785d2fe411-config-data\") pod \"barbican-api-65c9498c86-5xmqd\" (UID: \"91d67b1b-578f-46c7-aec8-83785d2fe411\") " pod="openstack/barbican-api-65c9498c86-5xmqd" Dec 01 21:55:22 crc kubenswrapper[4962]: I1201 21:55:22.637113 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91d67b1b-578f-46c7-aec8-83785d2fe411-logs\") pod \"barbican-api-65c9498c86-5xmqd\" (UID: \"91d67b1b-578f-46c7-aec8-83785d2fe411\") " pod="openstack/barbican-api-65c9498c86-5xmqd" Dec 01 21:55:22 crc kubenswrapper[4962]: I1201 21:55:22.637765 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkmmj\" (UniqueName: \"kubernetes.io/projected/91d67b1b-578f-46c7-aec8-83785d2fe411-kube-api-access-dkmmj\") pod \"barbican-api-65c9498c86-5xmqd\" (UID: \"91d67b1b-578f-46c7-aec8-83785d2fe411\") " pod="openstack/barbican-api-65c9498c86-5xmqd" Dec 01 21:55:22 crc kubenswrapper[4962]: I1201 21:55:22.637798 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91d67b1b-578f-46c7-aec8-83785d2fe411-combined-ca-bundle\") pod \"barbican-api-65c9498c86-5xmqd\" (UID: \"91d67b1b-578f-46c7-aec8-83785d2fe411\") " pod="openstack/barbican-api-65c9498c86-5xmqd" Dec 01 21:55:22 crc kubenswrapper[4962]: I1201 21:55:22.637821 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/91d67b1b-578f-46c7-aec8-83785d2fe411-public-tls-certs\") pod \"barbican-api-65c9498c86-5xmqd\" (UID: \"91d67b1b-578f-46c7-aec8-83785d2fe411\") " pod="openstack/barbican-api-65c9498c86-5xmqd" Dec 01 21:55:22 crc kubenswrapper[4962]: I1201 21:55:22.637855 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/91d67b1b-578f-46c7-aec8-83785d2fe411-config-data-custom\") pod \"barbican-api-65c9498c86-5xmqd\" (UID: \"91d67b1b-578f-46c7-aec8-83785d2fe411\") " pod="openstack/barbican-api-65c9498c86-5xmqd" Dec 01 21:55:22 crc kubenswrapper[4962]: I1201 21:55:22.637905 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/91d67b1b-578f-46c7-aec8-83785d2fe411-internal-tls-certs\") pod \"barbican-api-65c9498c86-5xmqd\" (UID: \"91d67b1b-578f-46c7-aec8-83785d2fe411\") " pod="openstack/barbican-api-65c9498c86-5xmqd" Dec 01 21:55:22 crc kubenswrapper[4962]: I1201 21:55:22.652919 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.689660306 podStartE2EDuration="5.652902222s" podCreationTimestamp="2025-12-01 21:55:17 +0000 UTC" firstStartedPulling="2025-12-01 21:55:18.565401903 +0000 UTC m=+1302.666841098" lastFinishedPulling="2025-12-01 21:55:20.528643819 +0000 UTC m=+1304.630083014" observedRunningTime="2025-12-01 21:55:22.640844728 +0000 UTC m=+1306.742283933" watchObservedRunningTime="2025-12-01 21:55:22.652902222 +0000 UTC m=+1306.754341417" Dec 01 21:55:22 crc kubenswrapper[4962]: I1201 21:55:22.739845 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkmmj\" (UniqueName: \"kubernetes.io/projected/91d67b1b-578f-46c7-aec8-83785d2fe411-kube-api-access-dkmmj\") pod \"barbican-api-65c9498c86-5xmqd\" (UID: \"91d67b1b-578f-46c7-aec8-83785d2fe411\") " pod="openstack/barbican-api-65c9498c86-5xmqd" Dec 01 21:55:22 crc kubenswrapper[4962]: I1201 21:55:22.739887 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91d67b1b-578f-46c7-aec8-83785d2fe411-combined-ca-bundle\") pod \"barbican-api-65c9498c86-5xmqd\" (UID: \"91d67b1b-578f-46c7-aec8-83785d2fe411\") " pod="openstack/barbican-api-65c9498c86-5xmqd" Dec 01 21:55:22 crc kubenswrapper[4962]: I1201 21:55:22.739910 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/91d67b1b-578f-46c7-aec8-83785d2fe411-public-tls-certs\") pod \"barbican-api-65c9498c86-5xmqd\" (UID: \"91d67b1b-578f-46c7-aec8-83785d2fe411\") " pod="openstack/barbican-api-65c9498c86-5xmqd" Dec 01 21:55:22 crc kubenswrapper[4962]: I1201 21:55:22.739960 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/91d67b1b-578f-46c7-aec8-83785d2fe411-config-data-custom\") pod \"barbican-api-65c9498c86-5xmqd\" (UID: \"91d67b1b-578f-46c7-aec8-83785d2fe411\") " pod="openstack/barbican-api-65c9498c86-5xmqd" Dec 01 21:55:22 crc kubenswrapper[4962]: I1201 21:55:22.739988 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/91d67b1b-578f-46c7-aec8-83785d2fe411-internal-tls-certs\") pod \"barbican-api-65c9498c86-5xmqd\" (UID: \"91d67b1b-578f-46c7-aec8-83785d2fe411\") " pod="openstack/barbican-api-65c9498c86-5xmqd" Dec 01 21:55:22 crc kubenswrapper[4962]: I1201 21:55:22.740042 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91d67b1b-578f-46c7-aec8-83785d2fe411-config-data\") pod \"barbican-api-65c9498c86-5xmqd\" (UID: \"91d67b1b-578f-46c7-aec8-83785d2fe411\") " pod="openstack/barbican-api-65c9498c86-5xmqd" Dec 01 21:55:22 crc kubenswrapper[4962]: I1201 21:55:22.740078 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91d67b1b-578f-46c7-aec8-83785d2fe411-logs\") pod \"barbican-api-65c9498c86-5xmqd\" (UID: \"91d67b1b-578f-46c7-aec8-83785d2fe411\") " pod="openstack/barbican-api-65c9498c86-5xmqd" Dec 01 21:55:22 crc kubenswrapper[4962]: I1201 21:55:22.740442 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91d67b1b-578f-46c7-aec8-83785d2fe411-logs\") pod \"barbican-api-65c9498c86-5xmqd\" (UID: \"91d67b1b-578f-46c7-aec8-83785d2fe411\") " pod="openstack/barbican-api-65c9498c86-5xmqd" Dec 01 21:55:22 crc kubenswrapper[4962]: I1201 21:55:22.746890 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91d67b1b-578f-46c7-aec8-83785d2fe411-config-data\") pod \"barbican-api-65c9498c86-5xmqd\" (UID: \"91d67b1b-578f-46c7-aec8-83785d2fe411\") " pod="openstack/barbican-api-65c9498c86-5xmqd" Dec 01 21:55:22 crc kubenswrapper[4962]: I1201 21:55:22.747491 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/91d67b1b-578f-46c7-aec8-83785d2fe411-public-tls-certs\") pod \"barbican-api-65c9498c86-5xmqd\" (UID: \"91d67b1b-578f-46c7-aec8-83785d2fe411\") " pod="openstack/barbican-api-65c9498c86-5xmqd" Dec 01 21:55:22 crc kubenswrapper[4962]: I1201 21:55:22.750571 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/91d67b1b-578f-46c7-aec8-83785d2fe411-internal-tls-certs\") pod \"barbican-api-65c9498c86-5xmqd\" (UID: \"91d67b1b-578f-46c7-aec8-83785d2fe411\") " pod="openstack/barbican-api-65c9498c86-5xmqd" Dec 01 21:55:22 crc kubenswrapper[4962]: I1201 21:55:22.751578 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/91d67b1b-578f-46c7-aec8-83785d2fe411-config-data-custom\") pod \"barbican-api-65c9498c86-5xmqd\" (UID: \"91d67b1b-578f-46c7-aec8-83785d2fe411\") " pod="openstack/barbican-api-65c9498c86-5xmqd" Dec 01 21:55:22 crc kubenswrapper[4962]: I1201 21:55:22.757954 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkmmj\" (UniqueName: \"kubernetes.io/projected/91d67b1b-578f-46c7-aec8-83785d2fe411-kube-api-access-dkmmj\") pod \"barbican-api-65c9498c86-5xmqd\" (UID: \"91d67b1b-578f-46c7-aec8-83785d2fe411\") " pod="openstack/barbican-api-65c9498c86-5xmqd" Dec 01 21:55:22 crc kubenswrapper[4962]: I1201 21:55:22.781010 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91d67b1b-578f-46c7-aec8-83785d2fe411-combined-ca-bundle\") pod \"barbican-api-65c9498c86-5xmqd\" (UID: \"91d67b1b-578f-46c7-aec8-83785d2fe411\") " pod="openstack/barbican-api-65c9498c86-5xmqd" Dec 01 21:55:22 crc kubenswrapper[4962]: I1201 21:55:22.823096 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 01 21:55:22 crc kubenswrapper[4962]: I1201 21:55:22.883713 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-65c9498c86-5xmqd" Dec 01 21:55:23 crc kubenswrapper[4962]: I1201 21:55:23.619886 4962 generic.go:334] "Generic (PLEG): container finished" podID="95331b15-926c-429c-a7e2-41bc904f6629" containerID="453db843f82dd5ea7c684209285d986641744cb608bcf21dbe8e2885d8198ff3" exitCode=0 Dec 01 21:55:23 crc kubenswrapper[4962]: I1201 21:55:23.619916 4962 generic.go:334] "Generic (PLEG): container finished" podID="95331b15-926c-429c-a7e2-41bc904f6629" containerID="93c9f9afcbd718d18e2c431ea6d16d9fe64efb9ed6b30f45ebaafdb9ec6ddbf7" exitCode=143 Dec 01 21:55:23 crc kubenswrapper[4962]: I1201 21:55:23.620851 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"95331b15-926c-429c-a7e2-41bc904f6629","Type":"ContainerDied","Data":"453db843f82dd5ea7c684209285d986641744cb608bcf21dbe8e2885d8198ff3"} Dec 01 21:55:23 crc kubenswrapper[4962]: I1201 21:55:23.620877 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"95331b15-926c-429c-a7e2-41bc904f6629","Type":"ContainerDied","Data":"93c9f9afcbd718d18e2c431ea6d16d9fe64efb9ed6b30f45ebaafdb9ec6ddbf7"} Dec 01 21:55:24 crc kubenswrapper[4962]: I1201 21:55:24.483645 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5579c75d4-fhjn4" Dec 01 21:55:27 crc kubenswrapper[4962]: I1201 21:55:27.575391 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5b9776df9c-m5wv9" Dec 01 21:55:27 crc kubenswrapper[4962]: I1201 21:55:27.613629 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5b9776df9c-m5wv9" Dec 01 21:55:27 crc kubenswrapper[4962]: I1201 21:55:27.702764 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5579c75d4-fhjn4"] Dec 01 21:55:27 crc kubenswrapper[4962]: I1201 21:55:27.703007 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5579c75d4-fhjn4" podUID="4898fa68-89d9-4aa6-9b60-4503ad99778e" containerName="neutron-api" containerID="cri-o://358e4acc7850c64440280b5d1c31ea9120e4d19ba36db602808cd65c946405fb" gracePeriod=30 Dec 01 21:55:27 crc kubenswrapper[4962]: I1201 21:55:27.703402 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5579c75d4-fhjn4" podUID="4898fa68-89d9-4aa6-9b60-4503ad99778e" containerName="neutron-httpd" containerID="cri-o://aff509d58e98b5675f1c30d02d6086b8025528aa3dc95c0b52ab03fd5ba9602b" gracePeriod=30 Dec 01 21:55:27 crc kubenswrapper[4962]: I1201 21:55:27.818557 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-55db4bf6b-mhhwj" Dec 01 21:55:27 crc kubenswrapper[4962]: I1201 21:55:27.946126 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6578955fd5-42wgd" Dec 01 21:55:28 crc kubenswrapper[4962]: I1201 21:55:28.041376 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-qgvg7"] Dec 01 21:55:28 crc kubenswrapper[4962]: I1201 21:55:28.057072 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b7b667979-qgvg7" podUID="2fc05c2b-9041-4361-8635-826c5a64afd2" containerName="dnsmasq-dns" containerID="cri-o://32c1ca42c8a9652c96705182eb7172aa3626be179a0a65e6d6d1359bc68fcdfa" gracePeriod=10 Dec 01 21:55:28 crc kubenswrapper[4962]: I1201 21:55:28.095620 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-55db4bf6b-mhhwj" Dec 01 21:55:28 crc kubenswrapper[4962]: I1201 21:55:28.227398 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="95331b15-926c-429c-a7e2-41bc904f6629" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.0.200:8776/healthcheck\": dial tcp 10.217.0.200:8776: connect: connection refused" Dec 01 21:55:28 crc kubenswrapper[4962]: I1201 21:55:28.235220 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 01 21:55:28 crc kubenswrapper[4962]: I1201 21:55:28.295186 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 21:55:28 crc kubenswrapper[4962]: I1201 21:55:28.718606 4962 generic.go:334] "Generic (PLEG): container finished" podID="2fc05c2b-9041-4361-8635-826c5a64afd2" containerID="32c1ca42c8a9652c96705182eb7172aa3626be179a0a65e6d6d1359bc68fcdfa" exitCode=0 Dec 01 21:55:28 crc kubenswrapper[4962]: I1201 21:55:28.718662 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-qgvg7" event={"ID":"2fc05c2b-9041-4361-8635-826c5a64afd2","Type":"ContainerDied","Data":"32c1ca42c8a9652c96705182eb7172aa3626be179a0a65e6d6d1359bc68fcdfa"} Dec 01 21:55:28 crc kubenswrapper[4962]: I1201 21:55:28.720699 4962 generic.go:334] "Generic (PLEG): container finished" podID="4898fa68-89d9-4aa6-9b60-4503ad99778e" containerID="aff509d58e98b5675f1c30d02d6086b8025528aa3dc95c0b52ab03fd5ba9602b" exitCode=0 Dec 01 21:55:28 crc kubenswrapper[4962]: I1201 21:55:28.721468 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5579c75d4-fhjn4" event={"ID":"4898fa68-89d9-4aa6-9b60-4503ad99778e","Type":"ContainerDied","Data":"aff509d58e98b5675f1c30d02d6086b8025528aa3dc95c0b52ab03fd5ba9602b"} Dec 01 21:55:28 crc kubenswrapper[4962]: I1201 21:55:28.721582 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="6cf6cbec-dfb2-4afc-a180-e38559810d02" containerName="cinder-scheduler" containerID="cri-o://ef2e1e950c9ed0a5e1b348acb20f5897de5d684ef7b59c4d14331740bc98ca0a" gracePeriod=30 Dec 01 21:55:28 crc kubenswrapper[4962]: I1201 21:55:28.721631 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="6cf6cbec-dfb2-4afc-a180-e38559810d02" containerName="probe" containerID="cri-o://4fe7f9b202cf3bd255544ba999c4aec029a692aca05094f3299c20d5cdb5170a" gracePeriod=30 Dec 01 21:55:29 crc kubenswrapper[4962]: I1201 21:55:29.376808 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 01 21:55:29 crc kubenswrapper[4962]: I1201 21:55:29.552652 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/95331b15-926c-429c-a7e2-41bc904f6629-etc-machine-id\") pod \"95331b15-926c-429c-a7e2-41bc904f6629\" (UID: \"95331b15-926c-429c-a7e2-41bc904f6629\") " Dec 01 21:55:29 crc kubenswrapper[4962]: I1201 21:55:29.552688 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95331b15-926c-429c-a7e2-41bc904f6629-scripts\") pod \"95331b15-926c-429c-a7e2-41bc904f6629\" (UID: \"95331b15-926c-429c-a7e2-41bc904f6629\") " Dec 01 21:55:29 crc kubenswrapper[4962]: I1201 21:55:29.552855 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95331b15-926c-429c-a7e2-41bc904f6629-combined-ca-bundle\") pod \"95331b15-926c-429c-a7e2-41bc904f6629\" (UID: \"95331b15-926c-429c-a7e2-41bc904f6629\") " Dec 01 21:55:29 crc kubenswrapper[4962]: I1201 21:55:29.552993 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/95331b15-926c-429c-a7e2-41bc904f6629-config-data-custom\") pod \"95331b15-926c-429c-a7e2-41bc904f6629\" (UID: \"95331b15-926c-429c-a7e2-41bc904f6629\") " Dec 01 21:55:29 crc kubenswrapper[4962]: I1201 21:55:29.553020 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/95331b15-926c-429c-a7e2-41bc904f6629-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "95331b15-926c-429c-a7e2-41bc904f6629" (UID: "95331b15-926c-429c-a7e2-41bc904f6629"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 21:55:29 crc kubenswrapper[4962]: I1201 21:55:29.553039 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ffk2s\" (UniqueName: \"kubernetes.io/projected/95331b15-926c-429c-a7e2-41bc904f6629-kube-api-access-ffk2s\") pod \"95331b15-926c-429c-a7e2-41bc904f6629\" (UID: \"95331b15-926c-429c-a7e2-41bc904f6629\") " Dec 01 21:55:29 crc kubenswrapper[4962]: I1201 21:55:29.553108 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95331b15-926c-429c-a7e2-41bc904f6629-config-data\") pod \"95331b15-926c-429c-a7e2-41bc904f6629\" (UID: \"95331b15-926c-429c-a7e2-41bc904f6629\") " Dec 01 21:55:29 crc kubenswrapper[4962]: I1201 21:55:29.553205 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95331b15-926c-429c-a7e2-41bc904f6629-logs\") pod \"95331b15-926c-429c-a7e2-41bc904f6629\" (UID: \"95331b15-926c-429c-a7e2-41bc904f6629\") " Dec 01 21:55:29 crc kubenswrapper[4962]: I1201 21:55:29.554138 4962 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/95331b15-926c-429c-a7e2-41bc904f6629-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 01 21:55:29 crc kubenswrapper[4962]: I1201 21:55:29.554230 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95331b15-926c-429c-a7e2-41bc904f6629-logs" (OuterVolumeSpecName: "logs") pod "95331b15-926c-429c-a7e2-41bc904f6629" (UID: "95331b15-926c-429c-a7e2-41bc904f6629"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:55:29 crc kubenswrapper[4962]: I1201 21:55:29.560509 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95331b15-926c-429c-a7e2-41bc904f6629-scripts" (OuterVolumeSpecName: "scripts") pod "95331b15-926c-429c-a7e2-41bc904f6629" (UID: "95331b15-926c-429c-a7e2-41bc904f6629"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:55:29 crc kubenswrapper[4962]: I1201 21:55:29.562833 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95331b15-926c-429c-a7e2-41bc904f6629-kube-api-access-ffk2s" (OuterVolumeSpecName: "kube-api-access-ffk2s") pod "95331b15-926c-429c-a7e2-41bc904f6629" (UID: "95331b15-926c-429c-a7e2-41bc904f6629"). InnerVolumeSpecName "kube-api-access-ffk2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:55:29 crc kubenswrapper[4962]: I1201 21:55:29.569054 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95331b15-926c-429c-a7e2-41bc904f6629-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "95331b15-926c-429c-a7e2-41bc904f6629" (UID: "95331b15-926c-429c-a7e2-41bc904f6629"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:55:29 crc kubenswrapper[4962]: I1201 21:55:29.576506 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-qgvg7" Dec 01 21:55:29 crc kubenswrapper[4962]: I1201 21:55:29.630029 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95331b15-926c-429c-a7e2-41bc904f6629-config-data" (OuterVolumeSpecName: "config-data") pod "95331b15-926c-429c-a7e2-41bc904f6629" (UID: "95331b15-926c-429c-a7e2-41bc904f6629"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:55:29 crc kubenswrapper[4962]: I1201 21:55:29.634307 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95331b15-926c-429c-a7e2-41bc904f6629-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "95331b15-926c-429c-a7e2-41bc904f6629" (UID: "95331b15-926c-429c-a7e2-41bc904f6629"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:55:29 crc kubenswrapper[4962]: I1201 21:55:29.664375 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95331b15-926c-429c-a7e2-41bc904f6629-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 21:55:29 crc kubenswrapper[4962]: I1201 21:55:29.664405 4962 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/95331b15-926c-429c-a7e2-41bc904f6629-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 01 21:55:29 crc kubenswrapper[4962]: I1201 21:55:29.664414 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ffk2s\" (UniqueName: \"kubernetes.io/projected/95331b15-926c-429c-a7e2-41bc904f6629-kube-api-access-ffk2s\") on node \"crc\" DevicePath \"\"" Dec 01 21:55:29 crc kubenswrapper[4962]: I1201 21:55:29.664427 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95331b15-926c-429c-a7e2-41bc904f6629-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 21:55:29 crc kubenswrapper[4962]: I1201 21:55:29.664436 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95331b15-926c-429c-a7e2-41bc904f6629-logs\") on node \"crc\" DevicePath \"\"" Dec 01 21:55:29 crc kubenswrapper[4962]: I1201 21:55:29.664443 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95331b15-926c-429c-a7e2-41bc904f6629-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 21:55:29 crc kubenswrapper[4962]: I1201 21:55:29.743377 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"624dc66d-d4f2-447b-861b-23987a75a3d6","Type":"ContainerStarted","Data":"9c22fd0df8b5c2c87c6bc63c654a1544e746d51671d99f82a396e56826cd19ea"} Dec 01 21:55:29 crc kubenswrapper[4962]: I1201 21:55:29.743563 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="624dc66d-d4f2-447b-861b-23987a75a3d6" containerName="ceilometer-central-agent" containerID="cri-o://bfb880d1e65ab732bffe713cc7b20a7199ecc71eb02fc8813b5e5e816272e897" gracePeriod=30 Dec 01 21:55:29 crc kubenswrapper[4962]: I1201 21:55:29.743623 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="624dc66d-d4f2-447b-861b-23987a75a3d6" containerName="proxy-httpd" containerID="cri-o://9c22fd0df8b5c2c87c6bc63c654a1544e746d51671d99f82a396e56826cd19ea" gracePeriod=30 Dec 01 21:55:29 crc kubenswrapper[4962]: I1201 21:55:29.743639 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="624dc66d-d4f2-447b-861b-23987a75a3d6" containerName="sg-core" containerID="cri-o://3b700d2def88961fd57d19c4e4cc0de95c1da92a0cb130fc9a051f5f95bc4559" gracePeriod=30 Dec 01 21:55:29 crc kubenswrapper[4962]: I1201 21:55:29.743674 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="624dc66d-d4f2-447b-861b-23987a75a3d6" containerName="ceilometer-notification-agent" containerID="cri-o://e957539bd231854f0073f95ebd36d89ec2ea62e54fe60af4a852a9def4749bd5" gracePeriod=30 Dec 01 21:55:29 crc kubenswrapper[4962]: I1201 21:55:29.743682 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 01 21:55:29 crc kubenswrapper[4962]: I1201 21:55:29.753256 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-qgvg7" event={"ID":"2fc05c2b-9041-4361-8635-826c5a64afd2","Type":"ContainerDied","Data":"2c93fcac4a6d1e435abb195fe2941bb633d392dd11030f37db5fec8393d39320"} Dec 01 21:55:29 crc kubenswrapper[4962]: I1201 21:55:29.753311 4962 scope.go:117] "RemoveContainer" containerID="32c1ca42c8a9652c96705182eb7172aa3626be179a0a65e6d6d1359bc68fcdfa" Dec 01 21:55:29 crc kubenswrapper[4962]: I1201 21:55:29.753312 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-qgvg7" Dec 01 21:55:29 crc kubenswrapper[4962]: I1201 21:55:29.759810 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"95331b15-926c-429c-a7e2-41bc904f6629","Type":"ContainerDied","Data":"fc9f0f3156bd9b5df0de24db8eaa51cb02c196090b4a7aa45e47b26ecbef71a7"} Dec 01 21:55:29 crc kubenswrapper[4962]: I1201 21:55:29.759887 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 01 21:55:29 crc kubenswrapper[4962]: I1201 21:55:29.765090 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2fc05c2b-9041-4361-8635-826c5a64afd2-dns-svc\") pod \"2fc05c2b-9041-4361-8635-826c5a64afd2\" (UID: \"2fc05c2b-9041-4361-8635-826c5a64afd2\") " Dec 01 21:55:29 crc kubenswrapper[4962]: I1201 21:55:29.765328 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2fc05c2b-9041-4361-8635-826c5a64afd2-ovsdbserver-nb\") pod \"2fc05c2b-9041-4361-8635-826c5a64afd2\" (UID: \"2fc05c2b-9041-4361-8635-826c5a64afd2\") " Dec 01 21:55:29 crc kubenswrapper[4962]: I1201 21:55:29.765374 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2fc05c2b-9041-4361-8635-826c5a64afd2-dns-swift-storage-0\") pod \"2fc05c2b-9041-4361-8635-826c5a64afd2\" (UID: \"2fc05c2b-9041-4361-8635-826c5a64afd2\") " Dec 01 21:55:29 crc kubenswrapper[4962]: I1201 21:55:29.765438 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2fc05c2b-9041-4361-8635-826c5a64afd2-ovsdbserver-sb\") pod \"2fc05c2b-9041-4361-8635-826c5a64afd2\" (UID: \"2fc05c2b-9041-4361-8635-826c5a64afd2\") " Dec 01 21:55:29 crc kubenswrapper[4962]: I1201 21:55:29.765492 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkmnr\" (UniqueName: \"kubernetes.io/projected/2fc05c2b-9041-4361-8635-826c5a64afd2-kube-api-access-kkmnr\") pod \"2fc05c2b-9041-4361-8635-826c5a64afd2\" (UID: \"2fc05c2b-9041-4361-8635-826c5a64afd2\") " Dec 01 21:55:29 crc kubenswrapper[4962]: I1201 21:55:29.765557 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fc05c2b-9041-4361-8635-826c5a64afd2-config\") pod \"2fc05c2b-9041-4361-8635-826c5a64afd2\" (UID: \"2fc05c2b-9041-4361-8635-826c5a64afd2\") " Dec 01 21:55:29 crc kubenswrapper[4962]: I1201 21:55:29.773897 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.979621383 podStartE2EDuration="1m4.77387688s" podCreationTimestamp="2025-12-01 21:54:25 +0000 UTC" firstStartedPulling="2025-12-01 21:54:27.328610325 +0000 UTC m=+1251.430049520" lastFinishedPulling="2025-12-01 21:55:29.122865822 +0000 UTC m=+1313.224305017" observedRunningTime="2025-12-01 21:55:29.764672278 +0000 UTC m=+1313.866111473" watchObservedRunningTime="2025-12-01 21:55:29.77387688 +0000 UTC m=+1313.875316075" Dec 01 21:55:29 crc kubenswrapper[4962]: I1201 21:55:29.775423 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fc05c2b-9041-4361-8635-826c5a64afd2-kube-api-access-kkmnr" (OuterVolumeSpecName: "kube-api-access-kkmnr") pod "2fc05c2b-9041-4361-8635-826c5a64afd2" (UID: "2fc05c2b-9041-4361-8635-826c5a64afd2"). InnerVolumeSpecName "kube-api-access-kkmnr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:55:29 crc kubenswrapper[4962]: I1201 21:55:29.824491 4962 scope.go:117] "RemoveContainer" containerID="584b68309f93cef9db13325a12b8d5d6137af068bfa65cd1ae925e264ba7e8ef" Dec 01 21:55:29 crc kubenswrapper[4962]: I1201 21:55:29.843212 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 01 21:55:29 crc kubenswrapper[4962]: I1201 21:55:29.850245 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fc05c2b-9041-4361-8635-826c5a64afd2-config" (OuterVolumeSpecName: "config") pod "2fc05c2b-9041-4361-8635-826c5a64afd2" (UID: "2fc05c2b-9041-4361-8635-826c5a64afd2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:55:29 crc kubenswrapper[4962]: I1201 21:55:29.862858 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fc05c2b-9041-4361-8635-826c5a64afd2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2fc05c2b-9041-4361-8635-826c5a64afd2" (UID: "2fc05c2b-9041-4361-8635-826c5a64afd2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:55:29 crc kubenswrapper[4962]: I1201 21:55:29.867501 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fc05c2b-9041-4361-8635-826c5a64afd2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2fc05c2b-9041-4361-8635-826c5a64afd2" (UID: "2fc05c2b-9041-4361-8635-826c5a64afd2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:55:29 crc kubenswrapper[4962]: I1201 21:55:29.877744 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2fc05c2b-9041-4361-8635-826c5a64afd2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 21:55:29 crc kubenswrapper[4962]: I1201 21:55:29.877775 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2fc05c2b-9041-4361-8635-826c5a64afd2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 21:55:29 crc kubenswrapper[4962]: I1201 21:55:29.877785 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkmnr\" (UniqueName: \"kubernetes.io/projected/2fc05c2b-9041-4361-8635-826c5a64afd2-kube-api-access-kkmnr\") on node \"crc\" DevicePath \"\"" Dec 01 21:55:29 crc kubenswrapper[4962]: I1201 21:55:29.877796 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fc05c2b-9041-4361-8635-826c5a64afd2-config\") on node \"crc\" DevicePath \"\"" Dec 01 21:55:29 crc kubenswrapper[4962]: I1201 21:55:29.881340 4962 scope.go:117] "RemoveContainer" containerID="453db843f82dd5ea7c684209285d986641744cb608bcf21dbe8e2885d8198ff3" Dec 01 21:55:29 crc kubenswrapper[4962]: W1201 21:55:29.896050 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91d67b1b_578f_46c7_aec8_83785d2fe411.slice/crio-013229aff7e1cfc2d5cd733c97334974a7987f67ec35d4c9e9ab12e3bbf67d3f WatchSource:0}: Error finding container 013229aff7e1cfc2d5cd733c97334974a7987f67ec35d4c9e9ab12e3bbf67d3f: Status 404 returned error can't find the container with id 013229aff7e1cfc2d5cd733c97334974a7987f67ec35d4c9e9ab12e3bbf67d3f Dec 01 21:55:29 crc kubenswrapper[4962]: I1201 21:55:29.901248 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 01 21:55:29 crc kubenswrapper[4962]: I1201 21:55:29.904614 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fc05c2b-9041-4361-8635-826c5a64afd2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2fc05c2b-9041-4361-8635-826c5a64afd2" (UID: "2fc05c2b-9041-4361-8635-826c5a64afd2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:55:29 crc kubenswrapper[4962]: I1201 21:55:29.911314 4962 scope.go:117] "RemoveContainer" containerID="93c9f9afcbd718d18e2c431ea6d16d9fe64efb9ed6b30f45ebaafdb9ec6ddbf7" Dec 01 21:55:29 crc kubenswrapper[4962]: I1201 21:55:29.922748 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fc05c2b-9041-4361-8635-826c5a64afd2-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2fc05c2b-9041-4361-8635-826c5a64afd2" (UID: "2fc05c2b-9041-4361-8635-826c5a64afd2"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:55:29 crc kubenswrapper[4962]: I1201 21:55:29.925370 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 01 21:55:29 crc kubenswrapper[4962]: E1201 21:55:29.925817 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fc05c2b-9041-4361-8635-826c5a64afd2" containerName="init" Dec 01 21:55:29 crc kubenswrapper[4962]: I1201 21:55:29.925833 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fc05c2b-9041-4361-8635-826c5a64afd2" containerName="init" Dec 01 21:55:29 crc kubenswrapper[4962]: E1201 21:55:29.925843 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95331b15-926c-429c-a7e2-41bc904f6629" containerName="cinder-api" Dec 01 21:55:29 crc kubenswrapper[4962]: I1201 21:55:29.925849 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="95331b15-926c-429c-a7e2-41bc904f6629" containerName="cinder-api" Dec 01 21:55:29 crc kubenswrapper[4962]: E1201 21:55:29.925859 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fc05c2b-9041-4361-8635-826c5a64afd2" containerName="dnsmasq-dns" Dec 01 21:55:29 crc kubenswrapper[4962]: I1201 21:55:29.925867 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fc05c2b-9041-4361-8635-826c5a64afd2" containerName="dnsmasq-dns" Dec 01 21:55:29 crc kubenswrapper[4962]: E1201 21:55:29.925878 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95331b15-926c-429c-a7e2-41bc904f6629" containerName="cinder-api-log" Dec 01 21:55:29 crc kubenswrapper[4962]: I1201 21:55:29.925885 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="95331b15-926c-429c-a7e2-41bc904f6629" containerName="cinder-api-log" Dec 01 21:55:29 crc kubenswrapper[4962]: I1201 21:55:29.930446 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="95331b15-926c-429c-a7e2-41bc904f6629" containerName="cinder-api" Dec 01 21:55:29 crc kubenswrapper[4962]: I1201 21:55:29.930514 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fc05c2b-9041-4361-8635-826c5a64afd2" containerName="dnsmasq-dns" Dec 01 21:55:29 crc kubenswrapper[4962]: I1201 21:55:29.930537 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="95331b15-926c-429c-a7e2-41bc904f6629" containerName="cinder-api-log" Dec 01 21:55:29 crc kubenswrapper[4962]: I1201 21:55:29.943606 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 01 21:55:29 crc kubenswrapper[4962]: I1201 21:55:29.944750 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 01 21:55:29 crc kubenswrapper[4962]: I1201 21:55:29.946263 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 01 21:55:29 crc kubenswrapper[4962]: I1201 21:55:29.946513 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Dec 01 21:55:29 crc kubenswrapper[4962]: I1201 21:55:29.980919 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Dec 01 21:55:30 crc kubenswrapper[4962]: I1201 21:55:30.002839 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-65c9498c86-5xmqd"] Dec 01 21:55:30 crc kubenswrapper[4962]: I1201 21:55:30.031308 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/356fbcf6-1bde-4b7b-bf5f-7be551d1a03c-config-data\") pod \"cinder-api-0\" (UID: \"356fbcf6-1bde-4b7b-bf5f-7be551d1a03c\") " pod="openstack/cinder-api-0" Dec 01 21:55:30 crc kubenswrapper[4962]: I1201 21:55:30.031351 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/356fbcf6-1bde-4b7b-bf5f-7be551d1a03c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"356fbcf6-1bde-4b7b-bf5f-7be551d1a03c\") " pod="openstack/cinder-api-0" Dec 01 21:55:30 crc kubenswrapper[4962]: I1201 21:55:30.031482 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/356fbcf6-1bde-4b7b-bf5f-7be551d1a03c-config-data-custom\") pod \"cinder-api-0\" (UID: \"356fbcf6-1bde-4b7b-bf5f-7be551d1a03c\") " pod="openstack/cinder-api-0" Dec 01 21:55:30 crc kubenswrapper[4962]: I1201 21:55:30.031628 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/356fbcf6-1bde-4b7b-bf5f-7be551d1a03c-logs\") pod \"cinder-api-0\" (UID: \"356fbcf6-1bde-4b7b-bf5f-7be551d1a03c\") " pod="openstack/cinder-api-0" Dec 01 21:55:30 crc kubenswrapper[4962]: I1201 21:55:30.031650 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7mfw\" (UniqueName: \"kubernetes.io/projected/356fbcf6-1bde-4b7b-bf5f-7be551d1a03c-kube-api-access-f7mfw\") pod \"cinder-api-0\" (UID: \"356fbcf6-1bde-4b7b-bf5f-7be551d1a03c\") " pod="openstack/cinder-api-0" Dec 01 21:55:30 crc kubenswrapper[4962]: I1201 21:55:30.031718 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/356fbcf6-1bde-4b7b-bf5f-7be551d1a03c-public-tls-certs\") pod \"cinder-api-0\" (UID: \"356fbcf6-1bde-4b7b-bf5f-7be551d1a03c\") " pod="openstack/cinder-api-0" Dec 01 21:55:30 crc kubenswrapper[4962]: I1201 21:55:30.031740 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/356fbcf6-1bde-4b7b-bf5f-7be551d1a03c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"356fbcf6-1bde-4b7b-bf5f-7be551d1a03c\") " pod="openstack/cinder-api-0" Dec 01 21:55:30 crc kubenswrapper[4962]: I1201 21:55:30.031775 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/356fbcf6-1bde-4b7b-bf5f-7be551d1a03c-scripts\") pod \"cinder-api-0\" (UID: \"356fbcf6-1bde-4b7b-bf5f-7be551d1a03c\") " pod="openstack/cinder-api-0" Dec 01 21:55:30 crc kubenswrapper[4962]: I1201 21:55:30.031800 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/356fbcf6-1bde-4b7b-bf5f-7be551d1a03c-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"356fbcf6-1bde-4b7b-bf5f-7be551d1a03c\") " pod="openstack/cinder-api-0" Dec 01 21:55:30 crc kubenswrapper[4962]: I1201 21:55:30.031883 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2fc05c2b-9041-4361-8635-826c5a64afd2-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 21:55:30 crc kubenswrapper[4962]: I1201 21:55:30.031900 4962 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2fc05c2b-9041-4361-8635-826c5a64afd2-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 01 21:55:30 crc kubenswrapper[4962]: I1201 21:55:30.133966 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/356fbcf6-1bde-4b7b-bf5f-7be551d1a03c-config-data-custom\") pod \"cinder-api-0\" (UID: \"356fbcf6-1bde-4b7b-bf5f-7be551d1a03c\") " pod="openstack/cinder-api-0" Dec 01 21:55:30 crc kubenswrapper[4962]: I1201 21:55:30.134066 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/356fbcf6-1bde-4b7b-bf5f-7be551d1a03c-logs\") pod \"cinder-api-0\" (UID: \"356fbcf6-1bde-4b7b-bf5f-7be551d1a03c\") " pod="openstack/cinder-api-0" Dec 01 21:55:30 crc kubenswrapper[4962]: I1201 21:55:30.134089 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7mfw\" (UniqueName: \"kubernetes.io/projected/356fbcf6-1bde-4b7b-bf5f-7be551d1a03c-kube-api-access-f7mfw\") pod \"cinder-api-0\" (UID: \"356fbcf6-1bde-4b7b-bf5f-7be551d1a03c\") " pod="openstack/cinder-api-0" Dec 01 21:55:30 crc kubenswrapper[4962]: I1201 21:55:30.134129 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/356fbcf6-1bde-4b7b-bf5f-7be551d1a03c-public-tls-certs\") pod \"cinder-api-0\" (UID: \"356fbcf6-1bde-4b7b-bf5f-7be551d1a03c\") " pod="openstack/cinder-api-0" Dec 01 21:55:30 crc kubenswrapper[4962]: I1201 21:55:30.134147 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/356fbcf6-1bde-4b7b-bf5f-7be551d1a03c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"356fbcf6-1bde-4b7b-bf5f-7be551d1a03c\") " pod="openstack/cinder-api-0" Dec 01 21:55:30 crc kubenswrapper[4962]: I1201 21:55:30.134168 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/356fbcf6-1bde-4b7b-bf5f-7be551d1a03c-scripts\") pod \"cinder-api-0\" (UID: \"356fbcf6-1bde-4b7b-bf5f-7be551d1a03c\") " pod="openstack/cinder-api-0" Dec 01 21:55:30 crc kubenswrapper[4962]: I1201 21:55:30.134186 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/356fbcf6-1bde-4b7b-bf5f-7be551d1a03c-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"356fbcf6-1bde-4b7b-bf5f-7be551d1a03c\") " pod="openstack/cinder-api-0" Dec 01 21:55:30 crc kubenswrapper[4962]: I1201 21:55:30.134216 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/356fbcf6-1bde-4b7b-bf5f-7be551d1a03c-config-data\") pod \"cinder-api-0\" (UID: \"356fbcf6-1bde-4b7b-bf5f-7be551d1a03c\") " pod="openstack/cinder-api-0" Dec 01 21:55:30 crc kubenswrapper[4962]: I1201 21:55:30.134230 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/356fbcf6-1bde-4b7b-bf5f-7be551d1a03c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"356fbcf6-1bde-4b7b-bf5f-7be551d1a03c\") " pod="openstack/cinder-api-0" Dec 01 21:55:30 crc kubenswrapper[4962]: I1201 21:55:30.134292 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/356fbcf6-1bde-4b7b-bf5f-7be551d1a03c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"356fbcf6-1bde-4b7b-bf5f-7be551d1a03c\") " pod="openstack/cinder-api-0" Dec 01 21:55:30 crc kubenswrapper[4962]: I1201 21:55:30.134689 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/356fbcf6-1bde-4b7b-bf5f-7be551d1a03c-logs\") pod \"cinder-api-0\" (UID: \"356fbcf6-1bde-4b7b-bf5f-7be551d1a03c\") " pod="openstack/cinder-api-0" Dec 01 21:55:30 crc kubenswrapper[4962]: I1201 21:55:30.139543 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/356fbcf6-1bde-4b7b-bf5f-7be551d1a03c-public-tls-certs\") pod \"cinder-api-0\" (UID: \"356fbcf6-1bde-4b7b-bf5f-7be551d1a03c\") " pod="openstack/cinder-api-0" Dec 01 21:55:30 crc kubenswrapper[4962]: I1201 21:55:30.140213 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/356fbcf6-1bde-4b7b-bf5f-7be551d1a03c-scripts\") pod \"cinder-api-0\" (UID: \"356fbcf6-1bde-4b7b-bf5f-7be551d1a03c\") " pod="openstack/cinder-api-0" Dec 01 21:55:30 crc kubenswrapper[4962]: I1201 21:55:30.140512 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/356fbcf6-1bde-4b7b-bf5f-7be551d1a03c-config-data-custom\") pod \"cinder-api-0\" (UID: \"356fbcf6-1bde-4b7b-bf5f-7be551d1a03c\") " pod="openstack/cinder-api-0" Dec 01 21:55:30 crc kubenswrapper[4962]: I1201 21:55:30.142862 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/356fbcf6-1bde-4b7b-bf5f-7be551d1a03c-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"356fbcf6-1bde-4b7b-bf5f-7be551d1a03c\") " pod="openstack/cinder-api-0" Dec 01 21:55:30 crc kubenswrapper[4962]: I1201 21:55:30.144888 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/356fbcf6-1bde-4b7b-bf5f-7be551d1a03c-config-data\") pod \"cinder-api-0\" (UID: \"356fbcf6-1bde-4b7b-bf5f-7be551d1a03c\") " pod="openstack/cinder-api-0" Dec 01 21:55:30 crc kubenswrapper[4962]: I1201 21:55:30.151570 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7mfw\" (UniqueName: \"kubernetes.io/projected/356fbcf6-1bde-4b7b-bf5f-7be551d1a03c-kube-api-access-f7mfw\") pod \"cinder-api-0\" (UID: \"356fbcf6-1bde-4b7b-bf5f-7be551d1a03c\") " pod="openstack/cinder-api-0" Dec 01 21:55:30 crc kubenswrapper[4962]: I1201 21:55:30.159712 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/356fbcf6-1bde-4b7b-bf5f-7be551d1a03c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"356fbcf6-1bde-4b7b-bf5f-7be551d1a03c\") " pod="openstack/cinder-api-0" Dec 01 21:55:30 crc kubenswrapper[4962]: I1201 21:55:30.236828 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95331b15-926c-429c-a7e2-41bc904f6629" path="/var/lib/kubelet/pods/95331b15-926c-429c-a7e2-41bc904f6629/volumes" Dec 01 21:55:30 crc kubenswrapper[4962]: I1201 21:55:30.292748 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-qgvg7"] Dec 01 21:55:30 crc kubenswrapper[4962]: I1201 21:55:30.300903 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-qgvg7"] Dec 01 21:55:30 crc kubenswrapper[4962]: I1201 21:55:30.301802 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 01 21:55:30 crc kubenswrapper[4962]: I1201 21:55:30.766994 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 01 21:55:30 crc kubenswrapper[4962]: W1201 21:55:30.790829 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod356fbcf6_1bde_4b7b_bf5f_7be551d1a03c.slice/crio-58ac74d646273b00ac594a7e0122ffb118865d7dd06e05f83e5c731f020ddb36 WatchSource:0}: Error finding container 58ac74d646273b00ac594a7e0122ffb118865d7dd06e05f83e5c731f020ddb36: Status 404 returned error can't find the container with id 58ac74d646273b00ac594a7e0122ffb118865d7dd06e05f83e5c731f020ddb36 Dec 01 21:55:30 crc kubenswrapper[4962]: I1201 21:55:30.794984 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-65c9498c86-5xmqd" event={"ID":"91d67b1b-578f-46c7-aec8-83785d2fe411","Type":"ContainerStarted","Data":"a388c59a7d79bfb6da529463b8184ee77beef2a89d0752802798f8c4a3ef851c"} Dec 01 21:55:30 crc kubenswrapper[4962]: I1201 21:55:30.795009 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-65c9498c86-5xmqd" event={"ID":"91d67b1b-578f-46c7-aec8-83785d2fe411","Type":"ContainerStarted","Data":"ebba819067ba3c557347e037e494c680912b937cc1628eb60a3ecce0861e8364"} Dec 01 21:55:30 crc kubenswrapper[4962]: I1201 21:55:30.795022 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-65c9498c86-5xmqd" Dec 01 21:55:30 crc kubenswrapper[4962]: I1201 21:55:30.795032 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-65c9498c86-5xmqd" event={"ID":"91d67b1b-578f-46c7-aec8-83785d2fe411","Type":"ContainerStarted","Data":"013229aff7e1cfc2d5cd733c97334974a7987f67ec35d4c9e9ab12e3bbf67d3f"} Dec 01 21:55:30 crc kubenswrapper[4962]: I1201 21:55:30.795066 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-65c9498c86-5xmqd" Dec 01 21:55:30 crc kubenswrapper[4962]: I1201 21:55:30.804570 4962 generic.go:334] "Generic (PLEG): container finished" podID="624dc66d-d4f2-447b-861b-23987a75a3d6" containerID="9c22fd0df8b5c2c87c6bc63c654a1544e746d51671d99f82a396e56826cd19ea" exitCode=0 Dec 01 21:55:30 crc kubenswrapper[4962]: I1201 21:55:30.804600 4962 generic.go:334] "Generic (PLEG): container finished" podID="624dc66d-d4f2-447b-861b-23987a75a3d6" containerID="3b700d2def88961fd57d19c4e4cc0de95c1da92a0cb130fc9a051f5f95bc4559" exitCode=2 Dec 01 21:55:30 crc kubenswrapper[4962]: I1201 21:55:30.804608 4962 generic.go:334] "Generic (PLEG): container finished" podID="624dc66d-d4f2-447b-861b-23987a75a3d6" containerID="bfb880d1e65ab732bffe713cc7b20a7199ecc71eb02fc8813b5e5e816272e897" exitCode=0 Dec 01 21:55:30 crc kubenswrapper[4962]: I1201 21:55:30.804650 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"624dc66d-d4f2-447b-861b-23987a75a3d6","Type":"ContainerDied","Data":"9c22fd0df8b5c2c87c6bc63c654a1544e746d51671d99f82a396e56826cd19ea"} Dec 01 21:55:30 crc kubenswrapper[4962]: I1201 21:55:30.804703 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"624dc66d-d4f2-447b-861b-23987a75a3d6","Type":"ContainerDied","Data":"3b700d2def88961fd57d19c4e4cc0de95c1da92a0cb130fc9a051f5f95bc4559"} Dec 01 21:55:30 crc kubenswrapper[4962]: I1201 21:55:30.804715 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"624dc66d-d4f2-447b-861b-23987a75a3d6","Type":"ContainerDied","Data":"bfb880d1e65ab732bffe713cc7b20a7199ecc71eb02fc8813b5e5e816272e897"} Dec 01 21:55:30 crc kubenswrapper[4962]: I1201 21:55:30.806682 4962 generic.go:334] "Generic (PLEG): container finished" podID="6cf6cbec-dfb2-4afc-a180-e38559810d02" containerID="4fe7f9b202cf3bd255544ba999c4aec029a692aca05094f3299c20d5cdb5170a" exitCode=0 Dec 01 21:55:30 crc kubenswrapper[4962]: I1201 21:55:30.806698 4962 generic.go:334] "Generic (PLEG): container finished" podID="6cf6cbec-dfb2-4afc-a180-e38559810d02" containerID="ef2e1e950c9ed0a5e1b348acb20f5897de5d684ef7b59c4d14331740bc98ca0a" exitCode=0 Dec 01 21:55:30 crc kubenswrapper[4962]: I1201 21:55:30.806715 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6cf6cbec-dfb2-4afc-a180-e38559810d02","Type":"ContainerDied","Data":"4fe7f9b202cf3bd255544ba999c4aec029a692aca05094f3299c20d5cdb5170a"} Dec 01 21:55:30 crc kubenswrapper[4962]: I1201 21:55:30.806741 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6cf6cbec-dfb2-4afc-a180-e38559810d02","Type":"ContainerDied","Data":"ef2e1e950c9ed0a5e1b348acb20f5897de5d684ef7b59c4d14331740bc98ca0a"} Dec 01 21:55:30 crc kubenswrapper[4962]: I1201 21:55:30.817022 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-65c9498c86-5xmqd" podStartSLOduration=8.81700537 podStartE2EDuration="8.81700537s" podCreationTimestamp="2025-12-01 21:55:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:55:30.816812325 +0000 UTC m=+1314.918251520" watchObservedRunningTime="2025-12-01 21:55:30.81700537 +0000 UTC m=+1314.918444565" Dec 01 21:55:31 crc kubenswrapper[4962]: I1201 21:55:31.372077 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 01 21:55:31 crc kubenswrapper[4962]: I1201 21:55:31.474877 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6cf6cbec-dfb2-4afc-a180-e38559810d02-config-data-custom\") pod \"6cf6cbec-dfb2-4afc-a180-e38559810d02\" (UID: \"6cf6cbec-dfb2-4afc-a180-e38559810d02\") " Dec 01 21:55:31 crc kubenswrapper[4962]: I1201 21:55:31.474952 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cf6cbec-dfb2-4afc-a180-e38559810d02-config-data\") pod \"6cf6cbec-dfb2-4afc-a180-e38559810d02\" (UID: \"6cf6cbec-dfb2-4afc-a180-e38559810d02\") " Dec 01 21:55:31 crc kubenswrapper[4962]: I1201 21:55:31.474989 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6cf6cbec-dfb2-4afc-a180-e38559810d02-scripts\") pod \"6cf6cbec-dfb2-4afc-a180-e38559810d02\" (UID: \"6cf6cbec-dfb2-4afc-a180-e38559810d02\") " Dec 01 21:55:31 crc kubenswrapper[4962]: I1201 21:55:31.475005 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptcv4\" (UniqueName: \"kubernetes.io/projected/6cf6cbec-dfb2-4afc-a180-e38559810d02-kube-api-access-ptcv4\") pod \"6cf6cbec-dfb2-4afc-a180-e38559810d02\" (UID: \"6cf6cbec-dfb2-4afc-a180-e38559810d02\") " Dec 01 21:55:31 crc kubenswrapper[4962]: I1201 21:55:31.475055 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6cf6cbec-dfb2-4afc-a180-e38559810d02-etc-machine-id\") pod \"6cf6cbec-dfb2-4afc-a180-e38559810d02\" (UID: \"6cf6cbec-dfb2-4afc-a180-e38559810d02\") " Dec 01 21:55:31 crc kubenswrapper[4962]: I1201 21:55:31.475126 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cf6cbec-dfb2-4afc-a180-e38559810d02-combined-ca-bundle\") pod \"6cf6cbec-dfb2-4afc-a180-e38559810d02\" (UID: \"6cf6cbec-dfb2-4afc-a180-e38559810d02\") " Dec 01 21:55:31 crc kubenswrapper[4962]: I1201 21:55:31.475627 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6cf6cbec-dfb2-4afc-a180-e38559810d02-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "6cf6cbec-dfb2-4afc-a180-e38559810d02" (UID: "6cf6cbec-dfb2-4afc-a180-e38559810d02"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 21:55:31 crc kubenswrapper[4962]: I1201 21:55:31.496241 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cf6cbec-dfb2-4afc-a180-e38559810d02-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6cf6cbec-dfb2-4afc-a180-e38559810d02" (UID: "6cf6cbec-dfb2-4afc-a180-e38559810d02"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:55:31 crc kubenswrapper[4962]: I1201 21:55:31.499232 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cf6cbec-dfb2-4afc-a180-e38559810d02-kube-api-access-ptcv4" (OuterVolumeSpecName: "kube-api-access-ptcv4") pod "6cf6cbec-dfb2-4afc-a180-e38559810d02" (UID: "6cf6cbec-dfb2-4afc-a180-e38559810d02"). InnerVolumeSpecName "kube-api-access-ptcv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:55:31 crc kubenswrapper[4962]: I1201 21:55:31.518165 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cf6cbec-dfb2-4afc-a180-e38559810d02-scripts" (OuterVolumeSpecName: "scripts") pod "6cf6cbec-dfb2-4afc-a180-e38559810d02" (UID: "6cf6cbec-dfb2-4afc-a180-e38559810d02"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:55:31 crc kubenswrapper[4962]: I1201 21:55:31.577040 4962 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6cf6cbec-dfb2-4afc-a180-e38559810d02-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 01 21:55:31 crc kubenswrapper[4962]: I1201 21:55:31.577097 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6cf6cbec-dfb2-4afc-a180-e38559810d02-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 21:55:31 crc kubenswrapper[4962]: I1201 21:55:31.577130 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptcv4\" (UniqueName: \"kubernetes.io/projected/6cf6cbec-dfb2-4afc-a180-e38559810d02-kube-api-access-ptcv4\") on node \"crc\" DevicePath \"\"" Dec 01 21:55:31 crc kubenswrapper[4962]: I1201 21:55:31.577141 4962 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6cf6cbec-dfb2-4afc-a180-e38559810d02-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 01 21:55:31 crc kubenswrapper[4962]: I1201 21:55:31.603837 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 21:55:31 crc kubenswrapper[4962]: I1201 21:55:31.605021 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cf6cbec-dfb2-4afc-a180-e38559810d02-config-data" (OuterVolumeSpecName: "config-data") pod "6cf6cbec-dfb2-4afc-a180-e38559810d02" (UID: "6cf6cbec-dfb2-4afc-a180-e38559810d02"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:55:31 crc kubenswrapper[4962]: I1201 21:55:31.643142 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cf6cbec-dfb2-4afc-a180-e38559810d02-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6cf6cbec-dfb2-4afc-a180-e38559810d02" (UID: "6cf6cbec-dfb2-4afc-a180-e38559810d02"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:55:31 crc kubenswrapper[4962]: I1201 21:55:31.679028 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/624dc66d-d4f2-447b-861b-23987a75a3d6-log-httpd\") pod \"624dc66d-d4f2-447b-861b-23987a75a3d6\" (UID: \"624dc66d-d4f2-447b-861b-23987a75a3d6\") " Dec 01 21:55:31 crc kubenswrapper[4962]: I1201 21:55:31.681837 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/624dc66d-d4f2-447b-861b-23987a75a3d6-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "624dc66d-d4f2-447b-861b-23987a75a3d6" (UID: "624dc66d-d4f2-447b-861b-23987a75a3d6"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:55:31 crc kubenswrapper[4962]: I1201 21:55:31.683203 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/624dc66d-d4f2-447b-861b-23987a75a3d6-run-httpd\") pod \"624dc66d-d4f2-447b-861b-23987a75a3d6\" (UID: \"624dc66d-d4f2-447b-861b-23987a75a3d6\") " Dec 01 21:55:31 crc kubenswrapper[4962]: I1201 21:55:31.684007 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/624dc66d-d4f2-447b-861b-23987a75a3d6-sg-core-conf-yaml\") pod \"624dc66d-d4f2-447b-861b-23987a75a3d6\" (UID: \"624dc66d-d4f2-447b-861b-23987a75a3d6\") " Dec 01 21:55:31 crc kubenswrapper[4962]: I1201 21:55:31.683412 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/624dc66d-d4f2-447b-861b-23987a75a3d6-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "624dc66d-d4f2-447b-861b-23987a75a3d6" (UID: "624dc66d-d4f2-447b-861b-23987a75a3d6"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:55:31 crc kubenswrapper[4962]: I1201 21:55:31.684837 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/624dc66d-d4f2-447b-861b-23987a75a3d6-combined-ca-bundle\") pod \"624dc66d-d4f2-447b-861b-23987a75a3d6\" (UID: \"624dc66d-d4f2-447b-861b-23987a75a3d6\") " Dec 01 21:55:31 crc kubenswrapper[4962]: I1201 21:55:31.684927 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/624dc66d-d4f2-447b-861b-23987a75a3d6-scripts\") pod \"624dc66d-d4f2-447b-861b-23987a75a3d6\" (UID: \"624dc66d-d4f2-447b-861b-23987a75a3d6\") " Dec 01 21:55:31 crc kubenswrapper[4962]: I1201 21:55:31.684996 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxvtk\" (UniqueName: \"kubernetes.io/projected/624dc66d-d4f2-447b-861b-23987a75a3d6-kube-api-access-jxvtk\") pod \"624dc66d-d4f2-447b-861b-23987a75a3d6\" (UID: \"624dc66d-d4f2-447b-861b-23987a75a3d6\") " Dec 01 21:55:31 crc kubenswrapper[4962]: I1201 21:55:31.685071 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/624dc66d-d4f2-447b-861b-23987a75a3d6-config-data\") pod \"624dc66d-d4f2-447b-861b-23987a75a3d6\" (UID: \"624dc66d-d4f2-447b-861b-23987a75a3d6\") " Dec 01 21:55:31 crc kubenswrapper[4962]: I1201 21:55:31.687187 4962 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/624dc66d-d4f2-447b-861b-23987a75a3d6-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 21:55:31 crc kubenswrapper[4962]: I1201 21:55:31.687205 4962 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/624dc66d-d4f2-447b-861b-23987a75a3d6-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 21:55:31 crc kubenswrapper[4962]: I1201 21:55:31.687215 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cf6cbec-dfb2-4afc-a180-e38559810d02-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 21:55:31 crc kubenswrapper[4962]: I1201 21:55:31.687223 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cf6cbec-dfb2-4afc-a180-e38559810d02-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 21:55:31 crc kubenswrapper[4962]: I1201 21:55:31.695207 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/624dc66d-d4f2-447b-861b-23987a75a3d6-scripts" (OuterVolumeSpecName: "scripts") pod "624dc66d-d4f2-447b-861b-23987a75a3d6" (UID: "624dc66d-d4f2-447b-861b-23987a75a3d6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:55:31 crc kubenswrapper[4962]: I1201 21:55:31.709260 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/624dc66d-d4f2-447b-861b-23987a75a3d6-kube-api-access-jxvtk" (OuterVolumeSpecName: "kube-api-access-jxvtk") pod "624dc66d-d4f2-447b-861b-23987a75a3d6" (UID: "624dc66d-d4f2-447b-861b-23987a75a3d6"). InnerVolumeSpecName "kube-api-access-jxvtk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:55:31 crc kubenswrapper[4962]: I1201 21:55:31.776103 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/624dc66d-d4f2-447b-861b-23987a75a3d6-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "624dc66d-d4f2-447b-861b-23987a75a3d6" (UID: "624dc66d-d4f2-447b-861b-23987a75a3d6"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:55:31 crc kubenswrapper[4962]: I1201 21:55:31.789620 4962 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/624dc66d-d4f2-447b-861b-23987a75a3d6-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 01 21:55:31 crc kubenswrapper[4962]: I1201 21:55:31.789662 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/624dc66d-d4f2-447b-861b-23987a75a3d6-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 21:55:31 crc kubenswrapper[4962]: I1201 21:55:31.789672 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxvtk\" (UniqueName: \"kubernetes.io/projected/624dc66d-d4f2-447b-861b-23987a75a3d6-kube-api-access-jxvtk\") on node \"crc\" DevicePath \"\"" Dec 01 21:55:31 crc kubenswrapper[4962]: I1201 21:55:31.838025 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/624dc66d-d4f2-447b-861b-23987a75a3d6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "624dc66d-d4f2-447b-861b-23987a75a3d6" (UID: "624dc66d-d4f2-447b-861b-23987a75a3d6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:55:31 crc kubenswrapper[4962]: I1201 21:55:31.848240 4962 generic.go:334] "Generic (PLEG): container finished" podID="624dc66d-d4f2-447b-861b-23987a75a3d6" containerID="e957539bd231854f0073f95ebd36d89ec2ea62e54fe60af4a852a9def4749bd5" exitCode=0 Dec 01 21:55:31 crc kubenswrapper[4962]: I1201 21:55:31.848319 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"624dc66d-d4f2-447b-861b-23987a75a3d6","Type":"ContainerDied","Data":"e957539bd231854f0073f95ebd36d89ec2ea62e54fe60af4a852a9def4749bd5"} Dec 01 21:55:31 crc kubenswrapper[4962]: I1201 21:55:31.848368 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"624dc66d-d4f2-447b-861b-23987a75a3d6","Type":"ContainerDied","Data":"3ea58faafc6a483aea4ddd4602ca9cc031011293c854d83a3f507657532c683e"} Dec 01 21:55:31 crc kubenswrapper[4962]: I1201 21:55:31.848384 4962 scope.go:117] "RemoveContainer" containerID="9c22fd0df8b5c2c87c6bc63c654a1544e746d51671d99f82a396e56826cd19ea" Dec 01 21:55:31 crc kubenswrapper[4962]: I1201 21:55:31.848486 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 21:55:31 crc kubenswrapper[4962]: I1201 21:55:31.877415 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6cf6cbec-dfb2-4afc-a180-e38559810d02","Type":"ContainerDied","Data":"6e0f0f845d71a8e5bad98dff9a711f23aaad820e9ed661556b4224357fdd5f5b"} Dec 01 21:55:31 crc kubenswrapper[4962]: I1201 21:55:31.877521 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 01 21:55:31 crc kubenswrapper[4962]: I1201 21:55:31.889291 4962 scope.go:117] "RemoveContainer" containerID="3b700d2def88961fd57d19c4e4cc0de95c1da92a0cb130fc9a051f5f95bc4559" Dec 01 21:55:31 crc kubenswrapper[4962]: I1201 21:55:31.891528 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/624dc66d-d4f2-447b-861b-23987a75a3d6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 21:55:31 crc kubenswrapper[4962]: I1201 21:55:31.898894 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"356fbcf6-1bde-4b7b-bf5f-7be551d1a03c","Type":"ContainerStarted","Data":"82435919e05244ff435e3560231bdec51b267bee80cc3239a03abd2005742d64"} Dec 01 21:55:31 crc kubenswrapper[4962]: I1201 21:55:31.898965 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"356fbcf6-1bde-4b7b-bf5f-7be551d1a03c","Type":"ContainerStarted","Data":"58ac74d646273b00ac594a7e0122ffb118865d7dd06e05f83e5c731f020ddb36"} Dec 01 21:55:31 crc kubenswrapper[4962]: I1201 21:55:31.934304 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 21:55:31 crc kubenswrapper[4962]: I1201 21:55:31.952886 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/624dc66d-d4f2-447b-861b-23987a75a3d6-config-data" (OuterVolumeSpecName: "config-data") pod "624dc66d-d4f2-447b-861b-23987a75a3d6" (UID: "624dc66d-d4f2-447b-861b-23987a75a3d6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:55:31 crc kubenswrapper[4962]: I1201 21:55:31.964228 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 21:55:31 crc kubenswrapper[4962]: I1201 21:55:31.966627 4962 scope.go:117] "RemoveContainer" containerID="e957539bd231854f0073f95ebd36d89ec2ea62e54fe60af4a852a9def4749bd5" Dec 01 21:55:31 crc kubenswrapper[4962]: I1201 21:55:31.984026 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 21:55:31 crc kubenswrapper[4962]: E1201 21:55:31.984464 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cf6cbec-dfb2-4afc-a180-e38559810d02" containerName="probe" Dec 01 21:55:31 crc kubenswrapper[4962]: I1201 21:55:31.984481 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cf6cbec-dfb2-4afc-a180-e38559810d02" containerName="probe" Dec 01 21:55:31 crc kubenswrapper[4962]: E1201 21:55:31.984496 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="624dc66d-d4f2-447b-861b-23987a75a3d6" containerName="ceilometer-central-agent" Dec 01 21:55:31 crc kubenswrapper[4962]: I1201 21:55:31.984503 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="624dc66d-d4f2-447b-861b-23987a75a3d6" containerName="ceilometer-central-agent" Dec 01 21:55:31 crc kubenswrapper[4962]: E1201 21:55:31.984520 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="624dc66d-d4f2-447b-861b-23987a75a3d6" containerName="sg-core" Dec 01 21:55:31 crc kubenswrapper[4962]: I1201 21:55:31.984526 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="624dc66d-d4f2-447b-861b-23987a75a3d6" containerName="sg-core" Dec 01 21:55:31 crc kubenswrapper[4962]: E1201 21:55:31.984538 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="624dc66d-d4f2-447b-861b-23987a75a3d6" containerName="proxy-httpd" Dec 01 21:55:31 crc kubenswrapper[4962]: I1201 21:55:31.984543 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="624dc66d-d4f2-447b-861b-23987a75a3d6" containerName="proxy-httpd" Dec 01 21:55:31 crc kubenswrapper[4962]: E1201 21:55:31.984555 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="624dc66d-d4f2-447b-861b-23987a75a3d6" containerName="ceilometer-notification-agent" Dec 01 21:55:31 crc kubenswrapper[4962]: I1201 21:55:31.984561 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="624dc66d-d4f2-447b-861b-23987a75a3d6" containerName="ceilometer-notification-agent" Dec 01 21:55:31 crc kubenswrapper[4962]: E1201 21:55:31.984584 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cf6cbec-dfb2-4afc-a180-e38559810d02" containerName="cinder-scheduler" Dec 01 21:55:31 crc kubenswrapper[4962]: I1201 21:55:31.984590 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cf6cbec-dfb2-4afc-a180-e38559810d02" containerName="cinder-scheduler" Dec 01 21:55:31 crc kubenswrapper[4962]: I1201 21:55:31.984784 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cf6cbec-dfb2-4afc-a180-e38559810d02" containerName="cinder-scheduler" Dec 01 21:55:31 crc kubenswrapper[4962]: I1201 21:55:31.984808 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="624dc66d-d4f2-447b-861b-23987a75a3d6" containerName="ceilometer-central-agent" Dec 01 21:55:31 crc kubenswrapper[4962]: I1201 21:55:31.984815 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="624dc66d-d4f2-447b-861b-23987a75a3d6" containerName="ceilometer-notification-agent" Dec 01 21:55:31 crc kubenswrapper[4962]: I1201 21:55:31.984831 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="624dc66d-d4f2-447b-861b-23987a75a3d6" containerName="proxy-httpd" Dec 01 21:55:31 crc kubenswrapper[4962]: I1201 21:55:31.984838 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="624dc66d-d4f2-447b-861b-23987a75a3d6" containerName="sg-core" Dec 01 21:55:31 crc kubenswrapper[4962]: I1201 21:55:31.984855 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cf6cbec-dfb2-4afc-a180-e38559810d02" containerName="probe" Dec 01 21:55:31 crc kubenswrapper[4962]: I1201 21:55:31.985983 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 01 21:55:31 crc kubenswrapper[4962]: I1201 21:55:31.994789 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/624dc66d-d4f2-447b-861b-23987a75a3d6-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 21:55:31 crc kubenswrapper[4962]: I1201 21:55:31.995056 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 21:55:31 crc kubenswrapper[4962]: I1201 21:55:31.995337 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 01 21:55:32 crc kubenswrapper[4962]: I1201 21:55:32.096415 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3c2a5fd2-18a9-46c7-8450-9c9cdeadea4d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3c2a5fd2-18a9-46c7-8450-9c9cdeadea4d\") " pod="openstack/cinder-scheduler-0" Dec 01 21:55:32 crc kubenswrapper[4962]: I1201 21:55:32.096471 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhq65\" (UniqueName: \"kubernetes.io/projected/3c2a5fd2-18a9-46c7-8450-9c9cdeadea4d-kube-api-access-zhq65\") pod \"cinder-scheduler-0\" (UID: \"3c2a5fd2-18a9-46c7-8450-9c9cdeadea4d\") " pod="openstack/cinder-scheduler-0" Dec 01 21:55:32 crc kubenswrapper[4962]: I1201 21:55:32.096501 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3c2a5fd2-18a9-46c7-8450-9c9cdeadea4d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3c2a5fd2-18a9-46c7-8450-9c9cdeadea4d\") " pod="openstack/cinder-scheduler-0" Dec 01 21:55:32 crc kubenswrapper[4962]: I1201 21:55:32.096548 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c2a5fd2-18a9-46c7-8450-9c9cdeadea4d-scripts\") pod \"cinder-scheduler-0\" (UID: \"3c2a5fd2-18a9-46c7-8450-9c9cdeadea4d\") " pod="openstack/cinder-scheduler-0" Dec 01 21:55:32 crc kubenswrapper[4962]: I1201 21:55:32.096575 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c2a5fd2-18a9-46c7-8450-9c9cdeadea4d-config-data\") pod \"cinder-scheduler-0\" (UID: \"3c2a5fd2-18a9-46c7-8450-9c9cdeadea4d\") " pod="openstack/cinder-scheduler-0" Dec 01 21:55:32 crc kubenswrapper[4962]: I1201 21:55:32.096599 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c2a5fd2-18a9-46c7-8450-9c9cdeadea4d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3c2a5fd2-18a9-46c7-8450-9c9cdeadea4d\") " pod="openstack/cinder-scheduler-0" Dec 01 21:55:32 crc kubenswrapper[4962]: I1201 21:55:32.099485 4962 scope.go:117] "RemoveContainer" containerID="bfb880d1e65ab732bffe713cc7b20a7199ecc71eb02fc8813b5e5e816272e897" Dec 01 21:55:32 crc kubenswrapper[4962]: I1201 21:55:32.129199 4962 scope.go:117] "RemoveContainer" containerID="9c22fd0df8b5c2c87c6bc63c654a1544e746d51671d99f82a396e56826cd19ea" Dec 01 21:55:32 crc kubenswrapper[4962]: E1201 21:55:32.129903 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c22fd0df8b5c2c87c6bc63c654a1544e746d51671d99f82a396e56826cd19ea\": container with ID starting with 9c22fd0df8b5c2c87c6bc63c654a1544e746d51671d99f82a396e56826cd19ea not found: ID does not exist" containerID="9c22fd0df8b5c2c87c6bc63c654a1544e746d51671d99f82a396e56826cd19ea" Dec 01 21:55:32 crc kubenswrapper[4962]: I1201 21:55:32.129989 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c22fd0df8b5c2c87c6bc63c654a1544e746d51671d99f82a396e56826cd19ea"} err="failed to get container status \"9c22fd0df8b5c2c87c6bc63c654a1544e746d51671d99f82a396e56826cd19ea\": rpc error: code = NotFound desc = could not find container \"9c22fd0df8b5c2c87c6bc63c654a1544e746d51671d99f82a396e56826cd19ea\": container with ID starting with 9c22fd0df8b5c2c87c6bc63c654a1544e746d51671d99f82a396e56826cd19ea not found: ID does not exist" Dec 01 21:55:32 crc kubenswrapper[4962]: I1201 21:55:32.130020 4962 scope.go:117] "RemoveContainer" containerID="3b700d2def88961fd57d19c4e4cc0de95c1da92a0cb130fc9a051f5f95bc4559" Dec 01 21:55:32 crc kubenswrapper[4962]: E1201 21:55:32.130485 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b700d2def88961fd57d19c4e4cc0de95c1da92a0cb130fc9a051f5f95bc4559\": container with ID starting with 3b700d2def88961fd57d19c4e4cc0de95c1da92a0cb130fc9a051f5f95bc4559 not found: ID does not exist" containerID="3b700d2def88961fd57d19c4e4cc0de95c1da92a0cb130fc9a051f5f95bc4559" Dec 01 21:55:32 crc kubenswrapper[4962]: I1201 21:55:32.130524 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b700d2def88961fd57d19c4e4cc0de95c1da92a0cb130fc9a051f5f95bc4559"} err="failed to get container status \"3b700d2def88961fd57d19c4e4cc0de95c1da92a0cb130fc9a051f5f95bc4559\": rpc error: code = NotFound desc = could not find container \"3b700d2def88961fd57d19c4e4cc0de95c1da92a0cb130fc9a051f5f95bc4559\": container with ID starting with 3b700d2def88961fd57d19c4e4cc0de95c1da92a0cb130fc9a051f5f95bc4559 not found: ID does not exist" Dec 01 21:55:32 crc kubenswrapper[4962]: I1201 21:55:32.130547 4962 scope.go:117] "RemoveContainer" containerID="e957539bd231854f0073f95ebd36d89ec2ea62e54fe60af4a852a9def4749bd5" Dec 01 21:55:32 crc kubenswrapper[4962]: E1201 21:55:32.130807 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e957539bd231854f0073f95ebd36d89ec2ea62e54fe60af4a852a9def4749bd5\": container with ID starting with e957539bd231854f0073f95ebd36d89ec2ea62e54fe60af4a852a9def4749bd5 not found: ID does not exist" containerID="e957539bd231854f0073f95ebd36d89ec2ea62e54fe60af4a852a9def4749bd5" Dec 01 21:55:32 crc kubenswrapper[4962]: I1201 21:55:32.130832 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e957539bd231854f0073f95ebd36d89ec2ea62e54fe60af4a852a9def4749bd5"} err="failed to get container status \"e957539bd231854f0073f95ebd36d89ec2ea62e54fe60af4a852a9def4749bd5\": rpc error: code = NotFound desc = could not find container \"e957539bd231854f0073f95ebd36d89ec2ea62e54fe60af4a852a9def4749bd5\": container with ID starting with e957539bd231854f0073f95ebd36d89ec2ea62e54fe60af4a852a9def4749bd5 not found: ID does not exist" Dec 01 21:55:32 crc kubenswrapper[4962]: I1201 21:55:32.130847 4962 scope.go:117] "RemoveContainer" containerID="bfb880d1e65ab732bffe713cc7b20a7199ecc71eb02fc8813b5e5e816272e897" Dec 01 21:55:32 crc kubenswrapper[4962]: E1201 21:55:32.133205 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfb880d1e65ab732bffe713cc7b20a7199ecc71eb02fc8813b5e5e816272e897\": container with ID starting with bfb880d1e65ab732bffe713cc7b20a7199ecc71eb02fc8813b5e5e816272e897 not found: ID does not exist" containerID="bfb880d1e65ab732bffe713cc7b20a7199ecc71eb02fc8813b5e5e816272e897" Dec 01 21:55:32 crc kubenswrapper[4962]: I1201 21:55:32.133231 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfb880d1e65ab732bffe713cc7b20a7199ecc71eb02fc8813b5e5e816272e897"} err="failed to get container status \"bfb880d1e65ab732bffe713cc7b20a7199ecc71eb02fc8813b5e5e816272e897\": rpc error: code = NotFound desc = could not find container \"bfb880d1e65ab732bffe713cc7b20a7199ecc71eb02fc8813b5e5e816272e897\": container with ID starting with bfb880d1e65ab732bffe713cc7b20a7199ecc71eb02fc8813b5e5e816272e897 not found: ID does not exist" Dec 01 21:55:32 crc kubenswrapper[4962]: I1201 21:55:32.133247 4962 scope.go:117] "RemoveContainer" containerID="4fe7f9b202cf3bd255544ba999c4aec029a692aca05094f3299c20d5cdb5170a" Dec 01 21:55:32 crc kubenswrapper[4962]: I1201 21:55:32.167064 4962 scope.go:117] "RemoveContainer" containerID="ef2e1e950c9ed0a5e1b348acb20f5897de5d684ef7b59c4d14331740bc98ca0a" Dec 01 21:55:32 crc kubenswrapper[4962]: I1201 21:55:32.183355 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 21:55:32 crc kubenswrapper[4962]: I1201 21:55:32.198378 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3c2a5fd2-18a9-46c7-8450-9c9cdeadea4d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3c2a5fd2-18a9-46c7-8450-9c9cdeadea4d\") " pod="openstack/cinder-scheduler-0" Dec 01 21:55:32 crc kubenswrapper[4962]: I1201 21:55:32.198445 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhq65\" (UniqueName: \"kubernetes.io/projected/3c2a5fd2-18a9-46c7-8450-9c9cdeadea4d-kube-api-access-zhq65\") pod \"cinder-scheduler-0\" (UID: \"3c2a5fd2-18a9-46c7-8450-9c9cdeadea4d\") " pod="openstack/cinder-scheduler-0" Dec 01 21:55:32 crc kubenswrapper[4962]: I1201 21:55:32.198479 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3c2a5fd2-18a9-46c7-8450-9c9cdeadea4d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3c2a5fd2-18a9-46c7-8450-9c9cdeadea4d\") " pod="openstack/cinder-scheduler-0" Dec 01 21:55:32 crc kubenswrapper[4962]: I1201 21:55:32.198527 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c2a5fd2-18a9-46c7-8450-9c9cdeadea4d-scripts\") pod \"cinder-scheduler-0\" (UID: \"3c2a5fd2-18a9-46c7-8450-9c9cdeadea4d\") " pod="openstack/cinder-scheduler-0" Dec 01 21:55:32 crc kubenswrapper[4962]: I1201 21:55:32.198556 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c2a5fd2-18a9-46c7-8450-9c9cdeadea4d-config-data\") pod \"cinder-scheduler-0\" (UID: \"3c2a5fd2-18a9-46c7-8450-9c9cdeadea4d\") " pod="openstack/cinder-scheduler-0" Dec 01 21:55:32 crc kubenswrapper[4962]: I1201 21:55:32.198604 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c2a5fd2-18a9-46c7-8450-9c9cdeadea4d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3c2a5fd2-18a9-46c7-8450-9c9cdeadea4d\") " pod="openstack/cinder-scheduler-0" Dec 01 21:55:32 crc kubenswrapper[4962]: I1201 21:55:32.199248 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3c2a5fd2-18a9-46c7-8450-9c9cdeadea4d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3c2a5fd2-18a9-46c7-8450-9c9cdeadea4d\") " pod="openstack/cinder-scheduler-0" Dec 01 21:55:32 crc kubenswrapper[4962]: I1201 21:55:32.204257 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c2a5fd2-18a9-46c7-8450-9c9cdeadea4d-scripts\") pod \"cinder-scheduler-0\" (UID: \"3c2a5fd2-18a9-46c7-8450-9c9cdeadea4d\") " pod="openstack/cinder-scheduler-0" Dec 01 21:55:32 crc kubenswrapper[4962]: I1201 21:55:32.205266 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c2a5fd2-18a9-46c7-8450-9c9cdeadea4d-config-data\") pod \"cinder-scheduler-0\" (UID: \"3c2a5fd2-18a9-46c7-8450-9c9cdeadea4d\") " pod="openstack/cinder-scheduler-0" Dec 01 21:55:32 crc kubenswrapper[4962]: I1201 21:55:32.205799 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3c2a5fd2-18a9-46c7-8450-9c9cdeadea4d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3c2a5fd2-18a9-46c7-8450-9c9cdeadea4d\") " pod="openstack/cinder-scheduler-0" Dec 01 21:55:32 crc kubenswrapper[4962]: I1201 21:55:32.215649 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c2a5fd2-18a9-46c7-8450-9c9cdeadea4d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3c2a5fd2-18a9-46c7-8450-9c9cdeadea4d\") " pod="openstack/cinder-scheduler-0" Dec 01 21:55:32 crc kubenswrapper[4962]: I1201 21:55:32.217306 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 01 21:55:32 crc kubenswrapper[4962]: I1201 21:55:32.220168 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhq65\" (UniqueName: \"kubernetes.io/projected/3c2a5fd2-18a9-46c7-8450-9c9cdeadea4d-kube-api-access-zhq65\") pod \"cinder-scheduler-0\" (UID: \"3c2a5fd2-18a9-46c7-8450-9c9cdeadea4d\") " pod="openstack/cinder-scheduler-0" Dec 01 21:55:32 crc kubenswrapper[4962]: I1201 21:55:32.238877 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fc05c2b-9041-4361-8635-826c5a64afd2" path="/var/lib/kubelet/pods/2fc05c2b-9041-4361-8635-826c5a64afd2/volumes" Dec 01 21:55:32 crc kubenswrapper[4962]: I1201 21:55:32.239761 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="624dc66d-d4f2-447b-861b-23987a75a3d6" path="/var/lib/kubelet/pods/624dc66d-d4f2-447b-861b-23987a75a3d6/volumes" Dec 01 21:55:32 crc kubenswrapper[4962]: I1201 21:55:32.240833 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cf6cbec-dfb2-4afc-a180-e38559810d02" path="/var/lib/kubelet/pods/6cf6cbec-dfb2-4afc-a180-e38559810d02/volumes" Dec 01 21:55:32 crc kubenswrapper[4962]: I1201 21:55:32.247692 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 01 21:55:32 crc kubenswrapper[4962]: I1201 21:55:32.255127 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 21:55:32 crc kubenswrapper[4962]: I1201 21:55:32.258221 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 01 21:55:32 crc kubenswrapper[4962]: I1201 21:55:32.258314 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 01 21:55:32 crc kubenswrapper[4962]: I1201 21:55:32.295667 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 21:55:32 crc kubenswrapper[4962]: I1201 21:55:32.301252 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2d6db005-fb47-40ad-979c-04ccd8146c41-run-httpd\") pod \"ceilometer-0\" (UID: \"2d6db005-fb47-40ad-979c-04ccd8146c41\") " pod="openstack/ceilometer-0" Dec 01 21:55:32 crc kubenswrapper[4962]: I1201 21:55:32.301317 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2d6db005-fb47-40ad-979c-04ccd8146c41-log-httpd\") pod \"ceilometer-0\" (UID: \"2d6db005-fb47-40ad-979c-04ccd8146c41\") " pod="openstack/ceilometer-0" Dec 01 21:55:32 crc kubenswrapper[4962]: I1201 21:55:32.301343 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2d6db005-fb47-40ad-979c-04ccd8146c41-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2d6db005-fb47-40ad-979c-04ccd8146c41\") " pod="openstack/ceilometer-0" Dec 01 21:55:32 crc kubenswrapper[4962]: I1201 21:55:32.301390 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d6db005-fb47-40ad-979c-04ccd8146c41-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2d6db005-fb47-40ad-979c-04ccd8146c41\") " pod="openstack/ceilometer-0" Dec 01 21:55:32 crc kubenswrapper[4962]: I1201 21:55:32.301414 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4skn\" (UniqueName: \"kubernetes.io/projected/2d6db005-fb47-40ad-979c-04ccd8146c41-kube-api-access-r4skn\") pod \"ceilometer-0\" (UID: \"2d6db005-fb47-40ad-979c-04ccd8146c41\") " pod="openstack/ceilometer-0" Dec 01 21:55:32 crc kubenswrapper[4962]: I1201 21:55:32.301488 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d6db005-fb47-40ad-979c-04ccd8146c41-config-data\") pod \"ceilometer-0\" (UID: \"2d6db005-fb47-40ad-979c-04ccd8146c41\") " pod="openstack/ceilometer-0" Dec 01 21:55:32 crc kubenswrapper[4962]: I1201 21:55:32.301520 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d6db005-fb47-40ad-979c-04ccd8146c41-scripts\") pod \"ceilometer-0\" (UID: \"2d6db005-fb47-40ad-979c-04ccd8146c41\") " pod="openstack/ceilometer-0" Dec 01 21:55:32 crc kubenswrapper[4962]: I1201 21:55:32.397021 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 01 21:55:32 crc kubenswrapper[4962]: I1201 21:55:32.403250 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d6db005-fb47-40ad-979c-04ccd8146c41-config-data\") pod \"ceilometer-0\" (UID: \"2d6db005-fb47-40ad-979c-04ccd8146c41\") " pod="openstack/ceilometer-0" Dec 01 21:55:32 crc kubenswrapper[4962]: I1201 21:55:32.403311 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d6db005-fb47-40ad-979c-04ccd8146c41-scripts\") pod \"ceilometer-0\" (UID: \"2d6db005-fb47-40ad-979c-04ccd8146c41\") " pod="openstack/ceilometer-0" Dec 01 21:55:32 crc kubenswrapper[4962]: I1201 21:55:32.403370 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2d6db005-fb47-40ad-979c-04ccd8146c41-run-httpd\") pod \"ceilometer-0\" (UID: \"2d6db005-fb47-40ad-979c-04ccd8146c41\") " pod="openstack/ceilometer-0" Dec 01 21:55:32 crc kubenswrapper[4962]: I1201 21:55:32.403409 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2d6db005-fb47-40ad-979c-04ccd8146c41-log-httpd\") pod \"ceilometer-0\" (UID: \"2d6db005-fb47-40ad-979c-04ccd8146c41\") " pod="openstack/ceilometer-0" Dec 01 21:55:32 crc kubenswrapper[4962]: I1201 21:55:32.403433 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2d6db005-fb47-40ad-979c-04ccd8146c41-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2d6db005-fb47-40ad-979c-04ccd8146c41\") " pod="openstack/ceilometer-0" Dec 01 21:55:32 crc kubenswrapper[4962]: I1201 21:55:32.403477 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d6db005-fb47-40ad-979c-04ccd8146c41-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2d6db005-fb47-40ad-979c-04ccd8146c41\") " pod="openstack/ceilometer-0" Dec 01 21:55:32 crc kubenswrapper[4962]: I1201 21:55:32.403501 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4skn\" (UniqueName: \"kubernetes.io/projected/2d6db005-fb47-40ad-979c-04ccd8146c41-kube-api-access-r4skn\") pod \"ceilometer-0\" (UID: \"2d6db005-fb47-40ad-979c-04ccd8146c41\") " pod="openstack/ceilometer-0" Dec 01 21:55:32 crc kubenswrapper[4962]: I1201 21:55:32.404037 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2d6db005-fb47-40ad-979c-04ccd8146c41-run-httpd\") pod \"ceilometer-0\" (UID: \"2d6db005-fb47-40ad-979c-04ccd8146c41\") " pod="openstack/ceilometer-0" Dec 01 21:55:32 crc kubenswrapper[4962]: I1201 21:55:32.404179 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2d6db005-fb47-40ad-979c-04ccd8146c41-log-httpd\") pod \"ceilometer-0\" (UID: \"2d6db005-fb47-40ad-979c-04ccd8146c41\") " pod="openstack/ceilometer-0" Dec 01 21:55:32 crc kubenswrapper[4962]: I1201 21:55:32.408084 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2d6db005-fb47-40ad-979c-04ccd8146c41-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2d6db005-fb47-40ad-979c-04ccd8146c41\") " pod="openstack/ceilometer-0" Dec 01 21:55:32 crc kubenswrapper[4962]: I1201 21:55:32.408293 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d6db005-fb47-40ad-979c-04ccd8146c41-config-data\") pod \"ceilometer-0\" (UID: \"2d6db005-fb47-40ad-979c-04ccd8146c41\") " pod="openstack/ceilometer-0" Dec 01 21:55:32 crc kubenswrapper[4962]: I1201 21:55:32.408834 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d6db005-fb47-40ad-979c-04ccd8146c41-scripts\") pod \"ceilometer-0\" (UID: \"2d6db005-fb47-40ad-979c-04ccd8146c41\") " pod="openstack/ceilometer-0" Dec 01 21:55:32 crc kubenswrapper[4962]: I1201 21:55:32.411135 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d6db005-fb47-40ad-979c-04ccd8146c41-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2d6db005-fb47-40ad-979c-04ccd8146c41\") " pod="openstack/ceilometer-0" Dec 01 21:55:32 crc kubenswrapper[4962]: I1201 21:55:32.417995 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4skn\" (UniqueName: \"kubernetes.io/projected/2d6db005-fb47-40ad-979c-04ccd8146c41-kube-api-access-r4skn\") pod \"ceilometer-0\" (UID: \"2d6db005-fb47-40ad-979c-04ccd8146c41\") " pod="openstack/ceilometer-0" Dec 01 21:55:32 crc kubenswrapper[4962]: I1201 21:55:32.616579 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 21:55:32 crc kubenswrapper[4962]: I1201 21:55:32.901753 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 21:55:32 crc kubenswrapper[4962]: I1201 21:55:32.913020 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3c2a5fd2-18a9-46c7-8450-9c9cdeadea4d","Type":"ContainerStarted","Data":"690aa5ef6f4f508b9c7248c5c44ff7893d64e630142f4161d0435c9c68948126"} Dec 01 21:55:32 crc kubenswrapper[4962]: I1201 21:55:32.917700 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"356fbcf6-1bde-4b7b-bf5f-7be551d1a03c","Type":"ContainerStarted","Data":"4fbd0733b4f6762c59923b05cc62f85f4baebf6134879630bc406d53913f1d7f"} Dec 01 21:55:32 crc kubenswrapper[4962]: I1201 21:55:32.917825 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 01 21:55:32 crc kubenswrapper[4962]: I1201 21:55:32.971805 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.971788645 podStartE2EDuration="3.971788645s" podCreationTimestamp="2025-12-01 21:55:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:55:32.966435422 +0000 UTC m=+1317.067874647" watchObservedRunningTime="2025-12-01 21:55:32.971788645 +0000 UTC m=+1317.073227840" Dec 01 21:55:33 crc kubenswrapper[4962]: I1201 21:55:33.142377 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 21:55:33 crc kubenswrapper[4962]: I1201 21:55:33.932359 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2d6db005-fb47-40ad-979c-04ccd8146c41","Type":"ContainerStarted","Data":"88820d2455788335cb70872e685f36cce70095113a6ab68695d19976c7052d48"} Dec 01 21:55:33 crc kubenswrapper[4962]: I1201 21:55:33.932602 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2d6db005-fb47-40ad-979c-04ccd8146c41","Type":"ContainerStarted","Data":"3978d68b47d582dd594ff3c60e1c8e5c491e7d7e7bf9857d41542924c852d3bc"} Dec 01 21:55:33 crc kubenswrapper[4962]: I1201 21:55:33.934999 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3c2a5fd2-18a9-46c7-8450-9c9cdeadea4d","Type":"ContainerStarted","Data":"60302190e9b6a41879d317c160033eaa7892c9b1c6c894f5b7b896735499bb7b"} Dec 01 21:55:34 crc kubenswrapper[4962]: I1201 21:55:34.401038 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6b7b667979-qgvg7" podUID="2fc05c2b-9041-4361-8635-826c5a64afd2" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.189:5353: i/o timeout" Dec 01 21:55:34 crc kubenswrapper[4962]: I1201 21:55:34.951166 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2d6db005-fb47-40ad-979c-04ccd8146c41","Type":"ContainerStarted","Data":"5aa07407749c8b852d88b0d747067ab051c1258e0cff82c7cb0c70d57542b327"} Dec 01 21:55:34 crc kubenswrapper[4962]: I1201 21:55:34.953955 4962 generic.go:334] "Generic (PLEG): container finished" podID="4898fa68-89d9-4aa6-9b60-4503ad99778e" containerID="358e4acc7850c64440280b5d1c31ea9120e4d19ba36db602808cd65c946405fb" exitCode=0 Dec 01 21:55:34 crc kubenswrapper[4962]: I1201 21:55:34.954017 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5579c75d4-fhjn4" event={"ID":"4898fa68-89d9-4aa6-9b60-4503ad99778e","Type":"ContainerDied","Data":"358e4acc7850c64440280b5d1c31ea9120e4d19ba36db602808cd65c946405fb"} Dec 01 21:55:34 crc kubenswrapper[4962]: I1201 21:55:34.956176 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3c2a5fd2-18a9-46c7-8450-9c9cdeadea4d","Type":"ContainerStarted","Data":"5adcbd44549600df0b32361e3e82ecbabf4f4a1e92a261744f05b7469f54d485"} Dec 01 21:55:34 crc kubenswrapper[4962]: I1201 21:55:34.993525 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.993507136 podStartE2EDuration="3.993507136s" podCreationTimestamp="2025-12-01 21:55:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:55:34.984489289 +0000 UTC m=+1319.085928484" watchObservedRunningTime="2025-12-01 21:55:34.993507136 +0000 UTC m=+1319.094946331" Dec 01 21:55:35 crc kubenswrapper[4962]: I1201 21:55:35.293919 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5579c75d4-fhjn4" Dec 01 21:55:35 crc kubenswrapper[4962]: I1201 21:55:35.407002 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4898fa68-89d9-4aa6-9b60-4503ad99778e-ovndb-tls-certs\") pod \"4898fa68-89d9-4aa6-9b60-4503ad99778e\" (UID: \"4898fa68-89d9-4aa6-9b60-4503ad99778e\") " Dec 01 21:55:35 crc kubenswrapper[4962]: I1201 21:55:35.407053 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4898fa68-89d9-4aa6-9b60-4503ad99778e-config\") pod \"4898fa68-89d9-4aa6-9b60-4503ad99778e\" (UID: \"4898fa68-89d9-4aa6-9b60-4503ad99778e\") " Dec 01 21:55:35 crc kubenswrapper[4962]: I1201 21:55:35.407127 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4898fa68-89d9-4aa6-9b60-4503ad99778e-httpd-config\") pod \"4898fa68-89d9-4aa6-9b60-4503ad99778e\" (UID: \"4898fa68-89d9-4aa6-9b60-4503ad99778e\") " Dec 01 21:55:35 crc kubenswrapper[4962]: I1201 21:55:35.407174 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4898fa68-89d9-4aa6-9b60-4503ad99778e-combined-ca-bundle\") pod \"4898fa68-89d9-4aa6-9b60-4503ad99778e\" (UID: \"4898fa68-89d9-4aa6-9b60-4503ad99778e\") " Dec 01 21:55:35 crc kubenswrapper[4962]: I1201 21:55:35.407269 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdpt9\" (UniqueName: \"kubernetes.io/projected/4898fa68-89d9-4aa6-9b60-4503ad99778e-kube-api-access-wdpt9\") pod \"4898fa68-89d9-4aa6-9b60-4503ad99778e\" (UID: \"4898fa68-89d9-4aa6-9b60-4503ad99778e\") " Dec 01 21:55:35 crc kubenswrapper[4962]: I1201 21:55:35.412344 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4898fa68-89d9-4aa6-9b60-4503ad99778e-kube-api-access-wdpt9" (OuterVolumeSpecName: "kube-api-access-wdpt9") pod "4898fa68-89d9-4aa6-9b60-4503ad99778e" (UID: "4898fa68-89d9-4aa6-9b60-4503ad99778e"). InnerVolumeSpecName "kube-api-access-wdpt9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:55:35 crc kubenswrapper[4962]: I1201 21:55:35.416308 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4898fa68-89d9-4aa6-9b60-4503ad99778e-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "4898fa68-89d9-4aa6-9b60-4503ad99778e" (UID: "4898fa68-89d9-4aa6-9b60-4503ad99778e"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:55:35 crc kubenswrapper[4962]: I1201 21:55:35.478294 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4898fa68-89d9-4aa6-9b60-4503ad99778e-config" (OuterVolumeSpecName: "config") pod "4898fa68-89d9-4aa6-9b60-4503ad99778e" (UID: "4898fa68-89d9-4aa6-9b60-4503ad99778e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:55:35 crc kubenswrapper[4962]: I1201 21:55:35.493509 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4898fa68-89d9-4aa6-9b60-4503ad99778e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4898fa68-89d9-4aa6-9b60-4503ad99778e" (UID: "4898fa68-89d9-4aa6-9b60-4503ad99778e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:55:35 crc kubenswrapper[4962]: I1201 21:55:35.509026 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4898fa68-89d9-4aa6-9b60-4503ad99778e-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "4898fa68-89d9-4aa6-9b60-4503ad99778e" (UID: "4898fa68-89d9-4aa6-9b60-4503ad99778e"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:55:35 crc kubenswrapper[4962]: I1201 21:55:35.510121 4962 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4898fa68-89d9-4aa6-9b60-4503ad99778e-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 21:55:35 crc kubenswrapper[4962]: I1201 21:55:35.510160 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/4898fa68-89d9-4aa6-9b60-4503ad99778e-config\") on node \"crc\" DevicePath \"\"" Dec 01 21:55:35 crc kubenswrapper[4962]: I1201 21:55:35.510171 4962 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4898fa68-89d9-4aa6-9b60-4503ad99778e-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 01 21:55:35 crc kubenswrapper[4962]: I1201 21:55:35.510180 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4898fa68-89d9-4aa6-9b60-4503ad99778e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 21:55:35 crc kubenswrapper[4962]: I1201 21:55:35.510189 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdpt9\" (UniqueName: \"kubernetes.io/projected/4898fa68-89d9-4aa6-9b60-4503ad99778e-kube-api-access-wdpt9\") on node \"crc\" DevicePath \"\"" Dec 01 21:55:35 crc kubenswrapper[4962]: I1201 21:55:35.972635 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2d6db005-fb47-40ad-979c-04ccd8146c41","Type":"ContainerStarted","Data":"dbdae4b355d4072d74e9f0cf004663def7872c74591d4a907f201b028ae0c649"} Dec 01 21:55:35 crc kubenswrapper[4962]: I1201 21:55:35.975400 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5579c75d4-fhjn4" Dec 01 21:55:35 crc kubenswrapper[4962]: I1201 21:55:35.975399 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5579c75d4-fhjn4" event={"ID":"4898fa68-89d9-4aa6-9b60-4503ad99778e","Type":"ContainerDied","Data":"a7b5cb61d6d8aa54419ad0dfa828d9892a01bab53fb748a9e2d78225751479c6"} Dec 01 21:55:35 crc kubenswrapper[4962]: I1201 21:55:35.975673 4962 scope.go:117] "RemoveContainer" containerID="aff509d58e98b5675f1c30d02d6086b8025528aa3dc95c0b52ab03fd5ba9602b" Dec 01 21:55:36 crc kubenswrapper[4962]: I1201 21:55:36.008042 4962 scope.go:117] "RemoveContainer" containerID="358e4acc7850c64440280b5d1c31ea9120e4d19ba36db602808cd65c946405fb" Dec 01 21:55:36 crc kubenswrapper[4962]: I1201 21:55:36.035907 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5579c75d4-fhjn4"] Dec 01 21:55:36 crc kubenswrapper[4962]: I1201 21:55:36.052507 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5579c75d4-fhjn4"] Dec 01 21:55:36 crc kubenswrapper[4962]: I1201 21:55:36.240244 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4898fa68-89d9-4aa6-9b60-4503ad99778e" path="/var/lib/kubelet/pods/4898fa68-89d9-4aa6-9b60-4503ad99778e/volumes" Dec 01 21:55:37 crc kubenswrapper[4962]: I1201 21:55:37.397192 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 01 21:55:38 crc kubenswrapper[4962]: I1201 21:55:38.004420 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2d6db005-fb47-40ad-979c-04ccd8146c41","Type":"ContainerStarted","Data":"83854fb61d90bc731ffa1b9c20134fab91b244e3e3f98020bc76170b355d69b1"} Dec 01 21:55:38 crc kubenswrapper[4962]: I1201 21:55:38.006346 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 01 21:55:38 crc kubenswrapper[4962]: I1201 21:55:38.040848 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.007119473 podStartE2EDuration="6.040826289s" podCreationTimestamp="2025-12-01 21:55:32 +0000 UTC" firstStartedPulling="2025-12-01 21:55:33.149980111 +0000 UTC m=+1317.251419306" lastFinishedPulling="2025-12-01 21:55:37.183686927 +0000 UTC m=+1321.285126122" observedRunningTime="2025-12-01 21:55:38.035352213 +0000 UTC m=+1322.136791418" watchObservedRunningTime="2025-12-01 21:55:38.040826289 +0000 UTC m=+1322.142265484" Dec 01 21:55:38 crc kubenswrapper[4962]: I1201 21:55:38.321032 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-bfc984cd5-wc42c" Dec 01 21:55:39 crc kubenswrapper[4962]: I1201 21:55:39.034868 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 01 21:55:39 crc kubenswrapper[4962]: E1201 21:55:39.036354 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4898fa68-89d9-4aa6-9b60-4503ad99778e" containerName="neutron-httpd" Dec 01 21:55:39 crc kubenswrapper[4962]: I1201 21:55:39.036442 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="4898fa68-89d9-4aa6-9b60-4503ad99778e" containerName="neutron-httpd" Dec 01 21:55:39 crc kubenswrapper[4962]: E1201 21:55:39.036524 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4898fa68-89d9-4aa6-9b60-4503ad99778e" containerName="neutron-api" Dec 01 21:55:39 crc kubenswrapper[4962]: I1201 21:55:39.036582 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="4898fa68-89d9-4aa6-9b60-4503ad99778e" containerName="neutron-api" Dec 01 21:55:39 crc kubenswrapper[4962]: I1201 21:55:39.036858 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="4898fa68-89d9-4aa6-9b60-4503ad99778e" containerName="neutron-httpd" Dec 01 21:55:39 crc kubenswrapper[4962]: I1201 21:55:39.036927 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="4898fa68-89d9-4aa6-9b60-4503ad99778e" containerName="neutron-api" Dec 01 21:55:39 crc kubenswrapper[4962]: I1201 21:55:39.037729 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 01 21:55:39 crc kubenswrapper[4962]: I1201 21:55:39.040411 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-pd8ng" Dec 01 21:55:39 crc kubenswrapper[4962]: I1201 21:55:39.040554 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 01 21:55:39 crc kubenswrapper[4962]: I1201 21:55:39.040859 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 01 21:55:39 crc kubenswrapper[4962]: I1201 21:55:39.057690 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 01 21:55:39 crc kubenswrapper[4962]: I1201 21:55:39.171327 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpz8h\" (UniqueName: \"kubernetes.io/projected/9fd1f254-7f23-46a6-b2fd-986de362e028-kube-api-access-kpz8h\") pod \"openstackclient\" (UID: \"9fd1f254-7f23-46a6-b2fd-986de362e028\") " pod="openstack/openstackclient" Dec 01 21:55:39 crc kubenswrapper[4962]: I1201 21:55:39.171420 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fd1f254-7f23-46a6-b2fd-986de362e028-combined-ca-bundle\") pod \"openstackclient\" (UID: \"9fd1f254-7f23-46a6-b2fd-986de362e028\") " pod="openstack/openstackclient" Dec 01 21:55:39 crc kubenswrapper[4962]: I1201 21:55:39.171549 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9fd1f254-7f23-46a6-b2fd-986de362e028-openstack-config-secret\") pod \"openstackclient\" (UID: \"9fd1f254-7f23-46a6-b2fd-986de362e028\") " pod="openstack/openstackclient" Dec 01 21:55:39 crc kubenswrapper[4962]: I1201 21:55:39.171581 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9fd1f254-7f23-46a6-b2fd-986de362e028-openstack-config\") pod \"openstackclient\" (UID: \"9fd1f254-7f23-46a6-b2fd-986de362e028\") " pod="openstack/openstackclient" Dec 01 21:55:39 crc kubenswrapper[4962]: I1201 21:55:39.273541 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fd1f254-7f23-46a6-b2fd-986de362e028-combined-ca-bundle\") pod \"openstackclient\" (UID: \"9fd1f254-7f23-46a6-b2fd-986de362e028\") " pod="openstack/openstackclient" Dec 01 21:55:39 crc kubenswrapper[4962]: I1201 21:55:39.273673 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9fd1f254-7f23-46a6-b2fd-986de362e028-openstack-config-secret\") pod \"openstackclient\" (UID: \"9fd1f254-7f23-46a6-b2fd-986de362e028\") " pod="openstack/openstackclient" Dec 01 21:55:39 crc kubenswrapper[4962]: I1201 21:55:39.273699 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9fd1f254-7f23-46a6-b2fd-986de362e028-openstack-config\") pod \"openstackclient\" (UID: \"9fd1f254-7f23-46a6-b2fd-986de362e028\") " pod="openstack/openstackclient" Dec 01 21:55:39 crc kubenswrapper[4962]: I1201 21:55:39.273788 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpz8h\" (UniqueName: \"kubernetes.io/projected/9fd1f254-7f23-46a6-b2fd-986de362e028-kube-api-access-kpz8h\") pod \"openstackclient\" (UID: \"9fd1f254-7f23-46a6-b2fd-986de362e028\") " pod="openstack/openstackclient" Dec 01 21:55:39 crc kubenswrapper[4962]: I1201 21:55:39.275812 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9fd1f254-7f23-46a6-b2fd-986de362e028-openstack-config\") pod \"openstackclient\" (UID: \"9fd1f254-7f23-46a6-b2fd-986de362e028\") " pod="openstack/openstackclient" Dec 01 21:55:39 crc kubenswrapper[4962]: I1201 21:55:39.283531 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fd1f254-7f23-46a6-b2fd-986de362e028-combined-ca-bundle\") pod \"openstackclient\" (UID: \"9fd1f254-7f23-46a6-b2fd-986de362e028\") " pod="openstack/openstackclient" Dec 01 21:55:39 crc kubenswrapper[4962]: I1201 21:55:39.289860 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9fd1f254-7f23-46a6-b2fd-986de362e028-openstack-config-secret\") pod \"openstackclient\" (UID: \"9fd1f254-7f23-46a6-b2fd-986de362e028\") " pod="openstack/openstackclient" Dec 01 21:55:39 crc kubenswrapper[4962]: I1201 21:55:39.310733 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpz8h\" (UniqueName: \"kubernetes.io/projected/9fd1f254-7f23-46a6-b2fd-986de362e028-kube-api-access-kpz8h\") pod \"openstackclient\" (UID: \"9fd1f254-7f23-46a6-b2fd-986de362e028\") " pod="openstack/openstackclient" Dec 01 21:55:39 crc kubenswrapper[4962]: I1201 21:55:39.396828 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 01 21:55:39 crc kubenswrapper[4962]: I1201 21:55:39.952272 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 01 21:55:40 crc kubenswrapper[4962]: I1201 21:55:40.024715 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"9fd1f254-7f23-46a6-b2fd-986de362e028","Type":"ContainerStarted","Data":"0281f3f493c69a3f7bdabcee41c867a4e1e28f92d4f9df0b7adaf3a21d9ae528"} Dec 01 21:55:40 crc kubenswrapper[4962]: I1201 21:55:40.839681 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-65c9498c86-5xmqd" Dec 01 21:55:41 crc kubenswrapper[4962]: I1201 21:55:41.254620 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-65c9498c86-5xmqd" Dec 01 21:55:41 crc kubenswrapper[4962]: I1201 21:55:41.419975 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-55db4bf6b-mhhwj"] Dec 01 21:55:41 crc kubenswrapper[4962]: I1201 21:55:41.420252 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-55db4bf6b-mhhwj" podUID="79441f7a-11fa-4ace-8deb-29d7db95e67c" containerName="barbican-api-log" containerID="cri-o://56188a8c3bc08d44dc9fad529e37f59ccaac6cc4e0beafc1c2ef0dfe6979ae6a" gracePeriod=30 Dec 01 21:55:41 crc kubenswrapper[4962]: I1201 21:55:41.420732 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-55db4bf6b-mhhwj" podUID="79441f7a-11fa-4ace-8deb-29d7db95e67c" containerName="barbican-api" containerID="cri-o://02bbd4368f9bd123ad569cc0d9dc963126eb75ec8073c5bd5ce7b17f8dcca4eb" gracePeriod=30 Dec 01 21:55:41 crc kubenswrapper[4962]: I1201 21:55:41.441148 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-55db4bf6b-mhhwj" podUID="79441f7a-11fa-4ace-8deb-29d7db95e67c" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.197:9311/healthcheck\": EOF" Dec 01 21:55:42 crc kubenswrapper[4962]: I1201 21:55:42.056538 4962 generic.go:334] "Generic (PLEG): container finished" podID="79441f7a-11fa-4ace-8deb-29d7db95e67c" containerID="56188a8c3bc08d44dc9fad529e37f59ccaac6cc4e0beafc1c2ef0dfe6979ae6a" exitCode=143 Dec 01 21:55:42 crc kubenswrapper[4962]: I1201 21:55:42.056827 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-55db4bf6b-mhhwj" event={"ID":"79441f7a-11fa-4ace-8deb-29d7db95e67c","Type":"ContainerDied","Data":"56188a8c3bc08d44dc9fad529e37f59ccaac6cc4e0beafc1c2ef0dfe6979ae6a"} Dec 01 21:55:42 crc kubenswrapper[4962]: I1201 21:55:42.774798 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 01 21:55:44 crc kubenswrapper[4962]: I1201 21:55:44.777285 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-54647f544-d6jzt" Dec 01 21:55:44 crc kubenswrapper[4962]: I1201 21:55:44.812672 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-54647f544-d6jzt" Dec 01 21:55:44 crc kubenswrapper[4962]: I1201 21:55:44.812783 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 01 21:55:45 crc kubenswrapper[4962]: I1201 21:55:45.140085 4962 generic.go:334] "Generic (PLEG): container finished" podID="79441f7a-11fa-4ace-8deb-29d7db95e67c" containerID="02bbd4368f9bd123ad569cc0d9dc963126eb75ec8073c5bd5ce7b17f8dcca4eb" exitCode=0 Dec 01 21:55:45 crc kubenswrapper[4962]: I1201 21:55:45.141618 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-55db4bf6b-mhhwj" event={"ID":"79441f7a-11fa-4ace-8deb-29d7db95e67c","Type":"ContainerDied","Data":"02bbd4368f9bd123ad569cc0d9dc963126eb75ec8073c5bd5ce7b17f8dcca4eb"} Dec 01 21:55:45 crc kubenswrapper[4962]: I1201 21:55:45.627494 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-55db4bf6b-mhhwj" Dec 01 21:55:45 crc kubenswrapper[4962]: I1201 21:55:45.754770 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pg5ht\" (UniqueName: \"kubernetes.io/projected/79441f7a-11fa-4ace-8deb-29d7db95e67c-kube-api-access-pg5ht\") pod \"79441f7a-11fa-4ace-8deb-29d7db95e67c\" (UID: \"79441f7a-11fa-4ace-8deb-29d7db95e67c\") " Dec 01 21:55:45 crc kubenswrapper[4962]: I1201 21:55:45.754901 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79441f7a-11fa-4ace-8deb-29d7db95e67c-logs\") pod \"79441f7a-11fa-4ace-8deb-29d7db95e67c\" (UID: \"79441f7a-11fa-4ace-8deb-29d7db95e67c\") " Dec 01 21:55:45 crc kubenswrapper[4962]: I1201 21:55:45.754961 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79441f7a-11fa-4ace-8deb-29d7db95e67c-config-data\") pod \"79441f7a-11fa-4ace-8deb-29d7db95e67c\" (UID: \"79441f7a-11fa-4ace-8deb-29d7db95e67c\") " Dec 01 21:55:45 crc kubenswrapper[4962]: I1201 21:55:45.754991 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/79441f7a-11fa-4ace-8deb-29d7db95e67c-config-data-custom\") pod \"79441f7a-11fa-4ace-8deb-29d7db95e67c\" (UID: \"79441f7a-11fa-4ace-8deb-29d7db95e67c\") " Dec 01 21:55:45 crc kubenswrapper[4962]: I1201 21:55:45.755239 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79441f7a-11fa-4ace-8deb-29d7db95e67c-combined-ca-bundle\") pod \"79441f7a-11fa-4ace-8deb-29d7db95e67c\" (UID: \"79441f7a-11fa-4ace-8deb-29d7db95e67c\") " Dec 01 21:55:45 crc kubenswrapper[4962]: I1201 21:55:45.756983 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79441f7a-11fa-4ace-8deb-29d7db95e67c-logs" (OuterVolumeSpecName: "logs") pod "79441f7a-11fa-4ace-8deb-29d7db95e67c" (UID: "79441f7a-11fa-4ace-8deb-29d7db95e67c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:55:45 crc kubenswrapper[4962]: I1201 21:55:45.783157 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79441f7a-11fa-4ace-8deb-29d7db95e67c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "79441f7a-11fa-4ace-8deb-29d7db95e67c" (UID: "79441f7a-11fa-4ace-8deb-29d7db95e67c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:55:45 crc kubenswrapper[4962]: I1201 21:55:45.783439 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79441f7a-11fa-4ace-8deb-29d7db95e67c-kube-api-access-pg5ht" (OuterVolumeSpecName: "kube-api-access-pg5ht") pod "79441f7a-11fa-4ace-8deb-29d7db95e67c" (UID: "79441f7a-11fa-4ace-8deb-29d7db95e67c"). InnerVolumeSpecName "kube-api-access-pg5ht". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:55:45 crc kubenswrapper[4962]: I1201 21:55:45.858320 4962 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/79441f7a-11fa-4ace-8deb-29d7db95e67c-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 01 21:55:45 crc kubenswrapper[4962]: I1201 21:55:45.858358 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pg5ht\" (UniqueName: \"kubernetes.io/projected/79441f7a-11fa-4ace-8deb-29d7db95e67c-kube-api-access-pg5ht\") on node \"crc\" DevicePath \"\"" Dec 01 21:55:45 crc kubenswrapper[4962]: I1201 21:55:45.858376 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79441f7a-11fa-4ace-8deb-29d7db95e67c-logs\") on node \"crc\" DevicePath \"\"" Dec 01 21:55:45 crc kubenswrapper[4962]: I1201 21:55:45.922379 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-65c954fcc-wpwn7"] Dec 01 21:55:45 crc kubenswrapper[4962]: E1201 21:55:45.922823 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79441f7a-11fa-4ace-8deb-29d7db95e67c" containerName="barbican-api-log" Dec 01 21:55:45 crc kubenswrapper[4962]: I1201 21:55:45.922834 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="79441f7a-11fa-4ace-8deb-29d7db95e67c" containerName="barbican-api-log" Dec 01 21:55:45 crc kubenswrapper[4962]: E1201 21:55:45.922880 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79441f7a-11fa-4ace-8deb-29d7db95e67c" containerName="barbican-api" Dec 01 21:55:45 crc kubenswrapper[4962]: I1201 21:55:45.922886 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="79441f7a-11fa-4ace-8deb-29d7db95e67c" containerName="barbican-api" Dec 01 21:55:45 crc kubenswrapper[4962]: I1201 21:55:45.923110 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="79441f7a-11fa-4ace-8deb-29d7db95e67c" containerName="barbican-api" Dec 01 21:55:45 crc kubenswrapper[4962]: I1201 21:55:45.923146 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="79441f7a-11fa-4ace-8deb-29d7db95e67c" containerName="barbican-api-log" Dec 01 21:55:45 crc kubenswrapper[4962]: I1201 21:55:45.930085 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-65c954fcc-wpwn7" Dec 01 21:55:45 crc kubenswrapper[4962]: I1201 21:55:45.933096 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Dec 01 21:55:45 crc kubenswrapper[4962]: I1201 21:55:45.933330 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Dec 01 21:55:45 crc kubenswrapper[4962]: I1201 21:55:45.933445 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 01 21:55:45 crc kubenswrapper[4962]: I1201 21:55:45.942092 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79441f7a-11fa-4ace-8deb-29d7db95e67c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "79441f7a-11fa-4ace-8deb-29d7db95e67c" (UID: "79441f7a-11fa-4ace-8deb-29d7db95e67c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:55:45 crc kubenswrapper[4962]: I1201 21:55:45.960658 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79441f7a-11fa-4ace-8deb-29d7db95e67c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 21:55:45 crc kubenswrapper[4962]: I1201 21:55:45.971967 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-65c954fcc-wpwn7"] Dec 01 21:55:46 crc kubenswrapper[4962]: I1201 21:55:46.062430 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjbmv\" (UniqueName: \"kubernetes.io/projected/85abfbd6-374e-486e-93f1-8e8c4e8b5da0-kube-api-access-rjbmv\") pod \"swift-proxy-65c954fcc-wpwn7\" (UID: \"85abfbd6-374e-486e-93f1-8e8c4e8b5da0\") " pod="openstack/swift-proxy-65c954fcc-wpwn7" Dec 01 21:55:46 crc kubenswrapper[4962]: I1201 21:55:46.062804 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/85abfbd6-374e-486e-93f1-8e8c4e8b5da0-etc-swift\") pod \"swift-proxy-65c954fcc-wpwn7\" (UID: \"85abfbd6-374e-486e-93f1-8e8c4e8b5da0\") " pod="openstack/swift-proxy-65c954fcc-wpwn7" Dec 01 21:55:46 crc kubenswrapper[4962]: I1201 21:55:46.062857 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/85abfbd6-374e-486e-93f1-8e8c4e8b5da0-public-tls-certs\") pod \"swift-proxy-65c954fcc-wpwn7\" (UID: \"85abfbd6-374e-486e-93f1-8e8c4e8b5da0\") " pod="openstack/swift-proxy-65c954fcc-wpwn7" Dec 01 21:55:46 crc kubenswrapper[4962]: I1201 21:55:46.062906 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85abfbd6-374e-486e-93f1-8e8c4e8b5da0-config-data\") pod \"swift-proxy-65c954fcc-wpwn7\" (UID: \"85abfbd6-374e-486e-93f1-8e8c4e8b5da0\") " pod="openstack/swift-proxy-65c954fcc-wpwn7" Dec 01 21:55:46 crc kubenswrapper[4962]: I1201 21:55:46.062951 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/85abfbd6-374e-486e-93f1-8e8c4e8b5da0-log-httpd\") pod \"swift-proxy-65c954fcc-wpwn7\" (UID: \"85abfbd6-374e-486e-93f1-8e8c4e8b5da0\") " pod="openstack/swift-proxy-65c954fcc-wpwn7" Dec 01 21:55:46 crc kubenswrapper[4962]: I1201 21:55:46.063003 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/85abfbd6-374e-486e-93f1-8e8c4e8b5da0-internal-tls-certs\") pod \"swift-proxy-65c954fcc-wpwn7\" (UID: \"85abfbd6-374e-486e-93f1-8e8c4e8b5da0\") " pod="openstack/swift-proxy-65c954fcc-wpwn7" Dec 01 21:55:46 crc kubenswrapper[4962]: I1201 21:55:46.063039 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85abfbd6-374e-486e-93f1-8e8c4e8b5da0-combined-ca-bundle\") pod \"swift-proxy-65c954fcc-wpwn7\" (UID: \"85abfbd6-374e-486e-93f1-8e8c4e8b5da0\") " pod="openstack/swift-proxy-65c954fcc-wpwn7" Dec 01 21:55:46 crc kubenswrapper[4962]: I1201 21:55:46.063069 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/85abfbd6-374e-486e-93f1-8e8c4e8b5da0-run-httpd\") pod \"swift-proxy-65c954fcc-wpwn7\" (UID: \"85abfbd6-374e-486e-93f1-8e8c4e8b5da0\") " pod="openstack/swift-proxy-65c954fcc-wpwn7" Dec 01 21:55:46 crc kubenswrapper[4962]: I1201 21:55:46.072817 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79441f7a-11fa-4ace-8deb-29d7db95e67c-config-data" (OuterVolumeSpecName: "config-data") pod "79441f7a-11fa-4ace-8deb-29d7db95e67c" (UID: "79441f7a-11fa-4ace-8deb-29d7db95e67c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:55:46 crc kubenswrapper[4962]: I1201 21:55:46.152062 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-55db4bf6b-mhhwj" event={"ID":"79441f7a-11fa-4ace-8deb-29d7db95e67c","Type":"ContainerDied","Data":"0b7eb0a1abd08acc09f071e88b9c324d0ef37fa750a1c136691a99ae37e0402c"} Dec 01 21:55:46 crc kubenswrapper[4962]: I1201 21:55:46.152116 4962 scope.go:117] "RemoveContainer" containerID="02bbd4368f9bd123ad569cc0d9dc963126eb75ec8073c5bd5ce7b17f8dcca4eb" Dec 01 21:55:46 crc kubenswrapper[4962]: I1201 21:55:46.152242 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-55db4bf6b-mhhwj" Dec 01 21:55:46 crc kubenswrapper[4962]: I1201 21:55:46.165188 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/85abfbd6-374e-486e-93f1-8e8c4e8b5da0-etc-swift\") pod \"swift-proxy-65c954fcc-wpwn7\" (UID: \"85abfbd6-374e-486e-93f1-8e8c4e8b5da0\") " pod="openstack/swift-proxy-65c954fcc-wpwn7" Dec 01 21:55:46 crc kubenswrapper[4962]: I1201 21:55:46.165249 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/85abfbd6-374e-486e-93f1-8e8c4e8b5da0-public-tls-certs\") pod \"swift-proxy-65c954fcc-wpwn7\" (UID: \"85abfbd6-374e-486e-93f1-8e8c4e8b5da0\") " pod="openstack/swift-proxy-65c954fcc-wpwn7" Dec 01 21:55:46 crc kubenswrapper[4962]: I1201 21:55:46.165301 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85abfbd6-374e-486e-93f1-8e8c4e8b5da0-config-data\") pod \"swift-proxy-65c954fcc-wpwn7\" (UID: \"85abfbd6-374e-486e-93f1-8e8c4e8b5da0\") " pod="openstack/swift-proxy-65c954fcc-wpwn7" Dec 01 21:55:46 crc kubenswrapper[4962]: I1201 21:55:46.165326 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/85abfbd6-374e-486e-93f1-8e8c4e8b5da0-log-httpd\") pod \"swift-proxy-65c954fcc-wpwn7\" (UID: \"85abfbd6-374e-486e-93f1-8e8c4e8b5da0\") " pod="openstack/swift-proxy-65c954fcc-wpwn7" Dec 01 21:55:46 crc kubenswrapper[4962]: I1201 21:55:46.165367 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/85abfbd6-374e-486e-93f1-8e8c4e8b5da0-internal-tls-certs\") pod \"swift-proxy-65c954fcc-wpwn7\" (UID: \"85abfbd6-374e-486e-93f1-8e8c4e8b5da0\") " pod="openstack/swift-proxy-65c954fcc-wpwn7" Dec 01 21:55:46 crc kubenswrapper[4962]: I1201 21:55:46.165394 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85abfbd6-374e-486e-93f1-8e8c4e8b5da0-combined-ca-bundle\") pod \"swift-proxy-65c954fcc-wpwn7\" (UID: \"85abfbd6-374e-486e-93f1-8e8c4e8b5da0\") " pod="openstack/swift-proxy-65c954fcc-wpwn7" Dec 01 21:55:46 crc kubenswrapper[4962]: I1201 21:55:46.165417 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/85abfbd6-374e-486e-93f1-8e8c4e8b5da0-run-httpd\") pod \"swift-proxy-65c954fcc-wpwn7\" (UID: \"85abfbd6-374e-486e-93f1-8e8c4e8b5da0\") " pod="openstack/swift-proxy-65c954fcc-wpwn7" Dec 01 21:55:46 crc kubenswrapper[4962]: I1201 21:55:46.165479 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjbmv\" (UniqueName: \"kubernetes.io/projected/85abfbd6-374e-486e-93f1-8e8c4e8b5da0-kube-api-access-rjbmv\") pod \"swift-proxy-65c954fcc-wpwn7\" (UID: \"85abfbd6-374e-486e-93f1-8e8c4e8b5da0\") " pod="openstack/swift-proxy-65c954fcc-wpwn7" Dec 01 21:55:46 crc kubenswrapper[4962]: I1201 21:55:46.165605 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79441f7a-11fa-4ace-8deb-29d7db95e67c-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 21:55:46 crc kubenswrapper[4962]: I1201 21:55:46.167147 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/85abfbd6-374e-486e-93f1-8e8c4e8b5da0-log-httpd\") pod \"swift-proxy-65c954fcc-wpwn7\" (UID: \"85abfbd6-374e-486e-93f1-8e8c4e8b5da0\") " pod="openstack/swift-proxy-65c954fcc-wpwn7" Dec 01 21:55:46 crc kubenswrapper[4962]: I1201 21:55:46.171235 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/85abfbd6-374e-486e-93f1-8e8c4e8b5da0-run-httpd\") pod \"swift-proxy-65c954fcc-wpwn7\" (UID: \"85abfbd6-374e-486e-93f1-8e8c4e8b5da0\") " pod="openstack/swift-proxy-65c954fcc-wpwn7" Dec 01 21:55:46 crc kubenswrapper[4962]: I1201 21:55:46.172001 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85abfbd6-374e-486e-93f1-8e8c4e8b5da0-combined-ca-bundle\") pod \"swift-proxy-65c954fcc-wpwn7\" (UID: \"85abfbd6-374e-486e-93f1-8e8c4e8b5da0\") " pod="openstack/swift-proxy-65c954fcc-wpwn7" Dec 01 21:55:46 crc kubenswrapper[4962]: I1201 21:55:46.172060 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85abfbd6-374e-486e-93f1-8e8c4e8b5da0-config-data\") pod \"swift-proxy-65c954fcc-wpwn7\" (UID: \"85abfbd6-374e-486e-93f1-8e8c4e8b5da0\") " pod="openstack/swift-proxy-65c954fcc-wpwn7" Dec 01 21:55:46 crc kubenswrapper[4962]: I1201 21:55:46.182633 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/85abfbd6-374e-486e-93f1-8e8c4e8b5da0-internal-tls-certs\") pod \"swift-proxy-65c954fcc-wpwn7\" (UID: \"85abfbd6-374e-486e-93f1-8e8c4e8b5da0\") " pod="openstack/swift-proxy-65c954fcc-wpwn7" Dec 01 21:55:46 crc kubenswrapper[4962]: I1201 21:55:46.185747 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/85abfbd6-374e-486e-93f1-8e8c4e8b5da0-public-tls-certs\") pod \"swift-proxy-65c954fcc-wpwn7\" (UID: \"85abfbd6-374e-486e-93f1-8e8c4e8b5da0\") " pod="openstack/swift-proxy-65c954fcc-wpwn7" Dec 01 21:55:46 crc kubenswrapper[4962]: I1201 21:55:46.186907 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/85abfbd6-374e-486e-93f1-8e8c4e8b5da0-etc-swift\") pod \"swift-proxy-65c954fcc-wpwn7\" (UID: \"85abfbd6-374e-486e-93f1-8e8c4e8b5da0\") " pod="openstack/swift-proxy-65c954fcc-wpwn7" Dec 01 21:55:46 crc kubenswrapper[4962]: I1201 21:55:46.188056 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjbmv\" (UniqueName: \"kubernetes.io/projected/85abfbd6-374e-486e-93f1-8e8c4e8b5da0-kube-api-access-rjbmv\") pod \"swift-proxy-65c954fcc-wpwn7\" (UID: \"85abfbd6-374e-486e-93f1-8e8c4e8b5da0\") " pod="openstack/swift-proxy-65c954fcc-wpwn7" Dec 01 21:55:46 crc kubenswrapper[4962]: I1201 21:55:46.198917 4962 scope.go:117] "RemoveContainer" containerID="56188a8c3bc08d44dc9fad529e37f59ccaac6cc4e0beafc1c2ef0dfe6979ae6a" Dec 01 21:55:46 crc kubenswrapper[4962]: I1201 21:55:46.293056 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-55db4bf6b-mhhwj"] Dec 01 21:55:46 crc kubenswrapper[4962]: I1201 21:55:46.305736 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-55db4bf6b-mhhwj"] Dec 01 21:55:46 crc kubenswrapper[4962]: I1201 21:55:46.343696 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-65c954fcc-wpwn7" Dec 01 21:55:46 crc kubenswrapper[4962]: I1201 21:55:46.830334 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-56f444f67c-lv25m"] Dec 01 21:55:46 crc kubenswrapper[4962]: I1201 21:55:46.837161 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-56f444f67c-lv25m" Dec 01 21:55:46 crc kubenswrapper[4962]: I1201 21:55:46.844981 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Dec 01 21:55:46 crc kubenswrapper[4962]: I1201 21:55:46.845164 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Dec 01 21:55:46 crc kubenswrapper[4962]: I1201 21:55:46.845344 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-zjthz" Dec 01 21:55:46 crc kubenswrapper[4962]: I1201 21:55:46.852959 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-56f444f67c-lv25m"] Dec 01 21:55:46 crc kubenswrapper[4962]: I1201 21:55:46.983695 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6pzp\" (UniqueName: \"kubernetes.io/projected/13829155-474a-445c-b27f-bffdd6b0befb-kube-api-access-d6pzp\") pod \"heat-engine-56f444f67c-lv25m\" (UID: \"13829155-474a-445c-b27f-bffdd6b0befb\") " pod="openstack/heat-engine-56f444f67c-lv25m" Dec 01 21:55:46 crc kubenswrapper[4962]: I1201 21:55:46.983847 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13829155-474a-445c-b27f-bffdd6b0befb-config-data\") pod \"heat-engine-56f444f67c-lv25m\" (UID: \"13829155-474a-445c-b27f-bffdd6b0befb\") " pod="openstack/heat-engine-56f444f67c-lv25m" Dec 01 21:55:46 crc kubenswrapper[4962]: I1201 21:55:46.983962 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13829155-474a-445c-b27f-bffdd6b0befb-config-data-custom\") pod \"heat-engine-56f444f67c-lv25m\" (UID: \"13829155-474a-445c-b27f-bffdd6b0befb\") " pod="openstack/heat-engine-56f444f67c-lv25m" Dec 01 21:55:46 crc kubenswrapper[4962]: I1201 21:55:46.984003 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13829155-474a-445c-b27f-bffdd6b0befb-combined-ca-bundle\") pod \"heat-engine-56f444f67c-lv25m\" (UID: \"13829155-474a-445c-b27f-bffdd6b0befb\") " pod="openstack/heat-engine-56f444f67c-lv25m" Dec 01 21:55:46 crc kubenswrapper[4962]: I1201 21:55:46.992162 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-688b9f5b49-ntsmc"] Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.003124 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688b9f5b49-ntsmc" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.055084 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-688b9f5b49-ntsmc"] Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.097531 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13829155-474a-445c-b27f-bffdd6b0befb-config-data-custom\") pod \"heat-engine-56f444f67c-lv25m\" (UID: \"13829155-474a-445c-b27f-bffdd6b0befb\") " pod="openstack/heat-engine-56f444f67c-lv25m" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.097593 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13829155-474a-445c-b27f-bffdd6b0befb-combined-ca-bundle\") pod \"heat-engine-56f444f67c-lv25m\" (UID: \"13829155-474a-445c-b27f-bffdd6b0befb\") " pod="openstack/heat-engine-56f444f67c-lv25m" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.097625 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a52733e0-9924-46f4-aee8-705cda80cc38-dns-swift-storage-0\") pod \"dnsmasq-dns-688b9f5b49-ntsmc\" (UID: \"a52733e0-9924-46f4-aee8-705cda80cc38\") " pod="openstack/dnsmasq-dns-688b9f5b49-ntsmc" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.097652 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a52733e0-9924-46f4-aee8-705cda80cc38-ovsdbserver-nb\") pod \"dnsmasq-dns-688b9f5b49-ntsmc\" (UID: \"a52733e0-9924-46f4-aee8-705cda80cc38\") " pod="openstack/dnsmasq-dns-688b9f5b49-ntsmc" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.097745 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a52733e0-9924-46f4-aee8-705cda80cc38-config\") pod \"dnsmasq-dns-688b9f5b49-ntsmc\" (UID: \"a52733e0-9924-46f4-aee8-705cda80cc38\") " pod="openstack/dnsmasq-dns-688b9f5b49-ntsmc" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.097770 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z785b\" (UniqueName: \"kubernetes.io/projected/a52733e0-9924-46f4-aee8-705cda80cc38-kube-api-access-z785b\") pod \"dnsmasq-dns-688b9f5b49-ntsmc\" (UID: \"a52733e0-9924-46f4-aee8-705cda80cc38\") " pod="openstack/dnsmasq-dns-688b9f5b49-ntsmc" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.097805 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a52733e0-9924-46f4-aee8-705cda80cc38-dns-svc\") pod \"dnsmasq-dns-688b9f5b49-ntsmc\" (UID: \"a52733e0-9924-46f4-aee8-705cda80cc38\") " pod="openstack/dnsmasq-dns-688b9f5b49-ntsmc" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.097834 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6pzp\" (UniqueName: \"kubernetes.io/projected/13829155-474a-445c-b27f-bffdd6b0befb-kube-api-access-d6pzp\") pod \"heat-engine-56f444f67c-lv25m\" (UID: \"13829155-474a-445c-b27f-bffdd6b0befb\") " pod="openstack/heat-engine-56f444f67c-lv25m" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.109201 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a52733e0-9924-46f4-aee8-705cda80cc38-ovsdbserver-sb\") pod \"dnsmasq-dns-688b9f5b49-ntsmc\" (UID: \"a52733e0-9924-46f4-aee8-705cda80cc38\") " pod="openstack/dnsmasq-dns-688b9f5b49-ntsmc" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.109508 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13829155-474a-445c-b27f-bffdd6b0befb-config-data\") pod \"heat-engine-56f444f67c-lv25m\" (UID: \"13829155-474a-445c-b27f-bffdd6b0befb\") " pod="openstack/heat-engine-56f444f67c-lv25m" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.131870 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13829155-474a-445c-b27f-bffdd6b0befb-config-data\") pod \"heat-engine-56f444f67c-lv25m\" (UID: \"13829155-474a-445c-b27f-bffdd6b0befb\") " pod="openstack/heat-engine-56f444f67c-lv25m" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.160795 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13829155-474a-445c-b27f-bffdd6b0befb-config-data-custom\") pod \"heat-engine-56f444f67c-lv25m\" (UID: \"13829155-474a-445c-b27f-bffdd6b0befb\") " pod="openstack/heat-engine-56f444f67c-lv25m" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.173977 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-65c954fcc-wpwn7"] Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.176622 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6pzp\" (UniqueName: \"kubernetes.io/projected/13829155-474a-445c-b27f-bffdd6b0befb-kube-api-access-d6pzp\") pod \"heat-engine-56f444f67c-lv25m\" (UID: \"13829155-474a-445c-b27f-bffdd6b0befb\") " pod="openstack/heat-engine-56f444f67c-lv25m" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.182691 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13829155-474a-445c-b27f-bffdd6b0befb-combined-ca-bundle\") pod \"heat-engine-56f444f67c-lv25m\" (UID: \"13829155-474a-445c-b27f-bffdd6b0befb\") " pod="openstack/heat-engine-56f444f67c-lv25m" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.212323 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a52733e0-9924-46f4-aee8-705cda80cc38-ovsdbserver-sb\") pod \"dnsmasq-dns-688b9f5b49-ntsmc\" (UID: \"a52733e0-9924-46f4-aee8-705cda80cc38\") " pod="openstack/dnsmasq-dns-688b9f5b49-ntsmc" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.212408 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a52733e0-9924-46f4-aee8-705cda80cc38-dns-swift-storage-0\") pod \"dnsmasq-dns-688b9f5b49-ntsmc\" (UID: \"a52733e0-9924-46f4-aee8-705cda80cc38\") " pod="openstack/dnsmasq-dns-688b9f5b49-ntsmc" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.212432 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a52733e0-9924-46f4-aee8-705cda80cc38-ovsdbserver-nb\") pod \"dnsmasq-dns-688b9f5b49-ntsmc\" (UID: \"a52733e0-9924-46f4-aee8-705cda80cc38\") " pod="openstack/dnsmasq-dns-688b9f5b49-ntsmc" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.212491 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a52733e0-9924-46f4-aee8-705cda80cc38-config\") pod \"dnsmasq-dns-688b9f5b49-ntsmc\" (UID: \"a52733e0-9924-46f4-aee8-705cda80cc38\") " pod="openstack/dnsmasq-dns-688b9f5b49-ntsmc" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.212520 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z785b\" (UniqueName: \"kubernetes.io/projected/a52733e0-9924-46f4-aee8-705cda80cc38-kube-api-access-z785b\") pod \"dnsmasq-dns-688b9f5b49-ntsmc\" (UID: \"a52733e0-9924-46f4-aee8-705cda80cc38\") " pod="openstack/dnsmasq-dns-688b9f5b49-ntsmc" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.212548 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a52733e0-9924-46f4-aee8-705cda80cc38-dns-svc\") pod \"dnsmasq-dns-688b9f5b49-ntsmc\" (UID: \"a52733e0-9924-46f4-aee8-705cda80cc38\") " pod="openstack/dnsmasq-dns-688b9f5b49-ntsmc" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.213482 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a52733e0-9924-46f4-aee8-705cda80cc38-dns-svc\") pod \"dnsmasq-dns-688b9f5b49-ntsmc\" (UID: \"a52733e0-9924-46f4-aee8-705cda80cc38\") " pod="openstack/dnsmasq-dns-688b9f5b49-ntsmc" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.214043 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a52733e0-9924-46f4-aee8-705cda80cc38-ovsdbserver-sb\") pod \"dnsmasq-dns-688b9f5b49-ntsmc\" (UID: \"a52733e0-9924-46f4-aee8-705cda80cc38\") " pod="openstack/dnsmasq-dns-688b9f5b49-ntsmc" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.214590 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a52733e0-9924-46f4-aee8-705cda80cc38-dns-swift-storage-0\") pod \"dnsmasq-dns-688b9f5b49-ntsmc\" (UID: \"a52733e0-9924-46f4-aee8-705cda80cc38\") " pod="openstack/dnsmasq-dns-688b9f5b49-ntsmc" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.216783 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a52733e0-9924-46f4-aee8-705cda80cc38-config\") pod \"dnsmasq-dns-688b9f5b49-ntsmc\" (UID: \"a52733e0-9924-46f4-aee8-705cda80cc38\") " pod="openstack/dnsmasq-dns-688b9f5b49-ntsmc" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.227420 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a52733e0-9924-46f4-aee8-705cda80cc38-ovsdbserver-nb\") pod \"dnsmasq-dns-688b9f5b49-ntsmc\" (UID: \"a52733e0-9924-46f4-aee8-705cda80cc38\") " pod="openstack/dnsmasq-dns-688b9f5b49-ntsmc" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.258620 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-65c954fcc-wpwn7" event={"ID":"85abfbd6-374e-486e-93f1-8e8c4e8b5da0","Type":"ContainerStarted","Data":"72d3662ee384be4325a965b5af2b3e4fa43df0be74b266b584d86e0afad7b487"} Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.273000 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z785b\" (UniqueName: \"kubernetes.io/projected/a52733e0-9924-46f4-aee8-705cda80cc38-kube-api-access-z785b\") pod \"dnsmasq-dns-688b9f5b49-ntsmc\" (UID: \"a52733e0-9924-46f4-aee8-705cda80cc38\") " pod="openstack/dnsmasq-dns-688b9f5b49-ntsmc" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.338054 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-58f5d66b6f-bgzrf"] Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.339793 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-58f5d66b6f-bgzrf" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.346916 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.406984 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-6648ff8f4d-xwjnk"] Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.409625 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6648ff8f4d-xwjnk" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.418288 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.426663 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b64f053-da53-49c7-a227-dcc84b5c078d-config-data\") pod \"heat-cfnapi-58f5d66b6f-bgzrf\" (UID: \"7b64f053-da53-49c7-a227-dcc84b5c078d\") " pod="openstack/heat-cfnapi-58f5d66b6f-bgzrf" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.426716 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdhdl\" (UniqueName: \"kubernetes.io/projected/7b64f053-da53-49c7-a227-dcc84b5c078d-kube-api-access-hdhdl\") pod \"heat-cfnapi-58f5d66b6f-bgzrf\" (UID: \"7b64f053-da53-49c7-a227-dcc84b5c078d\") " pod="openstack/heat-cfnapi-58f5d66b6f-bgzrf" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.426923 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b64f053-da53-49c7-a227-dcc84b5c078d-combined-ca-bundle\") pod \"heat-cfnapi-58f5d66b6f-bgzrf\" (UID: \"7b64f053-da53-49c7-a227-dcc84b5c078d\") " pod="openstack/heat-cfnapi-58f5d66b6f-bgzrf" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.427296 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7b64f053-da53-49c7-a227-dcc84b5c078d-config-data-custom\") pod \"heat-cfnapi-58f5d66b6f-bgzrf\" (UID: \"7b64f053-da53-49c7-a227-dcc84b5c078d\") " pod="openstack/heat-cfnapi-58f5d66b6f-bgzrf" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.434546 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688b9f5b49-ntsmc" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.435353 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-58f5d66b6f-bgzrf"] Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.453709 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-6648ff8f4d-xwjnk"] Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.460785 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-56f444f67c-lv25m" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.528648 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b64f053-da53-49c7-a227-dcc84b5c078d-combined-ca-bundle\") pod \"heat-cfnapi-58f5d66b6f-bgzrf\" (UID: \"7b64f053-da53-49c7-a227-dcc84b5c078d\") " pod="openstack/heat-cfnapi-58f5d66b6f-bgzrf" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.528698 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a75e6047-420d-4aa3-a817-90a547491be2-config-data-custom\") pod \"heat-api-6648ff8f4d-xwjnk\" (UID: \"a75e6047-420d-4aa3-a817-90a547491be2\") " pod="openstack/heat-api-6648ff8f4d-xwjnk" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.528719 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a75e6047-420d-4aa3-a817-90a547491be2-combined-ca-bundle\") pod \"heat-api-6648ff8f4d-xwjnk\" (UID: \"a75e6047-420d-4aa3-a817-90a547491be2\") " pod="openstack/heat-api-6648ff8f4d-xwjnk" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.528777 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a75e6047-420d-4aa3-a817-90a547491be2-config-data\") pod \"heat-api-6648ff8f4d-xwjnk\" (UID: \"a75e6047-420d-4aa3-a817-90a547491be2\") " pod="openstack/heat-api-6648ff8f4d-xwjnk" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.528845 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7b64f053-da53-49c7-a227-dcc84b5c078d-config-data-custom\") pod \"heat-cfnapi-58f5d66b6f-bgzrf\" (UID: \"7b64f053-da53-49c7-a227-dcc84b5c078d\") " pod="openstack/heat-cfnapi-58f5d66b6f-bgzrf" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.528915 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfhw7\" (UniqueName: \"kubernetes.io/projected/a75e6047-420d-4aa3-a817-90a547491be2-kube-api-access-kfhw7\") pod \"heat-api-6648ff8f4d-xwjnk\" (UID: \"a75e6047-420d-4aa3-a817-90a547491be2\") " pod="openstack/heat-api-6648ff8f4d-xwjnk" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.528999 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b64f053-da53-49c7-a227-dcc84b5c078d-config-data\") pod \"heat-cfnapi-58f5d66b6f-bgzrf\" (UID: \"7b64f053-da53-49c7-a227-dcc84b5c078d\") " pod="openstack/heat-cfnapi-58f5d66b6f-bgzrf" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.529019 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdhdl\" (UniqueName: \"kubernetes.io/projected/7b64f053-da53-49c7-a227-dcc84b5c078d-kube-api-access-hdhdl\") pod \"heat-cfnapi-58f5d66b6f-bgzrf\" (UID: \"7b64f053-da53-49c7-a227-dcc84b5c078d\") " pod="openstack/heat-cfnapi-58f5d66b6f-bgzrf" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.537160 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b64f053-da53-49c7-a227-dcc84b5c078d-combined-ca-bundle\") pod \"heat-cfnapi-58f5d66b6f-bgzrf\" (UID: \"7b64f053-da53-49c7-a227-dcc84b5c078d\") " pod="openstack/heat-cfnapi-58f5d66b6f-bgzrf" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.537400 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7b64f053-da53-49c7-a227-dcc84b5c078d-config-data-custom\") pod \"heat-cfnapi-58f5d66b6f-bgzrf\" (UID: \"7b64f053-da53-49c7-a227-dcc84b5c078d\") " pod="openstack/heat-cfnapi-58f5d66b6f-bgzrf" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.541804 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b64f053-da53-49c7-a227-dcc84b5c078d-config-data\") pod \"heat-cfnapi-58f5d66b6f-bgzrf\" (UID: \"7b64f053-da53-49c7-a227-dcc84b5c078d\") " pod="openstack/heat-cfnapi-58f5d66b6f-bgzrf" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.552884 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdhdl\" (UniqueName: \"kubernetes.io/projected/7b64f053-da53-49c7-a227-dcc84b5c078d-kube-api-access-hdhdl\") pod \"heat-cfnapi-58f5d66b6f-bgzrf\" (UID: \"7b64f053-da53-49c7-a227-dcc84b5c078d\") " pod="openstack/heat-cfnapi-58f5d66b6f-bgzrf" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.614122 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-b85sm"] Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.615628 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-b85sm" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.631607 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfhw7\" (UniqueName: \"kubernetes.io/projected/a75e6047-420d-4aa3-a817-90a547491be2-kube-api-access-kfhw7\") pod \"heat-api-6648ff8f4d-xwjnk\" (UID: \"a75e6047-420d-4aa3-a817-90a547491be2\") " pod="openstack/heat-api-6648ff8f4d-xwjnk" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.632452 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a75e6047-420d-4aa3-a817-90a547491be2-config-data-custom\") pod \"heat-api-6648ff8f4d-xwjnk\" (UID: \"a75e6047-420d-4aa3-a817-90a547491be2\") " pod="openstack/heat-api-6648ff8f4d-xwjnk" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.633352 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a75e6047-420d-4aa3-a817-90a547491be2-combined-ca-bundle\") pod \"heat-api-6648ff8f4d-xwjnk\" (UID: \"a75e6047-420d-4aa3-a817-90a547491be2\") " pod="openstack/heat-api-6648ff8f4d-xwjnk" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.633914 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a75e6047-420d-4aa3-a817-90a547491be2-config-data\") pod \"heat-api-6648ff8f4d-xwjnk\" (UID: \"a75e6047-420d-4aa3-a817-90a547491be2\") " pod="openstack/heat-api-6648ff8f4d-xwjnk" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.648670 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a75e6047-420d-4aa3-a817-90a547491be2-config-data\") pod \"heat-api-6648ff8f4d-xwjnk\" (UID: \"a75e6047-420d-4aa3-a817-90a547491be2\") " pod="openstack/heat-api-6648ff8f4d-xwjnk" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.652510 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-b85sm"] Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.674721 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfhw7\" (UniqueName: \"kubernetes.io/projected/a75e6047-420d-4aa3-a817-90a547491be2-kube-api-access-kfhw7\") pod \"heat-api-6648ff8f4d-xwjnk\" (UID: \"a75e6047-420d-4aa3-a817-90a547491be2\") " pod="openstack/heat-api-6648ff8f4d-xwjnk" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.700788 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-58f5d66b6f-bgzrf" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.704915 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a75e6047-420d-4aa3-a817-90a547491be2-config-data-custom\") pod \"heat-api-6648ff8f4d-xwjnk\" (UID: \"a75e6047-420d-4aa3-a817-90a547491be2\") " pod="openstack/heat-api-6648ff8f4d-xwjnk" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.705307 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-4zm7d"] Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.707116 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-4zm7d" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.720582 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a75e6047-420d-4aa3-a817-90a547491be2-combined-ca-bundle\") pod \"heat-api-6648ff8f4d-xwjnk\" (UID: \"a75e6047-420d-4aa3-a817-90a547491be2\") " pod="openstack/heat-api-6648ff8f4d-xwjnk" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.726334 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-25db-account-create-update-h9hw7"] Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.729115 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-25db-account-create-update-h9hw7" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.731425 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.735790 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6648ff8f4d-xwjnk" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.740320 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwgk6\" (UniqueName: \"kubernetes.io/projected/e2e2dc11-c865-45cb-ab81-c39e911fdef9-kube-api-access-mwgk6\") pod \"nova-api-db-create-b85sm\" (UID: \"e2e2dc11-c865-45cb-ab81-c39e911fdef9\") " pod="openstack/nova-api-db-create-b85sm" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.740444 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e2e2dc11-c865-45cb-ab81-c39e911fdef9-operator-scripts\") pod \"nova-api-db-create-b85sm\" (UID: \"e2e2dc11-c865-45cb-ab81-c39e911fdef9\") " pod="openstack/nova-api-db-create-b85sm" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.771700 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-4zm7d"] Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.806532 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-25db-account-create-update-h9hw7"] Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.842647 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/88297b50-c65c-4dc3-8eed-d86b046b7f84-operator-scripts\") pod \"nova-api-25db-account-create-update-h9hw7\" (UID: \"88297b50-c65c-4dc3-8eed-d86b046b7f84\") " pod="openstack/nova-api-25db-account-create-update-h9hw7" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.842718 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwgk6\" (UniqueName: \"kubernetes.io/projected/e2e2dc11-c865-45cb-ab81-c39e911fdef9-kube-api-access-mwgk6\") pod \"nova-api-db-create-b85sm\" (UID: \"e2e2dc11-c865-45cb-ab81-c39e911fdef9\") " pod="openstack/nova-api-db-create-b85sm" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.842744 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bdaa775e-7b2f-4c56-8f07-256a62b4ed20-operator-scripts\") pod \"nova-cell0-db-create-4zm7d\" (UID: \"bdaa775e-7b2f-4c56-8f07-256a62b4ed20\") " pod="openstack/nova-cell0-db-create-4zm7d" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.842807 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2ww2\" (UniqueName: \"kubernetes.io/projected/bdaa775e-7b2f-4c56-8f07-256a62b4ed20-kube-api-access-t2ww2\") pod \"nova-cell0-db-create-4zm7d\" (UID: \"bdaa775e-7b2f-4c56-8f07-256a62b4ed20\") " pod="openstack/nova-cell0-db-create-4zm7d" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.842887 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpdpl\" (UniqueName: \"kubernetes.io/projected/88297b50-c65c-4dc3-8eed-d86b046b7f84-kube-api-access-lpdpl\") pod \"nova-api-25db-account-create-update-h9hw7\" (UID: \"88297b50-c65c-4dc3-8eed-d86b046b7f84\") " pod="openstack/nova-api-25db-account-create-update-h9hw7" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.842922 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e2e2dc11-c865-45cb-ab81-c39e911fdef9-operator-scripts\") pod \"nova-api-db-create-b85sm\" (UID: \"e2e2dc11-c865-45cb-ab81-c39e911fdef9\") " pod="openstack/nova-api-db-create-b85sm" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.845489 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e2e2dc11-c865-45cb-ab81-c39e911fdef9-operator-scripts\") pod \"nova-api-db-create-b85sm\" (UID: \"e2e2dc11-c865-45cb-ab81-c39e911fdef9\") " pod="openstack/nova-api-db-create-b85sm" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.906057 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwgk6\" (UniqueName: \"kubernetes.io/projected/e2e2dc11-c865-45cb-ab81-c39e911fdef9-kube-api-access-mwgk6\") pod \"nova-api-db-create-b85sm\" (UID: \"e2e2dc11-c865-45cb-ab81-c39e911fdef9\") " pod="openstack/nova-api-db-create-b85sm" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.964084 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-tpnk4"] Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.965859 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-tpnk4" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.971779 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bdaa775e-7b2f-4c56-8f07-256a62b4ed20-operator-scripts\") pod \"nova-cell0-db-create-4zm7d\" (UID: \"bdaa775e-7b2f-4c56-8f07-256a62b4ed20\") " pod="openstack/nova-cell0-db-create-4zm7d" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.971928 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2ww2\" (UniqueName: \"kubernetes.io/projected/bdaa775e-7b2f-4c56-8f07-256a62b4ed20-kube-api-access-t2ww2\") pod \"nova-cell0-db-create-4zm7d\" (UID: \"bdaa775e-7b2f-4c56-8f07-256a62b4ed20\") " pod="openstack/nova-cell0-db-create-4zm7d" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.972234 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpdpl\" (UniqueName: \"kubernetes.io/projected/88297b50-c65c-4dc3-8eed-d86b046b7f84-kube-api-access-lpdpl\") pod \"nova-api-25db-account-create-update-h9hw7\" (UID: \"88297b50-c65c-4dc3-8eed-d86b046b7f84\") " pod="openstack/nova-api-25db-account-create-update-h9hw7" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.973589 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bdaa775e-7b2f-4c56-8f07-256a62b4ed20-operator-scripts\") pod \"nova-cell0-db-create-4zm7d\" (UID: \"bdaa775e-7b2f-4c56-8f07-256a62b4ed20\") " pod="openstack/nova-cell0-db-create-4zm7d" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.973926 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/88297b50-c65c-4dc3-8eed-d86b046b7f84-operator-scripts\") pod \"nova-api-25db-account-create-update-h9hw7\" (UID: \"88297b50-c65c-4dc3-8eed-d86b046b7f84\") " pod="openstack/nova-api-25db-account-create-update-h9hw7" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.977120 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-b85sm" Dec 01 21:55:47 crc kubenswrapper[4962]: I1201 21:55:47.987820 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/88297b50-c65c-4dc3-8eed-d86b046b7f84-operator-scripts\") pod \"nova-api-25db-account-create-update-h9hw7\" (UID: \"88297b50-c65c-4dc3-8eed-d86b046b7f84\") " pod="openstack/nova-api-25db-account-create-update-h9hw7" Dec 01 21:55:48 crc kubenswrapper[4962]: I1201 21:55:48.001316 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-287a-account-create-update-svftm"] Dec 01 21:55:48 crc kubenswrapper[4962]: I1201 21:55:48.003174 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-287a-account-create-update-svftm" Dec 01 21:55:48 crc kubenswrapper[4962]: I1201 21:55:48.006507 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 01 21:55:48 crc kubenswrapper[4962]: I1201 21:55:48.037126 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpdpl\" (UniqueName: \"kubernetes.io/projected/88297b50-c65c-4dc3-8eed-d86b046b7f84-kube-api-access-lpdpl\") pod \"nova-api-25db-account-create-update-h9hw7\" (UID: \"88297b50-c65c-4dc3-8eed-d86b046b7f84\") " pod="openstack/nova-api-25db-account-create-update-h9hw7" Dec 01 21:55:48 crc kubenswrapper[4962]: I1201 21:55:48.039577 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2ww2\" (UniqueName: \"kubernetes.io/projected/bdaa775e-7b2f-4c56-8f07-256a62b4ed20-kube-api-access-t2ww2\") pod \"nova-cell0-db-create-4zm7d\" (UID: \"bdaa775e-7b2f-4c56-8f07-256a62b4ed20\") " pod="openstack/nova-cell0-db-create-4zm7d" Dec 01 21:55:48 crc kubenswrapper[4962]: I1201 21:55:48.041501 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-tpnk4"] Dec 01 21:55:48 crc kubenswrapper[4962]: I1201 21:55:48.069866 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-4zm7d" Dec 01 21:55:48 crc kubenswrapper[4962]: I1201 21:55:48.070060 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-287a-account-create-update-svftm"] Dec 01 21:55:48 crc kubenswrapper[4962]: I1201 21:55:48.078349 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62f2a608-1cbf-4af5-ae73-9e3d141ae906-operator-scripts\") pod \"nova-cell0-287a-account-create-update-svftm\" (UID: \"62f2a608-1cbf-4af5-ae73-9e3d141ae906\") " pod="openstack/nova-cell0-287a-account-create-update-svftm" Dec 01 21:55:48 crc kubenswrapper[4962]: I1201 21:55:48.078446 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0f3d631-bb76-48fd-9bb2-2326d9044956-operator-scripts\") pod \"nova-cell1-db-create-tpnk4\" (UID: \"a0f3d631-bb76-48fd-9bb2-2326d9044956\") " pod="openstack/nova-cell1-db-create-tpnk4" Dec 01 21:55:48 crc kubenswrapper[4962]: I1201 21:55:48.078484 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spb7d\" (UniqueName: \"kubernetes.io/projected/a0f3d631-bb76-48fd-9bb2-2326d9044956-kube-api-access-spb7d\") pod \"nova-cell1-db-create-tpnk4\" (UID: \"a0f3d631-bb76-48fd-9bb2-2326d9044956\") " pod="openstack/nova-cell1-db-create-tpnk4" Dec 01 21:55:48 crc kubenswrapper[4962]: I1201 21:55:48.078520 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wdbj\" (UniqueName: \"kubernetes.io/projected/62f2a608-1cbf-4af5-ae73-9e3d141ae906-kube-api-access-6wdbj\") pod \"nova-cell0-287a-account-create-update-svftm\" (UID: \"62f2a608-1cbf-4af5-ae73-9e3d141ae906\") " pod="openstack/nova-cell0-287a-account-create-update-svftm" Dec 01 21:55:48 crc kubenswrapper[4962]: I1201 21:55:48.089417 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-25db-account-create-update-h9hw7" Dec 01 21:55:48 crc kubenswrapper[4962]: I1201 21:55:48.188026 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62f2a608-1cbf-4af5-ae73-9e3d141ae906-operator-scripts\") pod \"nova-cell0-287a-account-create-update-svftm\" (UID: \"62f2a608-1cbf-4af5-ae73-9e3d141ae906\") " pod="openstack/nova-cell0-287a-account-create-update-svftm" Dec 01 21:55:48 crc kubenswrapper[4962]: I1201 21:55:48.188522 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0f3d631-bb76-48fd-9bb2-2326d9044956-operator-scripts\") pod \"nova-cell1-db-create-tpnk4\" (UID: \"a0f3d631-bb76-48fd-9bb2-2326d9044956\") " pod="openstack/nova-cell1-db-create-tpnk4" Dec 01 21:55:48 crc kubenswrapper[4962]: I1201 21:55:48.189541 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spb7d\" (UniqueName: \"kubernetes.io/projected/a0f3d631-bb76-48fd-9bb2-2326d9044956-kube-api-access-spb7d\") pod \"nova-cell1-db-create-tpnk4\" (UID: \"a0f3d631-bb76-48fd-9bb2-2326d9044956\") " pod="openstack/nova-cell1-db-create-tpnk4" Dec 01 21:55:48 crc kubenswrapper[4962]: I1201 21:55:48.190371 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wdbj\" (UniqueName: \"kubernetes.io/projected/62f2a608-1cbf-4af5-ae73-9e3d141ae906-kube-api-access-6wdbj\") pod \"nova-cell0-287a-account-create-update-svftm\" (UID: \"62f2a608-1cbf-4af5-ae73-9e3d141ae906\") " pod="openstack/nova-cell0-287a-account-create-update-svftm" Dec 01 21:55:48 crc kubenswrapper[4962]: I1201 21:55:48.190440 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0f3d631-bb76-48fd-9bb2-2326d9044956-operator-scripts\") pod \"nova-cell1-db-create-tpnk4\" (UID: \"a0f3d631-bb76-48fd-9bb2-2326d9044956\") " pod="openstack/nova-cell1-db-create-tpnk4" Dec 01 21:55:48 crc kubenswrapper[4962]: I1201 21:55:48.190449 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62f2a608-1cbf-4af5-ae73-9e3d141ae906-operator-scripts\") pod \"nova-cell0-287a-account-create-update-svftm\" (UID: \"62f2a608-1cbf-4af5-ae73-9e3d141ae906\") " pod="openstack/nova-cell0-287a-account-create-update-svftm" Dec 01 21:55:48 crc kubenswrapper[4962]: I1201 21:55:48.216893 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-52d5-account-create-update-qxxxt"] Dec 01 21:55:48 crc kubenswrapper[4962]: I1201 21:55:48.231022 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-52d5-account-create-update-qxxxt" Dec 01 21:55:48 crc kubenswrapper[4962]: I1201 21:55:48.233044 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 01 21:55:48 crc kubenswrapper[4962]: I1201 21:55:48.239323 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wdbj\" (UniqueName: \"kubernetes.io/projected/62f2a608-1cbf-4af5-ae73-9e3d141ae906-kube-api-access-6wdbj\") pod \"nova-cell0-287a-account-create-update-svftm\" (UID: \"62f2a608-1cbf-4af5-ae73-9e3d141ae906\") " pod="openstack/nova-cell0-287a-account-create-update-svftm" Dec 01 21:55:48 crc kubenswrapper[4962]: I1201 21:55:48.273467 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spb7d\" (UniqueName: \"kubernetes.io/projected/a0f3d631-bb76-48fd-9bb2-2326d9044956-kube-api-access-spb7d\") pod \"nova-cell1-db-create-tpnk4\" (UID: \"a0f3d631-bb76-48fd-9bb2-2326d9044956\") " pod="openstack/nova-cell1-db-create-tpnk4" Dec 01 21:55:48 crc kubenswrapper[4962]: I1201 21:55:48.322980 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79441f7a-11fa-4ace-8deb-29d7db95e67c" path="/var/lib/kubelet/pods/79441f7a-11fa-4ace-8deb-29d7db95e67c/volumes" Dec 01 21:55:48 crc kubenswrapper[4962]: I1201 21:55:48.343157 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-52d5-account-create-update-qxxxt"] Dec 01 21:55:48 crc kubenswrapper[4962]: I1201 21:55:48.404175 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-688b9f5b49-ntsmc"] Dec 01 21:55:48 crc kubenswrapper[4962]: I1201 21:55:48.428505 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6dd00f79-0c0f-4016-bd96-e0c497c73e36-operator-scripts\") pod \"nova-cell1-52d5-account-create-update-qxxxt\" (UID: \"6dd00f79-0c0f-4016-bd96-e0c497c73e36\") " pod="openstack/nova-cell1-52d5-account-create-update-qxxxt" Dec 01 21:55:48 crc kubenswrapper[4962]: I1201 21:55:48.428764 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cpg5\" (UniqueName: \"kubernetes.io/projected/6dd00f79-0c0f-4016-bd96-e0c497c73e36-kube-api-access-2cpg5\") pod \"nova-cell1-52d5-account-create-update-qxxxt\" (UID: \"6dd00f79-0c0f-4016-bd96-e0c497c73e36\") " pod="openstack/nova-cell1-52d5-account-create-update-qxxxt" Dec 01 21:55:48 crc kubenswrapper[4962]: I1201 21:55:48.447473 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-56f444f67c-lv25m"] Dec 01 21:55:48 crc kubenswrapper[4962]: I1201 21:55:48.530679 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cpg5\" (UniqueName: \"kubernetes.io/projected/6dd00f79-0c0f-4016-bd96-e0c497c73e36-kube-api-access-2cpg5\") pod \"nova-cell1-52d5-account-create-update-qxxxt\" (UID: \"6dd00f79-0c0f-4016-bd96-e0c497c73e36\") " pod="openstack/nova-cell1-52d5-account-create-update-qxxxt" Dec 01 21:55:48 crc kubenswrapper[4962]: I1201 21:55:48.530786 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6dd00f79-0c0f-4016-bd96-e0c497c73e36-operator-scripts\") pod \"nova-cell1-52d5-account-create-update-qxxxt\" (UID: \"6dd00f79-0c0f-4016-bd96-e0c497c73e36\") " pod="openstack/nova-cell1-52d5-account-create-update-qxxxt" Dec 01 21:55:48 crc kubenswrapper[4962]: I1201 21:55:48.538731 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6dd00f79-0c0f-4016-bd96-e0c497c73e36-operator-scripts\") pod \"nova-cell1-52d5-account-create-update-qxxxt\" (UID: \"6dd00f79-0c0f-4016-bd96-e0c497c73e36\") " pod="openstack/nova-cell1-52d5-account-create-update-qxxxt" Dec 01 21:55:48 crc kubenswrapper[4962]: I1201 21:55:48.561563 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-58f5d66b6f-bgzrf"] Dec 01 21:55:48 crc kubenswrapper[4962]: I1201 21:55:48.605741 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cpg5\" (UniqueName: \"kubernetes.io/projected/6dd00f79-0c0f-4016-bd96-e0c497c73e36-kube-api-access-2cpg5\") pod \"nova-cell1-52d5-account-create-update-qxxxt\" (UID: \"6dd00f79-0c0f-4016-bd96-e0c497c73e36\") " pod="openstack/nova-cell1-52d5-account-create-update-qxxxt" Dec 01 21:55:48 crc kubenswrapper[4962]: I1201 21:55:48.713410 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-6648ff8f4d-xwjnk"] Dec 01 21:55:48 crc kubenswrapper[4962]: W1201 21:55:48.747472 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda75e6047_420d_4aa3_a817_90a547491be2.slice/crio-39e9bcdd2291436e12d4ef9911da2cc6b36a2943fd8fd797a32ec1a8a21058d6 WatchSource:0}: Error finding container 39e9bcdd2291436e12d4ef9911da2cc6b36a2943fd8fd797a32ec1a8a21058d6: Status 404 returned error can't find the container with id 39e9bcdd2291436e12d4ef9911da2cc6b36a2943fd8fd797a32ec1a8a21058d6 Dec 01 21:55:48 crc kubenswrapper[4962]: I1201 21:55:48.850716 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-tpnk4" Dec 01 21:55:48 crc kubenswrapper[4962]: I1201 21:55:48.887699 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-287a-account-create-update-svftm" Dec 01 21:55:48 crc kubenswrapper[4962]: I1201 21:55:48.896397 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-52d5-account-create-update-qxxxt" Dec 01 21:55:48 crc kubenswrapper[4962]: I1201 21:55:48.912502 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-b85sm"] Dec 01 21:55:49 crc kubenswrapper[4962]: I1201 21:55:49.122684 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-4zm7d"] Dec 01 21:55:49 crc kubenswrapper[4962]: I1201 21:55:49.216624 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-25db-account-create-update-h9hw7"] Dec 01 21:55:49 crc kubenswrapper[4962]: I1201 21:55:49.372915 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-56f444f67c-lv25m" event={"ID":"13829155-474a-445c-b27f-bffdd6b0befb","Type":"ContainerStarted","Data":"c25aa5102c17553db1df7a44b59452938278922e3f21be8bf35e08968659e166"} Dec 01 21:55:49 crc kubenswrapper[4962]: I1201 21:55:49.373230 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-56f444f67c-lv25m" event={"ID":"13829155-474a-445c-b27f-bffdd6b0befb","Type":"ContainerStarted","Data":"b3725c76f9647cefa4f3236345172e1005cfd2fb7918b922a6c3f6bfd5ffc701"} Dec 01 21:55:49 crc kubenswrapper[4962]: I1201 21:55:49.373280 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-56f444f67c-lv25m" Dec 01 21:55:49 crc kubenswrapper[4962]: I1201 21:55:49.375829 4962 generic.go:334] "Generic (PLEG): container finished" podID="a52733e0-9924-46f4-aee8-705cda80cc38" containerID="e4a7685dbe844676fd9e7ceb6095a9db7288c2a0331059015cae3134ac6b1819" exitCode=0 Dec 01 21:55:49 crc kubenswrapper[4962]: I1201 21:55:49.375910 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688b9f5b49-ntsmc" event={"ID":"a52733e0-9924-46f4-aee8-705cda80cc38","Type":"ContainerDied","Data":"e4a7685dbe844676fd9e7ceb6095a9db7288c2a0331059015cae3134ac6b1819"} Dec 01 21:55:49 crc kubenswrapper[4962]: I1201 21:55:49.375962 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688b9f5b49-ntsmc" event={"ID":"a52733e0-9924-46f4-aee8-705cda80cc38","Type":"ContainerStarted","Data":"6ffe45da241b8ae5dd2d68951a590c7a2aa4b0035203a508523f9bdf5fa81136"} Dec 01 21:55:49 crc kubenswrapper[4962]: I1201 21:55:49.379049 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6648ff8f4d-xwjnk" event={"ID":"a75e6047-420d-4aa3-a817-90a547491be2","Type":"ContainerStarted","Data":"39e9bcdd2291436e12d4ef9911da2cc6b36a2943fd8fd797a32ec1a8a21058d6"} Dec 01 21:55:49 crc kubenswrapper[4962]: I1201 21:55:49.386078 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-65c954fcc-wpwn7" event={"ID":"85abfbd6-374e-486e-93f1-8e8c4e8b5da0","Type":"ContainerStarted","Data":"ebbb750b1a23fa33e0e402b14012241e4959ea3ebff4c427f887f629f484f36d"} Dec 01 21:55:49 crc kubenswrapper[4962]: I1201 21:55:49.386132 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-65c954fcc-wpwn7" event={"ID":"85abfbd6-374e-486e-93f1-8e8c4e8b5da0","Type":"ContainerStarted","Data":"a92e601245abe1cd17a4c50ff2085bb640896e6169969ad6645af9016703398d"} Dec 01 21:55:49 crc kubenswrapper[4962]: I1201 21:55:49.386198 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-65c954fcc-wpwn7" Dec 01 21:55:49 crc kubenswrapper[4962]: I1201 21:55:49.386227 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-65c954fcc-wpwn7" Dec 01 21:55:49 crc kubenswrapper[4962]: I1201 21:55:49.392609 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-58f5d66b6f-bgzrf" event={"ID":"7b64f053-da53-49c7-a227-dcc84b5c078d","Type":"ContainerStarted","Data":"33224545768efe14e0f9135291c025e5550cae8ca8ffd28d78d6584d918eacdb"} Dec 01 21:55:49 crc kubenswrapper[4962]: I1201 21:55:49.415680 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-56f444f67c-lv25m" podStartSLOduration=3.415664516 podStartE2EDuration="3.415664516s" podCreationTimestamp="2025-12-01 21:55:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:55:49.395453221 +0000 UTC m=+1333.496892426" watchObservedRunningTime="2025-12-01 21:55:49.415664516 +0000 UTC m=+1333.517103701" Dec 01 21:55:49 crc kubenswrapper[4962]: I1201 21:55:49.435170 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-65c954fcc-wpwn7" podStartSLOduration=4.435153392 podStartE2EDuration="4.435153392s" podCreationTimestamp="2025-12-01 21:55:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:55:49.42876356 +0000 UTC m=+1333.530202765" watchObservedRunningTime="2025-12-01 21:55:49.435153392 +0000 UTC m=+1333.536592587" Dec 01 21:55:50 crc kubenswrapper[4962]: I1201 21:55:50.954518 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 21:55:50 crc kubenswrapper[4962]: I1201 21:55:50.955099 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2d6db005-fb47-40ad-979c-04ccd8146c41" containerName="ceilometer-central-agent" containerID="cri-o://88820d2455788335cb70872e685f36cce70095113a6ab68695d19976c7052d48" gracePeriod=30 Dec 01 21:55:50 crc kubenswrapper[4962]: I1201 21:55:50.958988 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2d6db005-fb47-40ad-979c-04ccd8146c41" containerName="sg-core" containerID="cri-o://dbdae4b355d4072d74e9f0cf004663def7872c74591d4a907f201b028ae0c649" gracePeriod=30 Dec 01 21:55:50 crc kubenswrapper[4962]: I1201 21:55:50.959114 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2d6db005-fb47-40ad-979c-04ccd8146c41" containerName="proxy-httpd" containerID="cri-o://83854fb61d90bc731ffa1b9c20134fab91b244e3e3f98020bc76170b355d69b1" gracePeriod=30 Dec 01 21:55:50 crc kubenswrapper[4962]: I1201 21:55:50.959167 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2d6db005-fb47-40ad-979c-04ccd8146c41" containerName="ceilometer-notification-agent" containerID="cri-o://5aa07407749c8b852d88b0d747067ab051c1258e0cff82c7cb0c70d57542b327" gracePeriod=30 Dec 01 21:55:50 crc kubenswrapper[4962]: I1201 21:55:50.974425 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="2d6db005-fb47-40ad-979c-04ccd8146c41" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.204:3000/\": read tcp 10.217.0.2:58428->10.217.0.204:3000: read: connection reset by peer" Dec 01 21:55:51 crc kubenswrapper[4962]: I1201 21:55:51.426948 4962 generic.go:334] "Generic (PLEG): container finished" podID="2d6db005-fb47-40ad-979c-04ccd8146c41" containerID="83854fb61d90bc731ffa1b9c20134fab91b244e3e3f98020bc76170b355d69b1" exitCode=0 Dec 01 21:55:51 crc kubenswrapper[4962]: I1201 21:55:51.426978 4962 generic.go:334] "Generic (PLEG): container finished" podID="2d6db005-fb47-40ad-979c-04ccd8146c41" containerID="dbdae4b355d4072d74e9f0cf004663def7872c74591d4a907f201b028ae0c649" exitCode=2 Dec 01 21:55:51 crc kubenswrapper[4962]: I1201 21:55:51.426987 4962 generic.go:334] "Generic (PLEG): container finished" podID="2d6db005-fb47-40ad-979c-04ccd8146c41" containerID="88820d2455788335cb70872e685f36cce70095113a6ab68695d19976c7052d48" exitCode=0 Dec 01 21:55:51 crc kubenswrapper[4962]: I1201 21:55:51.427005 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2d6db005-fb47-40ad-979c-04ccd8146c41","Type":"ContainerDied","Data":"83854fb61d90bc731ffa1b9c20134fab91b244e3e3f98020bc76170b355d69b1"} Dec 01 21:55:51 crc kubenswrapper[4962]: I1201 21:55:51.427030 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2d6db005-fb47-40ad-979c-04ccd8146c41","Type":"ContainerDied","Data":"dbdae4b355d4072d74e9f0cf004663def7872c74591d4a907f201b028ae0c649"} Dec 01 21:55:51 crc kubenswrapper[4962]: I1201 21:55:51.427040 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2d6db005-fb47-40ad-979c-04ccd8146c41","Type":"ContainerDied","Data":"88820d2455788335cb70872e685f36cce70095113a6ab68695d19976c7052d48"} Dec 01 21:55:53 crc kubenswrapper[4962]: I1201 21:55:53.465721 4962 generic.go:334] "Generic (PLEG): container finished" podID="2d6db005-fb47-40ad-979c-04ccd8146c41" containerID="5aa07407749c8b852d88b0d747067ab051c1258e0cff82c7cb0c70d57542b327" exitCode=0 Dec 01 21:55:53 crc kubenswrapper[4962]: I1201 21:55:53.466299 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2d6db005-fb47-40ad-979c-04ccd8146c41","Type":"ContainerDied","Data":"5aa07407749c8b852d88b0d747067ab051c1258e0cff82c7cb0c70d57542b327"} Dec 01 21:55:55 crc kubenswrapper[4962]: I1201 21:55:55.005648 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-6776d74cd9-xhqgt"] Dec 01 21:55:55 crc kubenswrapper[4962]: I1201 21:55:55.007441 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6776d74cd9-xhqgt" Dec 01 21:55:55 crc kubenswrapper[4962]: I1201 21:55:55.085580 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-6776d74cd9-xhqgt"] Dec 01 21:55:55 crc kubenswrapper[4962]: I1201 21:55:55.116580 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a0a5ca37-2aa2-43a5-8e9d-a58d62955f44-config-data-custom\") pod \"heat-engine-6776d74cd9-xhqgt\" (UID: \"a0a5ca37-2aa2-43a5-8e9d-a58d62955f44\") " pod="openstack/heat-engine-6776d74cd9-xhqgt" Dec 01 21:55:55 crc kubenswrapper[4962]: I1201 21:55:55.116663 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0a5ca37-2aa2-43a5-8e9d-a58d62955f44-combined-ca-bundle\") pod \"heat-engine-6776d74cd9-xhqgt\" (UID: \"a0a5ca37-2aa2-43a5-8e9d-a58d62955f44\") " pod="openstack/heat-engine-6776d74cd9-xhqgt" Dec 01 21:55:55 crc kubenswrapper[4962]: I1201 21:55:55.117372 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lntx7\" (UniqueName: \"kubernetes.io/projected/a0a5ca37-2aa2-43a5-8e9d-a58d62955f44-kube-api-access-lntx7\") pod \"heat-engine-6776d74cd9-xhqgt\" (UID: \"a0a5ca37-2aa2-43a5-8e9d-a58d62955f44\") " pod="openstack/heat-engine-6776d74cd9-xhqgt" Dec 01 21:55:55 crc kubenswrapper[4962]: I1201 21:55:55.117605 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0a5ca37-2aa2-43a5-8e9d-a58d62955f44-config-data\") pod \"heat-engine-6776d74cd9-xhqgt\" (UID: \"a0a5ca37-2aa2-43a5-8e9d-a58d62955f44\") " pod="openstack/heat-engine-6776d74cd9-xhqgt" Dec 01 21:55:55 crc kubenswrapper[4962]: I1201 21:55:55.126530 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-6555856bc4-7xjr6"] Dec 01 21:55:55 crc kubenswrapper[4962]: I1201 21:55:55.128342 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6555856bc4-7xjr6" Dec 01 21:55:55 crc kubenswrapper[4962]: I1201 21:55:55.135446 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-6555856bc4-7xjr6"] Dec 01 21:55:55 crc kubenswrapper[4962]: I1201 21:55:55.149090 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-7747d4b4f4-gldgq"] Dec 01 21:55:55 crc kubenswrapper[4962]: I1201 21:55:55.150834 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7747d4b4f4-gldgq" Dec 01 21:55:55 crc kubenswrapper[4962]: I1201 21:55:55.157089 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-7747d4b4f4-gldgq"] Dec 01 21:55:55 crc kubenswrapper[4962]: I1201 21:55:55.219426 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2cd79bc6-3c65-476e-ab58-1dcbe4533e23-config-data-custom\") pod \"heat-api-7747d4b4f4-gldgq\" (UID: \"2cd79bc6-3c65-476e-ab58-1dcbe4533e23\") " pod="openstack/heat-api-7747d4b4f4-gldgq" Dec 01 21:55:55 crc kubenswrapper[4962]: I1201 21:55:55.219470 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkqw2\" (UniqueName: \"kubernetes.io/projected/2cd79bc6-3c65-476e-ab58-1dcbe4533e23-kube-api-access-kkqw2\") pod \"heat-api-7747d4b4f4-gldgq\" (UID: \"2cd79bc6-3c65-476e-ab58-1dcbe4533e23\") " pod="openstack/heat-api-7747d4b4f4-gldgq" Dec 01 21:55:55 crc kubenswrapper[4962]: I1201 21:55:55.219537 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lntx7\" (UniqueName: \"kubernetes.io/projected/a0a5ca37-2aa2-43a5-8e9d-a58d62955f44-kube-api-access-lntx7\") pod \"heat-engine-6776d74cd9-xhqgt\" (UID: \"a0a5ca37-2aa2-43a5-8e9d-a58d62955f44\") " pod="openstack/heat-engine-6776d74cd9-xhqgt" Dec 01 21:55:55 crc kubenswrapper[4962]: I1201 21:55:55.219600 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0a5ca37-2aa2-43a5-8e9d-a58d62955f44-config-data\") pod \"heat-engine-6776d74cd9-xhqgt\" (UID: \"a0a5ca37-2aa2-43a5-8e9d-a58d62955f44\") " pod="openstack/heat-engine-6776d74cd9-xhqgt" Dec 01 21:55:55 crc kubenswrapper[4962]: I1201 21:55:55.219635 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a0a5ca37-2aa2-43a5-8e9d-a58d62955f44-config-data-custom\") pod \"heat-engine-6776d74cd9-xhqgt\" (UID: \"a0a5ca37-2aa2-43a5-8e9d-a58d62955f44\") " pod="openstack/heat-engine-6776d74cd9-xhqgt" Dec 01 21:55:55 crc kubenswrapper[4962]: I1201 21:55:55.219662 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cd79bc6-3c65-476e-ab58-1dcbe4533e23-config-data\") pod \"heat-api-7747d4b4f4-gldgq\" (UID: \"2cd79bc6-3c65-476e-ab58-1dcbe4533e23\") " pod="openstack/heat-api-7747d4b4f4-gldgq" Dec 01 21:55:55 crc kubenswrapper[4962]: I1201 21:55:55.219678 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aada9c17-9992-466a-bd0c-212a580295fa-config-data-custom\") pod \"heat-cfnapi-6555856bc4-7xjr6\" (UID: \"aada9c17-9992-466a-bd0c-212a580295fa\") " pod="openstack/heat-cfnapi-6555856bc4-7xjr6" Dec 01 21:55:55 crc kubenswrapper[4962]: I1201 21:55:55.219694 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aada9c17-9992-466a-bd0c-212a580295fa-combined-ca-bundle\") pod \"heat-cfnapi-6555856bc4-7xjr6\" (UID: \"aada9c17-9992-466a-bd0c-212a580295fa\") " pod="openstack/heat-cfnapi-6555856bc4-7xjr6" Dec 01 21:55:55 crc kubenswrapper[4962]: I1201 21:55:55.219725 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0a5ca37-2aa2-43a5-8e9d-a58d62955f44-combined-ca-bundle\") pod \"heat-engine-6776d74cd9-xhqgt\" (UID: \"a0a5ca37-2aa2-43a5-8e9d-a58d62955f44\") " pod="openstack/heat-engine-6776d74cd9-xhqgt" Dec 01 21:55:55 crc kubenswrapper[4962]: I1201 21:55:55.219751 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zq5m4\" (UniqueName: \"kubernetes.io/projected/aada9c17-9992-466a-bd0c-212a580295fa-kube-api-access-zq5m4\") pod \"heat-cfnapi-6555856bc4-7xjr6\" (UID: \"aada9c17-9992-466a-bd0c-212a580295fa\") " pod="openstack/heat-cfnapi-6555856bc4-7xjr6" Dec 01 21:55:55 crc kubenswrapper[4962]: I1201 21:55:55.219773 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aada9c17-9992-466a-bd0c-212a580295fa-config-data\") pod \"heat-cfnapi-6555856bc4-7xjr6\" (UID: \"aada9c17-9992-466a-bd0c-212a580295fa\") " pod="openstack/heat-cfnapi-6555856bc4-7xjr6" Dec 01 21:55:55 crc kubenswrapper[4962]: I1201 21:55:55.219817 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cd79bc6-3c65-476e-ab58-1dcbe4533e23-combined-ca-bundle\") pod \"heat-api-7747d4b4f4-gldgq\" (UID: \"2cd79bc6-3c65-476e-ab58-1dcbe4533e23\") " pod="openstack/heat-api-7747d4b4f4-gldgq" Dec 01 21:55:55 crc kubenswrapper[4962]: I1201 21:55:55.228225 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a0a5ca37-2aa2-43a5-8e9d-a58d62955f44-config-data-custom\") pod \"heat-engine-6776d74cd9-xhqgt\" (UID: \"a0a5ca37-2aa2-43a5-8e9d-a58d62955f44\") " pod="openstack/heat-engine-6776d74cd9-xhqgt" Dec 01 21:55:55 crc kubenswrapper[4962]: I1201 21:55:55.231438 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0a5ca37-2aa2-43a5-8e9d-a58d62955f44-combined-ca-bundle\") pod \"heat-engine-6776d74cd9-xhqgt\" (UID: \"a0a5ca37-2aa2-43a5-8e9d-a58d62955f44\") " pod="openstack/heat-engine-6776d74cd9-xhqgt" Dec 01 21:55:55 crc kubenswrapper[4962]: I1201 21:55:55.238926 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0a5ca37-2aa2-43a5-8e9d-a58d62955f44-config-data\") pod \"heat-engine-6776d74cd9-xhqgt\" (UID: \"a0a5ca37-2aa2-43a5-8e9d-a58d62955f44\") " pod="openstack/heat-engine-6776d74cd9-xhqgt" Dec 01 21:55:55 crc kubenswrapper[4962]: I1201 21:55:55.242028 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lntx7\" (UniqueName: \"kubernetes.io/projected/a0a5ca37-2aa2-43a5-8e9d-a58d62955f44-kube-api-access-lntx7\") pod \"heat-engine-6776d74cd9-xhqgt\" (UID: \"a0a5ca37-2aa2-43a5-8e9d-a58d62955f44\") " pod="openstack/heat-engine-6776d74cd9-xhqgt" Dec 01 21:55:55 crc kubenswrapper[4962]: I1201 21:55:55.321606 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2cd79bc6-3c65-476e-ab58-1dcbe4533e23-config-data-custom\") pod \"heat-api-7747d4b4f4-gldgq\" (UID: \"2cd79bc6-3c65-476e-ab58-1dcbe4533e23\") " pod="openstack/heat-api-7747d4b4f4-gldgq" Dec 01 21:55:55 crc kubenswrapper[4962]: I1201 21:55:55.321646 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkqw2\" (UniqueName: \"kubernetes.io/projected/2cd79bc6-3c65-476e-ab58-1dcbe4533e23-kube-api-access-kkqw2\") pod \"heat-api-7747d4b4f4-gldgq\" (UID: \"2cd79bc6-3c65-476e-ab58-1dcbe4533e23\") " pod="openstack/heat-api-7747d4b4f4-gldgq" Dec 01 21:55:55 crc kubenswrapper[4962]: I1201 21:55:55.321766 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cd79bc6-3c65-476e-ab58-1dcbe4533e23-config-data\") pod \"heat-api-7747d4b4f4-gldgq\" (UID: \"2cd79bc6-3c65-476e-ab58-1dcbe4533e23\") " pod="openstack/heat-api-7747d4b4f4-gldgq" Dec 01 21:55:55 crc kubenswrapper[4962]: I1201 21:55:55.321785 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aada9c17-9992-466a-bd0c-212a580295fa-config-data-custom\") pod \"heat-cfnapi-6555856bc4-7xjr6\" (UID: \"aada9c17-9992-466a-bd0c-212a580295fa\") " pod="openstack/heat-cfnapi-6555856bc4-7xjr6" Dec 01 21:55:55 crc kubenswrapper[4962]: I1201 21:55:55.321801 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aada9c17-9992-466a-bd0c-212a580295fa-combined-ca-bundle\") pod \"heat-cfnapi-6555856bc4-7xjr6\" (UID: \"aada9c17-9992-466a-bd0c-212a580295fa\") " pod="openstack/heat-cfnapi-6555856bc4-7xjr6" Dec 01 21:55:55 crc kubenswrapper[4962]: I1201 21:55:55.321845 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zq5m4\" (UniqueName: \"kubernetes.io/projected/aada9c17-9992-466a-bd0c-212a580295fa-kube-api-access-zq5m4\") pod \"heat-cfnapi-6555856bc4-7xjr6\" (UID: \"aada9c17-9992-466a-bd0c-212a580295fa\") " pod="openstack/heat-cfnapi-6555856bc4-7xjr6" Dec 01 21:55:55 crc kubenswrapper[4962]: I1201 21:55:55.321868 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aada9c17-9992-466a-bd0c-212a580295fa-config-data\") pod \"heat-cfnapi-6555856bc4-7xjr6\" (UID: \"aada9c17-9992-466a-bd0c-212a580295fa\") " pod="openstack/heat-cfnapi-6555856bc4-7xjr6" Dec 01 21:55:55 crc kubenswrapper[4962]: I1201 21:55:55.321915 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cd79bc6-3c65-476e-ab58-1dcbe4533e23-combined-ca-bundle\") pod \"heat-api-7747d4b4f4-gldgq\" (UID: \"2cd79bc6-3c65-476e-ab58-1dcbe4533e23\") " pod="openstack/heat-api-7747d4b4f4-gldgq" Dec 01 21:55:55 crc kubenswrapper[4962]: I1201 21:55:55.332437 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cd79bc6-3c65-476e-ab58-1dcbe4533e23-combined-ca-bundle\") pod \"heat-api-7747d4b4f4-gldgq\" (UID: \"2cd79bc6-3c65-476e-ab58-1dcbe4533e23\") " pod="openstack/heat-api-7747d4b4f4-gldgq" Dec 01 21:55:55 crc kubenswrapper[4962]: I1201 21:55:55.333311 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aada9c17-9992-466a-bd0c-212a580295fa-config-data\") pod \"heat-cfnapi-6555856bc4-7xjr6\" (UID: \"aada9c17-9992-466a-bd0c-212a580295fa\") " pod="openstack/heat-cfnapi-6555856bc4-7xjr6" Dec 01 21:55:55 crc kubenswrapper[4962]: I1201 21:55:55.337159 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aada9c17-9992-466a-bd0c-212a580295fa-config-data-custom\") pod \"heat-cfnapi-6555856bc4-7xjr6\" (UID: \"aada9c17-9992-466a-bd0c-212a580295fa\") " pod="openstack/heat-cfnapi-6555856bc4-7xjr6" Dec 01 21:55:55 crc kubenswrapper[4962]: I1201 21:55:55.341639 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6776d74cd9-xhqgt" Dec 01 21:55:55 crc kubenswrapper[4962]: I1201 21:55:55.342350 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkqw2\" (UniqueName: \"kubernetes.io/projected/2cd79bc6-3c65-476e-ab58-1dcbe4533e23-kube-api-access-kkqw2\") pod \"heat-api-7747d4b4f4-gldgq\" (UID: \"2cd79bc6-3c65-476e-ab58-1dcbe4533e23\") " pod="openstack/heat-api-7747d4b4f4-gldgq" Dec 01 21:55:55 crc kubenswrapper[4962]: I1201 21:55:55.347416 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2cd79bc6-3c65-476e-ab58-1dcbe4533e23-config-data-custom\") pod \"heat-api-7747d4b4f4-gldgq\" (UID: \"2cd79bc6-3c65-476e-ab58-1dcbe4533e23\") " pod="openstack/heat-api-7747d4b4f4-gldgq" Dec 01 21:55:55 crc kubenswrapper[4962]: I1201 21:55:55.349648 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aada9c17-9992-466a-bd0c-212a580295fa-combined-ca-bundle\") pod \"heat-cfnapi-6555856bc4-7xjr6\" (UID: \"aada9c17-9992-466a-bd0c-212a580295fa\") " pod="openstack/heat-cfnapi-6555856bc4-7xjr6" Dec 01 21:55:55 crc kubenswrapper[4962]: I1201 21:55:55.358326 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cd79bc6-3c65-476e-ab58-1dcbe4533e23-config-data\") pod \"heat-api-7747d4b4f4-gldgq\" (UID: \"2cd79bc6-3c65-476e-ab58-1dcbe4533e23\") " pod="openstack/heat-api-7747d4b4f4-gldgq" Dec 01 21:55:55 crc kubenswrapper[4962]: I1201 21:55:55.359175 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zq5m4\" (UniqueName: \"kubernetes.io/projected/aada9c17-9992-466a-bd0c-212a580295fa-kube-api-access-zq5m4\") pod \"heat-cfnapi-6555856bc4-7xjr6\" (UID: \"aada9c17-9992-466a-bd0c-212a580295fa\") " pod="openstack/heat-cfnapi-6555856bc4-7xjr6" Dec 01 21:55:55 crc kubenswrapper[4962]: I1201 21:55:55.449832 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6555856bc4-7xjr6" Dec 01 21:55:55 crc kubenswrapper[4962]: I1201 21:55:55.470160 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7747d4b4f4-gldgq" Dec 01 21:55:56 crc kubenswrapper[4962]: I1201 21:55:56.353912 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-65c954fcc-wpwn7" Dec 01 21:55:56 crc kubenswrapper[4962]: I1201 21:55:56.363947 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-65c954fcc-wpwn7" Dec 01 21:55:56 crc kubenswrapper[4962]: I1201 21:55:56.565812 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-6648ff8f4d-xwjnk"] Dec 01 21:55:56 crc kubenswrapper[4962]: I1201 21:55:56.584982 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-58f5d66b6f-bgzrf"] Dec 01 21:55:56 crc kubenswrapper[4962]: I1201 21:55:56.612738 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-59c6c9c84d-2tqrs"] Dec 01 21:55:56 crc kubenswrapper[4962]: I1201 21:55:56.615573 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-59c6c9c84d-2tqrs" Dec 01 21:55:56 crc kubenswrapper[4962]: I1201 21:55:56.620028 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-internal-svc" Dec 01 21:55:56 crc kubenswrapper[4962]: I1201 21:55:56.620381 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-public-svc" Dec 01 21:55:56 crc kubenswrapper[4962]: I1201 21:55:56.650139 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-5ff8b998b6-kt4sg"] Dec 01 21:55:56 crc kubenswrapper[4962]: I1201 21:55:56.654076 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5ff8b998b6-kt4sg" Dec 01 21:55:56 crc kubenswrapper[4962]: I1201 21:55:56.657675 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-public-svc" Dec 01 21:55:56 crc kubenswrapper[4962]: I1201 21:55:56.657968 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-internal-svc" Dec 01 21:55:56 crc kubenswrapper[4962]: I1201 21:55:56.659576 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63a2deaf-718f-418e-bed6-b8b1351c4d85-config-data\") pod \"heat-api-59c6c9c84d-2tqrs\" (UID: \"63a2deaf-718f-418e-bed6-b8b1351c4d85\") " pod="openstack/heat-api-59c6c9c84d-2tqrs" Dec 01 21:55:56 crc kubenswrapper[4962]: I1201 21:55:56.659700 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38-config-data-custom\") pod \"heat-cfnapi-5ff8b998b6-kt4sg\" (UID: \"0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38\") " pod="openstack/heat-cfnapi-5ff8b998b6-kt4sg" Dec 01 21:55:56 crc kubenswrapper[4962]: I1201 21:55:56.659786 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63a2deaf-718f-418e-bed6-b8b1351c4d85-combined-ca-bundle\") pod \"heat-api-59c6c9c84d-2tqrs\" (UID: \"63a2deaf-718f-418e-bed6-b8b1351c4d85\") " pod="openstack/heat-api-59c6c9c84d-2tqrs" Dec 01 21:55:56 crc kubenswrapper[4962]: I1201 21:55:56.659891 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/63a2deaf-718f-418e-bed6-b8b1351c4d85-config-data-custom\") pod \"heat-api-59c6c9c84d-2tqrs\" (UID: \"63a2deaf-718f-418e-bed6-b8b1351c4d85\") " pod="openstack/heat-api-59c6c9c84d-2tqrs" Dec 01 21:55:56 crc kubenswrapper[4962]: I1201 21:55:56.660003 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/63a2deaf-718f-418e-bed6-b8b1351c4d85-internal-tls-certs\") pod \"heat-api-59c6c9c84d-2tqrs\" (UID: \"63a2deaf-718f-418e-bed6-b8b1351c4d85\") " pod="openstack/heat-api-59c6c9c84d-2tqrs" Dec 01 21:55:56 crc kubenswrapper[4962]: I1201 21:55:56.660140 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38-config-data\") pod \"heat-cfnapi-5ff8b998b6-kt4sg\" (UID: \"0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38\") " pod="openstack/heat-cfnapi-5ff8b998b6-kt4sg" Dec 01 21:55:56 crc kubenswrapper[4962]: I1201 21:55:56.660223 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jqwb\" (UniqueName: \"kubernetes.io/projected/63a2deaf-718f-418e-bed6-b8b1351c4d85-kube-api-access-5jqwb\") pod \"heat-api-59c6c9c84d-2tqrs\" (UID: \"63a2deaf-718f-418e-bed6-b8b1351c4d85\") " pod="openstack/heat-api-59c6c9c84d-2tqrs" Dec 01 21:55:56 crc kubenswrapper[4962]: I1201 21:55:56.660294 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/63a2deaf-718f-418e-bed6-b8b1351c4d85-public-tls-certs\") pod \"heat-api-59c6c9c84d-2tqrs\" (UID: \"63a2deaf-718f-418e-bed6-b8b1351c4d85\") " pod="openstack/heat-api-59c6c9c84d-2tqrs" Dec 01 21:55:56 crc kubenswrapper[4962]: I1201 21:55:56.660375 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38-combined-ca-bundle\") pod \"heat-cfnapi-5ff8b998b6-kt4sg\" (UID: \"0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38\") " pod="openstack/heat-cfnapi-5ff8b998b6-kt4sg" Dec 01 21:55:56 crc kubenswrapper[4962]: I1201 21:55:56.660477 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38-public-tls-certs\") pod \"heat-cfnapi-5ff8b998b6-kt4sg\" (UID: \"0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38\") " pod="openstack/heat-cfnapi-5ff8b998b6-kt4sg" Dec 01 21:55:56 crc kubenswrapper[4962]: I1201 21:55:56.660558 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5296\" (UniqueName: \"kubernetes.io/projected/0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38-kube-api-access-s5296\") pod \"heat-cfnapi-5ff8b998b6-kt4sg\" (UID: \"0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38\") " pod="openstack/heat-cfnapi-5ff8b998b6-kt4sg" Dec 01 21:55:56 crc kubenswrapper[4962]: I1201 21:55:56.660646 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38-internal-tls-certs\") pod \"heat-cfnapi-5ff8b998b6-kt4sg\" (UID: \"0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38\") " pod="openstack/heat-cfnapi-5ff8b998b6-kt4sg" Dec 01 21:55:56 crc kubenswrapper[4962]: I1201 21:55:56.675531 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-5ff8b998b6-kt4sg"] Dec 01 21:55:56 crc kubenswrapper[4962]: I1201 21:55:56.721242 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-59c6c9c84d-2tqrs"] Dec 01 21:55:56 crc kubenswrapper[4962]: I1201 21:55:56.762169 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38-combined-ca-bundle\") pod \"heat-cfnapi-5ff8b998b6-kt4sg\" (UID: \"0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38\") " pod="openstack/heat-cfnapi-5ff8b998b6-kt4sg" Dec 01 21:55:56 crc kubenswrapper[4962]: I1201 21:55:56.762233 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38-public-tls-certs\") pod \"heat-cfnapi-5ff8b998b6-kt4sg\" (UID: \"0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38\") " pod="openstack/heat-cfnapi-5ff8b998b6-kt4sg" Dec 01 21:55:56 crc kubenswrapper[4962]: I1201 21:55:56.762254 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5296\" (UniqueName: \"kubernetes.io/projected/0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38-kube-api-access-s5296\") pod \"heat-cfnapi-5ff8b998b6-kt4sg\" (UID: \"0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38\") " pod="openstack/heat-cfnapi-5ff8b998b6-kt4sg" Dec 01 21:55:56 crc kubenswrapper[4962]: I1201 21:55:56.762284 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38-internal-tls-certs\") pod \"heat-cfnapi-5ff8b998b6-kt4sg\" (UID: \"0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38\") " pod="openstack/heat-cfnapi-5ff8b998b6-kt4sg" Dec 01 21:55:56 crc kubenswrapper[4962]: I1201 21:55:56.762344 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63a2deaf-718f-418e-bed6-b8b1351c4d85-config-data\") pod \"heat-api-59c6c9c84d-2tqrs\" (UID: \"63a2deaf-718f-418e-bed6-b8b1351c4d85\") " pod="openstack/heat-api-59c6c9c84d-2tqrs" Dec 01 21:55:56 crc kubenswrapper[4962]: I1201 21:55:56.762362 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38-config-data-custom\") pod \"heat-cfnapi-5ff8b998b6-kt4sg\" (UID: \"0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38\") " pod="openstack/heat-cfnapi-5ff8b998b6-kt4sg" Dec 01 21:55:56 crc kubenswrapper[4962]: I1201 21:55:56.762403 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63a2deaf-718f-418e-bed6-b8b1351c4d85-combined-ca-bundle\") pod \"heat-api-59c6c9c84d-2tqrs\" (UID: \"63a2deaf-718f-418e-bed6-b8b1351c4d85\") " pod="openstack/heat-api-59c6c9c84d-2tqrs" Dec 01 21:55:56 crc kubenswrapper[4962]: I1201 21:55:56.762449 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/63a2deaf-718f-418e-bed6-b8b1351c4d85-config-data-custom\") pod \"heat-api-59c6c9c84d-2tqrs\" (UID: \"63a2deaf-718f-418e-bed6-b8b1351c4d85\") " pod="openstack/heat-api-59c6c9c84d-2tqrs" Dec 01 21:55:56 crc kubenswrapper[4962]: I1201 21:55:56.762468 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/63a2deaf-718f-418e-bed6-b8b1351c4d85-internal-tls-certs\") pod \"heat-api-59c6c9c84d-2tqrs\" (UID: \"63a2deaf-718f-418e-bed6-b8b1351c4d85\") " pod="openstack/heat-api-59c6c9c84d-2tqrs" Dec 01 21:55:56 crc kubenswrapper[4962]: I1201 21:55:56.762531 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38-config-data\") pod \"heat-cfnapi-5ff8b998b6-kt4sg\" (UID: \"0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38\") " pod="openstack/heat-cfnapi-5ff8b998b6-kt4sg" Dec 01 21:55:56 crc kubenswrapper[4962]: I1201 21:55:56.762555 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jqwb\" (UniqueName: \"kubernetes.io/projected/63a2deaf-718f-418e-bed6-b8b1351c4d85-kube-api-access-5jqwb\") pod \"heat-api-59c6c9c84d-2tqrs\" (UID: \"63a2deaf-718f-418e-bed6-b8b1351c4d85\") " pod="openstack/heat-api-59c6c9c84d-2tqrs" Dec 01 21:55:56 crc kubenswrapper[4962]: I1201 21:55:56.762572 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/63a2deaf-718f-418e-bed6-b8b1351c4d85-public-tls-certs\") pod \"heat-api-59c6c9c84d-2tqrs\" (UID: \"63a2deaf-718f-418e-bed6-b8b1351c4d85\") " pod="openstack/heat-api-59c6c9c84d-2tqrs" Dec 01 21:55:56 crc kubenswrapper[4962]: I1201 21:55:56.768926 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38-config-data-custom\") pod \"heat-cfnapi-5ff8b998b6-kt4sg\" (UID: \"0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38\") " pod="openstack/heat-cfnapi-5ff8b998b6-kt4sg" Dec 01 21:55:56 crc kubenswrapper[4962]: I1201 21:55:56.771746 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/63a2deaf-718f-418e-bed6-b8b1351c4d85-config-data-custom\") pod \"heat-api-59c6c9c84d-2tqrs\" (UID: \"63a2deaf-718f-418e-bed6-b8b1351c4d85\") " pod="openstack/heat-api-59c6c9c84d-2tqrs" Dec 01 21:55:56 crc kubenswrapper[4962]: I1201 21:55:56.779394 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38-combined-ca-bundle\") pod \"heat-cfnapi-5ff8b998b6-kt4sg\" (UID: \"0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38\") " pod="openstack/heat-cfnapi-5ff8b998b6-kt4sg" Dec 01 21:55:56 crc kubenswrapper[4962]: I1201 21:55:56.779481 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38-internal-tls-certs\") pod \"heat-cfnapi-5ff8b998b6-kt4sg\" (UID: \"0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38\") " pod="openstack/heat-cfnapi-5ff8b998b6-kt4sg" Dec 01 21:55:56 crc kubenswrapper[4962]: I1201 21:55:56.779918 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63a2deaf-718f-418e-bed6-b8b1351c4d85-combined-ca-bundle\") pod \"heat-api-59c6c9c84d-2tqrs\" (UID: \"63a2deaf-718f-418e-bed6-b8b1351c4d85\") " pod="openstack/heat-api-59c6c9c84d-2tqrs" Dec 01 21:55:56 crc kubenswrapper[4962]: I1201 21:55:56.780250 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/63a2deaf-718f-418e-bed6-b8b1351c4d85-internal-tls-certs\") pod \"heat-api-59c6c9c84d-2tqrs\" (UID: \"63a2deaf-718f-418e-bed6-b8b1351c4d85\") " pod="openstack/heat-api-59c6c9c84d-2tqrs" Dec 01 21:55:56 crc kubenswrapper[4962]: I1201 21:55:56.782078 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38-public-tls-certs\") pod \"heat-cfnapi-5ff8b998b6-kt4sg\" (UID: \"0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38\") " pod="openstack/heat-cfnapi-5ff8b998b6-kt4sg" Dec 01 21:55:56 crc kubenswrapper[4962]: I1201 21:55:56.782380 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/63a2deaf-718f-418e-bed6-b8b1351c4d85-public-tls-certs\") pod \"heat-api-59c6c9c84d-2tqrs\" (UID: \"63a2deaf-718f-418e-bed6-b8b1351c4d85\") " pod="openstack/heat-api-59c6c9c84d-2tqrs" Dec 01 21:55:56 crc kubenswrapper[4962]: I1201 21:55:56.783883 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5296\" (UniqueName: \"kubernetes.io/projected/0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38-kube-api-access-s5296\") pod \"heat-cfnapi-5ff8b998b6-kt4sg\" (UID: \"0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38\") " pod="openstack/heat-cfnapi-5ff8b998b6-kt4sg" Dec 01 21:55:56 crc kubenswrapper[4962]: I1201 21:55:56.800792 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jqwb\" (UniqueName: \"kubernetes.io/projected/63a2deaf-718f-418e-bed6-b8b1351c4d85-kube-api-access-5jqwb\") pod \"heat-api-59c6c9c84d-2tqrs\" (UID: \"63a2deaf-718f-418e-bed6-b8b1351c4d85\") " pod="openstack/heat-api-59c6c9c84d-2tqrs" Dec 01 21:55:56 crc kubenswrapper[4962]: I1201 21:55:56.801070 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38-config-data\") pod \"heat-cfnapi-5ff8b998b6-kt4sg\" (UID: \"0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38\") " pod="openstack/heat-cfnapi-5ff8b998b6-kt4sg" Dec 01 21:55:56 crc kubenswrapper[4962]: I1201 21:55:56.808873 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63a2deaf-718f-418e-bed6-b8b1351c4d85-config-data\") pod \"heat-api-59c6c9c84d-2tqrs\" (UID: \"63a2deaf-718f-418e-bed6-b8b1351c4d85\") " pod="openstack/heat-api-59c6c9c84d-2tqrs" Dec 01 21:55:57 crc kubenswrapper[4962]: I1201 21:55:57.001835 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-59c6c9c84d-2tqrs" Dec 01 21:55:57 crc kubenswrapper[4962]: I1201 21:55:57.002532 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5ff8b998b6-kt4sg" Dec 01 21:55:57 crc kubenswrapper[4962]: I1201 21:55:57.531391 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-b85sm" event={"ID":"e2e2dc11-c865-45cb-ab81-c39e911fdef9","Type":"ContainerStarted","Data":"d67859305aa34f36de08ed19ded340191e9f814a788ba40318519d10b9cbbf27"} Dec 01 21:55:58 crc kubenswrapper[4962]: I1201 21:55:58.566695 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-4zm7d" event={"ID":"bdaa775e-7b2f-4c56-8f07-256a62b4ed20","Type":"ContainerStarted","Data":"0f7d79bba7c08d0383759d10a6afede4325c1f20e87f581bf090b94ef924503a"} Dec 01 21:55:58 crc kubenswrapper[4962]: I1201 21:55:58.573871 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-25db-account-create-update-h9hw7" event={"ID":"88297b50-c65c-4dc3-8eed-d86b046b7f84","Type":"ContainerStarted","Data":"ea987da1782f04319ce1db2ec333b6784d426f3249387ed9fc3624465d8acb13"} Dec 01 21:55:58 crc kubenswrapper[4962]: I1201 21:55:58.712633 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-287a-account-create-update-svftm"] Dec 01 21:55:58 crc kubenswrapper[4962]: W1201 21:55:58.735284 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod62f2a608_1cbf_4af5_ae73_9e3d141ae906.slice/crio-9a703de34b9be02a144bc4c5cd12ab3f4d99529bee72b89441787c5f5993d84c WatchSource:0}: Error finding container 9a703de34b9be02a144bc4c5cd12ab3f4d99529bee72b89441787c5f5993d84c: Status 404 returned error can't find the container with id 9a703de34b9be02a144bc4c5cd12ab3f4d99529bee72b89441787c5f5993d84c Dec 01 21:55:58 crc kubenswrapper[4962]: I1201 21:55:58.769851 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 21:55:58 crc kubenswrapper[4962]: I1201 21:55:58.925422 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d6db005-fb47-40ad-979c-04ccd8146c41-combined-ca-bundle\") pod \"2d6db005-fb47-40ad-979c-04ccd8146c41\" (UID: \"2d6db005-fb47-40ad-979c-04ccd8146c41\") " Dec 01 21:55:58 crc kubenswrapper[4962]: I1201 21:55:58.925765 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2d6db005-fb47-40ad-979c-04ccd8146c41-log-httpd\") pod \"2d6db005-fb47-40ad-979c-04ccd8146c41\" (UID: \"2d6db005-fb47-40ad-979c-04ccd8146c41\") " Dec 01 21:55:58 crc kubenswrapper[4962]: I1201 21:55:58.925795 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d6db005-fb47-40ad-979c-04ccd8146c41-scripts\") pod \"2d6db005-fb47-40ad-979c-04ccd8146c41\" (UID: \"2d6db005-fb47-40ad-979c-04ccd8146c41\") " Dec 01 21:55:58 crc kubenswrapper[4962]: I1201 21:55:58.925847 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2d6db005-fb47-40ad-979c-04ccd8146c41-sg-core-conf-yaml\") pod \"2d6db005-fb47-40ad-979c-04ccd8146c41\" (UID: \"2d6db005-fb47-40ad-979c-04ccd8146c41\") " Dec 01 21:55:58 crc kubenswrapper[4962]: I1201 21:55:58.925923 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4skn\" (UniqueName: \"kubernetes.io/projected/2d6db005-fb47-40ad-979c-04ccd8146c41-kube-api-access-r4skn\") pod \"2d6db005-fb47-40ad-979c-04ccd8146c41\" (UID: \"2d6db005-fb47-40ad-979c-04ccd8146c41\") " Dec 01 21:55:58 crc kubenswrapper[4962]: I1201 21:55:58.925964 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2d6db005-fb47-40ad-979c-04ccd8146c41-run-httpd\") pod \"2d6db005-fb47-40ad-979c-04ccd8146c41\" (UID: \"2d6db005-fb47-40ad-979c-04ccd8146c41\") " Dec 01 21:55:58 crc kubenswrapper[4962]: I1201 21:55:58.926082 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d6db005-fb47-40ad-979c-04ccd8146c41-config-data\") pod \"2d6db005-fb47-40ad-979c-04ccd8146c41\" (UID: \"2d6db005-fb47-40ad-979c-04ccd8146c41\") " Dec 01 21:55:58 crc kubenswrapper[4962]: I1201 21:55:58.927605 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d6db005-fb47-40ad-979c-04ccd8146c41-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2d6db005-fb47-40ad-979c-04ccd8146c41" (UID: "2d6db005-fb47-40ad-979c-04ccd8146c41"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:55:58 crc kubenswrapper[4962]: I1201 21:55:58.932222 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d6db005-fb47-40ad-979c-04ccd8146c41-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2d6db005-fb47-40ad-979c-04ccd8146c41" (UID: "2d6db005-fb47-40ad-979c-04ccd8146c41"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:55:58 crc kubenswrapper[4962]: I1201 21:55:58.944965 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d6db005-fb47-40ad-979c-04ccd8146c41-scripts" (OuterVolumeSpecName: "scripts") pod "2d6db005-fb47-40ad-979c-04ccd8146c41" (UID: "2d6db005-fb47-40ad-979c-04ccd8146c41"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:55:58 crc kubenswrapper[4962]: I1201 21:55:58.945802 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d6db005-fb47-40ad-979c-04ccd8146c41-kube-api-access-r4skn" (OuterVolumeSpecName: "kube-api-access-r4skn") pod "2d6db005-fb47-40ad-979c-04ccd8146c41" (UID: "2d6db005-fb47-40ad-979c-04ccd8146c41"). InnerVolumeSpecName "kube-api-access-r4skn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:55:58 crc kubenswrapper[4962]: I1201 21:55:58.983775 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d6db005-fb47-40ad-979c-04ccd8146c41-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2d6db005-fb47-40ad-979c-04ccd8146c41" (UID: "2d6db005-fb47-40ad-979c-04ccd8146c41"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:55:59 crc kubenswrapper[4962]: I1201 21:55:59.029069 4962 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2d6db005-fb47-40ad-979c-04ccd8146c41-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 01 21:55:59 crc kubenswrapper[4962]: I1201 21:55:59.029097 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4skn\" (UniqueName: \"kubernetes.io/projected/2d6db005-fb47-40ad-979c-04ccd8146c41-kube-api-access-r4skn\") on node \"crc\" DevicePath \"\"" Dec 01 21:55:59 crc kubenswrapper[4962]: I1201 21:55:59.029108 4962 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2d6db005-fb47-40ad-979c-04ccd8146c41-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 21:55:59 crc kubenswrapper[4962]: I1201 21:55:59.029116 4962 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2d6db005-fb47-40ad-979c-04ccd8146c41-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 21:55:59 crc kubenswrapper[4962]: I1201 21:55:59.029127 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d6db005-fb47-40ad-979c-04ccd8146c41-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 21:55:59 crc kubenswrapper[4962]: I1201 21:55:59.087098 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d6db005-fb47-40ad-979c-04ccd8146c41-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2d6db005-fb47-40ad-979c-04ccd8146c41" (UID: "2d6db005-fb47-40ad-979c-04ccd8146c41"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:55:59 crc kubenswrapper[4962]: I1201 21:55:59.124774 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d6db005-fb47-40ad-979c-04ccd8146c41-config-data" (OuterVolumeSpecName: "config-data") pod "2d6db005-fb47-40ad-979c-04ccd8146c41" (UID: "2d6db005-fb47-40ad-979c-04ccd8146c41"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:55:59 crc kubenswrapper[4962]: I1201 21:55:59.131289 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d6db005-fb47-40ad-979c-04ccd8146c41-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 21:55:59 crc kubenswrapper[4962]: I1201 21:55:59.131317 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d6db005-fb47-40ad-979c-04ccd8146c41-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 21:55:59 crc kubenswrapper[4962]: I1201 21:55:59.165581 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-5ff8b998b6-kt4sg"] Dec 01 21:55:59 crc kubenswrapper[4962]: I1201 21:55:59.197432 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-7747d4b4f4-gldgq"] Dec 01 21:55:59 crc kubenswrapper[4962]: I1201 21:55:59.215432 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-6776d74cd9-xhqgt"] Dec 01 21:55:59 crc kubenswrapper[4962]: W1201 21:55:59.224456 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0564a7bc_694c_4c6b_b7ec_7e7d26f4ea38.slice/crio-4f980f4cc5d6bc6dda7468888f04db385a7f544d3d736e7d3d4d51585ae47bff WatchSource:0}: Error finding container 4f980f4cc5d6bc6dda7468888f04db385a7f544d3d736e7d3d4d51585ae47bff: Status 404 returned error can't find the container with id 4f980f4cc5d6bc6dda7468888f04db385a7f544d3d736e7d3d4d51585ae47bff Dec 01 21:55:59 crc kubenswrapper[4962]: I1201 21:55:59.227756 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-59c6c9c84d-2tqrs"] Dec 01 21:55:59 crc kubenswrapper[4962]: W1201 21:55:59.238903 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63a2deaf_718f_418e_bed6_b8b1351c4d85.slice/crio-ef3d7aaf3a2f6c9b95329b88992d39bdf7e33f165f3266c38fb916023d4e7406 WatchSource:0}: Error finding container ef3d7aaf3a2f6c9b95329b88992d39bdf7e33f165f3266c38fb916023d4e7406: Status 404 returned error can't find the container with id ef3d7aaf3a2f6c9b95329b88992d39bdf7e33f165f3266c38fb916023d4e7406 Dec 01 21:55:59 crc kubenswrapper[4962]: I1201 21:55:59.245033 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-6555856bc4-7xjr6"] Dec 01 21:55:59 crc kubenswrapper[4962]: W1201 21:55:59.278438 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0a5ca37_2aa2_43a5_8e9d_a58d62955f44.slice/crio-93f32796cf6b2df6c8a4208ca786c163fe23f0ee5788edd5bce43df0ac416c04 WatchSource:0}: Error finding container 93f32796cf6b2df6c8a4208ca786c163fe23f0ee5788edd5bce43df0ac416c04: Status 404 returned error can't find the container with id 93f32796cf6b2df6c8a4208ca786c163fe23f0ee5788edd5bce43df0ac416c04 Dec 01 21:55:59 crc kubenswrapper[4962]: I1201 21:55:59.291239 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-52d5-account-create-update-qxxxt"] Dec 01 21:55:59 crc kubenswrapper[4962]: I1201 21:55:59.301361 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-tpnk4"] Dec 01 21:55:59 crc kubenswrapper[4962]: I1201 21:55:59.592520 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5ff8b998b6-kt4sg" event={"ID":"0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38","Type":"ContainerStarted","Data":"4f980f4cc5d6bc6dda7468888f04db385a7f544d3d736e7d3d4d51585ae47bff"} Dec 01 21:55:59 crc kubenswrapper[4962]: I1201 21:55:59.594209 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"9fd1f254-7f23-46a6-b2fd-986de362e028","Type":"ContainerStarted","Data":"9e5fb636fdd916616b8e342a9026b25847a2a37ebbbafe714267fee1e993c7a3"} Dec 01 21:55:59 crc kubenswrapper[4962]: I1201 21:55:59.604339 4962 generic.go:334] "Generic (PLEG): container finished" podID="bdaa775e-7b2f-4c56-8f07-256a62b4ed20" containerID="25f007fbb2c33ecfdc5a981921a1439c076bfb642912dbffdbffa0965553bcdd" exitCode=0 Dec 01 21:55:59 crc kubenswrapper[4962]: I1201 21:55:59.604476 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-4zm7d" event={"ID":"bdaa775e-7b2f-4c56-8f07-256a62b4ed20","Type":"ContainerDied","Data":"25f007fbb2c33ecfdc5a981921a1439c076bfb642912dbffdbffa0965553bcdd"} Dec 01 21:55:59 crc kubenswrapper[4962]: I1201 21:55:59.607088 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-25db-account-create-update-h9hw7" event={"ID":"88297b50-c65c-4dc3-8eed-d86b046b7f84","Type":"ContainerStarted","Data":"0933a1b3fae9725aafaf5fa8117f6c9c36071772289fc14389d1f89c99a0bf99"} Dec 01 21:55:59 crc kubenswrapper[4962]: I1201 21:55:59.615237 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2d6db005-fb47-40ad-979c-04ccd8146c41","Type":"ContainerDied","Data":"3978d68b47d582dd594ff3c60e1c8e5c491e7d7e7bf9857d41542924c852d3bc"} Dec 01 21:55:59 crc kubenswrapper[4962]: I1201 21:55:59.615306 4962 scope.go:117] "RemoveContainer" containerID="83854fb61d90bc731ffa1b9c20134fab91b244e3e3f98020bc76170b355d69b1" Dec 01 21:55:59 crc kubenswrapper[4962]: I1201 21:55:59.615603 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 21:55:59 crc kubenswrapper[4962]: I1201 21:55:59.626682 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-tpnk4" event={"ID":"a0f3d631-bb76-48fd-9bb2-2326d9044956","Type":"ContainerStarted","Data":"6734d4d7d5d7a7e3a4b3fc1fcf5b534ca37454bc57c2d58023114345fa69751a"} Dec 01 21:55:59 crc kubenswrapper[4962]: I1201 21:55:59.626730 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-tpnk4" event={"ID":"a0f3d631-bb76-48fd-9bb2-2326d9044956","Type":"ContainerStarted","Data":"ef58c93f3ad841f4e324f50966937d49ea54ac2be6bce1ce07916bf9522b060b"} Dec 01 21:55:59 crc kubenswrapper[4962]: I1201 21:55:59.632745 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6555856bc4-7xjr6" event={"ID":"aada9c17-9992-466a-bd0c-212a580295fa","Type":"ContainerStarted","Data":"c4018b3f7df8b793587679258df9c79a685f18eb7bb2fc90329ad29fb4b0a9a5"} Dec 01 21:55:59 crc kubenswrapper[4962]: I1201 21:55:59.636066 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.853241241 podStartE2EDuration="21.636046691s" podCreationTimestamp="2025-12-01 21:55:38 +0000 UTC" firstStartedPulling="2025-12-01 21:55:39.956911592 +0000 UTC m=+1324.058350777" lastFinishedPulling="2025-12-01 21:55:57.739717032 +0000 UTC m=+1341.841156227" observedRunningTime="2025-12-01 21:55:59.613034796 +0000 UTC m=+1343.714474001" watchObservedRunningTime="2025-12-01 21:55:59.636046691 +0000 UTC m=+1343.737485886" Dec 01 21:55:59 crc kubenswrapper[4962]: I1201 21:55:59.638616 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-52d5-account-create-update-qxxxt" event={"ID":"6dd00f79-0c0f-4016-bd96-e0c497c73e36","Type":"ContainerStarted","Data":"b0b3f6b5e43f218f3080cddeb8d91e9f5d13534744de82e0d531d2d2519cbc6e"} Dec 01 21:55:59 crc kubenswrapper[4962]: I1201 21:55:59.644109 4962 generic.go:334] "Generic (PLEG): container finished" podID="e2e2dc11-c865-45cb-ab81-c39e911fdef9" containerID="d4e9f414a0b7139454956f76686a5451feb7b9d41abfad2effe2882278195e7b" exitCode=0 Dec 01 21:55:59 crc kubenswrapper[4962]: I1201 21:55:59.644194 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-b85sm" event={"ID":"e2e2dc11-c865-45cb-ab81-c39e911fdef9","Type":"ContainerDied","Data":"d4e9f414a0b7139454956f76686a5451feb7b9d41abfad2effe2882278195e7b"} Dec 01 21:55:59 crc kubenswrapper[4962]: I1201 21:55:59.648698 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-25db-account-create-update-h9hw7" podStartSLOduration=12.648677571 podStartE2EDuration="12.648677571s" podCreationTimestamp="2025-12-01 21:55:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:55:59.634733854 +0000 UTC m=+1343.736173049" watchObservedRunningTime="2025-12-01 21:55:59.648677571 +0000 UTC m=+1343.750116776" Dec 01 21:55:59 crc kubenswrapper[4962]: I1201 21:55:59.654762 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-59c6c9c84d-2tqrs" event={"ID":"63a2deaf-718f-418e-bed6-b8b1351c4d85","Type":"ContainerStarted","Data":"ef3d7aaf3a2f6c9b95329b88992d39bdf7e33f165f3266c38fb916023d4e7406"} Dec 01 21:55:59 crc kubenswrapper[4962]: I1201 21:55:59.661459 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6776d74cd9-xhqgt" event={"ID":"a0a5ca37-2aa2-43a5-8e9d-a58d62955f44","Type":"ContainerStarted","Data":"93f32796cf6b2df6c8a4208ca786c163fe23f0ee5788edd5bce43df0ac416c04"} Dec 01 21:55:59 crc kubenswrapper[4962]: I1201 21:55:59.663866 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-287a-account-create-update-svftm" event={"ID":"62f2a608-1cbf-4af5-ae73-9e3d141ae906","Type":"ContainerStarted","Data":"112394d5797b70c9941755c33740353ce3336dac30cfb421d5359cb72a83bbe6"} Dec 01 21:55:59 crc kubenswrapper[4962]: I1201 21:55:59.663899 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-287a-account-create-update-svftm" event={"ID":"62f2a608-1cbf-4af5-ae73-9e3d141ae906","Type":"ContainerStarted","Data":"9a703de34b9be02a144bc4c5cd12ab3f4d99529bee72b89441787c5f5993d84c"} Dec 01 21:55:59 crc kubenswrapper[4962]: I1201 21:55:59.667055 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688b9f5b49-ntsmc" event={"ID":"a52733e0-9924-46f4-aee8-705cda80cc38","Type":"ContainerStarted","Data":"0998a210c85a339d1568cc92240d2d7ef19f0e2092225d0eeb8c265ed16406e3"} Dec 01 21:55:59 crc kubenswrapper[4962]: I1201 21:55:59.667732 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-688b9f5b49-ntsmc" Dec 01 21:55:59 crc kubenswrapper[4962]: I1201 21:55:59.675392 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-tpnk4" podStartSLOduration=12.675372672 podStartE2EDuration="12.675372672s" podCreationTimestamp="2025-12-01 21:55:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:55:59.668055773 +0000 UTC m=+1343.769494968" watchObservedRunningTime="2025-12-01 21:55:59.675372672 +0000 UTC m=+1343.776811867" Dec 01 21:55:59 crc kubenswrapper[4962]: I1201 21:55:59.677087 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7747d4b4f4-gldgq" event={"ID":"2cd79bc6-3c65-476e-ab58-1dcbe4533e23","Type":"ContainerStarted","Data":"662d6c83177069830f2da0e846d899c3e53c0963c52e24e038a3172bfcb1563d"} Dec 01 21:55:59 crc kubenswrapper[4962]: I1201 21:55:59.720624 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 21:55:59 crc kubenswrapper[4962]: I1201 21:55:59.744998 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 01 21:55:59 crc kubenswrapper[4962]: I1201 21:55:59.745192 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-688b9f5b49-ntsmc" podStartSLOduration=13.745174531 podStartE2EDuration="13.745174531s" podCreationTimestamp="2025-12-01 21:55:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:55:59.727752474 +0000 UTC m=+1343.829191699" watchObservedRunningTime="2025-12-01 21:55:59.745174531 +0000 UTC m=+1343.846613736" Dec 01 21:55:59 crc kubenswrapper[4962]: I1201 21:55:59.822463 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 01 21:55:59 crc kubenswrapper[4962]: E1201 21:55:59.823015 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d6db005-fb47-40ad-979c-04ccd8146c41" containerName="ceilometer-central-agent" Dec 01 21:55:59 crc kubenswrapper[4962]: I1201 21:55:59.823031 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d6db005-fb47-40ad-979c-04ccd8146c41" containerName="ceilometer-central-agent" Dec 01 21:55:59 crc kubenswrapper[4962]: E1201 21:55:59.823049 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d6db005-fb47-40ad-979c-04ccd8146c41" containerName="proxy-httpd" Dec 01 21:55:59 crc kubenswrapper[4962]: I1201 21:55:59.823058 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d6db005-fb47-40ad-979c-04ccd8146c41" containerName="proxy-httpd" Dec 01 21:55:59 crc kubenswrapper[4962]: E1201 21:55:59.823070 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d6db005-fb47-40ad-979c-04ccd8146c41" containerName="sg-core" Dec 01 21:55:59 crc kubenswrapper[4962]: I1201 21:55:59.823076 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d6db005-fb47-40ad-979c-04ccd8146c41" containerName="sg-core" Dec 01 21:55:59 crc kubenswrapper[4962]: E1201 21:55:59.823087 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d6db005-fb47-40ad-979c-04ccd8146c41" containerName="ceilometer-notification-agent" Dec 01 21:55:59 crc kubenswrapper[4962]: I1201 21:55:59.823093 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d6db005-fb47-40ad-979c-04ccd8146c41" containerName="ceilometer-notification-agent" Dec 01 21:55:59 crc kubenswrapper[4962]: I1201 21:55:59.823333 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d6db005-fb47-40ad-979c-04ccd8146c41" containerName="ceilometer-central-agent" Dec 01 21:55:59 crc kubenswrapper[4962]: I1201 21:55:59.823353 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d6db005-fb47-40ad-979c-04ccd8146c41" containerName="proxy-httpd" Dec 01 21:55:59 crc kubenswrapper[4962]: I1201 21:55:59.823376 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d6db005-fb47-40ad-979c-04ccd8146c41" containerName="ceilometer-notification-agent" Dec 01 21:55:59 crc kubenswrapper[4962]: I1201 21:55:59.823387 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d6db005-fb47-40ad-979c-04ccd8146c41" containerName="sg-core" Dec 01 21:55:59 crc kubenswrapper[4962]: I1201 21:55:59.825476 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 21:55:59 crc kubenswrapper[4962]: I1201 21:55:59.832408 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 01 21:55:59 crc kubenswrapper[4962]: I1201 21:55:59.832815 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 01 21:55:59 crc kubenswrapper[4962]: I1201 21:55:59.836320 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-287a-account-create-update-svftm" podStartSLOduration=12.836298207 podStartE2EDuration="12.836298207s" podCreationTimestamp="2025-12-01 21:55:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:55:59.750163653 +0000 UTC m=+1343.851602848" watchObservedRunningTime="2025-12-01 21:55:59.836298207 +0000 UTC m=+1343.937737392" Dec 01 21:55:59 crc kubenswrapper[4962]: I1201 21:55:59.851899 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 21:55:59 crc kubenswrapper[4962]: I1201 21:55:59.952699 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74jl4\" (UniqueName: \"kubernetes.io/projected/63129893-531f-4f40-a0d3-e75925071d5a-kube-api-access-74jl4\") pod \"ceilometer-0\" (UID: \"63129893-531f-4f40-a0d3-e75925071d5a\") " pod="openstack/ceilometer-0" Dec 01 21:55:59 crc kubenswrapper[4962]: I1201 21:55:59.952887 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/63129893-531f-4f40-a0d3-e75925071d5a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"63129893-531f-4f40-a0d3-e75925071d5a\") " pod="openstack/ceilometer-0" Dec 01 21:55:59 crc kubenswrapper[4962]: I1201 21:55:59.953057 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63129893-531f-4f40-a0d3-e75925071d5a-run-httpd\") pod \"ceilometer-0\" (UID: \"63129893-531f-4f40-a0d3-e75925071d5a\") " pod="openstack/ceilometer-0" Dec 01 21:55:59 crc kubenswrapper[4962]: I1201 21:55:59.953108 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63129893-531f-4f40-a0d3-e75925071d5a-log-httpd\") pod \"ceilometer-0\" (UID: \"63129893-531f-4f40-a0d3-e75925071d5a\") " pod="openstack/ceilometer-0" Dec 01 21:55:59 crc kubenswrapper[4962]: I1201 21:55:59.953487 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63129893-531f-4f40-a0d3-e75925071d5a-scripts\") pod \"ceilometer-0\" (UID: \"63129893-531f-4f40-a0d3-e75925071d5a\") " pod="openstack/ceilometer-0" Dec 01 21:55:59 crc kubenswrapper[4962]: I1201 21:55:59.953717 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63129893-531f-4f40-a0d3-e75925071d5a-config-data\") pod \"ceilometer-0\" (UID: \"63129893-531f-4f40-a0d3-e75925071d5a\") " pod="openstack/ceilometer-0" Dec 01 21:55:59 crc kubenswrapper[4962]: I1201 21:55:59.953838 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63129893-531f-4f40-a0d3-e75925071d5a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"63129893-531f-4f40-a0d3-e75925071d5a\") " pod="openstack/ceilometer-0" Dec 01 21:56:00 crc kubenswrapper[4962]: I1201 21:56:00.055968 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63129893-531f-4f40-a0d3-e75925071d5a-config-data\") pod \"ceilometer-0\" (UID: \"63129893-531f-4f40-a0d3-e75925071d5a\") " pod="openstack/ceilometer-0" Dec 01 21:56:00 crc kubenswrapper[4962]: I1201 21:56:00.056034 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63129893-531f-4f40-a0d3-e75925071d5a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"63129893-531f-4f40-a0d3-e75925071d5a\") " pod="openstack/ceilometer-0" Dec 01 21:56:00 crc kubenswrapper[4962]: I1201 21:56:00.056076 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74jl4\" (UniqueName: \"kubernetes.io/projected/63129893-531f-4f40-a0d3-e75925071d5a-kube-api-access-74jl4\") pod \"ceilometer-0\" (UID: \"63129893-531f-4f40-a0d3-e75925071d5a\") " pod="openstack/ceilometer-0" Dec 01 21:56:00 crc kubenswrapper[4962]: I1201 21:56:00.056130 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/63129893-531f-4f40-a0d3-e75925071d5a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"63129893-531f-4f40-a0d3-e75925071d5a\") " pod="openstack/ceilometer-0" Dec 01 21:56:00 crc kubenswrapper[4962]: I1201 21:56:00.056165 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63129893-531f-4f40-a0d3-e75925071d5a-run-httpd\") pod \"ceilometer-0\" (UID: \"63129893-531f-4f40-a0d3-e75925071d5a\") " pod="openstack/ceilometer-0" Dec 01 21:56:00 crc kubenswrapper[4962]: I1201 21:56:00.056186 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63129893-531f-4f40-a0d3-e75925071d5a-log-httpd\") pod \"ceilometer-0\" (UID: \"63129893-531f-4f40-a0d3-e75925071d5a\") " pod="openstack/ceilometer-0" Dec 01 21:56:00 crc kubenswrapper[4962]: I1201 21:56:00.056269 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63129893-531f-4f40-a0d3-e75925071d5a-scripts\") pod \"ceilometer-0\" (UID: \"63129893-531f-4f40-a0d3-e75925071d5a\") " pod="openstack/ceilometer-0" Dec 01 21:56:00 crc kubenswrapper[4962]: I1201 21:56:00.056800 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63129893-531f-4f40-a0d3-e75925071d5a-run-httpd\") pod \"ceilometer-0\" (UID: \"63129893-531f-4f40-a0d3-e75925071d5a\") " pod="openstack/ceilometer-0" Dec 01 21:56:00 crc kubenswrapper[4962]: I1201 21:56:00.057054 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63129893-531f-4f40-a0d3-e75925071d5a-log-httpd\") pod \"ceilometer-0\" (UID: \"63129893-531f-4f40-a0d3-e75925071d5a\") " pod="openstack/ceilometer-0" Dec 01 21:56:00 crc kubenswrapper[4962]: I1201 21:56:00.061549 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/63129893-531f-4f40-a0d3-e75925071d5a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"63129893-531f-4f40-a0d3-e75925071d5a\") " pod="openstack/ceilometer-0" Dec 01 21:56:00 crc kubenswrapper[4962]: I1201 21:56:00.062626 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63129893-531f-4f40-a0d3-e75925071d5a-scripts\") pod \"ceilometer-0\" (UID: \"63129893-531f-4f40-a0d3-e75925071d5a\") " pod="openstack/ceilometer-0" Dec 01 21:56:00 crc kubenswrapper[4962]: I1201 21:56:00.062696 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63129893-531f-4f40-a0d3-e75925071d5a-config-data\") pod \"ceilometer-0\" (UID: \"63129893-531f-4f40-a0d3-e75925071d5a\") " pod="openstack/ceilometer-0" Dec 01 21:56:00 crc kubenswrapper[4962]: I1201 21:56:00.075076 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63129893-531f-4f40-a0d3-e75925071d5a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"63129893-531f-4f40-a0d3-e75925071d5a\") " pod="openstack/ceilometer-0" Dec 01 21:56:00 crc kubenswrapper[4962]: I1201 21:56:00.080485 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74jl4\" (UniqueName: \"kubernetes.io/projected/63129893-531f-4f40-a0d3-e75925071d5a-kube-api-access-74jl4\") pod \"ceilometer-0\" (UID: \"63129893-531f-4f40-a0d3-e75925071d5a\") " pod="openstack/ceilometer-0" Dec 01 21:56:00 crc kubenswrapper[4962]: I1201 21:56:00.226461 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 21:56:00 crc kubenswrapper[4962]: I1201 21:56:00.232232 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d6db005-fb47-40ad-979c-04ccd8146c41" path="/var/lib/kubelet/pods/2d6db005-fb47-40ad-979c-04ccd8146c41/volumes" Dec 01 21:56:00 crc kubenswrapper[4962]: I1201 21:56:00.621122 4962 scope.go:117] "RemoveContainer" containerID="dbdae4b355d4072d74e9f0cf004663def7872c74591d4a907f201b028ae0c649" Dec 01 21:56:00 crc kubenswrapper[4962]: I1201 21:56:00.712125 4962 generic.go:334] "Generic (PLEG): container finished" podID="a0f3d631-bb76-48fd-9bb2-2326d9044956" containerID="6734d4d7d5d7a7e3a4b3fc1fcf5b534ca37454bc57c2d58023114345fa69751a" exitCode=0 Dec 01 21:56:00 crc kubenswrapper[4962]: I1201 21:56:00.712907 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-tpnk4" event={"ID":"a0f3d631-bb76-48fd-9bb2-2326d9044956","Type":"ContainerDied","Data":"6734d4d7d5d7a7e3a4b3fc1fcf5b534ca37454bc57c2d58023114345fa69751a"} Dec 01 21:56:00 crc kubenswrapper[4962]: I1201 21:56:00.726160 4962 generic.go:334] "Generic (PLEG): container finished" podID="62f2a608-1cbf-4af5-ae73-9e3d141ae906" containerID="112394d5797b70c9941755c33740353ce3336dac30cfb421d5359cb72a83bbe6" exitCode=0 Dec 01 21:56:00 crc kubenswrapper[4962]: I1201 21:56:00.726319 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-287a-account-create-update-svftm" event={"ID":"62f2a608-1cbf-4af5-ae73-9e3d141ae906","Type":"ContainerDied","Data":"112394d5797b70c9941755c33740353ce3336dac30cfb421d5359cb72a83bbe6"} Dec 01 21:56:00 crc kubenswrapper[4962]: I1201 21:56:00.733589 4962 generic.go:334] "Generic (PLEG): container finished" podID="88297b50-c65c-4dc3-8eed-d86b046b7f84" containerID="0933a1b3fae9725aafaf5fa8117f6c9c36071772289fc14389d1f89c99a0bf99" exitCode=0 Dec 01 21:56:00 crc kubenswrapper[4962]: I1201 21:56:00.733840 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-25db-account-create-update-h9hw7" event={"ID":"88297b50-c65c-4dc3-8eed-d86b046b7f84","Type":"ContainerDied","Data":"0933a1b3fae9725aafaf5fa8117f6c9c36071772289fc14389d1f89c99a0bf99"} Dec 01 21:56:00 crc kubenswrapper[4962]: I1201 21:56:00.804108 4962 scope.go:117] "RemoveContainer" containerID="5aa07407749c8b852d88b0d747067ab051c1258e0cff82c7cb0c70d57542b327" Dec 01 21:56:01 crc kubenswrapper[4962]: I1201 21:56:01.277387 4962 scope.go:117] "RemoveContainer" containerID="88820d2455788335cb70872e685f36cce70095113a6ab68695d19976c7052d48" Dec 01 21:56:01 crc kubenswrapper[4962]: I1201 21:56:01.428306 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-4zm7d" Dec 01 21:56:01 crc kubenswrapper[4962]: I1201 21:56:01.513416 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 21:56:01 crc kubenswrapper[4962]: W1201 21:56:01.518886 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63129893_531f_4f40_a0d3_e75925071d5a.slice/crio-771ddf655129a84f999eaac00db4972569c58fb4d809c335e8ff22e4c6c8889b WatchSource:0}: Error finding container 771ddf655129a84f999eaac00db4972569c58fb4d809c335e8ff22e4c6c8889b: Status 404 returned error can't find the container with id 771ddf655129a84f999eaac00db4972569c58fb4d809c335e8ff22e4c6c8889b Dec 01 21:56:01 crc kubenswrapper[4962]: I1201 21:56:01.521996 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-b85sm" Dec 01 21:56:01 crc kubenswrapper[4962]: I1201 21:56:01.608390 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2ww2\" (UniqueName: \"kubernetes.io/projected/bdaa775e-7b2f-4c56-8f07-256a62b4ed20-kube-api-access-t2ww2\") pod \"bdaa775e-7b2f-4c56-8f07-256a62b4ed20\" (UID: \"bdaa775e-7b2f-4c56-8f07-256a62b4ed20\") " Dec 01 21:56:01 crc kubenswrapper[4962]: I1201 21:56:01.608553 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bdaa775e-7b2f-4c56-8f07-256a62b4ed20-operator-scripts\") pod \"bdaa775e-7b2f-4c56-8f07-256a62b4ed20\" (UID: \"bdaa775e-7b2f-4c56-8f07-256a62b4ed20\") " Dec 01 21:56:01 crc kubenswrapper[4962]: I1201 21:56:01.609015 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bdaa775e-7b2f-4c56-8f07-256a62b4ed20-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bdaa775e-7b2f-4c56-8f07-256a62b4ed20" (UID: "bdaa775e-7b2f-4c56-8f07-256a62b4ed20"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:56:01 crc kubenswrapper[4962]: I1201 21:56:01.609708 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bdaa775e-7b2f-4c56-8f07-256a62b4ed20-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 21:56:01 crc kubenswrapper[4962]: I1201 21:56:01.624565 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdaa775e-7b2f-4c56-8f07-256a62b4ed20-kube-api-access-t2ww2" (OuterVolumeSpecName: "kube-api-access-t2ww2") pod "bdaa775e-7b2f-4c56-8f07-256a62b4ed20" (UID: "bdaa775e-7b2f-4c56-8f07-256a62b4ed20"). InnerVolumeSpecName "kube-api-access-t2ww2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:56:01 crc kubenswrapper[4962]: I1201 21:56:01.711500 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e2e2dc11-c865-45cb-ab81-c39e911fdef9-operator-scripts\") pod \"e2e2dc11-c865-45cb-ab81-c39e911fdef9\" (UID: \"e2e2dc11-c865-45cb-ab81-c39e911fdef9\") " Dec 01 21:56:01 crc kubenswrapper[4962]: I1201 21:56:01.711807 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwgk6\" (UniqueName: \"kubernetes.io/projected/e2e2dc11-c865-45cb-ab81-c39e911fdef9-kube-api-access-mwgk6\") pod \"e2e2dc11-c865-45cb-ab81-c39e911fdef9\" (UID: \"e2e2dc11-c865-45cb-ab81-c39e911fdef9\") " Dec 01 21:56:01 crc kubenswrapper[4962]: I1201 21:56:01.711855 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2e2dc11-c865-45cb-ab81-c39e911fdef9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e2e2dc11-c865-45cb-ab81-c39e911fdef9" (UID: "e2e2dc11-c865-45cb-ab81-c39e911fdef9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:56:01 crc kubenswrapper[4962]: I1201 21:56:01.712532 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e2e2dc11-c865-45cb-ab81-c39e911fdef9-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 21:56:01 crc kubenswrapper[4962]: I1201 21:56:01.712553 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2ww2\" (UniqueName: \"kubernetes.io/projected/bdaa775e-7b2f-4c56-8f07-256a62b4ed20-kube-api-access-t2ww2\") on node \"crc\" DevicePath \"\"" Dec 01 21:56:01 crc kubenswrapper[4962]: I1201 21:56:01.717201 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2e2dc11-c865-45cb-ab81-c39e911fdef9-kube-api-access-mwgk6" (OuterVolumeSpecName: "kube-api-access-mwgk6") pod "e2e2dc11-c865-45cb-ab81-c39e911fdef9" (UID: "e2e2dc11-c865-45cb-ab81-c39e911fdef9"). InnerVolumeSpecName "kube-api-access-mwgk6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:56:01 crc kubenswrapper[4962]: I1201 21:56:01.747110 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6776d74cd9-xhqgt" event={"ID":"a0a5ca37-2aa2-43a5-8e9d-a58d62955f44","Type":"ContainerStarted","Data":"c38766e3f428e4e0124f10f797be6610444e5fb4a10b2633ed129a3c001a354d"} Dec 01 21:56:01 crc kubenswrapper[4962]: I1201 21:56:01.747287 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-6776d74cd9-xhqgt" Dec 01 21:56:01 crc kubenswrapper[4962]: I1201 21:56:01.749127 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-4zm7d" Dec 01 21:56:01 crc kubenswrapper[4962]: I1201 21:56:01.749133 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-4zm7d" event={"ID":"bdaa775e-7b2f-4c56-8f07-256a62b4ed20","Type":"ContainerDied","Data":"0f7d79bba7c08d0383759d10a6afede4325c1f20e87f581bf090b94ef924503a"} Dec 01 21:56:01 crc kubenswrapper[4962]: I1201 21:56:01.749260 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f7d79bba7c08d0383759d10a6afede4325c1f20e87f581bf090b94ef924503a" Dec 01 21:56:01 crc kubenswrapper[4962]: I1201 21:56:01.751610 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6555856bc4-7xjr6" event={"ID":"aada9c17-9992-466a-bd0c-212a580295fa","Type":"ContainerStarted","Data":"a55795d93b236b18d1f6c5856aa83175c62d31c7623a3e7c69a163852923807c"} Dec 01 21:56:01 crc kubenswrapper[4962]: I1201 21:56:01.751747 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-6555856bc4-7xjr6" Dec 01 21:56:01 crc kubenswrapper[4962]: I1201 21:56:01.752978 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-58f5d66b6f-bgzrf" event={"ID":"7b64f053-da53-49c7-a227-dcc84b5c078d","Type":"ContainerStarted","Data":"c8369569da1064735d48a0eb1d8a3f8459f1093e0197a113332c211339f04916"} Dec 01 21:56:01 crc kubenswrapper[4962]: I1201 21:56:01.753097 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-58f5d66b6f-bgzrf" podUID="7b64f053-da53-49c7-a227-dcc84b5c078d" containerName="heat-cfnapi" containerID="cri-o://c8369569da1064735d48a0eb1d8a3f8459f1093e0197a113332c211339f04916" gracePeriod=60 Dec 01 21:56:01 crc kubenswrapper[4962]: I1201 21:56:01.753359 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-58f5d66b6f-bgzrf" Dec 01 21:56:01 crc kubenswrapper[4962]: I1201 21:56:01.754867 4962 generic.go:334] "Generic (PLEG): container finished" podID="6dd00f79-0c0f-4016-bd96-e0c497c73e36" containerID="312953e90d66804e0c3fe2c6b7f8cb61941ccdfa613a5441030df9579c545190" exitCode=0 Dec 01 21:56:01 crc kubenswrapper[4962]: I1201 21:56:01.754912 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-52d5-account-create-update-qxxxt" event={"ID":"6dd00f79-0c0f-4016-bd96-e0c497c73e36","Type":"ContainerDied","Data":"312953e90d66804e0c3fe2c6b7f8cb61941ccdfa613a5441030df9579c545190"} Dec 01 21:56:01 crc kubenswrapper[4962]: I1201 21:56:01.758377 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-b85sm" event={"ID":"e2e2dc11-c865-45cb-ab81-c39e911fdef9","Type":"ContainerDied","Data":"d67859305aa34f36de08ed19ded340191e9f814a788ba40318519d10b9cbbf27"} Dec 01 21:56:01 crc kubenswrapper[4962]: I1201 21:56:01.758826 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d67859305aa34f36de08ed19ded340191e9f814a788ba40318519d10b9cbbf27" Dec 01 21:56:01 crc kubenswrapper[4962]: I1201 21:56:01.758421 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-b85sm" Dec 01 21:56:01 crc kubenswrapper[4962]: I1201 21:56:01.760084 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-59c6c9c84d-2tqrs" event={"ID":"63a2deaf-718f-418e-bed6-b8b1351c4d85","Type":"ContainerStarted","Data":"16727c7a62203f9348fd8459a6a6f8c6ef71d3401d4e4feefc46901162f5a1b9"} Dec 01 21:56:01 crc kubenswrapper[4962]: I1201 21:56:01.760593 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-59c6c9c84d-2tqrs" Dec 01 21:56:01 crc kubenswrapper[4962]: I1201 21:56:01.764426 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7747d4b4f4-gldgq" event={"ID":"2cd79bc6-3c65-476e-ab58-1dcbe4533e23","Type":"ContainerStarted","Data":"c8104227d9f43b422cee01bcbfc551cb4a14092a8a0358af364d23e663ef8597"} Dec 01 21:56:01 crc kubenswrapper[4962]: I1201 21:56:01.764578 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-7747d4b4f4-gldgq" Dec 01 21:56:01 crc kubenswrapper[4962]: I1201 21:56:01.766131 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6648ff8f4d-xwjnk" event={"ID":"a75e6047-420d-4aa3-a817-90a547491be2","Type":"ContainerStarted","Data":"78558ce17d5d4004db2c351716fa0e3661a20c3659d14e739135462444227e5b"} Dec 01 21:56:01 crc kubenswrapper[4962]: I1201 21:56:01.766173 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-6648ff8f4d-xwjnk" Dec 01 21:56:01 crc kubenswrapper[4962]: I1201 21:56:01.766198 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-6648ff8f4d-xwjnk" podUID="a75e6047-420d-4aa3-a817-90a547491be2" containerName="heat-api" containerID="cri-o://78558ce17d5d4004db2c351716fa0e3661a20c3659d14e739135462444227e5b" gracePeriod=60 Dec 01 21:56:01 crc kubenswrapper[4962]: I1201 21:56:01.777959 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-6776d74cd9-xhqgt" podStartSLOduration=7.777918717 podStartE2EDuration="7.777918717s" podCreationTimestamp="2025-12-01 21:55:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:56:01.77238417 +0000 UTC m=+1345.873823365" watchObservedRunningTime="2025-12-01 21:56:01.777918717 +0000 UTC m=+1345.879357922" Dec 01 21:56:01 crc kubenswrapper[4962]: I1201 21:56:01.779064 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63129893-531f-4f40-a0d3-e75925071d5a","Type":"ContainerStarted","Data":"771ddf655129a84f999eaac00db4972569c58fb4d809c335e8ff22e4c6c8889b"} Dec 01 21:56:01 crc kubenswrapper[4962]: I1201 21:56:01.781678 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5ff8b998b6-kt4sg" event={"ID":"0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38","Type":"ContainerStarted","Data":"8d4f4153ae312232890de01e4d5af8bb9b987f42a8438310ce6e51adfdee1ee7"} Dec 01 21:56:01 crc kubenswrapper[4962]: I1201 21:56:01.781873 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-5ff8b998b6-kt4sg" Dec 01 21:56:01 crc kubenswrapper[4962]: I1201 21:56:01.815464 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwgk6\" (UniqueName: \"kubernetes.io/projected/e2e2dc11-c865-45cb-ab81-c39e911fdef9-kube-api-access-mwgk6\") on node \"crc\" DevicePath \"\"" Dec 01 21:56:01 crc kubenswrapper[4962]: I1201 21:56:01.821901 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-6555856bc4-7xjr6" podStartSLOduration=5.251916828 podStartE2EDuration="6.821876739s" podCreationTimestamp="2025-12-01 21:55:55 +0000 UTC" firstStartedPulling="2025-12-01 21:55:59.270571428 +0000 UTC m=+1343.372010623" lastFinishedPulling="2025-12-01 21:56:00.840531339 +0000 UTC m=+1344.941970534" observedRunningTime="2025-12-01 21:56:01.806230143 +0000 UTC m=+1345.907669338" watchObservedRunningTime="2025-12-01 21:56:01.821876739 +0000 UTC m=+1345.923315934" Dec 01 21:56:01 crc kubenswrapper[4962]: I1201 21:56:01.867699 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-59c6c9c84d-2tqrs" podStartSLOduration=4.284778953 podStartE2EDuration="5.867673823s" podCreationTimestamp="2025-12-01 21:55:56 +0000 UTC" firstStartedPulling="2025-12-01 21:55:59.270881447 +0000 UTC m=+1343.372320642" lastFinishedPulling="2025-12-01 21:56:00.853776317 +0000 UTC m=+1344.955215512" observedRunningTime="2025-12-01 21:56:01.824584436 +0000 UTC m=+1345.926023641" watchObservedRunningTime="2025-12-01 21:56:01.867673823 +0000 UTC m=+1345.969113018" Dec 01 21:56:01 crc kubenswrapper[4962]: I1201 21:56:01.872005 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-7747d4b4f4-gldgq" podStartSLOduration=5.260693368 podStartE2EDuration="6.871978996s" podCreationTimestamp="2025-12-01 21:55:55 +0000 UTC" firstStartedPulling="2025-12-01 21:55:59.239000509 +0000 UTC m=+1343.340439694" lastFinishedPulling="2025-12-01 21:56:00.850286137 +0000 UTC m=+1344.951725322" observedRunningTime="2025-12-01 21:56:01.848408095 +0000 UTC m=+1345.949847290" watchObservedRunningTime="2025-12-01 21:56:01.871978996 +0000 UTC m=+1345.973418201" Dec 01 21:56:01 crc kubenswrapper[4962]: I1201 21:56:01.909266 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-6648ff8f4d-xwjnk" podStartSLOduration=2.80996259 podStartE2EDuration="14.909239548s" podCreationTimestamp="2025-12-01 21:55:47 +0000 UTC" firstStartedPulling="2025-12-01 21:55:48.751473722 +0000 UTC m=+1332.852912917" lastFinishedPulling="2025-12-01 21:56:00.85075068 +0000 UTC m=+1344.952189875" observedRunningTime="2025-12-01 21:56:01.888368443 +0000 UTC m=+1345.989807638" watchObservedRunningTime="2025-12-01 21:56:01.909239548 +0000 UTC m=+1346.010678763" Dec 01 21:56:01 crc kubenswrapper[4962]: I1201 21:56:01.925368 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-58f5d66b6f-bgzrf" podStartSLOduration=3.634027397 podStartE2EDuration="15.925343547s" podCreationTimestamp="2025-12-01 21:55:46 +0000 UTC" firstStartedPulling="2025-12-01 21:55:48.528310354 +0000 UTC m=+1332.629749549" lastFinishedPulling="2025-12-01 21:56:00.819626504 +0000 UTC m=+1344.921065699" observedRunningTime="2025-12-01 21:56:01.911630436 +0000 UTC m=+1346.013069651" watchObservedRunningTime="2025-12-01 21:56:01.925343547 +0000 UTC m=+1346.026782762" Dec 01 21:56:01 crc kubenswrapper[4962]: I1201 21:56:01.968397 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-5ff8b998b6-kt4sg" podStartSLOduration=4.342776767 podStartE2EDuration="5.968374783s" podCreationTimestamp="2025-12-01 21:55:56 +0000 UTC" firstStartedPulling="2025-12-01 21:55:59.233922904 +0000 UTC m=+1343.335362099" lastFinishedPulling="2025-12-01 21:56:00.85952092 +0000 UTC m=+1344.960960115" observedRunningTime="2025-12-01 21:56:01.928682372 +0000 UTC m=+1346.030121567" watchObservedRunningTime="2025-12-01 21:56:01.968374783 +0000 UTC m=+1346.069813978" Dec 01 21:56:02 crc kubenswrapper[4962]: I1201 21:56:02.466959 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-tpnk4" Dec 01 21:56:02 crc kubenswrapper[4962]: I1201 21:56:02.550197 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-spb7d\" (UniqueName: \"kubernetes.io/projected/a0f3d631-bb76-48fd-9bb2-2326d9044956-kube-api-access-spb7d\") pod \"a0f3d631-bb76-48fd-9bb2-2326d9044956\" (UID: \"a0f3d631-bb76-48fd-9bb2-2326d9044956\") " Dec 01 21:56:02 crc kubenswrapper[4962]: I1201 21:56:02.550286 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0f3d631-bb76-48fd-9bb2-2326d9044956-operator-scripts\") pod \"a0f3d631-bb76-48fd-9bb2-2326d9044956\" (UID: \"a0f3d631-bb76-48fd-9bb2-2326d9044956\") " Dec 01 21:56:02 crc kubenswrapper[4962]: I1201 21:56:02.553158 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0f3d631-bb76-48fd-9bb2-2326d9044956-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a0f3d631-bb76-48fd-9bb2-2326d9044956" (UID: "a0f3d631-bb76-48fd-9bb2-2326d9044956"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:56:02 crc kubenswrapper[4962]: I1201 21:56:02.559680 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0f3d631-bb76-48fd-9bb2-2326d9044956-kube-api-access-spb7d" (OuterVolumeSpecName: "kube-api-access-spb7d") pod "a0f3d631-bb76-48fd-9bb2-2326d9044956" (UID: "a0f3d631-bb76-48fd-9bb2-2326d9044956"). InnerVolumeSpecName "kube-api-access-spb7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:56:02 crc kubenswrapper[4962]: I1201 21:56:02.617048 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-25db-account-create-update-h9hw7" Dec 01 21:56:02 crc kubenswrapper[4962]: I1201 21:56:02.670122 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-spb7d\" (UniqueName: \"kubernetes.io/projected/a0f3d631-bb76-48fd-9bb2-2326d9044956-kube-api-access-spb7d\") on node \"crc\" DevicePath \"\"" Dec 01 21:56:02 crc kubenswrapper[4962]: I1201 21:56:02.670164 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0f3d631-bb76-48fd-9bb2-2326d9044956-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 21:56:02 crc kubenswrapper[4962]: I1201 21:56:02.788491 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/88297b50-c65c-4dc3-8eed-d86b046b7f84-operator-scripts\") pod \"88297b50-c65c-4dc3-8eed-d86b046b7f84\" (UID: \"88297b50-c65c-4dc3-8eed-d86b046b7f84\") " Dec 01 21:56:02 crc kubenswrapper[4962]: I1201 21:56:02.788672 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lpdpl\" (UniqueName: \"kubernetes.io/projected/88297b50-c65c-4dc3-8eed-d86b046b7f84-kube-api-access-lpdpl\") pod \"88297b50-c65c-4dc3-8eed-d86b046b7f84\" (UID: \"88297b50-c65c-4dc3-8eed-d86b046b7f84\") " Dec 01 21:56:02 crc kubenswrapper[4962]: I1201 21:56:02.788946 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88297b50-c65c-4dc3-8eed-d86b046b7f84-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "88297b50-c65c-4dc3-8eed-d86b046b7f84" (UID: "88297b50-c65c-4dc3-8eed-d86b046b7f84"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:56:02 crc kubenswrapper[4962]: I1201 21:56:02.789443 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/88297b50-c65c-4dc3-8eed-d86b046b7f84-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 21:56:02 crc kubenswrapper[4962]: I1201 21:56:02.794758 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88297b50-c65c-4dc3-8eed-d86b046b7f84-kube-api-access-lpdpl" (OuterVolumeSpecName: "kube-api-access-lpdpl") pod "88297b50-c65c-4dc3-8eed-d86b046b7f84" (UID: "88297b50-c65c-4dc3-8eed-d86b046b7f84"). InnerVolumeSpecName "kube-api-access-lpdpl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:56:02 crc kubenswrapper[4962]: I1201 21:56:02.824762 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63129893-531f-4f40-a0d3-e75925071d5a","Type":"ContainerStarted","Data":"3d860a04a627da29702c51df3bac4b8ed669baaafa6d7e8510b5d920da590e75"} Dec 01 21:56:02 crc kubenswrapper[4962]: I1201 21:56:02.828731 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-tpnk4" event={"ID":"a0f3d631-bb76-48fd-9bb2-2326d9044956","Type":"ContainerDied","Data":"ef58c93f3ad841f4e324f50966937d49ea54ac2be6bce1ce07916bf9522b060b"} Dec 01 21:56:02 crc kubenswrapper[4962]: I1201 21:56:02.828768 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-tpnk4" Dec 01 21:56:02 crc kubenswrapper[4962]: I1201 21:56:02.828778 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef58c93f3ad841f4e324f50966937d49ea54ac2be6bce1ce07916bf9522b060b" Dec 01 21:56:02 crc kubenswrapper[4962]: I1201 21:56:02.831448 4962 generic.go:334] "Generic (PLEG): container finished" podID="2cd79bc6-3c65-476e-ab58-1dcbe4533e23" containerID="c8104227d9f43b422cee01bcbfc551cb4a14092a8a0358af364d23e663ef8597" exitCode=1 Dec 01 21:56:02 crc kubenswrapper[4962]: I1201 21:56:02.831677 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7747d4b4f4-gldgq" event={"ID":"2cd79bc6-3c65-476e-ab58-1dcbe4533e23","Type":"ContainerDied","Data":"c8104227d9f43b422cee01bcbfc551cb4a14092a8a0358af364d23e663ef8597"} Dec 01 21:56:02 crc kubenswrapper[4962]: I1201 21:56:02.834030 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-25db-account-create-update-h9hw7" event={"ID":"88297b50-c65c-4dc3-8eed-d86b046b7f84","Type":"ContainerDied","Data":"ea987da1782f04319ce1db2ec333b6784d426f3249387ed9fc3624465d8acb13"} Dec 01 21:56:02 crc kubenswrapper[4962]: I1201 21:56:02.834093 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea987da1782f04319ce1db2ec333b6784d426f3249387ed9fc3624465d8acb13" Dec 01 21:56:02 crc kubenswrapper[4962]: I1201 21:56:02.838670 4962 generic.go:334] "Generic (PLEG): container finished" podID="aada9c17-9992-466a-bd0c-212a580295fa" containerID="a55795d93b236b18d1f6c5856aa83175c62d31c7623a3e7c69a163852923807c" exitCode=1 Dec 01 21:56:02 crc kubenswrapper[4962]: I1201 21:56:02.840488 4962 scope.go:117] "RemoveContainer" containerID="a55795d93b236b18d1f6c5856aa83175c62d31c7623a3e7c69a163852923807c" Dec 01 21:56:02 crc kubenswrapper[4962]: I1201 21:56:02.840820 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6555856bc4-7xjr6" event={"ID":"aada9c17-9992-466a-bd0c-212a580295fa","Type":"ContainerDied","Data":"a55795d93b236b18d1f6c5856aa83175c62d31c7623a3e7c69a163852923807c"} Dec 01 21:56:02 crc kubenswrapper[4962]: I1201 21:56:02.833984 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-25db-account-create-update-h9hw7" Dec 01 21:56:02 crc kubenswrapper[4962]: I1201 21:56:02.841356 4962 scope.go:117] "RemoveContainer" containerID="c8104227d9f43b422cee01bcbfc551cb4a14092a8a0358af364d23e663ef8597" Dec 01 21:56:02 crc kubenswrapper[4962]: I1201 21:56:02.893981 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lpdpl\" (UniqueName: \"kubernetes.io/projected/88297b50-c65c-4dc3-8eed-d86b046b7f84-kube-api-access-lpdpl\") on node \"crc\" DevicePath \"\"" Dec 01 21:56:03 crc kubenswrapper[4962]: I1201 21:56:03.174749 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-287a-account-create-update-svftm" Dec 01 21:56:03 crc kubenswrapper[4962]: I1201 21:56:03.322268 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-52d5-account-create-update-qxxxt" Dec 01 21:56:03 crc kubenswrapper[4962]: I1201 21:56:03.327985 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62f2a608-1cbf-4af5-ae73-9e3d141ae906-operator-scripts\") pod \"62f2a608-1cbf-4af5-ae73-9e3d141ae906\" (UID: \"62f2a608-1cbf-4af5-ae73-9e3d141ae906\") " Dec 01 21:56:03 crc kubenswrapper[4962]: I1201 21:56:03.328445 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wdbj\" (UniqueName: \"kubernetes.io/projected/62f2a608-1cbf-4af5-ae73-9e3d141ae906-kube-api-access-6wdbj\") pod \"62f2a608-1cbf-4af5-ae73-9e3d141ae906\" (UID: \"62f2a608-1cbf-4af5-ae73-9e3d141ae906\") " Dec 01 21:56:03 crc kubenswrapper[4962]: I1201 21:56:03.329882 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62f2a608-1cbf-4af5-ae73-9e3d141ae906-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "62f2a608-1cbf-4af5-ae73-9e3d141ae906" (UID: "62f2a608-1cbf-4af5-ae73-9e3d141ae906"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:56:03 crc kubenswrapper[4962]: I1201 21:56:03.331594 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62f2a608-1cbf-4af5-ae73-9e3d141ae906-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 21:56:03 crc kubenswrapper[4962]: I1201 21:56:03.340151 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62f2a608-1cbf-4af5-ae73-9e3d141ae906-kube-api-access-6wdbj" (OuterVolumeSpecName: "kube-api-access-6wdbj") pod "62f2a608-1cbf-4af5-ae73-9e3d141ae906" (UID: "62f2a608-1cbf-4af5-ae73-9e3d141ae906"). InnerVolumeSpecName "kube-api-access-6wdbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:56:03 crc kubenswrapper[4962]: I1201 21:56:03.433656 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2cpg5\" (UniqueName: \"kubernetes.io/projected/6dd00f79-0c0f-4016-bd96-e0c497c73e36-kube-api-access-2cpg5\") pod \"6dd00f79-0c0f-4016-bd96-e0c497c73e36\" (UID: \"6dd00f79-0c0f-4016-bd96-e0c497c73e36\") " Dec 01 21:56:03 crc kubenswrapper[4962]: I1201 21:56:03.433856 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6dd00f79-0c0f-4016-bd96-e0c497c73e36-operator-scripts\") pod \"6dd00f79-0c0f-4016-bd96-e0c497c73e36\" (UID: \"6dd00f79-0c0f-4016-bd96-e0c497c73e36\") " Dec 01 21:56:03 crc kubenswrapper[4962]: I1201 21:56:03.434431 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6dd00f79-0c0f-4016-bd96-e0c497c73e36-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6dd00f79-0c0f-4016-bd96-e0c497c73e36" (UID: "6dd00f79-0c0f-4016-bd96-e0c497c73e36"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:56:03 crc kubenswrapper[4962]: I1201 21:56:03.437480 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6dd00f79-0c0f-4016-bd96-e0c497c73e36-kube-api-access-2cpg5" (OuterVolumeSpecName: "kube-api-access-2cpg5") pod "6dd00f79-0c0f-4016-bd96-e0c497c73e36" (UID: "6dd00f79-0c0f-4016-bd96-e0c497c73e36"). InnerVolumeSpecName "kube-api-access-2cpg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:56:03 crc kubenswrapper[4962]: I1201 21:56:03.439513 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6wdbj\" (UniqueName: \"kubernetes.io/projected/62f2a608-1cbf-4af5-ae73-9e3d141ae906-kube-api-access-6wdbj\") on node \"crc\" DevicePath \"\"" Dec 01 21:56:03 crc kubenswrapper[4962]: I1201 21:56:03.439544 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6dd00f79-0c0f-4016-bd96-e0c497c73e36-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 21:56:03 crc kubenswrapper[4962]: I1201 21:56:03.546061 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2cpg5\" (UniqueName: \"kubernetes.io/projected/6dd00f79-0c0f-4016-bd96-e0c497c73e36-kube-api-access-2cpg5\") on node \"crc\" DevicePath \"\"" Dec 01 21:56:03 crc kubenswrapper[4962]: I1201 21:56:03.849781 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-52d5-account-create-update-qxxxt" event={"ID":"6dd00f79-0c0f-4016-bd96-e0c497c73e36","Type":"ContainerDied","Data":"b0b3f6b5e43f218f3080cddeb8d91e9f5d13534744de82e0d531d2d2519cbc6e"} Dec 01 21:56:03 crc kubenswrapper[4962]: I1201 21:56:03.850047 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b0b3f6b5e43f218f3080cddeb8d91e9f5d13534744de82e0d531d2d2519cbc6e" Dec 01 21:56:03 crc kubenswrapper[4962]: I1201 21:56:03.850200 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-52d5-account-create-update-qxxxt" Dec 01 21:56:03 crc kubenswrapper[4962]: I1201 21:56:03.860435 4962 generic.go:334] "Generic (PLEG): container finished" podID="2cd79bc6-3c65-476e-ab58-1dcbe4533e23" containerID="9f947b5766c5ef2440fe5917a26945ad9af65bf5738caa53f1a36cd6a5ee872b" exitCode=1 Dec 01 21:56:03 crc kubenswrapper[4962]: I1201 21:56:03.860509 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7747d4b4f4-gldgq" event={"ID":"2cd79bc6-3c65-476e-ab58-1dcbe4533e23","Type":"ContainerDied","Data":"9f947b5766c5ef2440fe5917a26945ad9af65bf5738caa53f1a36cd6a5ee872b"} Dec 01 21:56:03 crc kubenswrapper[4962]: I1201 21:56:03.860541 4962 scope.go:117] "RemoveContainer" containerID="c8104227d9f43b422cee01bcbfc551cb4a14092a8a0358af364d23e663ef8597" Dec 01 21:56:03 crc kubenswrapper[4962]: I1201 21:56:03.861208 4962 scope.go:117] "RemoveContainer" containerID="9f947b5766c5ef2440fe5917a26945ad9af65bf5738caa53f1a36cd6a5ee872b" Dec 01 21:56:03 crc kubenswrapper[4962]: E1201 21:56:03.861501 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-7747d4b4f4-gldgq_openstack(2cd79bc6-3c65-476e-ab58-1dcbe4533e23)\"" pod="openstack/heat-api-7747d4b4f4-gldgq" podUID="2cd79bc6-3c65-476e-ab58-1dcbe4533e23" Dec 01 21:56:03 crc kubenswrapper[4962]: I1201 21:56:03.862317 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-287a-account-create-update-svftm" Dec 01 21:56:03 crc kubenswrapper[4962]: I1201 21:56:03.862318 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-287a-account-create-update-svftm" event={"ID":"62f2a608-1cbf-4af5-ae73-9e3d141ae906","Type":"ContainerDied","Data":"9a703de34b9be02a144bc4c5cd12ab3f4d99529bee72b89441787c5f5993d84c"} Dec 01 21:56:03 crc kubenswrapper[4962]: I1201 21:56:03.862403 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a703de34b9be02a144bc4c5cd12ab3f4d99529bee72b89441787c5f5993d84c" Dec 01 21:56:03 crc kubenswrapper[4962]: I1201 21:56:03.867221 4962 generic.go:334] "Generic (PLEG): container finished" podID="aada9c17-9992-466a-bd0c-212a580295fa" containerID="c5c8c8383fdff4c08abf12f41ff53dde7a7c4e4a2aa25cd366a22097b1f67158" exitCode=1 Dec 01 21:56:03 crc kubenswrapper[4962]: I1201 21:56:03.867301 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6555856bc4-7xjr6" event={"ID":"aada9c17-9992-466a-bd0c-212a580295fa","Type":"ContainerDied","Data":"c5c8c8383fdff4c08abf12f41ff53dde7a7c4e4a2aa25cd366a22097b1f67158"} Dec 01 21:56:03 crc kubenswrapper[4962]: I1201 21:56:03.868172 4962 scope.go:117] "RemoveContainer" containerID="c5c8c8383fdff4c08abf12f41ff53dde7a7c4e4a2aa25cd366a22097b1f67158" Dec 01 21:56:03 crc kubenswrapper[4962]: E1201 21:56:03.868559 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-6555856bc4-7xjr6_openstack(aada9c17-9992-466a-bd0c-212a580295fa)\"" pod="openstack/heat-cfnapi-6555856bc4-7xjr6" podUID="aada9c17-9992-466a-bd0c-212a580295fa" Dec 01 21:56:03 crc kubenswrapper[4962]: I1201 21:56:03.881149 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63129893-531f-4f40-a0d3-e75925071d5a","Type":"ContainerStarted","Data":"1cb0ce72822a9102f9c560213bfabf47a25660e86129bc6b1d4f699789fac5b3"} Dec 01 21:56:03 crc kubenswrapper[4962]: I1201 21:56:03.945200 4962 scope.go:117] "RemoveContainer" containerID="a55795d93b236b18d1f6c5856aa83175c62d31c7623a3e7c69a163852923807c" Dec 01 21:56:04 crc kubenswrapper[4962]: I1201 21:56:04.986457 4962 scope.go:117] "RemoveContainer" containerID="9f947b5766c5ef2440fe5917a26945ad9af65bf5738caa53f1a36cd6a5ee872b" Dec 01 21:56:04 crc kubenswrapper[4962]: E1201 21:56:04.987687 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-7747d4b4f4-gldgq_openstack(2cd79bc6-3c65-476e-ab58-1dcbe4533e23)\"" pod="openstack/heat-api-7747d4b4f4-gldgq" podUID="2cd79bc6-3c65-476e-ab58-1dcbe4533e23" Dec 01 21:56:05 crc kubenswrapper[4962]: I1201 21:56:05.011170 4962 scope.go:117] "RemoveContainer" containerID="c5c8c8383fdff4c08abf12f41ff53dde7a7c4e4a2aa25cd366a22097b1f67158" Dec 01 21:56:05 crc kubenswrapper[4962]: E1201 21:56:05.011464 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-6555856bc4-7xjr6_openstack(aada9c17-9992-466a-bd0c-212a580295fa)\"" pod="openstack/heat-cfnapi-6555856bc4-7xjr6" podUID="aada9c17-9992-466a-bd0c-212a580295fa" Dec 01 21:56:05 crc kubenswrapper[4962]: I1201 21:56:05.028440 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63129893-531f-4f40-a0d3-e75925071d5a","Type":"ContainerStarted","Data":"0684d0be3ff1743fbb032b8205e4e8970f933b764c051469b84b40fcf639d349"} Dec 01 21:56:05 crc kubenswrapper[4962]: I1201 21:56:05.450649 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-6555856bc4-7xjr6" Dec 01 21:56:05 crc kubenswrapper[4962]: I1201 21:56:05.451020 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-cfnapi-6555856bc4-7xjr6" Dec 01 21:56:05 crc kubenswrapper[4962]: I1201 21:56:05.471033 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-api-7747d4b4f4-gldgq" Dec 01 21:56:05 crc kubenswrapper[4962]: I1201 21:56:05.471077 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-7747d4b4f4-gldgq" Dec 01 21:56:06 crc kubenswrapper[4962]: I1201 21:56:06.041460 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63129893-531f-4f40-a0d3-e75925071d5a","Type":"ContainerStarted","Data":"d4980274227a5cb7b9a30cc4629fb2d6aba2347cc030d0a131c2571ccd3b66a1"} Dec 01 21:56:06 crc kubenswrapper[4962]: I1201 21:56:06.042122 4962 scope.go:117] "RemoveContainer" containerID="9f947b5766c5ef2440fe5917a26945ad9af65bf5738caa53f1a36cd6a5ee872b" Dec 01 21:56:06 crc kubenswrapper[4962]: E1201 21:56:06.042360 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-7747d4b4f4-gldgq_openstack(2cd79bc6-3c65-476e-ab58-1dcbe4533e23)\"" pod="openstack/heat-api-7747d4b4f4-gldgq" podUID="2cd79bc6-3c65-476e-ab58-1dcbe4533e23" Dec 01 21:56:06 crc kubenswrapper[4962]: I1201 21:56:06.044302 4962 scope.go:117] "RemoveContainer" containerID="c5c8c8383fdff4c08abf12f41ff53dde7a7c4e4a2aa25cd366a22097b1f67158" Dec 01 21:56:06 crc kubenswrapper[4962]: E1201 21:56:06.044782 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-6555856bc4-7xjr6_openstack(aada9c17-9992-466a-bd0c-212a580295fa)\"" pod="openstack/heat-cfnapi-6555856bc4-7xjr6" podUID="aada9c17-9992-466a-bd0c-212a580295fa" Dec 01 21:56:06 crc kubenswrapper[4962]: I1201 21:56:06.072682 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.889105224 podStartE2EDuration="7.07266234s" podCreationTimestamp="2025-12-01 21:55:59 +0000 UTC" firstStartedPulling="2025-12-01 21:56:01.523704714 +0000 UTC m=+1345.625143919" lastFinishedPulling="2025-12-01 21:56:05.70726184 +0000 UTC m=+1349.808701035" observedRunningTime="2025-12-01 21:56:06.062198902 +0000 UTC m=+1350.163638097" watchObservedRunningTime="2025-12-01 21:56:06.07266234 +0000 UTC m=+1350.174101535" Dec 01 21:56:06 crc kubenswrapper[4962]: I1201 21:56:06.364662 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 21:56:07 crc kubenswrapper[4962]: I1201 21:56:07.051596 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 01 21:56:07 crc kubenswrapper[4962]: I1201 21:56:07.436123 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-688b9f5b49-ntsmc" Dec 01 21:56:07 crc kubenswrapper[4962]: I1201 21:56:07.516265 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-42wgd"] Dec 01 21:56:07 crc kubenswrapper[4962]: I1201 21:56:07.516544 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6578955fd5-42wgd" podUID="5f1e9610-2753-4001-a9fb-5e020774725b" containerName="dnsmasq-dns" containerID="cri-o://77fa35faed3d880a1a9dee57b9ba09a7430aca2181f1a19e5dd261235253db5d" gracePeriod=10 Dec 01 21:56:07 crc kubenswrapper[4962]: I1201 21:56:07.529241 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-56f444f67c-lv25m" Dec 01 21:56:08 crc kubenswrapper[4962]: I1201 21:56:08.072249 4962 generic.go:334] "Generic (PLEG): container finished" podID="5f1e9610-2753-4001-a9fb-5e020774725b" containerID="77fa35faed3d880a1a9dee57b9ba09a7430aca2181f1a19e5dd261235253db5d" exitCode=0 Dec 01 21:56:08 crc kubenswrapper[4962]: I1201 21:56:08.072747 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="63129893-531f-4f40-a0d3-e75925071d5a" containerName="ceilometer-central-agent" containerID="cri-o://3d860a04a627da29702c51df3bac4b8ed669baaafa6d7e8510b5d920da590e75" gracePeriod=30 Dec 01 21:56:08 crc kubenswrapper[4962]: I1201 21:56:08.073059 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="63129893-531f-4f40-a0d3-e75925071d5a" containerName="proxy-httpd" containerID="cri-o://d4980274227a5cb7b9a30cc4629fb2d6aba2347cc030d0a131c2571ccd3b66a1" gracePeriod=30 Dec 01 21:56:08 crc kubenswrapper[4962]: I1201 21:56:08.073052 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-42wgd" event={"ID":"5f1e9610-2753-4001-a9fb-5e020774725b","Type":"ContainerDied","Data":"77fa35faed3d880a1a9dee57b9ba09a7430aca2181f1a19e5dd261235253db5d"} Dec 01 21:56:08 crc kubenswrapper[4962]: I1201 21:56:08.073090 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="63129893-531f-4f40-a0d3-e75925071d5a" containerName="ceilometer-notification-agent" containerID="cri-o://1cb0ce72822a9102f9c560213bfabf47a25660e86129bc6b1d4f699789fac5b3" gracePeriod=30 Dec 01 21:56:08 crc kubenswrapper[4962]: I1201 21:56:08.073142 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="63129893-531f-4f40-a0d3-e75925071d5a" containerName="sg-core" containerID="cri-o://0684d0be3ff1743fbb032b8205e4e8970f933b764c051469b84b40fcf639d349" gracePeriod=30 Dec 01 21:56:08 crc kubenswrapper[4962]: I1201 21:56:08.333413 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-4997b"] Dec 01 21:56:08 crc kubenswrapper[4962]: E1201 21:56:08.340267 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdaa775e-7b2f-4c56-8f07-256a62b4ed20" containerName="mariadb-database-create" Dec 01 21:56:08 crc kubenswrapper[4962]: I1201 21:56:08.340301 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdaa775e-7b2f-4c56-8f07-256a62b4ed20" containerName="mariadb-database-create" Dec 01 21:56:08 crc kubenswrapper[4962]: E1201 21:56:08.340335 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62f2a608-1cbf-4af5-ae73-9e3d141ae906" containerName="mariadb-account-create-update" Dec 01 21:56:08 crc kubenswrapper[4962]: I1201 21:56:08.340341 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="62f2a608-1cbf-4af5-ae73-9e3d141ae906" containerName="mariadb-account-create-update" Dec 01 21:56:08 crc kubenswrapper[4962]: E1201 21:56:08.340372 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dd00f79-0c0f-4016-bd96-e0c497c73e36" containerName="mariadb-account-create-update" Dec 01 21:56:08 crc kubenswrapper[4962]: I1201 21:56:08.340378 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dd00f79-0c0f-4016-bd96-e0c497c73e36" containerName="mariadb-account-create-update" Dec 01 21:56:08 crc kubenswrapper[4962]: E1201 21:56:08.340387 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2e2dc11-c865-45cb-ab81-c39e911fdef9" containerName="mariadb-database-create" Dec 01 21:56:08 crc kubenswrapper[4962]: I1201 21:56:08.340393 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2e2dc11-c865-45cb-ab81-c39e911fdef9" containerName="mariadb-database-create" Dec 01 21:56:08 crc kubenswrapper[4962]: E1201 21:56:08.340405 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0f3d631-bb76-48fd-9bb2-2326d9044956" containerName="mariadb-database-create" Dec 01 21:56:08 crc kubenswrapper[4962]: I1201 21:56:08.340412 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0f3d631-bb76-48fd-9bb2-2326d9044956" containerName="mariadb-database-create" Dec 01 21:56:08 crc kubenswrapper[4962]: E1201 21:56:08.340426 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88297b50-c65c-4dc3-8eed-d86b046b7f84" containerName="mariadb-account-create-update" Dec 01 21:56:08 crc kubenswrapper[4962]: I1201 21:56:08.340432 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="88297b50-c65c-4dc3-8eed-d86b046b7f84" containerName="mariadb-account-create-update" Dec 01 21:56:08 crc kubenswrapper[4962]: I1201 21:56:08.340736 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0f3d631-bb76-48fd-9bb2-2326d9044956" containerName="mariadb-database-create" Dec 01 21:56:08 crc kubenswrapper[4962]: I1201 21:56:08.340748 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2e2dc11-c865-45cb-ab81-c39e911fdef9" containerName="mariadb-database-create" Dec 01 21:56:08 crc kubenswrapper[4962]: I1201 21:56:08.340768 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="6dd00f79-0c0f-4016-bd96-e0c497c73e36" containerName="mariadb-account-create-update" Dec 01 21:56:08 crc kubenswrapper[4962]: I1201 21:56:08.340778 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="62f2a608-1cbf-4af5-ae73-9e3d141ae906" containerName="mariadb-account-create-update" Dec 01 21:56:08 crc kubenswrapper[4962]: I1201 21:56:08.340785 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="88297b50-c65c-4dc3-8eed-d86b046b7f84" containerName="mariadb-account-create-update" Dec 01 21:56:08 crc kubenswrapper[4962]: I1201 21:56:08.340795 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdaa775e-7b2f-4c56-8f07-256a62b4ed20" containerName="mariadb-database-create" Dec 01 21:56:08 crc kubenswrapper[4962]: I1201 21:56:08.341579 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-4997b" Dec 01 21:56:08 crc kubenswrapper[4962]: I1201 21:56:08.345215 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 01 21:56:08 crc kubenswrapper[4962]: I1201 21:56:08.345430 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 01 21:56:08 crc kubenswrapper[4962]: I1201 21:56:08.349784 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-4997b"] Dec 01 21:56:08 crc kubenswrapper[4962]: I1201 21:56:08.351894 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-vtxzn" Dec 01 21:56:08 crc kubenswrapper[4962]: I1201 21:56:08.404376 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-42wgd" Dec 01 21:56:08 crc kubenswrapper[4962]: I1201 21:56:08.530801 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4thwd\" (UniqueName: \"kubernetes.io/projected/5f1e9610-2753-4001-a9fb-5e020774725b-kube-api-access-4thwd\") pod \"5f1e9610-2753-4001-a9fb-5e020774725b\" (UID: \"5f1e9610-2753-4001-a9fb-5e020774725b\") " Dec 01 21:56:08 crc kubenswrapper[4962]: I1201 21:56:08.530883 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5f1e9610-2753-4001-a9fb-5e020774725b-dns-svc\") pod \"5f1e9610-2753-4001-a9fb-5e020774725b\" (UID: \"5f1e9610-2753-4001-a9fb-5e020774725b\") " Dec 01 21:56:08 crc kubenswrapper[4962]: I1201 21:56:08.531129 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5f1e9610-2753-4001-a9fb-5e020774725b-ovsdbserver-nb\") pod \"5f1e9610-2753-4001-a9fb-5e020774725b\" (UID: \"5f1e9610-2753-4001-a9fb-5e020774725b\") " Dec 01 21:56:08 crc kubenswrapper[4962]: I1201 21:56:08.531188 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5f1e9610-2753-4001-a9fb-5e020774725b-ovsdbserver-sb\") pod \"5f1e9610-2753-4001-a9fb-5e020774725b\" (UID: \"5f1e9610-2753-4001-a9fb-5e020774725b\") " Dec 01 21:56:08 crc kubenswrapper[4962]: I1201 21:56:08.531217 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f1e9610-2753-4001-a9fb-5e020774725b-config\") pod \"5f1e9610-2753-4001-a9fb-5e020774725b\" (UID: \"5f1e9610-2753-4001-a9fb-5e020774725b\") " Dec 01 21:56:08 crc kubenswrapper[4962]: I1201 21:56:08.531258 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5f1e9610-2753-4001-a9fb-5e020774725b-dns-swift-storage-0\") pod \"5f1e9610-2753-4001-a9fb-5e020774725b\" (UID: \"5f1e9610-2753-4001-a9fb-5e020774725b\") " Dec 01 21:56:08 crc kubenswrapper[4962]: I1201 21:56:08.531671 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93391e58-9905-4fec-a4fe-4ed30bbb5eec-config-data\") pod \"nova-cell0-conductor-db-sync-4997b\" (UID: \"93391e58-9905-4fec-a4fe-4ed30bbb5eec\") " pod="openstack/nova-cell0-conductor-db-sync-4997b" Dec 01 21:56:08 crc kubenswrapper[4962]: I1201 21:56:08.531740 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jd4nr\" (UniqueName: \"kubernetes.io/projected/93391e58-9905-4fec-a4fe-4ed30bbb5eec-kube-api-access-jd4nr\") pod \"nova-cell0-conductor-db-sync-4997b\" (UID: \"93391e58-9905-4fec-a4fe-4ed30bbb5eec\") " pod="openstack/nova-cell0-conductor-db-sync-4997b" Dec 01 21:56:08 crc kubenswrapper[4962]: I1201 21:56:08.531897 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93391e58-9905-4fec-a4fe-4ed30bbb5eec-scripts\") pod \"nova-cell0-conductor-db-sync-4997b\" (UID: \"93391e58-9905-4fec-a4fe-4ed30bbb5eec\") " pod="openstack/nova-cell0-conductor-db-sync-4997b" Dec 01 21:56:08 crc kubenswrapper[4962]: I1201 21:56:08.532064 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93391e58-9905-4fec-a4fe-4ed30bbb5eec-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-4997b\" (UID: \"93391e58-9905-4fec-a4fe-4ed30bbb5eec\") " pod="openstack/nova-cell0-conductor-db-sync-4997b" Dec 01 21:56:08 crc kubenswrapper[4962]: I1201 21:56:08.567898 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f1e9610-2753-4001-a9fb-5e020774725b-kube-api-access-4thwd" (OuterVolumeSpecName: "kube-api-access-4thwd") pod "5f1e9610-2753-4001-a9fb-5e020774725b" (UID: "5f1e9610-2753-4001-a9fb-5e020774725b"). InnerVolumeSpecName "kube-api-access-4thwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:56:08 crc kubenswrapper[4962]: I1201 21:56:08.626391 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f1e9610-2753-4001-a9fb-5e020774725b-config" (OuterVolumeSpecName: "config") pod "5f1e9610-2753-4001-a9fb-5e020774725b" (UID: "5f1e9610-2753-4001-a9fb-5e020774725b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:56:08 crc kubenswrapper[4962]: I1201 21:56:08.631353 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f1e9610-2753-4001-a9fb-5e020774725b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5f1e9610-2753-4001-a9fb-5e020774725b" (UID: "5f1e9610-2753-4001-a9fb-5e020774725b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:56:08 crc kubenswrapper[4962]: I1201 21:56:08.633899 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jd4nr\" (UniqueName: \"kubernetes.io/projected/93391e58-9905-4fec-a4fe-4ed30bbb5eec-kube-api-access-jd4nr\") pod \"nova-cell0-conductor-db-sync-4997b\" (UID: \"93391e58-9905-4fec-a4fe-4ed30bbb5eec\") " pod="openstack/nova-cell0-conductor-db-sync-4997b" Dec 01 21:56:08 crc kubenswrapper[4962]: I1201 21:56:08.634044 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93391e58-9905-4fec-a4fe-4ed30bbb5eec-scripts\") pod \"nova-cell0-conductor-db-sync-4997b\" (UID: \"93391e58-9905-4fec-a4fe-4ed30bbb5eec\") " pod="openstack/nova-cell0-conductor-db-sync-4997b" Dec 01 21:56:08 crc kubenswrapper[4962]: I1201 21:56:08.634095 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93391e58-9905-4fec-a4fe-4ed30bbb5eec-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-4997b\" (UID: \"93391e58-9905-4fec-a4fe-4ed30bbb5eec\") " pod="openstack/nova-cell0-conductor-db-sync-4997b" Dec 01 21:56:08 crc kubenswrapper[4962]: I1201 21:56:08.634149 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93391e58-9905-4fec-a4fe-4ed30bbb5eec-config-data\") pod \"nova-cell0-conductor-db-sync-4997b\" (UID: \"93391e58-9905-4fec-a4fe-4ed30bbb5eec\") " pod="openstack/nova-cell0-conductor-db-sync-4997b" Dec 01 21:56:08 crc kubenswrapper[4962]: I1201 21:56:08.634211 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4thwd\" (UniqueName: \"kubernetes.io/projected/5f1e9610-2753-4001-a9fb-5e020774725b-kube-api-access-4thwd\") on node \"crc\" DevicePath \"\"" Dec 01 21:56:08 crc kubenswrapper[4962]: I1201 21:56:08.634225 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5f1e9610-2753-4001-a9fb-5e020774725b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 21:56:08 crc kubenswrapper[4962]: I1201 21:56:08.634235 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f1e9610-2753-4001-a9fb-5e020774725b-config\") on node \"crc\" DevicePath \"\"" Dec 01 21:56:08 crc kubenswrapper[4962]: I1201 21:56:08.642257 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93391e58-9905-4fec-a4fe-4ed30bbb5eec-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-4997b\" (UID: \"93391e58-9905-4fec-a4fe-4ed30bbb5eec\") " pod="openstack/nova-cell0-conductor-db-sync-4997b" Dec 01 21:56:08 crc kubenswrapper[4962]: I1201 21:56:08.642274 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93391e58-9905-4fec-a4fe-4ed30bbb5eec-scripts\") pod \"nova-cell0-conductor-db-sync-4997b\" (UID: \"93391e58-9905-4fec-a4fe-4ed30bbb5eec\") " pod="openstack/nova-cell0-conductor-db-sync-4997b" Dec 01 21:56:08 crc kubenswrapper[4962]: I1201 21:56:08.650645 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93391e58-9905-4fec-a4fe-4ed30bbb5eec-config-data\") pod \"nova-cell0-conductor-db-sync-4997b\" (UID: \"93391e58-9905-4fec-a4fe-4ed30bbb5eec\") " pod="openstack/nova-cell0-conductor-db-sync-4997b" Dec 01 21:56:08 crc kubenswrapper[4962]: I1201 21:56:08.654291 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jd4nr\" (UniqueName: \"kubernetes.io/projected/93391e58-9905-4fec-a4fe-4ed30bbb5eec-kube-api-access-jd4nr\") pod \"nova-cell0-conductor-db-sync-4997b\" (UID: \"93391e58-9905-4fec-a4fe-4ed30bbb5eec\") " pod="openstack/nova-cell0-conductor-db-sync-4997b" Dec 01 21:56:08 crc kubenswrapper[4962]: I1201 21:56:08.654374 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f1e9610-2753-4001-a9fb-5e020774725b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5f1e9610-2753-4001-a9fb-5e020774725b" (UID: "5f1e9610-2753-4001-a9fb-5e020774725b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:56:08 crc kubenswrapper[4962]: I1201 21:56:08.658324 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f1e9610-2753-4001-a9fb-5e020774725b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5f1e9610-2753-4001-a9fb-5e020774725b" (UID: "5f1e9610-2753-4001-a9fb-5e020774725b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:56:08 crc kubenswrapper[4962]: I1201 21:56:08.672631 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f1e9610-2753-4001-a9fb-5e020774725b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "5f1e9610-2753-4001-a9fb-5e020774725b" (UID: "5f1e9610-2753-4001-a9fb-5e020774725b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:56:08 crc kubenswrapper[4962]: I1201 21:56:08.736757 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5f1e9610-2753-4001-a9fb-5e020774725b-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 21:56:08 crc kubenswrapper[4962]: I1201 21:56:08.736787 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5f1e9610-2753-4001-a9fb-5e020774725b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 21:56:08 crc kubenswrapper[4962]: I1201 21:56:08.736799 4962 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5f1e9610-2753-4001-a9fb-5e020774725b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 01 21:56:08 crc kubenswrapper[4962]: I1201 21:56:08.747498 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-4997b" Dec 01 21:56:08 crc kubenswrapper[4962]: I1201 21:56:08.932541 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-59c6c9c84d-2tqrs" Dec 01 21:56:08 crc kubenswrapper[4962]: I1201 21:56:08.977302 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-5ff8b998b6-kt4sg" Dec 01 21:56:09 crc kubenswrapper[4962]: I1201 21:56:09.048591 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-7747d4b4f4-gldgq"] Dec 01 21:56:09 crc kubenswrapper[4962]: I1201 21:56:09.143232 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-6555856bc4-7xjr6"] Dec 01 21:56:09 crc kubenswrapper[4962]: I1201 21:56:09.201386 4962 generic.go:334] "Generic (PLEG): container finished" podID="63129893-531f-4f40-a0d3-e75925071d5a" containerID="d4980274227a5cb7b9a30cc4629fb2d6aba2347cc030d0a131c2571ccd3b66a1" exitCode=0 Dec 01 21:56:09 crc kubenswrapper[4962]: I1201 21:56:09.201416 4962 generic.go:334] "Generic (PLEG): container finished" podID="63129893-531f-4f40-a0d3-e75925071d5a" containerID="0684d0be3ff1743fbb032b8205e4e8970f933b764c051469b84b40fcf639d349" exitCode=2 Dec 01 21:56:09 crc kubenswrapper[4962]: I1201 21:56:09.201425 4962 generic.go:334] "Generic (PLEG): container finished" podID="63129893-531f-4f40-a0d3-e75925071d5a" containerID="1cb0ce72822a9102f9c560213bfabf47a25660e86129bc6b1d4f699789fac5b3" exitCode=0 Dec 01 21:56:09 crc kubenswrapper[4962]: I1201 21:56:09.201433 4962 generic.go:334] "Generic (PLEG): container finished" podID="63129893-531f-4f40-a0d3-e75925071d5a" containerID="3d860a04a627da29702c51df3bac4b8ed669baaafa6d7e8510b5d920da590e75" exitCode=0 Dec 01 21:56:09 crc kubenswrapper[4962]: I1201 21:56:09.201505 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63129893-531f-4f40-a0d3-e75925071d5a","Type":"ContainerDied","Data":"d4980274227a5cb7b9a30cc4629fb2d6aba2347cc030d0a131c2571ccd3b66a1"} Dec 01 21:56:09 crc kubenswrapper[4962]: I1201 21:56:09.201530 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63129893-531f-4f40-a0d3-e75925071d5a","Type":"ContainerDied","Data":"0684d0be3ff1743fbb032b8205e4e8970f933b764c051469b84b40fcf639d349"} Dec 01 21:56:09 crc kubenswrapper[4962]: I1201 21:56:09.201549 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63129893-531f-4f40-a0d3-e75925071d5a","Type":"ContainerDied","Data":"1cb0ce72822a9102f9c560213bfabf47a25660e86129bc6b1d4f699789fac5b3"} Dec 01 21:56:09 crc kubenswrapper[4962]: I1201 21:56:09.201557 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63129893-531f-4f40-a0d3-e75925071d5a","Type":"ContainerDied","Data":"3d860a04a627da29702c51df3bac4b8ed669baaafa6d7e8510b5d920da590e75"} Dec 01 21:56:09 crc kubenswrapper[4962]: I1201 21:56:09.215019 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-42wgd" event={"ID":"5f1e9610-2753-4001-a9fb-5e020774725b","Type":"ContainerDied","Data":"1f92e3fa8a465f375e2752c223d3eb3067817fabd451aa5612d522f9389ec307"} Dec 01 21:56:09 crc kubenswrapper[4962]: I1201 21:56:09.215591 4962 scope.go:117] "RemoveContainer" containerID="77fa35faed3d880a1a9dee57b9ba09a7430aca2181f1a19e5dd261235253db5d" Dec 01 21:56:09 crc kubenswrapper[4962]: I1201 21:56:09.215353 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-42wgd" Dec 01 21:56:09 crc kubenswrapper[4962]: I1201 21:56:09.265114 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-42wgd"] Dec 01 21:56:09 crc kubenswrapper[4962]: I1201 21:56:09.265762 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 21:56:09 crc kubenswrapper[4962]: I1201 21:56:09.274755 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-42wgd"] Dec 01 21:56:09 crc kubenswrapper[4962]: I1201 21:56:09.285505 4962 scope.go:117] "RemoveContainer" containerID="40bde0aa50d937698515a8077f29b99419a0c41d9992444b34270646a3a67cdf" Dec 01 21:56:09 crc kubenswrapper[4962]: I1201 21:56:09.358900 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63129893-531f-4f40-a0d3-e75925071d5a-scripts\") pod \"63129893-531f-4f40-a0d3-e75925071d5a\" (UID: \"63129893-531f-4f40-a0d3-e75925071d5a\") " Dec 01 21:56:09 crc kubenswrapper[4962]: I1201 21:56:09.359077 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/63129893-531f-4f40-a0d3-e75925071d5a-sg-core-conf-yaml\") pod \"63129893-531f-4f40-a0d3-e75925071d5a\" (UID: \"63129893-531f-4f40-a0d3-e75925071d5a\") " Dec 01 21:56:09 crc kubenswrapper[4962]: I1201 21:56:09.359101 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74jl4\" (UniqueName: \"kubernetes.io/projected/63129893-531f-4f40-a0d3-e75925071d5a-kube-api-access-74jl4\") pod \"63129893-531f-4f40-a0d3-e75925071d5a\" (UID: \"63129893-531f-4f40-a0d3-e75925071d5a\") " Dec 01 21:56:09 crc kubenswrapper[4962]: I1201 21:56:09.359126 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63129893-531f-4f40-a0d3-e75925071d5a-log-httpd\") pod \"63129893-531f-4f40-a0d3-e75925071d5a\" (UID: \"63129893-531f-4f40-a0d3-e75925071d5a\") " Dec 01 21:56:09 crc kubenswrapper[4962]: I1201 21:56:09.359159 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63129893-531f-4f40-a0d3-e75925071d5a-combined-ca-bundle\") pod \"63129893-531f-4f40-a0d3-e75925071d5a\" (UID: \"63129893-531f-4f40-a0d3-e75925071d5a\") " Dec 01 21:56:09 crc kubenswrapper[4962]: I1201 21:56:09.359295 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63129893-531f-4f40-a0d3-e75925071d5a-run-httpd\") pod \"63129893-531f-4f40-a0d3-e75925071d5a\" (UID: \"63129893-531f-4f40-a0d3-e75925071d5a\") " Dec 01 21:56:09 crc kubenswrapper[4962]: I1201 21:56:09.359372 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63129893-531f-4f40-a0d3-e75925071d5a-config-data\") pod \"63129893-531f-4f40-a0d3-e75925071d5a\" (UID: \"63129893-531f-4f40-a0d3-e75925071d5a\") " Dec 01 21:56:09 crc kubenswrapper[4962]: I1201 21:56:09.365472 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63129893-531f-4f40-a0d3-e75925071d5a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "63129893-531f-4f40-a0d3-e75925071d5a" (UID: "63129893-531f-4f40-a0d3-e75925071d5a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:56:09 crc kubenswrapper[4962]: I1201 21:56:09.365656 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63129893-531f-4f40-a0d3-e75925071d5a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "63129893-531f-4f40-a0d3-e75925071d5a" (UID: "63129893-531f-4f40-a0d3-e75925071d5a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:56:09 crc kubenswrapper[4962]: I1201 21:56:09.372369 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63129893-531f-4f40-a0d3-e75925071d5a-scripts" (OuterVolumeSpecName: "scripts") pod "63129893-531f-4f40-a0d3-e75925071d5a" (UID: "63129893-531f-4f40-a0d3-e75925071d5a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:56:09 crc kubenswrapper[4962]: I1201 21:56:09.389113 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63129893-531f-4f40-a0d3-e75925071d5a-kube-api-access-74jl4" (OuterVolumeSpecName: "kube-api-access-74jl4") pod "63129893-531f-4f40-a0d3-e75925071d5a" (UID: "63129893-531f-4f40-a0d3-e75925071d5a"). InnerVolumeSpecName "kube-api-access-74jl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:56:09 crc kubenswrapper[4962]: I1201 21:56:09.421162 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63129893-531f-4f40-a0d3-e75925071d5a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "63129893-531f-4f40-a0d3-e75925071d5a" (UID: "63129893-531f-4f40-a0d3-e75925071d5a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:56:09 crc kubenswrapper[4962]: I1201 21:56:09.463065 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63129893-531f-4f40-a0d3-e75925071d5a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "63129893-531f-4f40-a0d3-e75925071d5a" (UID: "63129893-531f-4f40-a0d3-e75925071d5a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:56:09 crc kubenswrapper[4962]: I1201 21:56:09.470173 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63129893-531f-4f40-a0d3-e75925071d5a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 21:56:09 crc kubenswrapper[4962]: I1201 21:56:09.470205 4962 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63129893-531f-4f40-a0d3-e75925071d5a-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 21:56:09 crc kubenswrapper[4962]: I1201 21:56:09.470214 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63129893-531f-4f40-a0d3-e75925071d5a-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 21:56:09 crc kubenswrapper[4962]: I1201 21:56:09.470223 4962 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/63129893-531f-4f40-a0d3-e75925071d5a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 01 21:56:09 crc kubenswrapper[4962]: I1201 21:56:09.470232 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74jl4\" (UniqueName: \"kubernetes.io/projected/63129893-531f-4f40-a0d3-e75925071d5a-kube-api-access-74jl4\") on node \"crc\" DevicePath \"\"" Dec 01 21:56:09 crc kubenswrapper[4962]: I1201 21:56:09.470243 4962 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63129893-531f-4f40-a0d3-e75925071d5a-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 21:56:09 crc kubenswrapper[4962]: I1201 21:56:09.543517 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7747d4b4f4-gldgq" Dec 01 21:56:09 crc kubenswrapper[4962]: I1201 21:56:09.586838 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63129893-531f-4f40-a0d3-e75925071d5a-config-data" (OuterVolumeSpecName: "config-data") pod "63129893-531f-4f40-a0d3-e75925071d5a" (UID: "63129893-531f-4f40-a0d3-e75925071d5a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:56:09 crc kubenswrapper[4962]: I1201 21:56:09.590825 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-58f5d66b6f-bgzrf" Dec 01 21:56:09 crc kubenswrapper[4962]: I1201 21:56:09.675217 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2cd79bc6-3c65-476e-ab58-1dcbe4533e23-config-data-custom\") pod \"2cd79bc6-3c65-476e-ab58-1dcbe4533e23\" (UID: \"2cd79bc6-3c65-476e-ab58-1dcbe4533e23\") " Dec 01 21:56:09 crc kubenswrapper[4962]: I1201 21:56:09.675587 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cd79bc6-3c65-476e-ab58-1dcbe4533e23-config-data\") pod \"2cd79bc6-3c65-476e-ab58-1dcbe4533e23\" (UID: \"2cd79bc6-3c65-476e-ab58-1dcbe4533e23\") " Dec 01 21:56:09 crc kubenswrapper[4962]: I1201 21:56:09.675628 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkqw2\" (UniqueName: \"kubernetes.io/projected/2cd79bc6-3c65-476e-ab58-1dcbe4533e23-kube-api-access-kkqw2\") pod \"2cd79bc6-3c65-476e-ab58-1dcbe4533e23\" (UID: \"2cd79bc6-3c65-476e-ab58-1dcbe4533e23\") " Dec 01 21:56:09 crc kubenswrapper[4962]: I1201 21:56:09.675785 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cd79bc6-3c65-476e-ab58-1dcbe4533e23-combined-ca-bundle\") pod \"2cd79bc6-3c65-476e-ab58-1dcbe4533e23\" (UID: \"2cd79bc6-3c65-476e-ab58-1dcbe4533e23\") " Dec 01 21:56:09 crc kubenswrapper[4962]: I1201 21:56:09.676414 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63129893-531f-4f40-a0d3-e75925071d5a-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 21:56:09 crc kubenswrapper[4962]: I1201 21:56:09.679946 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cd79bc6-3c65-476e-ab58-1dcbe4533e23-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2cd79bc6-3c65-476e-ab58-1dcbe4533e23" (UID: "2cd79bc6-3c65-476e-ab58-1dcbe4533e23"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:56:09 crc kubenswrapper[4962]: I1201 21:56:09.683279 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cd79bc6-3c65-476e-ab58-1dcbe4533e23-kube-api-access-kkqw2" (OuterVolumeSpecName: "kube-api-access-kkqw2") pod "2cd79bc6-3c65-476e-ab58-1dcbe4533e23" (UID: "2cd79bc6-3c65-476e-ab58-1dcbe4533e23"). InnerVolumeSpecName "kube-api-access-kkqw2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:56:09 crc kubenswrapper[4962]: I1201 21:56:09.713535 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cd79bc6-3c65-476e-ab58-1dcbe4533e23-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2cd79bc6-3c65-476e-ab58-1dcbe4533e23" (UID: "2cd79bc6-3c65-476e-ab58-1dcbe4533e23"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:56:09 crc kubenswrapper[4962]: I1201 21:56:09.770194 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cd79bc6-3c65-476e-ab58-1dcbe4533e23-config-data" (OuterVolumeSpecName: "config-data") pod "2cd79bc6-3c65-476e-ab58-1dcbe4533e23" (UID: "2cd79bc6-3c65-476e-ab58-1dcbe4533e23"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:56:09 crc kubenswrapper[4962]: I1201 21:56:09.786753 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cd79bc6-3c65-476e-ab58-1dcbe4533e23-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 21:56:09 crc kubenswrapper[4962]: I1201 21:56:09.786795 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkqw2\" (UniqueName: \"kubernetes.io/projected/2cd79bc6-3c65-476e-ab58-1dcbe4533e23-kube-api-access-kkqw2\") on node \"crc\" DevicePath \"\"" Dec 01 21:56:09 crc kubenswrapper[4962]: I1201 21:56:09.786806 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cd79bc6-3c65-476e-ab58-1dcbe4533e23-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 21:56:09 crc kubenswrapper[4962]: I1201 21:56:09.786816 4962 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2cd79bc6-3c65-476e-ab58-1dcbe4533e23-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 01 21:56:09 crc kubenswrapper[4962]: I1201 21:56:09.870120 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-4997b"] Dec 01 21:56:09 crc kubenswrapper[4962]: I1201 21:56:09.942293 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6555856bc4-7xjr6" Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.003103 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-6648ff8f4d-xwjnk" Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.102468 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aada9c17-9992-466a-bd0c-212a580295fa-config-data\") pod \"aada9c17-9992-466a-bd0c-212a580295fa\" (UID: \"aada9c17-9992-466a-bd0c-212a580295fa\") " Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.102627 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aada9c17-9992-466a-bd0c-212a580295fa-combined-ca-bundle\") pod \"aada9c17-9992-466a-bd0c-212a580295fa\" (UID: \"aada9c17-9992-466a-bd0c-212a580295fa\") " Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.102736 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aada9c17-9992-466a-bd0c-212a580295fa-config-data-custom\") pod \"aada9c17-9992-466a-bd0c-212a580295fa\" (UID: \"aada9c17-9992-466a-bd0c-212a580295fa\") " Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.102867 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zq5m4\" (UniqueName: \"kubernetes.io/projected/aada9c17-9992-466a-bd0c-212a580295fa-kube-api-access-zq5m4\") pod \"aada9c17-9992-466a-bd0c-212a580295fa\" (UID: \"aada9c17-9992-466a-bd0c-212a580295fa\") " Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.106643 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aada9c17-9992-466a-bd0c-212a580295fa-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "aada9c17-9992-466a-bd0c-212a580295fa" (UID: "aada9c17-9992-466a-bd0c-212a580295fa"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.107001 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aada9c17-9992-466a-bd0c-212a580295fa-kube-api-access-zq5m4" (OuterVolumeSpecName: "kube-api-access-zq5m4") pod "aada9c17-9992-466a-bd0c-212a580295fa" (UID: "aada9c17-9992-466a-bd0c-212a580295fa"). InnerVolumeSpecName "kube-api-access-zq5m4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.158633 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aada9c17-9992-466a-bd0c-212a580295fa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aada9c17-9992-466a-bd0c-212a580295fa" (UID: "aada9c17-9992-466a-bd0c-212a580295fa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.175631 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aada9c17-9992-466a-bd0c-212a580295fa-config-data" (OuterVolumeSpecName: "config-data") pod "aada9c17-9992-466a-bd0c-212a580295fa" (UID: "aada9c17-9992-466a-bd0c-212a580295fa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.205531 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aada9c17-9992-466a-bd0c-212a580295fa-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.205569 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aada9c17-9992-466a-bd0c-212a580295fa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.205581 4962 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aada9c17-9992-466a-bd0c-212a580295fa-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.205590 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zq5m4\" (UniqueName: \"kubernetes.io/projected/aada9c17-9992-466a-bd0c-212a580295fa-kube-api-access-zq5m4\") on node \"crc\" DevicePath \"\"" Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.228071 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7747d4b4f4-gldgq" Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.229745 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6555856bc4-7xjr6" Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.230273 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f1e9610-2753-4001-a9fb-5e020774725b" path="/var/lib/kubelet/pods/5f1e9610-2753-4001-a9fb-5e020774725b/volumes" Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.231327 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-4997b" event={"ID":"93391e58-9905-4fec-a4fe-4ed30bbb5eec","Type":"ContainerStarted","Data":"b181454367b6a79571acb2a769b6881754138bee8f6d5d109eea996485a78b02"} Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.231351 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7747d4b4f4-gldgq" event={"ID":"2cd79bc6-3c65-476e-ab58-1dcbe4533e23","Type":"ContainerDied","Data":"662d6c83177069830f2da0e846d899c3e53c0963c52e24e038a3172bfcb1563d"} Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.231362 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6555856bc4-7xjr6" event={"ID":"aada9c17-9992-466a-bd0c-212a580295fa","Type":"ContainerDied","Data":"c4018b3f7df8b793587679258df9c79a685f18eb7bb2fc90329ad29fb4b0a9a5"} Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.231381 4962 scope.go:117] "RemoveContainer" containerID="9f947b5766c5ef2440fe5917a26945ad9af65bf5738caa53f1a36cd6a5ee872b" Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.232389 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63129893-531f-4f40-a0d3-e75925071d5a","Type":"ContainerDied","Data":"771ddf655129a84f999eaac00db4972569c58fb4d809c335e8ff22e4c6c8889b"} Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.232492 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.270706 4962 scope.go:117] "RemoveContainer" containerID="c5c8c8383fdff4c08abf12f41ff53dde7a7c4e4a2aa25cd366a22097b1f67158" Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.311385 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-7747d4b4f4-gldgq"] Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.313783 4962 scope.go:117] "RemoveContainer" containerID="d4980274227a5cb7b9a30cc4629fb2d6aba2347cc030d0a131c2571ccd3b66a1" Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.339989 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-7747d4b4f4-gldgq"] Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.344082 4962 scope.go:117] "RemoveContainer" containerID="0684d0be3ff1743fbb032b8205e4e8970f933b764c051469b84b40fcf639d349" Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.378694 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.380045 4962 scope.go:117] "RemoveContainer" containerID="1cb0ce72822a9102f9c560213bfabf47a25660e86129bc6b1d4f699789fac5b3" Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.435006 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.451108 4962 scope.go:117] "RemoveContainer" containerID="3d860a04a627da29702c51df3bac4b8ed669baaafa6d7e8510b5d920da590e75" Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.463992 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-6555856bc4-7xjr6"] Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.484043 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-6555856bc4-7xjr6"] Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.503024 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 01 21:56:10 crc kubenswrapper[4962]: E1201 21:56:10.503492 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63129893-531f-4f40-a0d3-e75925071d5a" containerName="sg-core" Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.503503 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="63129893-531f-4f40-a0d3-e75925071d5a" containerName="sg-core" Dec 01 21:56:10 crc kubenswrapper[4962]: E1201 21:56:10.503518 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63129893-531f-4f40-a0d3-e75925071d5a" containerName="proxy-httpd" Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.503524 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="63129893-531f-4f40-a0d3-e75925071d5a" containerName="proxy-httpd" Dec 01 21:56:10 crc kubenswrapper[4962]: E1201 21:56:10.503536 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aada9c17-9992-466a-bd0c-212a580295fa" containerName="heat-cfnapi" Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.503542 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="aada9c17-9992-466a-bd0c-212a580295fa" containerName="heat-cfnapi" Dec 01 21:56:10 crc kubenswrapper[4962]: E1201 21:56:10.503559 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f1e9610-2753-4001-a9fb-5e020774725b" containerName="dnsmasq-dns" Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.503567 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f1e9610-2753-4001-a9fb-5e020774725b" containerName="dnsmasq-dns" Dec 01 21:56:10 crc kubenswrapper[4962]: E1201 21:56:10.503576 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cd79bc6-3c65-476e-ab58-1dcbe4533e23" containerName="heat-api" Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.503582 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cd79bc6-3c65-476e-ab58-1dcbe4533e23" containerName="heat-api" Dec 01 21:56:10 crc kubenswrapper[4962]: E1201 21:56:10.503595 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aada9c17-9992-466a-bd0c-212a580295fa" containerName="heat-cfnapi" Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.503600 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="aada9c17-9992-466a-bd0c-212a580295fa" containerName="heat-cfnapi" Dec 01 21:56:10 crc kubenswrapper[4962]: E1201 21:56:10.503614 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cd79bc6-3c65-476e-ab58-1dcbe4533e23" containerName="heat-api" Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.503619 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cd79bc6-3c65-476e-ab58-1dcbe4533e23" containerName="heat-api" Dec 01 21:56:10 crc kubenswrapper[4962]: E1201 21:56:10.503636 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63129893-531f-4f40-a0d3-e75925071d5a" containerName="ceilometer-central-agent" Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.503642 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="63129893-531f-4f40-a0d3-e75925071d5a" containerName="ceilometer-central-agent" Dec 01 21:56:10 crc kubenswrapper[4962]: E1201 21:56:10.503657 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f1e9610-2753-4001-a9fb-5e020774725b" containerName="init" Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.503662 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f1e9610-2753-4001-a9fb-5e020774725b" containerName="init" Dec 01 21:56:10 crc kubenswrapper[4962]: E1201 21:56:10.503675 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63129893-531f-4f40-a0d3-e75925071d5a" containerName="ceilometer-notification-agent" Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.503681 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="63129893-531f-4f40-a0d3-e75925071d5a" containerName="ceilometer-notification-agent" Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.503903 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cd79bc6-3c65-476e-ab58-1dcbe4533e23" containerName="heat-api" Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.503912 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="63129893-531f-4f40-a0d3-e75925071d5a" containerName="proxy-httpd" Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.503927 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f1e9610-2753-4001-a9fb-5e020774725b" containerName="dnsmasq-dns" Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.503956 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="63129893-531f-4f40-a0d3-e75925071d5a" containerName="ceilometer-notification-agent" Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.503968 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="aada9c17-9992-466a-bd0c-212a580295fa" containerName="heat-cfnapi" Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.503983 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="63129893-531f-4f40-a0d3-e75925071d5a" containerName="ceilometer-central-agent" Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.503992 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="63129893-531f-4f40-a0d3-e75925071d5a" containerName="sg-core" Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.504380 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cd79bc6-3c65-476e-ab58-1dcbe4533e23" containerName="heat-api" Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.504399 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="aada9c17-9992-466a-bd0c-212a580295fa" containerName="heat-cfnapi" Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.510903 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.519991 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.520230 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.532425 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.624221 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/272645a0-8a27-4b3e-8f4f-5e456f118d84-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"272645a0-8a27-4b3e-8f4f-5e456f118d84\") " pod="openstack/ceilometer-0" Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.624284 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/272645a0-8a27-4b3e-8f4f-5e456f118d84-run-httpd\") pod \"ceilometer-0\" (UID: \"272645a0-8a27-4b3e-8f4f-5e456f118d84\") " pod="openstack/ceilometer-0" Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.624323 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/272645a0-8a27-4b3e-8f4f-5e456f118d84-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"272645a0-8a27-4b3e-8f4f-5e456f118d84\") " pod="openstack/ceilometer-0" Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.624350 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/272645a0-8a27-4b3e-8f4f-5e456f118d84-config-data\") pod \"ceilometer-0\" (UID: \"272645a0-8a27-4b3e-8f4f-5e456f118d84\") " pod="openstack/ceilometer-0" Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.624402 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/272645a0-8a27-4b3e-8f4f-5e456f118d84-scripts\") pod \"ceilometer-0\" (UID: \"272645a0-8a27-4b3e-8f4f-5e456f118d84\") " pod="openstack/ceilometer-0" Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.624419 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mvc8\" (UniqueName: \"kubernetes.io/projected/272645a0-8a27-4b3e-8f4f-5e456f118d84-kube-api-access-4mvc8\") pod \"ceilometer-0\" (UID: \"272645a0-8a27-4b3e-8f4f-5e456f118d84\") " pod="openstack/ceilometer-0" Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.624502 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/272645a0-8a27-4b3e-8f4f-5e456f118d84-log-httpd\") pod \"ceilometer-0\" (UID: \"272645a0-8a27-4b3e-8f4f-5e456f118d84\") " pod="openstack/ceilometer-0" Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.726584 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/272645a0-8a27-4b3e-8f4f-5e456f118d84-log-httpd\") pod \"ceilometer-0\" (UID: \"272645a0-8a27-4b3e-8f4f-5e456f118d84\") " pod="openstack/ceilometer-0" Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.726677 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/272645a0-8a27-4b3e-8f4f-5e456f118d84-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"272645a0-8a27-4b3e-8f4f-5e456f118d84\") " pod="openstack/ceilometer-0" Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.726711 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/272645a0-8a27-4b3e-8f4f-5e456f118d84-run-httpd\") pod \"ceilometer-0\" (UID: \"272645a0-8a27-4b3e-8f4f-5e456f118d84\") " pod="openstack/ceilometer-0" Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.726740 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/272645a0-8a27-4b3e-8f4f-5e456f118d84-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"272645a0-8a27-4b3e-8f4f-5e456f118d84\") " pod="openstack/ceilometer-0" Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.726763 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/272645a0-8a27-4b3e-8f4f-5e456f118d84-config-data\") pod \"ceilometer-0\" (UID: \"272645a0-8a27-4b3e-8f4f-5e456f118d84\") " pod="openstack/ceilometer-0" Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.726812 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/272645a0-8a27-4b3e-8f4f-5e456f118d84-scripts\") pod \"ceilometer-0\" (UID: \"272645a0-8a27-4b3e-8f4f-5e456f118d84\") " pod="openstack/ceilometer-0" Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.726855 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mvc8\" (UniqueName: \"kubernetes.io/projected/272645a0-8a27-4b3e-8f4f-5e456f118d84-kube-api-access-4mvc8\") pod \"ceilometer-0\" (UID: \"272645a0-8a27-4b3e-8f4f-5e456f118d84\") " pod="openstack/ceilometer-0" Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.727429 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/272645a0-8a27-4b3e-8f4f-5e456f118d84-run-httpd\") pod \"ceilometer-0\" (UID: \"272645a0-8a27-4b3e-8f4f-5e456f118d84\") " pod="openstack/ceilometer-0" Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.728205 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/272645a0-8a27-4b3e-8f4f-5e456f118d84-log-httpd\") pod \"ceilometer-0\" (UID: \"272645a0-8a27-4b3e-8f4f-5e456f118d84\") " pod="openstack/ceilometer-0" Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.733367 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/272645a0-8a27-4b3e-8f4f-5e456f118d84-scripts\") pod \"ceilometer-0\" (UID: \"272645a0-8a27-4b3e-8f4f-5e456f118d84\") " pod="openstack/ceilometer-0" Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.733810 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/272645a0-8a27-4b3e-8f4f-5e456f118d84-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"272645a0-8a27-4b3e-8f4f-5e456f118d84\") " pod="openstack/ceilometer-0" Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.736216 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/272645a0-8a27-4b3e-8f4f-5e456f118d84-config-data\") pod \"ceilometer-0\" (UID: \"272645a0-8a27-4b3e-8f4f-5e456f118d84\") " pod="openstack/ceilometer-0" Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.739849 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/272645a0-8a27-4b3e-8f4f-5e456f118d84-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"272645a0-8a27-4b3e-8f4f-5e456f118d84\") " pod="openstack/ceilometer-0" Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.746291 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mvc8\" (UniqueName: \"kubernetes.io/projected/272645a0-8a27-4b3e-8f4f-5e456f118d84-kube-api-access-4mvc8\") pod \"ceilometer-0\" (UID: \"272645a0-8a27-4b3e-8f4f-5e456f118d84\") " pod="openstack/ceilometer-0" Dec 01 21:56:10 crc kubenswrapper[4962]: I1201 21:56:10.871357 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 21:56:11 crc kubenswrapper[4962]: I1201 21:56:11.397876 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 21:56:12 crc kubenswrapper[4962]: I1201 21:56:12.237222 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cd79bc6-3c65-476e-ab58-1dcbe4533e23" path="/var/lib/kubelet/pods/2cd79bc6-3c65-476e-ab58-1dcbe4533e23/volumes" Dec 01 21:56:12 crc kubenswrapper[4962]: I1201 21:56:12.238833 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63129893-531f-4f40-a0d3-e75925071d5a" path="/var/lib/kubelet/pods/63129893-531f-4f40-a0d3-e75925071d5a/volumes" Dec 01 21:56:12 crc kubenswrapper[4962]: I1201 21:56:12.239676 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aada9c17-9992-466a-bd0c-212a580295fa" path="/var/lib/kubelet/pods/aada9c17-9992-466a-bd0c-212a580295fa/volumes" Dec 01 21:56:12 crc kubenswrapper[4962]: I1201 21:56:12.291690 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"272645a0-8a27-4b3e-8f4f-5e456f118d84","Type":"ContainerStarted","Data":"2552f2dca1b12dabd1917284e7c2a47006998b9f9527e8f0b6250123f888cc95"} Dec 01 21:56:12 crc kubenswrapper[4962]: I1201 21:56:12.710423 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 21:56:12 crc kubenswrapper[4962]: I1201 21:56:12.942137 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6578955fd5-42wgd" podUID="5f1e9610-2753-4001-a9fb-5e020774725b" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.199:5353: i/o timeout" Dec 01 21:56:13 crc kubenswrapper[4962]: I1201 21:56:13.322094 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"272645a0-8a27-4b3e-8f4f-5e456f118d84","Type":"ContainerStarted","Data":"5afe056cdbf7eb9c4ee729101a3f13b72572b19443d8c503f40610110b805f8b"} Dec 01 21:56:14 crc kubenswrapper[4962]: I1201 21:56:14.352054 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"272645a0-8a27-4b3e-8f4f-5e456f118d84","Type":"ContainerStarted","Data":"19e4e8aa5d7790937197a98667f4c39ae0c70400f1c571e288d265fbf070736c"} Dec 01 21:56:14 crc kubenswrapper[4962]: I1201 21:56:14.352668 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"272645a0-8a27-4b3e-8f4f-5e456f118d84","Type":"ContainerStarted","Data":"1a73bd9240037ee68094ed3e267ab37cd096c0744898f10ce5ddcc92cdf5849f"} Dec 01 21:56:15 crc kubenswrapper[4962]: I1201 21:56:15.390586 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-6776d74cd9-xhqgt" Dec 01 21:56:15 crc kubenswrapper[4962]: I1201 21:56:15.447154 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-56f444f67c-lv25m"] Dec 01 21:56:15 crc kubenswrapper[4962]: I1201 21:56:15.447379 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-56f444f67c-lv25m" podUID="13829155-474a-445c-b27f-bffdd6b0befb" containerName="heat-engine" containerID="cri-o://c25aa5102c17553db1df7a44b59452938278922e3f21be8bf35e08968659e166" gracePeriod=60 Dec 01 21:56:17 crc kubenswrapper[4962]: E1201 21:56:17.471587 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c25aa5102c17553db1df7a44b59452938278922e3f21be8bf35e08968659e166" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 01 21:56:17 crc kubenswrapper[4962]: E1201 21:56:17.497141 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c25aa5102c17553db1df7a44b59452938278922e3f21be8bf35e08968659e166" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 01 21:56:17 crc kubenswrapper[4962]: E1201 21:56:17.504387 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c25aa5102c17553db1df7a44b59452938278922e3f21be8bf35e08968659e166" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 01 21:56:17 crc kubenswrapper[4962]: E1201 21:56:17.504448 4962 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-56f444f67c-lv25m" podUID="13829155-474a-445c-b27f-bffdd6b0befb" containerName="heat-engine" Dec 01 21:56:23 crc kubenswrapper[4962]: I1201 21:56:23.489287 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"272645a0-8a27-4b3e-8f4f-5e456f118d84","Type":"ContainerStarted","Data":"5a6778c5ed2512f1ec2efda396e8e9535eb9077d93cd0315583df081b2eadf0f"} Dec 01 21:56:23 crc kubenswrapper[4962]: I1201 21:56:23.489829 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="272645a0-8a27-4b3e-8f4f-5e456f118d84" containerName="ceilometer-central-agent" containerID="cri-o://5afe056cdbf7eb9c4ee729101a3f13b72572b19443d8c503f40610110b805f8b" gracePeriod=30 Dec 01 21:56:23 crc kubenswrapper[4962]: I1201 21:56:23.490075 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 01 21:56:23 crc kubenswrapper[4962]: I1201 21:56:23.492551 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="272645a0-8a27-4b3e-8f4f-5e456f118d84" containerName="proxy-httpd" containerID="cri-o://5a6778c5ed2512f1ec2efda396e8e9535eb9077d93cd0315583df081b2eadf0f" gracePeriod=30 Dec 01 21:56:23 crc kubenswrapper[4962]: I1201 21:56:23.492622 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="272645a0-8a27-4b3e-8f4f-5e456f118d84" containerName="sg-core" containerID="cri-o://19e4e8aa5d7790937197a98667f4c39ae0c70400f1c571e288d265fbf070736c" gracePeriod=30 Dec 01 21:56:23 crc kubenswrapper[4962]: I1201 21:56:23.492666 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="272645a0-8a27-4b3e-8f4f-5e456f118d84" containerName="ceilometer-notification-agent" containerID="cri-o://1a73bd9240037ee68094ed3e267ab37cd096c0744898f10ce5ddcc92cdf5849f" gracePeriod=30 Dec 01 21:56:23 crc kubenswrapper[4962]: I1201 21:56:23.504638 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-4997b" event={"ID":"93391e58-9905-4fec-a4fe-4ed30bbb5eec","Type":"ContainerStarted","Data":"0c593c3dbdade458e130af55b4171bbf267694e32969ebf966f687723d011ab0"} Dec 01 21:56:23 crc kubenswrapper[4962]: I1201 21:56:23.533792 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.052141714 podStartE2EDuration="13.533774855s" podCreationTimestamp="2025-12-01 21:56:10 +0000 UTC" firstStartedPulling="2025-12-01 21:56:11.410913036 +0000 UTC m=+1355.512352251" lastFinishedPulling="2025-12-01 21:56:22.892546197 +0000 UTC m=+1366.993985392" observedRunningTime="2025-12-01 21:56:23.514097515 +0000 UTC m=+1367.615536710" watchObservedRunningTime="2025-12-01 21:56:23.533774855 +0000 UTC m=+1367.635214040" Dec 01 21:56:23 crc kubenswrapper[4962]: I1201 21:56:23.545006 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-4997b" podStartSLOduration=2.539809446 podStartE2EDuration="15.544988565s" podCreationTimestamp="2025-12-01 21:56:08 +0000 UTC" firstStartedPulling="2025-12-01 21:56:09.881111969 +0000 UTC m=+1353.982551164" lastFinishedPulling="2025-12-01 21:56:22.886291088 +0000 UTC m=+1366.987730283" observedRunningTime="2025-12-01 21:56:23.532352845 +0000 UTC m=+1367.633792050" watchObservedRunningTime="2025-12-01 21:56:23.544988565 +0000 UTC m=+1367.646427760" Dec 01 21:56:24 crc kubenswrapper[4962]: I1201 21:56:24.517155 4962 generic.go:334] "Generic (PLEG): container finished" podID="272645a0-8a27-4b3e-8f4f-5e456f118d84" containerID="19e4e8aa5d7790937197a98667f4c39ae0c70400f1c571e288d265fbf070736c" exitCode=2 Dec 01 21:56:24 crc kubenswrapper[4962]: I1201 21:56:24.517387 4962 generic.go:334] "Generic (PLEG): container finished" podID="272645a0-8a27-4b3e-8f4f-5e456f118d84" containerID="1a73bd9240037ee68094ed3e267ab37cd096c0744898f10ce5ddcc92cdf5849f" exitCode=0 Dec 01 21:56:24 crc kubenswrapper[4962]: I1201 21:56:24.517396 4962 generic.go:334] "Generic (PLEG): container finished" podID="272645a0-8a27-4b3e-8f4f-5e456f118d84" containerID="5afe056cdbf7eb9c4ee729101a3f13b72572b19443d8c503f40610110b805f8b" exitCode=0 Dec 01 21:56:24 crc kubenswrapper[4962]: I1201 21:56:24.518433 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"272645a0-8a27-4b3e-8f4f-5e456f118d84","Type":"ContainerDied","Data":"19e4e8aa5d7790937197a98667f4c39ae0c70400f1c571e288d265fbf070736c"} Dec 01 21:56:24 crc kubenswrapper[4962]: I1201 21:56:24.518467 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"272645a0-8a27-4b3e-8f4f-5e456f118d84","Type":"ContainerDied","Data":"1a73bd9240037ee68094ed3e267ab37cd096c0744898f10ce5ddcc92cdf5849f"} Dec 01 21:56:24 crc kubenswrapper[4962]: I1201 21:56:24.518478 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"272645a0-8a27-4b3e-8f4f-5e456f118d84","Type":"ContainerDied","Data":"5afe056cdbf7eb9c4ee729101a3f13b72572b19443d8c503f40610110b805f8b"} Dec 01 21:56:27 crc kubenswrapper[4962]: E1201 21:56:27.463300 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c25aa5102c17553db1df7a44b59452938278922e3f21be8bf35e08968659e166" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 01 21:56:27 crc kubenswrapper[4962]: E1201 21:56:27.465424 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c25aa5102c17553db1df7a44b59452938278922e3f21be8bf35e08968659e166" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 01 21:56:27 crc kubenswrapper[4962]: E1201 21:56:27.468087 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c25aa5102c17553db1df7a44b59452938278922e3f21be8bf35e08968659e166" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 01 21:56:27 crc kubenswrapper[4962]: E1201 21:56:27.468130 4962 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-56f444f67c-lv25m" podUID="13829155-474a-445c-b27f-bffdd6b0befb" containerName="heat-engine" Dec 01 21:56:29 crc kubenswrapper[4962]: I1201 21:56:29.417249 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-56f444f67c-lv25m" Dec 01 21:56:29 crc kubenswrapper[4962]: I1201 21:56:29.543075 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13829155-474a-445c-b27f-bffdd6b0befb-config-data-custom\") pod \"13829155-474a-445c-b27f-bffdd6b0befb\" (UID: \"13829155-474a-445c-b27f-bffdd6b0befb\") " Dec 01 21:56:29 crc kubenswrapper[4962]: I1201 21:56:29.543343 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13829155-474a-445c-b27f-bffdd6b0befb-config-data\") pod \"13829155-474a-445c-b27f-bffdd6b0befb\" (UID: \"13829155-474a-445c-b27f-bffdd6b0befb\") " Dec 01 21:56:29 crc kubenswrapper[4962]: I1201 21:56:29.543397 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6pzp\" (UniqueName: \"kubernetes.io/projected/13829155-474a-445c-b27f-bffdd6b0befb-kube-api-access-d6pzp\") pod \"13829155-474a-445c-b27f-bffdd6b0befb\" (UID: \"13829155-474a-445c-b27f-bffdd6b0befb\") " Dec 01 21:56:29 crc kubenswrapper[4962]: I1201 21:56:29.543477 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13829155-474a-445c-b27f-bffdd6b0befb-combined-ca-bundle\") pod \"13829155-474a-445c-b27f-bffdd6b0befb\" (UID: \"13829155-474a-445c-b27f-bffdd6b0befb\") " Dec 01 21:56:29 crc kubenswrapper[4962]: I1201 21:56:29.549385 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13829155-474a-445c-b27f-bffdd6b0befb-kube-api-access-d6pzp" (OuterVolumeSpecName: "kube-api-access-d6pzp") pod "13829155-474a-445c-b27f-bffdd6b0befb" (UID: "13829155-474a-445c-b27f-bffdd6b0befb"). InnerVolumeSpecName "kube-api-access-d6pzp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:56:29 crc kubenswrapper[4962]: I1201 21:56:29.557236 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13829155-474a-445c-b27f-bffdd6b0befb-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "13829155-474a-445c-b27f-bffdd6b0befb" (UID: "13829155-474a-445c-b27f-bffdd6b0befb"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:56:29 crc kubenswrapper[4962]: I1201 21:56:29.575505 4962 generic.go:334] "Generic (PLEG): container finished" podID="13829155-474a-445c-b27f-bffdd6b0befb" containerID="c25aa5102c17553db1df7a44b59452938278922e3f21be8bf35e08968659e166" exitCode=0 Dec 01 21:56:29 crc kubenswrapper[4962]: I1201 21:56:29.576000 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-56f444f67c-lv25m" Dec 01 21:56:29 crc kubenswrapper[4962]: I1201 21:56:29.576025 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-56f444f67c-lv25m" event={"ID":"13829155-474a-445c-b27f-bffdd6b0befb","Type":"ContainerDied","Data":"c25aa5102c17553db1df7a44b59452938278922e3f21be8bf35e08968659e166"} Dec 01 21:56:29 crc kubenswrapper[4962]: I1201 21:56:29.576569 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-56f444f67c-lv25m" event={"ID":"13829155-474a-445c-b27f-bffdd6b0befb","Type":"ContainerDied","Data":"b3725c76f9647cefa4f3236345172e1005cfd2fb7918b922a6c3f6bfd5ffc701"} Dec 01 21:56:29 crc kubenswrapper[4962]: I1201 21:56:29.576608 4962 scope.go:117] "RemoveContainer" containerID="c25aa5102c17553db1df7a44b59452938278922e3f21be8bf35e08968659e166" Dec 01 21:56:29 crc kubenswrapper[4962]: I1201 21:56:29.577755 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13829155-474a-445c-b27f-bffdd6b0befb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "13829155-474a-445c-b27f-bffdd6b0befb" (UID: "13829155-474a-445c-b27f-bffdd6b0befb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:56:29 crc kubenswrapper[4962]: I1201 21:56:29.627719 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13829155-474a-445c-b27f-bffdd6b0befb-config-data" (OuterVolumeSpecName: "config-data") pod "13829155-474a-445c-b27f-bffdd6b0befb" (UID: "13829155-474a-445c-b27f-bffdd6b0befb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:56:29 crc kubenswrapper[4962]: I1201 21:56:29.645971 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13829155-474a-445c-b27f-bffdd6b0befb-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 21:56:29 crc kubenswrapper[4962]: I1201 21:56:29.646004 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6pzp\" (UniqueName: \"kubernetes.io/projected/13829155-474a-445c-b27f-bffdd6b0befb-kube-api-access-d6pzp\") on node \"crc\" DevicePath \"\"" Dec 01 21:56:29 crc kubenswrapper[4962]: I1201 21:56:29.646016 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13829155-474a-445c-b27f-bffdd6b0befb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 21:56:29 crc kubenswrapper[4962]: I1201 21:56:29.646026 4962 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13829155-474a-445c-b27f-bffdd6b0befb-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 01 21:56:29 crc kubenswrapper[4962]: I1201 21:56:29.667535 4962 scope.go:117] "RemoveContainer" containerID="c25aa5102c17553db1df7a44b59452938278922e3f21be8bf35e08968659e166" Dec 01 21:56:29 crc kubenswrapper[4962]: E1201 21:56:29.668016 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c25aa5102c17553db1df7a44b59452938278922e3f21be8bf35e08968659e166\": container with ID starting with c25aa5102c17553db1df7a44b59452938278922e3f21be8bf35e08968659e166 not found: ID does not exist" containerID="c25aa5102c17553db1df7a44b59452938278922e3f21be8bf35e08968659e166" Dec 01 21:56:29 crc kubenswrapper[4962]: I1201 21:56:29.668167 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c25aa5102c17553db1df7a44b59452938278922e3f21be8bf35e08968659e166"} err="failed to get container status \"c25aa5102c17553db1df7a44b59452938278922e3f21be8bf35e08968659e166\": rpc error: code = NotFound desc = could not find container \"c25aa5102c17553db1df7a44b59452938278922e3f21be8bf35e08968659e166\": container with ID starting with c25aa5102c17553db1df7a44b59452938278922e3f21be8bf35e08968659e166 not found: ID does not exist" Dec 01 21:56:29 crc kubenswrapper[4962]: I1201 21:56:29.919169 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-56f444f67c-lv25m"] Dec 01 21:56:29 crc kubenswrapper[4962]: I1201 21:56:29.936675 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-56f444f67c-lv25m"] Dec 01 21:56:30 crc kubenswrapper[4962]: I1201 21:56:30.242979 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13829155-474a-445c-b27f-bffdd6b0befb" path="/var/lib/kubelet/pods/13829155-474a-445c-b27f-bffdd6b0befb/volumes" Dec 01 21:56:33 crc kubenswrapper[4962]: I1201 21:56:33.628043 4962 generic.go:334] "Generic (PLEG): container finished" podID="93391e58-9905-4fec-a4fe-4ed30bbb5eec" containerID="0c593c3dbdade458e130af55b4171bbf267694e32969ebf966f687723d011ab0" exitCode=0 Dec 01 21:56:33 crc kubenswrapper[4962]: I1201 21:56:33.628132 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-4997b" event={"ID":"93391e58-9905-4fec-a4fe-4ed30bbb5eec","Type":"ContainerDied","Data":"0c593c3dbdade458e130af55b4171bbf267694e32969ebf966f687723d011ab0"} Dec 01 21:56:35 crc kubenswrapper[4962]: I1201 21:56:35.099413 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-4997b" Dec 01 21:56:35 crc kubenswrapper[4962]: I1201 21:56:35.186915 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93391e58-9905-4fec-a4fe-4ed30bbb5eec-scripts\") pod \"93391e58-9905-4fec-a4fe-4ed30bbb5eec\" (UID: \"93391e58-9905-4fec-a4fe-4ed30bbb5eec\") " Dec 01 21:56:35 crc kubenswrapper[4962]: I1201 21:56:35.186998 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93391e58-9905-4fec-a4fe-4ed30bbb5eec-combined-ca-bundle\") pod \"93391e58-9905-4fec-a4fe-4ed30bbb5eec\" (UID: \"93391e58-9905-4fec-a4fe-4ed30bbb5eec\") " Dec 01 21:56:35 crc kubenswrapper[4962]: I1201 21:56:35.187064 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jd4nr\" (UniqueName: \"kubernetes.io/projected/93391e58-9905-4fec-a4fe-4ed30bbb5eec-kube-api-access-jd4nr\") pod \"93391e58-9905-4fec-a4fe-4ed30bbb5eec\" (UID: \"93391e58-9905-4fec-a4fe-4ed30bbb5eec\") " Dec 01 21:56:35 crc kubenswrapper[4962]: I1201 21:56:35.187303 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93391e58-9905-4fec-a4fe-4ed30bbb5eec-config-data\") pod \"93391e58-9905-4fec-a4fe-4ed30bbb5eec\" (UID: \"93391e58-9905-4fec-a4fe-4ed30bbb5eec\") " Dec 01 21:56:35 crc kubenswrapper[4962]: I1201 21:56:35.193000 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93391e58-9905-4fec-a4fe-4ed30bbb5eec-scripts" (OuterVolumeSpecName: "scripts") pod "93391e58-9905-4fec-a4fe-4ed30bbb5eec" (UID: "93391e58-9905-4fec-a4fe-4ed30bbb5eec"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:56:35 crc kubenswrapper[4962]: I1201 21:56:35.193160 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93391e58-9905-4fec-a4fe-4ed30bbb5eec-kube-api-access-jd4nr" (OuterVolumeSpecName: "kube-api-access-jd4nr") pod "93391e58-9905-4fec-a4fe-4ed30bbb5eec" (UID: "93391e58-9905-4fec-a4fe-4ed30bbb5eec"). InnerVolumeSpecName "kube-api-access-jd4nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:56:35 crc kubenswrapper[4962]: I1201 21:56:35.225056 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93391e58-9905-4fec-a4fe-4ed30bbb5eec-config-data" (OuterVolumeSpecName: "config-data") pod "93391e58-9905-4fec-a4fe-4ed30bbb5eec" (UID: "93391e58-9905-4fec-a4fe-4ed30bbb5eec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:56:35 crc kubenswrapper[4962]: I1201 21:56:35.230970 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93391e58-9905-4fec-a4fe-4ed30bbb5eec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "93391e58-9905-4fec-a4fe-4ed30bbb5eec" (UID: "93391e58-9905-4fec-a4fe-4ed30bbb5eec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:56:35 crc kubenswrapper[4962]: I1201 21:56:35.290123 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93391e58-9905-4fec-a4fe-4ed30bbb5eec-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 21:56:35 crc kubenswrapper[4962]: I1201 21:56:35.290158 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93391e58-9905-4fec-a4fe-4ed30bbb5eec-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 21:56:35 crc kubenswrapper[4962]: I1201 21:56:35.290167 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93391e58-9905-4fec-a4fe-4ed30bbb5eec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 21:56:35 crc kubenswrapper[4962]: I1201 21:56:35.290178 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jd4nr\" (UniqueName: \"kubernetes.io/projected/93391e58-9905-4fec-a4fe-4ed30bbb5eec-kube-api-access-jd4nr\") on node \"crc\" DevicePath \"\"" Dec 01 21:56:35 crc kubenswrapper[4962]: I1201 21:56:35.654810 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-4997b" event={"ID":"93391e58-9905-4fec-a4fe-4ed30bbb5eec","Type":"ContainerDied","Data":"b181454367b6a79571acb2a769b6881754138bee8f6d5d109eea996485a78b02"} Dec 01 21:56:35 crc kubenswrapper[4962]: I1201 21:56:35.654845 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b181454367b6a79571acb2a769b6881754138bee8f6d5d109eea996485a78b02" Dec 01 21:56:35 crc kubenswrapper[4962]: I1201 21:56:35.654899 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-4997b" Dec 01 21:56:35 crc kubenswrapper[4962]: I1201 21:56:35.840012 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 01 21:56:35 crc kubenswrapper[4962]: E1201 21:56:35.843956 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13829155-474a-445c-b27f-bffdd6b0befb" containerName="heat-engine" Dec 01 21:56:35 crc kubenswrapper[4962]: I1201 21:56:35.843996 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="13829155-474a-445c-b27f-bffdd6b0befb" containerName="heat-engine" Dec 01 21:56:35 crc kubenswrapper[4962]: E1201 21:56:35.844031 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93391e58-9905-4fec-a4fe-4ed30bbb5eec" containerName="nova-cell0-conductor-db-sync" Dec 01 21:56:35 crc kubenswrapper[4962]: I1201 21:56:35.844038 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="93391e58-9905-4fec-a4fe-4ed30bbb5eec" containerName="nova-cell0-conductor-db-sync" Dec 01 21:56:35 crc kubenswrapper[4962]: I1201 21:56:35.844403 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="93391e58-9905-4fec-a4fe-4ed30bbb5eec" containerName="nova-cell0-conductor-db-sync" Dec 01 21:56:35 crc kubenswrapper[4962]: I1201 21:56:35.844442 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="13829155-474a-445c-b27f-bffdd6b0befb" containerName="heat-engine" Dec 01 21:56:35 crc kubenswrapper[4962]: I1201 21:56:35.845502 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 01 21:56:35 crc kubenswrapper[4962]: I1201 21:56:35.886514 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-vtxzn" Dec 01 21:56:35 crc kubenswrapper[4962]: I1201 21:56:35.886752 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 01 21:56:35 crc kubenswrapper[4962]: I1201 21:56:35.906465 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 01 21:56:36 crc kubenswrapper[4962]: I1201 21:56:36.012480 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e95f5aa0-0150-49da-a25d-c2eb369d394e-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"e95f5aa0-0150-49da-a25d-c2eb369d394e\") " pod="openstack/nova-cell0-conductor-0" Dec 01 21:56:36 crc kubenswrapper[4962]: I1201 21:56:36.012541 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmfcd\" (UniqueName: \"kubernetes.io/projected/e95f5aa0-0150-49da-a25d-c2eb369d394e-kube-api-access-zmfcd\") pod \"nova-cell0-conductor-0\" (UID: \"e95f5aa0-0150-49da-a25d-c2eb369d394e\") " pod="openstack/nova-cell0-conductor-0" Dec 01 21:56:36 crc kubenswrapper[4962]: I1201 21:56:36.012860 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e95f5aa0-0150-49da-a25d-c2eb369d394e-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"e95f5aa0-0150-49da-a25d-c2eb369d394e\") " pod="openstack/nova-cell0-conductor-0" Dec 01 21:56:36 crc kubenswrapper[4962]: I1201 21:56:36.115203 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e95f5aa0-0150-49da-a25d-c2eb369d394e-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"e95f5aa0-0150-49da-a25d-c2eb369d394e\") " pod="openstack/nova-cell0-conductor-0" Dec 01 21:56:36 crc kubenswrapper[4962]: I1201 21:56:36.115849 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmfcd\" (UniqueName: \"kubernetes.io/projected/e95f5aa0-0150-49da-a25d-c2eb369d394e-kube-api-access-zmfcd\") pod \"nova-cell0-conductor-0\" (UID: \"e95f5aa0-0150-49da-a25d-c2eb369d394e\") " pod="openstack/nova-cell0-conductor-0" Dec 01 21:56:36 crc kubenswrapper[4962]: I1201 21:56:36.116027 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e95f5aa0-0150-49da-a25d-c2eb369d394e-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"e95f5aa0-0150-49da-a25d-c2eb369d394e\") " pod="openstack/nova-cell0-conductor-0" Dec 01 21:56:36 crc kubenswrapper[4962]: I1201 21:56:36.119927 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e95f5aa0-0150-49da-a25d-c2eb369d394e-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"e95f5aa0-0150-49da-a25d-c2eb369d394e\") " pod="openstack/nova-cell0-conductor-0" Dec 01 21:56:36 crc kubenswrapper[4962]: I1201 21:56:36.120893 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e95f5aa0-0150-49da-a25d-c2eb369d394e-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"e95f5aa0-0150-49da-a25d-c2eb369d394e\") " pod="openstack/nova-cell0-conductor-0" Dec 01 21:56:36 crc kubenswrapper[4962]: I1201 21:56:36.145688 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmfcd\" (UniqueName: \"kubernetes.io/projected/e95f5aa0-0150-49da-a25d-c2eb369d394e-kube-api-access-zmfcd\") pod \"nova-cell0-conductor-0\" (UID: \"e95f5aa0-0150-49da-a25d-c2eb369d394e\") " pod="openstack/nova-cell0-conductor-0" Dec 01 21:56:36 crc kubenswrapper[4962]: I1201 21:56:36.228149 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-vtxzn" Dec 01 21:56:36 crc kubenswrapper[4962]: I1201 21:56:36.235136 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 01 21:56:36 crc kubenswrapper[4962]: I1201 21:56:36.751517 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 21:56:36 crc kubenswrapper[4962]: I1201 21:56:36.752009 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="1ded81d0-3be6-4293-abd6-c3434d42667e" containerName="glance-log" containerID="cri-o://836b8101e947dd78979b83dfa7cb9ebeb2a41def98f01177745280a0ea6e6303" gracePeriod=30 Dec 01 21:56:36 crc kubenswrapper[4962]: I1201 21:56:36.752464 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="1ded81d0-3be6-4293-abd6-c3434d42667e" containerName="glance-httpd" containerID="cri-o://13883312d5b99c02ab6794ce1361846db18e86ab607c6100615a8682932dbec6" gracePeriod=30 Dec 01 21:56:36 crc kubenswrapper[4962]: I1201 21:56:36.773437 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 01 21:56:37 crc kubenswrapper[4962]: I1201 21:56:37.676040 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"e95f5aa0-0150-49da-a25d-c2eb369d394e","Type":"ContainerStarted","Data":"0fc56ad3c6fa62a6af99e37471ecd22b499ff39824bc98bc9d9ec4c5b8c39c46"} Dec 01 21:56:37 crc kubenswrapper[4962]: I1201 21:56:37.676500 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"e95f5aa0-0150-49da-a25d-c2eb369d394e","Type":"ContainerStarted","Data":"ac87588bd2ae85ad91deb089f7e7dff000d4dc47aef8f4f7afa635752c16a097"} Dec 01 21:56:37 crc kubenswrapper[4962]: I1201 21:56:37.676517 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 01 21:56:37 crc kubenswrapper[4962]: I1201 21:56:37.678242 4962 generic.go:334] "Generic (PLEG): container finished" podID="1ded81d0-3be6-4293-abd6-c3434d42667e" containerID="836b8101e947dd78979b83dfa7cb9ebeb2a41def98f01177745280a0ea6e6303" exitCode=143 Dec 01 21:56:37 crc kubenswrapper[4962]: I1201 21:56:37.678282 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1ded81d0-3be6-4293-abd6-c3434d42667e","Type":"ContainerDied","Data":"836b8101e947dd78979b83dfa7cb9ebeb2a41def98f01177745280a0ea6e6303"} Dec 01 21:56:37 crc kubenswrapper[4962]: I1201 21:56:37.698598 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.6985804829999998 podStartE2EDuration="2.698580483s" podCreationTimestamp="2025-12-01 21:56:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:56:37.692004136 +0000 UTC m=+1381.793443371" watchObservedRunningTime="2025-12-01 21:56:37.698580483 +0000 UTC m=+1381.800019678" Dec 01 21:56:38 crc kubenswrapper[4962]: I1201 21:56:38.475793 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 01 21:56:38 crc kubenswrapper[4962]: I1201 21:56:38.539422 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 21:56:38 crc kubenswrapper[4962]: I1201 21:56:38.539747 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="788bee1d-914f-4efc-acea-67ff250ce73f" containerName="glance-log" containerID="cri-o://844141a471fd7b5af60ab53947a4cbc50ad1f6e106b8325daf01b1afbb6c9dca" gracePeriod=30 Dec 01 21:56:38 crc kubenswrapper[4962]: I1201 21:56:38.540007 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="788bee1d-914f-4efc-acea-67ff250ce73f" containerName="glance-httpd" containerID="cri-o://e1cf1e720f696aabdf6d9c1f9c857171943a447ef9c523e1907ee394e3205972" gracePeriod=30 Dec 01 21:56:38 crc kubenswrapper[4962]: I1201 21:56:38.688011 4962 generic.go:334] "Generic (PLEG): container finished" podID="788bee1d-914f-4efc-acea-67ff250ce73f" containerID="844141a471fd7b5af60ab53947a4cbc50ad1f6e106b8325daf01b1afbb6c9dca" exitCode=143 Dec 01 21:56:38 crc kubenswrapper[4962]: I1201 21:56:38.688104 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"788bee1d-914f-4efc-acea-67ff250ce73f","Type":"ContainerDied","Data":"844141a471fd7b5af60ab53947a4cbc50ad1f6e106b8325daf01b1afbb6c9dca"} Dec 01 21:56:39 crc kubenswrapper[4962]: I1201 21:56:39.699820 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="e95f5aa0-0150-49da-a25d-c2eb369d394e" containerName="nova-cell0-conductor-conductor" containerID="cri-o://0fc56ad3c6fa62a6af99e37471ecd22b499ff39824bc98bc9d9ec4c5b8c39c46" gracePeriod=30 Dec 01 21:56:40 crc kubenswrapper[4962]: I1201 21:56:40.515705 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 21:56:40 crc kubenswrapper[4962]: I1201 21:56:40.645464 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ded81d0-3be6-4293-abd6-c3434d42667e-combined-ca-bundle\") pod \"1ded81d0-3be6-4293-abd6-c3434d42667e\" (UID: \"1ded81d0-3be6-4293-abd6-c3434d42667e\") " Dec 01 21:56:40 crc kubenswrapper[4962]: I1201 21:56:40.645568 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ded81d0-3be6-4293-abd6-c3434d42667e-config-data\") pod \"1ded81d0-3be6-4293-abd6-c3434d42667e\" (UID: \"1ded81d0-3be6-4293-abd6-c3434d42667e\") " Dec 01 21:56:40 crc kubenswrapper[4962]: I1201 21:56:40.645605 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ded81d0-3be6-4293-abd6-c3434d42667e-logs\") pod \"1ded81d0-3be6-4293-abd6-c3434d42667e\" (UID: \"1ded81d0-3be6-4293-abd6-c3434d42667e\") " Dec 01 21:56:40 crc kubenswrapper[4962]: I1201 21:56:40.645675 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"1ded81d0-3be6-4293-abd6-c3434d42667e\" (UID: \"1ded81d0-3be6-4293-abd6-c3434d42667e\") " Dec 01 21:56:40 crc kubenswrapper[4962]: I1201 21:56:40.645724 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1ded81d0-3be6-4293-abd6-c3434d42667e-httpd-run\") pod \"1ded81d0-3be6-4293-abd6-c3434d42667e\" (UID: \"1ded81d0-3be6-4293-abd6-c3434d42667e\") " Dec 01 21:56:40 crc kubenswrapper[4962]: I1201 21:56:40.645774 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65226\" (UniqueName: \"kubernetes.io/projected/1ded81d0-3be6-4293-abd6-c3434d42667e-kube-api-access-65226\") pod \"1ded81d0-3be6-4293-abd6-c3434d42667e\" (UID: \"1ded81d0-3be6-4293-abd6-c3434d42667e\") " Dec 01 21:56:40 crc kubenswrapper[4962]: I1201 21:56:40.645839 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ded81d0-3be6-4293-abd6-c3434d42667e-scripts\") pod \"1ded81d0-3be6-4293-abd6-c3434d42667e\" (UID: \"1ded81d0-3be6-4293-abd6-c3434d42667e\") " Dec 01 21:56:40 crc kubenswrapper[4962]: I1201 21:56:40.645916 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ded81d0-3be6-4293-abd6-c3434d42667e-public-tls-certs\") pod \"1ded81d0-3be6-4293-abd6-c3434d42667e\" (UID: \"1ded81d0-3be6-4293-abd6-c3434d42667e\") " Dec 01 21:56:40 crc kubenswrapper[4962]: I1201 21:56:40.647912 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ded81d0-3be6-4293-abd6-c3434d42667e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "1ded81d0-3be6-4293-abd6-c3434d42667e" (UID: "1ded81d0-3be6-4293-abd6-c3434d42667e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:56:40 crc kubenswrapper[4962]: I1201 21:56:40.649823 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ded81d0-3be6-4293-abd6-c3434d42667e-logs" (OuterVolumeSpecName: "logs") pod "1ded81d0-3be6-4293-abd6-c3434d42667e" (UID: "1ded81d0-3be6-4293-abd6-c3434d42667e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:56:40 crc kubenswrapper[4962]: I1201 21:56:40.653189 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ded81d0-3be6-4293-abd6-c3434d42667e-kube-api-access-65226" (OuterVolumeSpecName: "kube-api-access-65226") pod "1ded81d0-3be6-4293-abd6-c3434d42667e" (UID: "1ded81d0-3be6-4293-abd6-c3434d42667e"). InnerVolumeSpecName "kube-api-access-65226". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:56:40 crc kubenswrapper[4962]: I1201 21:56:40.653196 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ded81d0-3be6-4293-abd6-c3434d42667e-scripts" (OuterVolumeSpecName: "scripts") pod "1ded81d0-3be6-4293-abd6-c3434d42667e" (UID: "1ded81d0-3be6-4293-abd6-c3434d42667e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:56:40 crc kubenswrapper[4962]: I1201 21:56:40.663108 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "1ded81d0-3be6-4293-abd6-c3434d42667e" (UID: "1ded81d0-3be6-4293-abd6-c3434d42667e"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 21:56:40 crc kubenswrapper[4962]: I1201 21:56:40.682452 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ded81d0-3be6-4293-abd6-c3434d42667e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1ded81d0-3be6-4293-abd6-c3434d42667e" (UID: "1ded81d0-3be6-4293-abd6-c3434d42667e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:56:40 crc kubenswrapper[4962]: I1201 21:56:40.712987 4962 generic.go:334] "Generic (PLEG): container finished" podID="1ded81d0-3be6-4293-abd6-c3434d42667e" containerID="13883312d5b99c02ab6794ce1361846db18e86ab607c6100615a8682932dbec6" exitCode=0 Dec 01 21:56:40 crc kubenswrapper[4962]: I1201 21:56:40.713032 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1ded81d0-3be6-4293-abd6-c3434d42667e","Type":"ContainerDied","Data":"13883312d5b99c02ab6794ce1361846db18e86ab607c6100615a8682932dbec6"} Dec 01 21:56:40 crc kubenswrapper[4962]: I1201 21:56:40.713062 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1ded81d0-3be6-4293-abd6-c3434d42667e","Type":"ContainerDied","Data":"8731a4e99fbe79050dc71c2e66b6c676460403e7b8ad651e4670cdc44f28abfe"} Dec 01 21:56:40 crc kubenswrapper[4962]: I1201 21:56:40.713082 4962 scope.go:117] "RemoveContainer" containerID="13883312d5b99c02ab6794ce1361846db18e86ab607c6100615a8682932dbec6" Dec 01 21:56:40 crc kubenswrapper[4962]: I1201 21:56:40.713246 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 21:56:40 crc kubenswrapper[4962]: I1201 21:56:40.732153 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ded81d0-3be6-4293-abd6-c3434d42667e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "1ded81d0-3be6-4293-abd6-c3434d42667e" (UID: "1ded81d0-3be6-4293-abd6-c3434d42667e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:56:40 crc kubenswrapper[4962]: I1201 21:56:40.737245 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ded81d0-3be6-4293-abd6-c3434d42667e-config-data" (OuterVolumeSpecName: "config-data") pod "1ded81d0-3be6-4293-abd6-c3434d42667e" (UID: "1ded81d0-3be6-4293-abd6-c3434d42667e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:56:40 crc kubenswrapper[4962]: I1201 21:56:40.748803 4962 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1ded81d0-3be6-4293-abd6-c3434d42667e-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 01 21:56:40 crc kubenswrapper[4962]: I1201 21:56:40.748836 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65226\" (UniqueName: \"kubernetes.io/projected/1ded81d0-3be6-4293-abd6-c3434d42667e-kube-api-access-65226\") on node \"crc\" DevicePath \"\"" Dec 01 21:56:40 crc kubenswrapper[4962]: I1201 21:56:40.748847 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ded81d0-3be6-4293-abd6-c3434d42667e-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 21:56:40 crc kubenswrapper[4962]: I1201 21:56:40.748856 4962 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ded81d0-3be6-4293-abd6-c3434d42667e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 21:56:40 crc kubenswrapper[4962]: I1201 21:56:40.748865 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ded81d0-3be6-4293-abd6-c3434d42667e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 21:56:40 crc kubenswrapper[4962]: I1201 21:56:40.748873 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ded81d0-3be6-4293-abd6-c3434d42667e-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 21:56:40 crc kubenswrapper[4962]: I1201 21:56:40.748882 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ded81d0-3be6-4293-abd6-c3434d42667e-logs\") on node \"crc\" DevicePath \"\"" Dec 01 21:56:40 crc kubenswrapper[4962]: I1201 21:56:40.748910 4962 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Dec 01 21:56:40 crc kubenswrapper[4962]: I1201 21:56:40.773822 4962 scope.go:117] "RemoveContainer" containerID="836b8101e947dd78979b83dfa7cb9ebeb2a41def98f01177745280a0ea6e6303" Dec 01 21:56:40 crc kubenswrapper[4962]: I1201 21:56:40.778745 4962 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Dec 01 21:56:40 crc kubenswrapper[4962]: I1201 21:56:40.799168 4962 scope.go:117] "RemoveContainer" containerID="13883312d5b99c02ab6794ce1361846db18e86ab607c6100615a8682932dbec6" Dec 01 21:56:40 crc kubenswrapper[4962]: E1201 21:56:40.799643 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13883312d5b99c02ab6794ce1361846db18e86ab607c6100615a8682932dbec6\": container with ID starting with 13883312d5b99c02ab6794ce1361846db18e86ab607c6100615a8682932dbec6 not found: ID does not exist" containerID="13883312d5b99c02ab6794ce1361846db18e86ab607c6100615a8682932dbec6" Dec 01 21:56:40 crc kubenswrapper[4962]: I1201 21:56:40.799689 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13883312d5b99c02ab6794ce1361846db18e86ab607c6100615a8682932dbec6"} err="failed to get container status \"13883312d5b99c02ab6794ce1361846db18e86ab607c6100615a8682932dbec6\": rpc error: code = NotFound desc = could not find container \"13883312d5b99c02ab6794ce1361846db18e86ab607c6100615a8682932dbec6\": container with ID starting with 13883312d5b99c02ab6794ce1361846db18e86ab607c6100615a8682932dbec6 not found: ID does not exist" Dec 01 21:56:40 crc kubenswrapper[4962]: I1201 21:56:40.799718 4962 scope.go:117] "RemoveContainer" containerID="836b8101e947dd78979b83dfa7cb9ebeb2a41def98f01177745280a0ea6e6303" Dec 01 21:56:40 crc kubenswrapper[4962]: E1201 21:56:40.800178 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"836b8101e947dd78979b83dfa7cb9ebeb2a41def98f01177745280a0ea6e6303\": container with ID starting with 836b8101e947dd78979b83dfa7cb9ebeb2a41def98f01177745280a0ea6e6303 not found: ID does not exist" containerID="836b8101e947dd78979b83dfa7cb9ebeb2a41def98f01177745280a0ea6e6303" Dec 01 21:56:40 crc kubenswrapper[4962]: I1201 21:56:40.800218 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"836b8101e947dd78979b83dfa7cb9ebeb2a41def98f01177745280a0ea6e6303"} err="failed to get container status \"836b8101e947dd78979b83dfa7cb9ebeb2a41def98f01177745280a0ea6e6303\": rpc error: code = NotFound desc = could not find container \"836b8101e947dd78979b83dfa7cb9ebeb2a41def98f01177745280a0ea6e6303\": container with ID starting with 836b8101e947dd78979b83dfa7cb9ebeb2a41def98f01177745280a0ea6e6303 not found: ID does not exist" Dec 01 21:56:40 crc kubenswrapper[4962]: I1201 21:56:40.850858 4962 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Dec 01 21:56:40 crc kubenswrapper[4962]: I1201 21:56:40.877637 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="272645a0-8a27-4b3e-8f4f-5e456f118d84" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 01 21:56:41 crc kubenswrapper[4962]: I1201 21:56:41.077754 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 21:56:41 crc kubenswrapper[4962]: I1201 21:56:41.091125 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 21:56:41 crc kubenswrapper[4962]: I1201 21:56:41.124525 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 21:56:41 crc kubenswrapper[4962]: E1201 21:56:41.125214 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ded81d0-3be6-4293-abd6-c3434d42667e" containerName="glance-log" Dec 01 21:56:41 crc kubenswrapper[4962]: I1201 21:56:41.125246 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ded81d0-3be6-4293-abd6-c3434d42667e" containerName="glance-log" Dec 01 21:56:41 crc kubenswrapper[4962]: E1201 21:56:41.125269 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ded81d0-3be6-4293-abd6-c3434d42667e" containerName="glance-httpd" Dec 01 21:56:41 crc kubenswrapper[4962]: I1201 21:56:41.125277 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ded81d0-3be6-4293-abd6-c3434d42667e" containerName="glance-httpd" Dec 01 21:56:41 crc kubenswrapper[4962]: I1201 21:56:41.125604 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ded81d0-3be6-4293-abd6-c3434d42667e" containerName="glance-log" Dec 01 21:56:41 crc kubenswrapper[4962]: I1201 21:56:41.125637 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ded81d0-3be6-4293-abd6-c3434d42667e" containerName="glance-httpd" Dec 01 21:56:41 crc kubenswrapper[4962]: I1201 21:56:41.127224 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 21:56:41 crc kubenswrapper[4962]: I1201 21:56:41.130119 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 01 21:56:41 crc kubenswrapper[4962]: I1201 21:56:41.131433 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 01 21:56:41 crc kubenswrapper[4962]: I1201 21:56:41.141536 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 21:56:41 crc kubenswrapper[4962]: E1201 21:56:41.238514 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0fc56ad3c6fa62a6af99e37471ecd22b499ff39824bc98bc9d9ec4c5b8c39c46" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 01 21:56:41 crc kubenswrapper[4962]: E1201 21:56:41.240025 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0fc56ad3c6fa62a6af99e37471ecd22b499ff39824bc98bc9d9ec4c5b8c39c46" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 01 21:56:41 crc kubenswrapper[4962]: E1201 21:56:41.241179 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0fc56ad3c6fa62a6af99e37471ecd22b499ff39824bc98bc9d9ec4c5b8c39c46" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 01 21:56:41 crc kubenswrapper[4962]: E1201 21:56:41.241214 4962 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="e95f5aa0-0150-49da-a25d-c2eb369d394e" containerName="nova-cell0-conductor-conductor" Dec 01 21:56:41 crc kubenswrapper[4962]: I1201 21:56:41.262676 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/87780c0b-00e7-44cd-93da-c22f2b2a771c-logs\") pod \"glance-default-external-api-0\" (UID: \"87780c0b-00e7-44cd-93da-c22f2b2a771c\") " pod="openstack/glance-default-external-api-0" Dec 01 21:56:41 crc kubenswrapper[4962]: I1201 21:56:41.262736 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87780c0b-00e7-44cd-93da-c22f2b2a771c-config-data\") pod \"glance-default-external-api-0\" (UID: \"87780c0b-00e7-44cd-93da-c22f2b2a771c\") " pod="openstack/glance-default-external-api-0" Dec 01 21:56:41 crc kubenswrapper[4962]: I1201 21:56:41.262753 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6b7h\" (UniqueName: \"kubernetes.io/projected/87780c0b-00e7-44cd-93da-c22f2b2a771c-kube-api-access-d6b7h\") pod \"glance-default-external-api-0\" (UID: \"87780c0b-00e7-44cd-93da-c22f2b2a771c\") " pod="openstack/glance-default-external-api-0" Dec 01 21:56:41 crc kubenswrapper[4962]: I1201 21:56:41.262770 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/87780c0b-00e7-44cd-93da-c22f2b2a771c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"87780c0b-00e7-44cd-93da-c22f2b2a771c\") " pod="openstack/glance-default-external-api-0" Dec 01 21:56:41 crc kubenswrapper[4962]: I1201 21:56:41.262814 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87780c0b-00e7-44cd-93da-c22f2b2a771c-scripts\") pod \"glance-default-external-api-0\" (UID: \"87780c0b-00e7-44cd-93da-c22f2b2a771c\") " pod="openstack/glance-default-external-api-0" Dec 01 21:56:41 crc kubenswrapper[4962]: I1201 21:56:41.262849 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/87780c0b-00e7-44cd-93da-c22f2b2a771c-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"87780c0b-00e7-44cd-93da-c22f2b2a771c\") " pod="openstack/glance-default-external-api-0" Dec 01 21:56:41 crc kubenswrapper[4962]: I1201 21:56:41.262926 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87780c0b-00e7-44cd-93da-c22f2b2a771c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"87780c0b-00e7-44cd-93da-c22f2b2a771c\") " pod="openstack/glance-default-external-api-0" Dec 01 21:56:41 crc kubenswrapper[4962]: I1201 21:56:41.263007 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"87780c0b-00e7-44cd-93da-c22f2b2a771c\") " pod="openstack/glance-default-external-api-0" Dec 01 21:56:41 crc kubenswrapper[4962]: I1201 21:56:41.364777 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"87780c0b-00e7-44cd-93da-c22f2b2a771c\") " pod="openstack/glance-default-external-api-0" Dec 01 21:56:41 crc kubenswrapper[4962]: I1201 21:56:41.365057 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"87780c0b-00e7-44cd-93da-c22f2b2a771c\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-external-api-0" Dec 01 21:56:41 crc kubenswrapper[4962]: I1201 21:56:41.365785 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/87780c0b-00e7-44cd-93da-c22f2b2a771c-logs\") pod \"glance-default-external-api-0\" (UID: \"87780c0b-00e7-44cd-93da-c22f2b2a771c\") " pod="openstack/glance-default-external-api-0" Dec 01 21:56:41 crc kubenswrapper[4962]: I1201 21:56:41.365981 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87780c0b-00e7-44cd-93da-c22f2b2a771c-config-data\") pod \"glance-default-external-api-0\" (UID: \"87780c0b-00e7-44cd-93da-c22f2b2a771c\") " pod="openstack/glance-default-external-api-0" Dec 01 21:56:41 crc kubenswrapper[4962]: I1201 21:56:41.366041 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6b7h\" (UniqueName: \"kubernetes.io/projected/87780c0b-00e7-44cd-93da-c22f2b2a771c-kube-api-access-d6b7h\") pod \"glance-default-external-api-0\" (UID: \"87780c0b-00e7-44cd-93da-c22f2b2a771c\") " pod="openstack/glance-default-external-api-0" Dec 01 21:56:41 crc kubenswrapper[4962]: I1201 21:56:41.366089 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/87780c0b-00e7-44cd-93da-c22f2b2a771c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"87780c0b-00e7-44cd-93da-c22f2b2a771c\") " pod="openstack/glance-default-external-api-0" Dec 01 21:56:41 crc kubenswrapper[4962]: I1201 21:56:41.366307 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87780c0b-00e7-44cd-93da-c22f2b2a771c-scripts\") pod \"glance-default-external-api-0\" (UID: \"87780c0b-00e7-44cd-93da-c22f2b2a771c\") " pod="openstack/glance-default-external-api-0" Dec 01 21:56:41 crc kubenswrapper[4962]: I1201 21:56:41.366432 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/87780c0b-00e7-44cd-93da-c22f2b2a771c-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"87780c0b-00e7-44cd-93da-c22f2b2a771c\") " pod="openstack/glance-default-external-api-0" Dec 01 21:56:41 crc kubenswrapper[4962]: I1201 21:56:41.366605 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/87780c0b-00e7-44cd-93da-c22f2b2a771c-logs\") pod \"glance-default-external-api-0\" (UID: \"87780c0b-00e7-44cd-93da-c22f2b2a771c\") " pod="openstack/glance-default-external-api-0" Dec 01 21:56:41 crc kubenswrapper[4962]: I1201 21:56:41.366674 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87780c0b-00e7-44cd-93da-c22f2b2a771c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"87780c0b-00e7-44cd-93da-c22f2b2a771c\") " pod="openstack/glance-default-external-api-0" Dec 01 21:56:41 crc kubenswrapper[4962]: I1201 21:56:41.367979 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/87780c0b-00e7-44cd-93da-c22f2b2a771c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"87780c0b-00e7-44cd-93da-c22f2b2a771c\") " pod="openstack/glance-default-external-api-0" Dec 01 21:56:41 crc kubenswrapper[4962]: I1201 21:56:41.372926 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87780c0b-00e7-44cd-93da-c22f2b2a771c-scripts\") pod \"glance-default-external-api-0\" (UID: \"87780c0b-00e7-44cd-93da-c22f2b2a771c\") " pod="openstack/glance-default-external-api-0" Dec 01 21:56:41 crc kubenswrapper[4962]: I1201 21:56:41.373080 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87780c0b-00e7-44cd-93da-c22f2b2a771c-config-data\") pod \"glance-default-external-api-0\" (UID: \"87780c0b-00e7-44cd-93da-c22f2b2a771c\") " pod="openstack/glance-default-external-api-0" Dec 01 21:56:41 crc kubenswrapper[4962]: I1201 21:56:41.373526 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87780c0b-00e7-44cd-93da-c22f2b2a771c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"87780c0b-00e7-44cd-93da-c22f2b2a771c\") " pod="openstack/glance-default-external-api-0" Dec 01 21:56:41 crc kubenswrapper[4962]: I1201 21:56:41.373806 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/87780c0b-00e7-44cd-93da-c22f2b2a771c-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"87780c0b-00e7-44cd-93da-c22f2b2a771c\") " pod="openstack/glance-default-external-api-0" Dec 01 21:56:41 crc kubenswrapper[4962]: I1201 21:56:41.391748 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6b7h\" (UniqueName: \"kubernetes.io/projected/87780c0b-00e7-44cd-93da-c22f2b2a771c-kube-api-access-d6b7h\") pod \"glance-default-external-api-0\" (UID: \"87780c0b-00e7-44cd-93da-c22f2b2a771c\") " pod="openstack/glance-default-external-api-0" Dec 01 21:56:41 crc kubenswrapper[4962]: I1201 21:56:41.398798 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"87780c0b-00e7-44cd-93da-c22f2b2a771c\") " pod="openstack/glance-default-external-api-0" Dec 01 21:56:41 crc kubenswrapper[4962]: I1201 21:56:41.460205 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 21:56:42 crc kubenswrapper[4962]: I1201 21:56:42.286665 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ded81d0-3be6-4293-abd6-c3434d42667e" path="/var/lib/kubelet/pods/1ded81d0-3be6-4293-abd6-c3434d42667e/volumes" Dec 01 21:56:42 crc kubenswrapper[4962]: I1201 21:56:42.287910 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 21:56:42 crc kubenswrapper[4962]: I1201 21:56:42.571640 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 21:56:42 crc kubenswrapper[4962]: I1201 21:56:42.620300 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/788bee1d-914f-4efc-acea-67ff250ce73f-config-data\") pod \"788bee1d-914f-4efc-acea-67ff250ce73f\" (UID: \"788bee1d-914f-4efc-acea-67ff250ce73f\") " Dec 01 21:56:42 crc kubenswrapper[4962]: I1201 21:56:42.620369 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/788bee1d-914f-4efc-acea-67ff250ce73f-combined-ca-bundle\") pod \"788bee1d-914f-4efc-acea-67ff250ce73f\" (UID: \"788bee1d-914f-4efc-acea-67ff250ce73f\") " Dec 01 21:56:42 crc kubenswrapper[4962]: I1201 21:56:42.620433 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/788bee1d-914f-4efc-acea-67ff250ce73f-logs\") pod \"788bee1d-914f-4efc-acea-67ff250ce73f\" (UID: \"788bee1d-914f-4efc-acea-67ff250ce73f\") " Dec 01 21:56:42 crc kubenswrapper[4962]: I1201 21:56:42.620475 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/788bee1d-914f-4efc-acea-67ff250ce73f-internal-tls-certs\") pod \"788bee1d-914f-4efc-acea-67ff250ce73f\" (UID: \"788bee1d-914f-4efc-acea-67ff250ce73f\") " Dec 01 21:56:42 crc kubenswrapper[4962]: I1201 21:56:42.620520 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"788bee1d-914f-4efc-acea-67ff250ce73f\" (UID: \"788bee1d-914f-4efc-acea-67ff250ce73f\") " Dec 01 21:56:42 crc kubenswrapper[4962]: I1201 21:56:42.620561 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfknw\" (UniqueName: \"kubernetes.io/projected/788bee1d-914f-4efc-acea-67ff250ce73f-kube-api-access-mfknw\") pod \"788bee1d-914f-4efc-acea-67ff250ce73f\" (UID: \"788bee1d-914f-4efc-acea-67ff250ce73f\") " Dec 01 21:56:42 crc kubenswrapper[4962]: I1201 21:56:42.620616 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/788bee1d-914f-4efc-acea-67ff250ce73f-httpd-run\") pod \"788bee1d-914f-4efc-acea-67ff250ce73f\" (UID: \"788bee1d-914f-4efc-acea-67ff250ce73f\") " Dec 01 21:56:42 crc kubenswrapper[4962]: I1201 21:56:42.620646 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/788bee1d-914f-4efc-acea-67ff250ce73f-scripts\") pod \"788bee1d-914f-4efc-acea-67ff250ce73f\" (UID: \"788bee1d-914f-4efc-acea-67ff250ce73f\") " Dec 01 21:56:42 crc kubenswrapper[4962]: I1201 21:56:42.621853 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/788bee1d-914f-4efc-acea-67ff250ce73f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "788bee1d-914f-4efc-acea-67ff250ce73f" (UID: "788bee1d-914f-4efc-acea-67ff250ce73f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:56:42 crc kubenswrapper[4962]: I1201 21:56:42.622230 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/788bee1d-914f-4efc-acea-67ff250ce73f-logs" (OuterVolumeSpecName: "logs") pod "788bee1d-914f-4efc-acea-67ff250ce73f" (UID: "788bee1d-914f-4efc-acea-67ff250ce73f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:56:42 crc kubenswrapper[4962]: I1201 21:56:42.631497 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/788bee1d-914f-4efc-acea-67ff250ce73f-scripts" (OuterVolumeSpecName: "scripts") pod "788bee1d-914f-4efc-acea-67ff250ce73f" (UID: "788bee1d-914f-4efc-acea-67ff250ce73f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:56:42 crc kubenswrapper[4962]: I1201 21:56:42.639073 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "788bee1d-914f-4efc-acea-67ff250ce73f" (UID: "788bee1d-914f-4efc-acea-67ff250ce73f"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 21:56:42 crc kubenswrapper[4962]: I1201 21:56:42.659201 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/788bee1d-914f-4efc-acea-67ff250ce73f-kube-api-access-mfknw" (OuterVolumeSpecName: "kube-api-access-mfknw") pod "788bee1d-914f-4efc-acea-67ff250ce73f" (UID: "788bee1d-914f-4efc-acea-67ff250ce73f"). InnerVolumeSpecName "kube-api-access-mfknw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:56:42 crc kubenswrapper[4962]: I1201 21:56:42.730316 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/788bee1d-914f-4efc-acea-67ff250ce73f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "788bee1d-914f-4efc-acea-67ff250ce73f" (UID: "788bee1d-914f-4efc-acea-67ff250ce73f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:56:42 crc kubenswrapper[4962]: I1201 21:56:42.736289 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/788bee1d-914f-4efc-acea-67ff250ce73f-logs\") on node \"crc\" DevicePath \"\"" Dec 01 21:56:42 crc kubenswrapper[4962]: I1201 21:56:42.736329 4962 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/788bee1d-914f-4efc-acea-67ff250ce73f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 21:56:42 crc kubenswrapper[4962]: I1201 21:56:42.736358 4962 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Dec 01 21:56:42 crc kubenswrapper[4962]: I1201 21:56:42.736375 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mfknw\" (UniqueName: \"kubernetes.io/projected/788bee1d-914f-4efc-acea-67ff250ce73f-kube-api-access-mfknw\") on node \"crc\" DevicePath \"\"" Dec 01 21:56:42 crc kubenswrapper[4962]: I1201 21:56:42.736388 4962 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/788bee1d-914f-4efc-acea-67ff250ce73f-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 01 21:56:42 crc kubenswrapper[4962]: I1201 21:56:42.736398 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/788bee1d-914f-4efc-acea-67ff250ce73f-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 21:56:42 crc kubenswrapper[4962]: I1201 21:56:42.751082 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"87780c0b-00e7-44cd-93da-c22f2b2a771c","Type":"ContainerStarted","Data":"4c36b167c0f64afc353fc11a2416dff6172567944d7c389e91f9b02ed10ee779"} Dec 01 21:56:42 crc kubenswrapper[4962]: E1201 21:56:42.761387 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/788bee1d-914f-4efc-acea-67ff250ce73f-combined-ca-bundle podName:788bee1d-914f-4efc-acea-67ff250ce73f nodeName:}" failed. No retries permitted until 2025-12-01 21:56:43.26135632 +0000 UTC m=+1387.362795515 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/788bee1d-914f-4efc-acea-67ff250ce73f-combined-ca-bundle") pod "788bee1d-914f-4efc-acea-67ff250ce73f" (UID: "788bee1d-914f-4efc-acea-67ff250ce73f") : error deleting /var/lib/kubelet/pods/788bee1d-914f-4efc-acea-67ff250ce73f/volume-subpaths: remove /var/lib/kubelet/pods/788bee1d-914f-4efc-acea-67ff250ce73f/volume-subpaths: no such file or directory Dec 01 21:56:42 crc kubenswrapper[4962]: I1201 21:56:42.766097 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/788bee1d-914f-4efc-acea-67ff250ce73f-config-data" (OuterVolumeSpecName: "config-data") pod "788bee1d-914f-4efc-acea-67ff250ce73f" (UID: "788bee1d-914f-4efc-acea-67ff250ce73f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:56:42 crc kubenswrapper[4962]: I1201 21:56:42.766138 4962 generic.go:334] "Generic (PLEG): container finished" podID="788bee1d-914f-4efc-acea-67ff250ce73f" containerID="e1cf1e720f696aabdf6d9c1f9c857171943a447ef9c523e1907ee394e3205972" exitCode=0 Dec 01 21:56:42 crc kubenswrapper[4962]: I1201 21:56:42.766176 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"788bee1d-914f-4efc-acea-67ff250ce73f","Type":"ContainerDied","Data":"e1cf1e720f696aabdf6d9c1f9c857171943a447ef9c523e1907ee394e3205972"} Dec 01 21:56:42 crc kubenswrapper[4962]: I1201 21:56:42.766203 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"788bee1d-914f-4efc-acea-67ff250ce73f","Type":"ContainerDied","Data":"cad83d54a0a2223e5c7a4492a3c5e86b9061a353d47a7a476f5b7a5570c7b218"} Dec 01 21:56:42 crc kubenswrapper[4962]: I1201 21:56:42.766218 4962 scope.go:117] "RemoveContainer" containerID="e1cf1e720f696aabdf6d9c1f9c857171943a447ef9c523e1907ee394e3205972" Dec 01 21:56:42 crc kubenswrapper[4962]: I1201 21:56:42.766257 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 21:56:42 crc kubenswrapper[4962]: I1201 21:56:42.766801 4962 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Dec 01 21:56:42 crc kubenswrapper[4962]: I1201 21:56:42.839631 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/788bee1d-914f-4efc-acea-67ff250ce73f-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 21:56:42 crc kubenswrapper[4962]: I1201 21:56:42.839666 4962 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Dec 01 21:56:42 crc kubenswrapper[4962]: I1201 21:56:42.850007 4962 scope.go:117] "RemoveContainer" containerID="844141a471fd7b5af60ab53947a4cbc50ad1f6e106b8325daf01b1afbb6c9dca" Dec 01 21:56:42 crc kubenswrapper[4962]: I1201 21:56:42.886283 4962 scope.go:117] "RemoveContainer" containerID="e1cf1e720f696aabdf6d9c1f9c857171943a447ef9c523e1907ee394e3205972" Dec 01 21:56:42 crc kubenswrapper[4962]: E1201 21:56:42.886713 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1cf1e720f696aabdf6d9c1f9c857171943a447ef9c523e1907ee394e3205972\": container with ID starting with e1cf1e720f696aabdf6d9c1f9c857171943a447ef9c523e1907ee394e3205972 not found: ID does not exist" containerID="e1cf1e720f696aabdf6d9c1f9c857171943a447ef9c523e1907ee394e3205972" Dec 01 21:56:42 crc kubenswrapper[4962]: I1201 21:56:42.886748 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1cf1e720f696aabdf6d9c1f9c857171943a447ef9c523e1907ee394e3205972"} err="failed to get container status \"e1cf1e720f696aabdf6d9c1f9c857171943a447ef9c523e1907ee394e3205972\": rpc error: code = NotFound desc = could not find container \"e1cf1e720f696aabdf6d9c1f9c857171943a447ef9c523e1907ee394e3205972\": container with ID starting with e1cf1e720f696aabdf6d9c1f9c857171943a447ef9c523e1907ee394e3205972 not found: ID does not exist" Dec 01 21:56:42 crc kubenswrapper[4962]: I1201 21:56:42.886778 4962 scope.go:117] "RemoveContainer" containerID="844141a471fd7b5af60ab53947a4cbc50ad1f6e106b8325daf01b1afbb6c9dca" Dec 01 21:56:42 crc kubenswrapper[4962]: E1201 21:56:42.888687 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"844141a471fd7b5af60ab53947a4cbc50ad1f6e106b8325daf01b1afbb6c9dca\": container with ID starting with 844141a471fd7b5af60ab53947a4cbc50ad1f6e106b8325daf01b1afbb6c9dca not found: ID does not exist" containerID="844141a471fd7b5af60ab53947a4cbc50ad1f6e106b8325daf01b1afbb6c9dca" Dec 01 21:56:42 crc kubenswrapper[4962]: I1201 21:56:42.888747 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"844141a471fd7b5af60ab53947a4cbc50ad1f6e106b8325daf01b1afbb6c9dca"} err="failed to get container status \"844141a471fd7b5af60ab53947a4cbc50ad1f6e106b8325daf01b1afbb6c9dca\": rpc error: code = NotFound desc = could not find container \"844141a471fd7b5af60ab53947a4cbc50ad1f6e106b8325daf01b1afbb6c9dca\": container with ID starting with 844141a471fd7b5af60ab53947a4cbc50ad1f6e106b8325daf01b1afbb6c9dca not found: ID does not exist" Dec 01 21:56:43 crc kubenswrapper[4962]: I1201 21:56:43.347970 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/788bee1d-914f-4efc-acea-67ff250ce73f-combined-ca-bundle\") pod \"788bee1d-914f-4efc-acea-67ff250ce73f\" (UID: \"788bee1d-914f-4efc-acea-67ff250ce73f\") " Dec 01 21:56:43 crc kubenswrapper[4962]: I1201 21:56:43.352104 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/788bee1d-914f-4efc-acea-67ff250ce73f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "788bee1d-914f-4efc-acea-67ff250ce73f" (UID: "788bee1d-914f-4efc-acea-67ff250ce73f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:56:43 crc kubenswrapper[4962]: I1201 21:56:43.424098 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 21:56:43 crc kubenswrapper[4962]: I1201 21:56:43.455608 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/788bee1d-914f-4efc-acea-67ff250ce73f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 21:56:43 crc kubenswrapper[4962]: I1201 21:56:43.459265 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 21:56:43 crc kubenswrapper[4962]: I1201 21:56:43.473578 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 21:56:43 crc kubenswrapper[4962]: E1201 21:56:43.474175 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="788bee1d-914f-4efc-acea-67ff250ce73f" containerName="glance-log" Dec 01 21:56:43 crc kubenswrapper[4962]: I1201 21:56:43.474194 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="788bee1d-914f-4efc-acea-67ff250ce73f" containerName="glance-log" Dec 01 21:56:43 crc kubenswrapper[4962]: E1201 21:56:43.474238 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="788bee1d-914f-4efc-acea-67ff250ce73f" containerName="glance-httpd" Dec 01 21:56:43 crc kubenswrapper[4962]: I1201 21:56:43.474247 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="788bee1d-914f-4efc-acea-67ff250ce73f" containerName="glance-httpd" Dec 01 21:56:43 crc kubenswrapper[4962]: I1201 21:56:43.474477 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="788bee1d-914f-4efc-acea-67ff250ce73f" containerName="glance-httpd" Dec 01 21:56:43 crc kubenswrapper[4962]: I1201 21:56:43.474495 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="788bee1d-914f-4efc-acea-67ff250ce73f" containerName="glance-log" Dec 01 21:56:43 crc kubenswrapper[4962]: I1201 21:56:43.475715 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 21:56:43 crc kubenswrapper[4962]: I1201 21:56:43.478068 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 01 21:56:43 crc kubenswrapper[4962]: I1201 21:56:43.481803 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 01 21:56:43 crc kubenswrapper[4962]: I1201 21:56:43.497161 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 21:56:43 crc kubenswrapper[4962]: I1201 21:56:43.659192 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/891b6978-5cc9-464e-ae37-f9f7b3dadc62-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"891b6978-5cc9-464e-ae37-f9f7b3dadc62\") " pod="openstack/glance-default-internal-api-0" Dec 01 21:56:43 crc kubenswrapper[4962]: I1201 21:56:43.659411 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/891b6978-5cc9-464e-ae37-f9f7b3dadc62-scripts\") pod \"glance-default-internal-api-0\" (UID: \"891b6978-5cc9-464e-ae37-f9f7b3dadc62\") " pod="openstack/glance-default-internal-api-0" Dec 01 21:56:43 crc kubenswrapper[4962]: I1201 21:56:43.659487 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/891b6978-5cc9-464e-ae37-f9f7b3dadc62-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"891b6978-5cc9-464e-ae37-f9f7b3dadc62\") " pod="openstack/glance-default-internal-api-0" Dec 01 21:56:43 crc kubenswrapper[4962]: I1201 21:56:43.659548 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2f97k\" (UniqueName: \"kubernetes.io/projected/891b6978-5cc9-464e-ae37-f9f7b3dadc62-kube-api-access-2f97k\") pod \"glance-default-internal-api-0\" (UID: \"891b6978-5cc9-464e-ae37-f9f7b3dadc62\") " pod="openstack/glance-default-internal-api-0" Dec 01 21:56:43 crc kubenswrapper[4962]: I1201 21:56:43.659627 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"891b6978-5cc9-464e-ae37-f9f7b3dadc62\") " pod="openstack/glance-default-internal-api-0" Dec 01 21:56:43 crc kubenswrapper[4962]: I1201 21:56:43.659742 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/891b6978-5cc9-464e-ae37-f9f7b3dadc62-config-data\") pod \"glance-default-internal-api-0\" (UID: \"891b6978-5cc9-464e-ae37-f9f7b3dadc62\") " pod="openstack/glance-default-internal-api-0" Dec 01 21:56:43 crc kubenswrapper[4962]: I1201 21:56:43.659840 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/891b6978-5cc9-464e-ae37-f9f7b3dadc62-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"891b6978-5cc9-464e-ae37-f9f7b3dadc62\") " pod="openstack/glance-default-internal-api-0" Dec 01 21:56:43 crc kubenswrapper[4962]: I1201 21:56:43.659920 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/891b6978-5cc9-464e-ae37-f9f7b3dadc62-logs\") pod \"glance-default-internal-api-0\" (UID: \"891b6978-5cc9-464e-ae37-f9f7b3dadc62\") " pod="openstack/glance-default-internal-api-0" Dec 01 21:56:43 crc kubenswrapper[4962]: I1201 21:56:43.761993 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/891b6978-5cc9-464e-ae37-f9f7b3dadc62-config-data\") pod \"glance-default-internal-api-0\" (UID: \"891b6978-5cc9-464e-ae37-f9f7b3dadc62\") " pod="openstack/glance-default-internal-api-0" Dec 01 21:56:43 crc kubenswrapper[4962]: I1201 21:56:43.762056 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/891b6978-5cc9-464e-ae37-f9f7b3dadc62-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"891b6978-5cc9-464e-ae37-f9f7b3dadc62\") " pod="openstack/glance-default-internal-api-0" Dec 01 21:56:43 crc kubenswrapper[4962]: I1201 21:56:43.762092 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/891b6978-5cc9-464e-ae37-f9f7b3dadc62-logs\") pod \"glance-default-internal-api-0\" (UID: \"891b6978-5cc9-464e-ae37-f9f7b3dadc62\") " pod="openstack/glance-default-internal-api-0" Dec 01 21:56:43 crc kubenswrapper[4962]: I1201 21:56:43.762190 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/891b6978-5cc9-464e-ae37-f9f7b3dadc62-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"891b6978-5cc9-464e-ae37-f9f7b3dadc62\") " pod="openstack/glance-default-internal-api-0" Dec 01 21:56:43 crc kubenswrapper[4962]: I1201 21:56:43.762472 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/891b6978-5cc9-464e-ae37-f9f7b3dadc62-scripts\") pod \"glance-default-internal-api-0\" (UID: \"891b6978-5cc9-464e-ae37-f9f7b3dadc62\") " pod="openstack/glance-default-internal-api-0" Dec 01 21:56:43 crc kubenswrapper[4962]: I1201 21:56:43.762786 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/891b6978-5cc9-464e-ae37-f9f7b3dadc62-logs\") pod \"glance-default-internal-api-0\" (UID: \"891b6978-5cc9-464e-ae37-f9f7b3dadc62\") " pod="openstack/glance-default-internal-api-0" Dec 01 21:56:43 crc kubenswrapper[4962]: I1201 21:56:43.762837 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/891b6978-5cc9-464e-ae37-f9f7b3dadc62-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"891b6978-5cc9-464e-ae37-f9f7b3dadc62\") " pod="openstack/glance-default-internal-api-0" Dec 01 21:56:43 crc kubenswrapper[4962]: I1201 21:56:43.762847 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/891b6978-5cc9-464e-ae37-f9f7b3dadc62-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"891b6978-5cc9-464e-ae37-f9f7b3dadc62\") " pod="openstack/glance-default-internal-api-0" Dec 01 21:56:43 crc kubenswrapper[4962]: I1201 21:56:43.763055 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2f97k\" (UniqueName: \"kubernetes.io/projected/891b6978-5cc9-464e-ae37-f9f7b3dadc62-kube-api-access-2f97k\") pod \"glance-default-internal-api-0\" (UID: \"891b6978-5cc9-464e-ae37-f9f7b3dadc62\") " pod="openstack/glance-default-internal-api-0" Dec 01 21:56:43 crc kubenswrapper[4962]: I1201 21:56:43.763105 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"891b6978-5cc9-464e-ae37-f9f7b3dadc62\") " pod="openstack/glance-default-internal-api-0" Dec 01 21:56:43 crc kubenswrapper[4962]: I1201 21:56:43.763619 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"891b6978-5cc9-464e-ae37-f9f7b3dadc62\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Dec 01 21:56:43 crc kubenswrapper[4962]: I1201 21:56:43.766808 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/891b6978-5cc9-464e-ae37-f9f7b3dadc62-scripts\") pod \"glance-default-internal-api-0\" (UID: \"891b6978-5cc9-464e-ae37-f9f7b3dadc62\") " pod="openstack/glance-default-internal-api-0" Dec 01 21:56:43 crc kubenswrapper[4962]: I1201 21:56:43.767063 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/891b6978-5cc9-464e-ae37-f9f7b3dadc62-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"891b6978-5cc9-464e-ae37-f9f7b3dadc62\") " pod="openstack/glance-default-internal-api-0" Dec 01 21:56:43 crc kubenswrapper[4962]: I1201 21:56:43.767086 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/891b6978-5cc9-464e-ae37-f9f7b3dadc62-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"891b6978-5cc9-464e-ae37-f9f7b3dadc62\") " pod="openstack/glance-default-internal-api-0" Dec 01 21:56:43 crc kubenswrapper[4962]: I1201 21:56:43.767247 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/891b6978-5cc9-464e-ae37-f9f7b3dadc62-config-data\") pod \"glance-default-internal-api-0\" (UID: \"891b6978-5cc9-464e-ae37-f9f7b3dadc62\") " pod="openstack/glance-default-internal-api-0" Dec 01 21:56:43 crc kubenswrapper[4962]: I1201 21:56:43.780687 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"87780c0b-00e7-44cd-93da-c22f2b2a771c","Type":"ContainerStarted","Data":"8c3384623f713a3aa35d1c8e26b574cd5237c49d7b288e6d6cc43d4e46c38f2b"} Dec 01 21:56:43 crc kubenswrapper[4962]: I1201 21:56:43.780732 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"87780c0b-00e7-44cd-93da-c22f2b2a771c","Type":"ContainerStarted","Data":"0de6ffe4900f235695304bba96d2ee1d7072fa3bce665641ee71f21a7e37e2c6"} Dec 01 21:56:43 crc kubenswrapper[4962]: I1201 21:56:43.803552 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2f97k\" (UniqueName: \"kubernetes.io/projected/891b6978-5cc9-464e-ae37-f9f7b3dadc62-kube-api-access-2f97k\") pod \"glance-default-internal-api-0\" (UID: \"891b6978-5cc9-464e-ae37-f9f7b3dadc62\") " pod="openstack/glance-default-internal-api-0" Dec 01 21:56:43 crc kubenswrapper[4962]: I1201 21:56:43.812272 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"891b6978-5cc9-464e-ae37-f9f7b3dadc62\") " pod="openstack/glance-default-internal-api-0" Dec 01 21:56:43 crc kubenswrapper[4962]: I1201 21:56:43.815305 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=2.815284759 podStartE2EDuration="2.815284759s" podCreationTimestamp="2025-12-01 21:56:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:56:43.799800527 +0000 UTC m=+1387.901239722" watchObservedRunningTime="2025-12-01 21:56:43.815284759 +0000 UTC m=+1387.916723954" Dec 01 21:56:43 crc kubenswrapper[4962]: I1201 21:56:43.819335 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 21:56:44 crc kubenswrapper[4962]: I1201 21:56:44.233341 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="788bee1d-914f-4efc-acea-67ff250ce73f" path="/var/lib/kubelet/pods/788bee1d-914f-4efc-acea-67ff250ce73f/volumes" Dec 01 21:56:44 crc kubenswrapper[4962]: I1201 21:56:44.418061 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 21:56:44 crc kubenswrapper[4962]: I1201 21:56:44.795659 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"891b6978-5cc9-464e-ae37-f9f7b3dadc62","Type":"ContainerStarted","Data":"61515de510735367908d3d364491ad160f1898f4c871fe478dc76079b4f0d0df"} Dec 01 21:56:45 crc kubenswrapper[4962]: I1201 21:56:45.808095 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"891b6978-5cc9-464e-ae37-f9f7b3dadc62","Type":"ContainerStarted","Data":"3ea8fe4a2831692f7ba36a7afdd8c966b57e47b5420af73096e77a80e1814992"} Dec 01 21:56:45 crc kubenswrapper[4962]: I1201 21:56:45.808374 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"891b6978-5cc9-464e-ae37-f9f7b3dadc62","Type":"ContainerStarted","Data":"c6be20b3b5cf200dab8fa5ae143de4a961f6a17265a6efd492ac0bcca83cda3d"} Dec 01 21:56:45 crc kubenswrapper[4962]: I1201 21:56:45.840174 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=2.8401582899999998 podStartE2EDuration="2.84015829s" podCreationTimestamp="2025-12-01 21:56:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:56:45.829311221 +0000 UTC m=+1389.930750426" watchObservedRunningTime="2025-12-01 21:56:45.84015829 +0000 UTC m=+1389.941597485" Dec 01 21:56:46 crc kubenswrapper[4962]: E1201 21:56:46.238121 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0fc56ad3c6fa62a6af99e37471ecd22b499ff39824bc98bc9d9ec4c5b8c39c46" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 01 21:56:46 crc kubenswrapper[4962]: E1201 21:56:46.239945 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0fc56ad3c6fa62a6af99e37471ecd22b499ff39824bc98bc9d9ec4c5b8c39c46" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 01 21:56:46 crc kubenswrapper[4962]: E1201 21:56:46.242270 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0fc56ad3c6fa62a6af99e37471ecd22b499ff39824bc98bc9d9ec4c5b8c39c46" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 01 21:56:46 crc kubenswrapper[4962]: E1201 21:56:46.242308 4962 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="e95f5aa0-0150-49da-a25d-c2eb369d394e" containerName="nova-cell0-conductor-conductor" Dec 01 21:56:49 crc kubenswrapper[4962]: I1201 21:56:49.396177 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-h9qhc"] Dec 01 21:56:49 crc kubenswrapper[4962]: I1201 21:56:49.398322 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-h9qhc" Dec 01 21:56:49 crc kubenswrapper[4962]: I1201 21:56:49.408122 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-h9qhc"] Dec 01 21:56:49 crc kubenswrapper[4962]: I1201 21:56:49.498860 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjl9p\" (UniqueName: \"kubernetes.io/projected/805240bc-5355-4ffa-886c-a4e96fb3a540-kube-api-access-mjl9p\") pod \"aodh-db-create-h9qhc\" (UID: \"805240bc-5355-4ffa-886c-a4e96fb3a540\") " pod="openstack/aodh-db-create-h9qhc" Dec 01 21:56:49 crc kubenswrapper[4962]: I1201 21:56:49.499225 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/805240bc-5355-4ffa-886c-a4e96fb3a540-operator-scripts\") pod \"aodh-db-create-h9qhc\" (UID: \"805240bc-5355-4ffa-886c-a4e96fb3a540\") " pod="openstack/aodh-db-create-h9qhc" Dec 01 21:56:49 crc kubenswrapper[4962]: I1201 21:56:49.505180 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-e930-account-create-update-gr4rp"] Dec 01 21:56:49 crc kubenswrapper[4962]: I1201 21:56:49.511292 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-e930-account-create-update-gr4rp" Dec 01 21:56:49 crc kubenswrapper[4962]: I1201 21:56:49.513706 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Dec 01 21:56:49 crc kubenswrapper[4962]: I1201 21:56:49.525713 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-e930-account-create-update-gr4rp"] Dec 01 21:56:49 crc kubenswrapper[4962]: I1201 21:56:49.601464 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/737cedfd-d3c6-4f5b-8289-af4b32ec094a-operator-scripts\") pod \"aodh-e930-account-create-update-gr4rp\" (UID: \"737cedfd-d3c6-4f5b-8289-af4b32ec094a\") " pod="openstack/aodh-e930-account-create-update-gr4rp" Dec 01 21:56:49 crc kubenswrapper[4962]: I1201 21:56:49.601559 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/805240bc-5355-4ffa-886c-a4e96fb3a540-operator-scripts\") pod \"aodh-db-create-h9qhc\" (UID: \"805240bc-5355-4ffa-886c-a4e96fb3a540\") " pod="openstack/aodh-db-create-h9qhc" Dec 01 21:56:49 crc kubenswrapper[4962]: I1201 21:56:49.601650 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhspx\" (UniqueName: \"kubernetes.io/projected/737cedfd-d3c6-4f5b-8289-af4b32ec094a-kube-api-access-dhspx\") pod \"aodh-e930-account-create-update-gr4rp\" (UID: \"737cedfd-d3c6-4f5b-8289-af4b32ec094a\") " pod="openstack/aodh-e930-account-create-update-gr4rp" Dec 01 21:56:49 crc kubenswrapper[4962]: I1201 21:56:49.601744 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjl9p\" (UniqueName: \"kubernetes.io/projected/805240bc-5355-4ffa-886c-a4e96fb3a540-kube-api-access-mjl9p\") pod \"aodh-db-create-h9qhc\" (UID: \"805240bc-5355-4ffa-886c-a4e96fb3a540\") " pod="openstack/aodh-db-create-h9qhc" Dec 01 21:56:49 crc kubenswrapper[4962]: I1201 21:56:49.602909 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/805240bc-5355-4ffa-886c-a4e96fb3a540-operator-scripts\") pod \"aodh-db-create-h9qhc\" (UID: \"805240bc-5355-4ffa-886c-a4e96fb3a540\") " pod="openstack/aodh-db-create-h9qhc" Dec 01 21:56:49 crc kubenswrapper[4962]: I1201 21:56:49.636913 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjl9p\" (UniqueName: \"kubernetes.io/projected/805240bc-5355-4ffa-886c-a4e96fb3a540-kube-api-access-mjl9p\") pod \"aodh-db-create-h9qhc\" (UID: \"805240bc-5355-4ffa-886c-a4e96fb3a540\") " pod="openstack/aodh-db-create-h9qhc" Dec 01 21:56:49 crc kubenswrapper[4962]: I1201 21:56:49.703602 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/737cedfd-d3c6-4f5b-8289-af4b32ec094a-operator-scripts\") pod \"aodh-e930-account-create-update-gr4rp\" (UID: \"737cedfd-d3c6-4f5b-8289-af4b32ec094a\") " pod="openstack/aodh-e930-account-create-update-gr4rp" Dec 01 21:56:49 crc kubenswrapper[4962]: I1201 21:56:49.703721 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhspx\" (UniqueName: \"kubernetes.io/projected/737cedfd-d3c6-4f5b-8289-af4b32ec094a-kube-api-access-dhspx\") pod \"aodh-e930-account-create-update-gr4rp\" (UID: \"737cedfd-d3c6-4f5b-8289-af4b32ec094a\") " pod="openstack/aodh-e930-account-create-update-gr4rp" Dec 01 21:56:49 crc kubenswrapper[4962]: I1201 21:56:49.704684 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/737cedfd-d3c6-4f5b-8289-af4b32ec094a-operator-scripts\") pod \"aodh-e930-account-create-update-gr4rp\" (UID: \"737cedfd-d3c6-4f5b-8289-af4b32ec094a\") " pod="openstack/aodh-e930-account-create-update-gr4rp" Dec 01 21:56:49 crc kubenswrapper[4962]: I1201 21:56:49.715547 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-h9qhc" Dec 01 21:56:49 crc kubenswrapper[4962]: I1201 21:56:49.723606 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhspx\" (UniqueName: \"kubernetes.io/projected/737cedfd-d3c6-4f5b-8289-af4b32ec094a-kube-api-access-dhspx\") pod \"aodh-e930-account-create-update-gr4rp\" (UID: \"737cedfd-d3c6-4f5b-8289-af4b32ec094a\") " pod="openstack/aodh-e930-account-create-update-gr4rp" Dec 01 21:56:49 crc kubenswrapper[4962]: I1201 21:56:49.835535 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-e930-account-create-update-gr4rp" Dec 01 21:56:50 crc kubenswrapper[4962]: I1201 21:56:50.184152 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-h9qhc"] Dec 01 21:56:50 crc kubenswrapper[4962]: I1201 21:56:50.327180 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-e930-account-create-update-gr4rp"] Dec 01 21:56:50 crc kubenswrapper[4962]: I1201 21:56:50.869189 4962 generic.go:334] "Generic (PLEG): container finished" podID="737cedfd-d3c6-4f5b-8289-af4b32ec094a" containerID="e34f0dac6167d0be003d127806068534bce4d39d4a26e764dcf94cfeaa2bd185" exitCode=0 Dec 01 21:56:50 crc kubenswrapper[4962]: I1201 21:56:50.869245 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-e930-account-create-update-gr4rp" event={"ID":"737cedfd-d3c6-4f5b-8289-af4b32ec094a","Type":"ContainerDied","Data":"e34f0dac6167d0be003d127806068534bce4d39d4a26e764dcf94cfeaa2bd185"} Dec 01 21:56:50 crc kubenswrapper[4962]: I1201 21:56:50.869328 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-e930-account-create-update-gr4rp" event={"ID":"737cedfd-d3c6-4f5b-8289-af4b32ec094a","Type":"ContainerStarted","Data":"5f9a062675c354ce3e96f12ec7adb3b04100830c11c1cedfc93eec77937f7d22"} Dec 01 21:56:50 crc kubenswrapper[4962]: I1201 21:56:50.871135 4962 generic.go:334] "Generic (PLEG): container finished" podID="805240bc-5355-4ffa-886c-a4e96fb3a540" containerID="c77c0cb97248d978dc7a7e01e23b41b641701625149ceea908f513bdfe61d35f" exitCode=0 Dec 01 21:56:50 crc kubenswrapper[4962]: I1201 21:56:50.871183 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-h9qhc" event={"ID":"805240bc-5355-4ffa-886c-a4e96fb3a540","Type":"ContainerDied","Data":"c77c0cb97248d978dc7a7e01e23b41b641701625149ceea908f513bdfe61d35f"} Dec 01 21:56:50 crc kubenswrapper[4962]: I1201 21:56:50.871209 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-h9qhc" event={"ID":"805240bc-5355-4ffa-886c-a4e96fb3a540","Type":"ContainerStarted","Data":"14eaaf456409c0aa524e67a7c067681bf00231b708600f74564db3138505f566"} Dec 01 21:56:51 crc kubenswrapper[4962]: E1201 21:56:51.238678 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0fc56ad3c6fa62a6af99e37471ecd22b499ff39824bc98bc9d9ec4c5b8c39c46" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 01 21:56:51 crc kubenswrapper[4962]: E1201 21:56:51.241350 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0fc56ad3c6fa62a6af99e37471ecd22b499ff39824bc98bc9d9ec4c5b8c39c46" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 01 21:56:51 crc kubenswrapper[4962]: E1201 21:56:51.242920 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0fc56ad3c6fa62a6af99e37471ecd22b499ff39824bc98bc9d9ec4c5b8c39c46" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 01 21:56:51 crc kubenswrapper[4962]: E1201 21:56:51.243028 4962 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="e95f5aa0-0150-49da-a25d-c2eb369d394e" containerName="nova-cell0-conductor-conductor" Dec 01 21:56:51 crc kubenswrapper[4962]: I1201 21:56:51.461903 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 01 21:56:51 crc kubenswrapper[4962]: I1201 21:56:51.462357 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 01 21:56:51 crc kubenswrapper[4962]: I1201 21:56:51.530311 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 01 21:56:51 crc kubenswrapper[4962]: I1201 21:56:51.553558 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 01 21:56:51 crc kubenswrapper[4962]: I1201 21:56:51.891792 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 01 21:56:51 crc kubenswrapper[4962]: I1201 21:56:51.892163 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 01 21:56:52 crc kubenswrapper[4962]: I1201 21:56:52.428565 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-e930-account-create-update-gr4rp" Dec 01 21:56:52 crc kubenswrapper[4962]: I1201 21:56:52.539625 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-h9qhc" Dec 01 21:56:52 crc kubenswrapper[4962]: I1201 21:56:52.575684 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhspx\" (UniqueName: \"kubernetes.io/projected/737cedfd-d3c6-4f5b-8289-af4b32ec094a-kube-api-access-dhspx\") pod \"737cedfd-d3c6-4f5b-8289-af4b32ec094a\" (UID: \"737cedfd-d3c6-4f5b-8289-af4b32ec094a\") " Dec 01 21:56:52 crc kubenswrapper[4962]: I1201 21:56:52.575757 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/737cedfd-d3c6-4f5b-8289-af4b32ec094a-operator-scripts\") pod \"737cedfd-d3c6-4f5b-8289-af4b32ec094a\" (UID: \"737cedfd-d3c6-4f5b-8289-af4b32ec094a\") " Dec 01 21:56:52 crc kubenswrapper[4962]: I1201 21:56:52.577207 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/737cedfd-d3c6-4f5b-8289-af4b32ec094a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "737cedfd-d3c6-4f5b-8289-af4b32ec094a" (UID: "737cedfd-d3c6-4f5b-8289-af4b32ec094a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:56:52 crc kubenswrapper[4962]: I1201 21:56:52.588339 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/737cedfd-d3c6-4f5b-8289-af4b32ec094a-kube-api-access-dhspx" (OuterVolumeSpecName: "kube-api-access-dhspx") pod "737cedfd-d3c6-4f5b-8289-af4b32ec094a" (UID: "737cedfd-d3c6-4f5b-8289-af4b32ec094a"). InnerVolumeSpecName "kube-api-access-dhspx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:56:52 crc kubenswrapper[4962]: I1201 21:56:52.678339 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjl9p\" (UniqueName: \"kubernetes.io/projected/805240bc-5355-4ffa-886c-a4e96fb3a540-kube-api-access-mjl9p\") pod \"805240bc-5355-4ffa-886c-a4e96fb3a540\" (UID: \"805240bc-5355-4ffa-886c-a4e96fb3a540\") " Dec 01 21:56:52 crc kubenswrapper[4962]: I1201 21:56:52.678418 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/805240bc-5355-4ffa-886c-a4e96fb3a540-operator-scripts\") pod \"805240bc-5355-4ffa-886c-a4e96fb3a540\" (UID: \"805240bc-5355-4ffa-886c-a4e96fb3a540\") " Dec 01 21:56:52 crc kubenswrapper[4962]: I1201 21:56:52.678900 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhspx\" (UniqueName: \"kubernetes.io/projected/737cedfd-d3c6-4f5b-8289-af4b32ec094a-kube-api-access-dhspx\") on node \"crc\" DevicePath \"\"" Dec 01 21:56:52 crc kubenswrapper[4962]: I1201 21:56:52.678924 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/737cedfd-d3c6-4f5b-8289-af4b32ec094a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 21:56:52 crc kubenswrapper[4962]: I1201 21:56:52.679185 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/805240bc-5355-4ffa-886c-a4e96fb3a540-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "805240bc-5355-4ffa-886c-a4e96fb3a540" (UID: "805240bc-5355-4ffa-886c-a4e96fb3a540"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:56:52 crc kubenswrapper[4962]: I1201 21:56:52.682156 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/805240bc-5355-4ffa-886c-a4e96fb3a540-kube-api-access-mjl9p" (OuterVolumeSpecName: "kube-api-access-mjl9p") pod "805240bc-5355-4ffa-886c-a4e96fb3a540" (UID: "805240bc-5355-4ffa-886c-a4e96fb3a540"). InnerVolumeSpecName "kube-api-access-mjl9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:56:52 crc kubenswrapper[4962]: I1201 21:56:52.780593 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/805240bc-5355-4ffa-886c-a4e96fb3a540-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 21:56:52 crc kubenswrapper[4962]: I1201 21:56:52.780631 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjl9p\" (UniqueName: \"kubernetes.io/projected/805240bc-5355-4ffa-886c-a4e96fb3a540-kube-api-access-mjl9p\") on node \"crc\" DevicePath \"\"" Dec 01 21:56:52 crc kubenswrapper[4962]: I1201 21:56:52.904693 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-h9qhc" event={"ID":"805240bc-5355-4ffa-886c-a4e96fb3a540","Type":"ContainerDied","Data":"14eaaf456409c0aa524e67a7c067681bf00231b708600f74564db3138505f566"} Dec 01 21:56:52 crc kubenswrapper[4962]: I1201 21:56:52.904720 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-h9qhc" Dec 01 21:56:52 crc kubenswrapper[4962]: I1201 21:56:52.904733 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14eaaf456409c0aa524e67a7c067681bf00231b708600f74564db3138505f566" Dec 01 21:56:52 crc kubenswrapper[4962]: I1201 21:56:52.907756 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-e930-account-create-update-gr4rp" Dec 01 21:56:52 crc kubenswrapper[4962]: I1201 21:56:52.908756 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-e930-account-create-update-gr4rp" event={"ID":"737cedfd-d3c6-4f5b-8289-af4b32ec094a","Type":"ContainerDied","Data":"5f9a062675c354ce3e96f12ec7adb3b04100830c11c1cedfc93eec77937f7d22"} Dec 01 21:56:52 crc kubenswrapper[4962]: I1201 21:56:52.908776 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f9a062675c354ce3e96f12ec7adb3b04100830c11c1cedfc93eec77937f7d22" Dec 01 21:56:53 crc kubenswrapper[4962]: I1201 21:56:53.819705 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 01 21:56:53 crc kubenswrapper[4962]: I1201 21:56:53.820361 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 01 21:56:53 crc kubenswrapper[4962]: I1201 21:56:53.883886 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 01 21:56:53 crc kubenswrapper[4962]: I1201 21:56:53.913674 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 01 21:56:53 crc kubenswrapper[4962]: I1201 21:56:53.936207 4962 generic.go:334] "Generic (PLEG): container finished" podID="272645a0-8a27-4b3e-8f4f-5e456f118d84" containerID="5a6778c5ed2512f1ec2efda396e8e9535eb9077d93cd0315583df081b2eadf0f" exitCode=137 Dec 01 21:56:53 crc kubenswrapper[4962]: I1201 21:56:53.936286 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"272645a0-8a27-4b3e-8f4f-5e456f118d84","Type":"ContainerDied","Data":"5a6778c5ed2512f1ec2efda396e8e9535eb9077d93cd0315583df081b2eadf0f"} Dec 01 21:56:53 crc kubenswrapper[4962]: I1201 21:56:53.936333 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"272645a0-8a27-4b3e-8f4f-5e456f118d84","Type":"ContainerDied","Data":"2552f2dca1b12dabd1917284e7c2a47006998b9f9527e8f0b6250123f888cc95"} Dec 01 21:56:53 crc kubenswrapper[4962]: I1201 21:56:53.936346 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2552f2dca1b12dabd1917284e7c2a47006998b9f9527e8f0b6250123f888cc95" Dec 01 21:56:53 crc kubenswrapper[4962]: I1201 21:56:53.936702 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 01 21:56:53 crc kubenswrapper[4962]: I1201 21:56:53.936750 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 01 21:56:53 crc kubenswrapper[4962]: I1201 21:56:53.961105 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 21:56:54 crc kubenswrapper[4962]: I1201 21:56:54.110955 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/272645a0-8a27-4b3e-8f4f-5e456f118d84-sg-core-conf-yaml\") pod \"272645a0-8a27-4b3e-8f4f-5e456f118d84\" (UID: \"272645a0-8a27-4b3e-8f4f-5e456f118d84\") " Dec 01 21:56:54 crc kubenswrapper[4962]: I1201 21:56:54.111538 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/272645a0-8a27-4b3e-8f4f-5e456f118d84-config-data\") pod \"272645a0-8a27-4b3e-8f4f-5e456f118d84\" (UID: \"272645a0-8a27-4b3e-8f4f-5e456f118d84\") " Dec 01 21:56:54 crc kubenswrapper[4962]: I1201 21:56:54.111654 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4mvc8\" (UniqueName: \"kubernetes.io/projected/272645a0-8a27-4b3e-8f4f-5e456f118d84-kube-api-access-4mvc8\") pod \"272645a0-8a27-4b3e-8f4f-5e456f118d84\" (UID: \"272645a0-8a27-4b3e-8f4f-5e456f118d84\") " Dec 01 21:56:54 crc kubenswrapper[4962]: I1201 21:56:54.111961 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/272645a0-8a27-4b3e-8f4f-5e456f118d84-combined-ca-bundle\") pod \"272645a0-8a27-4b3e-8f4f-5e456f118d84\" (UID: \"272645a0-8a27-4b3e-8f4f-5e456f118d84\") " Dec 01 21:56:54 crc kubenswrapper[4962]: I1201 21:56:54.112029 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/272645a0-8a27-4b3e-8f4f-5e456f118d84-run-httpd\") pod \"272645a0-8a27-4b3e-8f4f-5e456f118d84\" (UID: \"272645a0-8a27-4b3e-8f4f-5e456f118d84\") " Dec 01 21:56:54 crc kubenswrapper[4962]: I1201 21:56:54.112510 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/272645a0-8a27-4b3e-8f4f-5e456f118d84-log-httpd\") pod \"272645a0-8a27-4b3e-8f4f-5e456f118d84\" (UID: \"272645a0-8a27-4b3e-8f4f-5e456f118d84\") " Dec 01 21:56:54 crc kubenswrapper[4962]: I1201 21:56:54.112607 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/272645a0-8a27-4b3e-8f4f-5e456f118d84-scripts\") pod \"272645a0-8a27-4b3e-8f4f-5e456f118d84\" (UID: \"272645a0-8a27-4b3e-8f4f-5e456f118d84\") " Dec 01 21:56:54 crc kubenswrapper[4962]: I1201 21:56:54.112999 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/272645a0-8a27-4b3e-8f4f-5e456f118d84-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "272645a0-8a27-4b3e-8f4f-5e456f118d84" (UID: "272645a0-8a27-4b3e-8f4f-5e456f118d84"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:56:54 crc kubenswrapper[4962]: I1201 21:56:54.113014 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/272645a0-8a27-4b3e-8f4f-5e456f118d84-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "272645a0-8a27-4b3e-8f4f-5e456f118d84" (UID: "272645a0-8a27-4b3e-8f4f-5e456f118d84"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:56:54 crc kubenswrapper[4962]: I1201 21:56:54.113832 4962 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/272645a0-8a27-4b3e-8f4f-5e456f118d84-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 21:56:54 crc kubenswrapper[4962]: I1201 21:56:54.113858 4962 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/272645a0-8a27-4b3e-8f4f-5e456f118d84-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 21:56:54 crc kubenswrapper[4962]: I1201 21:56:54.117693 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/272645a0-8a27-4b3e-8f4f-5e456f118d84-scripts" (OuterVolumeSpecName: "scripts") pod "272645a0-8a27-4b3e-8f4f-5e456f118d84" (UID: "272645a0-8a27-4b3e-8f4f-5e456f118d84"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:56:54 crc kubenswrapper[4962]: I1201 21:56:54.118463 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/272645a0-8a27-4b3e-8f4f-5e456f118d84-kube-api-access-4mvc8" (OuterVolumeSpecName: "kube-api-access-4mvc8") pod "272645a0-8a27-4b3e-8f4f-5e456f118d84" (UID: "272645a0-8a27-4b3e-8f4f-5e456f118d84"). InnerVolumeSpecName "kube-api-access-4mvc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:56:54 crc kubenswrapper[4962]: I1201 21:56:54.155986 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 01 21:56:54 crc kubenswrapper[4962]: I1201 21:56:54.156146 4962 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 21:56:54 crc kubenswrapper[4962]: I1201 21:56:54.161189 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 01 21:56:54 crc kubenswrapper[4962]: I1201 21:56:54.177407 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/272645a0-8a27-4b3e-8f4f-5e456f118d84-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "272645a0-8a27-4b3e-8f4f-5e456f118d84" (UID: "272645a0-8a27-4b3e-8f4f-5e456f118d84"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:56:54 crc kubenswrapper[4962]: I1201 21:56:54.215911 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/272645a0-8a27-4b3e-8f4f-5e456f118d84-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 21:56:54 crc kubenswrapper[4962]: I1201 21:56:54.215964 4962 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/272645a0-8a27-4b3e-8f4f-5e456f118d84-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 01 21:56:54 crc kubenswrapper[4962]: I1201 21:56:54.215983 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4mvc8\" (UniqueName: \"kubernetes.io/projected/272645a0-8a27-4b3e-8f4f-5e456f118d84-kube-api-access-4mvc8\") on node \"crc\" DevicePath \"\"" Dec 01 21:56:54 crc kubenswrapper[4962]: I1201 21:56:54.271187 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/272645a0-8a27-4b3e-8f4f-5e456f118d84-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "272645a0-8a27-4b3e-8f4f-5e456f118d84" (UID: "272645a0-8a27-4b3e-8f4f-5e456f118d84"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:56:54 crc kubenswrapper[4962]: I1201 21:56:54.311702 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/272645a0-8a27-4b3e-8f4f-5e456f118d84-config-data" (OuterVolumeSpecName: "config-data") pod "272645a0-8a27-4b3e-8f4f-5e456f118d84" (UID: "272645a0-8a27-4b3e-8f4f-5e456f118d84"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:56:54 crc kubenswrapper[4962]: I1201 21:56:54.319834 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/272645a0-8a27-4b3e-8f4f-5e456f118d84-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 21:56:54 crc kubenswrapper[4962]: I1201 21:56:54.319895 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/272645a0-8a27-4b3e-8f4f-5e456f118d84-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 21:56:54 crc kubenswrapper[4962]: I1201 21:56:54.817319 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-9zgsj"] Dec 01 21:56:54 crc kubenswrapper[4962]: E1201 21:56:54.818386 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="805240bc-5355-4ffa-886c-a4e96fb3a540" containerName="mariadb-database-create" Dec 01 21:56:54 crc kubenswrapper[4962]: I1201 21:56:54.818412 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="805240bc-5355-4ffa-886c-a4e96fb3a540" containerName="mariadb-database-create" Dec 01 21:56:54 crc kubenswrapper[4962]: E1201 21:56:54.818438 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="272645a0-8a27-4b3e-8f4f-5e456f118d84" containerName="sg-core" Dec 01 21:56:54 crc kubenswrapper[4962]: I1201 21:56:54.818447 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="272645a0-8a27-4b3e-8f4f-5e456f118d84" containerName="sg-core" Dec 01 21:56:54 crc kubenswrapper[4962]: E1201 21:56:54.818461 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="272645a0-8a27-4b3e-8f4f-5e456f118d84" containerName="proxy-httpd" Dec 01 21:56:54 crc kubenswrapper[4962]: I1201 21:56:54.818469 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="272645a0-8a27-4b3e-8f4f-5e456f118d84" containerName="proxy-httpd" Dec 01 21:56:54 crc kubenswrapper[4962]: E1201 21:56:54.818511 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="272645a0-8a27-4b3e-8f4f-5e456f118d84" containerName="ceilometer-central-agent" Dec 01 21:56:54 crc kubenswrapper[4962]: I1201 21:56:54.818517 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="272645a0-8a27-4b3e-8f4f-5e456f118d84" containerName="ceilometer-central-agent" Dec 01 21:56:54 crc kubenswrapper[4962]: E1201 21:56:54.818533 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="272645a0-8a27-4b3e-8f4f-5e456f118d84" containerName="ceilometer-notification-agent" Dec 01 21:56:54 crc kubenswrapper[4962]: I1201 21:56:54.818540 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="272645a0-8a27-4b3e-8f4f-5e456f118d84" containerName="ceilometer-notification-agent" Dec 01 21:56:54 crc kubenswrapper[4962]: E1201 21:56:54.818552 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="737cedfd-d3c6-4f5b-8289-af4b32ec094a" containerName="mariadb-account-create-update" Dec 01 21:56:54 crc kubenswrapper[4962]: I1201 21:56:54.818560 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="737cedfd-d3c6-4f5b-8289-af4b32ec094a" containerName="mariadb-account-create-update" Dec 01 21:56:54 crc kubenswrapper[4962]: I1201 21:56:54.818829 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="272645a0-8a27-4b3e-8f4f-5e456f118d84" containerName="ceilometer-central-agent" Dec 01 21:56:54 crc kubenswrapper[4962]: I1201 21:56:54.818849 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="272645a0-8a27-4b3e-8f4f-5e456f118d84" containerName="ceilometer-notification-agent" Dec 01 21:56:54 crc kubenswrapper[4962]: I1201 21:56:54.818864 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="272645a0-8a27-4b3e-8f4f-5e456f118d84" containerName="proxy-httpd" Dec 01 21:56:54 crc kubenswrapper[4962]: I1201 21:56:54.818879 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="272645a0-8a27-4b3e-8f4f-5e456f118d84" containerName="sg-core" Dec 01 21:56:54 crc kubenswrapper[4962]: I1201 21:56:54.818894 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="805240bc-5355-4ffa-886c-a4e96fb3a540" containerName="mariadb-database-create" Dec 01 21:56:54 crc kubenswrapper[4962]: I1201 21:56:54.818908 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="737cedfd-d3c6-4f5b-8289-af4b32ec094a" containerName="mariadb-account-create-update" Dec 01 21:56:54 crc kubenswrapper[4962]: I1201 21:56:54.819924 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-9zgsj" Dec 01 21:56:54 crc kubenswrapper[4962]: I1201 21:56:54.827821 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 01 21:56:54 crc kubenswrapper[4962]: I1201 21:56:54.828100 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Dec 01 21:56:54 crc kubenswrapper[4962]: I1201 21:56:54.828285 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-bf8t4" Dec 01 21:56:54 crc kubenswrapper[4962]: I1201 21:56:54.828469 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Dec 01 21:56:54 crc kubenswrapper[4962]: I1201 21:56:54.831956 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-9zgsj"] Dec 01 21:56:54 crc kubenswrapper[4962]: I1201 21:56:54.933792 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c24511ba-ce02-420a-83c0-7ef9a6c4eb47-config-data\") pod \"aodh-db-sync-9zgsj\" (UID: \"c24511ba-ce02-420a-83c0-7ef9a6c4eb47\") " pod="openstack/aodh-db-sync-9zgsj" Dec 01 21:56:54 crc kubenswrapper[4962]: I1201 21:56:54.933873 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c24511ba-ce02-420a-83c0-7ef9a6c4eb47-scripts\") pod \"aodh-db-sync-9zgsj\" (UID: \"c24511ba-ce02-420a-83c0-7ef9a6c4eb47\") " pod="openstack/aodh-db-sync-9zgsj" Dec 01 21:56:54 crc kubenswrapper[4962]: I1201 21:56:54.934100 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c24511ba-ce02-420a-83c0-7ef9a6c4eb47-combined-ca-bundle\") pod \"aodh-db-sync-9zgsj\" (UID: \"c24511ba-ce02-420a-83c0-7ef9a6c4eb47\") " pod="openstack/aodh-db-sync-9zgsj" Dec 01 21:56:54 crc kubenswrapper[4962]: I1201 21:56:54.934295 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nn4cv\" (UniqueName: \"kubernetes.io/projected/c24511ba-ce02-420a-83c0-7ef9a6c4eb47-kube-api-access-nn4cv\") pod \"aodh-db-sync-9zgsj\" (UID: \"c24511ba-ce02-420a-83c0-7ef9a6c4eb47\") " pod="openstack/aodh-db-sync-9zgsj" Dec 01 21:56:54 crc kubenswrapper[4962]: I1201 21:56:54.947359 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 21:56:54 crc kubenswrapper[4962]: I1201 21:56:54.984729 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 21:56:55 crc kubenswrapper[4962]: I1201 21:56:55.004877 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 01 21:56:55 crc kubenswrapper[4962]: I1201 21:56:55.020241 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 01 21:56:55 crc kubenswrapper[4962]: I1201 21:56:55.022865 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 21:56:55 crc kubenswrapper[4962]: I1201 21:56:55.024622 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 01 21:56:55 crc kubenswrapper[4962]: I1201 21:56:55.027219 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 01 21:56:55 crc kubenswrapper[4962]: I1201 21:56:55.036685 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c24511ba-ce02-420a-83c0-7ef9a6c4eb47-scripts\") pod \"aodh-db-sync-9zgsj\" (UID: \"c24511ba-ce02-420a-83c0-7ef9a6c4eb47\") " pod="openstack/aodh-db-sync-9zgsj" Dec 01 21:56:55 crc kubenswrapper[4962]: I1201 21:56:55.036786 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c24511ba-ce02-420a-83c0-7ef9a6c4eb47-combined-ca-bundle\") pod \"aodh-db-sync-9zgsj\" (UID: \"c24511ba-ce02-420a-83c0-7ef9a6c4eb47\") " pod="openstack/aodh-db-sync-9zgsj" Dec 01 21:56:55 crc kubenswrapper[4962]: I1201 21:56:55.036898 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nn4cv\" (UniqueName: \"kubernetes.io/projected/c24511ba-ce02-420a-83c0-7ef9a6c4eb47-kube-api-access-nn4cv\") pod \"aodh-db-sync-9zgsj\" (UID: \"c24511ba-ce02-420a-83c0-7ef9a6c4eb47\") " pod="openstack/aodh-db-sync-9zgsj" Dec 01 21:56:55 crc kubenswrapper[4962]: I1201 21:56:55.037162 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c24511ba-ce02-420a-83c0-7ef9a6c4eb47-config-data\") pod \"aodh-db-sync-9zgsj\" (UID: \"c24511ba-ce02-420a-83c0-7ef9a6c4eb47\") " pod="openstack/aodh-db-sync-9zgsj" Dec 01 21:56:55 crc kubenswrapper[4962]: I1201 21:56:55.039901 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 21:56:55 crc kubenswrapper[4962]: I1201 21:56:55.049317 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c24511ba-ce02-420a-83c0-7ef9a6c4eb47-combined-ca-bundle\") pod \"aodh-db-sync-9zgsj\" (UID: \"c24511ba-ce02-420a-83c0-7ef9a6c4eb47\") " pod="openstack/aodh-db-sync-9zgsj" Dec 01 21:56:55 crc kubenswrapper[4962]: I1201 21:56:55.049865 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c24511ba-ce02-420a-83c0-7ef9a6c4eb47-scripts\") pod \"aodh-db-sync-9zgsj\" (UID: \"c24511ba-ce02-420a-83c0-7ef9a6c4eb47\") " pod="openstack/aodh-db-sync-9zgsj" Dec 01 21:56:55 crc kubenswrapper[4962]: I1201 21:56:55.065895 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c24511ba-ce02-420a-83c0-7ef9a6c4eb47-config-data\") pod \"aodh-db-sync-9zgsj\" (UID: \"c24511ba-ce02-420a-83c0-7ef9a6c4eb47\") " pod="openstack/aodh-db-sync-9zgsj" Dec 01 21:56:55 crc kubenswrapper[4962]: I1201 21:56:55.066202 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nn4cv\" (UniqueName: \"kubernetes.io/projected/c24511ba-ce02-420a-83c0-7ef9a6c4eb47-kube-api-access-nn4cv\") pod \"aodh-db-sync-9zgsj\" (UID: \"c24511ba-ce02-420a-83c0-7ef9a6c4eb47\") " pod="openstack/aodh-db-sync-9zgsj" Dec 01 21:56:55 crc kubenswrapper[4962]: I1201 21:56:55.138999 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/825ec49a-9597-4e47-8c1c-9fac21baed5a-config-data\") pod \"ceilometer-0\" (UID: \"825ec49a-9597-4e47-8c1c-9fac21baed5a\") " pod="openstack/ceilometer-0" Dec 01 21:56:55 crc kubenswrapper[4962]: I1201 21:56:55.139115 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/825ec49a-9597-4e47-8c1c-9fac21baed5a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"825ec49a-9597-4e47-8c1c-9fac21baed5a\") " pod="openstack/ceilometer-0" Dec 01 21:56:55 crc kubenswrapper[4962]: I1201 21:56:55.139161 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/825ec49a-9597-4e47-8c1c-9fac21baed5a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"825ec49a-9597-4e47-8c1c-9fac21baed5a\") " pod="openstack/ceilometer-0" Dec 01 21:56:55 crc kubenswrapper[4962]: I1201 21:56:55.139307 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lhtj\" (UniqueName: \"kubernetes.io/projected/825ec49a-9597-4e47-8c1c-9fac21baed5a-kube-api-access-8lhtj\") pod \"ceilometer-0\" (UID: \"825ec49a-9597-4e47-8c1c-9fac21baed5a\") " pod="openstack/ceilometer-0" Dec 01 21:56:55 crc kubenswrapper[4962]: I1201 21:56:55.139355 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/825ec49a-9597-4e47-8c1c-9fac21baed5a-scripts\") pod \"ceilometer-0\" (UID: \"825ec49a-9597-4e47-8c1c-9fac21baed5a\") " pod="openstack/ceilometer-0" Dec 01 21:56:55 crc kubenswrapper[4962]: I1201 21:56:55.139401 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/825ec49a-9597-4e47-8c1c-9fac21baed5a-run-httpd\") pod \"ceilometer-0\" (UID: \"825ec49a-9597-4e47-8c1c-9fac21baed5a\") " pod="openstack/ceilometer-0" Dec 01 21:56:55 crc kubenswrapper[4962]: I1201 21:56:55.139419 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/825ec49a-9597-4e47-8c1c-9fac21baed5a-log-httpd\") pod \"ceilometer-0\" (UID: \"825ec49a-9597-4e47-8c1c-9fac21baed5a\") " pod="openstack/ceilometer-0" Dec 01 21:56:55 crc kubenswrapper[4962]: I1201 21:56:55.142554 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-9zgsj" Dec 01 21:56:55 crc kubenswrapper[4962]: I1201 21:56:55.242388 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lhtj\" (UniqueName: \"kubernetes.io/projected/825ec49a-9597-4e47-8c1c-9fac21baed5a-kube-api-access-8lhtj\") pod \"ceilometer-0\" (UID: \"825ec49a-9597-4e47-8c1c-9fac21baed5a\") " pod="openstack/ceilometer-0" Dec 01 21:56:55 crc kubenswrapper[4962]: I1201 21:56:55.242886 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/825ec49a-9597-4e47-8c1c-9fac21baed5a-scripts\") pod \"ceilometer-0\" (UID: \"825ec49a-9597-4e47-8c1c-9fac21baed5a\") " pod="openstack/ceilometer-0" Dec 01 21:56:55 crc kubenswrapper[4962]: I1201 21:56:55.243119 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/825ec49a-9597-4e47-8c1c-9fac21baed5a-run-httpd\") pod \"ceilometer-0\" (UID: \"825ec49a-9597-4e47-8c1c-9fac21baed5a\") " pod="openstack/ceilometer-0" Dec 01 21:56:55 crc kubenswrapper[4962]: I1201 21:56:55.243170 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/825ec49a-9597-4e47-8c1c-9fac21baed5a-log-httpd\") pod \"ceilometer-0\" (UID: \"825ec49a-9597-4e47-8c1c-9fac21baed5a\") " pod="openstack/ceilometer-0" Dec 01 21:56:55 crc kubenswrapper[4962]: I1201 21:56:55.243241 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/825ec49a-9597-4e47-8c1c-9fac21baed5a-config-data\") pod \"ceilometer-0\" (UID: \"825ec49a-9597-4e47-8c1c-9fac21baed5a\") " pod="openstack/ceilometer-0" Dec 01 21:56:55 crc kubenswrapper[4962]: I1201 21:56:55.243310 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/825ec49a-9597-4e47-8c1c-9fac21baed5a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"825ec49a-9597-4e47-8c1c-9fac21baed5a\") " pod="openstack/ceilometer-0" Dec 01 21:56:55 crc kubenswrapper[4962]: I1201 21:56:55.243350 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/825ec49a-9597-4e47-8c1c-9fac21baed5a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"825ec49a-9597-4e47-8c1c-9fac21baed5a\") " pod="openstack/ceilometer-0" Dec 01 21:56:55 crc kubenswrapper[4962]: I1201 21:56:55.243728 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/825ec49a-9597-4e47-8c1c-9fac21baed5a-run-httpd\") pod \"ceilometer-0\" (UID: \"825ec49a-9597-4e47-8c1c-9fac21baed5a\") " pod="openstack/ceilometer-0" Dec 01 21:56:55 crc kubenswrapper[4962]: I1201 21:56:55.244391 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/825ec49a-9597-4e47-8c1c-9fac21baed5a-log-httpd\") pod \"ceilometer-0\" (UID: \"825ec49a-9597-4e47-8c1c-9fac21baed5a\") " pod="openstack/ceilometer-0" Dec 01 21:56:55 crc kubenswrapper[4962]: I1201 21:56:55.248135 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/825ec49a-9597-4e47-8c1c-9fac21baed5a-config-data\") pod \"ceilometer-0\" (UID: \"825ec49a-9597-4e47-8c1c-9fac21baed5a\") " pod="openstack/ceilometer-0" Dec 01 21:56:55 crc kubenswrapper[4962]: I1201 21:56:55.249532 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/825ec49a-9597-4e47-8c1c-9fac21baed5a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"825ec49a-9597-4e47-8c1c-9fac21baed5a\") " pod="openstack/ceilometer-0" Dec 01 21:56:55 crc kubenswrapper[4962]: I1201 21:56:55.250975 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/825ec49a-9597-4e47-8c1c-9fac21baed5a-scripts\") pod \"ceilometer-0\" (UID: \"825ec49a-9597-4e47-8c1c-9fac21baed5a\") " pod="openstack/ceilometer-0" Dec 01 21:56:55 crc kubenswrapper[4962]: I1201 21:56:55.251450 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/825ec49a-9597-4e47-8c1c-9fac21baed5a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"825ec49a-9597-4e47-8c1c-9fac21baed5a\") " pod="openstack/ceilometer-0" Dec 01 21:56:55 crc kubenswrapper[4962]: I1201 21:56:55.263557 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lhtj\" (UniqueName: \"kubernetes.io/projected/825ec49a-9597-4e47-8c1c-9fac21baed5a-kube-api-access-8lhtj\") pod \"ceilometer-0\" (UID: \"825ec49a-9597-4e47-8c1c-9fac21baed5a\") " pod="openstack/ceilometer-0" Dec 01 21:56:55 crc kubenswrapper[4962]: I1201 21:56:55.272120 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 21:56:56 crc kubenswrapper[4962]: I1201 21:56:55.734428 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-9zgsj"] Dec 01 21:56:56 crc kubenswrapper[4962]: I1201 21:56:55.956207 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-9zgsj" event={"ID":"c24511ba-ce02-420a-83c0-7ef9a6c4eb47","Type":"ContainerStarted","Data":"4e513ec50e459a1f2975da476f8d305e1b87bb2388396c0d6c445662b35eebcc"} Dec 01 21:56:56 crc kubenswrapper[4962]: I1201 21:56:56.054802 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 01 21:56:56 crc kubenswrapper[4962]: I1201 21:56:56.054943 4962 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 21:56:56 crc kubenswrapper[4962]: I1201 21:56:56.057882 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 01 21:56:56 crc kubenswrapper[4962]: I1201 21:56:56.236942 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="272645a0-8a27-4b3e-8f4f-5e456f118d84" path="/var/lib/kubelet/pods/272645a0-8a27-4b3e-8f4f-5e456f118d84/volumes" Dec 01 21:56:56 crc kubenswrapper[4962]: E1201 21:56:56.247787 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0fc56ad3c6fa62a6af99e37471ecd22b499ff39824bc98bc9d9ec4c5b8c39c46" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 01 21:56:56 crc kubenswrapper[4962]: E1201 21:56:56.255261 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0fc56ad3c6fa62a6af99e37471ecd22b499ff39824bc98bc9d9ec4c5b8c39c46" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 01 21:56:56 crc kubenswrapper[4962]: E1201 21:56:56.263109 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0fc56ad3c6fa62a6af99e37471ecd22b499ff39824bc98bc9d9ec4c5b8c39c46" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 01 21:56:56 crc kubenswrapper[4962]: E1201 21:56:56.263169 4962 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="e95f5aa0-0150-49da-a25d-c2eb369d394e" containerName="nova-cell0-conductor-conductor" Dec 01 21:56:56 crc kubenswrapper[4962]: I1201 21:56:56.590312 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 21:56:56 crc kubenswrapper[4962]: E1201 21:56:56.911843 4962 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/42422da9157c6af957315b0497eef05a037adeb8add8b8987a9f7ecd35f1a28c/diff" to get inode usage: stat /var/lib/containers/storage/overlay/42422da9157c6af957315b0497eef05a037adeb8add8b8987a9f7ecd35f1a28c/diff: no such file or directory, extraDiskErr: could not stat "/var/log/pods/openstack_glance-default-internal-api-0_788bee1d-914f-4efc-acea-67ff250ce73f/glance-log/0.log" to get inode usage: stat /var/log/pods/openstack_glance-default-internal-api-0_788bee1d-914f-4efc-acea-67ff250ce73f/glance-log/0.log: no such file or directory Dec 01 21:56:56 crc kubenswrapper[4962]: E1201 21:56:56.966951 4962 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/1fe43c8f32005d56e142b86445cf668fa9bf97513a95d9d6ca56a9c7db9e19fc/diff" to get inode usage: stat /var/lib/containers/storage/overlay/1fe43c8f32005d56e142b86445cf668fa9bf97513a95d9d6ca56a9c7db9e19fc/diff: no such file or directory, extraDiskErr: could not stat "/var/log/pods/openstack_glance-default-external-api-0_1ded81d0-3be6-4293-abd6-c3434d42667e/glance-log/0.log" to get inode usage: stat /var/log/pods/openstack_glance-default-external-api-0_1ded81d0-3be6-4293-abd6-c3434d42667e/glance-log/0.log: no such file or directory Dec 01 21:56:56 crc kubenswrapper[4962]: I1201 21:56:56.975437 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"825ec49a-9597-4e47-8c1c-9fac21baed5a","Type":"ContainerStarted","Data":"72d0dd698dc73e06c4e39c52f73793e111ec32ae436fb2b8a6e1452f077ac5e0"} Dec 01 21:56:57 crc kubenswrapper[4962]: I1201 21:56:57.990297 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"825ec49a-9597-4e47-8c1c-9fac21baed5a","Type":"ContainerStarted","Data":"b37089abafb0b561fc8bb52a4289214dd10b68fdeb17894fca7701c71115549a"} Dec 01 21:56:58 crc kubenswrapper[4962]: E1201 21:56:58.436077 4962 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/6f4d5c7d63f5450f6b928a0be3aed3136bda43328bb8def4032ba605fe094066/diff" to get inode usage: stat /var/lib/containers/storage/overlay/6f4d5c7d63f5450f6b928a0be3aed3136bda43328bb8def4032ba605fe094066/diff: no such file or directory, extraDiskErr: could not stat "/var/log/pods/openstack_glance-default-internal-api-0_788bee1d-914f-4efc-acea-67ff250ce73f/glance-httpd/0.log" to get inode usage: stat /var/log/pods/openstack_glance-default-internal-api-0_788bee1d-914f-4efc-acea-67ff250ce73f/glance-httpd/0.log: no such file or directory Dec 01 21:56:58 crc kubenswrapper[4962]: E1201 21:56:58.737246 4962 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/669ab1b4ba108261c5e7036c317fa28f292aa1e919bed8bc3fe9f971ec15d047/diff" to get inode usage: stat /var/lib/containers/storage/overlay/669ab1b4ba108261c5e7036c317fa28f292aa1e919bed8bc3fe9f971ec15d047/diff: no such file or directory, extraDiskErr: could not stat "/var/log/pods/openstack_glance-default-external-api-0_1ded81d0-3be6-4293-abd6-c3434d42667e/glance-httpd/0.log" to get inode usage: stat /var/log/pods/openstack_glance-default-external-api-0_1ded81d0-3be6-4293-abd6-c3434d42667e/glance-httpd/0.log: no such file or directory Dec 01 21:56:59 crc kubenswrapper[4962]: I1201 21:56:59.026466 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"825ec49a-9597-4e47-8c1c-9fac21baed5a","Type":"ContainerStarted","Data":"f9a9d8b8c02560188a175645d67481fad9804595608e1db96150422f6fb6ed61"} Dec 01 21:57:01 crc kubenswrapper[4962]: E1201 21:57:01.238213 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0fc56ad3c6fa62a6af99e37471ecd22b499ff39824bc98bc9d9ec4c5b8c39c46" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 01 21:57:01 crc kubenswrapper[4962]: E1201 21:57:01.240086 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0fc56ad3c6fa62a6af99e37471ecd22b499ff39824bc98bc9d9ec4c5b8c39c46" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 01 21:57:01 crc kubenswrapper[4962]: E1201 21:57:01.242014 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0fc56ad3c6fa62a6af99e37471ecd22b499ff39824bc98bc9d9ec4c5b8c39c46" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 01 21:57:01 crc kubenswrapper[4962]: E1201 21:57:01.242069 4962 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="e95f5aa0-0150-49da-a25d-c2eb369d394e" containerName="nova-cell0-conductor-conductor" Dec 01 21:57:02 crc kubenswrapper[4962]: E1201 21:57:02.009261 4962 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda75e6047_420d_4aa3_a817_90a547491be2.slice/crio-39e9bcdd2291436e12d4ef9911da2cc6b36a2943fd8fd797a32ec1a8a21058d6\": RecentStats: unable to find data in memory cache]" Dec 01 21:57:02 crc kubenswrapper[4962]: E1201 21:57:02.013906 4962 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda75e6047_420d_4aa3_a817_90a547491be2.slice/crio-39e9bcdd2291436e12d4ef9911da2cc6b36a2943fd8fd797a32ec1a8a21058d6\": RecentStats: unable to find data in memory cache]" Dec 01 21:57:02 crc kubenswrapper[4962]: I1201 21:57:02.069374 4962 generic.go:334] "Generic (PLEG): container finished" podID="a75e6047-420d-4aa3-a817-90a547491be2" containerID="78558ce17d5d4004db2c351716fa0e3661a20c3659d14e739135462444227e5b" exitCode=137 Dec 01 21:57:02 crc kubenswrapper[4962]: I1201 21:57:02.069424 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6648ff8f4d-xwjnk" event={"ID":"a75e6047-420d-4aa3-a817-90a547491be2","Type":"ContainerDied","Data":"78558ce17d5d4004db2c351716fa0e3661a20c3659d14e739135462444227e5b"} Dec 01 21:57:02 crc kubenswrapper[4962]: I1201 21:57:02.070485 4962 generic.go:334] "Generic (PLEG): container finished" podID="7b64f053-da53-49c7-a227-dcc84b5c078d" containerID="c8369569da1064735d48a0eb1d8a3f8459f1093e0197a113332c211339f04916" exitCode=137 Dec 01 21:57:02 crc kubenswrapper[4962]: I1201 21:57:02.070519 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-58f5d66b6f-bgzrf" event={"ID":"7b64f053-da53-49c7-a227-dcc84b5c078d","Type":"ContainerDied","Data":"c8369569da1064735d48a0eb1d8a3f8459f1093e0197a113332c211339f04916"} Dec 01 21:57:02 crc kubenswrapper[4962]: I1201 21:57:02.074094 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"825ec49a-9597-4e47-8c1c-9fac21baed5a","Type":"ContainerStarted","Data":"d8f63d67dadc85428fbe743fc163168641ab662785348e82976dbb0d5a8249c6"} Dec 01 21:57:02 crc kubenswrapper[4962]: I1201 21:57:02.075833 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-9zgsj" event={"ID":"c24511ba-ce02-420a-83c0-7ef9a6c4eb47","Type":"ContainerStarted","Data":"2da244bf8c60657deaac54de9fd127e4ff9e1c7fd734afd69cae2978ac0f3a5c"} Dec 01 21:57:02 crc kubenswrapper[4962]: I1201 21:57:02.095976 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-9zgsj" podStartSLOduration=3.081133764 podStartE2EDuration="8.095957215s" podCreationTimestamp="2025-12-01 21:56:54 +0000 UTC" firstStartedPulling="2025-12-01 21:56:55.757722577 +0000 UTC m=+1399.859161762" lastFinishedPulling="2025-12-01 21:57:00.772545998 +0000 UTC m=+1404.873985213" observedRunningTime="2025-12-01 21:57:02.089969444 +0000 UTC m=+1406.191408639" watchObservedRunningTime="2025-12-01 21:57:02.095957215 +0000 UTC m=+1406.197396400" Dec 01 21:57:02 crc kubenswrapper[4962]: I1201 21:57:02.416162 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-58f5d66b6f-bgzrf" Dec 01 21:57:02 crc kubenswrapper[4962]: I1201 21:57:02.418102 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6648ff8f4d-xwjnk" Dec 01 21:57:02 crc kubenswrapper[4962]: I1201 21:57:02.532365 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfhw7\" (UniqueName: \"kubernetes.io/projected/a75e6047-420d-4aa3-a817-90a547491be2-kube-api-access-kfhw7\") pod \"a75e6047-420d-4aa3-a817-90a547491be2\" (UID: \"a75e6047-420d-4aa3-a817-90a547491be2\") " Dec 01 21:57:02 crc kubenswrapper[4962]: I1201 21:57:02.532440 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a75e6047-420d-4aa3-a817-90a547491be2-combined-ca-bundle\") pod \"a75e6047-420d-4aa3-a817-90a547491be2\" (UID: \"a75e6047-420d-4aa3-a817-90a547491be2\") " Dec 01 21:57:02 crc kubenswrapper[4962]: I1201 21:57:02.532503 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a75e6047-420d-4aa3-a817-90a547491be2-config-data\") pod \"a75e6047-420d-4aa3-a817-90a547491be2\" (UID: \"a75e6047-420d-4aa3-a817-90a547491be2\") " Dec 01 21:57:02 crc kubenswrapper[4962]: I1201 21:57:02.532569 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a75e6047-420d-4aa3-a817-90a547491be2-config-data-custom\") pod \"a75e6047-420d-4aa3-a817-90a547491be2\" (UID: \"a75e6047-420d-4aa3-a817-90a547491be2\") " Dec 01 21:57:02 crc kubenswrapper[4962]: I1201 21:57:02.532712 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b64f053-da53-49c7-a227-dcc84b5c078d-config-data\") pod \"7b64f053-da53-49c7-a227-dcc84b5c078d\" (UID: \"7b64f053-da53-49c7-a227-dcc84b5c078d\") " Dec 01 21:57:02 crc kubenswrapper[4962]: I1201 21:57:02.532796 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hdhdl\" (UniqueName: \"kubernetes.io/projected/7b64f053-da53-49c7-a227-dcc84b5c078d-kube-api-access-hdhdl\") pod \"7b64f053-da53-49c7-a227-dcc84b5c078d\" (UID: \"7b64f053-da53-49c7-a227-dcc84b5c078d\") " Dec 01 21:57:02 crc kubenswrapper[4962]: I1201 21:57:02.532832 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b64f053-da53-49c7-a227-dcc84b5c078d-combined-ca-bundle\") pod \"7b64f053-da53-49c7-a227-dcc84b5c078d\" (UID: \"7b64f053-da53-49c7-a227-dcc84b5c078d\") " Dec 01 21:57:02 crc kubenswrapper[4962]: I1201 21:57:02.532858 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7b64f053-da53-49c7-a227-dcc84b5c078d-config-data-custom\") pod \"7b64f053-da53-49c7-a227-dcc84b5c078d\" (UID: \"7b64f053-da53-49c7-a227-dcc84b5c078d\") " Dec 01 21:57:02 crc kubenswrapper[4962]: I1201 21:57:02.537103 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a75e6047-420d-4aa3-a817-90a547491be2-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a75e6047-420d-4aa3-a817-90a547491be2" (UID: "a75e6047-420d-4aa3-a817-90a547491be2"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:57:02 crc kubenswrapper[4962]: I1201 21:57:02.538412 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a75e6047-420d-4aa3-a817-90a547491be2-kube-api-access-kfhw7" (OuterVolumeSpecName: "kube-api-access-kfhw7") pod "a75e6047-420d-4aa3-a817-90a547491be2" (UID: "a75e6047-420d-4aa3-a817-90a547491be2"). InnerVolumeSpecName "kube-api-access-kfhw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:57:02 crc kubenswrapper[4962]: I1201 21:57:02.538654 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b64f053-da53-49c7-a227-dcc84b5c078d-kube-api-access-hdhdl" (OuterVolumeSpecName: "kube-api-access-hdhdl") pod "7b64f053-da53-49c7-a227-dcc84b5c078d" (UID: "7b64f053-da53-49c7-a227-dcc84b5c078d"). InnerVolumeSpecName "kube-api-access-hdhdl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:57:02 crc kubenswrapper[4962]: I1201 21:57:02.539110 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b64f053-da53-49c7-a227-dcc84b5c078d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7b64f053-da53-49c7-a227-dcc84b5c078d" (UID: "7b64f053-da53-49c7-a227-dcc84b5c078d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:57:02 crc kubenswrapper[4962]: I1201 21:57:02.565342 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b64f053-da53-49c7-a227-dcc84b5c078d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7b64f053-da53-49c7-a227-dcc84b5c078d" (UID: "7b64f053-da53-49c7-a227-dcc84b5c078d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:57:02 crc kubenswrapper[4962]: I1201 21:57:02.582036 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a75e6047-420d-4aa3-a817-90a547491be2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a75e6047-420d-4aa3-a817-90a547491be2" (UID: "a75e6047-420d-4aa3-a817-90a547491be2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:57:02 crc kubenswrapper[4962]: I1201 21:57:02.608219 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b64f053-da53-49c7-a227-dcc84b5c078d-config-data" (OuterVolumeSpecName: "config-data") pod "7b64f053-da53-49c7-a227-dcc84b5c078d" (UID: "7b64f053-da53-49c7-a227-dcc84b5c078d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:57:02 crc kubenswrapper[4962]: I1201 21:57:02.613045 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a75e6047-420d-4aa3-a817-90a547491be2-config-data" (OuterVolumeSpecName: "config-data") pod "a75e6047-420d-4aa3-a817-90a547491be2" (UID: "a75e6047-420d-4aa3-a817-90a547491be2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:57:02 crc kubenswrapper[4962]: I1201 21:57:02.635278 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b64f053-da53-49c7-a227-dcc84b5c078d-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 21:57:02 crc kubenswrapper[4962]: I1201 21:57:02.635312 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hdhdl\" (UniqueName: \"kubernetes.io/projected/7b64f053-da53-49c7-a227-dcc84b5c078d-kube-api-access-hdhdl\") on node \"crc\" DevicePath \"\"" Dec 01 21:57:02 crc kubenswrapper[4962]: I1201 21:57:02.635323 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b64f053-da53-49c7-a227-dcc84b5c078d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 21:57:02 crc kubenswrapper[4962]: I1201 21:57:02.635353 4962 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7b64f053-da53-49c7-a227-dcc84b5c078d-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 01 21:57:02 crc kubenswrapper[4962]: I1201 21:57:02.635363 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfhw7\" (UniqueName: \"kubernetes.io/projected/a75e6047-420d-4aa3-a817-90a547491be2-kube-api-access-kfhw7\") on node \"crc\" DevicePath \"\"" Dec 01 21:57:02 crc kubenswrapper[4962]: I1201 21:57:02.635371 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a75e6047-420d-4aa3-a817-90a547491be2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 21:57:02 crc kubenswrapper[4962]: I1201 21:57:02.635380 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a75e6047-420d-4aa3-a817-90a547491be2-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 21:57:02 crc kubenswrapper[4962]: I1201 21:57:02.635392 4962 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a75e6047-420d-4aa3-a817-90a547491be2-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 01 21:57:02 crc kubenswrapper[4962]: I1201 21:57:02.784921 4962 patch_prober.go:28] interesting pod/machine-config-daemon-b642k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 21:57:02 crc kubenswrapper[4962]: I1201 21:57:02.785102 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 21:57:03 crc kubenswrapper[4962]: I1201 21:57:03.091825 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-58f5d66b6f-bgzrf" Dec 01 21:57:03 crc kubenswrapper[4962]: I1201 21:57:03.091839 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-58f5d66b6f-bgzrf" event={"ID":"7b64f053-da53-49c7-a227-dcc84b5c078d","Type":"ContainerDied","Data":"33224545768efe14e0f9135291c025e5550cae8ca8ffd28d78d6584d918eacdb"} Dec 01 21:57:03 crc kubenswrapper[4962]: I1201 21:57:03.092900 4962 scope.go:117] "RemoveContainer" containerID="c8369569da1064735d48a0eb1d8a3f8459f1093e0197a113332c211339f04916" Dec 01 21:57:03 crc kubenswrapper[4962]: I1201 21:57:03.095634 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"825ec49a-9597-4e47-8c1c-9fac21baed5a","Type":"ContainerStarted","Data":"c6ccba931080d5bf36b02ec1175f81abf0f12a6feede81fd027d5bd45a1adf82"} Dec 01 21:57:03 crc kubenswrapper[4962]: I1201 21:57:03.105208 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 01 21:57:03 crc kubenswrapper[4962]: I1201 21:57:03.108310 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6648ff8f4d-xwjnk" Dec 01 21:57:03 crc kubenswrapper[4962]: I1201 21:57:03.108341 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6648ff8f4d-xwjnk" event={"ID":"a75e6047-420d-4aa3-a817-90a547491be2","Type":"ContainerDied","Data":"39e9bcdd2291436e12d4ef9911da2cc6b36a2943fd8fd797a32ec1a8a21058d6"} Dec 01 21:57:03 crc kubenswrapper[4962]: I1201 21:57:03.138397 4962 scope.go:117] "RemoveContainer" containerID="78558ce17d5d4004db2c351716fa0e3661a20c3659d14e739135462444227e5b" Dec 01 21:57:03 crc kubenswrapper[4962]: I1201 21:57:03.148783 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.277318893 podStartE2EDuration="9.14876052s" podCreationTimestamp="2025-12-01 21:56:54 +0000 UTC" firstStartedPulling="2025-12-01 21:56:56.598473082 +0000 UTC m=+1400.699912277" lastFinishedPulling="2025-12-01 21:57:02.469914709 +0000 UTC m=+1406.571353904" observedRunningTime="2025-12-01 21:57:03.142721288 +0000 UTC m=+1407.244160493" watchObservedRunningTime="2025-12-01 21:57:03.14876052 +0000 UTC m=+1407.250199715" Dec 01 21:57:03 crc kubenswrapper[4962]: I1201 21:57:03.178724 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-6648ff8f4d-xwjnk"] Dec 01 21:57:03 crc kubenswrapper[4962]: I1201 21:57:03.189291 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-6648ff8f4d-xwjnk"] Dec 01 21:57:03 crc kubenswrapper[4962]: I1201 21:57:03.200137 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-58f5d66b6f-bgzrf"] Dec 01 21:57:03 crc kubenswrapper[4962]: I1201 21:57:03.212172 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-58f5d66b6f-bgzrf"] Dec 01 21:57:04 crc kubenswrapper[4962]: I1201 21:57:04.128178 4962 generic.go:334] "Generic (PLEG): container finished" podID="c24511ba-ce02-420a-83c0-7ef9a6c4eb47" containerID="2da244bf8c60657deaac54de9fd127e4ff9e1c7fd734afd69cae2978ac0f3a5c" exitCode=0 Dec 01 21:57:04 crc kubenswrapper[4962]: I1201 21:57:04.128281 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-9zgsj" event={"ID":"c24511ba-ce02-420a-83c0-7ef9a6c4eb47","Type":"ContainerDied","Data":"2da244bf8c60657deaac54de9fd127e4ff9e1c7fd734afd69cae2978ac0f3a5c"} Dec 01 21:57:04 crc kubenswrapper[4962]: I1201 21:57:04.247452 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b64f053-da53-49c7-a227-dcc84b5c078d" path="/var/lib/kubelet/pods/7b64f053-da53-49c7-a227-dcc84b5c078d/volumes" Dec 01 21:57:04 crc kubenswrapper[4962]: I1201 21:57:04.248559 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a75e6047-420d-4aa3-a817-90a547491be2" path="/var/lib/kubelet/pods/a75e6047-420d-4aa3-a817-90a547491be2/volumes" Dec 01 21:57:05 crc kubenswrapper[4962]: I1201 21:57:05.645912 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-9zgsj" Dec 01 21:57:05 crc kubenswrapper[4962]: I1201 21:57:05.717344 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c24511ba-ce02-420a-83c0-7ef9a6c4eb47-scripts\") pod \"c24511ba-ce02-420a-83c0-7ef9a6c4eb47\" (UID: \"c24511ba-ce02-420a-83c0-7ef9a6c4eb47\") " Dec 01 21:57:05 crc kubenswrapper[4962]: I1201 21:57:05.717547 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c24511ba-ce02-420a-83c0-7ef9a6c4eb47-config-data\") pod \"c24511ba-ce02-420a-83c0-7ef9a6c4eb47\" (UID: \"c24511ba-ce02-420a-83c0-7ef9a6c4eb47\") " Dec 01 21:57:05 crc kubenswrapper[4962]: I1201 21:57:05.717622 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nn4cv\" (UniqueName: \"kubernetes.io/projected/c24511ba-ce02-420a-83c0-7ef9a6c4eb47-kube-api-access-nn4cv\") pod \"c24511ba-ce02-420a-83c0-7ef9a6c4eb47\" (UID: \"c24511ba-ce02-420a-83c0-7ef9a6c4eb47\") " Dec 01 21:57:05 crc kubenswrapper[4962]: I1201 21:57:05.717658 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c24511ba-ce02-420a-83c0-7ef9a6c4eb47-combined-ca-bundle\") pod \"c24511ba-ce02-420a-83c0-7ef9a6c4eb47\" (UID: \"c24511ba-ce02-420a-83c0-7ef9a6c4eb47\") " Dec 01 21:57:05 crc kubenswrapper[4962]: I1201 21:57:05.740156 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c24511ba-ce02-420a-83c0-7ef9a6c4eb47-scripts" (OuterVolumeSpecName: "scripts") pod "c24511ba-ce02-420a-83c0-7ef9a6c4eb47" (UID: "c24511ba-ce02-420a-83c0-7ef9a6c4eb47"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:57:05 crc kubenswrapper[4962]: I1201 21:57:05.740340 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c24511ba-ce02-420a-83c0-7ef9a6c4eb47-kube-api-access-nn4cv" (OuterVolumeSpecName: "kube-api-access-nn4cv") pod "c24511ba-ce02-420a-83c0-7ef9a6c4eb47" (UID: "c24511ba-ce02-420a-83c0-7ef9a6c4eb47"). InnerVolumeSpecName "kube-api-access-nn4cv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:57:05 crc kubenswrapper[4962]: I1201 21:57:05.756127 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c24511ba-ce02-420a-83c0-7ef9a6c4eb47-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c24511ba-ce02-420a-83c0-7ef9a6c4eb47" (UID: "c24511ba-ce02-420a-83c0-7ef9a6c4eb47"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:57:05 crc kubenswrapper[4962]: I1201 21:57:05.773117 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c24511ba-ce02-420a-83c0-7ef9a6c4eb47-config-data" (OuterVolumeSpecName: "config-data") pod "c24511ba-ce02-420a-83c0-7ef9a6c4eb47" (UID: "c24511ba-ce02-420a-83c0-7ef9a6c4eb47"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:57:05 crc kubenswrapper[4962]: I1201 21:57:05.820484 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c24511ba-ce02-420a-83c0-7ef9a6c4eb47-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 21:57:05 crc kubenswrapper[4962]: I1201 21:57:05.820547 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nn4cv\" (UniqueName: \"kubernetes.io/projected/c24511ba-ce02-420a-83c0-7ef9a6c4eb47-kube-api-access-nn4cv\") on node \"crc\" DevicePath \"\"" Dec 01 21:57:05 crc kubenswrapper[4962]: I1201 21:57:05.820562 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c24511ba-ce02-420a-83c0-7ef9a6c4eb47-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 21:57:05 crc kubenswrapper[4962]: I1201 21:57:05.820574 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c24511ba-ce02-420a-83c0-7ef9a6c4eb47-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 21:57:06 crc kubenswrapper[4962]: I1201 21:57:06.186083 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-9zgsj" event={"ID":"c24511ba-ce02-420a-83c0-7ef9a6c4eb47","Type":"ContainerDied","Data":"4e513ec50e459a1f2975da476f8d305e1b87bb2388396c0d6c445662b35eebcc"} Dec 01 21:57:06 crc kubenswrapper[4962]: I1201 21:57:06.186125 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e513ec50e459a1f2975da476f8d305e1b87bb2388396c0d6c445662b35eebcc" Dec 01 21:57:06 crc kubenswrapper[4962]: I1201 21:57:06.186245 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-9zgsj" Dec 01 21:57:06 crc kubenswrapper[4962]: E1201 21:57:06.237585 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0fc56ad3c6fa62a6af99e37471ecd22b499ff39824bc98bc9d9ec4c5b8c39c46" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 01 21:57:06 crc kubenswrapper[4962]: E1201 21:57:06.239445 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0fc56ad3c6fa62a6af99e37471ecd22b499ff39824bc98bc9d9ec4c5b8c39c46" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 01 21:57:06 crc kubenswrapper[4962]: E1201 21:57:06.240795 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0fc56ad3c6fa62a6af99e37471ecd22b499ff39824bc98bc9d9ec4c5b8c39c46" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 01 21:57:06 crc kubenswrapper[4962]: E1201 21:57:06.240886 4962 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="e95f5aa0-0150-49da-a25d-c2eb369d394e" containerName="nova-cell0-conductor-conductor" Dec 01 21:57:09 crc kubenswrapper[4962]: I1201 21:57:09.531188 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Dec 01 21:57:09 crc kubenswrapper[4962]: E1201 21:57:09.532366 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a75e6047-420d-4aa3-a817-90a547491be2" containerName="heat-api" Dec 01 21:57:09 crc kubenswrapper[4962]: I1201 21:57:09.532393 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a75e6047-420d-4aa3-a817-90a547491be2" containerName="heat-api" Dec 01 21:57:09 crc kubenswrapper[4962]: E1201 21:57:09.532423 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c24511ba-ce02-420a-83c0-7ef9a6c4eb47" containerName="aodh-db-sync" Dec 01 21:57:09 crc kubenswrapper[4962]: I1201 21:57:09.532436 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="c24511ba-ce02-420a-83c0-7ef9a6c4eb47" containerName="aodh-db-sync" Dec 01 21:57:09 crc kubenswrapper[4962]: E1201 21:57:09.532473 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b64f053-da53-49c7-a227-dcc84b5c078d" containerName="heat-cfnapi" Dec 01 21:57:09 crc kubenswrapper[4962]: I1201 21:57:09.532486 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b64f053-da53-49c7-a227-dcc84b5c078d" containerName="heat-cfnapi" Dec 01 21:57:09 crc kubenswrapper[4962]: I1201 21:57:09.532906 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="c24511ba-ce02-420a-83c0-7ef9a6c4eb47" containerName="aodh-db-sync" Dec 01 21:57:09 crc kubenswrapper[4962]: I1201 21:57:09.532974 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="a75e6047-420d-4aa3-a817-90a547491be2" containerName="heat-api" Dec 01 21:57:09 crc kubenswrapper[4962]: I1201 21:57:09.533004 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b64f053-da53-49c7-a227-dcc84b5c078d" containerName="heat-cfnapi" Dec 01 21:57:09 crc kubenswrapper[4962]: I1201 21:57:09.536850 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 01 21:57:09 crc kubenswrapper[4962]: I1201 21:57:09.540461 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-bf8t4" Dec 01 21:57:09 crc kubenswrapper[4962]: I1201 21:57:09.540567 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Dec 01 21:57:09 crc kubenswrapper[4962]: I1201 21:57:09.544922 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Dec 01 21:57:09 crc kubenswrapper[4962]: I1201 21:57:09.551964 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 01 21:57:09 crc kubenswrapper[4962]: I1201 21:57:09.617716 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07ba8682-f1b4-4f16-85c9-99b5dcc82666-scripts\") pod \"aodh-0\" (UID: \"07ba8682-f1b4-4f16-85c9-99b5dcc82666\") " pod="openstack/aodh-0" Dec 01 21:57:09 crc kubenswrapper[4962]: I1201 21:57:09.618005 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07ba8682-f1b4-4f16-85c9-99b5dcc82666-config-data\") pod \"aodh-0\" (UID: \"07ba8682-f1b4-4f16-85c9-99b5dcc82666\") " pod="openstack/aodh-0" Dec 01 21:57:09 crc kubenswrapper[4962]: I1201 21:57:09.618545 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07ba8682-f1b4-4f16-85c9-99b5dcc82666-combined-ca-bundle\") pod \"aodh-0\" (UID: \"07ba8682-f1b4-4f16-85c9-99b5dcc82666\") " pod="openstack/aodh-0" Dec 01 21:57:09 crc kubenswrapper[4962]: I1201 21:57:09.618906 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjnx2\" (UniqueName: \"kubernetes.io/projected/07ba8682-f1b4-4f16-85c9-99b5dcc82666-kube-api-access-cjnx2\") pod \"aodh-0\" (UID: \"07ba8682-f1b4-4f16-85c9-99b5dcc82666\") " pod="openstack/aodh-0" Dec 01 21:57:09 crc kubenswrapper[4962]: I1201 21:57:09.721642 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07ba8682-f1b4-4f16-85c9-99b5dcc82666-combined-ca-bundle\") pod \"aodh-0\" (UID: \"07ba8682-f1b4-4f16-85c9-99b5dcc82666\") " pod="openstack/aodh-0" Dec 01 21:57:09 crc kubenswrapper[4962]: I1201 21:57:09.722127 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjnx2\" (UniqueName: \"kubernetes.io/projected/07ba8682-f1b4-4f16-85c9-99b5dcc82666-kube-api-access-cjnx2\") pod \"aodh-0\" (UID: \"07ba8682-f1b4-4f16-85c9-99b5dcc82666\") " pod="openstack/aodh-0" Dec 01 21:57:09 crc kubenswrapper[4962]: I1201 21:57:09.722183 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07ba8682-f1b4-4f16-85c9-99b5dcc82666-scripts\") pod \"aodh-0\" (UID: \"07ba8682-f1b4-4f16-85c9-99b5dcc82666\") " pod="openstack/aodh-0" Dec 01 21:57:09 crc kubenswrapper[4962]: I1201 21:57:09.722261 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07ba8682-f1b4-4f16-85c9-99b5dcc82666-config-data\") pod \"aodh-0\" (UID: \"07ba8682-f1b4-4f16-85c9-99b5dcc82666\") " pod="openstack/aodh-0" Dec 01 21:57:09 crc kubenswrapper[4962]: I1201 21:57:09.731963 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07ba8682-f1b4-4f16-85c9-99b5dcc82666-combined-ca-bundle\") pod \"aodh-0\" (UID: \"07ba8682-f1b4-4f16-85c9-99b5dcc82666\") " pod="openstack/aodh-0" Dec 01 21:57:09 crc kubenswrapper[4962]: I1201 21:57:09.735289 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07ba8682-f1b4-4f16-85c9-99b5dcc82666-scripts\") pod \"aodh-0\" (UID: \"07ba8682-f1b4-4f16-85c9-99b5dcc82666\") " pod="openstack/aodh-0" Dec 01 21:57:09 crc kubenswrapper[4962]: I1201 21:57:09.735920 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07ba8682-f1b4-4f16-85c9-99b5dcc82666-config-data\") pod \"aodh-0\" (UID: \"07ba8682-f1b4-4f16-85c9-99b5dcc82666\") " pod="openstack/aodh-0" Dec 01 21:57:09 crc kubenswrapper[4962]: I1201 21:57:09.757111 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjnx2\" (UniqueName: \"kubernetes.io/projected/07ba8682-f1b4-4f16-85c9-99b5dcc82666-kube-api-access-cjnx2\") pod \"aodh-0\" (UID: \"07ba8682-f1b4-4f16-85c9-99b5dcc82666\") " pod="openstack/aodh-0" Dec 01 21:57:09 crc kubenswrapper[4962]: W1201 21:57:09.802850 4962 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod805240bc_5355_4ffa_886c_a4e96fb3a540.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod805240bc_5355_4ffa_886c_a4e96fb3a540.slice: no such file or directory Dec 01 21:57:09 crc kubenswrapper[4962]: W1201 21:57:09.802951 4962 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod737cedfd_d3c6_4f5b_8289_af4b32ec094a.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod737cedfd_d3c6_4f5b_8289_af4b32ec094a.slice: no such file or directory Dec 01 21:57:09 crc kubenswrapper[4962]: W1201 21:57:09.803145 4962 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc24511ba_ce02_420a_83c0_7ef9a6c4eb47.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc24511ba_ce02_420a_83c0_7ef9a6c4eb47.slice: no such file or directory Dec 01 21:57:09 crc kubenswrapper[4962]: I1201 21:57:09.911233 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 01 21:57:10 crc kubenswrapper[4962]: E1201 21:57:10.039827 4962 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ded81d0_3be6_4293_abd6_c3434d42667e.slice/crio-836b8101e947dd78979b83dfa7cb9ebeb2a41def98f01177745280a0ea6e6303.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ded81d0_3be6_4293_abd6_c3434d42667e.slice/crio-conmon-13883312d5b99c02ab6794ce1361846db18e86ab607c6100615a8682932dbec6.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b64f053_da53_49c7_a227_dcc84b5c078d.slice/crio-c8369569da1064735d48a0eb1d8a3f8459f1093e0197a113332c211339f04916.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b64f053_da53_49c7_a227_dcc84b5c078d.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod788bee1d_914f_4efc_acea_67ff250ce73f.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda75e6047_420d_4aa3_a817_90a547491be2.slice/crio-39e9bcdd2291436e12d4ef9911da2cc6b36a2943fd8fd797a32ec1a8a21058d6\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod272645a0_8a27_4b3e_8f4f_5e456f118d84.slice/crio-5a6778c5ed2512f1ec2efda396e8e9535eb9077d93cd0315583df081b2eadf0f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod788bee1d_914f_4efc_acea_67ff250ce73f.slice/crio-e1cf1e720f696aabdf6d9c1f9c857171943a447ef9c523e1907ee394e3205972.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ded81d0_3be6_4293_abd6_c3434d42667e.slice/crio-13883312d5b99c02ab6794ce1361846db18e86ab607c6100615a8682932dbec6.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda75e6047_420d_4aa3_a817_90a547491be2.slice/crio-conmon-78558ce17d5d4004db2c351716fa0e3661a20c3659d14e739135462444227e5b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod788bee1d_914f_4efc_acea_67ff250ce73f.slice/crio-conmon-e1cf1e720f696aabdf6d9c1f9c857171943a447ef9c523e1907ee394e3205972.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod272645a0_8a27_4b3e_8f4f_5e456f118d84.slice/crio-conmon-5a6778c5ed2512f1ec2efda396e8e9535eb9077d93cd0315583df081b2eadf0f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b64f053_da53_49c7_a227_dcc84b5c078d.slice/crio-33224545768efe14e0f9135291c025e5550cae8ca8ffd28d78d6584d918eacdb\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod788bee1d_914f_4efc_acea_67ff250ce73f.slice/crio-844141a471fd7b5af60ab53947a4cbc50ad1f6e106b8325daf01b1afbb6c9dca.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod272645a0_8a27_4b3e_8f4f_5e456f118d84.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ded81d0_3be6_4293_abd6_c3434d42667e.slice/crio-8731a4e99fbe79050dc71c2e66b6c676460403e7b8ad651e4670cdc44f28abfe\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ded81d0_3be6_4293_abd6_c3434d42667e.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda75e6047_420d_4aa3_a817_90a547491be2.slice\": RecentStats: unable to find data in memory cache]" Dec 01 21:57:10 crc kubenswrapper[4962]: E1201 21:57:10.039890 4962 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b64f053_da53_49c7_a227_dcc84b5c078d.slice/crio-conmon-c8369569da1064735d48a0eb1d8a3f8459f1093e0197a113332c211339f04916.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda75e6047_420d_4aa3_a817_90a547491be2.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ded81d0_3be6_4293_abd6_c3434d42667e.slice/crio-8731a4e99fbe79050dc71c2e66b6c676460403e7b8ad651e4670cdc44f28abfe\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod788bee1d_914f_4efc_acea_67ff250ce73f.slice/crio-e1cf1e720f696aabdf6d9c1f9c857171943a447ef9c523e1907ee394e3205972.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda75e6047_420d_4aa3_a817_90a547491be2.slice/crio-39e9bcdd2291436e12d4ef9911da2cc6b36a2943fd8fd797a32ec1a8a21058d6\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode95f5aa0_0150_49da_a25d_c2eb369d394e.slice/crio-conmon-0fc56ad3c6fa62a6af99e37471ecd22b499ff39824bc98bc9d9ec4c5b8c39c46.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode95f5aa0_0150_49da_a25d_c2eb369d394e.slice/crio-0fc56ad3c6fa62a6af99e37471ecd22b499ff39824bc98bc9d9ec4c5b8c39c46.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod272645a0_8a27_4b3e_8f4f_5e456f118d84.slice/crio-5a6778c5ed2512f1ec2efda396e8e9535eb9077d93cd0315583df081b2eadf0f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod272645a0_8a27_4b3e_8f4f_5e456f118d84.slice/crio-conmon-5a6778c5ed2512f1ec2efda396e8e9535eb9077d93cd0315583df081b2eadf0f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b64f053_da53_49c7_a227_dcc84b5c078d.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b64f053_da53_49c7_a227_dcc84b5c078d.slice/crio-c8369569da1064735d48a0eb1d8a3f8459f1093e0197a113332c211339f04916.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod272645a0_8a27_4b3e_8f4f_5e456f118d84.slice/crio-2552f2dca1b12dabd1917284e7c2a47006998b9f9527e8f0b6250123f888cc95\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda75e6047_420d_4aa3_a817_90a547491be2.slice/crio-78558ce17d5d4004db2c351716fa0e3661a20c3659d14e739135462444227e5b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod788bee1d_914f_4efc_acea_67ff250ce73f.slice/crio-844141a471fd7b5af60ab53947a4cbc50ad1f6e106b8325daf01b1afbb6c9dca.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b64f053_da53_49c7_a227_dcc84b5c078d.slice/crio-33224545768efe14e0f9135291c025e5550cae8ca8ffd28d78d6584d918eacdb\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod788bee1d_914f_4efc_acea_67ff250ce73f.slice/crio-conmon-844141a471fd7b5af60ab53947a4cbc50ad1f6e106b8325daf01b1afbb6c9dca.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ded81d0_3be6_4293_abd6_c3434d42667e.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda75e6047_420d_4aa3_a817_90a547491be2.slice/crio-conmon-78558ce17d5d4004db2c351716fa0e3661a20c3659d14e739135462444227e5b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ded81d0_3be6_4293_abd6_c3434d42667e.slice/crio-13883312d5b99c02ab6794ce1361846db18e86ab607c6100615a8682932dbec6.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ded81d0_3be6_4293_abd6_c3434d42667e.slice/crio-conmon-13883312d5b99c02ab6794ce1361846db18e86ab607c6100615a8682932dbec6.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod788bee1d_914f_4efc_acea_67ff250ce73f.slice/crio-conmon-e1cf1e720f696aabdf6d9c1f9c857171943a447ef9c523e1907ee394e3205972.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod788bee1d_914f_4efc_acea_67ff250ce73f.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod788bee1d_914f_4efc_acea_67ff250ce73f.slice/crio-cad83d54a0a2223e5c7a4492a3c5e86b9061a353d47a7a476f5b7a5570c7b218\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod272645a0_8a27_4b3e_8f4f_5e456f118d84.slice\": RecentStats: unable to find data in memory cache]" Dec 01 21:57:10 crc kubenswrapper[4962]: E1201 21:57:10.061337 4962 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod788bee1d_914f_4efc_acea_67ff250ce73f.slice/crio-844141a471fd7b5af60ab53947a4cbc50ad1f6e106b8325daf01b1afbb6c9dca.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ded81d0_3be6_4293_abd6_c3434d42667e.slice/crio-13883312d5b99c02ab6794ce1361846db18e86ab607c6100615a8682932dbec6.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod788bee1d_914f_4efc_acea_67ff250ce73f.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b64f053_da53_49c7_a227_dcc84b5c078d.slice/crio-conmon-c8369569da1064735d48a0eb1d8a3f8459f1093e0197a113332c211339f04916.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ded81d0_3be6_4293_abd6_c3434d42667e.slice/crio-conmon-13883312d5b99c02ab6794ce1361846db18e86ab607c6100615a8682932dbec6.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ded81d0_3be6_4293_abd6_c3434d42667e.slice/crio-8731a4e99fbe79050dc71c2e66b6c676460403e7b8ad651e4670cdc44f28abfe\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda75e6047_420d_4aa3_a817_90a547491be2.slice/crio-78558ce17d5d4004db2c351716fa0e3661a20c3659d14e739135462444227e5b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod272645a0_8a27_4b3e_8f4f_5e456f118d84.slice/crio-5a6778c5ed2512f1ec2efda396e8e9535eb9077d93cd0315583df081b2eadf0f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ded81d0_3be6_4293_abd6_c3434d42667e.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda75e6047_420d_4aa3_a817_90a547491be2.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod788bee1d_914f_4efc_acea_67ff250ce73f.slice/crio-conmon-e1cf1e720f696aabdf6d9c1f9c857171943a447ef9c523e1907ee394e3205972.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ded81d0_3be6_4293_abd6_c3434d42667e.slice/crio-836b8101e947dd78979b83dfa7cb9ebeb2a41def98f01177745280a0ea6e6303.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod272645a0_8a27_4b3e_8f4f_5e456f118d84.slice/crio-conmon-5a6778c5ed2512f1ec2efda396e8e9535eb9077d93cd0315583df081b2eadf0f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod272645a0_8a27_4b3e_8f4f_5e456f118d84.slice/crio-2552f2dca1b12dabd1917284e7c2a47006998b9f9527e8f0b6250123f888cc95\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b64f053_da53_49c7_a227_dcc84b5c078d.slice/crio-33224545768efe14e0f9135291c025e5550cae8ca8ffd28d78d6584d918eacdb\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ded81d0_3be6_4293_abd6_c3434d42667e.slice/crio-conmon-836b8101e947dd78979b83dfa7cb9ebeb2a41def98f01177745280a0ea6e6303.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda75e6047_420d_4aa3_a817_90a547491be2.slice/crio-conmon-78558ce17d5d4004db2c351716fa0e3661a20c3659d14e739135462444227e5b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda75e6047_420d_4aa3_a817_90a547491be2.slice/crio-39e9bcdd2291436e12d4ef9911da2cc6b36a2943fd8fd797a32ec1a8a21058d6\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b64f053_da53_49c7_a227_dcc84b5c078d.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b64f053_da53_49c7_a227_dcc84b5c078d.slice/crio-c8369569da1064735d48a0eb1d8a3f8459f1093e0197a113332c211339f04916.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod788bee1d_914f_4efc_acea_67ff250ce73f.slice/crio-conmon-844141a471fd7b5af60ab53947a4cbc50ad1f6e106b8325daf01b1afbb6c9dca.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod788bee1d_914f_4efc_acea_67ff250ce73f.slice/crio-cad83d54a0a2223e5c7a4492a3c5e86b9061a353d47a7a476f5b7a5570c7b218\": RecentStats: unable to find data in memory cache]" Dec 01 21:57:10 crc kubenswrapper[4962]: I1201 21:57:10.232607 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 01 21:57:10 crc kubenswrapper[4962]: I1201 21:57:10.260514 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmfcd\" (UniqueName: \"kubernetes.io/projected/e95f5aa0-0150-49da-a25d-c2eb369d394e-kube-api-access-zmfcd\") pod \"e95f5aa0-0150-49da-a25d-c2eb369d394e\" (UID: \"e95f5aa0-0150-49da-a25d-c2eb369d394e\") " Dec 01 21:57:10 crc kubenswrapper[4962]: I1201 21:57:10.260688 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e95f5aa0-0150-49da-a25d-c2eb369d394e-config-data\") pod \"e95f5aa0-0150-49da-a25d-c2eb369d394e\" (UID: \"e95f5aa0-0150-49da-a25d-c2eb369d394e\") " Dec 01 21:57:10 crc kubenswrapper[4962]: I1201 21:57:10.260919 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e95f5aa0-0150-49da-a25d-c2eb369d394e-combined-ca-bundle\") pod \"e95f5aa0-0150-49da-a25d-c2eb369d394e\" (UID: \"e95f5aa0-0150-49da-a25d-c2eb369d394e\") " Dec 01 21:57:10 crc kubenswrapper[4962]: I1201 21:57:10.275517 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e95f5aa0-0150-49da-a25d-c2eb369d394e-kube-api-access-zmfcd" (OuterVolumeSpecName: "kube-api-access-zmfcd") pod "e95f5aa0-0150-49da-a25d-c2eb369d394e" (UID: "e95f5aa0-0150-49da-a25d-c2eb369d394e"). InnerVolumeSpecName "kube-api-access-zmfcd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:57:10 crc kubenswrapper[4962]: I1201 21:57:10.285105 4962 generic.go:334] "Generic (PLEG): container finished" podID="e95f5aa0-0150-49da-a25d-c2eb369d394e" containerID="0fc56ad3c6fa62a6af99e37471ecd22b499ff39824bc98bc9d9ec4c5b8c39c46" exitCode=137 Dec 01 21:57:10 crc kubenswrapper[4962]: I1201 21:57:10.285165 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 01 21:57:10 crc kubenswrapper[4962]: I1201 21:57:10.285187 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"e95f5aa0-0150-49da-a25d-c2eb369d394e","Type":"ContainerDied","Data":"0fc56ad3c6fa62a6af99e37471ecd22b499ff39824bc98bc9d9ec4c5b8c39c46"} Dec 01 21:57:10 crc kubenswrapper[4962]: I1201 21:57:10.285601 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"e95f5aa0-0150-49da-a25d-c2eb369d394e","Type":"ContainerDied","Data":"ac87588bd2ae85ad91deb089f7e7dff000d4dc47aef8f4f7afa635752c16a097"} Dec 01 21:57:10 crc kubenswrapper[4962]: I1201 21:57:10.285625 4962 scope.go:117] "RemoveContainer" containerID="0fc56ad3c6fa62a6af99e37471ecd22b499ff39824bc98bc9d9ec4c5b8c39c46" Dec 01 21:57:10 crc kubenswrapper[4962]: I1201 21:57:10.334532 4962 scope.go:117] "RemoveContainer" containerID="0fc56ad3c6fa62a6af99e37471ecd22b499ff39824bc98bc9d9ec4c5b8c39c46" Dec 01 21:57:10 crc kubenswrapper[4962]: E1201 21:57:10.335701 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fc56ad3c6fa62a6af99e37471ecd22b499ff39824bc98bc9d9ec4c5b8c39c46\": container with ID starting with 0fc56ad3c6fa62a6af99e37471ecd22b499ff39824bc98bc9d9ec4c5b8c39c46 not found: ID does not exist" containerID="0fc56ad3c6fa62a6af99e37471ecd22b499ff39824bc98bc9d9ec4c5b8c39c46" Dec 01 21:57:10 crc kubenswrapper[4962]: I1201 21:57:10.335752 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fc56ad3c6fa62a6af99e37471ecd22b499ff39824bc98bc9d9ec4c5b8c39c46"} err="failed to get container status \"0fc56ad3c6fa62a6af99e37471ecd22b499ff39824bc98bc9d9ec4c5b8c39c46\": rpc error: code = NotFound desc = could not find container \"0fc56ad3c6fa62a6af99e37471ecd22b499ff39824bc98bc9d9ec4c5b8c39c46\": container with ID starting with 0fc56ad3c6fa62a6af99e37471ecd22b499ff39824bc98bc9d9ec4c5b8c39c46 not found: ID does not exist" Dec 01 21:57:10 crc kubenswrapper[4962]: I1201 21:57:10.346176 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e95f5aa0-0150-49da-a25d-c2eb369d394e-config-data" (OuterVolumeSpecName: "config-data") pod "e95f5aa0-0150-49da-a25d-c2eb369d394e" (UID: "e95f5aa0-0150-49da-a25d-c2eb369d394e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:57:10 crc kubenswrapper[4962]: I1201 21:57:10.382419 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e95f5aa0-0150-49da-a25d-c2eb369d394e-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 21:57:10 crc kubenswrapper[4962]: I1201 21:57:10.382469 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zmfcd\" (UniqueName: \"kubernetes.io/projected/e95f5aa0-0150-49da-a25d-c2eb369d394e-kube-api-access-zmfcd\") on node \"crc\" DevicePath \"\"" Dec 01 21:57:10 crc kubenswrapper[4962]: I1201 21:57:10.425198 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e95f5aa0-0150-49da-a25d-c2eb369d394e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e95f5aa0-0150-49da-a25d-c2eb369d394e" (UID: "e95f5aa0-0150-49da-a25d-c2eb369d394e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:57:10 crc kubenswrapper[4962]: I1201 21:57:10.485434 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e95f5aa0-0150-49da-a25d-c2eb369d394e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 21:57:10 crc kubenswrapper[4962]: I1201 21:57:10.607847 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 01 21:57:10 crc kubenswrapper[4962]: I1201 21:57:10.644845 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 01 21:57:10 crc kubenswrapper[4962]: I1201 21:57:10.655979 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 01 21:57:10 crc kubenswrapper[4962]: I1201 21:57:10.674792 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 01 21:57:10 crc kubenswrapper[4962]: E1201 21:57:10.676590 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e95f5aa0-0150-49da-a25d-c2eb369d394e" containerName="nova-cell0-conductor-conductor" Dec 01 21:57:10 crc kubenswrapper[4962]: I1201 21:57:10.676618 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="e95f5aa0-0150-49da-a25d-c2eb369d394e" containerName="nova-cell0-conductor-conductor" Dec 01 21:57:10 crc kubenswrapper[4962]: I1201 21:57:10.676925 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="e95f5aa0-0150-49da-a25d-c2eb369d394e" containerName="nova-cell0-conductor-conductor" Dec 01 21:57:10 crc kubenswrapper[4962]: I1201 21:57:10.684830 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 01 21:57:10 crc kubenswrapper[4962]: I1201 21:57:10.706602 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 01 21:57:10 crc kubenswrapper[4962]: I1201 21:57:10.707365 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-vtxzn" Dec 01 21:57:10 crc kubenswrapper[4962]: I1201 21:57:10.747842 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 01 21:57:10 crc kubenswrapper[4962]: I1201 21:57:10.803757 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fttvm\" (UniqueName: \"kubernetes.io/projected/9b1a4101-b960-4d3a-bba2-8472f8b2a726-kube-api-access-fttvm\") pod \"nova-cell0-conductor-0\" (UID: \"9b1a4101-b960-4d3a-bba2-8472f8b2a726\") " pod="openstack/nova-cell0-conductor-0" Dec 01 21:57:10 crc kubenswrapper[4962]: I1201 21:57:10.803860 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b1a4101-b960-4d3a-bba2-8472f8b2a726-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"9b1a4101-b960-4d3a-bba2-8472f8b2a726\") " pod="openstack/nova-cell0-conductor-0" Dec 01 21:57:10 crc kubenswrapper[4962]: I1201 21:57:10.803898 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b1a4101-b960-4d3a-bba2-8472f8b2a726-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"9b1a4101-b960-4d3a-bba2-8472f8b2a726\") " pod="openstack/nova-cell0-conductor-0" Dec 01 21:57:10 crc kubenswrapper[4962]: I1201 21:57:10.917222 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fttvm\" (UniqueName: \"kubernetes.io/projected/9b1a4101-b960-4d3a-bba2-8472f8b2a726-kube-api-access-fttvm\") pod \"nova-cell0-conductor-0\" (UID: \"9b1a4101-b960-4d3a-bba2-8472f8b2a726\") " pod="openstack/nova-cell0-conductor-0" Dec 01 21:57:10 crc kubenswrapper[4962]: I1201 21:57:10.917416 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b1a4101-b960-4d3a-bba2-8472f8b2a726-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"9b1a4101-b960-4d3a-bba2-8472f8b2a726\") " pod="openstack/nova-cell0-conductor-0" Dec 01 21:57:10 crc kubenswrapper[4962]: I1201 21:57:10.917480 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b1a4101-b960-4d3a-bba2-8472f8b2a726-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"9b1a4101-b960-4d3a-bba2-8472f8b2a726\") " pod="openstack/nova-cell0-conductor-0" Dec 01 21:57:10 crc kubenswrapper[4962]: I1201 21:57:10.928342 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b1a4101-b960-4d3a-bba2-8472f8b2a726-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"9b1a4101-b960-4d3a-bba2-8472f8b2a726\") " pod="openstack/nova-cell0-conductor-0" Dec 01 21:57:10 crc kubenswrapper[4962]: I1201 21:57:10.950873 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fttvm\" (UniqueName: \"kubernetes.io/projected/9b1a4101-b960-4d3a-bba2-8472f8b2a726-kube-api-access-fttvm\") pod \"nova-cell0-conductor-0\" (UID: \"9b1a4101-b960-4d3a-bba2-8472f8b2a726\") " pod="openstack/nova-cell0-conductor-0" Dec 01 21:57:10 crc kubenswrapper[4962]: I1201 21:57:10.953538 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b1a4101-b960-4d3a-bba2-8472f8b2a726-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"9b1a4101-b960-4d3a-bba2-8472f8b2a726\") " pod="openstack/nova-cell0-conductor-0" Dec 01 21:57:11 crc kubenswrapper[4962]: I1201 21:57:11.100506 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 01 21:57:11 crc kubenswrapper[4962]: I1201 21:57:11.309229 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"07ba8682-f1b4-4f16-85c9-99b5dcc82666","Type":"ContainerStarted","Data":"5923621d0476aa79a2fefabf810c539fa2bc9a7e19fadc4ded9359e1b0e19a22"} Dec 01 21:57:11 crc kubenswrapper[4962]: I1201 21:57:11.690668 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 01 21:57:12 crc kubenswrapper[4962]: I1201 21:57:12.249095 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e95f5aa0-0150-49da-a25d-c2eb369d394e" path="/var/lib/kubelet/pods/e95f5aa0-0150-49da-a25d-c2eb369d394e/volumes" Dec 01 21:57:12 crc kubenswrapper[4962]: I1201 21:57:12.323034 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"07ba8682-f1b4-4f16-85c9-99b5dcc82666","Type":"ContainerStarted","Data":"7e4570bd611c799e9c21509ffe5b73a1bdfd7d06d349aaae1e9e520a1bc9174b"} Dec 01 21:57:12 crc kubenswrapper[4962]: I1201 21:57:12.324989 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"9b1a4101-b960-4d3a-bba2-8472f8b2a726","Type":"ContainerStarted","Data":"0d4f87365afb4688d21933bfa8cc8d037657e44b0f8b881007f7b44afd1b6e83"} Dec 01 21:57:12 crc kubenswrapper[4962]: I1201 21:57:12.325026 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"9b1a4101-b960-4d3a-bba2-8472f8b2a726","Type":"ContainerStarted","Data":"cccd9dba2c7824eae820569a6a8180a9399bf79a129719cd9f6546b5c9399309"} Dec 01 21:57:12 crc kubenswrapper[4962]: I1201 21:57:12.325225 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 01 21:57:12 crc kubenswrapper[4962]: I1201 21:57:12.346882 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.346860409 podStartE2EDuration="2.346860409s" podCreationTimestamp="2025-12-01 21:57:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:57:12.342069962 +0000 UTC m=+1416.443509177" watchObservedRunningTime="2025-12-01 21:57:12.346860409 +0000 UTC m=+1416.448299604" Dec 01 21:57:12 crc kubenswrapper[4962]: I1201 21:57:12.547636 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 21:57:12 crc kubenswrapper[4962]: I1201 21:57:12.547883 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="825ec49a-9597-4e47-8c1c-9fac21baed5a" containerName="ceilometer-central-agent" containerID="cri-o://b37089abafb0b561fc8bb52a4289214dd10b68fdeb17894fca7701c71115549a" gracePeriod=30 Dec 01 21:57:12 crc kubenswrapper[4962]: I1201 21:57:12.548023 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="825ec49a-9597-4e47-8c1c-9fac21baed5a" containerName="proxy-httpd" containerID="cri-o://c6ccba931080d5bf36b02ec1175f81abf0f12a6feede81fd027d5bd45a1adf82" gracePeriod=30 Dec 01 21:57:12 crc kubenswrapper[4962]: I1201 21:57:12.548061 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="825ec49a-9597-4e47-8c1c-9fac21baed5a" containerName="sg-core" containerID="cri-o://d8f63d67dadc85428fbe743fc163168641ab662785348e82976dbb0d5a8249c6" gracePeriod=30 Dec 01 21:57:12 crc kubenswrapper[4962]: I1201 21:57:12.548091 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="825ec49a-9597-4e47-8c1c-9fac21baed5a" containerName="ceilometer-notification-agent" containerID="cri-o://f9a9d8b8c02560188a175645d67481fad9804595608e1db96150422f6fb6ed61" gracePeriod=30 Dec 01 21:57:12 crc kubenswrapper[4962]: I1201 21:57:12.555576 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 01 21:57:12 crc kubenswrapper[4962]: I1201 21:57:12.969732 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Dec 01 21:57:13 crc kubenswrapper[4962]: I1201 21:57:13.339671 4962 generic.go:334] "Generic (PLEG): container finished" podID="825ec49a-9597-4e47-8c1c-9fac21baed5a" containerID="c6ccba931080d5bf36b02ec1175f81abf0f12a6feede81fd027d5bd45a1adf82" exitCode=0 Dec 01 21:57:13 crc kubenswrapper[4962]: I1201 21:57:13.339965 4962 generic.go:334] "Generic (PLEG): container finished" podID="825ec49a-9597-4e47-8c1c-9fac21baed5a" containerID="d8f63d67dadc85428fbe743fc163168641ab662785348e82976dbb0d5a8249c6" exitCode=2 Dec 01 21:57:13 crc kubenswrapper[4962]: I1201 21:57:13.339976 4962 generic.go:334] "Generic (PLEG): container finished" podID="825ec49a-9597-4e47-8c1c-9fac21baed5a" containerID="b37089abafb0b561fc8bb52a4289214dd10b68fdeb17894fca7701c71115549a" exitCode=0 Dec 01 21:57:13 crc kubenswrapper[4962]: I1201 21:57:13.339848 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"825ec49a-9597-4e47-8c1c-9fac21baed5a","Type":"ContainerDied","Data":"c6ccba931080d5bf36b02ec1175f81abf0f12a6feede81fd027d5bd45a1adf82"} Dec 01 21:57:13 crc kubenswrapper[4962]: I1201 21:57:13.340022 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"825ec49a-9597-4e47-8c1c-9fac21baed5a","Type":"ContainerDied","Data":"d8f63d67dadc85428fbe743fc163168641ab662785348e82976dbb0d5a8249c6"} Dec 01 21:57:13 crc kubenswrapper[4962]: I1201 21:57:13.340036 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"825ec49a-9597-4e47-8c1c-9fac21baed5a","Type":"ContainerDied","Data":"b37089abafb0b561fc8bb52a4289214dd10b68fdeb17894fca7701c71115549a"} Dec 01 21:57:14 crc kubenswrapper[4962]: I1201 21:57:14.358567 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"07ba8682-f1b4-4f16-85c9-99b5dcc82666","Type":"ContainerStarted","Data":"269869044a427490757176305f563e559f01fad883064266d82b586da8054a16"} Dec 01 21:57:14 crc kubenswrapper[4962]: I1201 21:57:14.859868 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 21:57:14 crc kubenswrapper[4962]: I1201 21:57:14.926620 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/825ec49a-9597-4e47-8c1c-9fac21baed5a-scripts\") pod \"825ec49a-9597-4e47-8c1c-9fac21baed5a\" (UID: \"825ec49a-9597-4e47-8c1c-9fac21baed5a\") " Dec 01 21:57:14 crc kubenswrapper[4962]: I1201 21:57:14.926711 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8lhtj\" (UniqueName: \"kubernetes.io/projected/825ec49a-9597-4e47-8c1c-9fac21baed5a-kube-api-access-8lhtj\") pod \"825ec49a-9597-4e47-8c1c-9fac21baed5a\" (UID: \"825ec49a-9597-4e47-8c1c-9fac21baed5a\") " Dec 01 21:57:14 crc kubenswrapper[4962]: I1201 21:57:14.926744 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/825ec49a-9597-4e47-8c1c-9fac21baed5a-config-data\") pod \"825ec49a-9597-4e47-8c1c-9fac21baed5a\" (UID: \"825ec49a-9597-4e47-8c1c-9fac21baed5a\") " Dec 01 21:57:14 crc kubenswrapper[4962]: I1201 21:57:14.926812 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/825ec49a-9597-4e47-8c1c-9fac21baed5a-run-httpd\") pod \"825ec49a-9597-4e47-8c1c-9fac21baed5a\" (UID: \"825ec49a-9597-4e47-8c1c-9fac21baed5a\") " Dec 01 21:57:14 crc kubenswrapper[4962]: I1201 21:57:14.926955 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/825ec49a-9597-4e47-8c1c-9fac21baed5a-combined-ca-bundle\") pod \"825ec49a-9597-4e47-8c1c-9fac21baed5a\" (UID: \"825ec49a-9597-4e47-8c1c-9fac21baed5a\") " Dec 01 21:57:14 crc kubenswrapper[4962]: I1201 21:57:14.927009 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/825ec49a-9597-4e47-8c1c-9fac21baed5a-log-httpd\") pod \"825ec49a-9597-4e47-8c1c-9fac21baed5a\" (UID: \"825ec49a-9597-4e47-8c1c-9fac21baed5a\") " Dec 01 21:57:14 crc kubenswrapper[4962]: I1201 21:57:14.927063 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/825ec49a-9597-4e47-8c1c-9fac21baed5a-sg-core-conf-yaml\") pod \"825ec49a-9597-4e47-8c1c-9fac21baed5a\" (UID: \"825ec49a-9597-4e47-8c1c-9fac21baed5a\") " Dec 01 21:57:14 crc kubenswrapper[4962]: I1201 21:57:14.934635 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/825ec49a-9597-4e47-8c1c-9fac21baed5a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "825ec49a-9597-4e47-8c1c-9fac21baed5a" (UID: "825ec49a-9597-4e47-8c1c-9fac21baed5a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:57:14 crc kubenswrapper[4962]: I1201 21:57:14.935704 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/825ec49a-9597-4e47-8c1c-9fac21baed5a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "825ec49a-9597-4e47-8c1c-9fac21baed5a" (UID: "825ec49a-9597-4e47-8c1c-9fac21baed5a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:57:14 crc kubenswrapper[4962]: I1201 21:57:14.939377 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/825ec49a-9597-4e47-8c1c-9fac21baed5a-scripts" (OuterVolumeSpecName: "scripts") pod "825ec49a-9597-4e47-8c1c-9fac21baed5a" (UID: "825ec49a-9597-4e47-8c1c-9fac21baed5a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:57:14 crc kubenswrapper[4962]: I1201 21:57:14.941925 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/825ec49a-9597-4e47-8c1c-9fac21baed5a-kube-api-access-8lhtj" (OuterVolumeSpecName: "kube-api-access-8lhtj") pod "825ec49a-9597-4e47-8c1c-9fac21baed5a" (UID: "825ec49a-9597-4e47-8c1c-9fac21baed5a"). InnerVolumeSpecName "kube-api-access-8lhtj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:57:14 crc kubenswrapper[4962]: I1201 21:57:14.979607 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/825ec49a-9597-4e47-8c1c-9fac21baed5a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "825ec49a-9597-4e47-8c1c-9fac21baed5a" (UID: "825ec49a-9597-4e47-8c1c-9fac21baed5a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:57:15 crc kubenswrapper[4962]: I1201 21:57:15.029370 4962 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/825ec49a-9597-4e47-8c1c-9fac21baed5a-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 21:57:15 crc kubenswrapper[4962]: I1201 21:57:15.029616 4962 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/825ec49a-9597-4e47-8c1c-9fac21baed5a-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 21:57:15 crc kubenswrapper[4962]: I1201 21:57:15.029625 4962 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/825ec49a-9597-4e47-8c1c-9fac21baed5a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 01 21:57:15 crc kubenswrapper[4962]: I1201 21:57:15.029633 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/825ec49a-9597-4e47-8c1c-9fac21baed5a-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 21:57:15 crc kubenswrapper[4962]: I1201 21:57:15.029642 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8lhtj\" (UniqueName: \"kubernetes.io/projected/825ec49a-9597-4e47-8c1c-9fac21baed5a-kube-api-access-8lhtj\") on node \"crc\" DevicePath \"\"" Dec 01 21:57:15 crc kubenswrapper[4962]: I1201 21:57:15.053102 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/825ec49a-9597-4e47-8c1c-9fac21baed5a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "825ec49a-9597-4e47-8c1c-9fac21baed5a" (UID: "825ec49a-9597-4e47-8c1c-9fac21baed5a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:57:15 crc kubenswrapper[4962]: I1201 21:57:15.105521 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/825ec49a-9597-4e47-8c1c-9fac21baed5a-config-data" (OuterVolumeSpecName: "config-data") pod "825ec49a-9597-4e47-8c1c-9fac21baed5a" (UID: "825ec49a-9597-4e47-8c1c-9fac21baed5a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:57:15 crc kubenswrapper[4962]: I1201 21:57:15.131813 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/825ec49a-9597-4e47-8c1c-9fac21baed5a-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 21:57:15 crc kubenswrapper[4962]: I1201 21:57:15.131853 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/825ec49a-9597-4e47-8c1c-9fac21baed5a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 21:57:15 crc kubenswrapper[4962]: I1201 21:57:15.374515 4962 generic.go:334] "Generic (PLEG): container finished" podID="825ec49a-9597-4e47-8c1c-9fac21baed5a" containerID="f9a9d8b8c02560188a175645d67481fad9804595608e1db96150422f6fb6ed61" exitCode=0 Dec 01 21:57:15 crc kubenswrapper[4962]: I1201 21:57:15.374682 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 21:57:15 crc kubenswrapper[4962]: I1201 21:57:15.376255 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"825ec49a-9597-4e47-8c1c-9fac21baed5a","Type":"ContainerDied","Data":"f9a9d8b8c02560188a175645d67481fad9804595608e1db96150422f6fb6ed61"} Dec 01 21:57:15 crc kubenswrapper[4962]: I1201 21:57:15.376296 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"825ec49a-9597-4e47-8c1c-9fac21baed5a","Type":"ContainerDied","Data":"72d0dd698dc73e06c4e39c52f73793e111ec32ae436fb2b8a6e1452f077ac5e0"} Dec 01 21:57:15 crc kubenswrapper[4962]: I1201 21:57:15.376313 4962 scope.go:117] "RemoveContainer" containerID="c6ccba931080d5bf36b02ec1175f81abf0f12a6feede81fd027d5bd45a1adf82" Dec 01 21:57:15 crc kubenswrapper[4962]: I1201 21:57:15.405052 4962 scope.go:117] "RemoveContainer" containerID="d8f63d67dadc85428fbe743fc163168641ab662785348e82976dbb0d5a8249c6" Dec 01 21:57:15 crc kubenswrapper[4962]: I1201 21:57:15.419634 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 21:57:15 crc kubenswrapper[4962]: I1201 21:57:15.437324 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 01 21:57:15 crc kubenswrapper[4962]: I1201 21:57:15.455410 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 01 21:57:15 crc kubenswrapper[4962]: E1201 21:57:15.456007 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="825ec49a-9597-4e47-8c1c-9fac21baed5a" containerName="sg-core" Dec 01 21:57:15 crc kubenswrapper[4962]: I1201 21:57:15.456029 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="825ec49a-9597-4e47-8c1c-9fac21baed5a" containerName="sg-core" Dec 01 21:57:15 crc kubenswrapper[4962]: E1201 21:57:15.456067 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="825ec49a-9597-4e47-8c1c-9fac21baed5a" containerName="ceilometer-notification-agent" Dec 01 21:57:15 crc kubenswrapper[4962]: I1201 21:57:15.456075 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="825ec49a-9597-4e47-8c1c-9fac21baed5a" containerName="ceilometer-notification-agent" Dec 01 21:57:15 crc kubenswrapper[4962]: E1201 21:57:15.456094 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="825ec49a-9597-4e47-8c1c-9fac21baed5a" containerName="ceilometer-central-agent" Dec 01 21:57:15 crc kubenswrapper[4962]: I1201 21:57:15.456102 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="825ec49a-9597-4e47-8c1c-9fac21baed5a" containerName="ceilometer-central-agent" Dec 01 21:57:15 crc kubenswrapper[4962]: E1201 21:57:15.456124 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="825ec49a-9597-4e47-8c1c-9fac21baed5a" containerName="proxy-httpd" Dec 01 21:57:15 crc kubenswrapper[4962]: I1201 21:57:15.456130 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="825ec49a-9597-4e47-8c1c-9fac21baed5a" containerName="proxy-httpd" Dec 01 21:57:15 crc kubenswrapper[4962]: I1201 21:57:15.456376 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="825ec49a-9597-4e47-8c1c-9fac21baed5a" containerName="ceilometer-notification-agent" Dec 01 21:57:15 crc kubenswrapper[4962]: I1201 21:57:15.456399 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="825ec49a-9597-4e47-8c1c-9fac21baed5a" containerName="ceilometer-central-agent" Dec 01 21:57:15 crc kubenswrapper[4962]: I1201 21:57:15.456409 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="825ec49a-9597-4e47-8c1c-9fac21baed5a" containerName="proxy-httpd" Dec 01 21:57:15 crc kubenswrapper[4962]: I1201 21:57:15.456431 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="825ec49a-9597-4e47-8c1c-9fac21baed5a" containerName="sg-core" Dec 01 21:57:15 crc kubenswrapper[4962]: I1201 21:57:15.458539 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 21:57:15 crc kubenswrapper[4962]: I1201 21:57:15.460752 4962 scope.go:117] "RemoveContainer" containerID="f9a9d8b8c02560188a175645d67481fad9804595608e1db96150422f6fb6ed61" Dec 01 21:57:15 crc kubenswrapper[4962]: I1201 21:57:15.461440 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 01 21:57:15 crc kubenswrapper[4962]: I1201 21:57:15.463679 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 01 21:57:15 crc kubenswrapper[4962]: I1201 21:57:15.469853 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 21:57:15 crc kubenswrapper[4962]: I1201 21:57:15.497426 4962 scope.go:117] "RemoveContainer" containerID="b37089abafb0b561fc8bb52a4289214dd10b68fdeb17894fca7701c71115549a" Dec 01 21:57:15 crc kubenswrapper[4962]: I1201 21:57:15.531227 4962 scope.go:117] "RemoveContainer" containerID="c6ccba931080d5bf36b02ec1175f81abf0f12a6feede81fd027d5bd45a1adf82" Dec 01 21:57:15 crc kubenswrapper[4962]: E1201 21:57:15.538572 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6ccba931080d5bf36b02ec1175f81abf0f12a6feede81fd027d5bd45a1adf82\": container with ID starting with c6ccba931080d5bf36b02ec1175f81abf0f12a6feede81fd027d5bd45a1adf82 not found: ID does not exist" containerID="c6ccba931080d5bf36b02ec1175f81abf0f12a6feede81fd027d5bd45a1adf82" Dec 01 21:57:15 crc kubenswrapper[4962]: I1201 21:57:15.538623 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6ccba931080d5bf36b02ec1175f81abf0f12a6feede81fd027d5bd45a1adf82"} err="failed to get container status \"c6ccba931080d5bf36b02ec1175f81abf0f12a6feede81fd027d5bd45a1adf82\": rpc error: code = NotFound desc = could not find container \"c6ccba931080d5bf36b02ec1175f81abf0f12a6feede81fd027d5bd45a1adf82\": container with ID starting with c6ccba931080d5bf36b02ec1175f81abf0f12a6feede81fd027d5bd45a1adf82 not found: ID does not exist" Dec 01 21:57:15 crc kubenswrapper[4962]: I1201 21:57:15.538662 4962 scope.go:117] "RemoveContainer" containerID="d8f63d67dadc85428fbe743fc163168641ab662785348e82976dbb0d5a8249c6" Dec 01 21:57:15 crc kubenswrapper[4962]: E1201 21:57:15.539716 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8f63d67dadc85428fbe743fc163168641ab662785348e82976dbb0d5a8249c6\": container with ID starting with d8f63d67dadc85428fbe743fc163168641ab662785348e82976dbb0d5a8249c6 not found: ID does not exist" containerID="d8f63d67dadc85428fbe743fc163168641ab662785348e82976dbb0d5a8249c6" Dec 01 21:57:15 crc kubenswrapper[4962]: I1201 21:57:15.539869 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8f63d67dadc85428fbe743fc163168641ab662785348e82976dbb0d5a8249c6"} err="failed to get container status \"d8f63d67dadc85428fbe743fc163168641ab662785348e82976dbb0d5a8249c6\": rpc error: code = NotFound desc = could not find container \"d8f63d67dadc85428fbe743fc163168641ab662785348e82976dbb0d5a8249c6\": container with ID starting with d8f63d67dadc85428fbe743fc163168641ab662785348e82976dbb0d5a8249c6 not found: ID does not exist" Dec 01 21:57:15 crc kubenswrapper[4962]: I1201 21:57:15.539908 4962 scope.go:117] "RemoveContainer" containerID="f9a9d8b8c02560188a175645d67481fad9804595608e1db96150422f6fb6ed61" Dec 01 21:57:15 crc kubenswrapper[4962]: E1201 21:57:15.540801 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9a9d8b8c02560188a175645d67481fad9804595608e1db96150422f6fb6ed61\": container with ID starting with f9a9d8b8c02560188a175645d67481fad9804595608e1db96150422f6fb6ed61 not found: ID does not exist" containerID="f9a9d8b8c02560188a175645d67481fad9804595608e1db96150422f6fb6ed61" Dec 01 21:57:15 crc kubenswrapper[4962]: I1201 21:57:15.540852 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9a9d8b8c02560188a175645d67481fad9804595608e1db96150422f6fb6ed61"} err="failed to get container status \"f9a9d8b8c02560188a175645d67481fad9804595608e1db96150422f6fb6ed61\": rpc error: code = NotFound desc = could not find container \"f9a9d8b8c02560188a175645d67481fad9804595608e1db96150422f6fb6ed61\": container with ID starting with f9a9d8b8c02560188a175645d67481fad9804595608e1db96150422f6fb6ed61 not found: ID does not exist" Dec 01 21:57:15 crc kubenswrapper[4962]: I1201 21:57:15.540880 4962 scope.go:117] "RemoveContainer" containerID="b37089abafb0b561fc8bb52a4289214dd10b68fdeb17894fca7701c71115549a" Dec 01 21:57:15 crc kubenswrapper[4962]: E1201 21:57:15.541243 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b37089abafb0b561fc8bb52a4289214dd10b68fdeb17894fca7701c71115549a\": container with ID starting with b37089abafb0b561fc8bb52a4289214dd10b68fdeb17894fca7701c71115549a not found: ID does not exist" containerID="b37089abafb0b561fc8bb52a4289214dd10b68fdeb17894fca7701c71115549a" Dec 01 21:57:15 crc kubenswrapper[4962]: I1201 21:57:15.541275 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b37089abafb0b561fc8bb52a4289214dd10b68fdeb17894fca7701c71115549a"} err="failed to get container status \"b37089abafb0b561fc8bb52a4289214dd10b68fdeb17894fca7701c71115549a\": rpc error: code = NotFound desc = could not find container \"b37089abafb0b561fc8bb52a4289214dd10b68fdeb17894fca7701c71115549a\": container with ID starting with b37089abafb0b561fc8bb52a4289214dd10b68fdeb17894fca7701c71115549a not found: ID does not exist" Dec 01 21:57:15 crc kubenswrapper[4962]: I1201 21:57:15.553441 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7w42\" (UniqueName: \"kubernetes.io/projected/f6a46329-485b-472a-9c6c-d3c79dabfb91-kube-api-access-l7w42\") pod \"ceilometer-0\" (UID: \"f6a46329-485b-472a-9c6c-d3c79dabfb91\") " pod="openstack/ceilometer-0" Dec 01 21:57:15 crc kubenswrapper[4962]: I1201 21:57:15.553743 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6a46329-485b-472a-9c6c-d3c79dabfb91-run-httpd\") pod \"ceilometer-0\" (UID: \"f6a46329-485b-472a-9c6c-d3c79dabfb91\") " pod="openstack/ceilometer-0" Dec 01 21:57:15 crc kubenswrapper[4962]: I1201 21:57:15.553795 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6a46329-485b-472a-9c6c-d3c79dabfb91-config-data\") pod \"ceilometer-0\" (UID: \"f6a46329-485b-472a-9c6c-d3c79dabfb91\") " pod="openstack/ceilometer-0" Dec 01 21:57:15 crc kubenswrapper[4962]: I1201 21:57:15.553881 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6a46329-485b-472a-9c6c-d3c79dabfb91-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f6a46329-485b-472a-9c6c-d3c79dabfb91\") " pod="openstack/ceilometer-0" Dec 01 21:57:15 crc kubenswrapper[4962]: I1201 21:57:15.554001 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f6a46329-485b-472a-9c6c-d3c79dabfb91-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f6a46329-485b-472a-9c6c-d3c79dabfb91\") " pod="openstack/ceilometer-0" Dec 01 21:57:15 crc kubenswrapper[4962]: I1201 21:57:15.554081 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6a46329-485b-472a-9c6c-d3c79dabfb91-scripts\") pod \"ceilometer-0\" (UID: \"f6a46329-485b-472a-9c6c-d3c79dabfb91\") " pod="openstack/ceilometer-0" Dec 01 21:57:15 crc kubenswrapper[4962]: I1201 21:57:15.554242 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6a46329-485b-472a-9c6c-d3c79dabfb91-log-httpd\") pod \"ceilometer-0\" (UID: \"f6a46329-485b-472a-9c6c-d3c79dabfb91\") " pod="openstack/ceilometer-0" Dec 01 21:57:15 crc kubenswrapper[4962]: I1201 21:57:15.661165 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f6a46329-485b-472a-9c6c-d3c79dabfb91-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f6a46329-485b-472a-9c6c-d3c79dabfb91\") " pod="openstack/ceilometer-0" Dec 01 21:57:15 crc kubenswrapper[4962]: I1201 21:57:15.661230 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6a46329-485b-472a-9c6c-d3c79dabfb91-scripts\") pod \"ceilometer-0\" (UID: \"f6a46329-485b-472a-9c6c-d3c79dabfb91\") " pod="openstack/ceilometer-0" Dec 01 21:57:15 crc kubenswrapper[4962]: I1201 21:57:15.661307 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6a46329-485b-472a-9c6c-d3c79dabfb91-log-httpd\") pod \"ceilometer-0\" (UID: \"f6a46329-485b-472a-9c6c-d3c79dabfb91\") " pod="openstack/ceilometer-0" Dec 01 21:57:15 crc kubenswrapper[4962]: I1201 21:57:15.661381 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7w42\" (UniqueName: \"kubernetes.io/projected/f6a46329-485b-472a-9c6c-d3c79dabfb91-kube-api-access-l7w42\") pod \"ceilometer-0\" (UID: \"f6a46329-485b-472a-9c6c-d3c79dabfb91\") " pod="openstack/ceilometer-0" Dec 01 21:57:15 crc kubenswrapper[4962]: I1201 21:57:15.661465 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6a46329-485b-472a-9c6c-d3c79dabfb91-run-httpd\") pod \"ceilometer-0\" (UID: \"f6a46329-485b-472a-9c6c-d3c79dabfb91\") " pod="openstack/ceilometer-0" Dec 01 21:57:15 crc kubenswrapper[4962]: I1201 21:57:15.661488 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6a46329-485b-472a-9c6c-d3c79dabfb91-config-data\") pod \"ceilometer-0\" (UID: \"f6a46329-485b-472a-9c6c-d3c79dabfb91\") " pod="openstack/ceilometer-0" Dec 01 21:57:15 crc kubenswrapper[4962]: I1201 21:57:15.661518 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6a46329-485b-472a-9c6c-d3c79dabfb91-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f6a46329-485b-472a-9c6c-d3c79dabfb91\") " pod="openstack/ceilometer-0" Dec 01 21:57:15 crc kubenswrapper[4962]: I1201 21:57:15.662303 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6a46329-485b-472a-9c6c-d3c79dabfb91-run-httpd\") pod \"ceilometer-0\" (UID: \"f6a46329-485b-472a-9c6c-d3c79dabfb91\") " pod="openstack/ceilometer-0" Dec 01 21:57:15 crc kubenswrapper[4962]: I1201 21:57:15.662395 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6a46329-485b-472a-9c6c-d3c79dabfb91-log-httpd\") pod \"ceilometer-0\" (UID: \"f6a46329-485b-472a-9c6c-d3c79dabfb91\") " pod="openstack/ceilometer-0" Dec 01 21:57:15 crc kubenswrapper[4962]: I1201 21:57:15.669023 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6a46329-485b-472a-9c6c-d3c79dabfb91-scripts\") pod \"ceilometer-0\" (UID: \"f6a46329-485b-472a-9c6c-d3c79dabfb91\") " pod="openstack/ceilometer-0" Dec 01 21:57:15 crc kubenswrapper[4962]: I1201 21:57:15.669686 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f6a46329-485b-472a-9c6c-d3c79dabfb91-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f6a46329-485b-472a-9c6c-d3c79dabfb91\") " pod="openstack/ceilometer-0" Dec 01 21:57:15 crc kubenswrapper[4962]: I1201 21:57:15.670902 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6a46329-485b-472a-9c6c-d3c79dabfb91-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f6a46329-485b-472a-9c6c-d3c79dabfb91\") " pod="openstack/ceilometer-0" Dec 01 21:57:15 crc kubenswrapper[4962]: I1201 21:57:15.671534 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6a46329-485b-472a-9c6c-d3c79dabfb91-config-data\") pod \"ceilometer-0\" (UID: \"f6a46329-485b-472a-9c6c-d3c79dabfb91\") " pod="openstack/ceilometer-0" Dec 01 21:57:15 crc kubenswrapper[4962]: I1201 21:57:15.677276 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7w42\" (UniqueName: \"kubernetes.io/projected/f6a46329-485b-472a-9c6c-d3c79dabfb91-kube-api-access-l7w42\") pod \"ceilometer-0\" (UID: \"f6a46329-485b-472a-9c6c-d3c79dabfb91\") " pod="openstack/ceilometer-0" Dec 01 21:57:15 crc kubenswrapper[4962]: I1201 21:57:15.777043 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 21:57:16 crc kubenswrapper[4962]: I1201 21:57:16.232261 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="825ec49a-9597-4e47-8c1c-9fac21baed5a" path="/var/lib/kubelet/pods/825ec49a-9597-4e47-8c1c-9fac21baed5a/volumes" Dec 01 21:57:16 crc kubenswrapper[4962]: I1201 21:57:16.307729 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 21:57:16 crc kubenswrapper[4962]: I1201 21:57:16.393881 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f6a46329-485b-472a-9c6c-d3c79dabfb91","Type":"ContainerStarted","Data":"1c82357763d2755c096e5c1ce06e13685c776c00b018c1ad08e7b207164c2624"} Dec 01 21:57:16 crc kubenswrapper[4962]: I1201 21:57:16.395868 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"07ba8682-f1b4-4f16-85c9-99b5dcc82666","Type":"ContainerStarted","Data":"a5728c452ee33f2eafd2f8441ed6962da04a90c4a8f27f9f3f1495b681f8bcdc"} Dec 01 21:57:17 crc kubenswrapper[4962]: I1201 21:57:17.413895 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f6a46329-485b-472a-9c6c-d3c79dabfb91","Type":"ContainerStarted","Data":"592a94861620cb4df5a2be6b2da7248fb0f70e623d87a0527b24a3096da07c6e"} Dec 01 21:57:18 crc kubenswrapper[4962]: I1201 21:57:18.427846 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f6a46329-485b-472a-9c6c-d3c79dabfb91","Type":"ContainerStarted","Data":"fd96d78c4b92b5a1caf947c80fbfb8e27a019773df074e8e788498f8a570112e"} Dec 01 21:57:18 crc kubenswrapper[4962]: I1201 21:57:18.431783 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"07ba8682-f1b4-4f16-85c9-99b5dcc82666","Type":"ContainerStarted","Data":"89272950bdbf82012cd12825bfcbf1da1de172ee4d61b5963fa60c515724d833"} Dec 01 21:57:18 crc kubenswrapper[4962]: I1201 21:57:18.432024 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="07ba8682-f1b4-4f16-85c9-99b5dcc82666" containerName="aodh-api" containerID="cri-o://7e4570bd611c799e9c21509ffe5b73a1bdfd7d06d349aaae1e9e520a1bc9174b" gracePeriod=30 Dec 01 21:57:18 crc kubenswrapper[4962]: I1201 21:57:18.432799 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="07ba8682-f1b4-4f16-85c9-99b5dcc82666" containerName="aodh-listener" containerID="cri-o://89272950bdbf82012cd12825bfcbf1da1de172ee4d61b5963fa60c515724d833" gracePeriod=30 Dec 01 21:57:18 crc kubenswrapper[4962]: I1201 21:57:18.432899 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="07ba8682-f1b4-4f16-85c9-99b5dcc82666" containerName="aodh-notifier" containerID="cri-o://a5728c452ee33f2eafd2f8441ed6962da04a90c4a8f27f9f3f1495b681f8bcdc" gracePeriod=30 Dec 01 21:57:18 crc kubenswrapper[4962]: I1201 21:57:18.432990 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="07ba8682-f1b4-4f16-85c9-99b5dcc82666" containerName="aodh-evaluator" containerID="cri-o://269869044a427490757176305f563e559f01fad883064266d82b586da8054a16" gracePeriod=30 Dec 01 21:57:18 crc kubenswrapper[4962]: I1201 21:57:18.464407 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=3.166616803 podStartE2EDuration="9.464385467s" podCreationTimestamp="2025-12-01 21:57:09 +0000 UTC" firstStartedPulling="2025-12-01 21:57:10.592672229 +0000 UTC m=+1414.694111424" lastFinishedPulling="2025-12-01 21:57:16.890440873 +0000 UTC m=+1420.991880088" observedRunningTime="2025-12-01 21:57:18.460621939 +0000 UTC m=+1422.562061154" watchObservedRunningTime="2025-12-01 21:57:18.464385467 +0000 UTC m=+1422.565824682" Dec 01 21:57:19 crc kubenswrapper[4962]: I1201 21:57:19.444888 4962 generic.go:334] "Generic (PLEG): container finished" podID="07ba8682-f1b4-4f16-85c9-99b5dcc82666" containerID="a5728c452ee33f2eafd2f8441ed6962da04a90c4a8f27f9f3f1495b681f8bcdc" exitCode=0 Dec 01 21:57:19 crc kubenswrapper[4962]: I1201 21:57:19.445160 4962 generic.go:334] "Generic (PLEG): container finished" podID="07ba8682-f1b4-4f16-85c9-99b5dcc82666" containerID="269869044a427490757176305f563e559f01fad883064266d82b586da8054a16" exitCode=0 Dec 01 21:57:19 crc kubenswrapper[4962]: I1201 21:57:19.445171 4962 generic.go:334] "Generic (PLEG): container finished" podID="07ba8682-f1b4-4f16-85c9-99b5dcc82666" containerID="7e4570bd611c799e9c21509ffe5b73a1bdfd7d06d349aaae1e9e520a1bc9174b" exitCode=0 Dec 01 21:57:19 crc kubenswrapper[4962]: I1201 21:57:19.445056 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"07ba8682-f1b4-4f16-85c9-99b5dcc82666","Type":"ContainerDied","Data":"a5728c452ee33f2eafd2f8441ed6962da04a90c4a8f27f9f3f1495b681f8bcdc"} Dec 01 21:57:19 crc kubenswrapper[4962]: I1201 21:57:19.445245 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"07ba8682-f1b4-4f16-85c9-99b5dcc82666","Type":"ContainerDied","Data":"269869044a427490757176305f563e559f01fad883064266d82b586da8054a16"} Dec 01 21:57:19 crc kubenswrapper[4962]: I1201 21:57:19.445260 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"07ba8682-f1b4-4f16-85c9-99b5dcc82666","Type":"ContainerDied","Data":"7e4570bd611c799e9c21509ffe5b73a1bdfd7d06d349aaae1e9e520a1bc9174b"} Dec 01 21:57:19 crc kubenswrapper[4962]: I1201 21:57:19.447654 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f6a46329-485b-472a-9c6c-d3c79dabfb91","Type":"ContainerStarted","Data":"ee819f6636f293b188e89b69e598c9cf793546503006af4260549afd13af18a2"} Dec 01 21:57:20 crc kubenswrapper[4962]: I1201 21:57:20.464795 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f6a46329-485b-472a-9c6c-d3c79dabfb91","Type":"ContainerStarted","Data":"026f0734ebf1646f899d40438474da1083cd94dd882d5149db146d4656a8a10f"} Dec 01 21:57:20 crc kubenswrapper[4962]: I1201 21:57:20.465445 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 01 21:57:20 crc kubenswrapper[4962]: I1201 21:57:20.493781 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.795005764 podStartE2EDuration="5.493764548s" podCreationTimestamp="2025-12-01 21:57:15 +0000 UTC" firstStartedPulling="2025-12-01 21:57:16.313609008 +0000 UTC m=+1420.415048203" lastFinishedPulling="2025-12-01 21:57:20.012367792 +0000 UTC m=+1424.113806987" observedRunningTime="2025-12-01 21:57:20.483434956 +0000 UTC m=+1424.584874161" watchObservedRunningTime="2025-12-01 21:57:20.493764548 +0000 UTC m=+1424.595203743" Dec 01 21:57:21 crc kubenswrapper[4962]: I1201 21:57:21.144721 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 01 21:57:21 crc kubenswrapper[4962]: I1201 21:57:21.711981 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-fvdgt"] Dec 01 21:57:21 crc kubenswrapper[4962]: I1201 21:57:21.722516 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-fvdgt" Dec 01 21:57:21 crc kubenswrapper[4962]: I1201 21:57:21.734925 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-fvdgt"] Dec 01 21:57:21 crc kubenswrapper[4962]: I1201 21:57:21.739919 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 01 21:57:21 crc kubenswrapper[4962]: I1201 21:57:21.740124 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 01 21:57:21 crc kubenswrapper[4962]: I1201 21:57:21.837430 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8x2r7\" (UniqueName: \"kubernetes.io/projected/e236312e-ed98-484b-ad3e-9ed5a6645df0-kube-api-access-8x2r7\") pod \"nova-cell0-cell-mapping-fvdgt\" (UID: \"e236312e-ed98-484b-ad3e-9ed5a6645df0\") " pod="openstack/nova-cell0-cell-mapping-fvdgt" Dec 01 21:57:21 crc kubenswrapper[4962]: I1201 21:57:21.837516 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e236312e-ed98-484b-ad3e-9ed5a6645df0-scripts\") pod \"nova-cell0-cell-mapping-fvdgt\" (UID: \"e236312e-ed98-484b-ad3e-9ed5a6645df0\") " pod="openstack/nova-cell0-cell-mapping-fvdgt" Dec 01 21:57:21 crc kubenswrapper[4962]: I1201 21:57:21.837557 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e236312e-ed98-484b-ad3e-9ed5a6645df0-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-fvdgt\" (UID: \"e236312e-ed98-484b-ad3e-9ed5a6645df0\") " pod="openstack/nova-cell0-cell-mapping-fvdgt" Dec 01 21:57:21 crc kubenswrapper[4962]: I1201 21:57:21.837656 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e236312e-ed98-484b-ad3e-9ed5a6645df0-config-data\") pod \"nova-cell0-cell-mapping-fvdgt\" (UID: \"e236312e-ed98-484b-ad3e-9ed5a6645df0\") " pod="openstack/nova-cell0-cell-mapping-fvdgt" Dec 01 21:57:21 crc kubenswrapper[4962]: I1201 21:57:21.860085 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 01 21:57:21 crc kubenswrapper[4962]: I1201 21:57:21.862483 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 21:57:21 crc kubenswrapper[4962]: I1201 21:57:21.866341 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 01 21:57:21 crc kubenswrapper[4962]: I1201 21:57:21.877670 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 01 21:57:21 crc kubenswrapper[4962]: I1201 21:57:21.879609 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 21:57:21 crc kubenswrapper[4962]: I1201 21:57:21.883862 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 01 21:57:21 crc kubenswrapper[4962]: I1201 21:57:21.896629 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 21:57:21 crc kubenswrapper[4962]: I1201 21:57:21.931809 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 21:57:21 crc kubenswrapper[4962]: I1201 21:57:21.939282 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gfrs\" (UniqueName: \"kubernetes.io/projected/e8f2b876-3ead-4d4c-b122-de64ae016e7f-kube-api-access-4gfrs\") pod \"nova-metadata-0\" (UID: \"e8f2b876-3ead-4d4c-b122-de64ae016e7f\") " pod="openstack/nova-metadata-0" Dec 01 21:57:21 crc kubenswrapper[4962]: I1201 21:57:21.939373 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8x2r7\" (UniqueName: \"kubernetes.io/projected/e236312e-ed98-484b-ad3e-9ed5a6645df0-kube-api-access-8x2r7\") pod \"nova-cell0-cell-mapping-fvdgt\" (UID: \"e236312e-ed98-484b-ad3e-9ed5a6645df0\") " pod="openstack/nova-cell0-cell-mapping-fvdgt" Dec 01 21:57:21 crc kubenswrapper[4962]: I1201 21:57:21.939469 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e236312e-ed98-484b-ad3e-9ed5a6645df0-scripts\") pod \"nova-cell0-cell-mapping-fvdgt\" (UID: \"e236312e-ed98-484b-ad3e-9ed5a6645df0\") " pod="openstack/nova-cell0-cell-mapping-fvdgt" Dec 01 21:57:21 crc kubenswrapper[4962]: I1201 21:57:21.939518 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e236312e-ed98-484b-ad3e-9ed5a6645df0-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-fvdgt\" (UID: \"e236312e-ed98-484b-ad3e-9ed5a6645df0\") " pod="openstack/nova-cell0-cell-mapping-fvdgt" Dec 01 21:57:21 crc kubenswrapper[4962]: I1201 21:57:21.939617 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8f2b876-3ead-4d4c-b122-de64ae016e7f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e8f2b876-3ead-4d4c-b122-de64ae016e7f\") " pod="openstack/nova-metadata-0" Dec 01 21:57:21 crc kubenswrapper[4962]: I1201 21:57:21.939669 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8f2b876-3ead-4d4c-b122-de64ae016e7f-logs\") pod \"nova-metadata-0\" (UID: \"e8f2b876-3ead-4d4c-b122-de64ae016e7f\") " pod="openstack/nova-metadata-0" Dec 01 21:57:21 crc kubenswrapper[4962]: I1201 21:57:21.939691 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e236312e-ed98-484b-ad3e-9ed5a6645df0-config-data\") pod \"nova-cell0-cell-mapping-fvdgt\" (UID: \"e236312e-ed98-484b-ad3e-9ed5a6645df0\") " pod="openstack/nova-cell0-cell-mapping-fvdgt" Dec 01 21:57:21 crc kubenswrapper[4962]: I1201 21:57:21.939706 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a57d6fa-2989-48c3-b66e-6362b284eda4-logs\") pod \"nova-api-0\" (UID: \"5a57d6fa-2989-48c3-b66e-6362b284eda4\") " pod="openstack/nova-api-0" Dec 01 21:57:21 crc kubenswrapper[4962]: I1201 21:57:21.939760 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a57d6fa-2989-48c3-b66e-6362b284eda4-config-data\") pod \"nova-api-0\" (UID: \"5a57d6fa-2989-48c3-b66e-6362b284eda4\") " pod="openstack/nova-api-0" Dec 01 21:57:21 crc kubenswrapper[4962]: I1201 21:57:21.939780 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8f2b876-3ead-4d4c-b122-de64ae016e7f-config-data\") pod \"nova-metadata-0\" (UID: \"e8f2b876-3ead-4d4c-b122-de64ae016e7f\") " pod="openstack/nova-metadata-0" Dec 01 21:57:21 crc kubenswrapper[4962]: I1201 21:57:21.939809 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9j2lt\" (UniqueName: \"kubernetes.io/projected/5a57d6fa-2989-48c3-b66e-6362b284eda4-kube-api-access-9j2lt\") pod \"nova-api-0\" (UID: \"5a57d6fa-2989-48c3-b66e-6362b284eda4\") " pod="openstack/nova-api-0" Dec 01 21:57:21 crc kubenswrapper[4962]: I1201 21:57:21.939826 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a57d6fa-2989-48c3-b66e-6362b284eda4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5a57d6fa-2989-48c3-b66e-6362b284eda4\") " pod="openstack/nova-api-0" Dec 01 21:57:21 crc kubenswrapper[4962]: I1201 21:57:21.946908 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e236312e-ed98-484b-ad3e-9ed5a6645df0-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-fvdgt\" (UID: \"e236312e-ed98-484b-ad3e-9ed5a6645df0\") " pod="openstack/nova-cell0-cell-mapping-fvdgt" Dec 01 21:57:21 crc kubenswrapper[4962]: I1201 21:57:21.961627 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e236312e-ed98-484b-ad3e-9ed5a6645df0-scripts\") pod \"nova-cell0-cell-mapping-fvdgt\" (UID: \"e236312e-ed98-484b-ad3e-9ed5a6645df0\") " pod="openstack/nova-cell0-cell-mapping-fvdgt" Dec 01 21:57:21 crc kubenswrapper[4962]: I1201 21:57:21.965742 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e236312e-ed98-484b-ad3e-9ed5a6645df0-config-data\") pod \"nova-cell0-cell-mapping-fvdgt\" (UID: \"e236312e-ed98-484b-ad3e-9ed5a6645df0\") " pod="openstack/nova-cell0-cell-mapping-fvdgt" Dec 01 21:57:21 crc kubenswrapper[4962]: I1201 21:57:21.971512 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8x2r7\" (UniqueName: \"kubernetes.io/projected/e236312e-ed98-484b-ad3e-9ed5a6645df0-kube-api-access-8x2r7\") pod \"nova-cell0-cell-mapping-fvdgt\" (UID: \"e236312e-ed98-484b-ad3e-9ed5a6645df0\") " pod="openstack/nova-cell0-cell-mapping-fvdgt" Dec 01 21:57:22 crc kubenswrapper[4962]: I1201 21:57:22.003339 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 21:57:22 crc kubenswrapper[4962]: I1201 21:57:22.006210 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 21:57:22 crc kubenswrapper[4962]: I1201 21:57:22.029834 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 01 21:57:22 crc kubenswrapper[4962]: I1201 21:57:22.048351 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a57d6fa-2989-48c3-b66e-6362b284eda4-config-data\") pod \"nova-api-0\" (UID: \"5a57d6fa-2989-48c3-b66e-6362b284eda4\") " pod="openstack/nova-api-0" Dec 01 21:57:22 crc kubenswrapper[4962]: I1201 21:57:22.048602 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8f2b876-3ead-4d4c-b122-de64ae016e7f-config-data\") pod \"nova-metadata-0\" (UID: \"e8f2b876-3ead-4d4c-b122-de64ae016e7f\") " pod="openstack/nova-metadata-0" Dec 01 21:57:22 crc kubenswrapper[4962]: I1201 21:57:22.048681 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9j2lt\" (UniqueName: \"kubernetes.io/projected/5a57d6fa-2989-48c3-b66e-6362b284eda4-kube-api-access-9j2lt\") pod \"nova-api-0\" (UID: \"5a57d6fa-2989-48c3-b66e-6362b284eda4\") " pod="openstack/nova-api-0" Dec 01 21:57:22 crc kubenswrapper[4962]: I1201 21:57:22.048749 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a57d6fa-2989-48c3-b66e-6362b284eda4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5a57d6fa-2989-48c3-b66e-6362b284eda4\") " pod="openstack/nova-api-0" Dec 01 21:57:22 crc kubenswrapper[4962]: I1201 21:57:22.048830 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gfrs\" (UniqueName: \"kubernetes.io/projected/e8f2b876-3ead-4d4c-b122-de64ae016e7f-kube-api-access-4gfrs\") pod \"nova-metadata-0\" (UID: \"e8f2b876-3ead-4d4c-b122-de64ae016e7f\") " pod="openstack/nova-metadata-0" Dec 01 21:57:22 crc kubenswrapper[4962]: I1201 21:57:22.049292 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8f2b876-3ead-4d4c-b122-de64ae016e7f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e8f2b876-3ead-4d4c-b122-de64ae016e7f\") " pod="openstack/nova-metadata-0" Dec 01 21:57:22 crc kubenswrapper[4962]: I1201 21:57:22.049397 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8f2b876-3ead-4d4c-b122-de64ae016e7f-logs\") pod \"nova-metadata-0\" (UID: \"e8f2b876-3ead-4d4c-b122-de64ae016e7f\") " pod="openstack/nova-metadata-0" Dec 01 21:57:22 crc kubenswrapper[4962]: I1201 21:57:22.049472 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a57d6fa-2989-48c3-b66e-6362b284eda4-logs\") pod \"nova-api-0\" (UID: \"5a57d6fa-2989-48c3-b66e-6362b284eda4\") " pod="openstack/nova-api-0" Dec 01 21:57:22 crc kubenswrapper[4962]: I1201 21:57:22.049898 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a57d6fa-2989-48c3-b66e-6362b284eda4-logs\") pod \"nova-api-0\" (UID: \"5a57d6fa-2989-48c3-b66e-6362b284eda4\") " pod="openstack/nova-api-0" Dec 01 21:57:22 crc kubenswrapper[4962]: I1201 21:57:22.051463 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 21:57:22 crc kubenswrapper[4962]: I1201 21:57:22.053344 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8f2b876-3ead-4d4c-b122-de64ae016e7f-logs\") pod \"nova-metadata-0\" (UID: \"e8f2b876-3ead-4d4c-b122-de64ae016e7f\") " pod="openstack/nova-metadata-0" Dec 01 21:57:22 crc kubenswrapper[4962]: I1201 21:57:22.061989 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-fvdgt" Dec 01 21:57:22 crc kubenswrapper[4962]: I1201 21:57:22.068439 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8f2b876-3ead-4d4c-b122-de64ae016e7f-config-data\") pod \"nova-metadata-0\" (UID: \"e8f2b876-3ead-4d4c-b122-de64ae016e7f\") " pod="openstack/nova-metadata-0" Dec 01 21:57:22 crc kubenswrapper[4962]: I1201 21:57:22.071890 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a57d6fa-2989-48c3-b66e-6362b284eda4-config-data\") pod \"nova-api-0\" (UID: \"5a57d6fa-2989-48c3-b66e-6362b284eda4\") " pod="openstack/nova-api-0" Dec 01 21:57:22 crc kubenswrapper[4962]: I1201 21:57:22.073613 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8f2b876-3ead-4d4c-b122-de64ae016e7f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e8f2b876-3ead-4d4c-b122-de64ae016e7f\") " pod="openstack/nova-metadata-0" Dec 01 21:57:22 crc kubenswrapper[4962]: I1201 21:57:22.082260 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a57d6fa-2989-48c3-b66e-6362b284eda4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5a57d6fa-2989-48c3-b66e-6362b284eda4\") " pod="openstack/nova-api-0" Dec 01 21:57:22 crc kubenswrapper[4962]: I1201 21:57:22.082598 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9j2lt\" (UniqueName: \"kubernetes.io/projected/5a57d6fa-2989-48c3-b66e-6362b284eda4-kube-api-access-9j2lt\") pod \"nova-api-0\" (UID: \"5a57d6fa-2989-48c3-b66e-6362b284eda4\") " pod="openstack/nova-api-0" Dec 01 21:57:22 crc kubenswrapper[4962]: I1201 21:57:22.093012 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gfrs\" (UniqueName: \"kubernetes.io/projected/e8f2b876-3ead-4d4c-b122-de64ae016e7f-kube-api-access-4gfrs\") pod \"nova-metadata-0\" (UID: \"e8f2b876-3ead-4d4c-b122-de64ae016e7f\") " pod="openstack/nova-metadata-0" Dec 01 21:57:22 crc kubenswrapper[4962]: I1201 21:57:22.097588 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-568d7fd7cf-rw5lw"] Dec 01 21:57:22 crc kubenswrapper[4962]: I1201 21:57:22.108135 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-568d7fd7cf-rw5lw" Dec 01 21:57:22 crc kubenswrapper[4962]: I1201 21:57:22.151266 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99834bd0-d60e-45fd-b9f9-2d252fd9117c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"99834bd0-d60e-45fd-b9f9-2d252fd9117c\") " pod="openstack/nova-scheduler-0" Dec 01 21:57:22 crc kubenswrapper[4962]: I1201 21:57:22.151450 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mkvf\" (UniqueName: \"kubernetes.io/projected/99834bd0-d60e-45fd-b9f9-2d252fd9117c-kube-api-access-7mkvf\") pod \"nova-scheduler-0\" (UID: \"99834bd0-d60e-45fd-b9f9-2d252fd9117c\") " pod="openstack/nova-scheduler-0" Dec 01 21:57:22 crc kubenswrapper[4962]: I1201 21:57:22.151519 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99834bd0-d60e-45fd-b9f9-2d252fd9117c-config-data\") pod \"nova-scheduler-0\" (UID: \"99834bd0-d60e-45fd-b9f9-2d252fd9117c\") " pod="openstack/nova-scheduler-0" Dec 01 21:57:22 crc kubenswrapper[4962]: I1201 21:57:22.162532 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-568d7fd7cf-rw5lw"] Dec 01 21:57:22 crc kubenswrapper[4962]: I1201 21:57:22.204513 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 21:57:22 crc kubenswrapper[4962]: I1201 21:57:22.209535 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 21:57:22 crc kubenswrapper[4962]: I1201 21:57:22.211079 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 01 21:57:22 crc kubenswrapper[4962]: I1201 21:57:22.215660 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 01 21:57:22 crc kubenswrapper[4962]: I1201 21:57:22.225561 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 21:57:22 crc kubenswrapper[4962]: I1201 21:57:22.256709 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mkvf\" (UniqueName: \"kubernetes.io/projected/99834bd0-d60e-45fd-b9f9-2d252fd9117c-kube-api-access-7mkvf\") pod \"nova-scheduler-0\" (UID: \"99834bd0-d60e-45fd-b9f9-2d252fd9117c\") " pod="openstack/nova-scheduler-0" Dec 01 21:57:22 crc kubenswrapper[4962]: I1201 21:57:22.256796 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a81298b0-e9da-494b-ac1c-4c7e3e1be818-ovsdbserver-sb\") pod \"dnsmasq-dns-568d7fd7cf-rw5lw\" (UID: \"a81298b0-e9da-494b-ac1c-4c7e3e1be818\") " pod="openstack/dnsmasq-dns-568d7fd7cf-rw5lw" Dec 01 21:57:22 crc kubenswrapper[4962]: I1201 21:57:22.256846 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99834bd0-d60e-45fd-b9f9-2d252fd9117c-config-data\") pod \"nova-scheduler-0\" (UID: \"99834bd0-d60e-45fd-b9f9-2d252fd9117c\") " pod="openstack/nova-scheduler-0" Dec 01 21:57:22 crc kubenswrapper[4962]: I1201 21:57:22.256888 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99834bd0-d60e-45fd-b9f9-2d252fd9117c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"99834bd0-d60e-45fd-b9f9-2d252fd9117c\") " pod="openstack/nova-scheduler-0" Dec 01 21:57:22 crc kubenswrapper[4962]: I1201 21:57:22.256907 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a81298b0-e9da-494b-ac1c-4c7e3e1be818-ovsdbserver-nb\") pod \"dnsmasq-dns-568d7fd7cf-rw5lw\" (UID: \"a81298b0-e9da-494b-ac1c-4c7e3e1be818\") " pod="openstack/dnsmasq-dns-568d7fd7cf-rw5lw" Dec 01 21:57:22 crc kubenswrapper[4962]: I1201 21:57:22.256926 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxcbz\" (UniqueName: \"kubernetes.io/projected/a81298b0-e9da-494b-ac1c-4c7e3e1be818-kube-api-access-bxcbz\") pod \"dnsmasq-dns-568d7fd7cf-rw5lw\" (UID: \"a81298b0-e9da-494b-ac1c-4c7e3e1be818\") " pod="openstack/dnsmasq-dns-568d7fd7cf-rw5lw" Dec 01 21:57:22 crc kubenswrapper[4962]: I1201 21:57:22.263748 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a81298b0-e9da-494b-ac1c-4c7e3e1be818-config\") pod \"dnsmasq-dns-568d7fd7cf-rw5lw\" (UID: \"a81298b0-e9da-494b-ac1c-4c7e3e1be818\") " pod="openstack/dnsmasq-dns-568d7fd7cf-rw5lw" Dec 01 21:57:22 crc kubenswrapper[4962]: I1201 21:57:22.263802 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a81298b0-e9da-494b-ac1c-4c7e3e1be818-dns-svc\") pod \"dnsmasq-dns-568d7fd7cf-rw5lw\" (UID: \"a81298b0-e9da-494b-ac1c-4c7e3e1be818\") " pod="openstack/dnsmasq-dns-568d7fd7cf-rw5lw" Dec 01 21:57:22 crc kubenswrapper[4962]: I1201 21:57:22.263909 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a81298b0-e9da-494b-ac1c-4c7e3e1be818-dns-swift-storage-0\") pod \"dnsmasq-dns-568d7fd7cf-rw5lw\" (UID: \"a81298b0-e9da-494b-ac1c-4c7e3e1be818\") " pod="openstack/dnsmasq-dns-568d7fd7cf-rw5lw" Dec 01 21:57:22 crc kubenswrapper[4962]: I1201 21:57:22.273375 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99834bd0-d60e-45fd-b9f9-2d252fd9117c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"99834bd0-d60e-45fd-b9f9-2d252fd9117c\") " pod="openstack/nova-scheduler-0" Dec 01 21:57:22 crc kubenswrapper[4962]: I1201 21:57:22.298537 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99834bd0-d60e-45fd-b9f9-2d252fd9117c-config-data\") pod \"nova-scheduler-0\" (UID: \"99834bd0-d60e-45fd-b9f9-2d252fd9117c\") " pod="openstack/nova-scheduler-0" Dec 01 21:57:22 crc kubenswrapper[4962]: I1201 21:57:22.311871 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mkvf\" (UniqueName: \"kubernetes.io/projected/99834bd0-d60e-45fd-b9f9-2d252fd9117c-kube-api-access-7mkvf\") pod \"nova-scheduler-0\" (UID: \"99834bd0-d60e-45fd-b9f9-2d252fd9117c\") " pod="openstack/nova-scheduler-0" Dec 01 21:57:22 crc kubenswrapper[4962]: I1201 21:57:22.347443 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 21:57:22 crc kubenswrapper[4962]: I1201 21:57:22.366790 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a81298b0-e9da-494b-ac1c-4c7e3e1be818-ovsdbserver-sb\") pod \"dnsmasq-dns-568d7fd7cf-rw5lw\" (UID: \"a81298b0-e9da-494b-ac1c-4c7e3e1be818\") " pod="openstack/dnsmasq-dns-568d7fd7cf-rw5lw" Dec 01 21:57:22 crc kubenswrapper[4962]: I1201 21:57:22.366853 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65d408e2-365e-4ab9-9077-ea1706b8d4a2-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"65d408e2-365e-4ab9-9077-ea1706b8d4a2\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 21:57:22 crc kubenswrapper[4962]: I1201 21:57:22.366914 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65d408e2-365e-4ab9-9077-ea1706b8d4a2-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"65d408e2-365e-4ab9-9077-ea1706b8d4a2\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 21:57:22 crc kubenswrapper[4962]: I1201 21:57:22.366963 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a81298b0-e9da-494b-ac1c-4c7e3e1be818-ovsdbserver-nb\") pod \"dnsmasq-dns-568d7fd7cf-rw5lw\" (UID: \"a81298b0-e9da-494b-ac1c-4c7e3e1be818\") " pod="openstack/dnsmasq-dns-568d7fd7cf-rw5lw" Dec 01 21:57:22 crc kubenswrapper[4962]: I1201 21:57:22.366986 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxcbz\" (UniqueName: \"kubernetes.io/projected/a81298b0-e9da-494b-ac1c-4c7e3e1be818-kube-api-access-bxcbz\") pod \"dnsmasq-dns-568d7fd7cf-rw5lw\" (UID: \"a81298b0-e9da-494b-ac1c-4c7e3e1be818\") " pod="openstack/dnsmasq-dns-568d7fd7cf-rw5lw" Dec 01 21:57:22 crc kubenswrapper[4962]: I1201 21:57:22.367028 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frw2c\" (UniqueName: \"kubernetes.io/projected/65d408e2-365e-4ab9-9077-ea1706b8d4a2-kube-api-access-frw2c\") pod \"nova-cell1-novncproxy-0\" (UID: \"65d408e2-365e-4ab9-9077-ea1706b8d4a2\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 21:57:22 crc kubenswrapper[4962]: I1201 21:57:22.367087 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a81298b0-e9da-494b-ac1c-4c7e3e1be818-config\") pod \"dnsmasq-dns-568d7fd7cf-rw5lw\" (UID: \"a81298b0-e9da-494b-ac1c-4c7e3e1be818\") " pod="openstack/dnsmasq-dns-568d7fd7cf-rw5lw" Dec 01 21:57:22 crc kubenswrapper[4962]: I1201 21:57:22.367120 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a81298b0-e9da-494b-ac1c-4c7e3e1be818-dns-svc\") pod \"dnsmasq-dns-568d7fd7cf-rw5lw\" (UID: \"a81298b0-e9da-494b-ac1c-4c7e3e1be818\") " pod="openstack/dnsmasq-dns-568d7fd7cf-rw5lw" Dec 01 21:57:22 crc kubenswrapper[4962]: I1201 21:57:22.367181 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a81298b0-e9da-494b-ac1c-4c7e3e1be818-dns-swift-storage-0\") pod \"dnsmasq-dns-568d7fd7cf-rw5lw\" (UID: \"a81298b0-e9da-494b-ac1c-4c7e3e1be818\") " pod="openstack/dnsmasq-dns-568d7fd7cf-rw5lw" Dec 01 21:57:22 crc kubenswrapper[4962]: I1201 21:57:22.367808 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a81298b0-e9da-494b-ac1c-4c7e3e1be818-ovsdbserver-sb\") pod \"dnsmasq-dns-568d7fd7cf-rw5lw\" (UID: \"a81298b0-e9da-494b-ac1c-4c7e3e1be818\") " pod="openstack/dnsmasq-dns-568d7fd7cf-rw5lw" Dec 01 21:57:22 crc kubenswrapper[4962]: I1201 21:57:22.368269 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a81298b0-e9da-494b-ac1c-4c7e3e1be818-ovsdbserver-nb\") pod \"dnsmasq-dns-568d7fd7cf-rw5lw\" (UID: \"a81298b0-e9da-494b-ac1c-4c7e3e1be818\") " pod="openstack/dnsmasq-dns-568d7fd7cf-rw5lw" Dec 01 21:57:22 crc kubenswrapper[4962]: I1201 21:57:22.368368 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a81298b0-e9da-494b-ac1c-4c7e3e1be818-dns-swift-storage-0\") pod \"dnsmasq-dns-568d7fd7cf-rw5lw\" (UID: \"a81298b0-e9da-494b-ac1c-4c7e3e1be818\") " pod="openstack/dnsmasq-dns-568d7fd7cf-rw5lw" Dec 01 21:57:22 crc kubenswrapper[4962]: I1201 21:57:22.383848 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a81298b0-e9da-494b-ac1c-4c7e3e1be818-config\") pod \"dnsmasq-dns-568d7fd7cf-rw5lw\" (UID: \"a81298b0-e9da-494b-ac1c-4c7e3e1be818\") " pod="openstack/dnsmasq-dns-568d7fd7cf-rw5lw" Dec 01 21:57:22 crc kubenswrapper[4962]: I1201 21:57:22.387584 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a81298b0-e9da-494b-ac1c-4c7e3e1be818-dns-svc\") pod \"dnsmasq-dns-568d7fd7cf-rw5lw\" (UID: \"a81298b0-e9da-494b-ac1c-4c7e3e1be818\") " pod="openstack/dnsmasq-dns-568d7fd7cf-rw5lw" Dec 01 21:57:22 crc kubenswrapper[4962]: I1201 21:57:22.399734 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxcbz\" (UniqueName: \"kubernetes.io/projected/a81298b0-e9da-494b-ac1c-4c7e3e1be818-kube-api-access-bxcbz\") pod \"dnsmasq-dns-568d7fd7cf-rw5lw\" (UID: \"a81298b0-e9da-494b-ac1c-4c7e3e1be818\") " pod="openstack/dnsmasq-dns-568d7fd7cf-rw5lw" Dec 01 21:57:22 crc kubenswrapper[4962]: I1201 21:57:22.469395 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65d408e2-365e-4ab9-9077-ea1706b8d4a2-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"65d408e2-365e-4ab9-9077-ea1706b8d4a2\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 21:57:22 crc kubenswrapper[4962]: I1201 21:57:22.469496 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frw2c\" (UniqueName: \"kubernetes.io/projected/65d408e2-365e-4ab9-9077-ea1706b8d4a2-kube-api-access-frw2c\") pod \"nova-cell1-novncproxy-0\" (UID: \"65d408e2-365e-4ab9-9077-ea1706b8d4a2\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 21:57:22 crc kubenswrapper[4962]: I1201 21:57:22.469690 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65d408e2-365e-4ab9-9077-ea1706b8d4a2-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"65d408e2-365e-4ab9-9077-ea1706b8d4a2\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 21:57:22 crc kubenswrapper[4962]: I1201 21:57:22.490830 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65d408e2-365e-4ab9-9077-ea1706b8d4a2-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"65d408e2-365e-4ab9-9077-ea1706b8d4a2\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 21:57:22 crc kubenswrapper[4962]: I1201 21:57:22.508900 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frw2c\" (UniqueName: \"kubernetes.io/projected/65d408e2-365e-4ab9-9077-ea1706b8d4a2-kube-api-access-frw2c\") pod \"nova-cell1-novncproxy-0\" (UID: \"65d408e2-365e-4ab9-9077-ea1706b8d4a2\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 21:57:22 crc kubenswrapper[4962]: I1201 21:57:22.514351 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65d408e2-365e-4ab9-9077-ea1706b8d4a2-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"65d408e2-365e-4ab9-9077-ea1706b8d4a2\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 21:57:22 crc kubenswrapper[4962]: I1201 21:57:22.539457 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 21:57:22 crc kubenswrapper[4962]: I1201 21:57:22.641081 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-568d7fd7cf-rw5lw" Dec 01 21:57:22 crc kubenswrapper[4962]: I1201 21:57:22.661150 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 01 21:57:23 crc kubenswrapper[4962]: I1201 21:57:23.096668 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 21:57:23 crc kubenswrapper[4962]: I1201 21:57:23.260470 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 21:57:23 crc kubenswrapper[4962]: I1201 21:57:23.271700 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-fvdgt"] Dec 01 21:57:23 crc kubenswrapper[4962]: W1201 21:57:23.279054 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode236312e_ed98_484b_ad3e_9ed5a6645df0.slice/crio-4d26bcdbc39958b489203334a4c288ee659b4a1764785ce9543a5876dd165a21 WatchSource:0}: Error finding container 4d26bcdbc39958b489203334a4c288ee659b4a1764785ce9543a5876dd165a21: Status 404 returned error can't find the container with id 4d26bcdbc39958b489203334a4c288ee659b4a1764785ce9543a5876dd165a21 Dec 01 21:57:23 crc kubenswrapper[4962]: I1201 21:57:23.526845 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5a57d6fa-2989-48c3-b66e-6362b284eda4","Type":"ContainerStarted","Data":"7006d324db4297f99cca2d534575134bf1169582b98b90898c655b3cedec764d"} Dec 01 21:57:23 crc kubenswrapper[4962]: I1201 21:57:23.532120 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-fvdgt" event={"ID":"e236312e-ed98-484b-ad3e-9ed5a6645df0","Type":"ContainerStarted","Data":"4d26bcdbc39958b489203334a4c288ee659b4a1764785ce9543a5876dd165a21"} Dec 01 21:57:23 crc kubenswrapper[4962]: I1201 21:57:23.537654 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e8f2b876-3ead-4d4c-b122-de64ae016e7f","Type":"ContainerStarted","Data":"c057100b6feb8ff9d432d01cb0f8ffca404d8284c4534d2d7c7dac8bcf367418"} Dec 01 21:57:23 crc kubenswrapper[4962]: I1201 21:57:23.631391 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-568d7fd7cf-rw5lw"] Dec 01 21:57:23 crc kubenswrapper[4962]: W1201 21:57:23.658557 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod99834bd0_d60e_45fd_b9f9_2d252fd9117c.slice/crio-53cd5e82a129426077ed2007096cb57f5a771472cf0d4a9f14271ff700d82934 WatchSource:0}: Error finding container 53cd5e82a129426077ed2007096cb57f5a771472cf0d4a9f14271ff700d82934: Status 404 returned error can't find the container with id 53cd5e82a129426077ed2007096cb57f5a771472cf0d4a9f14271ff700d82934 Dec 01 21:57:23 crc kubenswrapper[4962]: I1201 21:57:23.661774 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 21:57:23 crc kubenswrapper[4962]: I1201 21:57:23.673501 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 21:57:23 crc kubenswrapper[4962]: I1201 21:57:23.731003 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-58lcl"] Dec 01 21:57:23 crc kubenswrapper[4962]: I1201 21:57:23.732431 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-58lcl" Dec 01 21:57:23 crc kubenswrapper[4962]: I1201 21:57:23.734097 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 01 21:57:23 crc kubenswrapper[4962]: I1201 21:57:23.735571 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 01 21:57:23 crc kubenswrapper[4962]: I1201 21:57:23.764514 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-58lcl"] Dec 01 21:57:23 crc kubenswrapper[4962]: I1201 21:57:23.829283 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f337e4ae-c46a-49f4-b8fe-bb7973cbdf8d-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-58lcl\" (UID: \"f337e4ae-c46a-49f4-b8fe-bb7973cbdf8d\") " pod="openstack/nova-cell1-conductor-db-sync-58lcl" Dec 01 21:57:23 crc kubenswrapper[4962]: I1201 21:57:23.829345 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f337e4ae-c46a-49f4-b8fe-bb7973cbdf8d-scripts\") pod \"nova-cell1-conductor-db-sync-58lcl\" (UID: \"f337e4ae-c46a-49f4-b8fe-bb7973cbdf8d\") " pod="openstack/nova-cell1-conductor-db-sync-58lcl" Dec 01 21:57:23 crc kubenswrapper[4962]: I1201 21:57:23.829510 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lsnd\" (UniqueName: \"kubernetes.io/projected/f337e4ae-c46a-49f4-b8fe-bb7973cbdf8d-kube-api-access-4lsnd\") pod \"nova-cell1-conductor-db-sync-58lcl\" (UID: \"f337e4ae-c46a-49f4-b8fe-bb7973cbdf8d\") " pod="openstack/nova-cell1-conductor-db-sync-58lcl" Dec 01 21:57:23 crc kubenswrapper[4962]: I1201 21:57:23.829538 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f337e4ae-c46a-49f4-b8fe-bb7973cbdf8d-config-data\") pod \"nova-cell1-conductor-db-sync-58lcl\" (UID: \"f337e4ae-c46a-49f4-b8fe-bb7973cbdf8d\") " pod="openstack/nova-cell1-conductor-db-sync-58lcl" Dec 01 21:57:23 crc kubenswrapper[4962]: I1201 21:57:23.931415 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lsnd\" (UniqueName: \"kubernetes.io/projected/f337e4ae-c46a-49f4-b8fe-bb7973cbdf8d-kube-api-access-4lsnd\") pod \"nova-cell1-conductor-db-sync-58lcl\" (UID: \"f337e4ae-c46a-49f4-b8fe-bb7973cbdf8d\") " pod="openstack/nova-cell1-conductor-db-sync-58lcl" Dec 01 21:57:23 crc kubenswrapper[4962]: I1201 21:57:23.931819 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f337e4ae-c46a-49f4-b8fe-bb7973cbdf8d-config-data\") pod \"nova-cell1-conductor-db-sync-58lcl\" (UID: \"f337e4ae-c46a-49f4-b8fe-bb7973cbdf8d\") " pod="openstack/nova-cell1-conductor-db-sync-58lcl" Dec 01 21:57:23 crc kubenswrapper[4962]: I1201 21:57:23.931894 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f337e4ae-c46a-49f4-b8fe-bb7973cbdf8d-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-58lcl\" (UID: \"f337e4ae-c46a-49f4-b8fe-bb7973cbdf8d\") " pod="openstack/nova-cell1-conductor-db-sync-58lcl" Dec 01 21:57:23 crc kubenswrapper[4962]: I1201 21:57:23.931957 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f337e4ae-c46a-49f4-b8fe-bb7973cbdf8d-scripts\") pod \"nova-cell1-conductor-db-sync-58lcl\" (UID: \"f337e4ae-c46a-49f4-b8fe-bb7973cbdf8d\") " pod="openstack/nova-cell1-conductor-db-sync-58lcl" Dec 01 21:57:23 crc kubenswrapper[4962]: I1201 21:57:23.936359 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f337e4ae-c46a-49f4-b8fe-bb7973cbdf8d-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-58lcl\" (UID: \"f337e4ae-c46a-49f4-b8fe-bb7973cbdf8d\") " pod="openstack/nova-cell1-conductor-db-sync-58lcl" Dec 01 21:57:23 crc kubenswrapper[4962]: I1201 21:57:23.936489 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f337e4ae-c46a-49f4-b8fe-bb7973cbdf8d-scripts\") pod \"nova-cell1-conductor-db-sync-58lcl\" (UID: \"f337e4ae-c46a-49f4-b8fe-bb7973cbdf8d\") " pod="openstack/nova-cell1-conductor-db-sync-58lcl" Dec 01 21:57:23 crc kubenswrapper[4962]: I1201 21:57:23.942127 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f337e4ae-c46a-49f4-b8fe-bb7973cbdf8d-config-data\") pod \"nova-cell1-conductor-db-sync-58lcl\" (UID: \"f337e4ae-c46a-49f4-b8fe-bb7973cbdf8d\") " pod="openstack/nova-cell1-conductor-db-sync-58lcl" Dec 01 21:57:23 crc kubenswrapper[4962]: I1201 21:57:23.957436 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lsnd\" (UniqueName: \"kubernetes.io/projected/f337e4ae-c46a-49f4-b8fe-bb7973cbdf8d-kube-api-access-4lsnd\") pod \"nova-cell1-conductor-db-sync-58lcl\" (UID: \"f337e4ae-c46a-49f4-b8fe-bb7973cbdf8d\") " pod="openstack/nova-cell1-conductor-db-sync-58lcl" Dec 01 21:57:24 crc kubenswrapper[4962]: I1201 21:57:24.117564 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-58lcl" Dec 01 21:57:24 crc kubenswrapper[4962]: I1201 21:57:24.564915 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"65d408e2-365e-4ab9-9077-ea1706b8d4a2","Type":"ContainerStarted","Data":"abf6059f16ab3232d9a6140b5bdedcd0233a9f42c890764f61a0ef1219cf3973"} Dec 01 21:57:24 crc kubenswrapper[4962]: I1201 21:57:24.572203 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"99834bd0-d60e-45fd-b9f9-2d252fd9117c","Type":"ContainerStarted","Data":"53cd5e82a129426077ed2007096cb57f5a771472cf0d4a9f14271ff700d82934"} Dec 01 21:57:24 crc kubenswrapper[4962]: I1201 21:57:24.575162 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-fvdgt" event={"ID":"e236312e-ed98-484b-ad3e-9ed5a6645df0","Type":"ContainerStarted","Data":"1d0ff72f8769c0fd22c9037b58fc1ff886d1665425352ace8895034a81a97248"} Dec 01 21:57:24 crc kubenswrapper[4962]: I1201 21:57:24.578319 4962 generic.go:334] "Generic (PLEG): container finished" podID="a81298b0-e9da-494b-ac1c-4c7e3e1be818" containerID="4b10d51e358d870a2c8b9428c9c8d99acd292597b8743870c73fe4b1823372d1" exitCode=0 Dec 01 21:57:24 crc kubenswrapper[4962]: I1201 21:57:24.578357 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-568d7fd7cf-rw5lw" event={"ID":"a81298b0-e9da-494b-ac1c-4c7e3e1be818","Type":"ContainerDied","Data":"4b10d51e358d870a2c8b9428c9c8d99acd292597b8743870c73fe4b1823372d1"} Dec 01 21:57:24 crc kubenswrapper[4962]: I1201 21:57:24.578379 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-568d7fd7cf-rw5lw" event={"ID":"a81298b0-e9da-494b-ac1c-4c7e3e1be818","Type":"ContainerStarted","Data":"649bf8f1fd0c04a10d7cd6a764163af6ac92ac0c1f940389a7a5fce714da3079"} Dec 01 21:57:24 crc kubenswrapper[4962]: I1201 21:57:24.606688 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-fvdgt" podStartSLOduration=3.6066724949999998 podStartE2EDuration="3.606672495s" podCreationTimestamp="2025-12-01 21:57:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:57:24.596697772 +0000 UTC m=+1428.698136977" watchObservedRunningTime="2025-12-01 21:57:24.606672495 +0000 UTC m=+1428.708111690" Dec 01 21:57:24 crc kubenswrapper[4962]: W1201 21:57:24.709554 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf337e4ae_c46a_49f4_b8fe_bb7973cbdf8d.slice/crio-c4835f0d0622d3b3cefab2c05641fd7984516534a427deed2f1b48c2106c3480 WatchSource:0}: Error finding container c4835f0d0622d3b3cefab2c05641fd7984516534a427deed2f1b48c2106c3480: Status 404 returned error can't find the container with id c4835f0d0622d3b3cefab2c05641fd7984516534a427deed2f1b48c2106c3480 Dec 01 21:57:24 crc kubenswrapper[4962]: I1201 21:57:24.711294 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-58lcl"] Dec 01 21:57:25 crc kubenswrapper[4962]: I1201 21:57:25.592644 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-568d7fd7cf-rw5lw" event={"ID":"a81298b0-e9da-494b-ac1c-4c7e3e1be818","Type":"ContainerStarted","Data":"b022597fb5249e1e727c92ba7738ab1dc71db487e4c705c7d496197bf238eee1"} Dec 01 21:57:25 crc kubenswrapper[4962]: I1201 21:57:25.593054 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-568d7fd7cf-rw5lw" Dec 01 21:57:25 crc kubenswrapper[4962]: I1201 21:57:25.595293 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-58lcl" event={"ID":"f337e4ae-c46a-49f4-b8fe-bb7973cbdf8d","Type":"ContainerStarted","Data":"0fbb87ea3559411331543374877d9cc86ad777f447b9042fed0047fb49676970"} Dec 01 21:57:25 crc kubenswrapper[4962]: I1201 21:57:25.595350 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-58lcl" event={"ID":"f337e4ae-c46a-49f4-b8fe-bb7973cbdf8d","Type":"ContainerStarted","Data":"c4835f0d0622d3b3cefab2c05641fd7984516534a427deed2f1b48c2106c3480"} Dec 01 21:57:25 crc kubenswrapper[4962]: I1201 21:57:25.619357 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-568d7fd7cf-rw5lw" podStartSLOduration=4.619343111 podStartE2EDuration="4.619343111s" podCreationTimestamp="2025-12-01 21:57:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:57:25.613853276 +0000 UTC m=+1429.715292501" watchObservedRunningTime="2025-12-01 21:57:25.619343111 +0000 UTC m=+1429.720782306" Dec 01 21:57:25 crc kubenswrapper[4962]: I1201 21:57:25.635948 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-58lcl" podStartSLOduration=2.6359144 podStartE2EDuration="2.6359144s" podCreationTimestamp="2025-12-01 21:57:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:57:25.634233873 +0000 UTC m=+1429.735673078" watchObservedRunningTime="2025-12-01 21:57:25.6359144 +0000 UTC m=+1429.737353595" Dec 01 21:57:26 crc kubenswrapper[4962]: I1201 21:57:26.162280 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 21:57:26 crc kubenswrapper[4962]: I1201 21:57:26.178201 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 21:57:28 crc kubenswrapper[4962]: I1201 21:57:28.629801 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"99834bd0-d60e-45fd-b9f9-2d252fd9117c","Type":"ContainerStarted","Data":"18a24966f77ef48548f3d31bc1a4a6262edf4c5489635cd60eee5816b080366c"} Dec 01 21:57:28 crc kubenswrapper[4962]: I1201 21:57:28.631443 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5a57d6fa-2989-48c3-b66e-6362b284eda4","Type":"ContainerStarted","Data":"7e3cf5e00421be7fc94877dd0e0c9871052b09c5b0041455aa5cf7869223e600"} Dec 01 21:57:28 crc kubenswrapper[4962]: I1201 21:57:28.631479 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5a57d6fa-2989-48c3-b66e-6362b284eda4","Type":"ContainerStarted","Data":"64de11a8be27f77a842e09d38601f7326e01819ceb15dad617e7e51d9e526011"} Dec 01 21:57:28 crc kubenswrapper[4962]: I1201 21:57:28.633443 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e8f2b876-3ead-4d4c-b122-de64ae016e7f","Type":"ContainerStarted","Data":"1af40045ee10da2e0e24c4225fc8825ac3aea3505cd6e6bfd672b8f73d06da53"} Dec 01 21:57:28 crc kubenswrapper[4962]: I1201 21:57:28.633724 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e8f2b876-3ead-4d4c-b122-de64ae016e7f","Type":"ContainerStarted","Data":"64c8141e1752b39485ef7d28df3d7f0ec485534c61a714d9cfb22b47b6f9491d"} Dec 01 21:57:28 crc kubenswrapper[4962]: I1201 21:57:28.633616 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e8f2b876-3ead-4d4c-b122-de64ae016e7f" containerName="nova-metadata-metadata" containerID="cri-o://1af40045ee10da2e0e24c4225fc8825ac3aea3505cd6e6bfd672b8f73d06da53" gracePeriod=30 Dec 01 21:57:28 crc kubenswrapper[4962]: I1201 21:57:28.633588 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e8f2b876-3ead-4d4c-b122-de64ae016e7f" containerName="nova-metadata-log" containerID="cri-o://64c8141e1752b39485ef7d28df3d7f0ec485534c61a714d9cfb22b47b6f9491d" gracePeriod=30 Dec 01 21:57:28 crc kubenswrapper[4962]: I1201 21:57:28.635504 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"65d408e2-365e-4ab9-9077-ea1706b8d4a2","Type":"ContainerStarted","Data":"0d036f8c88f33e984a854a699f7cb61ebf803cd0ced1a3b5e7b3e8b3ebdec2cf"} Dec 01 21:57:28 crc kubenswrapper[4962]: I1201 21:57:28.635557 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="65d408e2-365e-4ab9-9077-ea1706b8d4a2" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://0d036f8c88f33e984a854a699f7cb61ebf803cd0ced1a3b5e7b3e8b3ebdec2cf" gracePeriod=30 Dec 01 21:57:28 crc kubenswrapper[4962]: I1201 21:57:28.655217 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.7149287380000002 podStartE2EDuration="7.655199639s" podCreationTimestamp="2025-12-01 21:57:21 +0000 UTC" firstStartedPulling="2025-12-01 21:57:23.674810356 +0000 UTC m=+1427.776249541" lastFinishedPulling="2025-12-01 21:57:27.615081247 +0000 UTC m=+1431.716520442" observedRunningTime="2025-12-01 21:57:28.648332625 +0000 UTC m=+1432.749771820" watchObservedRunningTime="2025-12-01 21:57:28.655199639 +0000 UTC m=+1432.756638834" Dec 01 21:57:28 crc kubenswrapper[4962]: I1201 21:57:28.678782 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.193021375 podStartE2EDuration="7.678759596s" podCreationTimestamp="2025-12-01 21:57:21 +0000 UTC" firstStartedPulling="2025-12-01 21:57:23.098155233 +0000 UTC m=+1427.199594428" lastFinishedPulling="2025-12-01 21:57:27.583893454 +0000 UTC m=+1431.685332649" observedRunningTime="2025-12-01 21:57:28.672596702 +0000 UTC m=+1432.774035897" watchObservedRunningTime="2025-12-01 21:57:28.678759596 +0000 UTC m=+1432.780198791" Dec 01 21:57:28 crc kubenswrapper[4962]: I1201 21:57:28.692047 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.743723955 podStartE2EDuration="6.692031582s" podCreationTimestamp="2025-12-01 21:57:22 +0000 UTC" firstStartedPulling="2025-12-01 21:57:23.68022002 +0000 UTC m=+1427.781659205" lastFinishedPulling="2025-12-01 21:57:27.628527637 +0000 UTC m=+1431.729966832" observedRunningTime="2025-12-01 21:57:28.688519943 +0000 UTC m=+1432.789959138" watchObservedRunningTime="2025-12-01 21:57:28.692031582 +0000 UTC m=+1432.793470777" Dec 01 21:57:28 crc kubenswrapper[4962]: I1201 21:57:28.715065 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.372914137 podStartE2EDuration="7.715040573s" podCreationTimestamp="2025-12-01 21:57:21 +0000 UTC" firstStartedPulling="2025-12-01 21:57:23.260139438 +0000 UTC m=+1427.361578633" lastFinishedPulling="2025-12-01 21:57:27.602265874 +0000 UTC m=+1431.703705069" observedRunningTime="2025-12-01 21:57:28.712978165 +0000 UTC m=+1432.814417360" watchObservedRunningTime="2025-12-01 21:57:28.715040573 +0000 UTC m=+1432.816479778" Dec 01 21:57:29 crc kubenswrapper[4962]: I1201 21:57:29.362702 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 21:57:29 crc kubenswrapper[4962]: I1201 21:57:29.477489 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8f2b876-3ead-4d4c-b122-de64ae016e7f-combined-ca-bundle\") pod \"e8f2b876-3ead-4d4c-b122-de64ae016e7f\" (UID: \"e8f2b876-3ead-4d4c-b122-de64ae016e7f\") " Dec 01 21:57:29 crc kubenswrapper[4962]: I1201 21:57:29.477645 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8f2b876-3ead-4d4c-b122-de64ae016e7f-logs\") pod \"e8f2b876-3ead-4d4c-b122-de64ae016e7f\" (UID: \"e8f2b876-3ead-4d4c-b122-de64ae016e7f\") " Dec 01 21:57:29 crc kubenswrapper[4962]: I1201 21:57:29.477700 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gfrs\" (UniqueName: \"kubernetes.io/projected/e8f2b876-3ead-4d4c-b122-de64ae016e7f-kube-api-access-4gfrs\") pod \"e8f2b876-3ead-4d4c-b122-de64ae016e7f\" (UID: \"e8f2b876-3ead-4d4c-b122-de64ae016e7f\") " Dec 01 21:57:29 crc kubenswrapper[4962]: I1201 21:57:29.477765 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8f2b876-3ead-4d4c-b122-de64ae016e7f-config-data\") pod \"e8f2b876-3ead-4d4c-b122-de64ae016e7f\" (UID: \"e8f2b876-3ead-4d4c-b122-de64ae016e7f\") " Dec 01 21:57:29 crc kubenswrapper[4962]: I1201 21:57:29.478329 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8f2b876-3ead-4d4c-b122-de64ae016e7f-logs" (OuterVolumeSpecName: "logs") pod "e8f2b876-3ead-4d4c-b122-de64ae016e7f" (UID: "e8f2b876-3ead-4d4c-b122-de64ae016e7f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:57:29 crc kubenswrapper[4962]: I1201 21:57:29.478796 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8f2b876-3ead-4d4c-b122-de64ae016e7f-logs\") on node \"crc\" DevicePath \"\"" Dec 01 21:57:29 crc kubenswrapper[4962]: I1201 21:57:29.482532 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8f2b876-3ead-4d4c-b122-de64ae016e7f-kube-api-access-4gfrs" (OuterVolumeSpecName: "kube-api-access-4gfrs") pod "e8f2b876-3ead-4d4c-b122-de64ae016e7f" (UID: "e8f2b876-3ead-4d4c-b122-de64ae016e7f"). InnerVolumeSpecName "kube-api-access-4gfrs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:57:29 crc kubenswrapper[4962]: I1201 21:57:29.515314 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8f2b876-3ead-4d4c-b122-de64ae016e7f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e8f2b876-3ead-4d4c-b122-de64ae016e7f" (UID: "e8f2b876-3ead-4d4c-b122-de64ae016e7f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:57:29 crc kubenswrapper[4962]: I1201 21:57:29.521551 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8f2b876-3ead-4d4c-b122-de64ae016e7f-config-data" (OuterVolumeSpecName: "config-data") pod "e8f2b876-3ead-4d4c-b122-de64ae016e7f" (UID: "e8f2b876-3ead-4d4c-b122-de64ae016e7f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:57:29 crc kubenswrapper[4962]: I1201 21:57:29.581239 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4gfrs\" (UniqueName: \"kubernetes.io/projected/e8f2b876-3ead-4d4c-b122-de64ae016e7f-kube-api-access-4gfrs\") on node \"crc\" DevicePath \"\"" Dec 01 21:57:29 crc kubenswrapper[4962]: I1201 21:57:29.581278 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8f2b876-3ead-4d4c-b122-de64ae016e7f-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 21:57:29 crc kubenswrapper[4962]: I1201 21:57:29.581287 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8f2b876-3ead-4d4c-b122-de64ae016e7f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 21:57:29 crc kubenswrapper[4962]: I1201 21:57:29.651554 4962 generic.go:334] "Generic (PLEG): container finished" podID="e8f2b876-3ead-4d4c-b122-de64ae016e7f" containerID="1af40045ee10da2e0e24c4225fc8825ac3aea3505cd6e6bfd672b8f73d06da53" exitCode=0 Dec 01 21:57:29 crc kubenswrapper[4962]: I1201 21:57:29.651593 4962 generic.go:334] "Generic (PLEG): container finished" podID="e8f2b876-3ead-4d4c-b122-de64ae016e7f" containerID="64c8141e1752b39485ef7d28df3d7f0ec485534c61a714d9cfb22b47b6f9491d" exitCode=143 Dec 01 21:57:29 crc kubenswrapper[4962]: I1201 21:57:29.651657 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 21:57:29 crc kubenswrapper[4962]: I1201 21:57:29.651682 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e8f2b876-3ead-4d4c-b122-de64ae016e7f","Type":"ContainerDied","Data":"1af40045ee10da2e0e24c4225fc8825ac3aea3505cd6e6bfd672b8f73d06da53"} Dec 01 21:57:29 crc kubenswrapper[4962]: I1201 21:57:29.651736 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e8f2b876-3ead-4d4c-b122-de64ae016e7f","Type":"ContainerDied","Data":"64c8141e1752b39485ef7d28df3d7f0ec485534c61a714d9cfb22b47b6f9491d"} Dec 01 21:57:29 crc kubenswrapper[4962]: I1201 21:57:29.651749 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e8f2b876-3ead-4d4c-b122-de64ae016e7f","Type":"ContainerDied","Data":"c057100b6feb8ff9d432d01cb0f8ffca404d8284c4534d2d7c7dac8bcf367418"} Dec 01 21:57:29 crc kubenswrapper[4962]: I1201 21:57:29.651767 4962 scope.go:117] "RemoveContainer" containerID="1af40045ee10da2e0e24c4225fc8825ac3aea3505cd6e6bfd672b8f73d06da53" Dec 01 21:57:29 crc kubenswrapper[4962]: I1201 21:57:29.735128 4962 scope.go:117] "RemoveContainer" containerID="64c8141e1752b39485ef7d28df3d7f0ec485534c61a714d9cfb22b47b6f9491d" Dec 01 21:57:29 crc kubenswrapper[4962]: I1201 21:57:29.743532 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 21:57:29 crc kubenswrapper[4962]: I1201 21:57:29.775729 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 21:57:29 crc kubenswrapper[4962]: I1201 21:57:29.798248 4962 scope.go:117] "RemoveContainer" containerID="1af40045ee10da2e0e24c4225fc8825ac3aea3505cd6e6bfd672b8f73d06da53" Dec 01 21:57:29 crc kubenswrapper[4962]: E1201 21:57:29.799660 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1af40045ee10da2e0e24c4225fc8825ac3aea3505cd6e6bfd672b8f73d06da53\": container with ID starting with 1af40045ee10da2e0e24c4225fc8825ac3aea3505cd6e6bfd672b8f73d06da53 not found: ID does not exist" containerID="1af40045ee10da2e0e24c4225fc8825ac3aea3505cd6e6bfd672b8f73d06da53" Dec 01 21:57:29 crc kubenswrapper[4962]: I1201 21:57:29.799716 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1af40045ee10da2e0e24c4225fc8825ac3aea3505cd6e6bfd672b8f73d06da53"} err="failed to get container status \"1af40045ee10da2e0e24c4225fc8825ac3aea3505cd6e6bfd672b8f73d06da53\": rpc error: code = NotFound desc = could not find container \"1af40045ee10da2e0e24c4225fc8825ac3aea3505cd6e6bfd672b8f73d06da53\": container with ID starting with 1af40045ee10da2e0e24c4225fc8825ac3aea3505cd6e6bfd672b8f73d06da53 not found: ID does not exist" Dec 01 21:57:29 crc kubenswrapper[4962]: I1201 21:57:29.799768 4962 scope.go:117] "RemoveContainer" containerID="64c8141e1752b39485ef7d28df3d7f0ec485534c61a714d9cfb22b47b6f9491d" Dec 01 21:57:29 crc kubenswrapper[4962]: E1201 21:57:29.800391 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64c8141e1752b39485ef7d28df3d7f0ec485534c61a714d9cfb22b47b6f9491d\": container with ID starting with 64c8141e1752b39485ef7d28df3d7f0ec485534c61a714d9cfb22b47b6f9491d not found: ID does not exist" containerID="64c8141e1752b39485ef7d28df3d7f0ec485534c61a714d9cfb22b47b6f9491d" Dec 01 21:57:29 crc kubenswrapper[4962]: I1201 21:57:29.800426 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64c8141e1752b39485ef7d28df3d7f0ec485534c61a714d9cfb22b47b6f9491d"} err="failed to get container status \"64c8141e1752b39485ef7d28df3d7f0ec485534c61a714d9cfb22b47b6f9491d\": rpc error: code = NotFound desc = could not find container \"64c8141e1752b39485ef7d28df3d7f0ec485534c61a714d9cfb22b47b6f9491d\": container with ID starting with 64c8141e1752b39485ef7d28df3d7f0ec485534c61a714d9cfb22b47b6f9491d not found: ID does not exist" Dec 01 21:57:29 crc kubenswrapper[4962]: I1201 21:57:29.800446 4962 scope.go:117] "RemoveContainer" containerID="1af40045ee10da2e0e24c4225fc8825ac3aea3505cd6e6bfd672b8f73d06da53" Dec 01 21:57:29 crc kubenswrapper[4962]: I1201 21:57:29.800666 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1af40045ee10da2e0e24c4225fc8825ac3aea3505cd6e6bfd672b8f73d06da53"} err="failed to get container status \"1af40045ee10da2e0e24c4225fc8825ac3aea3505cd6e6bfd672b8f73d06da53\": rpc error: code = NotFound desc = could not find container \"1af40045ee10da2e0e24c4225fc8825ac3aea3505cd6e6bfd672b8f73d06da53\": container with ID starting with 1af40045ee10da2e0e24c4225fc8825ac3aea3505cd6e6bfd672b8f73d06da53 not found: ID does not exist" Dec 01 21:57:29 crc kubenswrapper[4962]: I1201 21:57:29.800680 4962 scope.go:117] "RemoveContainer" containerID="64c8141e1752b39485ef7d28df3d7f0ec485534c61a714d9cfb22b47b6f9491d" Dec 01 21:57:29 crc kubenswrapper[4962]: I1201 21:57:29.800914 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64c8141e1752b39485ef7d28df3d7f0ec485534c61a714d9cfb22b47b6f9491d"} err="failed to get container status \"64c8141e1752b39485ef7d28df3d7f0ec485534c61a714d9cfb22b47b6f9491d\": rpc error: code = NotFound desc = could not find container \"64c8141e1752b39485ef7d28df3d7f0ec485534c61a714d9cfb22b47b6f9491d\": container with ID starting with 64c8141e1752b39485ef7d28df3d7f0ec485534c61a714d9cfb22b47b6f9491d not found: ID does not exist" Dec 01 21:57:29 crc kubenswrapper[4962]: I1201 21:57:29.827231 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 01 21:57:29 crc kubenswrapper[4962]: E1201 21:57:29.829364 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8f2b876-3ead-4d4c-b122-de64ae016e7f" containerName="nova-metadata-metadata" Dec 01 21:57:29 crc kubenswrapper[4962]: I1201 21:57:29.829388 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8f2b876-3ead-4d4c-b122-de64ae016e7f" containerName="nova-metadata-metadata" Dec 01 21:57:29 crc kubenswrapper[4962]: E1201 21:57:29.829428 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8f2b876-3ead-4d4c-b122-de64ae016e7f" containerName="nova-metadata-log" Dec 01 21:57:29 crc kubenswrapper[4962]: I1201 21:57:29.829439 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8f2b876-3ead-4d4c-b122-de64ae016e7f" containerName="nova-metadata-log" Dec 01 21:57:29 crc kubenswrapper[4962]: I1201 21:57:29.830871 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8f2b876-3ead-4d4c-b122-de64ae016e7f" containerName="nova-metadata-log" Dec 01 21:57:29 crc kubenswrapper[4962]: I1201 21:57:29.830902 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8f2b876-3ead-4d4c-b122-de64ae016e7f" containerName="nova-metadata-metadata" Dec 01 21:57:29 crc kubenswrapper[4962]: I1201 21:57:29.832217 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 21:57:29 crc kubenswrapper[4962]: I1201 21:57:29.835715 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 01 21:57:29 crc kubenswrapper[4962]: I1201 21:57:29.836007 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 01 21:57:29 crc kubenswrapper[4962]: I1201 21:57:29.855092 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 21:57:29 crc kubenswrapper[4962]: I1201 21:57:29.909820 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2bpn\" (UniqueName: \"kubernetes.io/projected/52a87903-f472-4bdd-a1bb-20869165d97f-kube-api-access-t2bpn\") pod \"nova-metadata-0\" (UID: \"52a87903-f472-4bdd-a1bb-20869165d97f\") " pod="openstack/nova-metadata-0" Dec 01 21:57:29 crc kubenswrapper[4962]: I1201 21:57:29.909957 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/52a87903-f472-4bdd-a1bb-20869165d97f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"52a87903-f472-4bdd-a1bb-20869165d97f\") " pod="openstack/nova-metadata-0" Dec 01 21:57:29 crc kubenswrapper[4962]: I1201 21:57:29.909984 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52a87903-f472-4bdd-a1bb-20869165d97f-logs\") pod \"nova-metadata-0\" (UID: \"52a87903-f472-4bdd-a1bb-20869165d97f\") " pod="openstack/nova-metadata-0" Dec 01 21:57:29 crc kubenswrapper[4962]: I1201 21:57:29.910023 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52a87903-f472-4bdd-a1bb-20869165d97f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"52a87903-f472-4bdd-a1bb-20869165d97f\") " pod="openstack/nova-metadata-0" Dec 01 21:57:29 crc kubenswrapper[4962]: I1201 21:57:29.910044 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52a87903-f472-4bdd-a1bb-20869165d97f-config-data\") pod \"nova-metadata-0\" (UID: \"52a87903-f472-4bdd-a1bb-20869165d97f\") " pod="openstack/nova-metadata-0" Dec 01 21:57:30 crc kubenswrapper[4962]: I1201 21:57:30.012134 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2bpn\" (UniqueName: \"kubernetes.io/projected/52a87903-f472-4bdd-a1bb-20869165d97f-kube-api-access-t2bpn\") pod \"nova-metadata-0\" (UID: \"52a87903-f472-4bdd-a1bb-20869165d97f\") " pod="openstack/nova-metadata-0" Dec 01 21:57:30 crc kubenswrapper[4962]: I1201 21:57:30.012299 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/52a87903-f472-4bdd-a1bb-20869165d97f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"52a87903-f472-4bdd-a1bb-20869165d97f\") " pod="openstack/nova-metadata-0" Dec 01 21:57:30 crc kubenswrapper[4962]: I1201 21:57:30.012329 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52a87903-f472-4bdd-a1bb-20869165d97f-logs\") pod \"nova-metadata-0\" (UID: \"52a87903-f472-4bdd-a1bb-20869165d97f\") " pod="openstack/nova-metadata-0" Dec 01 21:57:30 crc kubenswrapper[4962]: I1201 21:57:30.012376 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52a87903-f472-4bdd-a1bb-20869165d97f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"52a87903-f472-4bdd-a1bb-20869165d97f\") " pod="openstack/nova-metadata-0" Dec 01 21:57:30 crc kubenswrapper[4962]: I1201 21:57:30.012405 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52a87903-f472-4bdd-a1bb-20869165d97f-config-data\") pod \"nova-metadata-0\" (UID: \"52a87903-f472-4bdd-a1bb-20869165d97f\") " pod="openstack/nova-metadata-0" Dec 01 21:57:30 crc kubenswrapper[4962]: I1201 21:57:30.012755 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52a87903-f472-4bdd-a1bb-20869165d97f-logs\") pod \"nova-metadata-0\" (UID: \"52a87903-f472-4bdd-a1bb-20869165d97f\") " pod="openstack/nova-metadata-0" Dec 01 21:57:30 crc kubenswrapper[4962]: I1201 21:57:30.016293 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52a87903-f472-4bdd-a1bb-20869165d97f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"52a87903-f472-4bdd-a1bb-20869165d97f\") " pod="openstack/nova-metadata-0" Dec 01 21:57:30 crc kubenswrapper[4962]: I1201 21:57:30.016304 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/52a87903-f472-4bdd-a1bb-20869165d97f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"52a87903-f472-4bdd-a1bb-20869165d97f\") " pod="openstack/nova-metadata-0" Dec 01 21:57:30 crc kubenswrapper[4962]: I1201 21:57:30.018522 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52a87903-f472-4bdd-a1bb-20869165d97f-config-data\") pod \"nova-metadata-0\" (UID: \"52a87903-f472-4bdd-a1bb-20869165d97f\") " pod="openstack/nova-metadata-0" Dec 01 21:57:30 crc kubenswrapper[4962]: I1201 21:57:30.030966 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2bpn\" (UniqueName: \"kubernetes.io/projected/52a87903-f472-4bdd-a1bb-20869165d97f-kube-api-access-t2bpn\") pod \"nova-metadata-0\" (UID: \"52a87903-f472-4bdd-a1bb-20869165d97f\") " pod="openstack/nova-metadata-0" Dec 01 21:57:30 crc kubenswrapper[4962]: I1201 21:57:30.155424 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 21:57:30 crc kubenswrapper[4962]: I1201 21:57:30.244757 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8f2b876-3ead-4d4c-b122-de64ae016e7f" path="/var/lib/kubelet/pods/e8f2b876-3ead-4d4c-b122-de64ae016e7f/volumes" Dec 01 21:57:30 crc kubenswrapper[4962]: W1201 21:57:30.685898 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52a87903_f472_4bdd_a1bb_20869165d97f.slice/crio-747063fc879d49aec19e79be24cbf3542777f3ec41dc9e41bb6656ca84ea2f28 WatchSource:0}: Error finding container 747063fc879d49aec19e79be24cbf3542777f3ec41dc9e41bb6656ca84ea2f28: Status 404 returned error can't find the container with id 747063fc879d49aec19e79be24cbf3542777f3ec41dc9e41bb6656ca84ea2f28 Dec 01 21:57:30 crc kubenswrapper[4962]: I1201 21:57:30.691213 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 21:57:31 crc kubenswrapper[4962]: I1201 21:57:31.677953 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"52a87903-f472-4bdd-a1bb-20869165d97f","Type":"ContainerStarted","Data":"becd3a0bdb94d7d4b5a8fd466979d217b747192a82b72bd8f456acda1013a61b"} Dec 01 21:57:31 crc kubenswrapper[4962]: I1201 21:57:31.678597 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"52a87903-f472-4bdd-a1bb-20869165d97f","Type":"ContainerStarted","Data":"c11d1b4742ee283336e14db2b0bd3c52ef5800ab1ac56451890d4ac0240f694d"} Dec 01 21:57:31 crc kubenswrapper[4962]: I1201 21:57:31.678707 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"52a87903-f472-4bdd-a1bb-20869165d97f","Type":"ContainerStarted","Data":"747063fc879d49aec19e79be24cbf3542777f3ec41dc9e41bb6656ca84ea2f28"} Dec 01 21:57:31 crc kubenswrapper[4962]: I1201 21:57:31.698126 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.698101387 podStartE2EDuration="2.698101387s" podCreationTimestamp="2025-12-01 21:57:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:57:31.695562505 +0000 UTC m=+1435.797001700" watchObservedRunningTime="2025-12-01 21:57:31.698101387 +0000 UTC m=+1435.799540602" Dec 01 21:57:32 crc kubenswrapper[4962]: I1201 21:57:32.240251 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 01 21:57:32 crc kubenswrapper[4962]: I1201 21:57:32.240733 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 01 21:57:32 crc kubenswrapper[4962]: I1201 21:57:32.540035 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 01 21:57:32 crc kubenswrapper[4962]: I1201 21:57:32.540212 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 01 21:57:32 crc kubenswrapper[4962]: I1201 21:57:32.606417 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 01 21:57:32 crc kubenswrapper[4962]: I1201 21:57:32.643162 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-568d7fd7cf-rw5lw" Dec 01 21:57:32 crc kubenswrapper[4962]: I1201 21:57:32.675841 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 01 21:57:32 crc kubenswrapper[4962]: I1201 21:57:32.695690 4962 generic.go:334] "Generic (PLEG): container finished" podID="e236312e-ed98-484b-ad3e-9ed5a6645df0" containerID="1d0ff72f8769c0fd22c9037b58fc1ff886d1665425352ace8895034a81a97248" exitCode=0 Dec 01 21:57:32 crc kubenswrapper[4962]: I1201 21:57:32.696907 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-fvdgt" event={"ID":"e236312e-ed98-484b-ad3e-9ed5a6645df0","Type":"ContainerDied","Data":"1d0ff72f8769c0fd22c9037b58fc1ff886d1665425352ace8895034a81a97248"} Dec 01 21:57:32 crc kubenswrapper[4962]: I1201 21:57:32.735998 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-688b9f5b49-ntsmc"] Dec 01 21:57:32 crc kubenswrapper[4962]: I1201 21:57:32.736830 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-688b9f5b49-ntsmc" podUID="a52733e0-9924-46f4-aee8-705cda80cc38" containerName="dnsmasq-dns" containerID="cri-o://0998a210c85a339d1568cc92240d2d7ef19f0e2092225d0eeb8c265ed16406e3" gracePeriod=10 Dec 01 21:57:32 crc kubenswrapper[4962]: I1201 21:57:32.761766 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 01 21:57:32 crc kubenswrapper[4962]: I1201 21:57:32.784192 4962 patch_prober.go:28] interesting pod/machine-config-daemon-b642k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 21:57:32 crc kubenswrapper[4962]: I1201 21:57:32.784246 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 21:57:33 crc kubenswrapper[4962]: I1201 21:57:33.308230 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5a57d6fa-2989-48c3-b66e-6362b284eda4" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.236:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 21:57:33 crc kubenswrapper[4962]: I1201 21:57:33.308508 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5a57d6fa-2989-48c3-b66e-6362b284eda4" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.236:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 21:57:33 crc kubenswrapper[4962]: I1201 21:57:33.477855 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688b9f5b49-ntsmc" Dec 01 21:57:33 crc kubenswrapper[4962]: I1201 21:57:33.605819 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a52733e0-9924-46f4-aee8-705cda80cc38-ovsdbserver-sb\") pod \"a52733e0-9924-46f4-aee8-705cda80cc38\" (UID: \"a52733e0-9924-46f4-aee8-705cda80cc38\") " Dec 01 21:57:33 crc kubenswrapper[4962]: I1201 21:57:33.605868 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a52733e0-9924-46f4-aee8-705cda80cc38-config\") pod \"a52733e0-9924-46f4-aee8-705cda80cc38\" (UID: \"a52733e0-9924-46f4-aee8-705cda80cc38\") " Dec 01 21:57:33 crc kubenswrapper[4962]: I1201 21:57:33.606099 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a52733e0-9924-46f4-aee8-705cda80cc38-dns-svc\") pod \"a52733e0-9924-46f4-aee8-705cda80cc38\" (UID: \"a52733e0-9924-46f4-aee8-705cda80cc38\") " Dec 01 21:57:33 crc kubenswrapper[4962]: I1201 21:57:33.606214 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a52733e0-9924-46f4-aee8-705cda80cc38-dns-swift-storage-0\") pod \"a52733e0-9924-46f4-aee8-705cda80cc38\" (UID: \"a52733e0-9924-46f4-aee8-705cda80cc38\") " Dec 01 21:57:33 crc kubenswrapper[4962]: I1201 21:57:33.606324 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a52733e0-9924-46f4-aee8-705cda80cc38-ovsdbserver-nb\") pod \"a52733e0-9924-46f4-aee8-705cda80cc38\" (UID: \"a52733e0-9924-46f4-aee8-705cda80cc38\") " Dec 01 21:57:33 crc kubenswrapper[4962]: I1201 21:57:33.606368 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z785b\" (UniqueName: \"kubernetes.io/projected/a52733e0-9924-46f4-aee8-705cda80cc38-kube-api-access-z785b\") pod \"a52733e0-9924-46f4-aee8-705cda80cc38\" (UID: \"a52733e0-9924-46f4-aee8-705cda80cc38\") " Dec 01 21:57:33 crc kubenswrapper[4962]: I1201 21:57:33.637164 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a52733e0-9924-46f4-aee8-705cda80cc38-kube-api-access-z785b" (OuterVolumeSpecName: "kube-api-access-z785b") pod "a52733e0-9924-46f4-aee8-705cda80cc38" (UID: "a52733e0-9924-46f4-aee8-705cda80cc38"). InnerVolumeSpecName "kube-api-access-z785b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:57:33 crc kubenswrapper[4962]: I1201 21:57:33.666433 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a52733e0-9924-46f4-aee8-705cda80cc38-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a52733e0-9924-46f4-aee8-705cda80cc38" (UID: "a52733e0-9924-46f4-aee8-705cda80cc38"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:57:33 crc kubenswrapper[4962]: I1201 21:57:33.667482 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a52733e0-9924-46f4-aee8-705cda80cc38-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a52733e0-9924-46f4-aee8-705cda80cc38" (UID: "a52733e0-9924-46f4-aee8-705cda80cc38"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:57:33 crc kubenswrapper[4962]: I1201 21:57:33.677396 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a52733e0-9924-46f4-aee8-705cda80cc38-config" (OuterVolumeSpecName: "config") pod "a52733e0-9924-46f4-aee8-705cda80cc38" (UID: "a52733e0-9924-46f4-aee8-705cda80cc38"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:57:33 crc kubenswrapper[4962]: I1201 21:57:33.681311 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a52733e0-9924-46f4-aee8-705cda80cc38-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a52733e0-9924-46f4-aee8-705cda80cc38" (UID: "a52733e0-9924-46f4-aee8-705cda80cc38"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:57:33 crc kubenswrapper[4962]: I1201 21:57:33.709332 4962 generic.go:334] "Generic (PLEG): container finished" podID="f337e4ae-c46a-49f4-b8fe-bb7973cbdf8d" containerID="0fbb87ea3559411331543374877d9cc86ad777f447b9042fed0047fb49676970" exitCode=0 Dec 01 21:57:33 crc kubenswrapper[4962]: I1201 21:57:33.709434 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-58lcl" event={"ID":"f337e4ae-c46a-49f4-b8fe-bb7973cbdf8d","Type":"ContainerDied","Data":"0fbb87ea3559411331543374877d9cc86ad777f447b9042fed0047fb49676970"} Dec 01 21:57:33 crc kubenswrapper[4962]: I1201 21:57:33.709605 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a52733e0-9924-46f4-aee8-705cda80cc38-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 21:57:33 crc kubenswrapper[4962]: I1201 21:57:33.709874 4962 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a52733e0-9924-46f4-aee8-705cda80cc38-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 01 21:57:33 crc kubenswrapper[4962]: I1201 21:57:33.709894 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z785b\" (UniqueName: \"kubernetes.io/projected/a52733e0-9924-46f4-aee8-705cda80cc38-kube-api-access-z785b\") on node \"crc\" DevicePath \"\"" Dec 01 21:57:33 crc kubenswrapper[4962]: I1201 21:57:33.709905 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a52733e0-9924-46f4-aee8-705cda80cc38-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 21:57:33 crc kubenswrapper[4962]: I1201 21:57:33.709915 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a52733e0-9924-46f4-aee8-705cda80cc38-config\") on node \"crc\" DevicePath \"\"" Dec 01 21:57:33 crc kubenswrapper[4962]: I1201 21:57:33.711859 4962 generic.go:334] "Generic (PLEG): container finished" podID="a52733e0-9924-46f4-aee8-705cda80cc38" containerID="0998a210c85a339d1568cc92240d2d7ef19f0e2092225d0eeb8c265ed16406e3" exitCode=0 Dec 01 21:57:33 crc kubenswrapper[4962]: I1201 21:57:33.712185 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688b9f5b49-ntsmc" event={"ID":"a52733e0-9924-46f4-aee8-705cda80cc38","Type":"ContainerDied","Data":"0998a210c85a339d1568cc92240d2d7ef19f0e2092225d0eeb8c265ed16406e3"} Dec 01 21:57:33 crc kubenswrapper[4962]: I1201 21:57:33.712197 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688b9f5b49-ntsmc" Dec 01 21:57:33 crc kubenswrapper[4962]: I1201 21:57:33.712216 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688b9f5b49-ntsmc" event={"ID":"a52733e0-9924-46f4-aee8-705cda80cc38","Type":"ContainerDied","Data":"6ffe45da241b8ae5dd2d68951a590c7a2aa4b0035203a508523f9bdf5fa81136"} Dec 01 21:57:33 crc kubenswrapper[4962]: I1201 21:57:33.712236 4962 scope.go:117] "RemoveContainer" containerID="0998a210c85a339d1568cc92240d2d7ef19f0e2092225d0eeb8c265ed16406e3" Dec 01 21:57:33 crc kubenswrapper[4962]: I1201 21:57:33.721067 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a52733e0-9924-46f4-aee8-705cda80cc38-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a52733e0-9924-46f4-aee8-705cda80cc38" (UID: "a52733e0-9924-46f4-aee8-705cda80cc38"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:57:33 crc kubenswrapper[4962]: I1201 21:57:33.799081 4962 scope.go:117] "RemoveContainer" containerID="e4a7685dbe844676fd9e7ceb6095a9db7288c2a0331059015cae3134ac6b1819" Dec 01 21:57:33 crc kubenswrapper[4962]: I1201 21:57:33.811994 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a52733e0-9924-46f4-aee8-705cda80cc38-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 21:57:33 crc kubenswrapper[4962]: I1201 21:57:33.843819 4962 scope.go:117] "RemoveContainer" containerID="0998a210c85a339d1568cc92240d2d7ef19f0e2092225d0eeb8c265ed16406e3" Dec 01 21:57:33 crc kubenswrapper[4962]: E1201 21:57:33.844321 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0998a210c85a339d1568cc92240d2d7ef19f0e2092225d0eeb8c265ed16406e3\": container with ID starting with 0998a210c85a339d1568cc92240d2d7ef19f0e2092225d0eeb8c265ed16406e3 not found: ID does not exist" containerID="0998a210c85a339d1568cc92240d2d7ef19f0e2092225d0eeb8c265ed16406e3" Dec 01 21:57:33 crc kubenswrapper[4962]: I1201 21:57:33.844363 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0998a210c85a339d1568cc92240d2d7ef19f0e2092225d0eeb8c265ed16406e3"} err="failed to get container status \"0998a210c85a339d1568cc92240d2d7ef19f0e2092225d0eeb8c265ed16406e3\": rpc error: code = NotFound desc = could not find container \"0998a210c85a339d1568cc92240d2d7ef19f0e2092225d0eeb8c265ed16406e3\": container with ID starting with 0998a210c85a339d1568cc92240d2d7ef19f0e2092225d0eeb8c265ed16406e3 not found: ID does not exist" Dec 01 21:57:33 crc kubenswrapper[4962]: I1201 21:57:33.844395 4962 scope.go:117] "RemoveContainer" containerID="e4a7685dbe844676fd9e7ceb6095a9db7288c2a0331059015cae3134ac6b1819" Dec 01 21:57:33 crc kubenswrapper[4962]: E1201 21:57:33.844759 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4a7685dbe844676fd9e7ceb6095a9db7288c2a0331059015cae3134ac6b1819\": container with ID starting with e4a7685dbe844676fd9e7ceb6095a9db7288c2a0331059015cae3134ac6b1819 not found: ID does not exist" containerID="e4a7685dbe844676fd9e7ceb6095a9db7288c2a0331059015cae3134ac6b1819" Dec 01 21:57:33 crc kubenswrapper[4962]: I1201 21:57:33.844781 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4a7685dbe844676fd9e7ceb6095a9db7288c2a0331059015cae3134ac6b1819"} err="failed to get container status \"e4a7685dbe844676fd9e7ceb6095a9db7288c2a0331059015cae3134ac6b1819\": rpc error: code = NotFound desc = could not find container \"e4a7685dbe844676fd9e7ceb6095a9db7288c2a0331059015cae3134ac6b1819\": container with ID starting with e4a7685dbe844676fd9e7ceb6095a9db7288c2a0331059015cae3134ac6b1819 not found: ID does not exist" Dec 01 21:57:34 crc kubenswrapper[4962]: I1201 21:57:34.054515 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-688b9f5b49-ntsmc"] Dec 01 21:57:34 crc kubenswrapper[4962]: I1201 21:57:34.068552 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-688b9f5b49-ntsmc"] Dec 01 21:57:34 crc kubenswrapper[4962]: I1201 21:57:34.188720 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-fvdgt" Dec 01 21:57:34 crc kubenswrapper[4962]: I1201 21:57:34.245238 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a52733e0-9924-46f4-aee8-705cda80cc38" path="/var/lib/kubelet/pods/a52733e0-9924-46f4-aee8-705cda80cc38/volumes" Dec 01 21:57:34 crc kubenswrapper[4962]: I1201 21:57:34.326848 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e236312e-ed98-484b-ad3e-9ed5a6645df0-config-data\") pod \"e236312e-ed98-484b-ad3e-9ed5a6645df0\" (UID: \"e236312e-ed98-484b-ad3e-9ed5a6645df0\") " Dec 01 21:57:34 crc kubenswrapper[4962]: I1201 21:57:34.327230 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e236312e-ed98-484b-ad3e-9ed5a6645df0-scripts\") pod \"e236312e-ed98-484b-ad3e-9ed5a6645df0\" (UID: \"e236312e-ed98-484b-ad3e-9ed5a6645df0\") " Dec 01 21:57:34 crc kubenswrapper[4962]: I1201 21:57:34.327282 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8x2r7\" (UniqueName: \"kubernetes.io/projected/e236312e-ed98-484b-ad3e-9ed5a6645df0-kube-api-access-8x2r7\") pod \"e236312e-ed98-484b-ad3e-9ed5a6645df0\" (UID: \"e236312e-ed98-484b-ad3e-9ed5a6645df0\") " Dec 01 21:57:34 crc kubenswrapper[4962]: I1201 21:57:34.327309 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e236312e-ed98-484b-ad3e-9ed5a6645df0-combined-ca-bundle\") pod \"e236312e-ed98-484b-ad3e-9ed5a6645df0\" (UID: \"e236312e-ed98-484b-ad3e-9ed5a6645df0\") " Dec 01 21:57:34 crc kubenswrapper[4962]: I1201 21:57:34.332759 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e236312e-ed98-484b-ad3e-9ed5a6645df0-kube-api-access-8x2r7" (OuterVolumeSpecName: "kube-api-access-8x2r7") pod "e236312e-ed98-484b-ad3e-9ed5a6645df0" (UID: "e236312e-ed98-484b-ad3e-9ed5a6645df0"). InnerVolumeSpecName "kube-api-access-8x2r7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:57:34 crc kubenswrapper[4962]: I1201 21:57:34.334003 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e236312e-ed98-484b-ad3e-9ed5a6645df0-scripts" (OuterVolumeSpecName: "scripts") pod "e236312e-ed98-484b-ad3e-9ed5a6645df0" (UID: "e236312e-ed98-484b-ad3e-9ed5a6645df0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:57:34 crc kubenswrapper[4962]: I1201 21:57:34.418058 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e236312e-ed98-484b-ad3e-9ed5a6645df0-config-data" (OuterVolumeSpecName: "config-data") pod "e236312e-ed98-484b-ad3e-9ed5a6645df0" (UID: "e236312e-ed98-484b-ad3e-9ed5a6645df0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:57:34 crc kubenswrapper[4962]: I1201 21:57:34.431189 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e236312e-ed98-484b-ad3e-9ed5a6645df0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e236312e-ed98-484b-ad3e-9ed5a6645df0" (UID: "e236312e-ed98-484b-ad3e-9ed5a6645df0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:57:34 crc kubenswrapper[4962]: I1201 21:57:34.445140 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e236312e-ed98-484b-ad3e-9ed5a6645df0-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 21:57:34 crc kubenswrapper[4962]: I1201 21:57:34.445178 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8x2r7\" (UniqueName: \"kubernetes.io/projected/e236312e-ed98-484b-ad3e-9ed5a6645df0-kube-api-access-8x2r7\") on node \"crc\" DevicePath \"\"" Dec 01 21:57:34 crc kubenswrapper[4962]: I1201 21:57:34.445190 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e236312e-ed98-484b-ad3e-9ed5a6645df0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 21:57:34 crc kubenswrapper[4962]: I1201 21:57:34.445198 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e236312e-ed98-484b-ad3e-9ed5a6645df0-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 21:57:34 crc kubenswrapper[4962]: I1201 21:57:34.722476 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-fvdgt" event={"ID":"e236312e-ed98-484b-ad3e-9ed5a6645df0","Type":"ContainerDied","Data":"4d26bcdbc39958b489203334a4c288ee659b4a1764785ce9543a5876dd165a21"} Dec 01 21:57:34 crc kubenswrapper[4962]: I1201 21:57:34.722823 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d26bcdbc39958b489203334a4c288ee659b4a1764785ce9543a5876dd165a21" Dec 01 21:57:34 crc kubenswrapper[4962]: I1201 21:57:34.722520 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-fvdgt" Dec 01 21:57:34 crc kubenswrapper[4962]: I1201 21:57:34.851231 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 01 21:57:34 crc kubenswrapper[4962]: I1201 21:57:34.851454 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5a57d6fa-2989-48c3-b66e-6362b284eda4" containerName="nova-api-log" containerID="cri-o://64de11a8be27f77a842e09d38601f7326e01819ceb15dad617e7e51d9e526011" gracePeriod=30 Dec 01 21:57:34 crc kubenswrapper[4962]: I1201 21:57:34.851925 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5a57d6fa-2989-48c3-b66e-6362b284eda4" containerName="nova-api-api" containerID="cri-o://7e3cf5e00421be7fc94877dd0e0c9871052b09c5b0041455aa5cf7869223e600" gracePeriod=30 Dec 01 21:57:34 crc kubenswrapper[4962]: I1201 21:57:34.877962 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 21:57:34 crc kubenswrapper[4962]: I1201 21:57:34.914650 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 21:57:34 crc kubenswrapper[4962]: I1201 21:57:34.914868 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="52a87903-f472-4bdd-a1bb-20869165d97f" containerName="nova-metadata-log" containerID="cri-o://c11d1b4742ee283336e14db2b0bd3c52ef5800ab1ac56451890d4ac0240f694d" gracePeriod=30 Dec 01 21:57:34 crc kubenswrapper[4962]: I1201 21:57:34.915150 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="52a87903-f472-4bdd-a1bb-20869165d97f" containerName="nova-metadata-metadata" containerID="cri-o://becd3a0bdb94d7d4b5a8fd466979d217b747192a82b72bd8f456acda1013a61b" gracePeriod=30 Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.157385 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.157430 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.268632 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-58lcl" Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.366352 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4lsnd\" (UniqueName: \"kubernetes.io/projected/f337e4ae-c46a-49f4-b8fe-bb7973cbdf8d-kube-api-access-4lsnd\") pod \"f337e4ae-c46a-49f4-b8fe-bb7973cbdf8d\" (UID: \"f337e4ae-c46a-49f4-b8fe-bb7973cbdf8d\") " Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.366529 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f337e4ae-c46a-49f4-b8fe-bb7973cbdf8d-config-data\") pod \"f337e4ae-c46a-49f4-b8fe-bb7973cbdf8d\" (UID: \"f337e4ae-c46a-49f4-b8fe-bb7973cbdf8d\") " Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.366690 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f337e4ae-c46a-49f4-b8fe-bb7973cbdf8d-scripts\") pod \"f337e4ae-c46a-49f4-b8fe-bb7973cbdf8d\" (UID: \"f337e4ae-c46a-49f4-b8fe-bb7973cbdf8d\") " Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.366723 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f337e4ae-c46a-49f4-b8fe-bb7973cbdf8d-combined-ca-bundle\") pod \"f337e4ae-c46a-49f4-b8fe-bb7973cbdf8d\" (UID: \"f337e4ae-c46a-49f4-b8fe-bb7973cbdf8d\") " Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.384096 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f337e4ae-c46a-49f4-b8fe-bb7973cbdf8d-kube-api-access-4lsnd" (OuterVolumeSpecName: "kube-api-access-4lsnd") pod "f337e4ae-c46a-49f4-b8fe-bb7973cbdf8d" (UID: "f337e4ae-c46a-49f4-b8fe-bb7973cbdf8d"). InnerVolumeSpecName "kube-api-access-4lsnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.387085 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f337e4ae-c46a-49f4-b8fe-bb7973cbdf8d-scripts" (OuterVolumeSpecName: "scripts") pod "f337e4ae-c46a-49f4-b8fe-bb7973cbdf8d" (UID: "f337e4ae-c46a-49f4-b8fe-bb7973cbdf8d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.415193 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f337e4ae-c46a-49f4-b8fe-bb7973cbdf8d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f337e4ae-c46a-49f4-b8fe-bb7973cbdf8d" (UID: "f337e4ae-c46a-49f4-b8fe-bb7973cbdf8d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.425781 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f337e4ae-c46a-49f4-b8fe-bb7973cbdf8d-config-data" (OuterVolumeSpecName: "config-data") pod "f337e4ae-c46a-49f4-b8fe-bb7973cbdf8d" (UID: "f337e4ae-c46a-49f4-b8fe-bb7973cbdf8d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.469388 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f337e4ae-c46a-49f4-b8fe-bb7973cbdf8d-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.469444 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f337e4ae-c46a-49f4-b8fe-bb7973cbdf8d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.469459 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4lsnd\" (UniqueName: \"kubernetes.io/projected/f337e4ae-c46a-49f4-b8fe-bb7973cbdf8d-kube-api-access-4lsnd\") on node \"crc\" DevicePath \"\"" Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.469468 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f337e4ae-c46a-49f4-b8fe-bb7973cbdf8d-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.491401 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.570994 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/52a87903-f472-4bdd-a1bb-20869165d97f-nova-metadata-tls-certs\") pod \"52a87903-f472-4bdd-a1bb-20869165d97f\" (UID: \"52a87903-f472-4bdd-a1bb-20869165d97f\") " Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.571177 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52a87903-f472-4bdd-a1bb-20869165d97f-combined-ca-bundle\") pod \"52a87903-f472-4bdd-a1bb-20869165d97f\" (UID: \"52a87903-f472-4bdd-a1bb-20869165d97f\") " Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.571267 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52a87903-f472-4bdd-a1bb-20869165d97f-logs\") pod \"52a87903-f472-4bdd-a1bb-20869165d97f\" (UID: \"52a87903-f472-4bdd-a1bb-20869165d97f\") " Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.571333 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2bpn\" (UniqueName: \"kubernetes.io/projected/52a87903-f472-4bdd-a1bb-20869165d97f-kube-api-access-t2bpn\") pod \"52a87903-f472-4bdd-a1bb-20869165d97f\" (UID: \"52a87903-f472-4bdd-a1bb-20869165d97f\") " Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.571568 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52a87903-f472-4bdd-a1bb-20869165d97f-config-data\") pod \"52a87903-f472-4bdd-a1bb-20869165d97f\" (UID: \"52a87903-f472-4bdd-a1bb-20869165d97f\") " Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.572205 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52a87903-f472-4bdd-a1bb-20869165d97f-logs" (OuterVolumeSpecName: "logs") pod "52a87903-f472-4bdd-a1bb-20869165d97f" (UID: "52a87903-f472-4bdd-a1bb-20869165d97f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.572869 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52a87903-f472-4bdd-a1bb-20869165d97f-logs\") on node \"crc\" DevicePath \"\"" Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.575771 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52a87903-f472-4bdd-a1bb-20869165d97f-kube-api-access-t2bpn" (OuterVolumeSpecName: "kube-api-access-t2bpn") pod "52a87903-f472-4bdd-a1bb-20869165d97f" (UID: "52a87903-f472-4bdd-a1bb-20869165d97f"). InnerVolumeSpecName "kube-api-access-t2bpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.606598 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52a87903-f472-4bdd-a1bb-20869165d97f-config-data" (OuterVolumeSpecName: "config-data") pod "52a87903-f472-4bdd-a1bb-20869165d97f" (UID: "52a87903-f472-4bdd-a1bb-20869165d97f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.626326 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52a87903-f472-4bdd-a1bb-20869165d97f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "52a87903-f472-4bdd-a1bb-20869165d97f" (UID: "52a87903-f472-4bdd-a1bb-20869165d97f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.629267 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52a87903-f472-4bdd-a1bb-20869165d97f-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "52a87903-f472-4bdd-a1bb-20869165d97f" (UID: "52a87903-f472-4bdd-a1bb-20869165d97f"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.674742 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52a87903-f472-4bdd-a1bb-20869165d97f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.674781 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2bpn\" (UniqueName: \"kubernetes.io/projected/52a87903-f472-4bdd-a1bb-20869165d97f-kube-api-access-t2bpn\") on node \"crc\" DevicePath \"\"" Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.674792 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52a87903-f472-4bdd-a1bb-20869165d97f-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.674803 4962 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/52a87903-f472-4bdd-a1bb-20869165d97f-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.735339 4962 generic.go:334] "Generic (PLEG): container finished" podID="52a87903-f472-4bdd-a1bb-20869165d97f" containerID="becd3a0bdb94d7d4b5a8fd466979d217b747192a82b72bd8f456acda1013a61b" exitCode=0 Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.735369 4962 generic.go:334] "Generic (PLEG): container finished" podID="52a87903-f472-4bdd-a1bb-20869165d97f" containerID="c11d1b4742ee283336e14db2b0bd3c52ef5800ab1ac56451890d4ac0240f694d" exitCode=143 Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.735390 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.735440 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"52a87903-f472-4bdd-a1bb-20869165d97f","Type":"ContainerDied","Data":"becd3a0bdb94d7d4b5a8fd466979d217b747192a82b72bd8f456acda1013a61b"} Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.735512 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"52a87903-f472-4bdd-a1bb-20869165d97f","Type":"ContainerDied","Data":"c11d1b4742ee283336e14db2b0bd3c52ef5800ab1ac56451890d4ac0240f694d"} Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.735534 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"52a87903-f472-4bdd-a1bb-20869165d97f","Type":"ContainerDied","Data":"747063fc879d49aec19e79be24cbf3542777f3ec41dc9e41bb6656ca84ea2f28"} Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.735564 4962 scope.go:117] "RemoveContainer" containerID="becd3a0bdb94d7d4b5a8fd466979d217b747192a82b72bd8f456acda1013a61b" Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.738068 4962 generic.go:334] "Generic (PLEG): container finished" podID="5a57d6fa-2989-48c3-b66e-6362b284eda4" containerID="64de11a8be27f77a842e09d38601f7326e01819ceb15dad617e7e51d9e526011" exitCode=143 Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.738203 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5a57d6fa-2989-48c3-b66e-6362b284eda4","Type":"ContainerDied","Data":"64de11a8be27f77a842e09d38601f7326e01819ceb15dad617e7e51d9e526011"} Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.740977 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-58lcl" Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.741213 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="99834bd0-d60e-45fd-b9f9-2d252fd9117c" containerName="nova-scheduler-scheduler" containerID="cri-o://18a24966f77ef48548f3d31bc1a4a6262edf4c5489635cd60eee5816b080366c" gracePeriod=30 Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.741364 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-58lcl" event={"ID":"f337e4ae-c46a-49f4-b8fe-bb7973cbdf8d","Type":"ContainerDied","Data":"c4835f0d0622d3b3cefab2c05641fd7984516534a427deed2f1b48c2106c3480"} Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.741410 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4835f0d0622d3b3cefab2c05641fd7984516534a427deed2f1b48c2106c3480" Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.778905 4962 scope.go:117] "RemoveContainer" containerID="c11d1b4742ee283336e14db2b0bd3c52ef5800ab1ac56451890d4ac0240f694d" Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.809327 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.815504 4962 scope.go:117] "RemoveContainer" containerID="becd3a0bdb94d7d4b5a8fd466979d217b747192a82b72bd8f456acda1013a61b" Dec 01 21:57:35 crc kubenswrapper[4962]: E1201 21:57:35.816075 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"becd3a0bdb94d7d4b5a8fd466979d217b747192a82b72bd8f456acda1013a61b\": container with ID starting with becd3a0bdb94d7d4b5a8fd466979d217b747192a82b72bd8f456acda1013a61b not found: ID does not exist" containerID="becd3a0bdb94d7d4b5a8fd466979d217b747192a82b72bd8f456acda1013a61b" Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.816130 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"becd3a0bdb94d7d4b5a8fd466979d217b747192a82b72bd8f456acda1013a61b"} err="failed to get container status \"becd3a0bdb94d7d4b5a8fd466979d217b747192a82b72bd8f456acda1013a61b\": rpc error: code = NotFound desc = could not find container \"becd3a0bdb94d7d4b5a8fd466979d217b747192a82b72bd8f456acda1013a61b\": container with ID starting with becd3a0bdb94d7d4b5a8fd466979d217b747192a82b72bd8f456acda1013a61b not found: ID does not exist" Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.816155 4962 scope.go:117] "RemoveContainer" containerID="c11d1b4742ee283336e14db2b0bd3c52ef5800ab1ac56451890d4ac0240f694d" Dec 01 21:57:35 crc kubenswrapper[4962]: E1201 21:57:35.816461 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c11d1b4742ee283336e14db2b0bd3c52ef5800ab1ac56451890d4ac0240f694d\": container with ID starting with c11d1b4742ee283336e14db2b0bd3c52ef5800ab1ac56451890d4ac0240f694d not found: ID does not exist" containerID="c11d1b4742ee283336e14db2b0bd3c52ef5800ab1ac56451890d4ac0240f694d" Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.816494 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c11d1b4742ee283336e14db2b0bd3c52ef5800ab1ac56451890d4ac0240f694d"} err="failed to get container status \"c11d1b4742ee283336e14db2b0bd3c52ef5800ab1ac56451890d4ac0240f694d\": rpc error: code = NotFound desc = could not find container \"c11d1b4742ee283336e14db2b0bd3c52ef5800ab1ac56451890d4ac0240f694d\": container with ID starting with c11d1b4742ee283336e14db2b0bd3c52ef5800ab1ac56451890d4ac0240f694d not found: ID does not exist" Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.816512 4962 scope.go:117] "RemoveContainer" containerID="becd3a0bdb94d7d4b5a8fd466979d217b747192a82b72bd8f456acda1013a61b" Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.816693 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"becd3a0bdb94d7d4b5a8fd466979d217b747192a82b72bd8f456acda1013a61b"} err="failed to get container status \"becd3a0bdb94d7d4b5a8fd466979d217b747192a82b72bd8f456acda1013a61b\": rpc error: code = NotFound desc = could not find container \"becd3a0bdb94d7d4b5a8fd466979d217b747192a82b72bd8f456acda1013a61b\": container with ID starting with becd3a0bdb94d7d4b5a8fd466979d217b747192a82b72bd8f456acda1013a61b not found: ID does not exist" Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.816716 4962 scope.go:117] "RemoveContainer" containerID="c11d1b4742ee283336e14db2b0bd3c52ef5800ab1ac56451890d4ac0240f694d" Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.816910 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c11d1b4742ee283336e14db2b0bd3c52ef5800ab1ac56451890d4ac0240f694d"} err="failed to get container status \"c11d1b4742ee283336e14db2b0bd3c52ef5800ab1ac56451890d4ac0240f694d\": rpc error: code = NotFound desc = could not find container \"c11d1b4742ee283336e14db2b0bd3c52ef5800ab1ac56451890d4ac0240f694d\": container with ID starting with c11d1b4742ee283336e14db2b0bd3c52ef5800ab1ac56451890d4ac0240f694d not found: ID does not exist" Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.822503 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.877524 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 01 21:57:35 crc kubenswrapper[4962]: E1201 21:57:35.879164 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f337e4ae-c46a-49f4-b8fe-bb7973cbdf8d" containerName="nova-cell1-conductor-db-sync" Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.879283 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f337e4ae-c46a-49f4-b8fe-bb7973cbdf8d" containerName="nova-cell1-conductor-db-sync" Dec 01 21:57:35 crc kubenswrapper[4962]: E1201 21:57:35.879377 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a52733e0-9924-46f4-aee8-705cda80cc38" containerName="init" Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.879464 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a52733e0-9924-46f4-aee8-705cda80cc38" containerName="init" Dec 01 21:57:35 crc kubenswrapper[4962]: E1201 21:57:35.879567 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52a87903-f472-4bdd-a1bb-20869165d97f" containerName="nova-metadata-log" Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.879635 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="52a87903-f472-4bdd-a1bb-20869165d97f" containerName="nova-metadata-log" Dec 01 21:57:35 crc kubenswrapper[4962]: E1201 21:57:35.879726 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e236312e-ed98-484b-ad3e-9ed5a6645df0" containerName="nova-manage" Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.879802 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="e236312e-ed98-484b-ad3e-9ed5a6645df0" containerName="nova-manage" Dec 01 21:57:35 crc kubenswrapper[4962]: E1201 21:57:35.879878 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a52733e0-9924-46f4-aee8-705cda80cc38" containerName="dnsmasq-dns" Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.880026 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a52733e0-9924-46f4-aee8-705cda80cc38" containerName="dnsmasq-dns" Dec 01 21:57:35 crc kubenswrapper[4962]: E1201 21:57:35.880116 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52a87903-f472-4bdd-a1bb-20869165d97f" containerName="nova-metadata-metadata" Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.880184 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="52a87903-f472-4bdd-a1bb-20869165d97f" containerName="nova-metadata-metadata" Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.880524 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="52a87903-f472-4bdd-a1bb-20869165d97f" containerName="nova-metadata-metadata" Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.880605 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f337e4ae-c46a-49f4-b8fe-bb7973cbdf8d" containerName="nova-cell1-conductor-db-sync" Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.880702 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="52a87903-f472-4bdd-a1bb-20869165d97f" containerName="nova-metadata-log" Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.880811 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="e236312e-ed98-484b-ad3e-9ed5a6645df0" containerName="nova-manage" Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.880892 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="a52733e0-9924-46f4-aee8-705cda80cc38" containerName="dnsmasq-dns" Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.882688 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.885784 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.886112 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.909664 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.927689 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.930743 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.933742 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.942888 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.980589 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7027bfd7-ae97-419b-aebd-11e811b45486-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"7027bfd7-ae97-419b-aebd-11e811b45486\") " pod="openstack/nova-cell1-conductor-0" Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.980632 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4176fcc-efb7-4ace-90bd-5a0f95763c00-config-data\") pod \"nova-metadata-0\" (UID: \"a4176fcc-efb7-4ace-90bd-5a0f95763c00\") " pod="openstack/nova-metadata-0" Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.980704 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4176fcc-efb7-4ace-90bd-5a0f95763c00-logs\") pod \"nova-metadata-0\" (UID: \"a4176fcc-efb7-4ace-90bd-5a0f95763c00\") " pod="openstack/nova-metadata-0" Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.980800 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7027bfd7-ae97-419b-aebd-11e811b45486-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"7027bfd7-ae97-419b-aebd-11e811b45486\") " pod="openstack/nova-cell1-conductor-0" Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.980821 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4176fcc-efb7-4ace-90bd-5a0f95763c00-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a4176fcc-efb7-4ace-90bd-5a0f95763c00\") " pod="openstack/nova-metadata-0" Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.980859 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvvhs\" (UniqueName: \"kubernetes.io/projected/7027bfd7-ae97-419b-aebd-11e811b45486-kube-api-access-qvvhs\") pod \"nova-cell1-conductor-0\" (UID: \"7027bfd7-ae97-419b-aebd-11e811b45486\") " pod="openstack/nova-cell1-conductor-0" Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.980910 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4176fcc-efb7-4ace-90bd-5a0f95763c00-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a4176fcc-efb7-4ace-90bd-5a0f95763c00\") " pod="openstack/nova-metadata-0" Dec 01 21:57:35 crc kubenswrapper[4962]: I1201 21:57:35.980950 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fl7fx\" (UniqueName: \"kubernetes.io/projected/a4176fcc-efb7-4ace-90bd-5a0f95763c00-kube-api-access-fl7fx\") pod \"nova-metadata-0\" (UID: \"a4176fcc-efb7-4ace-90bd-5a0f95763c00\") " pod="openstack/nova-metadata-0" Dec 01 21:57:36 crc kubenswrapper[4962]: I1201 21:57:36.082666 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7027bfd7-ae97-419b-aebd-11e811b45486-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"7027bfd7-ae97-419b-aebd-11e811b45486\") " pod="openstack/nova-cell1-conductor-0" Dec 01 21:57:36 crc kubenswrapper[4962]: I1201 21:57:36.082720 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4176fcc-efb7-4ace-90bd-5a0f95763c00-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a4176fcc-efb7-4ace-90bd-5a0f95763c00\") " pod="openstack/nova-metadata-0" Dec 01 21:57:36 crc kubenswrapper[4962]: I1201 21:57:36.082781 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvvhs\" (UniqueName: \"kubernetes.io/projected/7027bfd7-ae97-419b-aebd-11e811b45486-kube-api-access-qvvhs\") pod \"nova-cell1-conductor-0\" (UID: \"7027bfd7-ae97-419b-aebd-11e811b45486\") " pod="openstack/nova-cell1-conductor-0" Dec 01 21:57:36 crc kubenswrapper[4962]: I1201 21:57:36.082854 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4176fcc-efb7-4ace-90bd-5a0f95763c00-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a4176fcc-efb7-4ace-90bd-5a0f95763c00\") " pod="openstack/nova-metadata-0" Dec 01 21:57:36 crc kubenswrapper[4962]: I1201 21:57:36.082901 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fl7fx\" (UniqueName: \"kubernetes.io/projected/a4176fcc-efb7-4ace-90bd-5a0f95763c00-kube-api-access-fl7fx\") pod \"nova-metadata-0\" (UID: \"a4176fcc-efb7-4ace-90bd-5a0f95763c00\") " pod="openstack/nova-metadata-0" Dec 01 21:57:36 crc kubenswrapper[4962]: I1201 21:57:36.082970 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7027bfd7-ae97-419b-aebd-11e811b45486-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"7027bfd7-ae97-419b-aebd-11e811b45486\") " pod="openstack/nova-cell1-conductor-0" Dec 01 21:57:36 crc kubenswrapper[4962]: I1201 21:57:36.082999 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4176fcc-efb7-4ace-90bd-5a0f95763c00-config-data\") pod \"nova-metadata-0\" (UID: \"a4176fcc-efb7-4ace-90bd-5a0f95763c00\") " pod="openstack/nova-metadata-0" Dec 01 21:57:36 crc kubenswrapper[4962]: I1201 21:57:36.083103 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4176fcc-efb7-4ace-90bd-5a0f95763c00-logs\") pod \"nova-metadata-0\" (UID: \"a4176fcc-efb7-4ace-90bd-5a0f95763c00\") " pod="openstack/nova-metadata-0" Dec 01 21:57:36 crc kubenswrapper[4962]: I1201 21:57:36.083821 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4176fcc-efb7-4ace-90bd-5a0f95763c00-logs\") pod \"nova-metadata-0\" (UID: \"a4176fcc-efb7-4ace-90bd-5a0f95763c00\") " pod="openstack/nova-metadata-0" Dec 01 21:57:36 crc kubenswrapper[4962]: I1201 21:57:36.087287 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4176fcc-efb7-4ace-90bd-5a0f95763c00-config-data\") pod \"nova-metadata-0\" (UID: \"a4176fcc-efb7-4ace-90bd-5a0f95763c00\") " pod="openstack/nova-metadata-0" Dec 01 21:57:36 crc kubenswrapper[4962]: I1201 21:57:36.090143 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4176fcc-efb7-4ace-90bd-5a0f95763c00-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a4176fcc-efb7-4ace-90bd-5a0f95763c00\") " pod="openstack/nova-metadata-0" Dec 01 21:57:36 crc kubenswrapper[4962]: I1201 21:57:36.090715 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4176fcc-efb7-4ace-90bd-5a0f95763c00-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a4176fcc-efb7-4ace-90bd-5a0f95763c00\") " pod="openstack/nova-metadata-0" Dec 01 21:57:36 crc kubenswrapper[4962]: I1201 21:57:36.091445 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7027bfd7-ae97-419b-aebd-11e811b45486-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"7027bfd7-ae97-419b-aebd-11e811b45486\") " pod="openstack/nova-cell1-conductor-0" Dec 01 21:57:36 crc kubenswrapper[4962]: I1201 21:57:36.094814 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7027bfd7-ae97-419b-aebd-11e811b45486-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"7027bfd7-ae97-419b-aebd-11e811b45486\") " pod="openstack/nova-cell1-conductor-0" Dec 01 21:57:36 crc kubenswrapper[4962]: I1201 21:57:36.099548 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvvhs\" (UniqueName: \"kubernetes.io/projected/7027bfd7-ae97-419b-aebd-11e811b45486-kube-api-access-qvvhs\") pod \"nova-cell1-conductor-0\" (UID: \"7027bfd7-ae97-419b-aebd-11e811b45486\") " pod="openstack/nova-cell1-conductor-0" Dec 01 21:57:36 crc kubenswrapper[4962]: I1201 21:57:36.100277 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fl7fx\" (UniqueName: \"kubernetes.io/projected/a4176fcc-efb7-4ace-90bd-5a0f95763c00-kube-api-access-fl7fx\") pod \"nova-metadata-0\" (UID: \"a4176fcc-efb7-4ace-90bd-5a0f95763c00\") " pod="openstack/nova-metadata-0" Dec 01 21:57:36 crc kubenswrapper[4962]: I1201 21:57:36.232563 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 21:57:36 crc kubenswrapper[4962]: I1201 21:57:36.248785 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 01 21:57:36 crc kubenswrapper[4962]: I1201 21:57:36.249524 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52a87903-f472-4bdd-a1bb-20869165d97f" path="/var/lib/kubelet/pods/52a87903-f472-4bdd-a1bb-20869165d97f/volumes" Dec 01 21:57:36 crc kubenswrapper[4962]: I1201 21:57:36.862735 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 21:57:36 crc kubenswrapper[4962]: I1201 21:57:36.932124 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 01 21:57:37 crc kubenswrapper[4962]: E1201 21:57:37.542904 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="18a24966f77ef48548f3d31bc1a4a6262edf4c5489635cd60eee5816b080366c" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 01 21:57:37 crc kubenswrapper[4962]: E1201 21:57:37.544352 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="18a24966f77ef48548f3d31bc1a4a6262edf4c5489635cd60eee5816b080366c" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 01 21:57:37 crc kubenswrapper[4962]: E1201 21:57:37.546801 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="18a24966f77ef48548f3d31bc1a4a6262edf4c5489635cd60eee5816b080366c" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 01 21:57:37 crc kubenswrapper[4962]: E1201 21:57:37.546833 4962 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="99834bd0-d60e-45fd-b9f9-2d252fd9117c" containerName="nova-scheduler-scheduler" Dec 01 21:57:37 crc kubenswrapper[4962]: I1201 21:57:37.776538 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"7027bfd7-ae97-419b-aebd-11e811b45486","Type":"ContainerStarted","Data":"4962b50be12765f487128a1f799942792c5902d9c495f11f53fabfbd9a0f2d35"} Dec 01 21:57:37 crc kubenswrapper[4962]: I1201 21:57:37.776592 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"7027bfd7-ae97-419b-aebd-11e811b45486","Type":"ContainerStarted","Data":"c1ca636100c6ee61e86ca08afada9c03a094dbf6b59bfba6925dc241b6f44bb0"} Dec 01 21:57:37 crc kubenswrapper[4962]: I1201 21:57:37.778129 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 01 21:57:37 crc kubenswrapper[4962]: I1201 21:57:37.781547 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a4176fcc-efb7-4ace-90bd-5a0f95763c00","Type":"ContainerStarted","Data":"0b8cf51fd167f1e54c9c805e29c1a7697b89ff963a7381ef0fe424bb972a6250"} Dec 01 21:57:37 crc kubenswrapper[4962]: I1201 21:57:37.781586 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a4176fcc-efb7-4ace-90bd-5a0f95763c00","Type":"ContainerStarted","Data":"bc64d8a39eaab4dc24ac303ee3d12b50a27b7806023f4ec717d25cdc571c95c4"} Dec 01 21:57:37 crc kubenswrapper[4962]: I1201 21:57:37.781599 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a4176fcc-efb7-4ace-90bd-5a0f95763c00","Type":"ContainerStarted","Data":"86c61da88cdacac8ee16816559ea7c139a64bbae7e6bf1f279b2bd84a76294f7"} Dec 01 21:57:37 crc kubenswrapper[4962]: I1201 21:57:37.807055 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.807034758 podStartE2EDuration="2.807034758s" podCreationTimestamp="2025-12-01 21:57:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:57:37.791988742 +0000 UTC m=+1441.893427937" watchObservedRunningTime="2025-12-01 21:57:37.807034758 +0000 UTC m=+1441.908473953" Dec 01 21:57:37 crc kubenswrapper[4962]: I1201 21:57:37.822140 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.822125065 podStartE2EDuration="2.822125065s" podCreationTimestamp="2025-12-01 21:57:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:57:37.816489125 +0000 UTC m=+1441.917928330" watchObservedRunningTime="2025-12-01 21:57:37.822125065 +0000 UTC m=+1441.923564260" Dec 01 21:57:39 crc kubenswrapper[4962]: I1201 21:57:39.812040 4962 generic.go:334] "Generic (PLEG): container finished" podID="5a57d6fa-2989-48c3-b66e-6362b284eda4" containerID="7e3cf5e00421be7fc94877dd0e0c9871052b09c5b0041455aa5cf7869223e600" exitCode=0 Dec 01 21:57:39 crc kubenswrapper[4962]: I1201 21:57:39.812130 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5a57d6fa-2989-48c3-b66e-6362b284eda4","Type":"ContainerDied","Data":"7e3cf5e00421be7fc94877dd0e0c9871052b09c5b0041455aa5cf7869223e600"} Dec 01 21:57:39 crc kubenswrapper[4962]: I1201 21:57:39.813075 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5a57d6fa-2989-48c3-b66e-6362b284eda4","Type":"ContainerDied","Data":"7006d324db4297f99cca2d534575134bf1169582b98b90898c655b3cedec764d"} Dec 01 21:57:39 crc kubenswrapper[4962]: I1201 21:57:39.813094 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7006d324db4297f99cca2d534575134bf1169582b98b90898c655b3cedec764d" Dec 01 21:57:40 crc kubenswrapper[4962]: I1201 21:57:40.043756 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 21:57:40 crc kubenswrapper[4962]: I1201 21:57:40.087041 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a57d6fa-2989-48c3-b66e-6362b284eda4-config-data\") pod \"5a57d6fa-2989-48c3-b66e-6362b284eda4\" (UID: \"5a57d6fa-2989-48c3-b66e-6362b284eda4\") " Dec 01 21:57:40 crc kubenswrapper[4962]: I1201 21:57:40.087253 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a57d6fa-2989-48c3-b66e-6362b284eda4-combined-ca-bundle\") pod \"5a57d6fa-2989-48c3-b66e-6362b284eda4\" (UID: \"5a57d6fa-2989-48c3-b66e-6362b284eda4\") " Dec 01 21:57:40 crc kubenswrapper[4962]: I1201 21:57:40.087358 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9j2lt\" (UniqueName: \"kubernetes.io/projected/5a57d6fa-2989-48c3-b66e-6362b284eda4-kube-api-access-9j2lt\") pod \"5a57d6fa-2989-48c3-b66e-6362b284eda4\" (UID: \"5a57d6fa-2989-48c3-b66e-6362b284eda4\") " Dec 01 21:57:40 crc kubenswrapper[4962]: I1201 21:57:40.087405 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a57d6fa-2989-48c3-b66e-6362b284eda4-logs\") pod \"5a57d6fa-2989-48c3-b66e-6362b284eda4\" (UID: \"5a57d6fa-2989-48c3-b66e-6362b284eda4\") " Dec 01 21:57:40 crc kubenswrapper[4962]: I1201 21:57:40.093857 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a57d6fa-2989-48c3-b66e-6362b284eda4-logs" (OuterVolumeSpecName: "logs") pod "5a57d6fa-2989-48c3-b66e-6362b284eda4" (UID: "5a57d6fa-2989-48c3-b66e-6362b284eda4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:57:40 crc kubenswrapper[4962]: I1201 21:57:40.095525 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a57d6fa-2989-48c3-b66e-6362b284eda4-logs\") on node \"crc\" DevicePath \"\"" Dec 01 21:57:40 crc kubenswrapper[4962]: I1201 21:57:40.122656 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a57d6fa-2989-48c3-b66e-6362b284eda4-kube-api-access-9j2lt" (OuterVolumeSpecName: "kube-api-access-9j2lt") pod "5a57d6fa-2989-48c3-b66e-6362b284eda4" (UID: "5a57d6fa-2989-48c3-b66e-6362b284eda4"). InnerVolumeSpecName "kube-api-access-9j2lt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:57:40 crc kubenswrapper[4962]: I1201 21:57:40.133382 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a57d6fa-2989-48c3-b66e-6362b284eda4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5a57d6fa-2989-48c3-b66e-6362b284eda4" (UID: "5a57d6fa-2989-48c3-b66e-6362b284eda4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:57:40 crc kubenswrapper[4962]: I1201 21:57:40.136559 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a57d6fa-2989-48c3-b66e-6362b284eda4-config-data" (OuterVolumeSpecName: "config-data") pod "5a57d6fa-2989-48c3-b66e-6362b284eda4" (UID: "5a57d6fa-2989-48c3-b66e-6362b284eda4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:57:40 crc kubenswrapper[4962]: I1201 21:57:40.197160 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a57d6fa-2989-48c3-b66e-6362b284eda4-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 21:57:40 crc kubenswrapper[4962]: I1201 21:57:40.197194 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a57d6fa-2989-48c3-b66e-6362b284eda4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 21:57:40 crc kubenswrapper[4962]: I1201 21:57:40.197207 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9j2lt\" (UniqueName: \"kubernetes.io/projected/5a57d6fa-2989-48c3-b66e-6362b284eda4-kube-api-access-9j2lt\") on node \"crc\" DevicePath \"\"" Dec 01 21:57:40 crc kubenswrapper[4962]: E1201 21:57:40.279499 4962 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod99834bd0_d60e_45fd_b9f9_2d252fd9117c.slice/crio-conmon-18a24966f77ef48548f3d31bc1a4a6262edf4c5489635cd60eee5816b080366c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod99834bd0_d60e_45fd_b9f9_2d252fd9117c.slice/crio-18a24966f77ef48548f3d31bc1a4a6262edf4c5489635cd60eee5816b080366c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a57d6fa_2989_48c3_b66e_6362b284eda4.slice/crio-conmon-7e3cf5e00421be7fc94877dd0e0c9871052b09c5b0041455aa5cf7869223e600.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a57d6fa_2989_48c3_b66e_6362b284eda4.slice/crio-7e3cf5e00421be7fc94877dd0e0c9871052b09c5b0041455aa5cf7869223e600.scope\": RecentStats: unable to find data in memory cache]" Dec 01 21:57:40 crc kubenswrapper[4962]: I1201 21:57:40.352503 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 21:57:40 crc kubenswrapper[4962]: I1201 21:57:40.399761 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99834bd0-d60e-45fd-b9f9-2d252fd9117c-config-data\") pod \"99834bd0-d60e-45fd-b9f9-2d252fd9117c\" (UID: \"99834bd0-d60e-45fd-b9f9-2d252fd9117c\") " Dec 01 21:57:40 crc kubenswrapper[4962]: I1201 21:57:40.399815 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99834bd0-d60e-45fd-b9f9-2d252fd9117c-combined-ca-bundle\") pod \"99834bd0-d60e-45fd-b9f9-2d252fd9117c\" (UID: \"99834bd0-d60e-45fd-b9f9-2d252fd9117c\") " Dec 01 21:57:40 crc kubenswrapper[4962]: I1201 21:57:40.399878 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mkvf\" (UniqueName: \"kubernetes.io/projected/99834bd0-d60e-45fd-b9f9-2d252fd9117c-kube-api-access-7mkvf\") pod \"99834bd0-d60e-45fd-b9f9-2d252fd9117c\" (UID: \"99834bd0-d60e-45fd-b9f9-2d252fd9117c\") " Dec 01 21:57:40 crc kubenswrapper[4962]: I1201 21:57:40.420159 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99834bd0-d60e-45fd-b9f9-2d252fd9117c-kube-api-access-7mkvf" (OuterVolumeSpecName: "kube-api-access-7mkvf") pod "99834bd0-d60e-45fd-b9f9-2d252fd9117c" (UID: "99834bd0-d60e-45fd-b9f9-2d252fd9117c"). InnerVolumeSpecName "kube-api-access-7mkvf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:57:40 crc kubenswrapper[4962]: I1201 21:57:40.443335 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99834bd0-d60e-45fd-b9f9-2d252fd9117c-config-data" (OuterVolumeSpecName: "config-data") pod "99834bd0-d60e-45fd-b9f9-2d252fd9117c" (UID: "99834bd0-d60e-45fd-b9f9-2d252fd9117c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:57:40 crc kubenswrapper[4962]: I1201 21:57:40.446374 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99834bd0-d60e-45fd-b9f9-2d252fd9117c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "99834bd0-d60e-45fd-b9f9-2d252fd9117c" (UID: "99834bd0-d60e-45fd-b9f9-2d252fd9117c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:57:40 crc kubenswrapper[4962]: I1201 21:57:40.513484 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99834bd0-d60e-45fd-b9f9-2d252fd9117c-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 21:57:40 crc kubenswrapper[4962]: I1201 21:57:40.513524 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99834bd0-d60e-45fd-b9f9-2d252fd9117c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 21:57:40 crc kubenswrapper[4962]: I1201 21:57:40.513538 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mkvf\" (UniqueName: \"kubernetes.io/projected/99834bd0-d60e-45fd-b9f9-2d252fd9117c-kube-api-access-7mkvf\") on node \"crc\" DevicePath \"\"" Dec 01 21:57:40 crc kubenswrapper[4962]: I1201 21:57:40.827711 4962 generic.go:334] "Generic (PLEG): container finished" podID="99834bd0-d60e-45fd-b9f9-2d252fd9117c" containerID="18a24966f77ef48548f3d31bc1a4a6262edf4c5489635cd60eee5816b080366c" exitCode=0 Dec 01 21:57:40 crc kubenswrapper[4962]: I1201 21:57:40.827754 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 21:57:40 crc kubenswrapper[4962]: I1201 21:57:40.828086 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 21:57:40 crc kubenswrapper[4962]: I1201 21:57:40.827774 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"99834bd0-d60e-45fd-b9f9-2d252fd9117c","Type":"ContainerDied","Data":"18a24966f77ef48548f3d31bc1a4a6262edf4c5489635cd60eee5816b080366c"} Dec 01 21:57:40 crc kubenswrapper[4962]: I1201 21:57:40.828194 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"99834bd0-d60e-45fd-b9f9-2d252fd9117c","Type":"ContainerDied","Data":"53cd5e82a129426077ed2007096cb57f5a771472cf0d4a9f14271ff700d82934"} Dec 01 21:57:40 crc kubenswrapper[4962]: I1201 21:57:40.828211 4962 scope.go:117] "RemoveContainer" containerID="18a24966f77ef48548f3d31bc1a4a6262edf4c5489635cd60eee5816b080366c" Dec 01 21:57:40 crc kubenswrapper[4962]: I1201 21:57:40.866970 4962 scope.go:117] "RemoveContainer" containerID="18a24966f77ef48548f3d31bc1a4a6262edf4c5489635cd60eee5816b080366c" Dec 01 21:57:40 crc kubenswrapper[4962]: I1201 21:57:40.885262 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 01 21:57:40 crc kubenswrapper[4962]: E1201 21:57:40.885446 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18a24966f77ef48548f3d31bc1a4a6262edf4c5489635cd60eee5816b080366c\": container with ID starting with 18a24966f77ef48548f3d31bc1a4a6262edf4c5489635cd60eee5816b080366c not found: ID does not exist" containerID="18a24966f77ef48548f3d31bc1a4a6262edf4c5489635cd60eee5816b080366c" Dec 01 21:57:40 crc kubenswrapper[4962]: I1201 21:57:40.885489 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18a24966f77ef48548f3d31bc1a4a6262edf4c5489635cd60eee5816b080366c"} err="failed to get container status \"18a24966f77ef48548f3d31bc1a4a6262edf4c5489635cd60eee5816b080366c\": rpc error: code = NotFound desc = could not find container \"18a24966f77ef48548f3d31bc1a4a6262edf4c5489635cd60eee5816b080366c\": container with ID starting with 18a24966f77ef48548f3d31bc1a4a6262edf4c5489635cd60eee5816b080366c not found: ID does not exist" Dec 01 21:57:40 crc kubenswrapper[4962]: I1201 21:57:40.918998 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 01 21:57:40 crc kubenswrapper[4962]: I1201 21:57:40.957103 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 21:57:40 crc kubenswrapper[4962]: I1201 21:57:40.976901 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 21:57:40 crc kubenswrapper[4962]: I1201 21:57:40.987956 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 01 21:57:40 crc kubenswrapper[4962]: E1201 21:57:40.988508 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99834bd0-d60e-45fd-b9f9-2d252fd9117c" containerName="nova-scheduler-scheduler" Dec 01 21:57:40 crc kubenswrapper[4962]: I1201 21:57:40.988521 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="99834bd0-d60e-45fd-b9f9-2d252fd9117c" containerName="nova-scheduler-scheduler" Dec 01 21:57:40 crc kubenswrapper[4962]: E1201 21:57:40.988537 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a57d6fa-2989-48c3-b66e-6362b284eda4" containerName="nova-api-api" Dec 01 21:57:40 crc kubenswrapper[4962]: I1201 21:57:40.988543 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a57d6fa-2989-48c3-b66e-6362b284eda4" containerName="nova-api-api" Dec 01 21:57:40 crc kubenswrapper[4962]: E1201 21:57:40.988576 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a57d6fa-2989-48c3-b66e-6362b284eda4" containerName="nova-api-log" Dec 01 21:57:40 crc kubenswrapper[4962]: I1201 21:57:40.988583 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a57d6fa-2989-48c3-b66e-6362b284eda4" containerName="nova-api-log" Dec 01 21:57:40 crc kubenswrapper[4962]: I1201 21:57:40.988803 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="99834bd0-d60e-45fd-b9f9-2d252fd9117c" containerName="nova-scheduler-scheduler" Dec 01 21:57:40 crc kubenswrapper[4962]: I1201 21:57:40.988819 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a57d6fa-2989-48c3-b66e-6362b284eda4" containerName="nova-api-api" Dec 01 21:57:40 crc kubenswrapper[4962]: I1201 21:57:40.988840 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a57d6fa-2989-48c3-b66e-6362b284eda4" containerName="nova-api-log" Dec 01 21:57:40 crc kubenswrapper[4962]: I1201 21:57:40.990594 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 21:57:40 crc kubenswrapper[4962]: I1201 21:57:40.998179 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 01 21:57:40 crc kubenswrapper[4962]: I1201 21:57:40.998570 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 21:57:41 crc kubenswrapper[4962]: I1201 21:57:41.001033 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 21:57:41 crc kubenswrapper[4962]: I1201 21:57:41.004356 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 01 21:57:41 crc kubenswrapper[4962]: I1201 21:57:41.011748 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 21:57:41 crc kubenswrapper[4962]: I1201 21:57:41.030191 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 21:57:41 crc kubenswrapper[4962]: I1201 21:57:41.128091 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/712ea540-d060-4b1c-a201-7b7593c942dc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"712ea540-d060-4b1c-a201-7b7593c942dc\") " pod="openstack/nova-scheduler-0" Dec 01 21:57:41 crc kubenswrapper[4962]: I1201 21:57:41.128214 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss4kn\" (UniqueName: \"kubernetes.io/projected/66789194-99ac-4fdf-9fc0-350fa5422867-kube-api-access-ss4kn\") pod \"nova-api-0\" (UID: \"66789194-99ac-4fdf-9fc0-350fa5422867\") " pod="openstack/nova-api-0" Dec 01 21:57:41 crc kubenswrapper[4962]: I1201 21:57:41.128238 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66789194-99ac-4fdf-9fc0-350fa5422867-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"66789194-99ac-4fdf-9fc0-350fa5422867\") " pod="openstack/nova-api-0" Dec 01 21:57:41 crc kubenswrapper[4962]: I1201 21:57:41.128269 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66789194-99ac-4fdf-9fc0-350fa5422867-logs\") pod \"nova-api-0\" (UID: \"66789194-99ac-4fdf-9fc0-350fa5422867\") " pod="openstack/nova-api-0" Dec 01 21:57:41 crc kubenswrapper[4962]: I1201 21:57:41.128293 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/712ea540-d060-4b1c-a201-7b7593c942dc-config-data\") pod \"nova-scheduler-0\" (UID: \"712ea540-d060-4b1c-a201-7b7593c942dc\") " pod="openstack/nova-scheduler-0" Dec 01 21:57:41 crc kubenswrapper[4962]: I1201 21:57:41.128319 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzr7x\" (UniqueName: \"kubernetes.io/projected/712ea540-d060-4b1c-a201-7b7593c942dc-kube-api-access-fzr7x\") pod \"nova-scheduler-0\" (UID: \"712ea540-d060-4b1c-a201-7b7593c942dc\") " pod="openstack/nova-scheduler-0" Dec 01 21:57:41 crc kubenswrapper[4962]: I1201 21:57:41.128346 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66789194-99ac-4fdf-9fc0-350fa5422867-config-data\") pod \"nova-api-0\" (UID: \"66789194-99ac-4fdf-9fc0-350fa5422867\") " pod="openstack/nova-api-0" Dec 01 21:57:41 crc kubenswrapper[4962]: I1201 21:57:41.231074 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ss4kn\" (UniqueName: \"kubernetes.io/projected/66789194-99ac-4fdf-9fc0-350fa5422867-kube-api-access-ss4kn\") pod \"nova-api-0\" (UID: \"66789194-99ac-4fdf-9fc0-350fa5422867\") " pod="openstack/nova-api-0" Dec 01 21:57:41 crc kubenswrapper[4962]: I1201 21:57:41.231376 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66789194-99ac-4fdf-9fc0-350fa5422867-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"66789194-99ac-4fdf-9fc0-350fa5422867\") " pod="openstack/nova-api-0" Dec 01 21:57:41 crc kubenswrapper[4962]: I1201 21:57:41.231421 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66789194-99ac-4fdf-9fc0-350fa5422867-logs\") pod \"nova-api-0\" (UID: \"66789194-99ac-4fdf-9fc0-350fa5422867\") " pod="openstack/nova-api-0" Dec 01 21:57:41 crc kubenswrapper[4962]: I1201 21:57:41.231452 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/712ea540-d060-4b1c-a201-7b7593c942dc-config-data\") pod \"nova-scheduler-0\" (UID: \"712ea540-d060-4b1c-a201-7b7593c942dc\") " pod="openstack/nova-scheduler-0" Dec 01 21:57:41 crc kubenswrapper[4962]: I1201 21:57:41.231481 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzr7x\" (UniqueName: \"kubernetes.io/projected/712ea540-d060-4b1c-a201-7b7593c942dc-kube-api-access-fzr7x\") pod \"nova-scheduler-0\" (UID: \"712ea540-d060-4b1c-a201-7b7593c942dc\") " pod="openstack/nova-scheduler-0" Dec 01 21:57:41 crc kubenswrapper[4962]: I1201 21:57:41.231505 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66789194-99ac-4fdf-9fc0-350fa5422867-config-data\") pod \"nova-api-0\" (UID: \"66789194-99ac-4fdf-9fc0-350fa5422867\") " pod="openstack/nova-api-0" Dec 01 21:57:41 crc kubenswrapper[4962]: I1201 21:57:41.231616 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/712ea540-d060-4b1c-a201-7b7593c942dc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"712ea540-d060-4b1c-a201-7b7593c942dc\") " pod="openstack/nova-scheduler-0" Dec 01 21:57:41 crc kubenswrapper[4962]: I1201 21:57:41.232006 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66789194-99ac-4fdf-9fc0-350fa5422867-logs\") pod \"nova-api-0\" (UID: \"66789194-99ac-4fdf-9fc0-350fa5422867\") " pod="openstack/nova-api-0" Dec 01 21:57:41 crc kubenswrapper[4962]: I1201 21:57:41.233031 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 01 21:57:41 crc kubenswrapper[4962]: I1201 21:57:41.233468 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 01 21:57:41 crc kubenswrapper[4962]: I1201 21:57:41.236828 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66789194-99ac-4fdf-9fc0-350fa5422867-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"66789194-99ac-4fdf-9fc0-350fa5422867\") " pod="openstack/nova-api-0" Dec 01 21:57:41 crc kubenswrapper[4962]: I1201 21:57:41.237358 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66789194-99ac-4fdf-9fc0-350fa5422867-config-data\") pod \"nova-api-0\" (UID: \"66789194-99ac-4fdf-9fc0-350fa5422867\") " pod="openstack/nova-api-0" Dec 01 21:57:41 crc kubenswrapper[4962]: I1201 21:57:41.237921 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/712ea540-d060-4b1c-a201-7b7593c942dc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"712ea540-d060-4b1c-a201-7b7593c942dc\") " pod="openstack/nova-scheduler-0" Dec 01 21:57:41 crc kubenswrapper[4962]: I1201 21:57:41.244643 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/712ea540-d060-4b1c-a201-7b7593c942dc-config-data\") pod \"nova-scheduler-0\" (UID: \"712ea540-d060-4b1c-a201-7b7593c942dc\") " pod="openstack/nova-scheduler-0" Dec 01 21:57:41 crc kubenswrapper[4962]: I1201 21:57:41.253043 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzr7x\" (UniqueName: \"kubernetes.io/projected/712ea540-d060-4b1c-a201-7b7593c942dc-kube-api-access-fzr7x\") pod \"nova-scheduler-0\" (UID: \"712ea540-d060-4b1c-a201-7b7593c942dc\") " pod="openstack/nova-scheduler-0" Dec 01 21:57:41 crc kubenswrapper[4962]: I1201 21:57:41.263234 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss4kn\" (UniqueName: \"kubernetes.io/projected/66789194-99ac-4fdf-9fc0-350fa5422867-kube-api-access-ss4kn\") pod \"nova-api-0\" (UID: \"66789194-99ac-4fdf-9fc0-350fa5422867\") " pod="openstack/nova-api-0" Dec 01 21:57:41 crc kubenswrapper[4962]: I1201 21:57:41.319481 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 21:57:41 crc kubenswrapper[4962]: I1201 21:57:41.335372 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 21:57:41 crc kubenswrapper[4962]: I1201 21:57:41.915205 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 21:57:41 crc kubenswrapper[4962]: I1201 21:57:41.963873 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 21:57:41 crc kubenswrapper[4962]: W1201 21:57:41.968337 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod712ea540_d060_4b1c_a201_7b7593c942dc.slice/crio-b0a780728cfa505e8b4801b2697e7f49c8c7712c031d4ec20a6733fcc5bfc2fa WatchSource:0}: Error finding container b0a780728cfa505e8b4801b2697e7f49c8c7712c031d4ec20a6733fcc5bfc2fa: Status 404 returned error can't find the container with id b0a780728cfa505e8b4801b2697e7f49c8c7712c031d4ec20a6733fcc5bfc2fa Dec 01 21:57:42 crc kubenswrapper[4962]: I1201 21:57:42.265768 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a57d6fa-2989-48c3-b66e-6362b284eda4" path="/var/lib/kubelet/pods/5a57d6fa-2989-48c3-b66e-6362b284eda4/volumes" Dec 01 21:57:42 crc kubenswrapper[4962]: I1201 21:57:42.267628 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99834bd0-d60e-45fd-b9f9-2d252fd9117c" path="/var/lib/kubelet/pods/99834bd0-d60e-45fd-b9f9-2d252fd9117c/volumes" Dec 01 21:57:42 crc kubenswrapper[4962]: I1201 21:57:42.854772 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"66789194-99ac-4fdf-9fc0-350fa5422867","Type":"ContainerStarted","Data":"e46de99b8473033437861ecfef864ffc26a78968b854651184220f433944c7d5"} Dec 01 21:57:42 crc kubenswrapper[4962]: I1201 21:57:42.855145 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"66789194-99ac-4fdf-9fc0-350fa5422867","Type":"ContainerStarted","Data":"3ade7d54e515b3af6ab7f6813de832c61375247f29b19e7e84248ba3942b0745"} Dec 01 21:57:42 crc kubenswrapper[4962]: I1201 21:57:42.855156 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"66789194-99ac-4fdf-9fc0-350fa5422867","Type":"ContainerStarted","Data":"73f080f31f6308fdfdf13ccc705ff080fd644f483c8a510e78e18f729962cacf"} Dec 01 21:57:42 crc kubenswrapper[4962]: I1201 21:57:42.857061 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"712ea540-d060-4b1c-a201-7b7593c942dc","Type":"ContainerStarted","Data":"8b0a63c02087acfbac6d8dbc35975bd714c984b8a2ea987c0781e72038a5c36b"} Dec 01 21:57:42 crc kubenswrapper[4962]: I1201 21:57:42.857087 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"712ea540-d060-4b1c-a201-7b7593c942dc","Type":"ContainerStarted","Data":"b0a780728cfa505e8b4801b2697e7f49c8c7712c031d4ec20a6733fcc5bfc2fa"} Dec 01 21:57:42 crc kubenswrapper[4962]: I1201 21:57:42.875205 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.875172685 podStartE2EDuration="2.875172685s" podCreationTimestamp="2025-12-01 21:57:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:57:42.871736037 +0000 UTC m=+1446.973175252" watchObservedRunningTime="2025-12-01 21:57:42.875172685 +0000 UTC m=+1446.976611930" Dec 01 21:57:42 crc kubenswrapper[4962]: I1201 21:57:42.907442 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.907424378 podStartE2EDuration="2.907424378s" podCreationTimestamp="2025-12-01 21:57:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:57:42.897420775 +0000 UTC m=+1446.998859970" watchObservedRunningTime="2025-12-01 21:57:42.907424378 +0000 UTC m=+1447.008863583" Dec 01 21:57:45 crc kubenswrapper[4962]: I1201 21:57:45.783507 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 01 21:57:46 crc kubenswrapper[4962]: I1201 21:57:46.239265 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 01 21:57:46 crc kubenswrapper[4962]: I1201 21:57:46.239326 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 01 21:57:46 crc kubenswrapper[4962]: I1201 21:57:46.309987 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 01 21:57:46 crc kubenswrapper[4962]: I1201 21:57:46.336373 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 01 21:57:47 crc kubenswrapper[4962]: I1201 21:57:47.253211 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="a4176fcc-efb7-4ace-90bd-5a0f95763c00" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.243:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 01 21:57:47 crc kubenswrapper[4962]: I1201 21:57:47.253521 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="a4176fcc-efb7-4ace-90bd-5a0f95763c00" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.243:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 01 21:57:48 crc kubenswrapper[4962]: I1201 21:57:48.942399 4962 generic.go:334] "Generic (PLEG): container finished" podID="07ba8682-f1b4-4f16-85c9-99b5dcc82666" containerID="89272950bdbf82012cd12825bfcbf1da1de172ee4d61b5963fa60c515724d833" exitCode=137 Dec 01 21:57:48 crc kubenswrapper[4962]: I1201 21:57:48.942477 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"07ba8682-f1b4-4f16-85c9-99b5dcc82666","Type":"ContainerDied","Data":"89272950bdbf82012cd12825bfcbf1da1de172ee4d61b5963fa60c515724d833"} Dec 01 21:57:48 crc kubenswrapper[4962]: I1201 21:57:48.942834 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"07ba8682-f1b4-4f16-85c9-99b5dcc82666","Type":"ContainerDied","Data":"5923621d0476aa79a2fefabf810c539fa2bc9a7e19fadc4ded9359e1b0e19a22"} Dec 01 21:57:48 crc kubenswrapper[4962]: I1201 21:57:48.942848 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5923621d0476aa79a2fefabf810c539fa2bc9a7e19fadc4ded9359e1b0e19a22" Dec 01 21:57:48 crc kubenswrapper[4962]: I1201 21:57:48.950459 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 01 21:57:49 crc kubenswrapper[4962]: I1201 21:57:49.061515 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjnx2\" (UniqueName: \"kubernetes.io/projected/07ba8682-f1b4-4f16-85c9-99b5dcc82666-kube-api-access-cjnx2\") pod \"07ba8682-f1b4-4f16-85c9-99b5dcc82666\" (UID: \"07ba8682-f1b4-4f16-85c9-99b5dcc82666\") " Dec 01 21:57:49 crc kubenswrapper[4962]: I1201 21:57:49.062608 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07ba8682-f1b4-4f16-85c9-99b5dcc82666-config-data\") pod \"07ba8682-f1b4-4f16-85c9-99b5dcc82666\" (UID: \"07ba8682-f1b4-4f16-85c9-99b5dcc82666\") " Dec 01 21:57:49 crc kubenswrapper[4962]: I1201 21:57:49.062838 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07ba8682-f1b4-4f16-85c9-99b5dcc82666-combined-ca-bundle\") pod \"07ba8682-f1b4-4f16-85c9-99b5dcc82666\" (UID: \"07ba8682-f1b4-4f16-85c9-99b5dcc82666\") " Dec 01 21:57:49 crc kubenswrapper[4962]: I1201 21:57:49.063035 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07ba8682-f1b4-4f16-85c9-99b5dcc82666-scripts\") pod \"07ba8682-f1b4-4f16-85c9-99b5dcc82666\" (UID: \"07ba8682-f1b4-4f16-85c9-99b5dcc82666\") " Dec 01 21:57:49 crc kubenswrapper[4962]: I1201 21:57:49.071299 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07ba8682-f1b4-4f16-85c9-99b5dcc82666-kube-api-access-cjnx2" (OuterVolumeSpecName: "kube-api-access-cjnx2") pod "07ba8682-f1b4-4f16-85c9-99b5dcc82666" (UID: "07ba8682-f1b4-4f16-85c9-99b5dcc82666"). InnerVolumeSpecName "kube-api-access-cjnx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:57:49 crc kubenswrapper[4962]: I1201 21:57:49.072576 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07ba8682-f1b4-4f16-85c9-99b5dcc82666-scripts" (OuterVolumeSpecName: "scripts") pod "07ba8682-f1b4-4f16-85c9-99b5dcc82666" (UID: "07ba8682-f1b4-4f16-85c9-99b5dcc82666"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:57:49 crc kubenswrapper[4962]: I1201 21:57:49.166236 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07ba8682-f1b4-4f16-85c9-99b5dcc82666-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 21:57:49 crc kubenswrapper[4962]: I1201 21:57:49.166377 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjnx2\" (UniqueName: \"kubernetes.io/projected/07ba8682-f1b4-4f16-85c9-99b5dcc82666-kube-api-access-cjnx2\") on node \"crc\" DevicePath \"\"" Dec 01 21:57:49 crc kubenswrapper[4962]: I1201 21:57:49.193030 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07ba8682-f1b4-4f16-85c9-99b5dcc82666-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "07ba8682-f1b4-4f16-85c9-99b5dcc82666" (UID: "07ba8682-f1b4-4f16-85c9-99b5dcc82666"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:57:49 crc kubenswrapper[4962]: I1201 21:57:49.208651 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07ba8682-f1b4-4f16-85c9-99b5dcc82666-config-data" (OuterVolumeSpecName: "config-data") pod "07ba8682-f1b4-4f16-85c9-99b5dcc82666" (UID: "07ba8682-f1b4-4f16-85c9-99b5dcc82666"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:57:49 crc kubenswrapper[4962]: I1201 21:57:49.269596 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07ba8682-f1b4-4f16-85c9-99b5dcc82666-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 21:57:49 crc kubenswrapper[4962]: I1201 21:57:49.269909 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07ba8682-f1b4-4f16-85c9-99b5dcc82666-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 21:57:49 crc kubenswrapper[4962]: I1201 21:57:49.826652 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 21:57:49 crc kubenswrapper[4962]: I1201 21:57:49.827297 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="3dced14f-6bff-4820-b135-78ef69ba6b33" containerName="kube-state-metrics" containerID="cri-o://5f35cf4077e4ba9ef3b58db0be30ca171cd84839dc0460825c1ad95e246c604e" gracePeriod=30 Dec 01 21:57:49 crc kubenswrapper[4962]: I1201 21:57:49.905678 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-0"] Dec 01 21:57:49 crc kubenswrapper[4962]: I1201 21:57:49.906184 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mysqld-exporter-0" podUID="a120e58c-62c2-4242-a668-151b872a9cb4" containerName="mysqld-exporter" containerID="cri-o://565453a6732b26778e1fd23b877ef4726753596be880161a75cb33142f68b534" gracePeriod=30 Dec 01 21:57:49 crc kubenswrapper[4962]: I1201 21:57:49.962801 4962 generic.go:334] "Generic (PLEG): container finished" podID="3dced14f-6bff-4820-b135-78ef69ba6b33" containerID="5f35cf4077e4ba9ef3b58db0be30ca171cd84839dc0460825c1ad95e246c604e" exitCode=2 Dec 01 21:57:49 crc kubenswrapper[4962]: I1201 21:57:49.962953 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 01 21:57:49 crc kubenswrapper[4962]: I1201 21:57:49.971178 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"3dced14f-6bff-4820-b135-78ef69ba6b33","Type":"ContainerDied","Data":"5f35cf4077e4ba9ef3b58db0be30ca171cd84839dc0460825c1ad95e246c604e"} Dec 01 21:57:50 crc kubenswrapper[4962]: I1201 21:57:50.121994 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Dec 01 21:57:50 crc kubenswrapper[4962]: I1201 21:57:50.137789 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Dec 01 21:57:50 crc kubenswrapper[4962]: I1201 21:57:50.156166 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Dec 01 21:57:50 crc kubenswrapper[4962]: E1201 21:57:50.156726 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07ba8682-f1b4-4f16-85c9-99b5dcc82666" containerName="aodh-evaluator" Dec 01 21:57:50 crc kubenswrapper[4962]: I1201 21:57:50.156745 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="07ba8682-f1b4-4f16-85c9-99b5dcc82666" containerName="aodh-evaluator" Dec 01 21:57:50 crc kubenswrapper[4962]: E1201 21:57:50.156761 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07ba8682-f1b4-4f16-85c9-99b5dcc82666" containerName="aodh-listener" Dec 01 21:57:50 crc kubenswrapper[4962]: I1201 21:57:50.156769 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="07ba8682-f1b4-4f16-85c9-99b5dcc82666" containerName="aodh-listener" Dec 01 21:57:50 crc kubenswrapper[4962]: E1201 21:57:50.156811 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07ba8682-f1b4-4f16-85c9-99b5dcc82666" containerName="aodh-notifier" Dec 01 21:57:50 crc kubenswrapper[4962]: I1201 21:57:50.156817 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="07ba8682-f1b4-4f16-85c9-99b5dcc82666" containerName="aodh-notifier" Dec 01 21:57:50 crc kubenswrapper[4962]: E1201 21:57:50.156838 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07ba8682-f1b4-4f16-85c9-99b5dcc82666" containerName="aodh-api" Dec 01 21:57:50 crc kubenswrapper[4962]: I1201 21:57:50.156848 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="07ba8682-f1b4-4f16-85c9-99b5dcc82666" containerName="aodh-api" Dec 01 21:57:50 crc kubenswrapper[4962]: I1201 21:57:50.157075 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="07ba8682-f1b4-4f16-85c9-99b5dcc82666" containerName="aodh-evaluator" Dec 01 21:57:50 crc kubenswrapper[4962]: I1201 21:57:50.157093 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="07ba8682-f1b4-4f16-85c9-99b5dcc82666" containerName="aodh-notifier" Dec 01 21:57:50 crc kubenswrapper[4962]: I1201 21:57:50.157102 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="07ba8682-f1b4-4f16-85c9-99b5dcc82666" containerName="aodh-api" Dec 01 21:57:50 crc kubenswrapper[4962]: I1201 21:57:50.157113 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="07ba8682-f1b4-4f16-85c9-99b5dcc82666" containerName="aodh-listener" Dec 01 21:57:50 crc kubenswrapper[4962]: I1201 21:57:50.159132 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 01 21:57:50 crc kubenswrapper[4962]: I1201 21:57:50.163168 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-bf8t4" Dec 01 21:57:50 crc kubenswrapper[4962]: I1201 21:57:50.163255 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Dec 01 21:57:50 crc kubenswrapper[4962]: I1201 21:57:50.163410 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Dec 01 21:57:50 crc kubenswrapper[4962]: I1201 21:57:50.163703 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Dec 01 21:57:50 crc kubenswrapper[4962]: I1201 21:57:50.163838 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Dec 01 21:57:50 crc kubenswrapper[4962]: I1201 21:57:50.172590 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 01 21:57:50 crc kubenswrapper[4962]: I1201 21:57:50.232809 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07ba8682-f1b4-4f16-85c9-99b5dcc82666" path="/var/lib/kubelet/pods/07ba8682-f1b4-4f16-85c9-99b5dcc82666/volumes" Dec 01 21:57:50 crc kubenswrapper[4962]: I1201 21:57:50.315401 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01b5da04-0e15-442b-87c1-941fac20aeaf-config-data\") pod \"aodh-0\" (UID: \"01b5da04-0e15-442b-87c1-941fac20aeaf\") " pod="openstack/aodh-0" Dec 01 21:57:50 crc kubenswrapper[4962]: I1201 21:57:50.315618 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bt5p\" (UniqueName: \"kubernetes.io/projected/01b5da04-0e15-442b-87c1-941fac20aeaf-kube-api-access-4bt5p\") pod \"aodh-0\" (UID: \"01b5da04-0e15-442b-87c1-941fac20aeaf\") " pod="openstack/aodh-0" Dec 01 21:57:50 crc kubenswrapper[4962]: I1201 21:57:50.315998 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01b5da04-0e15-442b-87c1-941fac20aeaf-combined-ca-bundle\") pod \"aodh-0\" (UID: \"01b5da04-0e15-442b-87c1-941fac20aeaf\") " pod="openstack/aodh-0" Dec 01 21:57:50 crc kubenswrapper[4962]: I1201 21:57:50.316086 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/01b5da04-0e15-442b-87c1-941fac20aeaf-internal-tls-certs\") pod \"aodh-0\" (UID: \"01b5da04-0e15-442b-87c1-941fac20aeaf\") " pod="openstack/aodh-0" Dec 01 21:57:50 crc kubenswrapper[4962]: I1201 21:57:50.316118 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/01b5da04-0e15-442b-87c1-941fac20aeaf-public-tls-certs\") pod \"aodh-0\" (UID: \"01b5da04-0e15-442b-87c1-941fac20aeaf\") " pod="openstack/aodh-0" Dec 01 21:57:50 crc kubenswrapper[4962]: I1201 21:57:50.316392 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01b5da04-0e15-442b-87c1-941fac20aeaf-scripts\") pod \"aodh-0\" (UID: \"01b5da04-0e15-442b-87c1-941fac20aeaf\") " pod="openstack/aodh-0" Dec 01 21:57:50 crc kubenswrapper[4962]: I1201 21:57:50.418858 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01b5da04-0e15-442b-87c1-941fac20aeaf-config-data\") pod \"aodh-0\" (UID: \"01b5da04-0e15-442b-87c1-941fac20aeaf\") " pod="openstack/aodh-0" Dec 01 21:57:50 crc kubenswrapper[4962]: I1201 21:57:50.419408 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bt5p\" (UniqueName: \"kubernetes.io/projected/01b5da04-0e15-442b-87c1-941fac20aeaf-kube-api-access-4bt5p\") pod \"aodh-0\" (UID: \"01b5da04-0e15-442b-87c1-941fac20aeaf\") " pod="openstack/aodh-0" Dec 01 21:57:50 crc kubenswrapper[4962]: I1201 21:57:50.419482 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01b5da04-0e15-442b-87c1-941fac20aeaf-combined-ca-bundle\") pod \"aodh-0\" (UID: \"01b5da04-0e15-442b-87c1-941fac20aeaf\") " pod="openstack/aodh-0" Dec 01 21:57:50 crc kubenswrapper[4962]: I1201 21:57:50.419637 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/01b5da04-0e15-442b-87c1-941fac20aeaf-internal-tls-certs\") pod \"aodh-0\" (UID: \"01b5da04-0e15-442b-87c1-941fac20aeaf\") " pod="openstack/aodh-0" Dec 01 21:57:50 crc kubenswrapper[4962]: I1201 21:57:50.419664 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/01b5da04-0e15-442b-87c1-941fac20aeaf-public-tls-certs\") pod \"aodh-0\" (UID: \"01b5da04-0e15-442b-87c1-941fac20aeaf\") " pod="openstack/aodh-0" Dec 01 21:57:50 crc kubenswrapper[4962]: I1201 21:57:50.419720 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01b5da04-0e15-442b-87c1-941fac20aeaf-scripts\") pod \"aodh-0\" (UID: \"01b5da04-0e15-442b-87c1-941fac20aeaf\") " pod="openstack/aodh-0" Dec 01 21:57:50 crc kubenswrapper[4962]: I1201 21:57:50.425847 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01b5da04-0e15-442b-87c1-941fac20aeaf-config-data\") pod \"aodh-0\" (UID: \"01b5da04-0e15-442b-87c1-941fac20aeaf\") " pod="openstack/aodh-0" Dec 01 21:57:50 crc kubenswrapper[4962]: I1201 21:57:50.426318 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01b5da04-0e15-442b-87c1-941fac20aeaf-scripts\") pod \"aodh-0\" (UID: \"01b5da04-0e15-442b-87c1-941fac20aeaf\") " pod="openstack/aodh-0" Dec 01 21:57:50 crc kubenswrapper[4962]: I1201 21:57:50.426554 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/01b5da04-0e15-442b-87c1-941fac20aeaf-internal-tls-certs\") pod \"aodh-0\" (UID: \"01b5da04-0e15-442b-87c1-941fac20aeaf\") " pod="openstack/aodh-0" Dec 01 21:57:50 crc kubenswrapper[4962]: I1201 21:57:50.427411 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/01b5da04-0e15-442b-87c1-941fac20aeaf-public-tls-certs\") pod \"aodh-0\" (UID: \"01b5da04-0e15-442b-87c1-941fac20aeaf\") " pod="openstack/aodh-0" Dec 01 21:57:50 crc kubenswrapper[4962]: I1201 21:57:50.427789 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01b5da04-0e15-442b-87c1-941fac20aeaf-combined-ca-bundle\") pod \"aodh-0\" (UID: \"01b5da04-0e15-442b-87c1-941fac20aeaf\") " pod="openstack/aodh-0" Dec 01 21:57:50 crc kubenswrapper[4962]: I1201 21:57:50.436120 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bt5p\" (UniqueName: \"kubernetes.io/projected/01b5da04-0e15-442b-87c1-941fac20aeaf-kube-api-access-4bt5p\") pod \"aodh-0\" (UID: \"01b5da04-0e15-442b-87c1-941fac20aeaf\") " pod="openstack/aodh-0" Dec 01 21:57:50 crc kubenswrapper[4962]: I1201 21:57:50.509065 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 01 21:57:50 crc kubenswrapper[4962]: I1201 21:57:50.514394 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Dec 01 21:57:50 crc kubenswrapper[4962]: I1201 21:57:50.530820 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 01 21:57:50 crc kubenswrapper[4962]: I1201 21:57:50.626020 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rhvx\" (UniqueName: \"kubernetes.io/projected/a120e58c-62c2-4242-a668-151b872a9cb4-kube-api-access-8rhvx\") pod \"a120e58c-62c2-4242-a668-151b872a9cb4\" (UID: \"a120e58c-62c2-4242-a668-151b872a9cb4\") " Dec 01 21:57:50 crc kubenswrapper[4962]: I1201 21:57:50.626072 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a120e58c-62c2-4242-a668-151b872a9cb4-config-data\") pod \"a120e58c-62c2-4242-a668-151b872a9cb4\" (UID: \"a120e58c-62c2-4242-a668-151b872a9cb4\") " Dec 01 21:57:50 crc kubenswrapper[4962]: I1201 21:57:50.626116 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8hwx\" (UniqueName: \"kubernetes.io/projected/3dced14f-6bff-4820-b135-78ef69ba6b33-kube-api-access-x8hwx\") pod \"3dced14f-6bff-4820-b135-78ef69ba6b33\" (UID: \"3dced14f-6bff-4820-b135-78ef69ba6b33\") " Dec 01 21:57:50 crc kubenswrapper[4962]: I1201 21:57:50.626151 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a120e58c-62c2-4242-a668-151b872a9cb4-combined-ca-bundle\") pod \"a120e58c-62c2-4242-a668-151b872a9cb4\" (UID: \"a120e58c-62c2-4242-a668-151b872a9cb4\") " Dec 01 21:57:50 crc kubenswrapper[4962]: I1201 21:57:50.632973 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3dced14f-6bff-4820-b135-78ef69ba6b33-kube-api-access-x8hwx" (OuterVolumeSpecName: "kube-api-access-x8hwx") pod "3dced14f-6bff-4820-b135-78ef69ba6b33" (UID: "3dced14f-6bff-4820-b135-78ef69ba6b33"). InnerVolumeSpecName "kube-api-access-x8hwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:57:50 crc kubenswrapper[4962]: I1201 21:57:50.634200 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a120e58c-62c2-4242-a668-151b872a9cb4-kube-api-access-8rhvx" (OuterVolumeSpecName: "kube-api-access-8rhvx") pod "a120e58c-62c2-4242-a668-151b872a9cb4" (UID: "a120e58c-62c2-4242-a668-151b872a9cb4"). InnerVolumeSpecName "kube-api-access-8rhvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:57:50 crc kubenswrapper[4962]: I1201 21:57:50.681633 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a120e58c-62c2-4242-a668-151b872a9cb4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a120e58c-62c2-4242-a668-151b872a9cb4" (UID: "a120e58c-62c2-4242-a668-151b872a9cb4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:57:50 crc kubenswrapper[4962]: I1201 21:57:50.697318 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a120e58c-62c2-4242-a668-151b872a9cb4-config-data" (OuterVolumeSpecName: "config-data") pod "a120e58c-62c2-4242-a668-151b872a9cb4" (UID: "a120e58c-62c2-4242-a668-151b872a9cb4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:57:50 crc kubenswrapper[4962]: I1201 21:57:50.729068 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8rhvx\" (UniqueName: \"kubernetes.io/projected/a120e58c-62c2-4242-a668-151b872a9cb4-kube-api-access-8rhvx\") on node \"crc\" DevicePath \"\"" Dec 01 21:57:50 crc kubenswrapper[4962]: I1201 21:57:50.729104 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a120e58c-62c2-4242-a668-151b872a9cb4-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 21:57:50 crc kubenswrapper[4962]: I1201 21:57:50.729121 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8hwx\" (UniqueName: \"kubernetes.io/projected/3dced14f-6bff-4820-b135-78ef69ba6b33-kube-api-access-x8hwx\") on node \"crc\" DevicePath \"\"" Dec 01 21:57:50 crc kubenswrapper[4962]: I1201 21:57:50.729135 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a120e58c-62c2-4242-a668-151b872a9cb4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 21:57:50 crc kubenswrapper[4962]: I1201 21:57:50.975149 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"3dced14f-6bff-4820-b135-78ef69ba6b33","Type":"ContainerDied","Data":"cd63ae4baebc3800f00b82ad190c515c0cfdb69ed4241445e9af6edf6a8576ac"} Dec 01 21:57:50 crc kubenswrapper[4962]: I1201 21:57:50.975222 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 01 21:57:50 crc kubenswrapper[4962]: I1201 21:57:50.975547 4962 scope.go:117] "RemoveContainer" containerID="5f35cf4077e4ba9ef3b58db0be30ca171cd84839dc0460825c1ad95e246c604e" Dec 01 21:57:50 crc kubenswrapper[4962]: I1201 21:57:50.977518 4962 generic.go:334] "Generic (PLEG): container finished" podID="a120e58c-62c2-4242-a668-151b872a9cb4" containerID="565453a6732b26778e1fd23b877ef4726753596be880161a75cb33142f68b534" exitCode=2 Dec 01 21:57:50 crc kubenswrapper[4962]: I1201 21:57:50.977551 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"a120e58c-62c2-4242-a668-151b872a9cb4","Type":"ContainerDied","Data":"565453a6732b26778e1fd23b877ef4726753596be880161a75cb33142f68b534"} Dec 01 21:57:50 crc kubenswrapper[4962]: I1201 21:57:50.977575 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"a120e58c-62c2-4242-a668-151b872a9cb4","Type":"ContainerDied","Data":"01314995120ca2a359e62111f59a5bc1ddcbb3896a2be7ff5302f1af741be76a"} Dec 01 21:57:50 crc kubenswrapper[4962]: I1201 21:57:50.977630 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Dec 01 21:57:51 crc kubenswrapper[4962]: I1201 21:57:51.007586 4962 scope.go:117] "RemoveContainer" containerID="565453a6732b26778e1fd23b877ef4726753596be880161a75cb33142f68b534" Dec 01 21:57:51 crc kubenswrapper[4962]: I1201 21:57:51.038889 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-0"] Dec 01 21:57:51 crc kubenswrapper[4962]: I1201 21:57:51.067063 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-0"] Dec 01 21:57:51 crc kubenswrapper[4962]: I1201 21:57:51.070061 4962 scope.go:117] "RemoveContainer" containerID="565453a6732b26778e1fd23b877ef4726753596be880161a75cb33142f68b534" Dec 01 21:57:51 crc kubenswrapper[4962]: E1201 21:57:51.070629 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"565453a6732b26778e1fd23b877ef4726753596be880161a75cb33142f68b534\": container with ID starting with 565453a6732b26778e1fd23b877ef4726753596be880161a75cb33142f68b534 not found: ID does not exist" containerID="565453a6732b26778e1fd23b877ef4726753596be880161a75cb33142f68b534" Dec 01 21:57:51 crc kubenswrapper[4962]: I1201 21:57:51.070711 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"565453a6732b26778e1fd23b877ef4726753596be880161a75cb33142f68b534"} err="failed to get container status \"565453a6732b26778e1fd23b877ef4726753596be880161a75cb33142f68b534\": rpc error: code = NotFound desc = could not find container \"565453a6732b26778e1fd23b877ef4726753596be880161a75cb33142f68b534\": container with ID starting with 565453a6732b26778e1fd23b877ef4726753596be880161a75cb33142f68b534 not found: ID does not exist" Dec 01 21:57:51 crc kubenswrapper[4962]: I1201 21:57:51.078293 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 21:57:51 crc kubenswrapper[4962]: I1201 21:57:51.093175 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 21:57:51 crc kubenswrapper[4962]: I1201 21:57:51.105656 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 01 21:57:51 crc kubenswrapper[4962]: I1201 21:57:51.115563 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-0"] Dec 01 21:57:51 crc kubenswrapper[4962]: E1201 21:57:51.119422 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a120e58c-62c2-4242-a668-151b872a9cb4" containerName="mysqld-exporter" Dec 01 21:57:51 crc kubenswrapper[4962]: I1201 21:57:51.119469 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a120e58c-62c2-4242-a668-151b872a9cb4" containerName="mysqld-exporter" Dec 01 21:57:51 crc kubenswrapper[4962]: E1201 21:57:51.119489 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dced14f-6bff-4820-b135-78ef69ba6b33" containerName="kube-state-metrics" Dec 01 21:57:51 crc kubenswrapper[4962]: I1201 21:57:51.119497 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dced14f-6bff-4820-b135-78ef69ba6b33" containerName="kube-state-metrics" Dec 01 21:57:51 crc kubenswrapper[4962]: I1201 21:57:51.119750 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="3dced14f-6bff-4820-b135-78ef69ba6b33" containerName="kube-state-metrics" Dec 01 21:57:51 crc kubenswrapper[4962]: I1201 21:57:51.119787 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="a120e58c-62c2-4242-a668-151b872a9cb4" containerName="mysqld-exporter" Dec 01 21:57:51 crc kubenswrapper[4962]: I1201 21:57:51.133078 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 21:57:51 crc kubenswrapper[4962]: I1201 21:57:51.135274 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 01 21:57:51 crc kubenswrapper[4962]: I1201 21:57:51.137324 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Dec 01 21:57:51 crc kubenswrapper[4962]: I1201 21:57:51.137810 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Dec 01 21:57:51 crc kubenswrapper[4962]: I1201 21:57:51.138898 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-mysqld-exporter-svc" Dec 01 21:57:51 crc kubenswrapper[4962]: I1201 21:57:51.139197 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Dec 01 21:57:51 crc kubenswrapper[4962]: I1201 21:57:51.139408 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-config-data" Dec 01 21:57:51 crc kubenswrapper[4962]: I1201 21:57:51.139558 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Dec 01 21:57:51 crc kubenswrapper[4962]: I1201 21:57:51.141577 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/412ecf69-be53-4cb2-9ea4-867884bbf8cf-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"412ecf69-be53-4cb2-9ea4-867884bbf8cf\") " pod="openstack/kube-state-metrics-0" Dec 01 21:57:51 crc kubenswrapper[4962]: I1201 21:57:51.141675 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a4079d4-140a-438c-a252-c0669217e113-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"7a4079d4-140a-438c-a252-c0669217e113\") " pod="openstack/mysqld-exporter-0" Dec 01 21:57:51 crc kubenswrapper[4962]: I1201 21:57:51.141724 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvx79\" (UniqueName: \"kubernetes.io/projected/7a4079d4-140a-438c-a252-c0669217e113-kube-api-access-vvx79\") pod \"mysqld-exporter-0\" (UID: \"7a4079d4-140a-438c-a252-c0669217e113\") " pod="openstack/mysqld-exporter-0" Dec 01 21:57:51 crc kubenswrapper[4962]: I1201 21:57:51.141950 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-df5wh\" (UniqueName: \"kubernetes.io/projected/412ecf69-be53-4cb2-9ea4-867884bbf8cf-kube-api-access-df5wh\") pod \"kube-state-metrics-0\" (UID: \"412ecf69-be53-4cb2-9ea4-867884bbf8cf\") " pod="openstack/kube-state-metrics-0" Dec 01 21:57:51 crc kubenswrapper[4962]: I1201 21:57:51.142069 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/412ecf69-be53-4cb2-9ea4-867884bbf8cf-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"412ecf69-be53-4cb2-9ea4-867884bbf8cf\") " pod="openstack/kube-state-metrics-0" Dec 01 21:57:51 crc kubenswrapper[4962]: I1201 21:57:51.142123 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a4079d4-140a-438c-a252-c0669217e113-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"7a4079d4-140a-438c-a252-c0669217e113\") " pod="openstack/mysqld-exporter-0" Dec 01 21:57:51 crc kubenswrapper[4962]: I1201 21:57:51.142228 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/412ecf69-be53-4cb2-9ea4-867884bbf8cf-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"412ecf69-be53-4cb2-9ea4-867884bbf8cf\") " pod="openstack/kube-state-metrics-0" Dec 01 21:57:51 crc kubenswrapper[4962]: I1201 21:57:51.142273 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a4079d4-140a-438c-a252-c0669217e113-config-data\") pod \"mysqld-exporter-0\" (UID: \"7a4079d4-140a-438c-a252-c0669217e113\") " pod="openstack/mysqld-exporter-0" Dec 01 21:57:51 crc kubenswrapper[4962]: I1201 21:57:51.152843 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 21:57:51 crc kubenswrapper[4962]: I1201 21:57:51.244831 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/412ecf69-be53-4cb2-9ea4-867884bbf8cf-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"412ecf69-be53-4cb2-9ea4-867884bbf8cf\") " pod="openstack/kube-state-metrics-0" Dec 01 21:57:51 crc kubenswrapper[4962]: I1201 21:57:51.244924 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a4079d4-140a-438c-a252-c0669217e113-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"7a4079d4-140a-438c-a252-c0669217e113\") " pod="openstack/mysqld-exporter-0" Dec 01 21:57:51 crc kubenswrapper[4962]: I1201 21:57:51.244994 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvx79\" (UniqueName: \"kubernetes.io/projected/7a4079d4-140a-438c-a252-c0669217e113-kube-api-access-vvx79\") pod \"mysqld-exporter-0\" (UID: \"7a4079d4-140a-438c-a252-c0669217e113\") " pod="openstack/mysqld-exporter-0" Dec 01 21:57:51 crc kubenswrapper[4962]: I1201 21:57:51.245048 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-df5wh\" (UniqueName: \"kubernetes.io/projected/412ecf69-be53-4cb2-9ea4-867884bbf8cf-kube-api-access-df5wh\") pod \"kube-state-metrics-0\" (UID: \"412ecf69-be53-4cb2-9ea4-867884bbf8cf\") " pod="openstack/kube-state-metrics-0" Dec 01 21:57:51 crc kubenswrapper[4962]: I1201 21:57:51.245122 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/412ecf69-be53-4cb2-9ea4-867884bbf8cf-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"412ecf69-be53-4cb2-9ea4-867884bbf8cf\") " pod="openstack/kube-state-metrics-0" Dec 01 21:57:51 crc kubenswrapper[4962]: I1201 21:57:51.245171 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a4079d4-140a-438c-a252-c0669217e113-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"7a4079d4-140a-438c-a252-c0669217e113\") " pod="openstack/mysqld-exporter-0" Dec 01 21:57:51 crc kubenswrapper[4962]: I1201 21:57:51.245883 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/412ecf69-be53-4cb2-9ea4-867884bbf8cf-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"412ecf69-be53-4cb2-9ea4-867884bbf8cf\") " pod="openstack/kube-state-metrics-0" Dec 01 21:57:51 crc kubenswrapper[4962]: I1201 21:57:51.245953 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a4079d4-140a-438c-a252-c0669217e113-config-data\") pod \"mysqld-exporter-0\" (UID: \"7a4079d4-140a-438c-a252-c0669217e113\") " pod="openstack/mysqld-exporter-0" Dec 01 21:57:51 crc kubenswrapper[4962]: I1201 21:57:51.250554 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/412ecf69-be53-4cb2-9ea4-867884bbf8cf-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"412ecf69-be53-4cb2-9ea4-867884bbf8cf\") " pod="openstack/kube-state-metrics-0" Dec 01 21:57:51 crc kubenswrapper[4962]: I1201 21:57:51.251217 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a4079d4-140a-438c-a252-c0669217e113-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"7a4079d4-140a-438c-a252-c0669217e113\") " pod="openstack/mysqld-exporter-0" Dec 01 21:57:51 crc kubenswrapper[4962]: I1201 21:57:51.255859 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/412ecf69-be53-4cb2-9ea4-867884bbf8cf-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"412ecf69-be53-4cb2-9ea4-867884bbf8cf\") " pod="openstack/kube-state-metrics-0" Dec 01 21:57:51 crc kubenswrapper[4962]: I1201 21:57:51.256215 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a4079d4-140a-438c-a252-c0669217e113-config-data\") pod \"mysqld-exporter-0\" (UID: \"7a4079d4-140a-438c-a252-c0669217e113\") " pod="openstack/mysqld-exporter-0" Dec 01 21:57:51 crc kubenswrapper[4962]: I1201 21:57:51.256576 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/412ecf69-be53-4cb2-9ea4-867884bbf8cf-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"412ecf69-be53-4cb2-9ea4-867884bbf8cf\") " pod="openstack/kube-state-metrics-0" Dec 01 21:57:51 crc kubenswrapper[4962]: I1201 21:57:51.261956 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a4079d4-140a-438c-a252-c0669217e113-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"7a4079d4-140a-438c-a252-c0669217e113\") " pod="openstack/mysqld-exporter-0" Dec 01 21:57:51 crc kubenswrapper[4962]: I1201 21:57:51.265067 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvx79\" (UniqueName: \"kubernetes.io/projected/7a4079d4-140a-438c-a252-c0669217e113-kube-api-access-vvx79\") pod \"mysqld-exporter-0\" (UID: \"7a4079d4-140a-438c-a252-c0669217e113\") " pod="openstack/mysqld-exporter-0" Dec 01 21:57:51 crc kubenswrapper[4962]: I1201 21:57:51.266596 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-df5wh\" (UniqueName: \"kubernetes.io/projected/412ecf69-be53-4cb2-9ea4-867884bbf8cf-kube-api-access-df5wh\") pod \"kube-state-metrics-0\" (UID: \"412ecf69-be53-4cb2-9ea4-867884bbf8cf\") " pod="openstack/kube-state-metrics-0" Dec 01 21:57:51 crc kubenswrapper[4962]: I1201 21:57:51.320588 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 01 21:57:51 crc kubenswrapper[4962]: I1201 21:57:51.320830 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 01 21:57:51 crc kubenswrapper[4962]: I1201 21:57:51.336584 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 01 21:57:51 crc kubenswrapper[4962]: I1201 21:57:51.463583 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 01 21:57:51 crc kubenswrapper[4962]: I1201 21:57:51.478473 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Dec 01 21:57:51 crc kubenswrapper[4962]: I1201 21:57:51.537200 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 01 21:57:51 crc kubenswrapper[4962]: I1201 21:57:51.870372 4962 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 21:57:51 crc kubenswrapper[4962]: I1201 21:57:51.971335 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 21:57:51 crc kubenswrapper[4962]: I1201 21:57:51.997176 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 21:57:51 crc kubenswrapper[4962]: I1201 21:57:51.997464 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f6a46329-485b-472a-9c6c-d3c79dabfb91" containerName="ceilometer-central-agent" containerID="cri-o://592a94861620cb4df5a2be6b2da7248fb0f70e623d87a0527b24a3096da07c6e" gracePeriod=30 Dec 01 21:57:51 crc kubenswrapper[4962]: I1201 21:57:51.998029 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f6a46329-485b-472a-9c6c-d3c79dabfb91" containerName="proxy-httpd" containerID="cri-o://026f0734ebf1646f899d40438474da1083cd94dd882d5149db146d4656a8a10f" gracePeriod=30 Dec 01 21:57:51 crc kubenswrapper[4962]: I1201 21:57:51.998085 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f6a46329-485b-472a-9c6c-d3c79dabfb91" containerName="sg-core" containerID="cri-o://ee819f6636f293b188e89b69e598c9cf793546503006af4260549afd13af18a2" gracePeriod=30 Dec 01 21:57:51 crc kubenswrapper[4962]: I1201 21:57:51.998119 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f6a46329-485b-472a-9c6c-d3c79dabfb91" containerName="ceilometer-notification-agent" containerID="cri-o://fd96d78c4b92b5a1caf947c80fbfb8e27a019773df074e8e788498f8a570112e" gracePeriod=30 Dec 01 21:57:52 crc kubenswrapper[4962]: I1201 21:57:52.019699 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"01b5da04-0e15-442b-87c1-941fac20aeaf","Type":"ContainerStarted","Data":"2a6b4f2584e177ecd418182a17c7e0b9ea8bb9cca0bb4a3bb48d4e9535339ecc"} Dec 01 21:57:52 crc kubenswrapper[4962]: I1201 21:57:52.020096 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"01b5da04-0e15-442b-87c1-941fac20aeaf","Type":"ContainerStarted","Data":"3472de3d4f771beb86d9af47d5c192f06e17bc8bb6f53146bb061a327191943f"} Dec 01 21:57:52 crc kubenswrapper[4962]: I1201 21:57:52.068576 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 01 21:57:52 crc kubenswrapper[4962]: I1201 21:57:52.086669 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Dec 01 21:57:52 crc kubenswrapper[4962]: I1201 21:57:52.231145 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3dced14f-6bff-4820-b135-78ef69ba6b33" path="/var/lib/kubelet/pods/3dced14f-6bff-4820-b135-78ef69ba6b33/volumes" Dec 01 21:57:52 crc kubenswrapper[4962]: I1201 21:57:52.231676 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a120e58c-62c2-4242-a668-151b872a9cb4" path="/var/lib/kubelet/pods/a120e58c-62c2-4242-a668-151b872a9cb4/volumes" Dec 01 21:57:52 crc kubenswrapper[4962]: I1201 21:57:52.404596 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="66789194-99ac-4fdf-9fc0-350fa5422867" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.245:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 21:57:52 crc kubenswrapper[4962]: I1201 21:57:52.405038 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="66789194-99ac-4fdf-9fc0-350fa5422867" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.245:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 21:57:53 crc kubenswrapper[4962]: I1201 21:57:53.039480 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"7a4079d4-140a-438c-a252-c0669217e113","Type":"ContainerStarted","Data":"cb4ea78d32181f5d75f04549a0c3a292a9e8fb2efe3346a51adff0e74a806252"} Dec 01 21:57:53 crc kubenswrapper[4962]: I1201 21:57:53.051358 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"01b5da04-0e15-442b-87c1-941fac20aeaf","Type":"ContainerStarted","Data":"d11df9a3ad95b4d8ff1402a72f82ab05f4352cf9afb8a5e46a1a2a829116f4ff"} Dec 01 21:57:53 crc kubenswrapper[4962]: I1201 21:57:53.057348 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"412ecf69-be53-4cb2-9ea4-867884bbf8cf","Type":"ContainerStarted","Data":"0ffc63c958358937bbe26fff79df98f4cc712f3b0b1bc5929c1efd0a5d2af9a1"} Dec 01 21:57:53 crc kubenswrapper[4962]: I1201 21:57:53.057660 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"412ecf69-be53-4cb2-9ea4-867884bbf8cf","Type":"ContainerStarted","Data":"a412e50c69e73e6e3b14c49a0a68bae44b6365c61332ae5f2ef502f7ef1a2043"} Dec 01 21:57:53 crc kubenswrapper[4962]: I1201 21:57:53.057923 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 01 21:57:53 crc kubenswrapper[4962]: I1201 21:57:53.067130 4962 generic.go:334] "Generic (PLEG): container finished" podID="f6a46329-485b-472a-9c6c-d3c79dabfb91" containerID="026f0734ebf1646f899d40438474da1083cd94dd882d5149db146d4656a8a10f" exitCode=0 Dec 01 21:57:53 crc kubenswrapper[4962]: I1201 21:57:53.067163 4962 generic.go:334] "Generic (PLEG): container finished" podID="f6a46329-485b-472a-9c6c-d3c79dabfb91" containerID="ee819f6636f293b188e89b69e598c9cf793546503006af4260549afd13af18a2" exitCode=2 Dec 01 21:57:53 crc kubenswrapper[4962]: I1201 21:57:53.067174 4962 generic.go:334] "Generic (PLEG): container finished" podID="f6a46329-485b-472a-9c6c-d3c79dabfb91" containerID="592a94861620cb4df5a2be6b2da7248fb0f70e623d87a0527b24a3096da07c6e" exitCode=0 Dec 01 21:57:53 crc kubenswrapper[4962]: I1201 21:57:53.067214 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f6a46329-485b-472a-9c6c-d3c79dabfb91","Type":"ContainerDied","Data":"026f0734ebf1646f899d40438474da1083cd94dd882d5149db146d4656a8a10f"} Dec 01 21:57:53 crc kubenswrapper[4962]: I1201 21:57:53.067275 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f6a46329-485b-472a-9c6c-d3c79dabfb91","Type":"ContainerDied","Data":"ee819f6636f293b188e89b69e598c9cf793546503006af4260549afd13af18a2"} Dec 01 21:57:53 crc kubenswrapper[4962]: I1201 21:57:53.067291 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f6a46329-485b-472a-9c6c-d3c79dabfb91","Type":"ContainerDied","Data":"592a94861620cb4df5a2be6b2da7248fb0f70e623d87a0527b24a3096da07c6e"} Dec 01 21:57:53 crc kubenswrapper[4962]: I1201 21:57:53.090039 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.721938424 podStartE2EDuration="2.090011534s" podCreationTimestamp="2025-12-01 21:57:51 +0000 UTC" firstStartedPulling="2025-12-01 21:57:52.003506698 +0000 UTC m=+1456.104945883" lastFinishedPulling="2025-12-01 21:57:52.371579788 +0000 UTC m=+1456.473018993" observedRunningTime="2025-12-01 21:57:53.078491328 +0000 UTC m=+1457.179930543" watchObservedRunningTime="2025-12-01 21:57:53.090011534 +0000 UTC m=+1457.191450739" Dec 01 21:57:54 crc kubenswrapper[4962]: I1201 21:57:54.081596 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"01b5da04-0e15-442b-87c1-941fac20aeaf","Type":"ContainerStarted","Data":"6651bb97fb232768e5f823c1ddfb513ca5c7505398718dc627f72c6088e7a941"} Dec 01 21:57:54 crc kubenswrapper[4962]: I1201 21:57:54.083625 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"7a4079d4-140a-438c-a252-c0669217e113","Type":"ContainerStarted","Data":"aef0f0235fc5fd495b7e3810afbd9fa88c17be742c4469a7ef5047b922bd2667"} Dec 01 21:57:54 crc kubenswrapper[4962]: I1201 21:57:54.110831 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-0" podStartSLOduration=2.416079926 podStartE2EDuration="3.110809371s" podCreationTimestamp="2025-12-01 21:57:51 +0000 UTC" firstStartedPulling="2025-12-01 21:57:52.077511264 +0000 UTC m=+1456.178950449" lastFinishedPulling="2025-12-01 21:57:52.772240699 +0000 UTC m=+1456.873679894" observedRunningTime="2025-12-01 21:57:54.106283383 +0000 UTC m=+1458.207722578" watchObservedRunningTime="2025-12-01 21:57:54.110809371 +0000 UTC m=+1458.212248566" Dec 01 21:57:55 crc kubenswrapper[4962]: I1201 21:57:55.117330 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"01b5da04-0e15-442b-87c1-941fac20aeaf","Type":"ContainerStarted","Data":"72339093c5bd948f8ebc0537ff7b20ed87883f6ba7a8384c485a2883f5497225"} Dec 01 21:57:56 crc kubenswrapper[4962]: I1201 21:57:56.272440 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 01 21:57:56 crc kubenswrapper[4962]: I1201 21:57:56.274308 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 01 21:57:56 crc kubenswrapper[4962]: I1201 21:57:56.298332 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 01 21:57:56 crc kubenswrapper[4962]: I1201 21:57:56.303954 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=3.209775576 podStartE2EDuration="6.303914764s" podCreationTimestamp="2025-12-01 21:57:50 +0000 UTC" firstStartedPulling="2025-12-01 21:57:51.087960881 +0000 UTC m=+1455.189400076" lastFinishedPulling="2025-12-01 21:57:54.182100059 +0000 UTC m=+1458.283539264" observedRunningTime="2025-12-01 21:57:55.148088794 +0000 UTC m=+1459.249528039" watchObservedRunningTime="2025-12-01 21:57:56.303914764 +0000 UTC m=+1460.405353959" Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.065164 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.105694 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f6a46329-485b-472a-9c6c-d3c79dabfb91-sg-core-conf-yaml\") pod \"f6a46329-485b-472a-9c6c-d3c79dabfb91\" (UID: \"f6a46329-485b-472a-9c6c-d3c79dabfb91\") " Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.105805 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7w42\" (UniqueName: \"kubernetes.io/projected/f6a46329-485b-472a-9c6c-d3c79dabfb91-kube-api-access-l7w42\") pod \"f6a46329-485b-472a-9c6c-d3c79dabfb91\" (UID: \"f6a46329-485b-472a-9c6c-d3c79dabfb91\") " Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.105958 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6a46329-485b-472a-9c6c-d3c79dabfb91-combined-ca-bundle\") pod \"f6a46329-485b-472a-9c6c-d3c79dabfb91\" (UID: \"f6a46329-485b-472a-9c6c-d3c79dabfb91\") " Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.106024 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6a46329-485b-472a-9c6c-d3c79dabfb91-config-data\") pod \"f6a46329-485b-472a-9c6c-d3c79dabfb91\" (UID: \"f6a46329-485b-472a-9c6c-d3c79dabfb91\") " Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.106114 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6a46329-485b-472a-9c6c-d3c79dabfb91-run-httpd\") pod \"f6a46329-485b-472a-9c6c-d3c79dabfb91\" (UID: \"f6a46329-485b-472a-9c6c-d3c79dabfb91\") " Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.106275 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6a46329-485b-472a-9c6c-d3c79dabfb91-scripts\") pod \"f6a46329-485b-472a-9c6c-d3c79dabfb91\" (UID: \"f6a46329-485b-472a-9c6c-d3c79dabfb91\") " Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.106371 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6a46329-485b-472a-9c6c-d3c79dabfb91-log-httpd\") pod \"f6a46329-485b-472a-9c6c-d3c79dabfb91\" (UID: \"f6a46329-485b-472a-9c6c-d3c79dabfb91\") " Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.106592 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6a46329-485b-472a-9c6c-d3c79dabfb91-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f6a46329-485b-472a-9c6c-d3c79dabfb91" (UID: "f6a46329-485b-472a-9c6c-d3c79dabfb91"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.106788 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6a46329-485b-472a-9c6c-d3c79dabfb91-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f6a46329-485b-472a-9c6c-d3c79dabfb91" (UID: "f6a46329-485b-472a-9c6c-d3c79dabfb91"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.107972 4962 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6a46329-485b-472a-9c6c-d3c79dabfb91-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.108000 4962 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6a46329-485b-472a-9c6c-d3c79dabfb91-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.123612 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6a46329-485b-472a-9c6c-d3c79dabfb91-kube-api-access-l7w42" (OuterVolumeSpecName: "kube-api-access-l7w42") pod "f6a46329-485b-472a-9c6c-d3c79dabfb91" (UID: "f6a46329-485b-472a-9c6c-d3c79dabfb91"). InnerVolumeSpecName "kube-api-access-l7w42". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.140241 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6a46329-485b-472a-9c6c-d3c79dabfb91-scripts" (OuterVolumeSpecName: "scripts") pod "f6a46329-485b-472a-9c6c-d3c79dabfb91" (UID: "f6a46329-485b-472a-9c6c-d3c79dabfb91"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.164416 4962 generic.go:334] "Generic (PLEG): container finished" podID="f6a46329-485b-472a-9c6c-d3c79dabfb91" containerID="fd96d78c4b92b5a1caf947c80fbfb8e27a019773df074e8e788498f8a570112e" exitCode=0 Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.164615 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f6a46329-485b-472a-9c6c-d3c79dabfb91","Type":"ContainerDied","Data":"fd96d78c4b92b5a1caf947c80fbfb8e27a019773df074e8e788498f8a570112e"} Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.164672 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f6a46329-485b-472a-9c6c-d3c79dabfb91","Type":"ContainerDied","Data":"1c82357763d2755c096e5c1ce06e13685c776c00b018c1ad08e7b207164c2624"} Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.164691 4962 scope.go:117] "RemoveContainer" containerID="026f0734ebf1646f899d40438474da1083cd94dd882d5149db146d4656a8a10f" Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.164729 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.179175 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.212060 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6a46329-485b-472a-9c6c-d3c79dabfb91-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.212307 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7w42\" (UniqueName: \"kubernetes.io/projected/f6a46329-485b-472a-9c6c-d3c79dabfb91-kube-api-access-l7w42\") on node \"crc\" DevicePath \"\"" Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.253699 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6a46329-485b-472a-9c6c-d3c79dabfb91-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f6a46329-485b-472a-9c6c-d3c79dabfb91" (UID: "f6a46329-485b-472a-9c6c-d3c79dabfb91"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.270376 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6a46329-485b-472a-9c6c-d3c79dabfb91-config-data" (OuterVolumeSpecName: "config-data") pod "f6a46329-485b-472a-9c6c-d3c79dabfb91" (UID: "f6a46329-485b-472a-9c6c-d3c79dabfb91"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.292443 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6a46329-485b-472a-9c6c-d3c79dabfb91-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f6a46329-485b-472a-9c6c-d3c79dabfb91" (UID: "f6a46329-485b-472a-9c6c-d3c79dabfb91"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.304035 4962 scope.go:117] "RemoveContainer" containerID="ee819f6636f293b188e89b69e598c9cf793546503006af4260549afd13af18a2" Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.315008 4962 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f6a46329-485b-472a-9c6c-d3c79dabfb91-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.315214 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6a46329-485b-472a-9c6c-d3c79dabfb91-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.315341 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6a46329-485b-472a-9c6c-d3c79dabfb91-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.333146 4962 scope.go:117] "RemoveContainer" containerID="fd96d78c4b92b5a1caf947c80fbfb8e27a019773df074e8e788498f8a570112e" Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.358118 4962 scope.go:117] "RemoveContainer" containerID="592a94861620cb4df5a2be6b2da7248fb0f70e623d87a0527b24a3096da07c6e" Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.385001 4962 scope.go:117] "RemoveContainer" containerID="026f0734ebf1646f899d40438474da1083cd94dd882d5149db146d4656a8a10f" Dec 01 21:57:57 crc kubenswrapper[4962]: E1201 21:57:57.386682 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"026f0734ebf1646f899d40438474da1083cd94dd882d5149db146d4656a8a10f\": container with ID starting with 026f0734ebf1646f899d40438474da1083cd94dd882d5149db146d4656a8a10f not found: ID does not exist" containerID="026f0734ebf1646f899d40438474da1083cd94dd882d5149db146d4656a8a10f" Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.386716 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"026f0734ebf1646f899d40438474da1083cd94dd882d5149db146d4656a8a10f"} err="failed to get container status \"026f0734ebf1646f899d40438474da1083cd94dd882d5149db146d4656a8a10f\": rpc error: code = NotFound desc = could not find container \"026f0734ebf1646f899d40438474da1083cd94dd882d5149db146d4656a8a10f\": container with ID starting with 026f0734ebf1646f899d40438474da1083cd94dd882d5149db146d4656a8a10f not found: ID does not exist" Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.386740 4962 scope.go:117] "RemoveContainer" containerID="ee819f6636f293b188e89b69e598c9cf793546503006af4260549afd13af18a2" Dec 01 21:57:57 crc kubenswrapper[4962]: E1201 21:57:57.387271 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee819f6636f293b188e89b69e598c9cf793546503006af4260549afd13af18a2\": container with ID starting with ee819f6636f293b188e89b69e598c9cf793546503006af4260549afd13af18a2 not found: ID does not exist" containerID="ee819f6636f293b188e89b69e598c9cf793546503006af4260549afd13af18a2" Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.387347 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee819f6636f293b188e89b69e598c9cf793546503006af4260549afd13af18a2"} err="failed to get container status \"ee819f6636f293b188e89b69e598c9cf793546503006af4260549afd13af18a2\": rpc error: code = NotFound desc = could not find container \"ee819f6636f293b188e89b69e598c9cf793546503006af4260549afd13af18a2\": container with ID starting with ee819f6636f293b188e89b69e598c9cf793546503006af4260549afd13af18a2 not found: ID does not exist" Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.387409 4962 scope.go:117] "RemoveContainer" containerID="fd96d78c4b92b5a1caf947c80fbfb8e27a019773df074e8e788498f8a570112e" Dec 01 21:57:57 crc kubenswrapper[4962]: E1201 21:57:57.387747 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd96d78c4b92b5a1caf947c80fbfb8e27a019773df074e8e788498f8a570112e\": container with ID starting with fd96d78c4b92b5a1caf947c80fbfb8e27a019773df074e8e788498f8a570112e not found: ID does not exist" containerID="fd96d78c4b92b5a1caf947c80fbfb8e27a019773df074e8e788498f8a570112e" Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.387772 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd96d78c4b92b5a1caf947c80fbfb8e27a019773df074e8e788498f8a570112e"} err="failed to get container status \"fd96d78c4b92b5a1caf947c80fbfb8e27a019773df074e8e788498f8a570112e\": rpc error: code = NotFound desc = could not find container \"fd96d78c4b92b5a1caf947c80fbfb8e27a019773df074e8e788498f8a570112e\": container with ID starting with fd96d78c4b92b5a1caf947c80fbfb8e27a019773df074e8e788498f8a570112e not found: ID does not exist" Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.387791 4962 scope.go:117] "RemoveContainer" containerID="592a94861620cb4df5a2be6b2da7248fb0f70e623d87a0527b24a3096da07c6e" Dec 01 21:57:57 crc kubenswrapper[4962]: E1201 21:57:57.388234 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"592a94861620cb4df5a2be6b2da7248fb0f70e623d87a0527b24a3096da07c6e\": container with ID starting with 592a94861620cb4df5a2be6b2da7248fb0f70e623d87a0527b24a3096da07c6e not found: ID does not exist" containerID="592a94861620cb4df5a2be6b2da7248fb0f70e623d87a0527b24a3096da07c6e" Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.388256 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"592a94861620cb4df5a2be6b2da7248fb0f70e623d87a0527b24a3096da07c6e"} err="failed to get container status \"592a94861620cb4df5a2be6b2da7248fb0f70e623d87a0527b24a3096da07c6e\": rpc error: code = NotFound desc = could not find container \"592a94861620cb4df5a2be6b2da7248fb0f70e623d87a0527b24a3096da07c6e\": container with ID starting with 592a94861620cb4df5a2be6b2da7248fb0f70e623d87a0527b24a3096da07c6e not found: ID does not exist" Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.517461 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.532108 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.541784 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 01 21:57:57 crc kubenswrapper[4962]: E1201 21:57:57.542583 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6a46329-485b-472a-9c6c-d3c79dabfb91" containerName="ceilometer-notification-agent" Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.542611 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6a46329-485b-472a-9c6c-d3c79dabfb91" containerName="ceilometer-notification-agent" Dec 01 21:57:57 crc kubenswrapper[4962]: E1201 21:57:57.542635 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6a46329-485b-472a-9c6c-d3c79dabfb91" containerName="sg-core" Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.542644 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6a46329-485b-472a-9c6c-d3c79dabfb91" containerName="sg-core" Dec 01 21:57:57 crc kubenswrapper[4962]: E1201 21:57:57.542690 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6a46329-485b-472a-9c6c-d3c79dabfb91" containerName="proxy-httpd" Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.542715 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6a46329-485b-472a-9c6c-d3c79dabfb91" containerName="proxy-httpd" Dec 01 21:57:57 crc kubenswrapper[4962]: E1201 21:57:57.542729 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6a46329-485b-472a-9c6c-d3c79dabfb91" containerName="ceilometer-central-agent" Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.542738 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6a46329-485b-472a-9c6c-d3c79dabfb91" containerName="ceilometer-central-agent" Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.543114 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6a46329-485b-472a-9c6c-d3c79dabfb91" containerName="sg-core" Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.543138 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6a46329-485b-472a-9c6c-d3c79dabfb91" containerName="ceilometer-notification-agent" Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.543162 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6a46329-485b-472a-9c6c-d3c79dabfb91" containerName="proxy-httpd" Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.543176 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6a46329-485b-472a-9c6c-d3c79dabfb91" containerName="ceilometer-central-agent" Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.552597 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.559905 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.560257 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.560527 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.609129 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.621644 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3\") " pod="openstack/ceilometer-0" Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.622981 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3\") " pod="openstack/ceilometer-0" Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.623147 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3-log-httpd\") pod \"ceilometer-0\" (UID: \"54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3\") " pod="openstack/ceilometer-0" Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.623269 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hwv4\" (UniqueName: \"kubernetes.io/projected/54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3-kube-api-access-4hwv4\") pod \"ceilometer-0\" (UID: \"54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3\") " pod="openstack/ceilometer-0" Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.623350 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3-config-data\") pod \"ceilometer-0\" (UID: \"54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3\") " pod="openstack/ceilometer-0" Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.623496 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3-scripts\") pod \"ceilometer-0\" (UID: \"54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3\") " pod="openstack/ceilometer-0" Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.623594 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3\") " pod="openstack/ceilometer-0" Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.623731 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3-run-httpd\") pod \"ceilometer-0\" (UID: \"54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3\") " pod="openstack/ceilometer-0" Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.725793 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3-log-httpd\") pod \"ceilometer-0\" (UID: \"54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3\") " pod="openstack/ceilometer-0" Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.726335 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hwv4\" (UniqueName: \"kubernetes.io/projected/54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3-kube-api-access-4hwv4\") pod \"ceilometer-0\" (UID: \"54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3\") " pod="openstack/ceilometer-0" Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.726701 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3-config-data\") pod \"ceilometer-0\" (UID: \"54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3\") " pod="openstack/ceilometer-0" Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.727461 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3-scripts\") pod \"ceilometer-0\" (UID: \"54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3\") " pod="openstack/ceilometer-0" Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.727585 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3\") " pod="openstack/ceilometer-0" Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.726298 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3-log-httpd\") pod \"ceilometer-0\" (UID: \"54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3\") " pod="openstack/ceilometer-0" Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.727881 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3-run-httpd\") pod \"ceilometer-0\" (UID: \"54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3\") " pod="openstack/ceilometer-0" Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.728182 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3-run-httpd\") pod \"ceilometer-0\" (UID: \"54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3\") " pod="openstack/ceilometer-0" Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.728251 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3\") " pod="openstack/ceilometer-0" Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.728492 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3\") " pod="openstack/ceilometer-0" Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.732140 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3-scripts\") pod \"ceilometer-0\" (UID: \"54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3\") " pod="openstack/ceilometer-0" Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.732321 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3\") " pod="openstack/ceilometer-0" Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.733049 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3-config-data\") pod \"ceilometer-0\" (UID: \"54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3\") " pod="openstack/ceilometer-0" Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.739451 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3\") " pod="openstack/ceilometer-0" Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.741405 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3\") " pod="openstack/ceilometer-0" Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.744429 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hwv4\" (UniqueName: \"kubernetes.io/projected/54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3-kube-api-access-4hwv4\") pod \"ceilometer-0\" (UID: \"54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3\") " pod="openstack/ceilometer-0" Dec 01 21:57:57 crc kubenswrapper[4962]: I1201 21:57:57.886189 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 21:57:58 crc kubenswrapper[4962]: I1201 21:57:58.269969 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6a46329-485b-472a-9c6c-d3c79dabfb91" path="/var/lib/kubelet/pods/f6a46329-485b-472a-9c6c-d3c79dabfb91/volumes" Dec 01 21:57:58 crc kubenswrapper[4962]: W1201 21:57:58.409199 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod54bc2da4_1b23_49a0_b9b9_a5f042b1e0e3.slice/crio-37598a79123cebeba7e8d616a1c3cda3be16cc3afd1a4db0670dbc9b2059d903 WatchSource:0}: Error finding container 37598a79123cebeba7e8d616a1c3cda3be16cc3afd1a4db0670dbc9b2059d903: Status 404 returned error can't find the container with id 37598a79123cebeba7e8d616a1c3cda3be16cc3afd1a4db0670dbc9b2059d903 Dec 01 21:57:58 crc kubenswrapper[4962]: I1201 21:57:58.416886 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 21:57:59 crc kubenswrapper[4962]: I1201 21:57:59.184785 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 01 21:57:59 crc kubenswrapper[4962]: I1201 21:57:59.199711 4962 generic.go:334] "Generic (PLEG): container finished" podID="65d408e2-365e-4ab9-9077-ea1706b8d4a2" containerID="0d036f8c88f33e984a854a699f7cb61ebf803cd0ced1a3b5e7b3e8b3ebdec2cf" exitCode=137 Dec 01 21:57:59 crc kubenswrapper[4962]: I1201 21:57:59.199796 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"65d408e2-365e-4ab9-9077-ea1706b8d4a2","Type":"ContainerDied","Data":"0d036f8c88f33e984a854a699f7cb61ebf803cd0ced1a3b5e7b3e8b3ebdec2cf"} Dec 01 21:57:59 crc kubenswrapper[4962]: I1201 21:57:59.199830 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"65d408e2-365e-4ab9-9077-ea1706b8d4a2","Type":"ContainerDied","Data":"abf6059f16ab3232d9a6140b5bdedcd0233a9f42c890764f61a0ef1219cf3973"} Dec 01 21:57:59 crc kubenswrapper[4962]: I1201 21:57:59.199854 4962 scope.go:117] "RemoveContainer" containerID="0d036f8c88f33e984a854a699f7cb61ebf803cd0ced1a3b5e7b3e8b3ebdec2cf" Dec 01 21:57:59 crc kubenswrapper[4962]: I1201 21:57:59.200100 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 01 21:57:59 crc kubenswrapper[4962]: I1201 21:57:59.203257 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3","Type":"ContainerStarted","Data":"37598a79123cebeba7e8d616a1c3cda3be16cc3afd1a4db0670dbc9b2059d903"} Dec 01 21:57:59 crc kubenswrapper[4962]: I1201 21:57:59.237129 4962 scope.go:117] "RemoveContainer" containerID="0d036f8c88f33e984a854a699f7cb61ebf803cd0ced1a3b5e7b3e8b3ebdec2cf" Dec 01 21:57:59 crc kubenswrapper[4962]: E1201 21:57:59.237851 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d036f8c88f33e984a854a699f7cb61ebf803cd0ced1a3b5e7b3e8b3ebdec2cf\": container with ID starting with 0d036f8c88f33e984a854a699f7cb61ebf803cd0ced1a3b5e7b3e8b3ebdec2cf not found: ID does not exist" containerID="0d036f8c88f33e984a854a699f7cb61ebf803cd0ced1a3b5e7b3e8b3ebdec2cf" Dec 01 21:57:59 crc kubenswrapper[4962]: I1201 21:57:59.237910 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d036f8c88f33e984a854a699f7cb61ebf803cd0ced1a3b5e7b3e8b3ebdec2cf"} err="failed to get container status \"0d036f8c88f33e984a854a699f7cb61ebf803cd0ced1a3b5e7b3e8b3ebdec2cf\": rpc error: code = NotFound desc = could not find container \"0d036f8c88f33e984a854a699f7cb61ebf803cd0ced1a3b5e7b3e8b3ebdec2cf\": container with ID starting with 0d036f8c88f33e984a854a699f7cb61ebf803cd0ced1a3b5e7b3e8b3ebdec2cf not found: ID does not exist" Dec 01 21:57:59 crc kubenswrapper[4962]: I1201 21:57:59.274772 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65d408e2-365e-4ab9-9077-ea1706b8d4a2-config-data\") pod \"65d408e2-365e-4ab9-9077-ea1706b8d4a2\" (UID: \"65d408e2-365e-4ab9-9077-ea1706b8d4a2\") " Dec 01 21:57:59 crc kubenswrapper[4962]: I1201 21:57:59.274974 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frw2c\" (UniqueName: \"kubernetes.io/projected/65d408e2-365e-4ab9-9077-ea1706b8d4a2-kube-api-access-frw2c\") pod \"65d408e2-365e-4ab9-9077-ea1706b8d4a2\" (UID: \"65d408e2-365e-4ab9-9077-ea1706b8d4a2\") " Dec 01 21:57:59 crc kubenswrapper[4962]: I1201 21:57:59.275066 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65d408e2-365e-4ab9-9077-ea1706b8d4a2-combined-ca-bundle\") pod \"65d408e2-365e-4ab9-9077-ea1706b8d4a2\" (UID: \"65d408e2-365e-4ab9-9077-ea1706b8d4a2\") " Dec 01 21:57:59 crc kubenswrapper[4962]: I1201 21:57:59.282862 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65d408e2-365e-4ab9-9077-ea1706b8d4a2-kube-api-access-frw2c" (OuterVolumeSpecName: "kube-api-access-frw2c") pod "65d408e2-365e-4ab9-9077-ea1706b8d4a2" (UID: "65d408e2-365e-4ab9-9077-ea1706b8d4a2"). InnerVolumeSpecName "kube-api-access-frw2c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:57:59 crc kubenswrapper[4962]: I1201 21:57:59.311448 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65d408e2-365e-4ab9-9077-ea1706b8d4a2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "65d408e2-365e-4ab9-9077-ea1706b8d4a2" (UID: "65d408e2-365e-4ab9-9077-ea1706b8d4a2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:57:59 crc kubenswrapper[4962]: I1201 21:57:59.324794 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65d408e2-365e-4ab9-9077-ea1706b8d4a2-config-data" (OuterVolumeSpecName: "config-data") pod "65d408e2-365e-4ab9-9077-ea1706b8d4a2" (UID: "65d408e2-365e-4ab9-9077-ea1706b8d4a2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:57:59 crc kubenswrapper[4962]: I1201 21:57:59.378360 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65d408e2-365e-4ab9-9077-ea1706b8d4a2-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 21:57:59 crc kubenswrapper[4962]: I1201 21:57:59.378396 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frw2c\" (UniqueName: \"kubernetes.io/projected/65d408e2-365e-4ab9-9077-ea1706b8d4a2-kube-api-access-frw2c\") on node \"crc\" DevicePath \"\"" Dec 01 21:57:59 crc kubenswrapper[4962]: I1201 21:57:59.378408 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65d408e2-365e-4ab9-9077-ea1706b8d4a2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 21:57:59 crc kubenswrapper[4962]: I1201 21:57:59.547432 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 21:57:59 crc kubenswrapper[4962]: I1201 21:57:59.566098 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 21:57:59 crc kubenswrapper[4962]: I1201 21:57:59.576054 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 21:57:59 crc kubenswrapper[4962]: E1201 21:57:59.576606 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65d408e2-365e-4ab9-9077-ea1706b8d4a2" containerName="nova-cell1-novncproxy-novncproxy" Dec 01 21:57:59 crc kubenswrapper[4962]: I1201 21:57:59.576617 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="65d408e2-365e-4ab9-9077-ea1706b8d4a2" containerName="nova-cell1-novncproxy-novncproxy" Dec 01 21:57:59 crc kubenswrapper[4962]: I1201 21:57:59.576885 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="65d408e2-365e-4ab9-9077-ea1706b8d4a2" containerName="nova-cell1-novncproxy-novncproxy" Dec 01 21:57:59 crc kubenswrapper[4962]: I1201 21:57:59.577751 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 01 21:57:59 crc kubenswrapper[4962]: I1201 21:57:59.580595 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 01 21:57:59 crc kubenswrapper[4962]: I1201 21:57:59.586174 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Dec 01 21:57:59 crc kubenswrapper[4962]: I1201 21:57:59.587157 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 21:57:59 crc kubenswrapper[4962]: I1201 21:57:59.587264 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Dec 01 21:57:59 crc kubenswrapper[4962]: I1201 21:57:59.683768 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf253872-abad-4b40-b941-2cbada4988ac-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"bf253872-abad-4b40-b941-2cbada4988ac\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 21:57:59 crc kubenswrapper[4962]: I1201 21:57:59.683862 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2wfm\" (UniqueName: \"kubernetes.io/projected/bf253872-abad-4b40-b941-2cbada4988ac-kube-api-access-m2wfm\") pod \"nova-cell1-novncproxy-0\" (UID: \"bf253872-abad-4b40-b941-2cbada4988ac\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 21:57:59 crc kubenswrapper[4962]: I1201 21:57:59.683951 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf253872-abad-4b40-b941-2cbada4988ac-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"bf253872-abad-4b40-b941-2cbada4988ac\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 21:57:59 crc kubenswrapper[4962]: I1201 21:57:59.683981 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf253872-abad-4b40-b941-2cbada4988ac-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"bf253872-abad-4b40-b941-2cbada4988ac\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 21:57:59 crc kubenswrapper[4962]: I1201 21:57:59.684037 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf253872-abad-4b40-b941-2cbada4988ac-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"bf253872-abad-4b40-b941-2cbada4988ac\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 21:57:59 crc kubenswrapper[4962]: I1201 21:57:59.786822 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf253872-abad-4b40-b941-2cbada4988ac-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"bf253872-abad-4b40-b941-2cbada4988ac\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 21:57:59 crc kubenswrapper[4962]: I1201 21:57:59.786917 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2wfm\" (UniqueName: \"kubernetes.io/projected/bf253872-abad-4b40-b941-2cbada4988ac-kube-api-access-m2wfm\") pod \"nova-cell1-novncproxy-0\" (UID: \"bf253872-abad-4b40-b941-2cbada4988ac\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 21:57:59 crc kubenswrapper[4962]: I1201 21:57:59.786991 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf253872-abad-4b40-b941-2cbada4988ac-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"bf253872-abad-4b40-b941-2cbada4988ac\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 21:57:59 crc kubenswrapper[4962]: I1201 21:57:59.787017 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf253872-abad-4b40-b941-2cbada4988ac-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"bf253872-abad-4b40-b941-2cbada4988ac\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 21:57:59 crc kubenswrapper[4962]: I1201 21:57:59.787050 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf253872-abad-4b40-b941-2cbada4988ac-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"bf253872-abad-4b40-b941-2cbada4988ac\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 21:57:59 crc kubenswrapper[4962]: I1201 21:57:59.794874 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf253872-abad-4b40-b941-2cbada4988ac-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"bf253872-abad-4b40-b941-2cbada4988ac\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 21:57:59 crc kubenswrapper[4962]: I1201 21:57:59.795159 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf253872-abad-4b40-b941-2cbada4988ac-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"bf253872-abad-4b40-b941-2cbada4988ac\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 21:57:59 crc kubenswrapper[4962]: I1201 21:57:59.796001 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf253872-abad-4b40-b941-2cbada4988ac-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"bf253872-abad-4b40-b941-2cbada4988ac\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 21:57:59 crc kubenswrapper[4962]: I1201 21:57:59.807181 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf253872-abad-4b40-b941-2cbada4988ac-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"bf253872-abad-4b40-b941-2cbada4988ac\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 21:57:59 crc kubenswrapper[4962]: I1201 21:57:59.811432 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2wfm\" (UniqueName: \"kubernetes.io/projected/bf253872-abad-4b40-b941-2cbada4988ac-kube-api-access-m2wfm\") pod \"nova-cell1-novncproxy-0\" (UID: \"bf253872-abad-4b40-b941-2cbada4988ac\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 21:57:59 crc kubenswrapper[4962]: I1201 21:57:59.902168 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 01 21:58:00 crc kubenswrapper[4962]: I1201 21:58:00.216118 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3","Type":"ContainerStarted","Data":"73164a00da9ea05c9f610dc589a3e1f6469427d5ab721e638aeefe453e598845"} Dec 01 21:58:00 crc kubenswrapper[4962]: I1201 21:58:00.216422 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3","Type":"ContainerStarted","Data":"6cbe575a548b08579e6ad18d7714c9a5699b0a6e261582719c8740a3a3a3c19e"} Dec 01 21:58:00 crc kubenswrapper[4962]: I1201 21:58:00.231636 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65d408e2-365e-4ab9-9077-ea1706b8d4a2" path="/var/lib/kubelet/pods/65d408e2-365e-4ab9-9077-ea1706b8d4a2/volumes" Dec 01 21:58:00 crc kubenswrapper[4962]: I1201 21:58:00.368733 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 21:58:01 crc kubenswrapper[4962]: I1201 21:58:01.244963 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3","Type":"ContainerStarted","Data":"8be64bb5c3f819245f82c19ad302a759a0f95ece9ce2ccc63a35fb76e642bd13"} Dec 01 21:58:01 crc kubenswrapper[4962]: I1201 21:58:01.246534 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"bf253872-abad-4b40-b941-2cbada4988ac","Type":"ContainerStarted","Data":"b487c91339f158baec48cf1d4d77a8016957596639d23a150e2198e73be0c3a6"} Dec 01 21:58:01 crc kubenswrapper[4962]: I1201 21:58:01.246562 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"bf253872-abad-4b40-b941-2cbada4988ac","Type":"ContainerStarted","Data":"5676dd4d2159d4546ea0cf8453ecb5ced1d74a650773867d6ef5c516ae2824d2"} Dec 01 21:58:01 crc kubenswrapper[4962]: I1201 21:58:01.284647 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.284629466 podStartE2EDuration="2.284629466s" podCreationTimestamp="2025-12-01 21:57:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:58:01.266503152 +0000 UTC m=+1465.367942377" watchObservedRunningTime="2025-12-01 21:58:01.284629466 +0000 UTC m=+1465.386068661" Dec 01 21:58:01 crc kubenswrapper[4962]: I1201 21:58:01.323469 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 01 21:58:01 crc kubenswrapper[4962]: I1201 21:58:01.323939 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 01 21:58:01 crc kubenswrapper[4962]: I1201 21:58:01.328343 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 01 21:58:01 crc kubenswrapper[4962]: I1201 21:58:01.329745 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 01 21:58:01 crc kubenswrapper[4962]: I1201 21:58:01.484189 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 01 21:58:02 crc kubenswrapper[4962]: I1201 21:58:02.262772 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 01 21:58:02 crc kubenswrapper[4962]: I1201 21:58:02.269035 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 01 21:58:02 crc kubenswrapper[4962]: I1201 21:58:02.527959 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-f84f9ccf-bg7lh"] Dec 01 21:58:02 crc kubenswrapper[4962]: I1201 21:58:02.531849 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f84f9ccf-bg7lh" Dec 01 21:58:02 crc kubenswrapper[4962]: I1201 21:58:02.547693 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f84f9ccf-bg7lh"] Dec 01 21:58:02 crc kubenswrapper[4962]: I1201 21:58:02.659271 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a0db8b97-4fe1-4ccc-bdd6-e4635285a854-dns-svc\") pod \"dnsmasq-dns-f84f9ccf-bg7lh\" (UID: \"a0db8b97-4fe1-4ccc-bdd6-e4635285a854\") " pod="openstack/dnsmasq-dns-f84f9ccf-bg7lh" Dec 01 21:58:02 crc kubenswrapper[4962]: I1201 21:58:02.659376 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a0db8b97-4fe1-4ccc-bdd6-e4635285a854-ovsdbserver-sb\") pod \"dnsmasq-dns-f84f9ccf-bg7lh\" (UID: \"a0db8b97-4fe1-4ccc-bdd6-e4635285a854\") " pod="openstack/dnsmasq-dns-f84f9ccf-bg7lh" Dec 01 21:58:02 crc kubenswrapper[4962]: I1201 21:58:02.659403 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a0db8b97-4fe1-4ccc-bdd6-e4635285a854-ovsdbserver-nb\") pod \"dnsmasq-dns-f84f9ccf-bg7lh\" (UID: \"a0db8b97-4fe1-4ccc-bdd6-e4635285a854\") " pod="openstack/dnsmasq-dns-f84f9ccf-bg7lh" Dec 01 21:58:02 crc kubenswrapper[4962]: I1201 21:58:02.659461 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a0db8b97-4fe1-4ccc-bdd6-e4635285a854-dns-swift-storage-0\") pod \"dnsmasq-dns-f84f9ccf-bg7lh\" (UID: \"a0db8b97-4fe1-4ccc-bdd6-e4635285a854\") " pod="openstack/dnsmasq-dns-f84f9ccf-bg7lh" Dec 01 21:58:02 crc kubenswrapper[4962]: I1201 21:58:02.659482 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbt9x\" (UniqueName: \"kubernetes.io/projected/a0db8b97-4fe1-4ccc-bdd6-e4635285a854-kube-api-access-kbt9x\") pod \"dnsmasq-dns-f84f9ccf-bg7lh\" (UID: \"a0db8b97-4fe1-4ccc-bdd6-e4635285a854\") " pod="openstack/dnsmasq-dns-f84f9ccf-bg7lh" Dec 01 21:58:02 crc kubenswrapper[4962]: I1201 21:58:02.659528 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0db8b97-4fe1-4ccc-bdd6-e4635285a854-config\") pod \"dnsmasq-dns-f84f9ccf-bg7lh\" (UID: \"a0db8b97-4fe1-4ccc-bdd6-e4635285a854\") " pod="openstack/dnsmasq-dns-f84f9ccf-bg7lh" Dec 01 21:58:02 crc kubenswrapper[4962]: I1201 21:58:02.761258 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0db8b97-4fe1-4ccc-bdd6-e4635285a854-config\") pod \"dnsmasq-dns-f84f9ccf-bg7lh\" (UID: \"a0db8b97-4fe1-4ccc-bdd6-e4635285a854\") " pod="openstack/dnsmasq-dns-f84f9ccf-bg7lh" Dec 01 21:58:02 crc kubenswrapper[4962]: I1201 21:58:02.761403 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a0db8b97-4fe1-4ccc-bdd6-e4635285a854-dns-svc\") pod \"dnsmasq-dns-f84f9ccf-bg7lh\" (UID: \"a0db8b97-4fe1-4ccc-bdd6-e4635285a854\") " pod="openstack/dnsmasq-dns-f84f9ccf-bg7lh" Dec 01 21:58:02 crc kubenswrapper[4962]: I1201 21:58:02.761515 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a0db8b97-4fe1-4ccc-bdd6-e4635285a854-ovsdbserver-sb\") pod \"dnsmasq-dns-f84f9ccf-bg7lh\" (UID: \"a0db8b97-4fe1-4ccc-bdd6-e4635285a854\") " pod="openstack/dnsmasq-dns-f84f9ccf-bg7lh" Dec 01 21:58:02 crc kubenswrapper[4962]: I1201 21:58:02.761562 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a0db8b97-4fe1-4ccc-bdd6-e4635285a854-ovsdbserver-nb\") pod \"dnsmasq-dns-f84f9ccf-bg7lh\" (UID: \"a0db8b97-4fe1-4ccc-bdd6-e4635285a854\") " pod="openstack/dnsmasq-dns-f84f9ccf-bg7lh" Dec 01 21:58:02 crc kubenswrapper[4962]: I1201 21:58:02.761693 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a0db8b97-4fe1-4ccc-bdd6-e4635285a854-dns-swift-storage-0\") pod \"dnsmasq-dns-f84f9ccf-bg7lh\" (UID: \"a0db8b97-4fe1-4ccc-bdd6-e4635285a854\") " pod="openstack/dnsmasq-dns-f84f9ccf-bg7lh" Dec 01 21:58:02 crc kubenswrapper[4962]: I1201 21:58:02.761722 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbt9x\" (UniqueName: \"kubernetes.io/projected/a0db8b97-4fe1-4ccc-bdd6-e4635285a854-kube-api-access-kbt9x\") pod \"dnsmasq-dns-f84f9ccf-bg7lh\" (UID: \"a0db8b97-4fe1-4ccc-bdd6-e4635285a854\") " pod="openstack/dnsmasq-dns-f84f9ccf-bg7lh" Dec 01 21:58:02 crc kubenswrapper[4962]: I1201 21:58:02.762410 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a0db8b97-4fe1-4ccc-bdd6-e4635285a854-ovsdbserver-sb\") pod \"dnsmasq-dns-f84f9ccf-bg7lh\" (UID: \"a0db8b97-4fe1-4ccc-bdd6-e4635285a854\") " pod="openstack/dnsmasq-dns-f84f9ccf-bg7lh" Dec 01 21:58:02 crc kubenswrapper[4962]: I1201 21:58:02.762498 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a0db8b97-4fe1-4ccc-bdd6-e4635285a854-ovsdbserver-nb\") pod \"dnsmasq-dns-f84f9ccf-bg7lh\" (UID: \"a0db8b97-4fe1-4ccc-bdd6-e4635285a854\") " pod="openstack/dnsmasq-dns-f84f9ccf-bg7lh" Dec 01 21:58:02 crc kubenswrapper[4962]: I1201 21:58:02.763149 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a0db8b97-4fe1-4ccc-bdd6-e4635285a854-dns-swift-storage-0\") pod \"dnsmasq-dns-f84f9ccf-bg7lh\" (UID: \"a0db8b97-4fe1-4ccc-bdd6-e4635285a854\") " pod="openstack/dnsmasq-dns-f84f9ccf-bg7lh" Dec 01 21:58:02 crc kubenswrapper[4962]: I1201 21:58:02.763753 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a0db8b97-4fe1-4ccc-bdd6-e4635285a854-dns-svc\") pod \"dnsmasq-dns-f84f9ccf-bg7lh\" (UID: \"a0db8b97-4fe1-4ccc-bdd6-e4635285a854\") " pod="openstack/dnsmasq-dns-f84f9ccf-bg7lh" Dec 01 21:58:02 crc kubenswrapper[4962]: I1201 21:58:02.764744 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0db8b97-4fe1-4ccc-bdd6-e4635285a854-config\") pod \"dnsmasq-dns-f84f9ccf-bg7lh\" (UID: \"a0db8b97-4fe1-4ccc-bdd6-e4635285a854\") " pod="openstack/dnsmasq-dns-f84f9ccf-bg7lh" Dec 01 21:58:02 crc kubenswrapper[4962]: I1201 21:58:02.780914 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbt9x\" (UniqueName: \"kubernetes.io/projected/a0db8b97-4fe1-4ccc-bdd6-e4635285a854-kube-api-access-kbt9x\") pod \"dnsmasq-dns-f84f9ccf-bg7lh\" (UID: \"a0db8b97-4fe1-4ccc-bdd6-e4635285a854\") " pod="openstack/dnsmasq-dns-f84f9ccf-bg7lh" Dec 01 21:58:02 crc kubenswrapper[4962]: I1201 21:58:02.784971 4962 patch_prober.go:28] interesting pod/machine-config-daemon-b642k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 21:58:02 crc kubenswrapper[4962]: I1201 21:58:02.785007 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 21:58:02 crc kubenswrapper[4962]: I1201 21:58:02.785042 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b642k" Dec 01 21:58:02 crc kubenswrapper[4962]: I1201 21:58:02.785611 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"be316a715ee51336fb8f9d77180528af883eb796f40ec81884b6acc27922aa28"} pod="openshift-machine-config-operator/machine-config-daemon-b642k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 21:58:02 crc kubenswrapper[4962]: I1201 21:58:02.785658 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" containerID="cri-o://be316a715ee51336fb8f9d77180528af883eb796f40ec81884b6acc27922aa28" gracePeriod=600 Dec 01 21:58:02 crc kubenswrapper[4962]: I1201 21:58:02.873729 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f84f9ccf-bg7lh" Dec 01 21:58:03 crc kubenswrapper[4962]: I1201 21:58:03.306351 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3","Type":"ContainerStarted","Data":"01ad774812216c83e989efbfb8763f577d888b70cdc96b7c38314d8442f52259"} Dec 01 21:58:03 crc kubenswrapper[4962]: I1201 21:58:03.307159 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 01 21:58:03 crc kubenswrapper[4962]: I1201 21:58:03.363388 4962 generic.go:334] "Generic (PLEG): container finished" podID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerID="be316a715ee51336fb8f9d77180528af883eb796f40ec81884b6acc27922aa28" exitCode=0 Dec 01 21:58:03 crc kubenswrapper[4962]: I1201 21:58:03.364540 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" event={"ID":"191b6ce3-f613-4217-b224-a65ee4cfdfe7","Type":"ContainerDied","Data":"be316a715ee51336fb8f9d77180528af883eb796f40ec81884b6acc27922aa28"} Dec 01 21:58:03 crc kubenswrapper[4962]: I1201 21:58:03.364645 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" event={"ID":"191b6ce3-f613-4217-b224-a65ee4cfdfe7","Type":"ContainerStarted","Data":"ff25160e91e4465a0010cbcc4456883387e71df20e0a000eb4c6b6750118371f"} Dec 01 21:58:03 crc kubenswrapper[4962]: I1201 21:58:03.364714 4962 scope.go:117] "RemoveContainer" containerID="7e98140d5fb11879a3903d3761dc38b8ef264c041494b571b46af54f4f57bb50" Dec 01 21:58:03 crc kubenswrapper[4962]: I1201 21:58:03.384423 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.510032791 podStartE2EDuration="6.384404206s" podCreationTimestamp="2025-12-01 21:57:57 +0000 UTC" firstStartedPulling="2025-12-01 21:57:58.412625366 +0000 UTC m=+1462.514064571" lastFinishedPulling="2025-12-01 21:58:02.286996801 +0000 UTC m=+1466.388435986" observedRunningTime="2025-12-01 21:58:03.35664605 +0000 UTC m=+1467.458085245" watchObservedRunningTime="2025-12-01 21:58:03.384404206 +0000 UTC m=+1467.485843391" Dec 01 21:58:03 crc kubenswrapper[4962]: I1201 21:58:03.591965 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f84f9ccf-bg7lh"] Dec 01 21:58:04 crc kubenswrapper[4962]: I1201 21:58:04.383656 4962 generic.go:334] "Generic (PLEG): container finished" podID="a0db8b97-4fe1-4ccc-bdd6-e4635285a854" containerID="d1c5ba058c84dbadc3b7e623c9a161eaa74208431d249e3b27acaecbaa770789" exitCode=0 Dec 01 21:58:04 crc kubenswrapper[4962]: I1201 21:58:04.384016 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84f9ccf-bg7lh" event={"ID":"a0db8b97-4fe1-4ccc-bdd6-e4635285a854","Type":"ContainerDied","Data":"d1c5ba058c84dbadc3b7e623c9a161eaa74208431d249e3b27acaecbaa770789"} Dec 01 21:58:04 crc kubenswrapper[4962]: I1201 21:58:04.384194 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84f9ccf-bg7lh" event={"ID":"a0db8b97-4fe1-4ccc-bdd6-e4635285a854","Type":"ContainerStarted","Data":"d5884c14e435507d16f47befcfd85a870b49a5761eca34309f7184917994d3b2"} Dec 01 21:58:04 crc kubenswrapper[4962]: I1201 21:58:04.903020 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 01 21:58:05 crc kubenswrapper[4962]: I1201 21:58:05.192698 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 21:58:05 crc kubenswrapper[4962]: I1201 21:58:05.412381 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84f9ccf-bg7lh" event={"ID":"a0db8b97-4fe1-4ccc-bdd6-e4635285a854","Type":"ContainerStarted","Data":"c77afe8853921b13088819e7c2221c0ebb8655e3725f3bb62bbc7964f0f8c9f1"} Dec 01 21:58:05 crc kubenswrapper[4962]: I1201 21:58:05.412540 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3" containerName="ceilometer-central-agent" containerID="cri-o://6cbe575a548b08579e6ad18d7714c9a5699b0a6e261582719c8740a3a3a3c19e" gracePeriod=30 Dec 01 21:58:05 crc kubenswrapper[4962]: I1201 21:58:05.414045 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3" containerName="proxy-httpd" containerID="cri-o://01ad774812216c83e989efbfb8763f577d888b70cdc96b7c38314d8442f52259" gracePeriod=30 Dec 01 21:58:05 crc kubenswrapper[4962]: I1201 21:58:05.414104 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3" containerName="sg-core" containerID="cri-o://8be64bb5c3f819245f82c19ad302a759a0f95ece9ce2ccc63a35fb76e642bd13" gracePeriod=30 Dec 01 21:58:05 crc kubenswrapper[4962]: I1201 21:58:05.414142 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3" containerName="ceilometer-notification-agent" containerID="cri-o://73164a00da9ea05c9f610dc589a3e1f6469427d5ab721e638aeefe453e598845" gracePeriod=30 Dec 01 21:58:05 crc kubenswrapper[4962]: I1201 21:58:05.452292 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-f84f9ccf-bg7lh" podStartSLOduration=3.452271602 podStartE2EDuration="3.452271602s" podCreationTimestamp="2025-12-01 21:58:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:58:05.437179505 +0000 UTC m=+1469.538618720" watchObservedRunningTime="2025-12-01 21:58:05.452271602 +0000 UTC m=+1469.553710797" Dec 01 21:58:05 crc kubenswrapper[4962]: I1201 21:58:05.767890 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 01 21:58:05 crc kubenswrapper[4962]: I1201 21:58:05.768453 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="66789194-99ac-4fdf-9fc0-350fa5422867" containerName="nova-api-log" containerID="cri-o://3ade7d54e515b3af6ab7f6813de832c61375247f29b19e7e84248ba3942b0745" gracePeriod=30 Dec 01 21:58:05 crc kubenswrapper[4962]: I1201 21:58:05.768599 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="66789194-99ac-4fdf-9fc0-350fa5422867" containerName="nova-api-api" containerID="cri-o://e46de99b8473033437861ecfef864ffc26a78968b854651184220f433944c7d5" gracePeriod=30 Dec 01 21:58:06 crc kubenswrapper[4962]: I1201 21:58:06.429147 4962 generic.go:334] "Generic (PLEG): container finished" podID="66789194-99ac-4fdf-9fc0-350fa5422867" containerID="3ade7d54e515b3af6ab7f6813de832c61375247f29b19e7e84248ba3942b0745" exitCode=143 Dec 01 21:58:06 crc kubenswrapper[4962]: I1201 21:58:06.429238 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"66789194-99ac-4fdf-9fc0-350fa5422867","Type":"ContainerDied","Data":"3ade7d54e515b3af6ab7f6813de832c61375247f29b19e7e84248ba3942b0745"} Dec 01 21:58:06 crc kubenswrapper[4962]: I1201 21:58:06.432999 4962 generic.go:334] "Generic (PLEG): container finished" podID="54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3" containerID="01ad774812216c83e989efbfb8763f577d888b70cdc96b7c38314d8442f52259" exitCode=0 Dec 01 21:58:06 crc kubenswrapper[4962]: I1201 21:58:06.433038 4962 generic.go:334] "Generic (PLEG): container finished" podID="54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3" containerID="8be64bb5c3f819245f82c19ad302a759a0f95ece9ce2ccc63a35fb76e642bd13" exitCode=2 Dec 01 21:58:06 crc kubenswrapper[4962]: I1201 21:58:06.433050 4962 generic.go:334] "Generic (PLEG): container finished" podID="54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3" containerID="73164a00da9ea05c9f610dc589a3e1f6469427d5ab721e638aeefe453e598845" exitCode=0 Dec 01 21:58:06 crc kubenswrapper[4962]: I1201 21:58:06.433068 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3","Type":"ContainerDied","Data":"01ad774812216c83e989efbfb8763f577d888b70cdc96b7c38314d8442f52259"} Dec 01 21:58:06 crc kubenswrapper[4962]: I1201 21:58:06.433106 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3","Type":"ContainerDied","Data":"8be64bb5c3f819245f82c19ad302a759a0f95ece9ce2ccc63a35fb76e642bd13"} Dec 01 21:58:06 crc kubenswrapper[4962]: I1201 21:58:06.433122 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3","Type":"ContainerDied","Data":"73164a00da9ea05c9f610dc589a3e1f6469427d5ab721e638aeefe453e598845"} Dec 01 21:58:06 crc kubenswrapper[4962]: I1201 21:58:06.433382 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-f84f9ccf-bg7lh" Dec 01 21:58:07 crc kubenswrapper[4962]: I1201 21:58:07.367295 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 21:58:07 crc kubenswrapper[4962]: I1201 21:58:07.480180 4962 generic.go:334] "Generic (PLEG): container finished" podID="54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3" containerID="6cbe575a548b08579e6ad18d7714c9a5699b0a6e261582719c8740a3a3a3c19e" exitCode=0 Dec 01 21:58:07 crc kubenswrapper[4962]: I1201 21:58:07.480569 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3","Type":"ContainerDied","Data":"6cbe575a548b08579e6ad18d7714c9a5699b0a6e261582719c8740a3a3a3c19e"} Dec 01 21:58:07 crc kubenswrapper[4962]: I1201 21:58:07.480642 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3","Type":"ContainerDied","Data":"37598a79123cebeba7e8d616a1c3cda3be16cc3afd1a4db0670dbc9b2059d903"} Dec 01 21:58:07 crc kubenswrapper[4962]: I1201 21:58:07.480651 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 21:58:07 crc kubenswrapper[4962]: I1201 21:58:07.480662 4962 scope.go:117] "RemoveContainer" containerID="01ad774812216c83e989efbfb8763f577d888b70cdc96b7c38314d8442f52259" Dec 01 21:58:07 crc kubenswrapper[4962]: I1201 21:58:07.502847 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3-sg-core-conf-yaml\") pod \"54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3\" (UID: \"54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3\") " Dec 01 21:58:07 crc kubenswrapper[4962]: I1201 21:58:07.503124 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hwv4\" (UniqueName: \"kubernetes.io/projected/54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3-kube-api-access-4hwv4\") pod \"54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3\" (UID: \"54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3\") " Dec 01 21:58:07 crc kubenswrapper[4962]: I1201 21:58:07.503234 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3-ceilometer-tls-certs\") pod \"54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3\" (UID: \"54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3\") " Dec 01 21:58:07 crc kubenswrapper[4962]: I1201 21:58:07.503314 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3-config-data\") pod \"54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3\" (UID: \"54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3\") " Dec 01 21:58:07 crc kubenswrapper[4962]: I1201 21:58:07.503392 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3-log-httpd\") pod \"54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3\" (UID: \"54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3\") " Dec 01 21:58:07 crc kubenswrapper[4962]: I1201 21:58:07.503425 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3-run-httpd\") pod \"54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3\" (UID: \"54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3\") " Dec 01 21:58:07 crc kubenswrapper[4962]: I1201 21:58:07.503461 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3-scripts\") pod \"54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3\" (UID: \"54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3\") " Dec 01 21:58:07 crc kubenswrapper[4962]: I1201 21:58:07.503511 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3-combined-ca-bundle\") pod \"54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3\" (UID: \"54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3\") " Dec 01 21:58:07 crc kubenswrapper[4962]: I1201 21:58:07.508462 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3" (UID: "54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:58:07 crc kubenswrapper[4962]: I1201 21:58:07.508569 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3" (UID: "54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:58:07 crc kubenswrapper[4962]: I1201 21:58:07.515464 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3-kube-api-access-4hwv4" (OuterVolumeSpecName: "kube-api-access-4hwv4") pod "54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3" (UID: "54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3"). InnerVolumeSpecName "kube-api-access-4hwv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:58:07 crc kubenswrapper[4962]: I1201 21:58:07.517670 4962 scope.go:117] "RemoveContainer" containerID="8be64bb5c3f819245f82c19ad302a759a0f95ece9ce2ccc63a35fb76e642bd13" Dec 01 21:58:07 crc kubenswrapper[4962]: I1201 21:58:07.522676 4962 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 21:58:07 crc kubenswrapper[4962]: I1201 21:58:07.522710 4962 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 21:58:07 crc kubenswrapper[4962]: I1201 21:58:07.522724 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hwv4\" (UniqueName: \"kubernetes.io/projected/54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3-kube-api-access-4hwv4\") on node \"crc\" DevicePath \"\"" Dec 01 21:58:07 crc kubenswrapper[4962]: I1201 21:58:07.526736 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3-scripts" (OuterVolumeSpecName: "scripts") pod "54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3" (UID: "54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:58:07 crc kubenswrapper[4962]: I1201 21:58:07.587360 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3" (UID: "54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:58:07 crc kubenswrapper[4962]: I1201 21:58:07.601096 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3" (UID: "54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:58:07 crc kubenswrapper[4962]: I1201 21:58:07.625228 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 21:58:07 crc kubenswrapper[4962]: I1201 21:58:07.625256 4962 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 01 21:58:07 crc kubenswrapper[4962]: I1201 21:58:07.625267 4962 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 21:58:07 crc kubenswrapper[4962]: I1201 21:58:07.644135 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3" (UID: "54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:58:07 crc kubenswrapper[4962]: I1201 21:58:07.671093 4962 scope.go:117] "RemoveContainer" containerID="73164a00da9ea05c9f610dc589a3e1f6469427d5ab721e638aeefe453e598845" Dec 01 21:58:07 crc kubenswrapper[4962]: I1201 21:58:07.672249 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3-config-data" (OuterVolumeSpecName: "config-data") pod "54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3" (UID: "54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:58:07 crc kubenswrapper[4962]: I1201 21:58:07.698307 4962 scope.go:117] "RemoveContainer" containerID="6cbe575a548b08579e6ad18d7714c9a5699b0a6e261582719c8740a3a3a3c19e" Dec 01 21:58:07 crc kubenswrapper[4962]: I1201 21:58:07.721670 4962 scope.go:117] "RemoveContainer" containerID="01ad774812216c83e989efbfb8763f577d888b70cdc96b7c38314d8442f52259" Dec 01 21:58:07 crc kubenswrapper[4962]: E1201 21:58:07.722254 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01ad774812216c83e989efbfb8763f577d888b70cdc96b7c38314d8442f52259\": container with ID starting with 01ad774812216c83e989efbfb8763f577d888b70cdc96b7c38314d8442f52259 not found: ID does not exist" containerID="01ad774812216c83e989efbfb8763f577d888b70cdc96b7c38314d8442f52259" Dec 01 21:58:07 crc kubenswrapper[4962]: I1201 21:58:07.722284 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01ad774812216c83e989efbfb8763f577d888b70cdc96b7c38314d8442f52259"} err="failed to get container status \"01ad774812216c83e989efbfb8763f577d888b70cdc96b7c38314d8442f52259\": rpc error: code = NotFound desc = could not find container \"01ad774812216c83e989efbfb8763f577d888b70cdc96b7c38314d8442f52259\": container with ID starting with 01ad774812216c83e989efbfb8763f577d888b70cdc96b7c38314d8442f52259 not found: ID does not exist" Dec 01 21:58:07 crc kubenswrapper[4962]: I1201 21:58:07.722306 4962 scope.go:117] "RemoveContainer" containerID="8be64bb5c3f819245f82c19ad302a759a0f95ece9ce2ccc63a35fb76e642bd13" Dec 01 21:58:07 crc kubenswrapper[4962]: E1201 21:58:07.722681 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8be64bb5c3f819245f82c19ad302a759a0f95ece9ce2ccc63a35fb76e642bd13\": container with ID starting with 8be64bb5c3f819245f82c19ad302a759a0f95ece9ce2ccc63a35fb76e642bd13 not found: ID does not exist" containerID="8be64bb5c3f819245f82c19ad302a759a0f95ece9ce2ccc63a35fb76e642bd13" Dec 01 21:58:07 crc kubenswrapper[4962]: I1201 21:58:07.722702 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8be64bb5c3f819245f82c19ad302a759a0f95ece9ce2ccc63a35fb76e642bd13"} err="failed to get container status \"8be64bb5c3f819245f82c19ad302a759a0f95ece9ce2ccc63a35fb76e642bd13\": rpc error: code = NotFound desc = could not find container \"8be64bb5c3f819245f82c19ad302a759a0f95ece9ce2ccc63a35fb76e642bd13\": container with ID starting with 8be64bb5c3f819245f82c19ad302a759a0f95ece9ce2ccc63a35fb76e642bd13 not found: ID does not exist" Dec 01 21:58:07 crc kubenswrapper[4962]: I1201 21:58:07.722715 4962 scope.go:117] "RemoveContainer" containerID="73164a00da9ea05c9f610dc589a3e1f6469427d5ab721e638aeefe453e598845" Dec 01 21:58:07 crc kubenswrapper[4962]: E1201 21:58:07.723126 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73164a00da9ea05c9f610dc589a3e1f6469427d5ab721e638aeefe453e598845\": container with ID starting with 73164a00da9ea05c9f610dc589a3e1f6469427d5ab721e638aeefe453e598845 not found: ID does not exist" containerID="73164a00da9ea05c9f610dc589a3e1f6469427d5ab721e638aeefe453e598845" Dec 01 21:58:07 crc kubenswrapper[4962]: I1201 21:58:07.723148 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73164a00da9ea05c9f610dc589a3e1f6469427d5ab721e638aeefe453e598845"} err="failed to get container status \"73164a00da9ea05c9f610dc589a3e1f6469427d5ab721e638aeefe453e598845\": rpc error: code = NotFound desc = could not find container \"73164a00da9ea05c9f610dc589a3e1f6469427d5ab721e638aeefe453e598845\": container with ID starting with 73164a00da9ea05c9f610dc589a3e1f6469427d5ab721e638aeefe453e598845 not found: ID does not exist" Dec 01 21:58:07 crc kubenswrapper[4962]: I1201 21:58:07.723160 4962 scope.go:117] "RemoveContainer" containerID="6cbe575a548b08579e6ad18d7714c9a5699b0a6e261582719c8740a3a3a3c19e" Dec 01 21:58:07 crc kubenswrapper[4962]: E1201 21:58:07.723439 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cbe575a548b08579e6ad18d7714c9a5699b0a6e261582719c8740a3a3a3c19e\": container with ID starting with 6cbe575a548b08579e6ad18d7714c9a5699b0a6e261582719c8740a3a3a3c19e not found: ID does not exist" containerID="6cbe575a548b08579e6ad18d7714c9a5699b0a6e261582719c8740a3a3a3c19e" Dec 01 21:58:07 crc kubenswrapper[4962]: I1201 21:58:07.723458 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cbe575a548b08579e6ad18d7714c9a5699b0a6e261582719c8740a3a3a3c19e"} err="failed to get container status \"6cbe575a548b08579e6ad18d7714c9a5699b0a6e261582719c8740a3a3a3c19e\": rpc error: code = NotFound desc = could not find container \"6cbe575a548b08579e6ad18d7714c9a5699b0a6e261582719c8740a3a3a3c19e\": container with ID starting with 6cbe575a548b08579e6ad18d7714c9a5699b0a6e261582719c8740a3a3a3c19e not found: ID does not exist" Dec 01 21:58:07 crc kubenswrapper[4962]: I1201 21:58:07.727275 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 21:58:07 crc kubenswrapper[4962]: I1201 21:58:07.727299 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 21:58:07 crc kubenswrapper[4962]: I1201 21:58:07.840385 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 21:58:07 crc kubenswrapper[4962]: I1201 21:58:07.866895 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 01 21:58:07 crc kubenswrapper[4962]: I1201 21:58:07.896296 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 01 21:58:07 crc kubenswrapper[4962]: E1201 21:58:07.896853 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3" containerName="ceilometer-notification-agent" Dec 01 21:58:07 crc kubenswrapper[4962]: I1201 21:58:07.896873 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3" containerName="ceilometer-notification-agent" Dec 01 21:58:07 crc kubenswrapper[4962]: E1201 21:58:07.896901 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3" containerName="sg-core" Dec 01 21:58:07 crc kubenswrapper[4962]: I1201 21:58:07.896911 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3" containerName="sg-core" Dec 01 21:58:07 crc kubenswrapper[4962]: E1201 21:58:07.896956 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3" containerName="ceilometer-central-agent" Dec 01 21:58:07 crc kubenswrapper[4962]: I1201 21:58:07.896964 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3" containerName="ceilometer-central-agent" Dec 01 21:58:07 crc kubenswrapper[4962]: E1201 21:58:07.896984 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3" containerName="proxy-httpd" Dec 01 21:58:07 crc kubenswrapper[4962]: I1201 21:58:07.896990 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3" containerName="proxy-httpd" Dec 01 21:58:07 crc kubenswrapper[4962]: I1201 21:58:07.897190 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3" containerName="proxy-httpd" Dec 01 21:58:07 crc kubenswrapper[4962]: I1201 21:58:07.897204 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3" containerName="ceilometer-notification-agent" Dec 01 21:58:07 crc kubenswrapper[4962]: I1201 21:58:07.897221 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3" containerName="sg-core" Dec 01 21:58:07 crc kubenswrapper[4962]: I1201 21:58:07.897251 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3" containerName="ceilometer-central-agent" Dec 01 21:58:07 crc kubenswrapper[4962]: I1201 21:58:07.899371 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 21:58:07 crc kubenswrapper[4962]: I1201 21:58:07.902650 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 01 21:58:07 crc kubenswrapper[4962]: I1201 21:58:07.902760 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 01 21:58:07 crc kubenswrapper[4962]: I1201 21:58:07.902817 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 01 21:58:07 crc kubenswrapper[4962]: I1201 21:58:07.914726 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 21:58:08 crc kubenswrapper[4962]: I1201 21:58:08.034851 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvkmq\" (UniqueName: \"kubernetes.io/projected/bd27d7a4-d5ee-48ee-b317-ac637797097e-kube-api-access-cvkmq\") pod \"ceilometer-0\" (UID: \"bd27d7a4-d5ee-48ee-b317-ac637797097e\") " pod="openstack/ceilometer-0" Dec 01 21:58:08 crc kubenswrapper[4962]: I1201 21:58:08.034908 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd27d7a4-d5ee-48ee-b317-ac637797097e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bd27d7a4-d5ee-48ee-b317-ac637797097e\") " pod="openstack/ceilometer-0" Dec 01 21:58:08 crc kubenswrapper[4962]: I1201 21:58:08.034929 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd27d7a4-d5ee-48ee-b317-ac637797097e-run-httpd\") pod \"ceilometer-0\" (UID: \"bd27d7a4-d5ee-48ee-b317-ac637797097e\") " pod="openstack/ceilometer-0" Dec 01 21:58:08 crc kubenswrapper[4962]: I1201 21:58:08.035522 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd27d7a4-d5ee-48ee-b317-ac637797097e-config-data\") pod \"ceilometer-0\" (UID: \"bd27d7a4-d5ee-48ee-b317-ac637797097e\") " pod="openstack/ceilometer-0" Dec 01 21:58:08 crc kubenswrapper[4962]: I1201 21:58:08.035585 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd27d7a4-d5ee-48ee-b317-ac637797097e-log-httpd\") pod \"ceilometer-0\" (UID: \"bd27d7a4-d5ee-48ee-b317-ac637797097e\") " pod="openstack/ceilometer-0" Dec 01 21:58:08 crc kubenswrapper[4962]: I1201 21:58:08.035627 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bd27d7a4-d5ee-48ee-b317-ac637797097e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bd27d7a4-d5ee-48ee-b317-ac637797097e\") " pod="openstack/ceilometer-0" Dec 01 21:58:08 crc kubenswrapper[4962]: I1201 21:58:08.035700 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd27d7a4-d5ee-48ee-b317-ac637797097e-scripts\") pod \"ceilometer-0\" (UID: \"bd27d7a4-d5ee-48ee-b317-ac637797097e\") " pod="openstack/ceilometer-0" Dec 01 21:58:08 crc kubenswrapper[4962]: I1201 21:58:08.035716 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd27d7a4-d5ee-48ee-b317-ac637797097e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"bd27d7a4-d5ee-48ee-b317-ac637797097e\") " pod="openstack/ceilometer-0" Dec 01 21:58:08 crc kubenswrapper[4962]: I1201 21:58:08.137609 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd27d7a4-d5ee-48ee-b317-ac637797097e-config-data\") pod \"ceilometer-0\" (UID: \"bd27d7a4-d5ee-48ee-b317-ac637797097e\") " pod="openstack/ceilometer-0" Dec 01 21:58:08 crc kubenswrapper[4962]: I1201 21:58:08.137667 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd27d7a4-d5ee-48ee-b317-ac637797097e-log-httpd\") pod \"ceilometer-0\" (UID: \"bd27d7a4-d5ee-48ee-b317-ac637797097e\") " pod="openstack/ceilometer-0" Dec 01 21:58:08 crc kubenswrapper[4962]: I1201 21:58:08.137699 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bd27d7a4-d5ee-48ee-b317-ac637797097e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bd27d7a4-d5ee-48ee-b317-ac637797097e\") " pod="openstack/ceilometer-0" Dec 01 21:58:08 crc kubenswrapper[4962]: I1201 21:58:08.137736 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd27d7a4-d5ee-48ee-b317-ac637797097e-scripts\") pod \"ceilometer-0\" (UID: \"bd27d7a4-d5ee-48ee-b317-ac637797097e\") " pod="openstack/ceilometer-0" Dec 01 21:58:08 crc kubenswrapper[4962]: I1201 21:58:08.137750 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd27d7a4-d5ee-48ee-b317-ac637797097e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"bd27d7a4-d5ee-48ee-b317-ac637797097e\") " pod="openstack/ceilometer-0" Dec 01 21:58:08 crc kubenswrapper[4962]: I1201 21:58:08.137783 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvkmq\" (UniqueName: \"kubernetes.io/projected/bd27d7a4-d5ee-48ee-b317-ac637797097e-kube-api-access-cvkmq\") pod \"ceilometer-0\" (UID: \"bd27d7a4-d5ee-48ee-b317-ac637797097e\") " pod="openstack/ceilometer-0" Dec 01 21:58:08 crc kubenswrapper[4962]: I1201 21:58:08.137804 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd27d7a4-d5ee-48ee-b317-ac637797097e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bd27d7a4-d5ee-48ee-b317-ac637797097e\") " pod="openstack/ceilometer-0" Dec 01 21:58:08 crc kubenswrapper[4962]: I1201 21:58:08.137825 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd27d7a4-d5ee-48ee-b317-ac637797097e-run-httpd\") pod \"ceilometer-0\" (UID: \"bd27d7a4-d5ee-48ee-b317-ac637797097e\") " pod="openstack/ceilometer-0" Dec 01 21:58:08 crc kubenswrapper[4962]: I1201 21:58:08.138354 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd27d7a4-d5ee-48ee-b317-ac637797097e-run-httpd\") pod \"ceilometer-0\" (UID: \"bd27d7a4-d5ee-48ee-b317-ac637797097e\") " pod="openstack/ceilometer-0" Dec 01 21:58:08 crc kubenswrapper[4962]: I1201 21:58:08.139683 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd27d7a4-d5ee-48ee-b317-ac637797097e-log-httpd\") pod \"ceilometer-0\" (UID: \"bd27d7a4-d5ee-48ee-b317-ac637797097e\") " pod="openstack/ceilometer-0" Dec 01 21:58:08 crc kubenswrapper[4962]: I1201 21:58:08.144456 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd27d7a4-d5ee-48ee-b317-ac637797097e-config-data\") pod \"ceilometer-0\" (UID: \"bd27d7a4-d5ee-48ee-b317-ac637797097e\") " pod="openstack/ceilometer-0" Dec 01 21:58:08 crc kubenswrapper[4962]: I1201 21:58:08.145102 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd27d7a4-d5ee-48ee-b317-ac637797097e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bd27d7a4-d5ee-48ee-b317-ac637797097e\") " pod="openstack/ceilometer-0" Dec 01 21:58:08 crc kubenswrapper[4962]: I1201 21:58:08.149714 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd27d7a4-d5ee-48ee-b317-ac637797097e-scripts\") pod \"ceilometer-0\" (UID: \"bd27d7a4-d5ee-48ee-b317-ac637797097e\") " pod="openstack/ceilometer-0" Dec 01 21:58:08 crc kubenswrapper[4962]: I1201 21:58:08.153177 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd27d7a4-d5ee-48ee-b317-ac637797097e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"bd27d7a4-d5ee-48ee-b317-ac637797097e\") " pod="openstack/ceilometer-0" Dec 01 21:58:08 crc kubenswrapper[4962]: I1201 21:58:08.154644 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bd27d7a4-d5ee-48ee-b317-ac637797097e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bd27d7a4-d5ee-48ee-b317-ac637797097e\") " pod="openstack/ceilometer-0" Dec 01 21:58:08 crc kubenswrapper[4962]: I1201 21:58:08.165182 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvkmq\" (UniqueName: \"kubernetes.io/projected/bd27d7a4-d5ee-48ee-b317-ac637797097e-kube-api-access-cvkmq\") pod \"ceilometer-0\" (UID: \"bd27d7a4-d5ee-48ee-b317-ac637797097e\") " pod="openstack/ceilometer-0" Dec 01 21:58:08 crc kubenswrapper[4962]: I1201 21:58:08.229725 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 21:58:08 crc kubenswrapper[4962]: I1201 21:58:08.235693 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3" path="/var/lib/kubelet/pods/54bc2da4-1b23-49a0-b9b9-a5f042b1e0e3/volumes" Dec 01 21:58:08 crc kubenswrapper[4962]: I1201 21:58:08.817412 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 21:58:08 crc kubenswrapper[4962]: W1201 21:58:08.822646 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd27d7a4_d5ee_48ee_b317_ac637797097e.slice/crio-5ce9f6740dd2b413a3f05a7e48a594cdd8e9f9ae1631938f348d14e2d0402332 WatchSource:0}: Error finding container 5ce9f6740dd2b413a3f05a7e48a594cdd8e9f9ae1631938f348d14e2d0402332: Status 404 returned error can't find the container with id 5ce9f6740dd2b413a3f05a7e48a594cdd8e9f9ae1631938f348d14e2d0402332 Dec 01 21:58:09 crc kubenswrapper[4962]: I1201 21:58:09.531230 4962 generic.go:334] "Generic (PLEG): container finished" podID="66789194-99ac-4fdf-9fc0-350fa5422867" containerID="e46de99b8473033437861ecfef864ffc26a78968b854651184220f433944c7d5" exitCode=0 Dec 01 21:58:09 crc kubenswrapper[4962]: I1201 21:58:09.531420 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"66789194-99ac-4fdf-9fc0-350fa5422867","Type":"ContainerDied","Data":"e46de99b8473033437861ecfef864ffc26a78968b854651184220f433944c7d5"} Dec 01 21:58:09 crc kubenswrapper[4962]: I1201 21:58:09.531590 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"66789194-99ac-4fdf-9fc0-350fa5422867","Type":"ContainerDied","Data":"73f080f31f6308fdfdf13ccc705ff080fd644f483c8a510e78e18f729962cacf"} Dec 01 21:58:09 crc kubenswrapper[4962]: I1201 21:58:09.531673 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73f080f31f6308fdfdf13ccc705ff080fd644f483c8a510e78e18f729962cacf" Dec 01 21:58:09 crc kubenswrapper[4962]: I1201 21:58:09.533399 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bd27d7a4-d5ee-48ee-b317-ac637797097e","Type":"ContainerStarted","Data":"5ce9f6740dd2b413a3f05a7e48a594cdd8e9f9ae1631938f348d14e2d0402332"} Dec 01 21:58:09 crc kubenswrapper[4962]: I1201 21:58:09.552528 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 21:58:09 crc kubenswrapper[4962]: I1201 21:58:09.687962 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66789194-99ac-4fdf-9fc0-350fa5422867-combined-ca-bundle\") pod \"66789194-99ac-4fdf-9fc0-350fa5422867\" (UID: \"66789194-99ac-4fdf-9fc0-350fa5422867\") " Dec 01 21:58:09 crc kubenswrapper[4962]: I1201 21:58:09.688345 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66789194-99ac-4fdf-9fc0-350fa5422867-config-data\") pod \"66789194-99ac-4fdf-9fc0-350fa5422867\" (UID: \"66789194-99ac-4fdf-9fc0-350fa5422867\") " Dec 01 21:58:09 crc kubenswrapper[4962]: I1201 21:58:09.688722 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66789194-99ac-4fdf-9fc0-350fa5422867-logs\") pod \"66789194-99ac-4fdf-9fc0-350fa5422867\" (UID: \"66789194-99ac-4fdf-9fc0-350fa5422867\") " Dec 01 21:58:09 crc kubenswrapper[4962]: I1201 21:58:09.688799 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ss4kn\" (UniqueName: \"kubernetes.io/projected/66789194-99ac-4fdf-9fc0-350fa5422867-kube-api-access-ss4kn\") pod \"66789194-99ac-4fdf-9fc0-350fa5422867\" (UID: \"66789194-99ac-4fdf-9fc0-350fa5422867\") " Dec 01 21:58:09 crc kubenswrapper[4962]: I1201 21:58:09.689619 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66789194-99ac-4fdf-9fc0-350fa5422867-logs" (OuterVolumeSpecName: "logs") pod "66789194-99ac-4fdf-9fc0-350fa5422867" (UID: "66789194-99ac-4fdf-9fc0-350fa5422867"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:58:09 crc kubenswrapper[4962]: I1201 21:58:09.690244 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66789194-99ac-4fdf-9fc0-350fa5422867-logs\") on node \"crc\" DevicePath \"\"" Dec 01 21:58:09 crc kubenswrapper[4962]: I1201 21:58:09.695136 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66789194-99ac-4fdf-9fc0-350fa5422867-kube-api-access-ss4kn" (OuterVolumeSpecName: "kube-api-access-ss4kn") pod "66789194-99ac-4fdf-9fc0-350fa5422867" (UID: "66789194-99ac-4fdf-9fc0-350fa5422867"). InnerVolumeSpecName "kube-api-access-ss4kn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:58:09 crc kubenswrapper[4962]: I1201 21:58:09.743459 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66789194-99ac-4fdf-9fc0-350fa5422867-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "66789194-99ac-4fdf-9fc0-350fa5422867" (UID: "66789194-99ac-4fdf-9fc0-350fa5422867"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:58:09 crc kubenswrapper[4962]: I1201 21:58:09.754263 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66789194-99ac-4fdf-9fc0-350fa5422867-config-data" (OuterVolumeSpecName: "config-data") pod "66789194-99ac-4fdf-9fc0-350fa5422867" (UID: "66789194-99ac-4fdf-9fc0-350fa5422867"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:58:09 crc kubenswrapper[4962]: I1201 21:58:09.792809 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66789194-99ac-4fdf-9fc0-350fa5422867-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 21:58:09 crc kubenswrapper[4962]: I1201 21:58:09.792857 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66789194-99ac-4fdf-9fc0-350fa5422867-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 21:58:09 crc kubenswrapper[4962]: I1201 21:58:09.792878 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ss4kn\" (UniqueName: \"kubernetes.io/projected/66789194-99ac-4fdf-9fc0-350fa5422867-kube-api-access-ss4kn\") on node \"crc\" DevicePath \"\"" Dec 01 21:58:09 crc kubenswrapper[4962]: I1201 21:58:09.902876 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 01 21:58:09 crc kubenswrapper[4962]: I1201 21:58:09.921583 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 01 21:58:10 crc kubenswrapper[4962]: I1201 21:58:10.545998 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bd27d7a4-d5ee-48ee-b317-ac637797097e","Type":"ContainerStarted","Data":"8873855e8cc86b1a5f5e32c301660def4b38d97deda148e127ed139adbd358a5"} Dec 01 21:58:10 crc kubenswrapper[4962]: I1201 21:58:10.546029 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 21:58:10 crc kubenswrapper[4962]: I1201 21:58:10.627194 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 01 21:58:10 crc kubenswrapper[4962]: I1201 21:58:10.675051 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 01 21:58:10 crc kubenswrapper[4962]: I1201 21:58:10.738797 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 01 21:58:10 crc kubenswrapper[4962]: E1201 21:58:10.739348 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66789194-99ac-4fdf-9fc0-350fa5422867" containerName="nova-api-api" Dec 01 21:58:10 crc kubenswrapper[4962]: I1201 21:58:10.739365 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="66789194-99ac-4fdf-9fc0-350fa5422867" containerName="nova-api-api" Dec 01 21:58:10 crc kubenswrapper[4962]: E1201 21:58:10.739379 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66789194-99ac-4fdf-9fc0-350fa5422867" containerName="nova-api-log" Dec 01 21:58:10 crc kubenswrapper[4962]: I1201 21:58:10.739384 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="66789194-99ac-4fdf-9fc0-350fa5422867" containerName="nova-api-log" Dec 01 21:58:10 crc kubenswrapper[4962]: I1201 21:58:10.739621 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="66789194-99ac-4fdf-9fc0-350fa5422867" containerName="nova-api-api" Dec 01 21:58:10 crc kubenswrapper[4962]: I1201 21:58:10.739648 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="66789194-99ac-4fdf-9fc0-350fa5422867" containerName="nova-api-log" Dec 01 21:58:10 crc kubenswrapper[4962]: I1201 21:58:10.740875 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 21:58:10 crc kubenswrapper[4962]: I1201 21:58:10.763318 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 01 21:58:10 crc kubenswrapper[4962]: I1201 21:58:10.763472 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 01 21:58:10 crc kubenswrapper[4962]: I1201 21:58:10.763577 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 01 21:58:10 crc kubenswrapper[4962]: I1201 21:58:10.795803 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 21:58:10 crc kubenswrapper[4962]: I1201 21:58:10.806282 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 01 21:58:10 crc kubenswrapper[4962]: I1201 21:58:10.844874 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72f677b4-b1a9-46c6-8c28-a56f1310497a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"72f677b4-b1a9-46c6-8c28-a56f1310497a\") " pod="openstack/nova-api-0" Dec 01 21:58:10 crc kubenswrapper[4962]: I1201 21:58:10.844920 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2shql\" (UniqueName: \"kubernetes.io/projected/72f677b4-b1a9-46c6-8c28-a56f1310497a-kube-api-access-2shql\") pod \"nova-api-0\" (UID: \"72f677b4-b1a9-46c6-8c28-a56f1310497a\") " pod="openstack/nova-api-0" Dec 01 21:58:10 crc kubenswrapper[4962]: I1201 21:58:10.845048 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72f677b4-b1a9-46c6-8c28-a56f1310497a-config-data\") pod \"nova-api-0\" (UID: \"72f677b4-b1a9-46c6-8c28-a56f1310497a\") " pod="openstack/nova-api-0" Dec 01 21:58:10 crc kubenswrapper[4962]: I1201 21:58:10.845069 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/72f677b4-b1a9-46c6-8c28-a56f1310497a-internal-tls-certs\") pod \"nova-api-0\" (UID: \"72f677b4-b1a9-46c6-8c28-a56f1310497a\") " pod="openstack/nova-api-0" Dec 01 21:58:10 crc kubenswrapper[4962]: I1201 21:58:10.845193 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72f677b4-b1a9-46c6-8c28-a56f1310497a-logs\") pod \"nova-api-0\" (UID: \"72f677b4-b1a9-46c6-8c28-a56f1310497a\") " pod="openstack/nova-api-0" Dec 01 21:58:10 crc kubenswrapper[4962]: I1201 21:58:10.845227 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/72f677b4-b1a9-46c6-8c28-a56f1310497a-public-tls-certs\") pod \"nova-api-0\" (UID: \"72f677b4-b1a9-46c6-8c28-a56f1310497a\") " pod="openstack/nova-api-0" Dec 01 21:58:10 crc kubenswrapper[4962]: I1201 21:58:10.947719 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72f677b4-b1a9-46c6-8c28-a56f1310497a-config-data\") pod \"nova-api-0\" (UID: \"72f677b4-b1a9-46c6-8c28-a56f1310497a\") " pod="openstack/nova-api-0" Dec 01 21:58:10 crc kubenswrapper[4962]: I1201 21:58:10.947775 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/72f677b4-b1a9-46c6-8c28-a56f1310497a-internal-tls-certs\") pod \"nova-api-0\" (UID: \"72f677b4-b1a9-46c6-8c28-a56f1310497a\") " pod="openstack/nova-api-0" Dec 01 21:58:10 crc kubenswrapper[4962]: I1201 21:58:10.947989 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72f677b4-b1a9-46c6-8c28-a56f1310497a-logs\") pod \"nova-api-0\" (UID: \"72f677b4-b1a9-46c6-8c28-a56f1310497a\") " pod="openstack/nova-api-0" Dec 01 21:58:10 crc kubenswrapper[4962]: I1201 21:58:10.948035 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/72f677b4-b1a9-46c6-8c28-a56f1310497a-public-tls-certs\") pod \"nova-api-0\" (UID: \"72f677b4-b1a9-46c6-8c28-a56f1310497a\") " pod="openstack/nova-api-0" Dec 01 21:58:10 crc kubenswrapper[4962]: I1201 21:58:10.948083 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72f677b4-b1a9-46c6-8c28-a56f1310497a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"72f677b4-b1a9-46c6-8c28-a56f1310497a\") " pod="openstack/nova-api-0" Dec 01 21:58:10 crc kubenswrapper[4962]: I1201 21:58:10.948111 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2shql\" (UniqueName: \"kubernetes.io/projected/72f677b4-b1a9-46c6-8c28-a56f1310497a-kube-api-access-2shql\") pod \"nova-api-0\" (UID: \"72f677b4-b1a9-46c6-8c28-a56f1310497a\") " pod="openstack/nova-api-0" Dec 01 21:58:10 crc kubenswrapper[4962]: I1201 21:58:10.948392 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72f677b4-b1a9-46c6-8c28-a56f1310497a-logs\") pod \"nova-api-0\" (UID: \"72f677b4-b1a9-46c6-8c28-a56f1310497a\") " pod="openstack/nova-api-0" Dec 01 21:58:10 crc kubenswrapper[4962]: I1201 21:58:10.953771 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/72f677b4-b1a9-46c6-8c28-a56f1310497a-internal-tls-certs\") pod \"nova-api-0\" (UID: \"72f677b4-b1a9-46c6-8c28-a56f1310497a\") " pod="openstack/nova-api-0" Dec 01 21:58:10 crc kubenswrapper[4962]: I1201 21:58:10.953900 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72f677b4-b1a9-46c6-8c28-a56f1310497a-config-data\") pod \"nova-api-0\" (UID: \"72f677b4-b1a9-46c6-8c28-a56f1310497a\") " pod="openstack/nova-api-0" Dec 01 21:58:10 crc kubenswrapper[4962]: I1201 21:58:10.957390 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/72f677b4-b1a9-46c6-8c28-a56f1310497a-public-tls-certs\") pod \"nova-api-0\" (UID: \"72f677b4-b1a9-46c6-8c28-a56f1310497a\") " pod="openstack/nova-api-0" Dec 01 21:58:10 crc kubenswrapper[4962]: I1201 21:58:10.964503 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72f677b4-b1a9-46c6-8c28-a56f1310497a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"72f677b4-b1a9-46c6-8c28-a56f1310497a\") " pod="openstack/nova-api-0" Dec 01 21:58:10 crc kubenswrapper[4962]: I1201 21:58:10.967194 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2shql\" (UniqueName: \"kubernetes.io/projected/72f677b4-b1a9-46c6-8c28-a56f1310497a-kube-api-access-2shql\") pod \"nova-api-0\" (UID: \"72f677b4-b1a9-46c6-8c28-a56f1310497a\") " pod="openstack/nova-api-0" Dec 01 21:58:11 crc kubenswrapper[4962]: I1201 21:58:11.091093 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-snnm6"] Dec 01 21:58:11 crc kubenswrapper[4962]: I1201 21:58:11.092625 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-snnm6" Dec 01 21:58:11 crc kubenswrapper[4962]: I1201 21:58:11.095606 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 01 21:58:11 crc kubenswrapper[4962]: I1201 21:58:11.096394 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 01 21:58:11 crc kubenswrapper[4962]: I1201 21:58:11.096665 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 21:58:11 crc kubenswrapper[4962]: I1201 21:58:11.113250 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-snnm6"] Dec 01 21:58:11 crc kubenswrapper[4962]: I1201 21:58:11.255459 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzpd5\" (UniqueName: \"kubernetes.io/projected/36804ca4-2be7-4289-ac20-41e5209f0ae3-kube-api-access-kzpd5\") pod \"nova-cell1-cell-mapping-snnm6\" (UID: \"36804ca4-2be7-4289-ac20-41e5209f0ae3\") " pod="openstack/nova-cell1-cell-mapping-snnm6" Dec 01 21:58:11 crc kubenswrapper[4962]: I1201 21:58:11.255518 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36804ca4-2be7-4289-ac20-41e5209f0ae3-scripts\") pod \"nova-cell1-cell-mapping-snnm6\" (UID: \"36804ca4-2be7-4289-ac20-41e5209f0ae3\") " pod="openstack/nova-cell1-cell-mapping-snnm6" Dec 01 21:58:11 crc kubenswrapper[4962]: I1201 21:58:11.255872 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36804ca4-2be7-4289-ac20-41e5209f0ae3-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-snnm6\" (UID: \"36804ca4-2be7-4289-ac20-41e5209f0ae3\") " pod="openstack/nova-cell1-cell-mapping-snnm6" Dec 01 21:58:11 crc kubenswrapper[4962]: I1201 21:58:11.256002 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36804ca4-2be7-4289-ac20-41e5209f0ae3-config-data\") pod \"nova-cell1-cell-mapping-snnm6\" (UID: \"36804ca4-2be7-4289-ac20-41e5209f0ae3\") " pod="openstack/nova-cell1-cell-mapping-snnm6" Dec 01 21:58:11 crc kubenswrapper[4962]: I1201 21:58:11.358395 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzpd5\" (UniqueName: \"kubernetes.io/projected/36804ca4-2be7-4289-ac20-41e5209f0ae3-kube-api-access-kzpd5\") pod \"nova-cell1-cell-mapping-snnm6\" (UID: \"36804ca4-2be7-4289-ac20-41e5209f0ae3\") " pod="openstack/nova-cell1-cell-mapping-snnm6" Dec 01 21:58:11 crc kubenswrapper[4962]: I1201 21:58:11.359141 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36804ca4-2be7-4289-ac20-41e5209f0ae3-scripts\") pod \"nova-cell1-cell-mapping-snnm6\" (UID: \"36804ca4-2be7-4289-ac20-41e5209f0ae3\") " pod="openstack/nova-cell1-cell-mapping-snnm6" Dec 01 21:58:11 crc kubenswrapper[4962]: I1201 21:58:11.360046 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36804ca4-2be7-4289-ac20-41e5209f0ae3-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-snnm6\" (UID: \"36804ca4-2be7-4289-ac20-41e5209f0ae3\") " pod="openstack/nova-cell1-cell-mapping-snnm6" Dec 01 21:58:11 crc kubenswrapper[4962]: I1201 21:58:11.360117 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36804ca4-2be7-4289-ac20-41e5209f0ae3-config-data\") pod \"nova-cell1-cell-mapping-snnm6\" (UID: \"36804ca4-2be7-4289-ac20-41e5209f0ae3\") " pod="openstack/nova-cell1-cell-mapping-snnm6" Dec 01 21:58:11 crc kubenswrapper[4962]: I1201 21:58:11.368351 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36804ca4-2be7-4289-ac20-41e5209f0ae3-config-data\") pod \"nova-cell1-cell-mapping-snnm6\" (UID: \"36804ca4-2be7-4289-ac20-41e5209f0ae3\") " pod="openstack/nova-cell1-cell-mapping-snnm6" Dec 01 21:58:11 crc kubenswrapper[4962]: I1201 21:58:11.374815 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36804ca4-2be7-4289-ac20-41e5209f0ae3-scripts\") pod \"nova-cell1-cell-mapping-snnm6\" (UID: \"36804ca4-2be7-4289-ac20-41e5209f0ae3\") " pod="openstack/nova-cell1-cell-mapping-snnm6" Dec 01 21:58:11 crc kubenswrapper[4962]: I1201 21:58:11.387358 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzpd5\" (UniqueName: \"kubernetes.io/projected/36804ca4-2be7-4289-ac20-41e5209f0ae3-kube-api-access-kzpd5\") pod \"nova-cell1-cell-mapping-snnm6\" (UID: \"36804ca4-2be7-4289-ac20-41e5209f0ae3\") " pod="openstack/nova-cell1-cell-mapping-snnm6" Dec 01 21:58:11 crc kubenswrapper[4962]: I1201 21:58:11.395511 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36804ca4-2be7-4289-ac20-41e5209f0ae3-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-snnm6\" (UID: \"36804ca4-2be7-4289-ac20-41e5209f0ae3\") " pod="openstack/nova-cell1-cell-mapping-snnm6" Dec 01 21:58:11 crc kubenswrapper[4962]: I1201 21:58:11.424792 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-snnm6" Dec 01 21:58:11 crc kubenswrapper[4962]: I1201 21:58:11.566398 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bd27d7a4-d5ee-48ee-b317-ac637797097e","Type":"ContainerStarted","Data":"1b8680a116eb1730e294205b6fe4d9726f1a73c2af331267256f85134b3e5b63"} Dec 01 21:58:11 crc kubenswrapper[4962]: I1201 21:58:11.669498 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 21:58:11 crc kubenswrapper[4962]: I1201 21:58:11.937412 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-snnm6"] Dec 01 21:58:11 crc kubenswrapper[4962]: W1201 21:58:11.948804 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36804ca4_2be7_4289_ac20_41e5209f0ae3.slice/crio-2c3bce36604ad47c0c01cd236a8631b22b16316219326e2dcd9a506160831c74 WatchSource:0}: Error finding container 2c3bce36604ad47c0c01cd236a8631b22b16316219326e2dcd9a506160831c74: Status 404 returned error can't find the container with id 2c3bce36604ad47c0c01cd236a8631b22b16316219326e2dcd9a506160831c74 Dec 01 21:58:12 crc kubenswrapper[4962]: I1201 21:58:12.234217 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66789194-99ac-4fdf-9fc0-350fa5422867" path="/var/lib/kubelet/pods/66789194-99ac-4fdf-9fc0-350fa5422867/volumes" Dec 01 21:58:12 crc kubenswrapper[4962]: I1201 21:58:12.582119 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"72f677b4-b1a9-46c6-8c28-a56f1310497a","Type":"ContainerStarted","Data":"b008e8f1319f030264cec8aff13e7b2c8692e9d690fb13d636b13c397f5d4074"} Dec 01 21:58:12 crc kubenswrapper[4962]: I1201 21:58:12.582161 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"72f677b4-b1a9-46c6-8c28-a56f1310497a","Type":"ContainerStarted","Data":"7ea3a0d85cd874b6654e5d0264ff9c6b8657b0238843ca723ddae6df8af27379"} Dec 01 21:58:12 crc kubenswrapper[4962]: I1201 21:58:12.582172 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"72f677b4-b1a9-46c6-8c28-a56f1310497a","Type":"ContainerStarted","Data":"cddea78d1336f29507c7f6316b839fd708957fc7cdb66ac96b7a76ca66c32d97"} Dec 01 21:58:12 crc kubenswrapper[4962]: I1201 21:58:12.587645 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-snnm6" event={"ID":"36804ca4-2be7-4289-ac20-41e5209f0ae3","Type":"ContainerStarted","Data":"5349d1e018999cb3ae1b93470b4e63c23ad7c10a2906009fcd6516c6f176afd9"} Dec 01 21:58:12 crc kubenswrapper[4962]: I1201 21:58:12.587680 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-snnm6" event={"ID":"36804ca4-2be7-4289-ac20-41e5209f0ae3","Type":"ContainerStarted","Data":"2c3bce36604ad47c0c01cd236a8631b22b16316219326e2dcd9a506160831c74"} Dec 01 21:58:12 crc kubenswrapper[4962]: I1201 21:58:12.602930 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.602909091 podStartE2EDuration="2.602909091s" podCreationTimestamp="2025-12-01 21:58:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:58:12.602272033 +0000 UTC m=+1476.703711228" watchObservedRunningTime="2025-12-01 21:58:12.602909091 +0000 UTC m=+1476.704348306" Dec 01 21:58:12 crc kubenswrapper[4962]: I1201 21:58:12.606685 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bd27d7a4-d5ee-48ee-b317-ac637797097e","Type":"ContainerStarted","Data":"0fdbb7fc9151c4bfe4a3f9b7082315248516c2824a50d7b174c570405de73a0b"} Dec 01 21:58:12 crc kubenswrapper[4962]: I1201 21:58:12.638075 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-snnm6" podStartSLOduration=1.638055305 podStartE2EDuration="1.638055305s" podCreationTimestamp="2025-12-01 21:58:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:58:12.62162619 +0000 UTC m=+1476.723065375" watchObservedRunningTime="2025-12-01 21:58:12.638055305 +0000 UTC m=+1476.739494500" Dec 01 21:58:12 crc kubenswrapper[4962]: I1201 21:58:12.876133 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-f84f9ccf-bg7lh" Dec 01 21:58:12 crc kubenswrapper[4962]: I1201 21:58:12.983545 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-568d7fd7cf-rw5lw"] Dec 01 21:58:12 crc kubenswrapper[4962]: I1201 21:58:12.983905 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-568d7fd7cf-rw5lw" podUID="a81298b0-e9da-494b-ac1c-4c7e3e1be818" containerName="dnsmasq-dns" containerID="cri-o://b022597fb5249e1e727c92ba7738ab1dc71db487e4c705c7d496197bf238eee1" gracePeriod=10 Dec 01 21:58:13 crc kubenswrapper[4962]: I1201 21:58:13.626993 4962 generic.go:334] "Generic (PLEG): container finished" podID="a81298b0-e9da-494b-ac1c-4c7e3e1be818" containerID="b022597fb5249e1e727c92ba7738ab1dc71db487e4c705c7d496197bf238eee1" exitCode=0 Dec 01 21:58:13 crc kubenswrapper[4962]: I1201 21:58:13.629648 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-568d7fd7cf-rw5lw" event={"ID":"a81298b0-e9da-494b-ac1c-4c7e3e1be818","Type":"ContainerDied","Data":"b022597fb5249e1e727c92ba7738ab1dc71db487e4c705c7d496197bf238eee1"} Dec 01 21:58:13 crc kubenswrapper[4962]: I1201 21:58:13.629682 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-568d7fd7cf-rw5lw" event={"ID":"a81298b0-e9da-494b-ac1c-4c7e3e1be818","Type":"ContainerDied","Data":"649bf8f1fd0c04a10d7cd6a764163af6ac92ac0c1f940389a7a5fce714da3079"} Dec 01 21:58:13 crc kubenswrapper[4962]: I1201 21:58:13.629697 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="649bf8f1fd0c04a10d7cd6a764163af6ac92ac0c1f940389a7a5fce714da3079" Dec 01 21:58:13 crc kubenswrapper[4962]: I1201 21:58:13.702473 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-568d7fd7cf-rw5lw" Dec 01 21:58:13 crc kubenswrapper[4962]: I1201 21:58:13.851001 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a81298b0-e9da-494b-ac1c-4c7e3e1be818-ovsdbserver-nb\") pod \"a81298b0-e9da-494b-ac1c-4c7e3e1be818\" (UID: \"a81298b0-e9da-494b-ac1c-4c7e3e1be818\") " Dec 01 21:58:13 crc kubenswrapper[4962]: I1201 21:58:13.851166 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a81298b0-e9da-494b-ac1c-4c7e3e1be818-ovsdbserver-sb\") pod \"a81298b0-e9da-494b-ac1c-4c7e3e1be818\" (UID: \"a81298b0-e9da-494b-ac1c-4c7e3e1be818\") " Dec 01 21:58:13 crc kubenswrapper[4962]: I1201 21:58:13.851214 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxcbz\" (UniqueName: \"kubernetes.io/projected/a81298b0-e9da-494b-ac1c-4c7e3e1be818-kube-api-access-bxcbz\") pod \"a81298b0-e9da-494b-ac1c-4c7e3e1be818\" (UID: \"a81298b0-e9da-494b-ac1c-4c7e3e1be818\") " Dec 01 21:58:13 crc kubenswrapper[4962]: I1201 21:58:13.851238 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a81298b0-e9da-494b-ac1c-4c7e3e1be818-config\") pod \"a81298b0-e9da-494b-ac1c-4c7e3e1be818\" (UID: \"a81298b0-e9da-494b-ac1c-4c7e3e1be818\") " Dec 01 21:58:13 crc kubenswrapper[4962]: I1201 21:58:13.851294 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a81298b0-e9da-494b-ac1c-4c7e3e1be818-dns-svc\") pod \"a81298b0-e9da-494b-ac1c-4c7e3e1be818\" (UID: \"a81298b0-e9da-494b-ac1c-4c7e3e1be818\") " Dec 01 21:58:13 crc kubenswrapper[4962]: I1201 21:58:13.851327 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a81298b0-e9da-494b-ac1c-4c7e3e1be818-dns-swift-storage-0\") pod \"a81298b0-e9da-494b-ac1c-4c7e3e1be818\" (UID: \"a81298b0-e9da-494b-ac1c-4c7e3e1be818\") " Dec 01 21:58:13 crc kubenswrapper[4962]: I1201 21:58:13.860327 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a81298b0-e9da-494b-ac1c-4c7e3e1be818-kube-api-access-bxcbz" (OuterVolumeSpecName: "kube-api-access-bxcbz") pod "a81298b0-e9da-494b-ac1c-4c7e3e1be818" (UID: "a81298b0-e9da-494b-ac1c-4c7e3e1be818"). InnerVolumeSpecName "kube-api-access-bxcbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:58:13 crc kubenswrapper[4962]: I1201 21:58:13.915025 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a81298b0-e9da-494b-ac1c-4c7e3e1be818-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a81298b0-e9da-494b-ac1c-4c7e3e1be818" (UID: "a81298b0-e9da-494b-ac1c-4c7e3e1be818"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:58:13 crc kubenswrapper[4962]: I1201 21:58:13.919189 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a81298b0-e9da-494b-ac1c-4c7e3e1be818-config" (OuterVolumeSpecName: "config") pod "a81298b0-e9da-494b-ac1c-4c7e3e1be818" (UID: "a81298b0-e9da-494b-ac1c-4c7e3e1be818"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:58:13 crc kubenswrapper[4962]: I1201 21:58:13.919514 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a81298b0-e9da-494b-ac1c-4c7e3e1be818-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a81298b0-e9da-494b-ac1c-4c7e3e1be818" (UID: "a81298b0-e9da-494b-ac1c-4c7e3e1be818"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:58:13 crc kubenswrapper[4962]: I1201 21:58:13.920761 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a81298b0-e9da-494b-ac1c-4c7e3e1be818-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a81298b0-e9da-494b-ac1c-4c7e3e1be818" (UID: "a81298b0-e9da-494b-ac1c-4c7e3e1be818"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:58:13 crc kubenswrapper[4962]: I1201 21:58:13.922761 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a81298b0-e9da-494b-ac1c-4c7e3e1be818-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a81298b0-e9da-494b-ac1c-4c7e3e1be818" (UID: "a81298b0-e9da-494b-ac1c-4c7e3e1be818"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:58:13 crc kubenswrapper[4962]: I1201 21:58:13.954678 4962 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a81298b0-e9da-494b-ac1c-4c7e3e1be818-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 01 21:58:13 crc kubenswrapper[4962]: I1201 21:58:13.954722 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a81298b0-e9da-494b-ac1c-4c7e3e1be818-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 21:58:13 crc kubenswrapper[4962]: I1201 21:58:13.954731 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a81298b0-e9da-494b-ac1c-4c7e3e1be818-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 21:58:13 crc kubenswrapper[4962]: I1201 21:58:13.954740 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxcbz\" (UniqueName: \"kubernetes.io/projected/a81298b0-e9da-494b-ac1c-4c7e3e1be818-kube-api-access-bxcbz\") on node \"crc\" DevicePath \"\"" Dec 01 21:58:13 crc kubenswrapper[4962]: I1201 21:58:13.954751 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a81298b0-e9da-494b-ac1c-4c7e3e1be818-config\") on node \"crc\" DevicePath \"\"" Dec 01 21:58:13 crc kubenswrapper[4962]: I1201 21:58:13.954759 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a81298b0-e9da-494b-ac1c-4c7e3e1be818-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 21:58:14 crc kubenswrapper[4962]: I1201 21:58:14.640438 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-568d7fd7cf-rw5lw" Dec 01 21:58:14 crc kubenswrapper[4962]: I1201 21:58:14.641890 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bd27d7a4-d5ee-48ee-b317-ac637797097e","Type":"ContainerStarted","Data":"f5cce121c6e120ee26efaae94bbeffca5976651918c066d3b6751059b27e89bf"} Dec 01 21:58:14 crc kubenswrapper[4962]: I1201 21:58:14.641952 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 01 21:58:14 crc kubenswrapper[4962]: I1201 21:58:14.673167 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.0905054 podStartE2EDuration="7.673143124s" podCreationTimestamp="2025-12-01 21:58:07 +0000 UTC" firstStartedPulling="2025-12-01 21:58:08.826640803 +0000 UTC m=+1472.928079998" lastFinishedPulling="2025-12-01 21:58:13.409278527 +0000 UTC m=+1477.510717722" observedRunningTime="2025-12-01 21:58:14.66560394 +0000 UTC m=+1478.767043145" watchObservedRunningTime="2025-12-01 21:58:14.673143124 +0000 UTC m=+1478.774582319" Dec 01 21:58:14 crc kubenswrapper[4962]: I1201 21:58:14.700390 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-568d7fd7cf-rw5lw"] Dec 01 21:58:14 crc kubenswrapper[4962]: I1201 21:58:14.713520 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-568d7fd7cf-rw5lw"] Dec 01 21:58:16 crc kubenswrapper[4962]: I1201 21:58:16.260858 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a81298b0-e9da-494b-ac1c-4c7e3e1be818" path="/var/lib/kubelet/pods/a81298b0-e9da-494b-ac1c-4c7e3e1be818/volumes" Dec 01 21:58:17 crc kubenswrapper[4962]: I1201 21:58:17.677395 4962 generic.go:334] "Generic (PLEG): container finished" podID="36804ca4-2be7-4289-ac20-41e5209f0ae3" containerID="5349d1e018999cb3ae1b93470b4e63c23ad7c10a2906009fcd6516c6f176afd9" exitCode=0 Dec 01 21:58:17 crc kubenswrapper[4962]: I1201 21:58:17.677487 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-snnm6" event={"ID":"36804ca4-2be7-4289-ac20-41e5209f0ae3","Type":"ContainerDied","Data":"5349d1e018999cb3ae1b93470b4e63c23ad7c10a2906009fcd6516c6f176afd9"} Dec 01 21:58:19 crc kubenswrapper[4962]: I1201 21:58:19.242598 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-snnm6" Dec 01 21:58:19 crc kubenswrapper[4962]: I1201 21:58:19.284385 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36804ca4-2be7-4289-ac20-41e5209f0ae3-config-data\") pod \"36804ca4-2be7-4289-ac20-41e5209f0ae3\" (UID: \"36804ca4-2be7-4289-ac20-41e5209f0ae3\") " Dec 01 21:58:19 crc kubenswrapper[4962]: I1201 21:58:19.284507 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzpd5\" (UniqueName: \"kubernetes.io/projected/36804ca4-2be7-4289-ac20-41e5209f0ae3-kube-api-access-kzpd5\") pod \"36804ca4-2be7-4289-ac20-41e5209f0ae3\" (UID: \"36804ca4-2be7-4289-ac20-41e5209f0ae3\") " Dec 01 21:58:19 crc kubenswrapper[4962]: I1201 21:58:19.284587 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36804ca4-2be7-4289-ac20-41e5209f0ae3-scripts\") pod \"36804ca4-2be7-4289-ac20-41e5209f0ae3\" (UID: \"36804ca4-2be7-4289-ac20-41e5209f0ae3\") " Dec 01 21:58:19 crc kubenswrapper[4962]: I1201 21:58:19.284603 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36804ca4-2be7-4289-ac20-41e5209f0ae3-combined-ca-bundle\") pod \"36804ca4-2be7-4289-ac20-41e5209f0ae3\" (UID: \"36804ca4-2be7-4289-ac20-41e5209f0ae3\") " Dec 01 21:58:19 crc kubenswrapper[4962]: I1201 21:58:19.294150 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36804ca4-2be7-4289-ac20-41e5209f0ae3-scripts" (OuterVolumeSpecName: "scripts") pod "36804ca4-2be7-4289-ac20-41e5209f0ae3" (UID: "36804ca4-2be7-4289-ac20-41e5209f0ae3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:58:19 crc kubenswrapper[4962]: I1201 21:58:19.294260 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36804ca4-2be7-4289-ac20-41e5209f0ae3-kube-api-access-kzpd5" (OuterVolumeSpecName: "kube-api-access-kzpd5") pod "36804ca4-2be7-4289-ac20-41e5209f0ae3" (UID: "36804ca4-2be7-4289-ac20-41e5209f0ae3"). InnerVolumeSpecName "kube-api-access-kzpd5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:58:19 crc kubenswrapper[4962]: I1201 21:58:19.338118 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36804ca4-2be7-4289-ac20-41e5209f0ae3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "36804ca4-2be7-4289-ac20-41e5209f0ae3" (UID: "36804ca4-2be7-4289-ac20-41e5209f0ae3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:58:19 crc kubenswrapper[4962]: I1201 21:58:19.344627 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36804ca4-2be7-4289-ac20-41e5209f0ae3-config-data" (OuterVolumeSpecName: "config-data") pod "36804ca4-2be7-4289-ac20-41e5209f0ae3" (UID: "36804ca4-2be7-4289-ac20-41e5209f0ae3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:58:19 crc kubenswrapper[4962]: I1201 21:58:19.389303 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36804ca4-2be7-4289-ac20-41e5209f0ae3-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 21:58:19 crc kubenswrapper[4962]: I1201 21:58:19.389334 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36804ca4-2be7-4289-ac20-41e5209f0ae3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 21:58:19 crc kubenswrapper[4962]: I1201 21:58:19.389345 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36804ca4-2be7-4289-ac20-41e5209f0ae3-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 21:58:19 crc kubenswrapper[4962]: I1201 21:58:19.389356 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzpd5\" (UniqueName: \"kubernetes.io/projected/36804ca4-2be7-4289-ac20-41e5209f0ae3-kube-api-access-kzpd5\") on node \"crc\" DevicePath \"\"" Dec 01 21:58:19 crc kubenswrapper[4962]: I1201 21:58:19.713432 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-snnm6" event={"ID":"36804ca4-2be7-4289-ac20-41e5209f0ae3","Type":"ContainerDied","Data":"2c3bce36604ad47c0c01cd236a8631b22b16316219326e2dcd9a506160831c74"} Dec 01 21:58:19 crc kubenswrapper[4962]: I1201 21:58:19.713512 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c3bce36604ad47c0c01cd236a8631b22b16316219326e2dcd9a506160831c74" Dec 01 21:58:19 crc kubenswrapper[4962]: I1201 21:58:19.713517 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-snnm6" Dec 01 21:58:19 crc kubenswrapper[4962]: I1201 21:58:19.939779 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 21:58:19 crc kubenswrapper[4962]: I1201 21:58:19.940291 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="712ea540-d060-4b1c-a201-7b7593c942dc" containerName="nova-scheduler-scheduler" containerID="cri-o://8b0a63c02087acfbac6d8dbc35975bd714c984b8a2ea987c0781e72038a5c36b" gracePeriod=30 Dec 01 21:58:19 crc kubenswrapper[4962]: I1201 21:58:19.956661 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 01 21:58:19 crc kubenswrapper[4962]: I1201 21:58:19.956965 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="72f677b4-b1a9-46c6-8c28-a56f1310497a" containerName="nova-api-log" containerID="cri-o://7ea3a0d85cd874b6654e5d0264ff9c6b8657b0238843ca723ddae6df8af27379" gracePeriod=30 Dec 01 21:58:19 crc kubenswrapper[4962]: I1201 21:58:19.957140 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="72f677b4-b1a9-46c6-8c28-a56f1310497a" containerName="nova-api-api" containerID="cri-o://b008e8f1319f030264cec8aff13e7b2c8692e9d690fb13d636b13c397f5d4074" gracePeriod=30 Dec 01 21:58:20 crc kubenswrapper[4962]: I1201 21:58:20.073047 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 21:58:20 crc kubenswrapper[4962]: I1201 21:58:20.073852 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a4176fcc-efb7-4ace-90bd-5a0f95763c00" containerName="nova-metadata-log" containerID="cri-o://bc64d8a39eaab4dc24ac303ee3d12b50a27b7806023f4ec717d25cdc571c95c4" gracePeriod=30 Dec 01 21:58:20 crc kubenswrapper[4962]: I1201 21:58:20.074472 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a4176fcc-efb7-4ace-90bd-5a0f95763c00" containerName="nova-metadata-metadata" containerID="cri-o://0b8cf51fd167f1e54c9c805e29c1a7697b89ff963a7381ef0fe424bb972a6250" gracePeriod=30 Dec 01 21:58:20 crc kubenswrapper[4962]: I1201 21:58:20.601349 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 21:58:20 crc kubenswrapper[4962]: I1201 21:58:20.622078 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72f677b4-b1a9-46c6-8c28-a56f1310497a-combined-ca-bundle\") pod \"72f677b4-b1a9-46c6-8c28-a56f1310497a\" (UID: \"72f677b4-b1a9-46c6-8c28-a56f1310497a\") " Dec 01 21:58:20 crc kubenswrapper[4962]: I1201 21:58:20.622133 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/72f677b4-b1a9-46c6-8c28-a56f1310497a-internal-tls-certs\") pod \"72f677b4-b1a9-46c6-8c28-a56f1310497a\" (UID: \"72f677b4-b1a9-46c6-8c28-a56f1310497a\") " Dec 01 21:58:20 crc kubenswrapper[4962]: I1201 21:58:20.622178 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72f677b4-b1a9-46c6-8c28-a56f1310497a-config-data\") pod \"72f677b4-b1a9-46c6-8c28-a56f1310497a\" (UID: \"72f677b4-b1a9-46c6-8c28-a56f1310497a\") " Dec 01 21:58:20 crc kubenswrapper[4962]: I1201 21:58:20.622244 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2shql\" (UniqueName: \"kubernetes.io/projected/72f677b4-b1a9-46c6-8c28-a56f1310497a-kube-api-access-2shql\") pod \"72f677b4-b1a9-46c6-8c28-a56f1310497a\" (UID: \"72f677b4-b1a9-46c6-8c28-a56f1310497a\") " Dec 01 21:58:20 crc kubenswrapper[4962]: I1201 21:58:20.622346 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/72f677b4-b1a9-46c6-8c28-a56f1310497a-public-tls-certs\") pod \"72f677b4-b1a9-46c6-8c28-a56f1310497a\" (UID: \"72f677b4-b1a9-46c6-8c28-a56f1310497a\") " Dec 01 21:58:20 crc kubenswrapper[4962]: I1201 21:58:20.622428 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72f677b4-b1a9-46c6-8c28-a56f1310497a-logs\") pod \"72f677b4-b1a9-46c6-8c28-a56f1310497a\" (UID: \"72f677b4-b1a9-46c6-8c28-a56f1310497a\") " Dec 01 21:58:20 crc kubenswrapper[4962]: I1201 21:58:20.623288 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72f677b4-b1a9-46c6-8c28-a56f1310497a-logs" (OuterVolumeSpecName: "logs") pod "72f677b4-b1a9-46c6-8c28-a56f1310497a" (UID: "72f677b4-b1a9-46c6-8c28-a56f1310497a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:58:20 crc kubenswrapper[4962]: I1201 21:58:20.629358 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72f677b4-b1a9-46c6-8c28-a56f1310497a-kube-api-access-2shql" (OuterVolumeSpecName: "kube-api-access-2shql") pod "72f677b4-b1a9-46c6-8c28-a56f1310497a" (UID: "72f677b4-b1a9-46c6-8c28-a56f1310497a"). InnerVolumeSpecName "kube-api-access-2shql". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:58:20 crc kubenswrapper[4962]: I1201 21:58:20.668124 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72f677b4-b1a9-46c6-8c28-a56f1310497a-config-data" (OuterVolumeSpecName: "config-data") pod "72f677b4-b1a9-46c6-8c28-a56f1310497a" (UID: "72f677b4-b1a9-46c6-8c28-a56f1310497a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:58:20 crc kubenswrapper[4962]: I1201 21:58:20.692491 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72f677b4-b1a9-46c6-8c28-a56f1310497a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "72f677b4-b1a9-46c6-8c28-a56f1310497a" (UID: "72f677b4-b1a9-46c6-8c28-a56f1310497a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:58:20 crc kubenswrapper[4962]: I1201 21:58:20.708119 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72f677b4-b1a9-46c6-8c28-a56f1310497a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "72f677b4-b1a9-46c6-8c28-a56f1310497a" (UID: "72f677b4-b1a9-46c6-8c28-a56f1310497a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:58:20 crc kubenswrapper[4962]: I1201 21:58:20.713668 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72f677b4-b1a9-46c6-8c28-a56f1310497a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "72f677b4-b1a9-46c6-8c28-a56f1310497a" (UID: "72f677b4-b1a9-46c6-8c28-a56f1310497a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:58:20 crc kubenswrapper[4962]: I1201 21:58:20.725254 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72f677b4-b1a9-46c6-8c28-a56f1310497a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 21:58:20 crc kubenswrapper[4962]: I1201 21:58:20.725286 4962 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/72f677b4-b1a9-46c6-8c28-a56f1310497a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 21:58:20 crc kubenswrapper[4962]: I1201 21:58:20.725299 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72f677b4-b1a9-46c6-8c28-a56f1310497a-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 21:58:20 crc kubenswrapper[4962]: I1201 21:58:20.725312 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2shql\" (UniqueName: \"kubernetes.io/projected/72f677b4-b1a9-46c6-8c28-a56f1310497a-kube-api-access-2shql\") on node \"crc\" DevicePath \"\"" Dec 01 21:58:20 crc kubenswrapper[4962]: I1201 21:58:20.725327 4962 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/72f677b4-b1a9-46c6-8c28-a56f1310497a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 21:58:20 crc kubenswrapper[4962]: I1201 21:58:20.725338 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72f677b4-b1a9-46c6-8c28-a56f1310497a-logs\") on node \"crc\" DevicePath \"\"" Dec 01 21:58:20 crc kubenswrapper[4962]: I1201 21:58:20.730773 4962 generic.go:334] "Generic (PLEG): container finished" podID="72f677b4-b1a9-46c6-8c28-a56f1310497a" containerID="b008e8f1319f030264cec8aff13e7b2c8692e9d690fb13d636b13c397f5d4074" exitCode=0 Dec 01 21:58:20 crc kubenswrapper[4962]: I1201 21:58:20.730803 4962 generic.go:334] "Generic (PLEG): container finished" podID="72f677b4-b1a9-46c6-8c28-a56f1310497a" containerID="7ea3a0d85cd874b6654e5d0264ff9c6b8657b0238843ca723ddae6df8af27379" exitCode=143 Dec 01 21:58:20 crc kubenswrapper[4962]: I1201 21:58:20.730861 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"72f677b4-b1a9-46c6-8c28-a56f1310497a","Type":"ContainerDied","Data":"b008e8f1319f030264cec8aff13e7b2c8692e9d690fb13d636b13c397f5d4074"} Dec 01 21:58:20 crc kubenswrapper[4962]: I1201 21:58:20.730878 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 21:58:20 crc kubenswrapper[4962]: I1201 21:58:20.730891 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"72f677b4-b1a9-46c6-8c28-a56f1310497a","Type":"ContainerDied","Data":"7ea3a0d85cd874b6654e5d0264ff9c6b8657b0238843ca723ddae6df8af27379"} Dec 01 21:58:20 crc kubenswrapper[4962]: I1201 21:58:20.730905 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"72f677b4-b1a9-46c6-8c28-a56f1310497a","Type":"ContainerDied","Data":"cddea78d1336f29507c7f6316b839fd708957fc7cdb66ac96b7a76ca66c32d97"} Dec 01 21:58:20 crc kubenswrapper[4962]: I1201 21:58:20.730925 4962 scope.go:117] "RemoveContainer" containerID="b008e8f1319f030264cec8aff13e7b2c8692e9d690fb13d636b13c397f5d4074" Dec 01 21:58:20 crc kubenswrapper[4962]: I1201 21:58:20.733306 4962 generic.go:334] "Generic (PLEG): container finished" podID="a4176fcc-efb7-4ace-90bd-5a0f95763c00" containerID="bc64d8a39eaab4dc24ac303ee3d12b50a27b7806023f4ec717d25cdc571c95c4" exitCode=143 Dec 01 21:58:20 crc kubenswrapper[4962]: I1201 21:58:20.733344 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a4176fcc-efb7-4ace-90bd-5a0f95763c00","Type":"ContainerDied","Data":"bc64d8a39eaab4dc24ac303ee3d12b50a27b7806023f4ec717d25cdc571c95c4"} Dec 01 21:58:20 crc kubenswrapper[4962]: I1201 21:58:20.765367 4962 scope.go:117] "RemoveContainer" containerID="7ea3a0d85cd874b6654e5d0264ff9c6b8657b0238843ca723ddae6df8af27379" Dec 01 21:58:20 crc kubenswrapper[4962]: I1201 21:58:20.797053 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 01 21:58:20 crc kubenswrapper[4962]: I1201 21:58:20.806680 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 01 21:58:20 crc kubenswrapper[4962]: I1201 21:58:20.808488 4962 scope.go:117] "RemoveContainer" containerID="b008e8f1319f030264cec8aff13e7b2c8692e9d690fb13d636b13c397f5d4074" Dec 01 21:58:20 crc kubenswrapper[4962]: E1201 21:58:20.809211 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b008e8f1319f030264cec8aff13e7b2c8692e9d690fb13d636b13c397f5d4074\": container with ID starting with b008e8f1319f030264cec8aff13e7b2c8692e9d690fb13d636b13c397f5d4074 not found: ID does not exist" containerID="b008e8f1319f030264cec8aff13e7b2c8692e9d690fb13d636b13c397f5d4074" Dec 01 21:58:20 crc kubenswrapper[4962]: I1201 21:58:20.809253 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b008e8f1319f030264cec8aff13e7b2c8692e9d690fb13d636b13c397f5d4074"} err="failed to get container status \"b008e8f1319f030264cec8aff13e7b2c8692e9d690fb13d636b13c397f5d4074\": rpc error: code = NotFound desc = could not find container \"b008e8f1319f030264cec8aff13e7b2c8692e9d690fb13d636b13c397f5d4074\": container with ID starting with b008e8f1319f030264cec8aff13e7b2c8692e9d690fb13d636b13c397f5d4074 not found: ID does not exist" Dec 01 21:58:20 crc kubenswrapper[4962]: I1201 21:58:20.809281 4962 scope.go:117] "RemoveContainer" containerID="7ea3a0d85cd874b6654e5d0264ff9c6b8657b0238843ca723ddae6df8af27379" Dec 01 21:58:20 crc kubenswrapper[4962]: E1201 21:58:20.809502 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ea3a0d85cd874b6654e5d0264ff9c6b8657b0238843ca723ddae6df8af27379\": container with ID starting with 7ea3a0d85cd874b6654e5d0264ff9c6b8657b0238843ca723ddae6df8af27379 not found: ID does not exist" containerID="7ea3a0d85cd874b6654e5d0264ff9c6b8657b0238843ca723ddae6df8af27379" Dec 01 21:58:20 crc kubenswrapper[4962]: I1201 21:58:20.809516 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ea3a0d85cd874b6654e5d0264ff9c6b8657b0238843ca723ddae6df8af27379"} err="failed to get container status \"7ea3a0d85cd874b6654e5d0264ff9c6b8657b0238843ca723ddae6df8af27379\": rpc error: code = NotFound desc = could not find container \"7ea3a0d85cd874b6654e5d0264ff9c6b8657b0238843ca723ddae6df8af27379\": container with ID starting with 7ea3a0d85cd874b6654e5d0264ff9c6b8657b0238843ca723ddae6df8af27379 not found: ID does not exist" Dec 01 21:58:20 crc kubenswrapper[4962]: I1201 21:58:20.809528 4962 scope.go:117] "RemoveContainer" containerID="b008e8f1319f030264cec8aff13e7b2c8692e9d690fb13d636b13c397f5d4074" Dec 01 21:58:20 crc kubenswrapper[4962]: I1201 21:58:20.809687 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b008e8f1319f030264cec8aff13e7b2c8692e9d690fb13d636b13c397f5d4074"} err="failed to get container status \"b008e8f1319f030264cec8aff13e7b2c8692e9d690fb13d636b13c397f5d4074\": rpc error: code = NotFound desc = could not find container \"b008e8f1319f030264cec8aff13e7b2c8692e9d690fb13d636b13c397f5d4074\": container with ID starting with b008e8f1319f030264cec8aff13e7b2c8692e9d690fb13d636b13c397f5d4074 not found: ID does not exist" Dec 01 21:58:20 crc kubenswrapper[4962]: I1201 21:58:20.809700 4962 scope.go:117] "RemoveContainer" containerID="7ea3a0d85cd874b6654e5d0264ff9c6b8657b0238843ca723ddae6df8af27379" Dec 01 21:58:20 crc kubenswrapper[4962]: I1201 21:58:20.809847 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ea3a0d85cd874b6654e5d0264ff9c6b8657b0238843ca723ddae6df8af27379"} err="failed to get container status \"7ea3a0d85cd874b6654e5d0264ff9c6b8657b0238843ca723ddae6df8af27379\": rpc error: code = NotFound desc = could not find container \"7ea3a0d85cd874b6654e5d0264ff9c6b8657b0238843ca723ddae6df8af27379\": container with ID starting with 7ea3a0d85cd874b6654e5d0264ff9c6b8657b0238843ca723ddae6df8af27379 not found: ID does not exist" Dec 01 21:58:20 crc kubenswrapper[4962]: I1201 21:58:20.821366 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 01 21:58:20 crc kubenswrapper[4962]: E1201 21:58:20.821907 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a81298b0-e9da-494b-ac1c-4c7e3e1be818" containerName="init" Dec 01 21:58:20 crc kubenswrapper[4962]: I1201 21:58:20.821921 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a81298b0-e9da-494b-ac1c-4c7e3e1be818" containerName="init" Dec 01 21:58:20 crc kubenswrapper[4962]: E1201 21:58:20.821968 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a81298b0-e9da-494b-ac1c-4c7e3e1be818" containerName="dnsmasq-dns" Dec 01 21:58:20 crc kubenswrapper[4962]: I1201 21:58:20.821977 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a81298b0-e9da-494b-ac1c-4c7e3e1be818" containerName="dnsmasq-dns" Dec 01 21:58:20 crc kubenswrapper[4962]: E1201 21:58:20.822009 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72f677b4-b1a9-46c6-8c28-a56f1310497a" containerName="nova-api-log" Dec 01 21:58:20 crc kubenswrapper[4962]: I1201 21:58:20.822016 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="72f677b4-b1a9-46c6-8c28-a56f1310497a" containerName="nova-api-log" Dec 01 21:58:20 crc kubenswrapper[4962]: E1201 21:58:20.822038 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72f677b4-b1a9-46c6-8c28-a56f1310497a" containerName="nova-api-api" Dec 01 21:58:20 crc kubenswrapper[4962]: I1201 21:58:20.822044 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="72f677b4-b1a9-46c6-8c28-a56f1310497a" containerName="nova-api-api" Dec 01 21:58:20 crc kubenswrapper[4962]: E1201 21:58:20.822066 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36804ca4-2be7-4289-ac20-41e5209f0ae3" containerName="nova-manage" Dec 01 21:58:20 crc kubenswrapper[4962]: I1201 21:58:20.822073 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="36804ca4-2be7-4289-ac20-41e5209f0ae3" containerName="nova-manage" Dec 01 21:58:20 crc kubenswrapper[4962]: I1201 21:58:20.822275 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="72f677b4-b1a9-46c6-8c28-a56f1310497a" containerName="nova-api-log" Dec 01 21:58:20 crc kubenswrapper[4962]: I1201 21:58:20.822298 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="36804ca4-2be7-4289-ac20-41e5209f0ae3" containerName="nova-manage" Dec 01 21:58:20 crc kubenswrapper[4962]: I1201 21:58:20.822312 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="a81298b0-e9da-494b-ac1c-4c7e3e1be818" containerName="dnsmasq-dns" Dec 01 21:58:20 crc kubenswrapper[4962]: I1201 21:58:20.822325 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="72f677b4-b1a9-46c6-8c28-a56f1310497a" containerName="nova-api-api" Dec 01 21:58:20 crc kubenswrapper[4962]: I1201 21:58:20.823549 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 21:58:20 crc kubenswrapper[4962]: I1201 21:58:20.826529 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 01 21:58:20 crc kubenswrapper[4962]: I1201 21:58:20.826807 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 01 21:58:20 crc kubenswrapper[4962]: I1201 21:58:20.828159 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 01 21:58:20 crc kubenswrapper[4962]: I1201 21:58:20.831767 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 21:58:20 crc kubenswrapper[4962]: I1201 21:58:20.929096 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aabf746f-d3b5-4858-a2ec-9a5cec96720a-internal-tls-certs\") pod \"nova-api-0\" (UID: \"aabf746f-d3b5-4858-a2ec-9a5cec96720a\") " pod="openstack/nova-api-0" Dec 01 21:58:20 crc kubenswrapper[4962]: I1201 21:58:20.929149 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aabf746f-d3b5-4858-a2ec-9a5cec96720a-logs\") pod \"nova-api-0\" (UID: \"aabf746f-d3b5-4858-a2ec-9a5cec96720a\") " pod="openstack/nova-api-0" Dec 01 21:58:20 crc kubenswrapper[4962]: I1201 21:58:20.929201 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aabf746f-d3b5-4858-a2ec-9a5cec96720a-config-data\") pod \"nova-api-0\" (UID: \"aabf746f-d3b5-4858-a2ec-9a5cec96720a\") " pod="openstack/nova-api-0" Dec 01 21:58:20 crc kubenswrapper[4962]: I1201 21:58:20.929324 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8ckv\" (UniqueName: \"kubernetes.io/projected/aabf746f-d3b5-4858-a2ec-9a5cec96720a-kube-api-access-d8ckv\") pod \"nova-api-0\" (UID: \"aabf746f-d3b5-4858-a2ec-9a5cec96720a\") " pod="openstack/nova-api-0" Dec 01 21:58:20 crc kubenswrapper[4962]: I1201 21:58:20.929366 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aabf746f-d3b5-4858-a2ec-9a5cec96720a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"aabf746f-d3b5-4858-a2ec-9a5cec96720a\") " pod="openstack/nova-api-0" Dec 01 21:58:20 crc kubenswrapper[4962]: I1201 21:58:20.929479 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aabf746f-d3b5-4858-a2ec-9a5cec96720a-public-tls-certs\") pod \"nova-api-0\" (UID: \"aabf746f-d3b5-4858-a2ec-9a5cec96720a\") " pod="openstack/nova-api-0" Dec 01 21:58:21 crc kubenswrapper[4962]: I1201 21:58:21.031352 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aabf746f-d3b5-4858-a2ec-9a5cec96720a-public-tls-certs\") pod \"nova-api-0\" (UID: \"aabf746f-d3b5-4858-a2ec-9a5cec96720a\") " pod="openstack/nova-api-0" Dec 01 21:58:21 crc kubenswrapper[4962]: I1201 21:58:21.031433 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aabf746f-d3b5-4858-a2ec-9a5cec96720a-internal-tls-certs\") pod \"nova-api-0\" (UID: \"aabf746f-d3b5-4858-a2ec-9a5cec96720a\") " pod="openstack/nova-api-0" Dec 01 21:58:21 crc kubenswrapper[4962]: I1201 21:58:21.031475 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aabf746f-d3b5-4858-a2ec-9a5cec96720a-logs\") pod \"nova-api-0\" (UID: \"aabf746f-d3b5-4858-a2ec-9a5cec96720a\") " pod="openstack/nova-api-0" Dec 01 21:58:21 crc kubenswrapper[4962]: I1201 21:58:21.031539 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aabf746f-d3b5-4858-a2ec-9a5cec96720a-config-data\") pod \"nova-api-0\" (UID: \"aabf746f-d3b5-4858-a2ec-9a5cec96720a\") " pod="openstack/nova-api-0" Dec 01 21:58:21 crc kubenswrapper[4962]: I1201 21:58:21.031594 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8ckv\" (UniqueName: \"kubernetes.io/projected/aabf746f-d3b5-4858-a2ec-9a5cec96720a-kube-api-access-d8ckv\") pod \"nova-api-0\" (UID: \"aabf746f-d3b5-4858-a2ec-9a5cec96720a\") " pod="openstack/nova-api-0" Dec 01 21:58:21 crc kubenswrapper[4962]: I1201 21:58:21.031615 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aabf746f-d3b5-4858-a2ec-9a5cec96720a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"aabf746f-d3b5-4858-a2ec-9a5cec96720a\") " pod="openstack/nova-api-0" Dec 01 21:58:21 crc kubenswrapper[4962]: I1201 21:58:21.032162 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aabf746f-d3b5-4858-a2ec-9a5cec96720a-logs\") pod \"nova-api-0\" (UID: \"aabf746f-d3b5-4858-a2ec-9a5cec96720a\") " pod="openstack/nova-api-0" Dec 01 21:58:21 crc kubenswrapper[4962]: I1201 21:58:21.036636 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aabf746f-d3b5-4858-a2ec-9a5cec96720a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"aabf746f-d3b5-4858-a2ec-9a5cec96720a\") " pod="openstack/nova-api-0" Dec 01 21:58:21 crc kubenswrapper[4962]: I1201 21:58:21.036681 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aabf746f-d3b5-4858-a2ec-9a5cec96720a-public-tls-certs\") pod \"nova-api-0\" (UID: \"aabf746f-d3b5-4858-a2ec-9a5cec96720a\") " pod="openstack/nova-api-0" Dec 01 21:58:21 crc kubenswrapper[4962]: I1201 21:58:21.037322 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aabf746f-d3b5-4858-a2ec-9a5cec96720a-internal-tls-certs\") pod \"nova-api-0\" (UID: \"aabf746f-d3b5-4858-a2ec-9a5cec96720a\") " pod="openstack/nova-api-0" Dec 01 21:58:21 crc kubenswrapper[4962]: I1201 21:58:21.038988 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aabf746f-d3b5-4858-a2ec-9a5cec96720a-config-data\") pod \"nova-api-0\" (UID: \"aabf746f-d3b5-4858-a2ec-9a5cec96720a\") " pod="openstack/nova-api-0" Dec 01 21:58:21 crc kubenswrapper[4962]: I1201 21:58:21.050814 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8ckv\" (UniqueName: \"kubernetes.io/projected/aabf746f-d3b5-4858-a2ec-9a5cec96720a-kube-api-access-d8ckv\") pod \"nova-api-0\" (UID: \"aabf746f-d3b5-4858-a2ec-9a5cec96720a\") " pod="openstack/nova-api-0" Dec 01 21:58:21 crc kubenswrapper[4962]: I1201 21:58:21.150491 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 21:58:21 crc kubenswrapper[4962]: E1201 21:58:21.346602 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8b0a63c02087acfbac6d8dbc35975bd714c984b8a2ea987c0781e72038a5c36b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 01 21:58:21 crc kubenswrapper[4962]: E1201 21:58:21.349256 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8b0a63c02087acfbac6d8dbc35975bd714c984b8a2ea987c0781e72038a5c36b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 01 21:58:21 crc kubenswrapper[4962]: E1201 21:58:21.351337 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8b0a63c02087acfbac6d8dbc35975bd714c984b8a2ea987c0781e72038a5c36b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 01 21:58:21 crc kubenswrapper[4962]: E1201 21:58:21.351390 4962 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="712ea540-d060-4b1c-a201-7b7593c942dc" containerName="nova-scheduler-scheduler" Dec 01 21:58:21 crc kubenswrapper[4962]: I1201 21:58:21.669011 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 21:58:21 crc kubenswrapper[4962]: W1201 21:58:21.672101 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaabf746f_d3b5_4858_a2ec_9a5cec96720a.slice/crio-c4056dcafebb02a8174c630a48f0bafb1ca1ff2bdbb9085286c9790795e0f925 WatchSource:0}: Error finding container c4056dcafebb02a8174c630a48f0bafb1ca1ff2bdbb9085286c9790795e0f925: Status 404 returned error can't find the container with id c4056dcafebb02a8174c630a48f0bafb1ca1ff2bdbb9085286c9790795e0f925 Dec 01 21:58:21 crc kubenswrapper[4962]: I1201 21:58:21.751278 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"aabf746f-d3b5-4858-a2ec-9a5cec96720a","Type":"ContainerStarted","Data":"c4056dcafebb02a8174c630a48f0bafb1ca1ff2bdbb9085286c9790795e0f925"} Dec 01 21:58:22 crc kubenswrapper[4962]: I1201 21:58:22.234958 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72f677b4-b1a9-46c6-8c28-a56f1310497a" path="/var/lib/kubelet/pods/72f677b4-b1a9-46c6-8c28-a56f1310497a/volumes" Dec 01 21:58:22 crc kubenswrapper[4962]: I1201 21:58:22.767734 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"aabf746f-d3b5-4858-a2ec-9a5cec96720a","Type":"ContainerStarted","Data":"e17561865d9554cd0d43ce9e5617e1b1ac4d8a26e5f1254d9e948a7123b49960"} Dec 01 21:58:22 crc kubenswrapper[4962]: I1201 21:58:22.767801 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"aabf746f-d3b5-4858-a2ec-9a5cec96720a","Type":"ContainerStarted","Data":"b935cf1170ec104bc6f763e7c4762c348356e2d76a5ffbff7285f9d8cf5494dc"} Dec 01 21:58:22 crc kubenswrapper[4962]: I1201 21:58:22.813863 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.813838249 podStartE2EDuration="2.813838249s" podCreationTimestamp="2025-12-01 21:58:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:58:22.799786101 +0000 UTC m=+1486.901225296" watchObservedRunningTime="2025-12-01 21:58:22.813838249 +0000 UTC m=+1486.915277464" Dec 01 21:58:23 crc kubenswrapper[4962]: I1201 21:58:23.219665 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="a4176fcc-efb7-4ace-90bd-5a0f95763c00" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.243:8775/\": read tcp 10.217.0.2:35650->10.217.0.243:8775: read: connection reset by peer" Dec 01 21:58:23 crc kubenswrapper[4962]: I1201 21:58:23.219788 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="a4176fcc-efb7-4ace-90bd-5a0f95763c00" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.243:8775/\": read tcp 10.217.0.2:35658->10.217.0.243:8775: read: connection reset by peer" Dec 01 21:58:23 crc kubenswrapper[4962]: I1201 21:58:23.798737 4962 generic.go:334] "Generic (PLEG): container finished" podID="a4176fcc-efb7-4ace-90bd-5a0f95763c00" containerID="0b8cf51fd167f1e54c9c805e29c1a7697b89ff963a7381ef0fe424bb972a6250" exitCode=0 Dec 01 21:58:23 crc kubenswrapper[4962]: I1201 21:58:23.798842 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a4176fcc-efb7-4ace-90bd-5a0f95763c00","Type":"ContainerDied","Data":"0b8cf51fd167f1e54c9c805e29c1a7697b89ff963a7381ef0fe424bb972a6250"} Dec 01 21:58:23 crc kubenswrapper[4962]: I1201 21:58:23.799113 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a4176fcc-efb7-4ace-90bd-5a0f95763c00","Type":"ContainerDied","Data":"86c61da88cdacac8ee16816559ea7c139a64bbae7e6bf1f279b2bd84a76294f7"} Dec 01 21:58:23 crc kubenswrapper[4962]: I1201 21:58:23.799145 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86c61da88cdacac8ee16816559ea7c139a64bbae7e6bf1f279b2bd84a76294f7" Dec 01 21:58:23 crc kubenswrapper[4962]: I1201 21:58:23.829702 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 21:58:23 crc kubenswrapper[4962]: I1201 21:58:23.924632 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4176fcc-efb7-4ace-90bd-5a0f95763c00-nova-metadata-tls-certs\") pod \"a4176fcc-efb7-4ace-90bd-5a0f95763c00\" (UID: \"a4176fcc-efb7-4ace-90bd-5a0f95763c00\") " Dec 01 21:58:23 crc kubenswrapper[4962]: I1201 21:58:23.925273 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4176fcc-efb7-4ace-90bd-5a0f95763c00-config-data\") pod \"a4176fcc-efb7-4ace-90bd-5a0f95763c00\" (UID: \"a4176fcc-efb7-4ace-90bd-5a0f95763c00\") " Dec 01 21:58:23 crc kubenswrapper[4962]: I1201 21:58:23.925676 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fl7fx\" (UniqueName: \"kubernetes.io/projected/a4176fcc-efb7-4ace-90bd-5a0f95763c00-kube-api-access-fl7fx\") pod \"a4176fcc-efb7-4ace-90bd-5a0f95763c00\" (UID: \"a4176fcc-efb7-4ace-90bd-5a0f95763c00\") " Dec 01 21:58:23 crc kubenswrapper[4962]: I1201 21:58:23.926487 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4176fcc-efb7-4ace-90bd-5a0f95763c00-combined-ca-bundle\") pod \"a4176fcc-efb7-4ace-90bd-5a0f95763c00\" (UID: \"a4176fcc-efb7-4ace-90bd-5a0f95763c00\") " Dec 01 21:58:23 crc kubenswrapper[4962]: I1201 21:58:23.927001 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4176fcc-efb7-4ace-90bd-5a0f95763c00-logs\") pod \"a4176fcc-efb7-4ace-90bd-5a0f95763c00\" (UID: \"a4176fcc-efb7-4ace-90bd-5a0f95763c00\") " Dec 01 21:58:23 crc kubenswrapper[4962]: I1201 21:58:23.927631 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4176fcc-efb7-4ace-90bd-5a0f95763c00-logs" (OuterVolumeSpecName: "logs") pod "a4176fcc-efb7-4ace-90bd-5a0f95763c00" (UID: "a4176fcc-efb7-4ace-90bd-5a0f95763c00"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:58:23 crc kubenswrapper[4962]: I1201 21:58:23.931404 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4176fcc-efb7-4ace-90bd-5a0f95763c00-logs\") on node \"crc\" DevicePath \"\"" Dec 01 21:58:23 crc kubenswrapper[4962]: I1201 21:58:23.932229 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4176fcc-efb7-4ace-90bd-5a0f95763c00-kube-api-access-fl7fx" (OuterVolumeSpecName: "kube-api-access-fl7fx") pod "a4176fcc-efb7-4ace-90bd-5a0f95763c00" (UID: "a4176fcc-efb7-4ace-90bd-5a0f95763c00"). InnerVolumeSpecName "kube-api-access-fl7fx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:58:23 crc kubenswrapper[4962]: I1201 21:58:23.968826 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4176fcc-efb7-4ace-90bd-5a0f95763c00-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a4176fcc-efb7-4ace-90bd-5a0f95763c00" (UID: "a4176fcc-efb7-4ace-90bd-5a0f95763c00"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:58:24 crc kubenswrapper[4962]: I1201 21:58:24.004528 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4176fcc-efb7-4ace-90bd-5a0f95763c00-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "a4176fcc-efb7-4ace-90bd-5a0f95763c00" (UID: "a4176fcc-efb7-4ace-90bd-5a0f95763c00"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:58:24 crc kubenswrapper[4962]: I1201 21:58:24.007581 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4176fcc-efb7-4ace-90bd-5a0f95763c00-config-data" (OuterVolumeSpecName: "config-data") pod "a4176fcc-efb7-4ace-90bd-5a0f95763c00" (UID: "a4176fcc-efb7-4ace-90bd-5a0f95763c00"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:58:24 crc kubenswrapper[4962]: I1201 21:58:24.036093 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4176fcc-efb7-4ace-90bd-5a0f95763c00-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 21:58:24 crc kubenswrapper[4962]: I1201 21:58:24.036138 4962 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4176fcc-efb7-4ace-90bd-5a0f95763c00-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 21:58:24 crc kubenswrapper[4962]: I1201 21:58:24.036154 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4176fcc-efb7-4ace-90bd-5a0f95763c00-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 21:58:24 crc kubenswrapper[4962]: I1201 21:58:24.036165 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fl7fx\" (UniqueName: \"kubernetes.io/projected/a4176fcc-efb7-4ace-90bd-5a0f95763c00-kube-api-access-fl7fx\") on node \"crc\" DevicePath \"\"" Dec 01 21:58:24 crc kubenswrapper[4962]: I1201 21:58:24.816656 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 21:58:24 crc kubenswrapper[4962]: I1201 21:58:24.861072 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 21:58:24 crc kubenswrapper[4962]: I1201 21:58:24.887108 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 21:58:24 crc kubenswrapper[4962]: I1201 21:58:24.908903 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 01 21:58:24 crc kubenswrapper[4962]: E1201 21:58:24.909584 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4176fcc-efb7-4ace-90bd-5a0f95763c00" containerName="nova-metadata-metadata" Dec 01 21:58:24 crc kubenswrapper[4962]: I1201 21:58:24.909607 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4176fcc-efb7-4ace-90bd-5a0f95763c00" containerName="nova-metadata-metadata" Dec 01 21:58:24 crc kubenswrapper[4962]: E1201 21:58:24.909669 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4176fcc-efb7-4ace-90bd-5a0f95763c00" containerName="nova-metadata-log" Dec 01 21:58:24 crc kubenswrapper[4962]: I1201 21:58:24.909679 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4176fcc-efb7-4ace-90bd-5a0f95763c00" containerName="nova-metadata-log" Dec 01 21:58:24 crc kubenswrapper[4962]: I1201 21:58:24.909996 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4176fcc-efb7-4ace-90bd-5a0f95763c00" containerName="nova-metadata-log" Dec 01 21:58:24 crc kubenswrapper[4962]: I1201 21:58:24.910034 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4176fcc-efb7-4ace-90bd-5a0f95763c00" containerName="nova-metadata-metadata" Dec 01 21:58:24 crc kubenswrapper[4962]: I1201 21:58:24.911625 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 21:58:24 crc kubenswrapper[4962]: I1201 21:58:24.917769 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 01 21:58:24 crc kubenswrapper[4962]: I1201 21:58:24.919306 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 01 21:58:24 crc kubenswrapper[4962]: I1201 21:58:24.932068 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 21:58:25 crc kubenswrapper[4962]: I1201 21:58:25.064348 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46f24440-8e28-4c9e-908d-ca07fd2edcfc-config-data\") pod \"nova-metadata-0\" (UID: \"46f24440-8e28-4c9e-908d-ca07fd2edcfc\") " pod="openstack/nova-metadata-0" Dec 01 21:58:25 crc kubenswrapper[4962]: I1201 21:58:25.064478 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46f24440-8e28-4c9e-908d-ca07fd2edcfc-logs\") pod \"nova-metadata-0\" (UID: \"46f24440-8e28-4c9e-908d-ca07fd2edcfc\") " pod="openstack/nova-metadata-0" Dec 01 21:58:25 crc kubenswrapper[4962]: I1201 21:58:25.064514 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2476n\" (UniqueName: \"kubernetes.io/projected/46f24440-8e28-4c9e-908d-ca07fd2edcfc-kube-api-access-2476n\") pod \"nova-metadata-0\" (UID: \"46f24440-8e28-4c9e-908d-ca07fd2edcfc\") " pod="openstack/nova-metadata-0" Dec 01 21:58:25 crc kubenswrapper[4962]: I1201 21:58:25.064542 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/46f24440-8e28-4c9e-908d-ca07fd2edcfc-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"46f24440-8e28-4c9e-908d-ca07fd2edcfc\") " pod="openstack/nova-metadata-0" Dec 01 21:58:25 crc kubenswrapper[4962]: I1201 21:58:25.064578 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46f24440-8e28-4c9e-908d-ca07fd2edcfc-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"46f24440-8e28-4c9e-908d-ca07fd2edcfc\") " pod="openstack/nova-metadata-0" Dec 01 21:58:25 crc kubenswrapper[4962]: I1201 21:58:25.172887 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46f24440-8e28-4c9e-908d-ca07fd2edcfc-config-data\") pod \"nova-metadata-0\" (UID: \"46f24440-8e28-4c9e-908d-ca07fd2edcfc\") " pod="openstack/nova-metadata-0" Dec 01 21:58:25 crc kubenswrapper[4962]: I1201 21:58:25.173081 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46f24440-8e28-4c9e-908d-ca07fd2edcfc-logs\") pod \"nova-metadata-0\" (UID: \"46f24440-8e28-4c9e-908d-ca07fd2edcfc\") " pod="openstack/nova-metadata-0" Dec 01 21:58:25 crc kubenswrapper[4962]: I1201 21:58:25.173126 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2476n\" (UniqueName: \"kubernetes.io/projected/46f24440-8e28-4c9e-908d-ca07fd2edcfc-kube-api-access-2476n\") pod \"nova-metadata-0\" (UID: \"46f24440-8e28-4c9e-908d-ca07fd2edcfc\") " pod="openstack/nova-metadata-0" Dec 01 21:58:25 crc kubenswrapper[4962]: I1201 21:58:25.173158 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/46f24440-8e28-4c9e-908d-ca07fd2edcfc-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"46f24440-8e28-4c9e-908d-ca07fd2edcfc\") " pod="openstack/nova-metadata-0" Dec 01 21:58:25 crc kubenswrapper[4962]: I1201 21:58:25.173199 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46f24440-8e28-4c9e-908d-ca07fd2edcfc-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"46f24440-8e28-4c9e-908d-ca07fd2edcfc\") " pod="openstack/nova-metadata-0" Dec 01 21:58:25 crc kubenswrapper[4962]: I1201 21:58:25.173883 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46f24440-8e28-4c9e-908d-ca07fd2edcfc-logs\") pod \"nova-metadata-0\" (UID: \"46f24440-8e28-4c9e-908d-ca07fd2edcfc\") " pod="openstack/nova-metadata-0" Dec 01 21:58:25 crc kubenswrapper[4962]: I1201 21:58:25.178461 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/46f24440-8e28-4c9e-908d-ca07fd2edcfc-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"46f24440-8e28-4c9e-908d-ca07fd2edcfc\") " pod="openstack/nova-metadata-0" Dec 01 21:58:25 crc kubenswrapper[4962]: I1201 21:58:25.178696 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46f24440-8e28-4c9e-908d-ca07fd2edcfc-config-data\") pod \"nova-metadata-0\" (UID: \"46f24440-8e28-4c9e-908d-ca07fd2edcfc\") " pod="openstack/nova-metadata-0" Dec 01 21:58:25 crc kubenswrapper[4962]: I1201 21:58:25.179157 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46f24440-8e28-4c9e-908d-ca07fd2edcfc-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"46f24440-8e28-4c9e-908d-ca07fd2edcfc\") " pod="openstack/nova-metadata-0" Dec 01 21:58:25 crc kubenswrapper[4962]: I1201 21:58:25.190221 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2476n\" (UniqueName: \"kubernetes.io/projected/46f24440-8e28-4c9e-908d-ca07fd2edcfc-kube-api-access-2476n\") pod \"nova-metadata-0\" (UID: \"46f24440-8e28-4c9e-908d-ca07fd2edcfc\") " pod="openstack/nova-metadata-0" Dec 01 21:58:25 crc kubenswrapper[4962]: I1201 21:58:25.235884 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 21:58:25 crc kubenswrapper[4962]: I1201 21:58:25.786965 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 21:58:25 crc kubenswrapper[4962]: I1201 21:58:25.832027 4962 generic.go:334] "Generic (PLEG): container finished" podID="712ea540-d060-4b1c-a201-7b7593c942dc" containerID="8b0a63c02087acfbac6d8dbc35975bd714c984b8a2ea987c0781e72038a5c36b" exitCode=0 Dec 01 21:58:25 crc kubenswrapper[4962]: I1201 21:58:25.832090 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 21:58:25 crc kubenswrapper[4962]: I1201 21:58:25.832088 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"712ea540-d060-4b1c-a201-7b7593c942dc","Type":"ContainerDied","Data":"8b0a63c02087acfbac6d8dbc35975bd714c984b8a2ea987c0781e72038a5c36b"} Dec 01 21:58:25 crc kubenswrapper[4962]: I1201 21:58:25.832251 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"712ea540-d060-4b1c-a201-7b7593c942dc","Type":"ContainerDied","Data":"b0a780728cfa505e8b4801b2697e7f49c8c7712c031d4ec20a6733fcc5bfc2fa"} Dec 01 21:58:25 crc kubenswrapper[4962]: I1201 21:58:25.832285 4962 scope.go:117] "RemoveContainer" containerID="8b0a63c02087acfbac6d8dbc35975bd714c984b8a2ea987c0781e72038a5c36b" Dec 01 21:58:25 crc kubenswrapper[4962]: I1201 21:58:25.872451 4962 scope.go:117] "RemoveContainer" containerID="8b0a63c02087acfbac6d8dbc35975bd714c984b8a2ea987c0781e72038a5c36b" Dec 01 21:58:25 crc kubenswrapper[4962]: E1201 21:58:25.873011 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b0a63c02087acfbac6d8dbc35975bd714c984b8a2ea987c0781e72038a5c36b\": container with ID starting with 8b0a63c02087acfbac6d8dbc35975bd714c984b8a2ea987c0781e72038a5c36b not found: ID does not exist" containerID="8b0a63c02087acfbac6d8dbc35975bd714c984b8a2ea987c0781e72038a5c36b" Dec 01 21:58:25 crc kubenswrapper[4962]: I1201 21:58:25.873060 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b0a63c02087acfbac6d8dbc35975bd714c984b8a2ea987c0781e72038a5c36b"} err="failed to get container status \"8b0a63c02087acfbac6d8dbc35975bd714c984b8a2ea987c0781e72038a5c36b\": rpc error: code = NotFound desc = could not find container \"8b0a63c02087acfbac6d8dbc35975bd714c984b8a2ea987c0781e72038a5c36b\": container with ID starting with 8b0a63c02087acfbac6d8dbc35975bd714c984b8a2ea987c0781e72038a5c36b not found: ID does not exist" Dec 01 21:58:25 crc kubenswrapper[4962]: I1201 21:58:25.894045 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzr7x\" (UniqueName: \"kubernetes.io/projected/712ea540-d060-4b1c-a201-7b7593c942dc-kube-api-access-fzr7x\") pod \"712ea540-d060-4b1c-a201-7b7593c942dc\" (UID: \"712ea540-d060-4b1c-a201-7b7593c942dc\") " Dec 01 21:58:25 crc kubenswrapper[4962]: I1201 21:58:25.894712 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/712ea540-d060-4b1c-a201-7b7593c942dc-config-data\") pod \"712ea540-d060-4b1c-a201-7b7593c942dc\" (UID: \"712ea540-d060-4b1c-a201-7b7593c942dc\") " Dec 01 21:58:25 crc kubenswrapper[4962]: I1201 21:58:25.894743 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/712ea540-d060-4b1c-a201-7b7593c942dc-combined-ca-bundle\") pod \"712ea540-d060-4b1c-a201-7b7593c942dc\" (UID: \"712ea540-d060-4b1c-a201-7b7593c942dc\") " Dec 01 21:58:25 crc kubenswrapper[4962]: W1201 21:58:25.898364 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46f24440_8e28_4c9e_908d_ca07fd2edcfc.slice/crio-30af7e29c5a04da06858f84df8f91bbd3958ac7f9082138eb367a90f5f0b85b4 WatchSource:0}: Error finding container 30af7e29c5a04da06858f84df8f91bbd3958ac7f9082138eb367a90f5f0b85b4: Status 404 returned error can't find the container with id 30af7e29c5a04da06858f84df8f91bbd3958ac7f9082138eb367a90f5f0b85b4 Dec 01 21:58:25 crc kubenswrapper[4962]: I1201 21:58:25.901543 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 21:58:25 crc kubenswrapper[4962]: I1201 21:58:25.902944 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/712ea540-d060-4b1c-a201-7b7593c942dc-kube-api-access-fzr7x" (OuterVolumeSpecName: "kube-api-access-fzr7x") pod "712ea540-d060-4b1c-a201-7b7593c942dc" (UID: "712ea540-d060-4b1c-a201-7b7593c942dc"). InnerVolumeSpecName "kube-api-access-fzr7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:58:25 crc kubenswrapper[4962]: I1201 21:58:25.959341 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/712ea540-d060-4b1c-a201-7b7593c942dc-config-data" (OuterVolumeSpecName: "config-data") pod "712ea540-d060-4b1c-a201-7b7593c942dc" (UID: "712ea540-d060-4b1c-a201-7b7593c942dc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:58:25 crc kubenswrapper[4962]: I1201 21:58:25.961521 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/712ea540-d060-4b1c-a201-7b7593c942dc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "712ea540-d060-4b1c-a201-7b7593c942dc" (UID: "712ea540-d060-4b1c-a201-7b7593c942dc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:58:25 crc kubenswrapper[4962]: I1201 21:58:25.997964 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzr7x\" (UniqueName: \"kubernetes.io/projected/712ea540-d060-4b1c-a201-7b7593c942dc-kube-api-access-fzr7x\") on node \"crc\" DevicePath \"\"" Dec 01 21:58:25 crc kubenswrapper[4962]: I1201 21:58:25.998059 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/712ea540-d060-4b1c-a201-7b7593c942dc-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 21:58:25 crc kubenswrapper[4962]: I1201 21:58:25.998080 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/712ea540-d060-4b1c-a201-7b7593c942dc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 21:58:26 crc kubenswrapper[4962]: I1201 21:58:26.179013 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 21:58:26 crc kubenswrapper[4962]: I1201 21:58:26.192605 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 21:58:26 crc kubenswrapper[4962]: I1201 21:58:26.207556 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 21:58:26 crc kubenswrapper[4962]: E1201 21:58:26.208394 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="712ea540-d060-4b1c-a201-7b7593c942dc" containerName="nova-scheduler-scheduler" Dec 01 21:58:26 crc kubenswrapper[4962]: I1201 21:58:26.208425 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="712ea540-d060-4b1c-a201-7b7593c942dc" containerName="nova-scheduler-scheduler" Dec 01 21:58:26 crc kubenswrapper[4962]: I1201 21:58:26.208919 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="712ea540-d060-4b1c-a201-7b7593c942dc" containerName="nova-scheduler-scheduler" Dec 01 21:58:26 crc kubenswrapper[4962]: I1201 21:58:26.210408 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 21:58:26 crc kubenswrapper[4962]: I1201 21:58:26.214389 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 01 21:58:26 crc kubenswrapper[4962]: I1201 21:58:26.250075 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="712ea540-d060-4b1c-a201-7b7593c942dc" path="/var/lib/kubelet/pods/712ea540-d060-4b1c-a201-7b7593c942dc/volumes" Dec 01 21:58:26 crc kubenswrapper[4962]: I1201 21:58:26.250810 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4176fcc-efb7-4ace-90bd-5a0f95763c00" path="/var/lib/kubelet/pods/a4176fcc-efb7-4ace-90bd-5a0f95763c00/volumes" Dec 01 21:58:26 crc kubenswrapper[4962]: I1201 21:58:26.251828 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 21:58:26 crc kubenswrapper[4962]: I1201 21:58:26.307187 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02074ca6-7293-4d4a-8354-f299b4ae4b5a-config-data\") pod \"nova-scheduler-0\" (UID: \"02074ca6-7293-4d4a-8354-f299b4ae4b5a\") " pod="openstack/nova-scheduler-0" Dec 01 21:58:26 crc kubenswrapper[4962]: I1201 21:58:26.307393 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02074ca6-7293-4d4a-8354-f299b4ae4b5a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"02074ca6-7293-4d4a-8354-f299b4ae4b5a\") " pod="openstack/nova-scheduler-0" Dec 01 21:58:26 crc kubenswrapper[4962]: I1201 21:58:26.307530 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jb7gr\" (UniqueName: \"kubernetes.io/projected/02074ca6-7293-4d4a-8354-f299b4ae4b5a-kube-api-access-jb7gr\") pod \"nova-scheduler-0\" (UID: \"02074ca6-7293-4d4a-8354-f299b4ae4b5a\") " pod="openstack/nova-scheduler-0" Dec 01 21:58:26 crc kubenswrapper[4962]: I1201 21:58:26.410852 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02074ca6-7293-4d4a-8354-f299b4ae4b5a-config-data\") pod \"nova-scheduler-0\" (UID: \"02074ca6-7293-4d4a-8354-f299b4ae4b5a\") " pod="openstack/nova-scheduler-0" Dec 01 21:58:26 crc kubenswrapper[4962]: I1201 21:58:26.411304 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02074ca6-7293-4d4a-8354-f299b4ae4b5a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"02074ca6-7293-4d4a-8354-f299b4ae4b5a\") " pod="openstack/nova-scheduler-0" Dec 01 21:58:26 crc kubenswrapper[4962]: I1201 21:58:26.411362 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jb7gr\" (UniqueName: \"kubernetes.io/projected/02074ca6-7293-4d4a-8354-f299b4ae4b5a-kube-api-access-jb7gr\") pod \"nova-scheduler-0\" (UID: \"02074ca6-7293-4d4a-8354-f299b4ae4b5a\") " pod="openstack/nova-scheduler-0" Dec 01 21:58:26 crc kubenswrapper[4962]: I1201 21:58:26.417359 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02074ca6-7293-4d4a-8354-f299b4ae4b5a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"02074ca6-7293-4d4a-8354-f299b4ae4b5a\") " pod="openstack/nova-scheduler-0" Dec 01 21:58:26 crc kubenswrapper[4962]: I1201 21:58:26.417661 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02074ca6-7293-4d4a-8354-f299b4ae4b5a-config-data\") pod \"nova-scheduler-0\" (UID: \"02074ca6-7293-4d4a-8354-f299b4ae4b5a\") " pod="openstack/nova-scheduler-0" Dec 01 21:58:26 crc kubenswrapper[4962]: I1201 21:58:26.431755 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jb7gr\" (UniqueName: \"kubernetes.io/projected/02074ca6-7293-4d4a-8354-f299b4ae4b5a-kube-api-access-jb7gr\") pod \"nova-scheduler-0\" (UID: \"02074ca6-7293-4d4a-8354-f299b4ae4b5a\") " pod="openstack/nova-scheduler-0" Dec 01 21:58:26 crc kubenswrapper[4962]: I1201 21:58:26.640076 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 21:58:26 crc kubenswrapper[4962]: I1201 21:58:26.904246 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"46f24440-8e28-4c9e-908d-ca07fd2edcfc","Type":"ContainerStarted","Data":"fd01fa4629a38d4de08b453fe0398b6abfa26274f561ec17af870525bada5a1f"} Dec 01 21:58:26 crc kubenswrapper[4962]: I1201 21:58:26.904541 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"46f24440-8e28-4c9e-908d-ca07fd2edcfc","Type":"ContainerStarted","Data":"454c2e9389ec56cafe3e9c334b6fb0ba5aa0e12024b19be551381b4a6d0bcc6c"} Dec 01 21:58:26 crc kubenswrapper[4962]: I1201 21:58:26.904554 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"46f24440-8e28-4c9e-908d-ca07fd2edcfc","Type":"ContainerStarted","Data":"30af7e29c5a04da06858f84df8f91bbd3958ac7f9082138eb367a90f5f0b85b4"} Dec 01 21:58:26 crc kubenswrapper[4962]: I1201 21:58:26.947283 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.9472685370000002 podStartE2EDuration="2.947268537s" podCreationTimestamp="2025-12-01 21:58:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:58:26.942357528 +0000 UTC m=+1491.043796733" watchObservedRunningTime="2025-12-01 21:58:26.947268537 +0000 UTC m=+1491.048707732" Dec 01 21:58:27 crc kubenswrapper[4962]: I1201 21:58:27.257309 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 21:58:27 crc kubenswrapper[4962]: W1201 21:58:27.259971 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod02074ca6_7293_4d4a_8354_f299b4ae4b5a.slice/crio-e0f5acaff6b043cebb24a0441f83a25d5ab6b3a852bd1d35376fba3c03ac0fcd WatchSource:0}: Error finding container e0f5acaff6b043cebb24a0441f83a25d5ab6b3a852bd1d35376fba3c03ac0fcd: Status 404 returned error can't find the container with id e0f5acaff6b043cebb24a0441f83a25d5ab6b3a852bd1d35376fba3c03ac0fcd Dec 01 21:58:27 crc kubenswrapper[4962]: I1201 21:58:27.918188 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"02074ca6-7293-4d4a-8354-f299b4ae4b5a","Type":"ContainerStarted","Data":"9dfacc7b40c17fa6b6b2e5e456efd979b5f4db1b25213f9c73842bf735878ba9"} Dec 01 21:58:27 crc kubenswrapper[4962]: I1201 21:58:27.918791 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"02074ca6-7293-4d4a-8354-f299b4ae4b5a","Type":"ContainerStarted","Data":"e0f5acaff6b043cebb24a0441f83a25d5ab6b3a852bd1d35376fba3c03ac0fcd"} Dec 01 21:58:27 crc kubenswrapper[4962]: I1201 21:58:27.938157 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.9381373179999999 podStartE2EDuration="1.938137318s" podCreationTimestamp="2025-12-01 21:58:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:58:27.935890644 +0000 UTC m=+1492.037329879" watchObservedRunningTime="2025-12-01 21:58:27.938137318 +0000 UTC m=+1492.039576533" Dec 01 21:58:30 crc kubenswrapper[4962]: I1201 21:58:30.240442 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 01 21:58:30 crc kubenswrapper[4962]: I1201 21:58:30.240903 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 01 21:58:31 crc kubenswrapper[4962]: I1201 21:58:31.150876 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 01 21:58:31 crc kubenswrapper[4962]: I1201 21:58:31.150985 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 01 21:58:31 crc kubenswrapper[4962]: I1201 21:58:31.640894 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 01 21:58:32 crc kubenswrapper[4962]: I1201 21:58:32.173180 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="aabf746f-d3b5-4858-a2ec-9a5cec96720a" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.0:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 01 21:58:32 crc kubenswrapper[4962]: I1201 21:58:32.173814 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="aabf746f-d3b5-4858-a2ec-9a5cec96720a" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.0:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 01 21:58:35 crc kubenswrapper[4962]: I1201 21:58:35.237178 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 01 21:58:35 crc kubenswrapper[4962]: I1201 21:58:35.238509 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 01 21:58:36 crc kubenswrapper[4962]: I1201 21:58:36.251109 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="46f24440-8e28-4c9e-908d-ca07fd2edcfc" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.1:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 01 21:58:36 crc kubenswrapper[4962]: I1201 21:58:36.251109 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="46f24440-8e28-4c9e-908d-ca07fd2edcfc" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.1:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 01 21:58:36 crc kubenswrapper[4962]: I1201 21:58:36.641996 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 01 21:58:36 crc kubenswrapper[4962]: I1201 21:58:36.703340 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 01 21:58:37 crc kubenswrapper[4962]: I1201 21:58:37.131551 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 01 21:58:38 crc kubenswrapper[4962]: I1201 21:58:38.245001 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 01 21:58:41 crc kubenswrapper[4962]: I1201 21:58:41.166139 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 01 21:58:41 crc kubenswrapper[4962]: I1201 21:58:41.166758 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 01 21:58:41 crc kubenswrapper[4962]: I1201 21:58:41.167682 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 01 21:58:41 crc kubenswrapper[4962]: I1201 21:58:41.167706 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 01 21:58:41 crc kubenswrapper[4962]: I1201 21:58:41.182419 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 01 21:58:41 crc kubenswrapper[4962]: I1201 21:58:41.182592 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 01 21:58:45 crc kubenswrapper[4962]: I1201 21:58:45.245732 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 01 21:58:45 crc kubenswrapper[4962]: I1201 21:58:45.249448 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 01 21:58:45 crc kubenswrapper[4962]: I1201 21:58:45.252087 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 01 21:58:46 crc kubenswrapper[4962]: I1201 21:58:46.336372 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 01 21:58:56 crc kubenswrapper[4962]: I1201 21:58:56.499435 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-x55sw"] Dec 01 21:58:56 crc kubenswrapper[4962]: I1201 21:58:56.510668 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-x55sw"] Dec 01 21:58:56 crc kubenswrapper[4962]: I1201 21:58:56.619289 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-h98rk"] Dec 01 21:58:56 crc kubenswrapper[4962]: I1201 21:58:56.621379 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-h98rk" Dec 01 21:58:56 crc kubenswrapper[4962]: I1201 21:58:56.631811 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad0d848c-97d4-4360-a1e6-335cd2a8896c-config-data\") pod \"heat-db-sync-h98rk\" (UID: \"ad0d848c-97d4-4360-a1e6-335cd2a8896c\") " pod="openstack/heat-db-sync-h98rk" Dec 01 21:58:56 crc kubenswrapper[4962]: I1201 21:58:56.631988 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9d4h9\" (UniqueName: \"kubernetes.io/projected/ad0d848c-97d4-4360-a1e6-335cd2a8896c-kube-api-access-9d4h9\") pod \"heat-db-sync-h98rk\" (UID: \"ad0d848c-97d4-4360-a1e6-335cd2a8896c\") " pod="openstack/heat-db-sync-h98rk" Dec 01 21:58:56 crc kubenswrapper[4962]: I1201 21:58:56.632309 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad0d848c-97d4-4360-a1e6-335cd2a8896c-combined-ca-bundle\") pod \"heat-db-sync-h98rk\" (UID: \"ad0d848c-97d4-4360-a1e6-335cd2a8896c\") " pod="openstack/heat-db-sync-h98rk" Dec 01 21:58:56 crc kubenswrapper[4962]: I1201 21:58:56.633925 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-h98rk"] Dec 01 21:58:56 crc kubenswrapper[4962]: I1201 21:58:56.749423 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad0d848c-97d4-4360-a1e6-335cd2a8896c-config-data\") pod \"heat-db-sync-h98rk\" (UID: \"ad0d848c-97d4-4360-a1e6-335cd2a8896c\") " pod="openstack/heat-db-sync-h98rk" Dec 01 21:58:56 crc kubenswrapper[4962]: I1201 21:58:56.749787 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9d4h9\" (UniqueName: \"kubernetes.io/projected/ad0d848c-97d4-4360-a1e6-335cd2a8896c-kube-api-access-9d4h9\") pod \"heat-db-sync-h98rk\" (UID: \"ad0d848c-97d4-4360-a1e6-335cd2a8896c\") " pod="openstack/heat-db-sync-h98rk" Dec 01 21:58:56 crc kubenswrapper[4962]: I1201 21:58:56.750089 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad0d848c-97d4-4360-a1e6-335cd2a8896c-combined-ca-bundle\") pod \"heat-db-sync-h98rk\" (UID: \"ad0d848c-97d4-4360-a1e6-335cd2a8896c\") " pod="openstack/heat-db-sync-h98rk" Dec 01 21:58:56 crc kubenswrapper[4962]: I1201 21:58:56.764619 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad0d848c-97d4-4360-a1e6-335cd2a8896c-combined-ca-bundle\") pod \"heat-db-sync-h98rk\" (UID: \"ad0d848c-97d4-4360-a1e6-335cd2a8896c\") " pod="openstack/heat-db-sync-h98rk" Dec 01 21:58:56 crc kubenswrapper[4962]: I1201 21:58:56.765041 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad0d848c-97d4-4360-a1e6-335cd2a8896c-config-data\") pod \"heat-db-sync-h98rk\" (UID: \"ad0d848c-97d4-4360-a1e6-335cd2a8896c\") " pod="openstack/heat-db-sync-h98rk" Dec 01 21:58:56 crc kubenswrapper[4962]: I1201 21:58:56.767150 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9d4h9\" (UniqueName: \"kubernetes.io/projected/ad0d848c-97d4-4360-a1e6-335cd2a8896c-kube-api-access-9d4h9\") pod \"heat-db-sync-h98rk\" (UID: \"ad0d848c-97d4-4360-a1e6-335cd2a8896c\") " pod="openstack/heat-db-sync-h98rk" Dec 01 21:58:56 crc kubenswrapper[4962]: I1201 21:58:56.943921 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-h98rk" Dec 01 21:58:57 crc kubenswrapper[4962]: I1201 21:58:57.507474 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-h98rk"] Dec 01 21:58:58 crc kubenswrapper[4962]: I1201 21:58:58.245091 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="873a333f-1f2b-4824-86a7-7935ff6908f9" path="/var/lib/kubelet/pods/873a333f-1f2b-4824-86a7-7935ff6908f9/volumes" Dec 01 21:58:58 crc kubenswrapper[4962]: I1201 21:58:58.435001 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-h98rk" event={"ID":"ad0d848c-97d4-4360-a1e6-335cd2a8896c","Type":"ContainerStarted","Data":"59c21c945b2337826c79dade86fcbf0cca0c838d91c1dae9f9d8fbed329ede84"} Dec 01 21:58:58 crc kubenswrapper[4962]: I1201 21:58:58.758284 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 21:58:59 crc kubenswrapper[4962]: I1201 21:58:59.256485 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 21:58:59 crc kubenswrapper[4962]: I1201 21:58:59.256976 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bd27d7a4-d5ee-48ee-b317-ac637797097e" containerName="ceilometer-central-agent" containerID="cri-o://8873855e8cc86b1a5f5e32c301660def4b38d97deda148e127ed139adbd358a5" gracePeriod=30 Dec 01 21:58:59 crc kubenswrapper[4962]: I1201 21:58:59.257120 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bd27d7a4-d5ee-48ee-b317-ac637797097e" containerName="proxy-httpd" containerID="cri-o://f5cce121c6e120ee26efaae94bbeffca5976651918c066d3b6751059b27e89bf" gracePeriod=30 Dec 01 21:58:59 crc kubenswrapper[4962]: I1201 21:58:59.257200 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bd27d7a4-d5ee-48ee-b317-ac637797097e" containerName="ceilometer-notification-agent" containerID="cri-o://1b8680a116eb1730e294205b6fe4d9726f1a73c2af331267256f85134b3e5b63" gracePeriod=30 Dec 01 21:58:59 crc kubenswrapper[4962]: I1201 21:58:59.257198 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bd27d7a4-d5ee-48ee-b317-ac637797097e" containerName="sg-core" containerID="cri-o://0fdbb7fc9151c4bfe4a3f9b7082315248516c2824a50d7b174c570405de73a0b" gracePeriod=30 Dec 01 21:58:59 crc kubenswrapper[4962]: I1201 21:58:59.742036 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 21:59:00 crc kubenswrapper[4962]: I1201 21:59:00.463849 4962 generic.go:334] "Generic (PLEG): container finished" podID="bd27d7a4-d5ee-48ee-b317-ac637797097e" containerID="f5cce121c6e120ee26efaae94bbeffca5976651918c066d3b6751059b27e89bf" exitCode=0 Dec 01 21:59:00 crc kubenswrapper[4962]: I1201 21:59:00.464177 4962 generic.go:334] "Generic (PLEG): container finished" podID="bd27d7a4-d5ee-48ee-b317-ac637797097e" containerID="0fdbb7fc9151c4bfe4a3f9b7082315248516c2824a50d7b174c570405de73a0b" exitCode=2 Dec 01 21:59:00 crc kubenswrapper[4962]: I1201 21:59:00.464202 4962 generic.go:334] "Generic (PLEG): container finished" podID="bd27d7a4-d5ee-48ee-b317-ac637797097e" containerID="8873855e8cc86b1a5f5e32c301660def4b38d97deda148e127ed139adbd358a5" exitCode=0 Dec 01 21:59:00 crc kubenswrapper[4962]: I1201 21:59:00.464222 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bd27d7a4-d5ee-48ee-b317-ac637797097e","Type":"ContainerDied","Data":"f5cce121c6e120ee26efaae94bbeffca5976651918c066d3b6751059b27e89bf"} Dec 01 21:59:00 crc kubenswrapper[4962]: I1201 21:59:00.464248 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bd27d7a4-d5ee-48ee-b317-ac637797097e","Type":"ContainerDied","Data":"0fdbb7fc9151c4bfe4a3f9b7082315248516c2824a50d7b174c570405de73a0b"} Dec 01 21:59:00 crc kubenswrapper[4962]: I1201 21:59:00.464257 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bd27d7a4-d5ee-48ee-b317-ac637797097e","Type":"ContainerDied","Data":"8873855e8cc86b1a5f5e32c301660def4b38d97deda148e127ed139adbd358a5"} Dec 01 21:59:01 crc kubenswrapper[4962]: I1201 21:59:01.316465 4962 scope.go:117] "RemoveContainer" containerID="ef39056f5dc29b17fbcdcec508f51b55561a93e38b7c4f611a448d91a3c73791" Dec 01 21:59:01 crc kubenswrapper[4962]: I1201 21:59:01.371384 4962 scope.go:117] "RemoveContainer" containerID="f4f41170d9e311be4cf208f6f061d3c7127e6a261d74c26dee78e6b911148fa1" Dec 01 21:59:03 crc kubenswrapper[4962]: I1201 21:59:03.298094 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="c9ef8bb6-0fc4-411e-82a1-85d95ced5818" containerName="rabbitmq" containerID="cri-o://461b6d48eac2ff8ad256d35fbf30e75bf1e1a0278acfdd421ba2a7d95ae106de" gracePeriod=604796 Dec 01 21:59:03 crc kubenswrapper[4962]: I1201 21:59:03.948199 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="1e9a059a-712b-4ff4-b50e-7d94a96a9db5" containerName="rabbitmq" containerID="cri-o://e05d8b8b65733b514c93844819537d91355a50c1bbe84d5b1f3c2f1e6383e213" gracePeriod=604796 Dec 01 21:59:04 crc kubenswrapper[4962]: I1201 21:59:04.405978 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="c9ef8bb6-0fc4-411e-82a1-85d95ced5818" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.128:5671: connect: connection refused" Dec 01 21:59:04 crc kubenswrapper[4962]: I1201 21:59:04.822741 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="1e9a059a-712b-4ff4-b50e-7d94a96a9db5" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.129:5671: connect: connection refused" Dec 01 21:59:05 crc kubenswrapper[4962]: I1201 21:59:05.554338 4962 generic.go:334] "Generic (PLEG): container finished" podID="bd27d7a4-d5ee-48ee-b317-ac637797097e" containerID="1b8680a116eb1730e294205b6fe4d9726f1a73c2af331267256f85134b3e5b63" exitCode=0 Dec 01 21:59:05 crc kubenswrapper[4962]: I1201 21:59:05.554556 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bd27d7a4-d5ee-48ee-b317-ac637797097e","Type":"ContainerDied","Data":"1b8680a116eb1730e294205b6fe4d9726f1a73c2af331267256f85134b3e5b63"} Dec 01 21:59:08 crc kubenswrapper[4962]: I1201 21:59:08.233433 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="bd27d7a4-d5ee-48ee-b317-ac637797097e" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.253:3000/\": dial tcp 10.217.0.253:3000: connect: connection refused" Dec 01 21:59:09 crc kubenswrapper[4962]: I1201 21:59:09.599406 4962 generic.go:334] "Generic (PLEG): container finished" podID="c9ef8bb6-0fc4-411e-82a1-85d95ced5818" containerID="461b6d48eac2ff8ad256d35fbf30e75bf1e1a0278acfdd421ba2a7d95ae106de" exitCode=0 Dec 01 21:59:09 crc kubenswrapper[4962]: I1201 21:59:09.599546 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c9ef8bb6-0fc4-411e-82a1-85d95ced5818","Type":"ContainerDied","Data":"461b6d48eac2ff8ad256d35fbf30e75bf1e1a0278acfdd421ba2a7d95ae106de"} Dec 01 21:59:11 crc kubenswrapper[4962]: I1201 21:59:11.635957 4962 generic.go:334] "Generic (PLEG): container finished" podID="1e9a059a-712b-4ff4-b50e-7d94a96a9db5" containerID="e05d8b8b65733b514c93844819537d91355a50c1bbe84d5b1f3c2f1e6383e213" exitCode=0 Dec 01 21:59:11 crc kubenswrapper[4962]: I1201 21:59:11.636042 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1e9a059a-712b-4ff4-b50e-7d94a96a9db5","Type":"ContainerDied","Data":"e05d8b8b65733b514c93844819537d91355a50c1bbe84d5b1f3c2f1e6383e213"} Dec 01 21:59:14 crc kubenswrapper[4962]: I1201 21:59:14.236910 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b75489c6f-6n7cg"] Dec 01 21:59:14 crc kubenswrapper[4962]: I1201 21:59:14.239231 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b75489c6f-6n7cg" Dec 01 21:59:14 crc kubenswrapper[4962]: I1201 21:59:14.241142 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Dec 01 21:59:14 crc kubenswrapper[4962]: I1201 21:59:14.267303 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b75489c6f-6n7cg"] Dec 01 21:59:14 crc kubenswrapper[4962]: I1201 21:59:14.378522 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/51151665-c2e2-48f9-8f33-179a6e8c98fe-ovsdbserver-nb\") pod \"dnsmasq-dns-5b75489c6f-6n7cg\" (UID: \"51151665-c2e2-48f9-8f33-179a6e8c98fe\") " pod="openstack/dnsmasq-dns-5b75489c6f-6n7cg" Dec 01 21:59:14 crc kubenswrapper[4962]: I1201 21:59:14.378647 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/51151665-c2e2-48f9-8f33-179a6e8c98fe-ovsdbserver-sb\") pod \"dnsmasq-dns-5b75489c6f-6n7cg\" (UID: \"51151665-c2e2-48f9-8f33-179a6e8c98fe\") " pod="openstack/dnsmasq-dns-5b75489c6f-6n7cg" Dec 01 21:59:14 crc kubenswrapper[4962]: I1201 21:59:14.378828 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/51151665-c2e2-48f9-8f33-179a6e8c98fe-openstack-edpm-ipam\") pod \"dnsmasq-dns-5b75489c6f-6n7cg\" (UID: \"51151665-c2e2-48f9-8f33-179a6e8c98fe\") " pod="openstack/dnsmasq-dns-5b75489c6f-6n7cg" Dec 01 21:59:14 crc kubenswrapper[4962]: I1201 21:59:14.378864 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6gs7\" (UniqueName: \"kubernetes.io/projected/51151665-c2e2-48f9-8f33-179a6e8c98fe-kube-api-access-x6gs7\") pod \"dnsmasq-dns-5b75489c6f-6n7cg\" (UID: \"51151665-c2e2-48f9-8f33-179a6e8c98fe\") " pod="openstack/dnsmasq-dns-5b75489c6f-6n7cg" Dec 01 21:59:14 crc kubenswrapper[4962]: I1201 21:59:14.378901 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/51151665-c2e2-48f9-8f33-179a6e8c98fe-dns-svc\") pod \"dnsmasq-dns-5b75489c6f-6n7cg\" (UID: \"51151665-c2e2-48f9-8f33-179a6e8c98fe\") " pod="openstack/dnsmasq-dns-5b75489c6f-6n7cg" Dec 01 21:59:14 crc kubenswrapper[4962]: I1201 21:59:14.379013 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/51151665-c2e2-48f9-8f33-179a6e8c98fe-dns-swift-storage-0\") pod \"dnsmasq-dns-5b75489c6f-6n7cg\" (UID: \"51151665-c2e2-48f9-8f33-179a6e8c98fe\") " pod="openstack/dnsmasq-dns-5b75489c6f-6n7cg" Dec 01 21:59:14 crc kubenswrapper[4962]: I1201 21:59:14.379049 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51151665-c2e2-48f9-8f33-179a6e8c98fe-config\") pod \"dnsmasq-dns-5b75489c6f-6n7cg\" (UID: \"51151665-c2e2-48f9-8f33-179a6e8c98fe\") " pod="openstack/dnsmasq-dns-5b75489c6f-6n7cg" Dec 01 21:59:14 crc kubenswrapper[4962]: I1201 21:59:14.405565 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="c9ef8bb6-0fc4-411e-82a1-85d95ced5818" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.128:5671: connect: connection refused" Dec 01 21:59:14 crc kubenswrapper[4962]: I1201 21:59:14.409285 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b75489c6f-6n7cg"] Dec 01 21:59:14 crc kubenswrapper[4962]: E1201 21:59:14.410257 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc dns-swift-storage-0 kube-api-access-x6gs7 openstack-edpm-ipam ovsdbserver-nb ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-5b75489c6f-6n7cg" podUID="51151665-c2e2-48f9-8f33-179a6e8c98fe" Dec 01 21:59:14 crc kubenswrapper[4962]: I1201 21:59:14.424276 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d75f767dc-grqhx"] Dec 01 21:59:14 crc kubenswrapper[4962]: I1201 21:59:14.426460 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d75f767dc-grqhx" Dec 01 21:59:14 crc kubenswrapper[4962]: I1201 21:59:14.454914 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d75f767dc-grqhx"] Dec 01 21:59:14 crc kubenswrapper[4962]: I1201 21:59:14.482084 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/51151665-c2e2-48f9-8f33-179a6e8c98fe-dns-swift-storage-0\") pod \"dnsmasq-dns-5b75489c6f-6n7cg\" (UID: \"51151665-c2e2-48f9-8f33-179a6e8c98fe\") " pod="openstack/dnsmasq-dns-5b75489c6f-6n7cg" Dec 01 21:59:14 crc kubenswrapper[4962]: I1201 21:59:14.482131 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51151665-c2e2-48f9-8f33-179a6e8c98fe-config\") pod \"dnsmasq-dns-5b75489c6f-6n7cg\" (UID: \"51151665-c2e2-48f9-8f33-179a6e8c98fe\") " pod="openstack/dnsmasq-dns-5b75489c6f-6n7cg" Dec 01 21:59:14 crc kubenswrapper[4962]: I1201 21:59:14.482710 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/51151665-c2e2-48f9-8f33-179a6e8c98fe-ovsdbserver-nb\") pod \"dnsmasq-dns-5b75489c6f-6n7cg\" (UID: \"51151665-c2e2-48f9-8f33-179a6e8c98fe\") " pod="openstack/dnsmasq-dns-5b75489c6f-6n7cg" Dec 01 21:59:14 crc kubenswrapper[4962]: I1201 21:59:14.482797 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/51151665-c2e2-48f9-8f33-179a6e8c98fe-ovsdbserver-sb\") pod \"dnsmasq-dns-5b75489c6f-6n7cg\" (UID: \"51151665-c2e2-48f9-8f33-179a6e8c98fe\") " pod="openstack/dnsmasq-dns-5b75489c6f-6n7cg" Dec 01 21:59:14 crc kubenswrapper[4962]: I1201 21:59:14.482890 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/51151665-c2e2-48f9-8f33-179a6e8c98fe-openstack-edpm-ipam\") pod \"dnsmasq-dns-5b75489c6f-6n7cg\" (UID: \"51151665-c2e2-48f9-8f33-179a6e8c98fe\") " pod="openstack/dnsmasq-dns-5b75489c6f-6n7cg" Dec 01 21:59:14 crc kubenswrapper[4962]: I1201 21:59:14.482915 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6gs7\" (UniqueName: \"kubernetes.io/projected/51151665-c2e2-48f9-8f33-179a6e8c98fe-kube-api-access-x6gs7\") pod \"dnsmasq-dns-5b75489c6f-6n7cg\" (UID: \"51151665-c2e2-48f9-8f33-179a6e8c98fe\") " pod="openstack/dnsmasq-dns-5b75489c6f-6n7cg" Dec 01 21:59:14 crc kubenswrapper[4962]: I1201 21:59:14.482960 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/51151665-c2e2-48f9-8f33-179a6e8c98fe-dns-svc\") pod \"dnsmasq-dns-5b75489c6f-6n7cg\" (UID: \"51151665-c2e2-48f9-8f33-179a6e8c98fe\") " pod="openstack/dnsmasq-dns-5b75489c6f-6n7cg" Dec 01 21:59:14 crc kubenswrapper[4962]: I1201 21:59:14.483676 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/51151665-c2e2-48f9-8f33-179a6e8c98fe-ovsdbserver-nb\") pod \"dnsmasq-dns-5b75489c6f-6n7cg\" (UID: \"51151665-c2e2-48f9-8f33-179a6e8c98fe\") " pod="openstack/dnsmasq-dns-5b75489c6f-6n7cg" Dec 01 21:59:14 crc kubenswrapper[4962]: I1201 21:59:14.483798 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/51151665-c2e2-48f9-8f33-179a6e8c98fe-ovsdbserver-sb\") pod \"dnsmasq-dns-5b75489c6f-6n7cg\" (UID: \"51151665-c2e2-48f9-8f33-179a6e8c98fe\") " pod="openstack/dnsmasq-dns-5b75489c6f-6n7cg" Dec 01 21:59:14 crc kubenswrapper[4962]: I1201 21:59:14.483845 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/51151665-c2e2-48f9-8f33-179a6e8c98fe-openstack-edpm-ipam\") pod \"dnsmasq-dns-5b75489c6f-6n7cg\" (UID: \"51151665-c2e2-48f9-8f33-179a6e8c98fe\") " pod="openstack/dnsmasq-dns-5b75489c6f-6n7cg" Dec 01 21:59:14 crc kubenswrapper[4962]: I1201 21:59:14.484017 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51151665-c2e2-48f9-8f33-179a6e8c98fe-config\") pod \"dnsmasq-dns-5b75489c6f-6n7cg\" (UID: \"51151665-c2e2-48f9-8f33-179a6e8c98fe\") " pod="openstack/dnsmasq-dns-5b75489c6f-6n7cg" Dec 01 21:59:14 crc kubenswrapper[4962]: I1201 21:59:14.484032 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/51151665-c2e2-48f9-8f33-179a6e8c98fe-dns-swift-storage-0\") pod \"dnsmasq-dns-5b75489c6f-6n7cg\" (UID: \"51151665-c2e2-48f9-8f33-179a6e8c98fe\") " pod="openstack/dnsmasq-dns-5b75489c6f-6n7cg" Dec 01 21:59:14 crc kubenswrapper[4962]: I1201 21:59:14.485056 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/51151665-c2e2-48f9-8f33-179a6e8c98fe-dns-svc\") pod \"dnsmasq-dns-5b75489c6f-6n7cg\" (UID: \"51151665-c2e2-48f9-8f33-179a6e8c98fe\") " pod="openstack/dnsmasq-dns-5b75489c6f-6n7cg" Dec 01 21:59:14 crc kubenswrapper[4962]: I1201 21:59:14.502819 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6gs7\" (UniqueName: \"kubernetes.io/projected/51151665-c2e2-48f9-8f33-179a6e8c98fe-kube-api-access-x6gs7\") pod \"dnsmasq-dns-5b75489c6f-6n7cg\" (UID: \"51151665-c2e2-48f9-8f33-179a6e8c98fe\") " pod="openstack/dnsmasq-dns-5b75489c6f-6n7cg" Dec 01 21:59:14 crc kubenswrapper[4962]: I1201 21:59:14.585623 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c626973f-0e99-4e4b-bc2b-8caddbada7aa-openstack-edpm-ipam\") pod \"dnsmasq-dns-5d75f767dc-grqhx\" (UID: \"c626973f-0e99-4e4b-bc2b-8caddbada7aa\") " pod="openstack/dnsmasq-dns-5d75f767dc-grqhx" Dec 01 21:59:14 crc kubenswrapper[4962]: I1201 21:59:14.585668 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c626973f-0e99-4e4b-bc2b-8caddbada7aa-dns-swift-storage-0\") pod \"dnsmasq-dns-5d75f767dc-grqhx\" (UID: \"c626973f-0e99-4e4b-bc2b-8caddbada7aa\") " pod="openstack/dnsmasq-dns-5d75f767dc-grqhx" Dec 01 21:59:14 crc kubenswrapper[4962]: I1201 21:59:14.585767 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c626973f-0e99-4e4b-bc2b-8caddbada7aa-dns-svc\") pod \"dnsmasq-dns-5d75f767dc-grqhx\" (UID: \"c626973f-0e99-4e4b-bc2b-8caddbada7aa\") " pod="openstack/dnsmasq-dns-5d75f767dc-grqhx" Dec 01 21:59:14 crc kubenswrapper[4962]: I1201 21:59:14.585788 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c626973f-0e99-4e4b-bc2b-8caddbada7aa-config\") pod \"dnsmasq-dns-5d75f767dc-grqhx\" (UID: \"c626973f-0e99-4e4b-bc2b-8caddbada7aa\") " pod="openstack/dnsmasq-dns-5d75f767dc-grqhx" Dec 01 21:59:14 crc kubenswrapper[4962]: I1201 21:59:14.585879 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjmhq\" (UniqueName: \"kubernetes.io/projected/c626973f-0e99-4e4b-bc2b-8caddbada7aa-kube-api-access-gjmhq\") pod \"dnsmasq-dns-5d75f767dc-grqhx\" (UID: \"c626973f-0e99-4e4b-bc2b-8caddbada7aa\") " pod="openstack/dnsmasq-dns-5d75f767dc-grqhx" Dec 01 21:59:14 crc kubenswrapper[4962]: I1201 21:59:14.585895 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c626973f-0e99-4e4b-bc2b-8caddbada7aa-ovsdbserver-nb\") pod \"dnsmasq-dns-5d75f767dc-grqhx\" (UID: \"c626973f-0e99-4e4b-bc2b-8caddbada7aa\") " pod="openstack/dnsmasq-dns-5d75f767dc-grqhx" Dec 01 21:59:14 crc kubenswrapper[4962]: I1201 21:59:14.585920 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c626973f-0e99-4e4b-bc2b-8caddbada7aa-ovsdbserver-sb\") pod \"dnsmasq-dns-5d75f767dc-grqhx\" (UID: \"c626973f-0e99-4e4b-bc2b-8caddbada7aa\") " pod="openstack/dnsmasq-dns-5d75f767dc-grqhx" Dec 01 21:59:14 crc kubenswrapper[4962]: I1201 21:59:14.680526 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b75489c6f-6n7cg" Dec 01 21:59:14 crc kubenswrapper[4962]: I1201 21:59:14.687579 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjmhq\" (UniqueName: \"kubernetes.io/projected/c626973f-0e99-4e4b-bc2b-8caddbada7aa-kube-api-access-gjmhq\") pod \"dnsmasq-dns-5d75f767dc-grqhx\" (UID: \"c626973f-0e99-4e4b-bc2b-8caddbada7aa\") " pod="openstack/dnsmasq-dns-5d75f767dc-grqhx" Dec 01 21:59:14 crc kubenswrapper[4962]: I1201 21:59:14.687628 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c626973f-0e99-4e4b-bc2b-8caddbada7aa-ovsdbserver-nb\") pod \"dnsmasq-dns-5d75f767dc-grqhx\" (UID: \"c626973f-0e99-4e4b-bc2b-8caddbada7aa\") " pod="openstack/dnsmasq-dns-5d75f767dc-grqhx" Dec 01 21:59:14 crc kubenswrapper[4962]: I1201 21:59:14.687668 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c626973f-0e99-4e4b-bc2b-8caddbada7aa-ovsdbserver-sb\") pod \"dnsmasq-dns-5d75f767dc-grqhx\" (UID: \"c626973f-0e99-4e4b-bc2b-8caddbada7aa\") " pod="openstack/dnsmasq-dns-5d75f767dc-grqhx" Dec 01 21:59:14 crc kubenswrapper[4962]: I1201 21:59:14.687729 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c626973f-0e99-4e4b-bc2b-8caddbada7aa-openstack-edpm-ipam\") pod \"dnsmasq-dns-5d75f767dc-grqhx\" (UID: \"c626973f-0e99-4e4b-bc2b-8caddbada7aa\") " pod="openstack/dnsmasq-dns-5d75f767dc-grqhx" Dec 01 21:59:14 crc kubenswrapper[4962]: I1201 21:59:14.687745 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c626973f-0e99-4e4b-bc2b-8caddbada7aa-dns-swift-storage-0\") pod \"dnsmasq-dns-5d75f767dc-grqhx\" (UID: \"c626973f-0e99-4e4b-bc2b-8caddbada7aa\") " pod="openstack/dnsmasq-dns-5d75f767dc-grqhx" Dec 01 21:59:14 crc kubenswrapper[4962]: I1201 21:59:14.687827 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c626973f-0e99-4e4b-bc2b-8caddbada7aa-dns-svc\") pod \"dnsmasq-dns-5d75f767dc-grqhx\" (UID: \"c626973f-0e99-4e4b-bc2b-8caddbada7aa\") " pod="openstack/dnsmasq-dns-5d75f767dc-grqhx" Dec 01 21:59:14 crc kubenswrapper[4962]: I1201 21:59:14.687846 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c626973f-0e99-4e4b-bc2b-8caddbada7aa-config\") pod \"dnsmasq-dns-5d75f767dc-grqhx\" (UID: \"c626973f-0e99-4e4b-bc2b-8caddbada7aa\") " pod="openstack/dnsmasq-dns-5d75f767dc-grqhx" Dec 01 21:59:14 crc kubenswrapper[4962]: I1201 21:59:14.688750 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c626973f-0e99-4e4b-bc2b-8caddbada7aa-config\") pod \"dnsmasq-dns-5d75f767dc-grqhx\" (UID: \"c626973f-0e99-4e4b-bc2b-8caddbada7aa\") " pod="openstack/dnsmasq-dns-5d75f767dc-grqhx" Dec 01 21:59:14 crc kubenswrapper[4962]: I1201 21:59:14.688824 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c626973f-0e99-4e4b-bc2b-8caddbada7aa-openstack-edpm-ipam\") pod \"dnsmasq-dns-5d75f767dc-grqhx\" (UID: \"c626973f-0e99-4e4b-bc2b-8caddbada7aa\") " pod="openstack/dnsmasq-dns-5d75f767dc-grqhx" Dec 01 21:59:14 crc kubenswrapper[4962]: I1201 21:59:14.689428 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c626973f-0e99-4e4b-bc2b-8caddbada7aa-dns-swift-storage-0\") pod \"dnsmasq-dns-5d75f767dc-grqhx\" (UID: \"c626973f-0e99-4e4b-bc2b-8caddbada7aa\") " pod="openstack/dnsmasq-dns-5d75f767dc-grqhx" Dec 01 21:59:14 crc kubenswrapper[4962]: I1201 21:59:14.689447 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c626973f-0e99-4e4b-bc2b-8caddbada7aa-ovsdbserver-nb\") pod \"dnsmasq-dns-5d75f767dc-grqhx\" (UID: \"c626973f-0e99-4e4b-bc2b-8caddbada7aa\") " pod="openstack/dnsmasq-dns-5d75f767dc-grqhx" Dec 01 21:59:14 crc kubenswrapper[4962]: I1201 21:59:14.690054 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c626973f-0e99-4e4b-bc2b-8caddbada7aa-dns-svc\") pod \"dnsmasq-dns-5d75f767dc-grqhx\" (UID: \"c626973f-0e99-4e4b-bc2b-8caddbada7aa\") " pod="openstack/dnsmasq-dns-5d75f767dc-grqhx" Dec 01 21:59:14 crc kubenswrapper[4962]: I1201 21:59:14.690131 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c626973f-0e99-4e4b-bc2b-8caddbada7aa-ovsdbserver-sb\") pod \"dnsmasq-dns-5d75f767dc-grqhx\" (UID: \"c626973f-0e99-4e4b-bc2b-8caddbada7aa\") " pod="openstack/dnsmasq-dns-5d75f767dc-grqhx" Dec 01 21:59:14 crc kubenswrapper[4962]: I1201 21:59:14.698594 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b75489c6f-6n7cg" Dec 01 21:59:14 crc kubenswrapper[4962]: I1201 21:59:14.730434 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjmhq\" (UniqueName: \"kubernetes.io/projected/c626973f-0e99-4e4b-bc2b-8caddbada7aa-kube-api-access-gjmhq\") pod \"dnsmasq-dns-5d75f767dc-grqhx\" (UID: \"c626973f-0e99-4e4b-bc2b-8caddbada7aa\") " pod="openstack/dnsmasq-dns-5d75f767dc-grqhx" Dec 01 21:59:14 crc kubenswrapper[4962]: I1201 21:59:14.743646 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d75f767dc-grqhx" Dec 01 21:59:14 crc kubenswrapper[4962]: I1201 21:59:14.786476 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="1e9a059a-712b-4ff4-b50e-7d94a96a9db5" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.129:5671: connect: connection refused" Dec 01 21:59:14 crc kubenswrapper[4962]: I1201 21:59:14.791237 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/51151665-c2e2-48f9-8f33-179a6e8c98fe-ovsdbserver-sb\") pod \"51151665-c2e2-48f9-8f33-179a6e8c98fe\" (UID: \"51151665-c2e2-48f9-8f33-179a6e8c98fe\") " Dec 01 21:59:14 crc kubenswrapper[4962]: I1201 21:59:14.791475 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/51151665-c2e2-48f9-8f33-179a6e8c98fe-dns-svc\") pod \"51151665-c2e2-48f9-8f33-179a6e8c98fe\" (UID: \"51151665-c2e2-48f9-8f33-179a6e8c98fe\") " Dec 01 21:59:14 crc kubenswrapper[4962]: I1201 21:59:14.791516 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6gs7\" (UniqueName: \"kubernetes.io/projected/51151665-c2e2-48f9-8f33-179a6e8c98fe-kube-api-access-x6gs7\") pod \"51151665-c2e2-48f9-8f33-179a6e8c98fe\" (UID: \"51151665-c2e2-48f9-8f33-179a6e8c98fe\") " Dec 01 21:59:14 crc kubenswrapper[4962]: I1201 21:59:14.791600 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51151665-c2e2-48f9-8f33-179a6e8c98fe-config\") pod \"51151665-c2e2-48f9-8f33-179a6e8c98fe\" (UID: \"51151665-c2e2-48f9-8f33-179a6e8c98fe\") " Dec 01 21:59:14 crc kubenswrapper[4962]: I1201 21:59:14.791633 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/51151665-c2e2-48f9-8f33-179a6e8c98fe-dns-swift-storage-0\") pod \"51151665-c2e2-48f9-8f33-179a6e8c98fe\" (UID: \"51151665-c2e2-48f9-8f33-179a6e8c98fe\") " Dec 01 21:59:14 crc kubenswrapper[4962]: I1201 21:59:14.791853 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/51151665-c2e2-48f9-8f33-179a6e8c98fe-ovsdbserver-nb\") pod \"51151665-c2e2-48f9-8f33-179a6e8c98fe\" (UID: \"51151665-c2e2-48f9-8f33-179a6e8c98fe\") " Dec 01 21:59:14 crc kubenswrapper[4962]: I1201 21:59:14.791919 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/51151665-c2e2-48f9-8f33-179a6e8c98fe-openstack-edpm-ipam\") pod \"51151665-c2e2-48f9-8f33-179a6e8c98fe\" (UID: \"51151665-c2e2-48f9-8f33-179a6e8c98fe\") " Dec 01 21:59:14 crc kubenswrapper[4962]: I1201 21:59:14.792078 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51151665-c2e2-48f9-8f33-179a6e8c98fe-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "51151665-c2e2-48f9-8f33-179a6e8c98fe" (UID: "51151665-c2e2-48f9-8f33-179a6e8c98fe"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:59:14 crc kubenswrapper[4962]: I1201 21:59:14.792456 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51151665-c2e2-48f9-8f33-179a6e8c98fe-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "51151665-c2e2-48f9-8f33-179a6e8c98fe" (UID: "51151665-c2e2-48f9-8f33-179a6e8c98fe"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:59:14 crc kubenswrapper[4962]: I1201 21:59:14.792537 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/51151665-c2e2-48f9-8f33-179a6e8c98fe-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 21:59:14 crc kubenswrapper[4962]: I1201 21:59:14.792743 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51151665-c2e2-48f9-8f33-179a6e8c98fe-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "51151665-c2e2-48f9-8f33-179a6e8c98fe" (UID: "51151665-c2e2-48f9-8f33-179a6e8c98fe"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:59:14 crc kubenswrapper[4962]: I1201 21:59:14.792831 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51151665-c2e2-48f9-8f33-179a6e8c98fe-config" (OuterVolumeSpecName: "config") pod "51151665-c2e2-48f9-8f33-179a6e8c98fe" (UID: "51151665-c2e2-48f9-8f33-179a6e8c98fe"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:59:14 crc kubenswrapper[4962]: I1201 21:59:14.793038 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51151665-c2e2-48f9-8f33-179a6e8c98fe-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "51151665-c2e2-48f9-8f33-179a6e8c98fe" (UID: "51151665-c2e2-48f9-8f33-179a6e8c98fe"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:59:14 crc kubenswrapper[4962]: I1201 21:59:14.793152 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51151665-c2e2-48f9-8f33-179a6e8c98fe-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "51151665-c2e2-48f9-8f33-179a6e8c98fe" (UID: "51151665-c2e2-48f9-8f33-179a6e8c98fe"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:59:14 crc kubenswrapper[4962]: I1201 21:59:14.800087 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51151665-c2e2-48f9-8f33-179a6e8c98fe-kube-api-access-x6gs7" (OuterVolumeSpecName: "kube-api-access-x6gs7") pod "51151665-c2e2-48f9-8f33-179a6e8c98fe" (UID: "51151665-c2e2-48f9-8f33-179a6e8c98fe"). InnerVolumeSpecName "kube-api-access-x6gs7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:59:14 crc kubenswrapper[4962]: I1201 21:59:14.894891 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51151665-c2e2-48f9-8f33-179a6e8c98fe-config\") on node \"crc\" DevicePath \"\"" Dec 01 21:59:14 crc kubenswrapper[4962]: I1201 21:59:14.895179 4962 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/51151665-c2e2-48f9-8f33-179a6e8c98fe-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 01 21:59:14 crc kubenswrapper[4962]: I1201 21:59:14.895189 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/51151665-c2e2-48f9-8f33-179a6e8c98fe-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 21:59:14 crc kubenswrapper[4962]: I1201 21:59:14.895199 4962 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/51151665-c2e2-48f9-8f33-179a6e8c98fe-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 01 21:59:14 crc kubenswrapper[4962]: I1201 21:59:14.895207 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/51151665-c2e2-48f9-8f33-179a6e8c98fe-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 21:59:14 crc kubenswrapper[4962]: I1201 21:59:14.895215 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6gs7\" (UniqueName: \"kubernetes.io/projected/51151665-c2e2-48f9-8f33-179a6e8c98fe-kube-api-access-x6gs7\") on node \"crc\" DevicePath \"\"" Dec 01 21:59:15 crc kubenswrapper[4962]: I1201 21:59:15.702329 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b75489c6f-6n7cg" Dec 01 21:59:15 crc kubenswrapper[4962]: I1201 21:59:15.769742 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b75489c6f-6n7cg"] Dec 01 21:59:15 crc kubenswrapper[4962]: I1201 21:59:15.779945 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b75489c6f-6n7cg"] Dec 01 21:59:16 crc kubenswrapper[4962]: I1201 21:59:16.240829 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51151665-c2e2-48f9-8f33-179a6e8c98fe" path="/var/lib/kubelet/pods/51151665-c2e2-48f9-8f33-179a6e8c98fe/volumes" Dec 01 21:59:19 crc kubenswrapper[4962]: E1201 21:59:19.537724 4962 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested" Dec 01 21:59:19 crc kubenswrapper[4962]: E1201 21:59:19.538220 4962 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested" Dec 01 21:59:19 crc kubenswrapper[4962]: E1201 21:59:19.538349 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:heat-db-sync,Image:quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested,Command:[/bin/bash],Args:[-c /usr/bin/heat-manage --config-dir /etc/heat/heat.conf.d db_sync],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/heat/heat.conf.d/00-default.conf,SubPath:00-default.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/heat/heat.conf.d/01-custom.conf,SubPath:01-custom.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9d4h9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42418,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42418,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-db-sync-h98rk_openstack(ad0d848c-97d4-4360-a1e6-335cd2a8896c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 21:59:19 crc kubenswrapper[4962]: E1201 21:59:19.539736 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/heat-db-sync-h98rk" podUID="ad0d848c-97d4-4360-a1e6-335cd2a8896c" Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.681614 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.688989 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.696652 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.787921 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.787945 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1e9a059a-712b-4ff4-b50e-7d94a96a9db5","Type":"ContainerDied","Data":"3f2d9cbde4485b49740dce5ef9a9aeebcd4abfce6e5fe5955bf2d2e23669f7e7"} Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.787992 4962 scope.go:117] "RemoveContainer" containerID="e05d8b8b65733b514c93844819537d91355a50c1bbe84d5b1f3c2f1e6383e213" Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.803265 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c9ef8bb6-0fc4-411e-82a1-85d95ced5818","Type":"ContainerDied","Data":"78e5aa08bb5fc48c8005a571ce973bcf7fe5ed46a0b3db0a3f5790ad1c304c45"} Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.803382 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.817414 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.817401 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bd27d7a4-d5ee-48ee-b317-ac637797097e","Type":"ContainerDied","Data":"5ce9f6740dd2b413a3f05a7e48a594cdd8e9f9ae1631938f348d14e2d0402332"} Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.831220 4962 scope.go:117] "RemoveContainer" containerID="2e612e8c7d52bd7bb195592643b0167d6f4ce348b0ef115b6d213703e68c13cb" Dec 01 21:59:19 crc kubenswrapper[4962]: E1201 21:59:19.831263 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested\\\"\"" pod="openstack/heat-db-sync-h98rk" podUID="ad0d848c-97d4-4360-a1e6-335cd2a8896c" Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.870293 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c9ef8bb6-0fc4-411e-82a1-85d95ced5818-pod-info\") pod \"c9ef8bb6-0fc4-411e-82a1-85d95ced5818\" (UID: \"c9ef8bb6-0fc4-411e-82a1-85d95ced5818\") " Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.870371 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"c9ef8bb6-0fc4-411e-82a1-85d95ced5818\" (UID: \"c9ef8bb6-0fc4-411e-82a1-85d95ced5818\") " Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.870414 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c9ef8bb6-0fc4-411e-82a1-85d95ced5818-config-data\") pod \"c9ef8bb6-0fc4-411e-82a1-85d95ced5818\" (UID: \"c9ef8bb6-0fc4-411e-82a1-85d95ced5818\") " Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.870441 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd27d7a4-d5ee-48ee-b317-ac637797097e-config-data\") pod \"bd27d7a4-d5ee-48ee-b317-ac637797097e\" (UID: \"bd27d7a4-d5ee-48ee-b317-ac637797097e\") " Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.870482 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvkmq\" (UniqueName: \"kubernetes.io/projected/bd27d7a4-d5ee-48ee-b317-ac637797097e-kube-api-access-cvkmq\") pod \"bd27d7a4-d5ee-48ee-b317-ac637797097e\" (UID: \"bd27d7a4-d5ee-48ee-b317-ac637797097e\") " Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.870505 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c9ef8bb6-0fc4-411e-82a1-85d95ced5818-rabbitmq-confd\") pod \"c9ef8bb6-0fc4-411e-82a1-85d95ced5818\" (UID: \"c9ef8bb6-0fc4-411e-82a1-85d95ced5818\") " Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.870523 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1e9a059a-712b-4ff4-b50e-7d94a96a9db5-server-conf\") pod \"1e9a059a-712b-4ff4-b50e-7d94a96a9db5\" (UID: \"1e9a059a-712b-4ff4-b50e-7d94a96a9db5\") " Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.870579 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c9ef8bb6-0fc4-411e-82a1-85d95ced5818-rabbitmq-erlang-cookie\") pod \"c9ef8bb6-0fc4-411e-82a1-85d95ced5818\" (UID: \"c9ef8bb6-0fc4-411e-82a1-85d95ced5818\") " Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.870599 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1e9a059a-712b-4ff4-b50e-7d94a96a9db5-rabbitmq-confd\") pod \"1e9a059a-712b-4ff4-b50e-7d94a96a9db5\" (UID: \"1e9a059a-712b-4ff4-b50e-7d94a96a9db5\") " Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.870617 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c9ef8bb6-0fc4-411e-82a1-85d95ced5818-erlang-cookie-secret\") pod \"c9ef8bb6-0fc4-411e-82a1-85d95ced5818\" (UID: \"c9ef8bb6-0fc4-411e-82a1-85d95ced5818\") " Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.870636 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bd27d7a4-d5ee-48ee-b317-ac637797097e-sg-core-conf-yaml\") pod \"bd27d7a4-d5ee-48ee-b317-ac637797097e\" (UID: \"bd27d7a4-d5ee-48ee-b317-ac637797097e\") " Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.870654 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbdxb\" (UniqueName: \"kubernetes.io/projected/1e9a059a-712b-4ff4-b50e-7d94a96a9db5-kube-api-access-jbdxb\") pod \"1e9a059a-712b-4ff4-b50e-7d94a96a9db5\" (UID: \"1e9a059a-712b-4ff4-b50e-7d94a96a9db5\") " Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.870671 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd27d7a4-d5ee-48ee-b317-ac637797097e-ceilometer-tls-certs\") pod \"bd27d7a4-d5ee-48ee-b317-ac637797097e\" (UID: \"bd27d7a4-d5ee-48ee-b317-ac637797097e\") " Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.870693 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd27d7a4-d5ee-48ee-b317-ac637797097e-log-httpd\") pod \"bd27d7a4-d5ee-48ee-b317-ac637797097e\" (UID: \"bd27d7a4-d5ee-48ee-b317-ac637797097e\") " Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.870728 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1e9a059a-712b-4ff4-b50e-7d94a96a9db5-plugins-conf\") pod \"1e9a059a-712b-4ff4-b50e-7d94a96a9db5\" (UID: \"1e9a059a-712b-4ff4-b50e-7d94a96a9db5\") " Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.870743 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd27d7a4-d5ee-48ee-b317-ac637797097e-combined-ca-bundle\") pod \"bd27d7a4-d5ee-48ee-b317-ac637797097e\" (UID: \"bd27d7a4-d5ee-48ee-b317-ac637797097e\") " Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.870781 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1e9a059a-712b-4ff4-b50e-7d94a96a9db5-pod-info\") pod \"1e9a059a-712b-4ff4-b50e-7d94a96a9db5\" (UID: \"1e9a059a-712b-4ff4-b50e-7d94a96a9db5\") " Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.870829 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd27d7a4-d5ee-48ee-b317-ac637797097e-run-httpd\") pod \"bd27d7a4-d5ee-48ee-b317-ac637797097e\" (UID: \"bd27d7a4-d5ee-48ee-b317-ac637797097e\") " Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.870847 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c9ef8bb6-0fc4-411e-82a1-85d95ced5818-plugins-conf\") pod \"c9ef8bb6-0fc4-411e-82a1-85d95ced5818\" (UID: \"c9ef8bb6-0fc4-411e-82a1-85d95ced5818\") " Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.870865 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd27d7a4-d5ee-48ee-b317-ac637797097e-scripts\") pod \"bd27d7a4-d5ee-48ee-b317-ac637797097e\" (UID: \"bd27d7a4-d5ee-48ee-b317-ac637797097e\") " Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.870887 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1e9a059a-712b-4ff4-b50e-7d94a96a9db5-rabbitmq-tls\") pod \"1e9a059a-712b-4ff4-b50e-7d94a96a9db5\" (UID: \"1e9a059a-712b-4ff4-b50e-7d94a96a9db5\") " Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.870909 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c9ef8bb6-0fc4-411e-82a1-85d95ced5818-server-conf\") pod \"c9ef8bb6-0fc4-411e-82a1-85d95ced5818\" (UID: \"c9ef8bb6-0fc4-411e-82a1-85d95ced5818\") " Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.870956 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1e9a059a-712b-4ff4-b50e-7d94a96a9db5-rabbitmq-plugins\") pod \"1e9a059a-712b-4ff4-b50e-7d94a96a9db5\" (UID: \"1e9a059a-712b-4ff4-b50e-7d94a96a9db5\") " Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.870990 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c9ef8bb6-0fc4-411e-82a1-85d95ced5818-rabbitmq-plugins\") pod \"c9ef8bb6-0fc4-411e-82a1-85d95ced5818\" (UID: \"c9ef8bb6-0fc4-411e-82a1-85d95ced5818\") " Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.871017 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1e9a059a-712b-4ff4-b50e-7d94a96a9db5-config-data\") pod \"1e9a059a-712b-4ff4-b50e-7d94a96a9db5\" (UID: \"1e9a059a-712b-4ff4-b50e-7d94a96a9db5\") " Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.871076 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"1e9a059a-712b-4ff4-b50e-7d94a96a9db5\" (UID: \"1e9a059a-712b-4ff4-b50e-7d94a96a9db5\") " Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.871092 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvkbl\" (UniqueName: \"kubernetes.io/projected/c9ef8bb6-0fc4-411e-82a1-85d95ced5818-kube-api-access-wvkbl\") pod \"c9ef8bb6-0fc4-411e-82a1-85d95ced5818\" (UID: \"c9ef8bb6-0fc4-411e-82a1-85d95ced5818\") " Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.871115 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c9ef8bb6-0fc4-411e-82a1-85d95ced5818-rabbitmq-tls\") pod \"c9ef8bb6-0fc4-411e-82a1-85d95ced5818\" (UID: \"c9ef8bb6-0fc4-411e-82a1-85d95ced5818\") " Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.871170 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1e9a059a-712b-4ff4-b50e-7d94a96a9db5-erlang-cookie-secret\") pod \"1e9a059a-712b-4ff4-b50e-7d94a96a9db5\" (UID: \"1e9a059a-712b-4ff4-b50e-7d94a96a9db5\") " Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.871187 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1e9a059a-712b-4ff4-b50e-7d94a96a9db5-rabbitmq-erlang-cookie\") pod \"1e9a059a-712b-4ff4-b50e-7d94a96a9db5\" (UID: \"1e9a059a-712b-4ff4-b50e-7d94a96a9db5\") " Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.875342 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e9a059a-712b-4ff4-b50e-7d94a96a9db5-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "1e9a059a-712b-4ff4-b50e-7d94a96a9db5" (UID: "1e9a059a-712b-4ff4-b50e-7d94a96a9db5"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.875367 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9ef8bb6-0fc4-411e-82a1-85d95ced5818-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "c9ef8bb6-0fc4-411e-82a1-85d95ced5818" (UID: "c9ef8bb6-0fc4-411e-82a1-85d95ced5818"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.883684 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e9a059a-712b-4ff4-b50e-7d94a96a9db5-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "1e9a059a-712b-4ff4-b50e-7d94a96a9db5" (UID: "1e9a059a-712b-4ff4-b50e-7d94a96a9db5"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.884062 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd27d7a4-d5ee-48ee-b317-ac637797097e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "bd27d7a4-d5ee-48ee-b317-ac637797097e" (UID: "bd27d7a4-d5ee-48ee-b317-ac637797097e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.884520 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e9a059a-712b-4ff4-b50e-7d94a96a9db5-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "1e9a059a-712b-4ff4-b50e-7d94a96a9db5" (UID: "1e9a059a-712b-4ff4-b50e-7d94a96a9db5"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.897633 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9ef8bb6-0fc4-411e-82a1-85d95ced5818-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "c9ef8bb6-0fc4-411e-82a1-85d95ced5818" (UID: "c9ef8bb6-0fc4-411e-82a1-85d95ced5818"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.914644 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd27d7a4-d5ee-48ee-b317-ac637797097e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "bd27d7a4-d5ee-48ee-b317-ac637797097e" (UID: "bd27d7a4-d5ee-48ee-b317-ac637797097e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.915542 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9ef8bb6-0fc4-411e-82a1-85d95ced5818-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "c9ef8bb6-0fc4-411e-82a1-85d95ced5818" (UID: "c9ef8bb6-0fc4-411e-82a1-85d95ced5818"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.916472 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/1e9a059a-712b-4ff4-b50e-7d94a96a9db5-pod-info" (OuterVolumeSpecName: "pod-info") pod "1e9a059a-712b-4ff4-b50e-7d94a96a9db5" (UID: "1e9a059a-712b-4ff4-b50e-7d94a96a9db5"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.934176 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9ef8bb6-0fc4-411e-82a1-85d95ced5818-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "c9ef8bb6-0fc4-411e-82a1-85d95ced5818" (UID: "c9ef8bb6-0fc4-411e-82a1-85d95ced5818"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.934248 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd27d7a4-d5ee-48ee-b317-ac637797097e-kube-api-access-cvkmq" (OuterVolumeSpecName: "kube-api-access-cvkmq") pod "bd27d7a4-d5ee-48ee-b317-ac637797097e" (UID: "bd27d7a4-d5ee-48ee-b317-ac637797097e"). InnerVolumeSpecName "kube-api-access-cvkmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.934313 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "c9ef8bb6-0fc4-411e-82a1-85d95ced5818" (UID: "c9ef8bb6-0fc4-411e-82a1-85d95ced5818"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.934376 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9ef8bb6-0fc4-411e-82a1-85d95ced5818-kube-api-access-wvkbl" (OuterVolumeSpecName: "kube-api-access-wvkbl") pod "c9ef8bb6-0fc4-411e-82a1-85d95ced5818" (UID: "c9ef8bb6-0fc4-411e-82a1-85d95ced5818"). InnerVolumeSpecName "kube-api-access-wvkbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.934438 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd27d7a4-d5ee-48ee-b317-ac637797097e-scripts" (OuterVolumeSpecName: "scripts") pod "bd27d7a4-d5ee-48ee-b317-ac637797097e" (UID: "bd27d7a4-d5ee-48ee-b317-ac637797097e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.934793 4962 scope.go:117] "RemoveContainer" containerID="461b6d48eac2ff8ad256d35fbf30e75bf1e1a0278acfdd421ba2a7d95ae106de" Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.936147 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e9a059a-712b-4ff4-b50e-7d94a96a9db5-kube-api-access-jbdxb" (OuterVolumeSpecName: "kube-api-access-jbdxb") pod "1e9a059a-712b-4ff4-b50e-7d94a96a9db5" (UID: "1e9a059a-712b-4ff4-b50e-7d94a96a9db5"). InnerVolumeSpecName "kube-api-access-jbdxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.954115 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "persistence") pod "1e9a059a-712b-4ff4-b50e-7d94a96a9db5" (UID: "1e9a059a-712b-4ff4-b50e-7d94a96a9db5"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.957034 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e9a059a-712b-4ff4-b50e-7d94a96a9db5-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "1e9a059a-712b-4ff4-b50e-7d94a96a9db5" (UID: "1e9a059a-712b-4ff4-b50e-7d94a96a9db5"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.960446 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9ef8bb6-0fc4-411e-82a1-85d95ced5818-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "c9ef8bb6-0fc4-411e-82a1-85d95ced5818" (UID: "c9ef8bb6-0fc4-411e-82a1-85d95ced5818"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.962146 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/c9ef8bb6-0fc4-411e-82a1-85d95ced5818-pod-info" (OuterVolumeSpecName: "pod-info") pod "c9ef8bb6-0fc4-411e-82a1-85d95ced5818" (UID: "c9ef8bb6-0fc4-411e-82a1-85d95ced5818"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.968262 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e9a059a-712b-4ff4-b50e-7d94a96a9db5-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "1e9a059a-712b-4ff4-b50e-7d94a96a9db5" (UID: "1e9a059a-712b-4ff4-b50e-7d94a96a9db5"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.983593 4962 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.983644 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvkmq\" (UniqueName: \"kubernetes.io/projected/bd27d7a4-d5ee-48ee-b317-ac637797097e-kube-api-access-cvkmq\") on node \"crc\" DevicePath \"\"" Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.983657 4962 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c9ef8bb6-0fc4-411e-82a1-85d95ced5818-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.983665 4962 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c9ef8bb6-0fc4-411e-82a1-85d95ced5818-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.983674 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbdxb\" (UniqueName: \"kubernetes.io/projected/1e9a059a-712b-4ff4-b50e-7d94a96a9db5-kube-api-access-jbdxb\") on node \"crc\" DevicePath \"\"" Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.983682 4962 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd27d7a4-d5ee-48ee-b317-ac637797097e-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.983689 4962 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1e9a059a-712b-4ff4-b50e-7d94a96a9db5-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.983697 4962 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1e9a059a-712b-4ff4-b50e-7d94a96a9db5-pod-info\") on node \"crc\" DevicePath \"\"" Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.983705 4962 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd27d7a4-d5ee-48ee-b317-ac637797097e-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.983713 4962 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c9ef8bb6-0fc4-411e-82a1-85d95ced5818-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.983721 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd27d7a4-d5ee-48ee-b317-ac637797097e-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.983729 4962 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1e9a059a-712b-4ff4-b50e-7d94a96a9db5-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.983737 4962 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1e9a059a-712b-4ff4-b50e-7d94a96a9db5-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.983745 4962 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c9ef8bb6-0fc4-411e-82a1-85d95ced5818-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.983756 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvkbl\" (UniqueName: \"kubernetes.io/projected/c9ef8bb6-0fc4-411e-82a1-85d95ced5818-kube-api-access-wvkbl\") on node \"crc\" DevicePath \"\"" Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.983769 4962 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.983777 4962 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c9ef8bb6-0fc4-411e-82a1-85d95ced5818-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.983786 4962 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1e9a059a-712b-4ff4-b50e-7d94a96a9db5-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.983795 4962 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1e9a059a-712b-4ff4-b50e-7d94a96a9db5-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 01 21:59:19 crc kubenswrapper[4962]: I1201 21:59:19.983804 4962 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c9ef8bb6-0fc4-411e-82a1-85d95ced5818-pod-info\") on node \"crc\" DevicePath \"\"" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.056275 4962 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.060136 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd27d7a4-d5ee-48ee-b317-ac637797097e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "bd27d7a4-d5ee-48ee-b317-ac637797097e" (UID: "bd27d7a4-d5ee-48ee-b317-ac637797097e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.074600 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e9a059a-712b-4ff4-b50e-7d94a96a9db5-server-conf" (OuterVolumeSpecName: "server-conf") pod "1e9a059a-712b-4ff4-b50e-7d94a96a9db5" (UID: "1e9a059a-712b-4ff4-b50e-7d94a96a9db5"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.086094 4962 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.086122 4962 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1e9a059a-712b-4ff4-b50e-7d94a96a9db5-server-conf\") on node \"crc\" DevicePath \"\"" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.086133 4962 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bd27d7a4-d5ee-48ee-b317-ac637797097e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.099816 4962 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.111959 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d75f767dc-grqhx"] Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.116771 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd27d7a4-d5ee-48ee-b317-ac637797097e-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "bd27d7a4-d5ee-48ee-b317-ac637797097e" (UID: "bd27d7a4-d5ee-48ee-b317-ac637797097e"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.133140 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9ef8bb6-0fc4-411e-82a1-85d95ced5818-config-data" (OuterVolumeSpecName: "config-data") pod "c9ef8bb6-0fc4-411e-82a1-85d95ced5818" (UID: "c9ef8bb6-0fc4-411e-82a1-85d95ced5818"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.143861 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e9a059a-712b-4ff4-b50e-7d94a96a9db5-config-data" (OuterVolumeSpecName: "config-data") pod "1e9a059a-712b-4ff4-b50e-7d94a96a9db5" (UID: "1e9a059a-712b-4ff4-b50e-7d94a96a9db5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.155303 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd27d7a4-d5ee-48ee-b317-ac637797097e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bd27d7a4-d5ee-48ee-b317-ac637797097e" (UID: "bd27d7a4-d5ee-48ee-b317-ac637797097e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.159927 4962 scope.go:117] "RemoveContainer" containerID="8adf4ba0aa720627144c4b6055ae0379a2e8bc72a0049b5aea7634192f4d4038" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.187953 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c9ef8bb6-0fc4-411e-82a1-85d95ced5818-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.188149 4962 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd27d7a4-d5ee-48ee-b317-ac637797097e-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.188242 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd27d7a4-d5ee-48ee-b317-ac637797097e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.188434 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1e9a059a-712b-4ff4-b50e-7d94a96a9db5-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.188523 4962 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.212209 4962 scope.go:117] "RemoveContainer" containerID="f5cce121c6e120ee26efaae94bbeffca5976651918c066d3b6751059b27e89bf" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.216042 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9ef8bb6-0fc4-411e-82a1-85d95ced5818-server-conf" (OuterVolumeSpecName: "server-conf") pod "c9ef8bb6-0fc4-411e-82a1-85d95ced5818" (UID: "c9ef8bb6-0fc4-411e-82a1-85d95ced5818"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.237757 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd27d7a4-d5ee-48ee-b317-ac637797097e-config-data" (OuterVolumeSpecName: "config-data") pod "bd27d7a4-d5ee-48ee-b317-ac637797097e" (UID: "bd27d7a4-d5ee-48ee-b317-ac637797097e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.242416 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e9a059a-712b-4ff4-b50e-7d94a96a9db5-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "1e9a059a-712b-4ff4-b50e-7d94a96a9db5" (UID: "1e9a059a-712b-4ff4-b50e-7d94a96a9db5"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.257273 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9ef8bb6-0fc4-411e-82a1-85d95ced5818-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "c9ef8bb6-0fc4-411e-82a1-85d95ced5818" (UID: "c9ef8bb6-0fc4-411e-82a1-85d95ced5818"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.261623 4962 scope.go:117] "RemoveContainer" containerID="0fdbb7fc9151c4bfe4a3f9b7082315248516c2824a50d7b174c570405de73a0b" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.291180 4962 scope.go:117] "RemoveContainer" containerID="1b8680a116eb1730e294205b6fe4d9726f1a73c2af331267256f85134b3e5b63" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.293780 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd27d7a4-d5ee-48ee-b317-ac637797097e-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.293813 4962 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c9ef8bb6-0fc4-411e-82a1-85d95ced5818-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.293824 4962 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1e9a059a-712b-4ff4-b50e-7d94a96a9db5-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.293832 4962 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c9ef8bb6-0fc4-411e-82a1-85d95ced5818-server-conf\") on node \"crc\" DevicePath \"\"" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.341286 4962 scope.go:117] "RemoveContainer" containerID="8873855e8cc86b1a5f5e32c301660def4b38d97deda148e127ed139adbd358a5" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.442171 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.487769 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.511416 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 21:59:20 crc kubenswrapper[4962]: E1201 21:59:20.514867 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd27d7a4-d5ee-48ee-b317-ac637797097e" containerName="proxy-httpd" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.515071 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd27d7a4-d5ee-48ee-b317-ac637797097e" containerName="proxy-httpd" Dec 01 21:59:20 crc kubenswrapper[4962]: E1201 21:59:20.515148 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd27d7a4-d5ee-48ee-b317-ac637797097e" containerName="ceilometer-notification-agent" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.515203 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd27d7a4-d5ee-48ee-b317-ac637797097e" containerName="ceilometer-notification-agent" Dec 01 21:59:20 crc kubenswrapper[4962]: E1201 21:59:20.515272 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd27d7a4-d5ee-48ee-b317-ac637797097e" containerName="sg-core" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.515321 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd27d7a4-d5ee-48ee-b317-ac637797097e" containerName="sg-core" Dec 01 21:59:20 crc kubenswrapper[4962]: E1201 21:59:20.515416 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e9a059a-712b-4ff4-b50e-7d94a96a9db5" containerName="setup-container" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.515488 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e9a059a-712b-4ff4-b50e-7d94a96a9db5" containerName="setup-container" Dec 01 21:59:20 crc kubenswrapper[4962]: E1201 21:59:20.515565 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9ef8bb6-0fc4-411e-82a1-85d95ced5818" containerName="setup-container" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.515622 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9ef8bb6-0fc4-411e-82a1-85d95ced5818" containerName="setup-container" Dec 01 21:59:20 crc kubenswrapper[4962]: E1201 21:59:20.515702 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9ef8bb6-0fc4-411e-82a1-85d95ced5818" containerName="rabbitmq" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.515767 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9ef8bb6-0fc4-411e-82a1-85d95ced5818" containerName="rabbitmq" Dec 01 21:59:20 crc kubenswrapper[4962]: E1201 21:59:20.515857 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e9a059a-712b-4ff4-b50e-7d94a96a9db5" containerName="rabbitmq" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.515958 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e9a059a-712b-4ff4-b50e-7d94a96a9db5" containerName="rabbitmq" Dec 01 21:59:20 crc kubenswrapper[4962]: E1201 21:59:20.516097 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd27d7a4-d5ee-48ee-b317-ac637797097e" containerName="ceilometer-central-agent" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.516171 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd27d7a4-d5ee-48ee-b317-ac637797097e" containerName="ceilometer-central-agent" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.516562 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd27d7a4-d5ee-48ee-b317-ac637797097e" containerName="ceilometer-notification-agent" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.516649 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9ef8bb6-0fc4-411e-82a1-85d95ced5818" containerName="rabbitmq" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.516725 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e9a059a-712b-4ff4-b50e-7d94a96a9db5" containerName="rabbitmq" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.516789 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd27d7a4-d5ee-48ee-b317-ac637797097e" containerName="sg-core" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.516870 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd27d7a4-d5ee-48ee-b317-ac637797097e" containerName="ceilometer-central-agent" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.516947 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd27d7a4-d5ee-48ee-b317-ac637797097e" containerName="proxy-httpd" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.522106 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.526965 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.527498 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.527634 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.527732 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-hqps9" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.527160 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.527452 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.528085 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.574254 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.586605 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.601491 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6z2kr\" (UniqueName: \"kubernetes.io/projected/42940ca4-6f73-42b9-97b9-8fcf3fa4f968-kube-api-access-6z2kr\") pod \"rabbitmq-cell1-server-0\" (UID: \"42940ca4-6f73-42b9-97b9-8fcf3fa4f968\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.601571 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/42940ca4-6f73-42b9-97b9-8fcf3fa4f968-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"42940ca4-6f73-42b9-97b9-8fcf3fa4f968\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.601593 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/42940ca4-6f73-42b9-97b9-8fcf3fa4f968-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"42940ca4-6f73-42b9-97b9-8fcf3fa4f968\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.601620 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"42940ca4-6f73-42b9-97b9-8fcf3fa4f968\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.601650 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/42940ca4-6f73-42b9-97b9-8fcf3fa4f968-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"42940ca4-6f73-42b9-97b9-8fcf3fa4f968\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.602023 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/42940ca4-6f73-42b9-97b9-8fcf3fa4f968-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"42940ca4-6f73-42b9-97b9-8fcf3fa4f968\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.602063 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/42940ca4-6f73-42b9-97b9-8fcf3fa4f968-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"42940ca4-6f73-42b9-97b9-8fcf3fa4f968\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.602118 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/42940ca4-6f73-42b9-97b9-8fcf3fa4f968-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"42940ca4-6f73-42b9-97b9-8fcf3fa4f968\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.602151 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/42940ca4-6f73-42b9-97b9-8fcf3fa4f968-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"42940ca4-6f73-42b9-97b9-8fcf3fa4f968\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.602207 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/42940ca4-6f73-42b9-97b9-8fcf3fa4f968-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"42940ca4-6f73-42b9-97b9-8fcf3fa4f968\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.602318 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/42940ca4-6f73-42b9-97b9-8fcf3fa4f968-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"42940ca4-6f73-42b9-97b9-8fcf3fa4f968\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.607403 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.625930 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.636927 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.639813 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.641482 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.641633 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.642043 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.649121 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.662664 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.673346 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.675277 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.677137 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.677723 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.677995 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.678061 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.678232 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-jkck4" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.678345 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.678536 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.690194 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.704365 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dc4f027-5299-427c-9726-65012507b49b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6dc4f027-5299-427c-9726-65012507b49b\") " pod="openstack/ceilometer-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.704425 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/42940ca4-6f73-42b9-97b9-8fcf3fa4f968-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"42940ca4-6f73-42b9-97b9-8fcf3fa4f968\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.704445 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/42940ca4-6f73-42b9-97b9-8fcf3fa4f968-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"42940ca4-6f73-42b9-97b9-8fcf3fa4f968\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.704467 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/42940ca4-6f73-42b9-97b9-8fcf3fa4f968-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"42940ca4-6f73-42b9-97b9-8fcf3fa4f968\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.704488 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6dc4f027-5299-427c-9726-65012507b49b-scripts\") pod \"ceilometer-0\" (UID: \"6dc4f027-5299-427c-9726-65012507b49b\") " pod="openstack/ceilometer-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.704509 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/42940ca4-6f73-42b9-97b9-8fcf3fa4f968-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"42940ca4-6f73-42b9-97b9-8fcf3fa4f968\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.704533 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6dc4f027-5299-427c-9726-65012507b49b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6dc4f027-5299-427c-9726-65012507b49b\") " pod="openstack/ceilometer-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.704556 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/42940ca4-6f73-42b9-97b9-8fcf3fa4f968-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"42940ca4-6f73-42b9-97b9-8fcf3fa4f968\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.704578 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dc4f027-5299-427c-9726-65012507b49b-config-data\") pod \"ceilometer-0\" (UID: \"6dc4f027-5299-427c-9726-65012507b49b\") " pod="openstack/ceilometer-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.704594 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6dc4f027-5299-427c-9726-65012507b49b-log-httpd\") pod \"ceilometer-0\" (UID: \"6dc4f027-5299-427c-9726-65012507b49b\") " pod="openstack/ceilometer-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.704640 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/42940ca4-6f73-42b9-97b9-8fcf3fa4f968-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"42940ca4-6f73-42b9-97b9-8fcf3fa4f968\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.704662 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9785l\" (UniqueName: \"kubernetes.io/projected/6dc4f027-5299-427c-9726-65012507b49b-kube-api-access-9785l\") pod \"ceilometer-0\" (UID: \"6dc4f027-5299-427c-9726-65012507b49b\") " pod="openstack/ceilometer-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.704689 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dc4f027-5299-427c-9726-65012507b49b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6dc4f027-5299-427c-9726-65012507b49b\") " pod="openstack/ceilometer-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.704711 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6z2kr\" (UniqueName: \"kubernetes.io/projected/42940ca4-6f73-42b9-97b9-8fcf3fa4f968-kube-api-access-6z2kr\") pod \"rabbitmq-cell1-server-0\" (UID: \"42940ca4-6f73-42b9-97b9-8fcf3fa4f968\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.704764 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/42940ca4-6f73-42b9-97b9-8fcf3fa4f968-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"42940ca4-6f73-42b9-97b9-8fcf3fa4f968\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.704778 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6dc4f027-5299-427c-9726-65012507b49b-run-httpd\") pod \"ceilometer-0\" (UID: \"6dc4f027-5299-427c-9726-65012507b49b\") " pod="openstack/ceilometer-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.704795 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/42940ca4-6f73-42b9-97b9-8fcf3fa4f968-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"42940ca4-6f73-42b9-97b9-8fcf3fa4f968\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.704824 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"42940ca4-6f73-42b9-97b9-8fcf3fa4f968\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.704858 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/42940ca4-6f73-42b9-97b9-8fcf3fa4f968-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"42940ca4-6f73-42b9-97b9-8fcf3fa4f968\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.706739 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/42940ca4-6f73-42b9-97b9-8fcf3fa4f968-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"42940ca4-6f73-42b9-97b9-8fcf3fa4f968\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.707372 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/42940ca4-6f73-42b9-97b9-8fcf3fa4f968-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"42940ca4-6f73-42b9-97b9-8fcf3fa4f968\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.707770 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/42940ca4-6f73-42b9-97b9-8fcf3fa4f968-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"42940ca4-6f73-42b9-97b9-8fcf3fa4f968\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.707915 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/42940ca4-6f73-42b9-97b9-8fcf3fa4f968-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"42940ca4-6f73-42b9-97b9-8fcf3fa4f968\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.708391 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/42940ca4-6f73-42b9-97b9-8fcf3fa4f968-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"42940ca4-6f73-42b9-97b9-8fcf3fa4f968\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.709394 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/42940ca4-6f73-42b9-97b9-8fcf3fa4f968-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"42940ca4-6f73-42b9-97b9-8fcf3fa4f968\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.708751 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"42940ca4-6f73-42b9-97b9-8fcf3fa4f968\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-cell1-server-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.712250 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/42940ca4-6f73-42b9-97b9-8fcf3fa4f968-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"42940ca4-6f73-42b9-97b9-8fcf3fa4f968\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.712460 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/42940ca4-6f73-42b9-97b9-8fcf3fa4f968-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"42940ca4-6f73-42b9-97b9-8fcf3fa4f968\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.715609 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/42940ca4-6f73-42b9-97b9-8fcf3fa4f968-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"42940ca4-6f73-42b9-97b9-8fcf3fa4f968\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.723668 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6z2kr\" (UniqueName: \"kubernetes.io/projected/42940ca4-6f73-42b9-97b9-8fcf3fa4f968-kube-api-access-6z2kr\") pod \"rabbitmq-cell1-server-0\" (UID: \"42940ca4-6f73-42b9-97b9-8fcf3fa4f968\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.758261 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"42940ca4-6f73-42b9-97b9-8fcf3fa4f968\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.807633 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2284f352-fb8b-4432-b26f-106c1255dd90-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"2284f352-fb8b-4432-b26f-106c1255dd90\") " pod="openstack/rabbitmq-server-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.807692 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6dc4f027-5299-427c-9726-65012507b49b-scripts\") pod \"ceilometer-0\" (UID: \"6dc4f027-5299-427c-9726-65012507b49b\") " pod="openstack/ceilometer-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.807717 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6dc4f027-5299-427c-9726-65012507b49b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6dc4f027-5299-427c-9726-65012507b49b\") " pod="openstack/ceilometer-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.807742 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dc4f027-5299-427c-9726-65012507b49b-config-data\") pod \"ceilometer-0\" (UID: \"6dc4f027-5299-427c-9726-65012507b49b\") " pod="openstack/ceilometer-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.807763 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6dc4f027-5299-427c-9726-65012507b49b-log-httpd\") pod \"ceilometer-0\" (UID: \"6dc4f027-5299-427c-9726-65012507b49b\") " pod="openstack/ceilometer-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.807803 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9785l\" (UniqueName: \"kubernetes.io/projected/6dc4f027-5299-427c-9726-65012507b49b-kube-api-access-9785l\") pod \"ceilometer-0\" (UID: \"6dc4f027-5299-427c-9726-65012507b49b\") " pod="openstack/ceilometer-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.807836 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dc4f027-5299-427c-9726-65012507b49b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6dc4f027-5299-427c-9726-65012507b49b\") " pod="openstack/ceilometer-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.807868 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2284f352-fb8b-4432-b26f-106c1255dd90-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"2284f352-fb8b-4432-b26f-106c1255dd90\") " pod="openstack/rabbitmq-server-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.807912 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2284f352-fb8b-4432-b26f-106c1255dd90-server-conf\") pod \"rabbitmq-server-0\" (UID: \"2284f352-fb8b-4432-b26f-106c1255dd90\") " pod="openstack/rabbitmq-server-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.807949 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6dc4f027-5299-427c-9726-65012507b49b-run-httpd\") pod \"ceilometer-0\" (UID: \"6dc4f027-5299-427c-9726-65012507b49b\") " pod="openstack/ceilometer-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.807973 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2284f352-fb8b-4432-b26f-106c1255dd90-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"2284f352-fb8b-4432-b26f-106c1255dd90\") " pod="openstack/rabbitmq-server-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.807996 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2284f352-fb8b-4432-b26f-106c1255dd90-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"2284f352-fb8b-4432-b26f-106c1255dd90\") " pod="openstack/rabbitmq-server-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.808019 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2284f352-fb8b-4432-b26f-106c1255dd90-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"2284f352-fb8b-4432-b26f-106c1255dd90\") " pod="openstack/rabbitmq-server-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.808046 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2284f352-fb8b-4432-b26f-106c1255dd90-config-data\") pod \"rabbitmq-server-0\" (UID: \"2284f352-fb8b-4432-b26f-106c1255dd90\") " pod="openstack/rabbitmq-server-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.808061 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2284f352-fb8b-4432-b26f-106c1255dd90-pod-info\") pod \"rabbitmq-server-0\" (UID: \"2284f352-fb8b-4432-b26f-106c1255dd90\") " pod="openstack/rabbitmq-server-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.808082 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2284f352-fb8b-4432-b26f-106c1255dd90-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"2284f352-fb8b-4432-b26f-106c1255dd90\") " pod="openstack/rabbitmq-server-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.808116 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56l2v\" (UniqueName: \"kubernetes.io/projected/2284f352-fb8b-4432-b26f-106c1255dd90-kube-api-access-56l2v\") pod \"rabbitmq-server-0\" (UID: \"2284f352-fb8b-4432-b26f-106c1255dd90\") " pod="openstack/rabbitmq-server-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.808140 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"2284f352-fb8b-4432-b26f-106c1255dd90\") " pod="openstack/rabbitmq-server-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.808187 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dc4f027-5299-427c-9726-65012507b49b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6dc4f027-5299-427c-9726-65012507b49b\") " pod="openstack/ceilometer-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.808344 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6dc4f027-5299-427c-9726-65012507b49b-log-httpd\") pod \"ceilometer-0\" (UID: \"6dc4f027-5299-427c-9726-65012507b49b\") " pod="openstack/ceilometer-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.808595 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6dc4f027-5299-427c-9726-65012507b49b-run-httpd\") pod \"ceilometer-0\" (UID: \"6dc4f027-5299-427c-9726-65012507b49b\") " pod="openstack/ceilometer-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.810593 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6dc4f027-5299-427c-9726-65012507b49b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6dc4f027-5299-427c-9726-65012507b49b\") " pod="openstack/ceilometer-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.811962 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dc4f027-5299-427c-9726-65012507b49b-config-data\") pod \"ceilometer-0\" (UID: \"6dc4f027-5299-427c-9726-65012507b49b\") " pod="openstack/ceilometer-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.814689 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6dc4f027-5299-427c-9726-65012507b49b-scripts\") pod \"ceilometer-0\" (UID: \"6dc4f027-5299-427c-9726-65012507b49b\") " pod="openstack/ceilometer-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.815646 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dc4f027-5299-427c-9726-65012507b49b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6dc4f027-5299-427c-9726-65012507b49b\") " pod="openstack/ceilometer-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.817599 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dc4f027-5299-427c-9726-65012507b49b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6dc4f027-5299-427c-9726-65012507b49b\") " pod="openstack/ceilometer-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.824548 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9785l\" (UniqueName: \"kubernetes.io/projected/6dc4f027-5299-427c-9726-65012507b49b-kube-api-access-9785l\") pod \"ceilometer-0\" (UID: \"6dc4f027-5299-427c-9726-65012507b49b\") " pod="openstack/ceilometer-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.833574 4962 generic.go:334] "Generic (PLEG): container finished" podID="c626973f-0e99-4e4b-bc2b-8caddbada7aa" containerID="bce39f9eb20e081cea66065371d4d11acb22f50cd1c67b49ae8d588cdfe16dae" exitCode=0 Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.833644 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d75f767dc-grqhx" event={"ID":"c626973f-0e99-4e4b-bc2b-8caddbada7aa","Type":"ContainerDied","Data":"bce39f9eb20e081cea66065371d4d11acb22f50cd1c67b49ae8d588cdfe16dae"} Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.833667 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d75f767dc-grqhx" event={"ID":"c626973f-0e99-4e4b-bc2b-8caddbada7aa","Type":"ContainerStarted","Data":"4230f80c3618564eb548362e0ba6aba8a0c2595bf01a141037cd29d1cac8f348"} Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.909890 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2284f352-fb8b-4432-b26f-106c1255dd90-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"2284f352-fb8b-4432-b26f-106c1255dd90\") " pod="openstack/rabbitmq-server-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.909976 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2284f352-fb8b-4432-b26f-106c1255dd90-server-conf\") pod \"rabbitmq-server-0\" (UID: \"2284f352-fb8b-4432-b26f-106c1255dd90\") " pod="openstack/rabbitmq-server-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.910007 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2284f352-fb8b-4432-b26f-106c1255dd90-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"2284f352-fb8b-4432-b26f-106c1255dd90\") " pod="openstack/rabbitmq-server-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.910028 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2284f352-fb8b-4432-b26f-106c1255dd90-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"2284f352-fb8b-4432-b26f-106c1255dd90\") " pod="openstack/rabbitmq-server-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.910051 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2284f352-fb8b-4432-b26f-106c1255dd90-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"2284f352-fb8b-4432-b26f-106c1255dd90\") " pod="openstack/rabbitmq-server-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.910094 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2284f352-fb8b-4432-b26f-106c1255dd90-config-data\") pod \"rabbitmq-server-0\" (UID: \"2284f352-fb8b-4432-b26f-106c1255dd90\") " pod="openstack/rabbitmq-server-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.910109 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2284f352-fb8b-4432-b26f-106c1255dd90-pod-info\") pod \"rabbitmq-server-0\" (UID: \"2284f352-fb8b-4432-b26f-106c1255dd90\") " pod="openstack/rabbitmq-server-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.910132 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2284f352-fb8b-4432-b26f-106c1255dd90-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"2284f352-fb8b-4432-b26f-106c1255dd90\") " pod="openstack/rabbitmq-server-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.910175 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56l2v\" (UniqueName: \"kubernetes.io/projected/2284f352-fb8b-4432-b26f-106c1255dd90-kube-api-access-56l2v\") pod \"rabbitmq-server-0\" (UID: \"2284f352-fb8b-4432-b26f-106c1255dd90\") " pod="openstack/rabbitmq-server-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.910206 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"2284f352-fb8b-4432-b26f-106c1255dd90\") " pod="openstack/rabbitmq-server-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.910311 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2284f352-fb8b-4432-b26f-106c1255dd90-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"2284f352-fb8b-4432-b26f-106c1255dd90\") " pod="openstack/rabbitmq-server-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.910880 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2284f352-fb8b-4432-b26f-106c1255dd90-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"2284f352-fb8b-4432-b26f-106c1255dd90\") " pod="openstack/rabbitmq-server-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.912545 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2284f352-fb8b-4432-b26f-106c1255dd90-config-data\") pod \"rabbitmq-server-0\" (UID: \"2284f352-fb8b-4432-b26f-106c1255dd90\") " pod="openstack/rabbitmq-server-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.913400 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2284f352-fb8b-4432-b26f-106c1255dd90-server-conf\") pod \"rabbitmq-server-0\" (UID: \"2284f352-fb8b-4432-b26f-106c1255dd90\") " pod="openstack/rabbitmq-server-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.914091 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2284f352-fb8b-4432-b26f-106c1255dd90-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"2284f352-fb8b-4432-b26f-106c1255dd90\") " pod="openstack/rabbitmq-server-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.914344 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2284f352-fb8b-4432-b26f-106c1255dd90-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"2284f352-fb8b-4432-b26f-106c1255dd90\") " pod="openstack/rabbitmq-server-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.915434 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"2284f352-fb8b-4432-b26f-106c1255dd90\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-server-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.917452 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2284f352-fb8b-4432-b26f-106c1255dd90-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"2284f352-fb8b-4432-b26f-106c1255dd90\") " pod="openstack/rabbitmq-server-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.918782 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2284f352-fb8b-4432-b26f-106c1255dd90-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"2284f352-fb8b-4432-b26f-106c1255dd90\") " pod="openstack/rabbitmq-server-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.919131 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2284f352-fb8b-4432-b26f-106c1255dd90-pod-info\") pod \"rabbitmq-server-0\" (UID: \"2284f352-fb8b-4432-b26f-106c1255dd90\") " pod="openstack/rabbitmq-server-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.925747 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2284f352-fb8b-4432-b26f-106c1255dd90-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"2284f352-fb8b-4432-b26f-106c1255dd90\") " pod="openstack/rabbitmq-server-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.930482 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.931853 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56l2v\" (UniqueName: \"kubernetes.io/projected/2284f352-fb8b-4432-b26f-106c1255dd90-kube-api-access-56l2v\") pod \"rabbitmq-server-0\" (UID: \"2284f352-fb8b-4432-b26f-106c1255dd90\") " pod="openstack/rabbitmq-server-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.961097 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.968210 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"2284f352-fb8b-4432-b26f-106c1255dd90\") " pod="openstack/rabbitmq-server-0" Dec 01 21:59:20 crc kubenswrapper[4962]: I1201 21:59:20.995872 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 01 21:59:21 crc kubenswrapper[4962]: I1201 21:59:21.425587 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 21:59:21 crc kubenswrapper[4962]: I1201 21:59:21.577113 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 21:59:21 crc kubenswrapper[4962]: I1201 21:59:21.613538 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 21:59:21 crc kubenswrapper[4962]: I1201 21:59:21.864059 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2284f352-fb8b-4432-b26f-106c1255dd90","Type":"ContainerStarted","Data":"082aeb3f3c90c9387d710afeaa7450b237b019ec1176c03398d204c622dc5bf2"} Dec 01 21:59:21 crc kubenswrapper[4962]: I1201 21:59:21.866541 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d75f767dc-grqhx" event={"ID":"c626973f-0e99-4e4b-bc2b-8caddbada7aa","Type":"ContainerStarted","Data":"9e96e2b438161cb6e5e19efd581c2861a8216eb8da4032553346d95c388ba843"} Dec 01 21:59:21 crc kubenswrapper[4962]: I1201 21:59:21.868248 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5d75f767dc-grqhx" Dec 01 21:59:21 crc kubenswrapper[4962]: I1201 21:59:21.879116 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"42940ca4-6f73-42b9-97b9-8fcf3fa4f968","Type":"ContainerStarted","Data":"509bb58dc21fd60ad7619fe8d1392c310cf6efc30be55fca09dfa542fbc322bd"} Dec 01 21:59:21 crc kubenswrapper[4962]: I1201 21:59:21.898350 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6dc4f027-5299-427c-9726-65012507b49b","Type":"ContainerStarted","Data":"bb509525a4fe1c8f474cf420442fe22e505acf2d7f520e3cf1af5aa526b90358"} Dec 01 21:59:21 crc kubenswrapper[4962]: I1201 21:59:21.910778 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5d75f767dc-grqhx" podStartSLOduration=7.910765819 podStartE2EDuration="7.910765819s" podCreationTimestamp="2025-12-01 21:59:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:59:21.908348491 +0000 UTC m=+1546.009787686" watchObservedRunningTime="2025-12-01 21:59:21.910765819 +0000 UTC m=+1546.012205014" Dec 01 21:59:22 crc kubenswrapper[4962]: I1201 21:59:22.232047 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e9a059a-712b-4ff4-b50e-7d94a96a9db5" path="/var/lib/kubelet/pods/1e9a059a-712b-4ff4-b50e-7d94a96a9db5/volumes" Dec 01 21:59:22 crc kubenswrapper[4962]: I1201 21:59:22.233184 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd27d7a4-d5ee-48ee-b317-ac637797097e" path="/var/lib/kubelet/pods/bd27d7a4-d5ee-48ee-b317-ac637797097e/volumes" Dec 01 21:59:22 crc kubenswrapper[4962]: I1201 21:59:22.234665 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9ef8bb6-0fc4-411e-82a1-85d95ced5818" path="/var/lib/kubelet/pods/c9ef8bb6-0fc4-411e-82a1-85d95ced5818/volumes" Dec 01 21:59:23 crc kubenswrapper[4962]: I1201 21:59:23.944634 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"42940ca4-6f73-42b9-97b9-8fcf3fa4f968","Type":"ContainerStarted","Data":"d1756adf50fb08cb7e78346a339e512220c725da98f4d206ff5fe19b86c54499"} Dec 01 21:59:23 crc kubenswrapper[4962]: I1201 21:59:23.951704 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2284f352-fb8b-4432-b26f-106c1255dd90","Type":"ContainerStarted","Data":"2913e9bf91b13c4a8a6816e294e5189ee1d2c60c4fb7df487009deca8b3ed76f"} Dec 01 21:59:29 crc kubenswrapper[4962]: I1201 21:59:29.745130 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5d75f767dc-grqhx" Dec 01 21:59:29 crc kubenswrapper[4962]: I1201 21:59:29.814683 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f84f9ccf-bg7lh"] Dec 01 21:59:29 crc kubenswrapper[4962]: I1201 21:59:29.814921 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-f84f9ccf-bg7lh" podUID="a0db8b97-4fe1-4ccc-bdd6-e4635285a854" containerName="dnsmasq-dns" containerID="cri-o://c77afe8853921b13088819e7c2221c0ebb8655e3725f3bb62bbc7964f0f8c9f1" gracePeriod=10 Dec 01 21:59:30 crc kubenswrapper[4962]: I1201 21:59:30.030196 4962 generic.go:334] "Generic (PLEG): container finished" podID="a0db8b97-4fe1-4ccc-bdd6-e4635285a854" containerID="c77afe8853921b13088819e7c2221c0ebb8655e3725f3bb62bbc7964f0f8c9f1" exitCode=0 Dec 01 21:59:30 crc kubenswrapper[4962]: I1201 21:59:30.030283 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84f9ccf-bg7lh" event={"ID":"a0db8b97-4fe1-4ccc-bdd6-e4635285a854","Type":"ContainerDied","Data":"c77afe8853921b13088819e7c2221c0ebb8655e3725f3bb62bbc7964f0f8c9f1"} Dec 01 21:59:30 crc kubenswrapper[4962]: I1201 21:59:30.033093 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6dc4f027-5299-427c-9726-65012507b49b","Type":"ContainerStarted","Data":"f4c624f0a4bedabf73c3f05c533cdbb691927c00547a6ecd7508352217306e17"} Dec 01 21:59:30 crc kubenswrapper[4962]: I1201 21:59:30.500746 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f84f9ccf-bg7lh" Dec 01 21:59:30 crc kubenswrapper[4962]: I1201 21:59:30.603587 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a0db8b97-4fe1-4ccc-bdd6-e4635285a854-dns-swift-storage-0\") pod \"a0db8b97-4fe1-4ccc-bdd6-e4635285a854\" (UID: \"a0db8b97-4fe1-4ccc-bdd6-e4635285a854\") " Dec 01 21:59:30 crc kubenswrapper[4962]: I1201 21:59:30.603647 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a0db8b97-4fe1-4ccc-bdd6-e4635285a854-dns-svc\") pod \"a0db8b97-4fe1-4ccc-bdd6-e4635285a854\" (UID: \"a0db8b97-4fe1-4ccc-bdd6-e4635285a854\") " Dec 01 21:59:30 crc kubenswrapper[4962]: I1201 21:59:30.603680 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a0db8b97-4fe1-4ccc-bdd6-e4635285a854-ovsdbserver-sb\") pod \"a0db8b97-4fe1-4ccc-bdd6-e4635285a854\" (UID: \"a0db8b97-4fe1-4ccc-bdd6-e4635285a854\") " Dec 01 21:59:30 crc kubenswrapper[4962]: I1201 21:59:30.603903 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a0db8b97-4fe1-4ccc-bdd6-e4635285a854-ovsdbserver-nb\") pod \"a0db8b97-4fe1-4ccc-bdd6-e4635285a854\" (UID: \"a0db8b97-4fe1-4ccc-bdd6-e4635285a854\") " Dec 01 21:59:30 crc kubenswrapper[4962]: I1201 21:59:30.603969 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0db8b97-4fe1-4ccc-bdd6-e4635285a854-config\") pod \"a0db8b97-4fe1-4ccc-bdd6-e4635285a854\" (UID: \"a0db8b97-4fe1-4ccc-bdd6-e4635285a854\") " Dec 01 21:59:30 crc kubenswrapper[4962]: I1201 21:59:30.603989 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbt9x\" (UniqueName: \"kubernetes.io/projected/a0db8b97-4fe1-4ccc-bdd6-e4635285a854-kube-api-access-kbt9x\") pod \"a0db8b97-4fe1-4ccc-bdd6-e4635285a854\" (UID: \"a0db8b97-4fe1-4ccc-bdd6-e4635285a854\") " Dec 01 21:59:30 crc kubenswrapper[4962]: I1201 21:59:30.611050 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0db8b97-4fe1-4ccc-bdd6-e4635285a854-kube-api-access-kbt9x" (OuterVolumeSpecName: "kube-api-access-kbt9x") pod "a0db8b97-4fe1-4ccc-bdd6-e4635285a854" (UID: "a0db8b97-4fe1-4ccc-bdd6-e4635285a854"). InnerVolumeSpecName "kube-api-access-kbt9x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:59:30 crc kubenswrapper[4962]: I1201 21:59:30.741366 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbt9x\" (UniqueName: \"kubernetes.io/projected/a0db8b97-4fe1-4ccc-bdd6-e4635285a854-kube-api-access-kbt9x\") on node \"crc\" DevicePath \"\"" Dec 01 21:59:30 crc kubenswrapper[4962]: I1201 21:59:30.748021 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0db8b97-4fe1-4ccc-bdd6-e4635285a854-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a0db8b97-4fe1-4ccc-bdd6-e4635285a854" (UID: "a0db8b97-4fe1-4ccc-bdd6-e4635285a854"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:59:30 crc kubenswrapper[4962]: I1201 21:59:30.760666 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0db8b97-4fe1-4ccc-bdd6-e4635285a854-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a0db8b97-4fe1-4ccc-bdd6-e4635285a854" (UID: "a0db8b97-4fe1-4ccc-bdd6-e4635285a854"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:59:30 crc kubenswrapper[4962]: I1201 21:59:30.777517 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0db8b97-4fe1-4ccc-bdd6-e4635285a854-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a0db8b97-4fe1-4ccc-bdd6-e4635285a854" (UID: "a0db8b97-4fe1-4ccc-bdd6-e4635285a854"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:59:30 crc kubenswrapper[4962]: I1201 21:59:30.785485 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0db8b97-4fe1-4ccc-bdd6-e4635285a854-config" (OuterVolumeSpecName: "config") pod "a0db8b97-4fe1-4ccc-bdd6-e4635285a854" (UID: "a0db8b97-4fe1-4ccc-bdd6-e4635285a854"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:59:30 crc kubenswrapper[4962]: I1201 21:59:30.795667 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0db8b97-4fe1-4ccc-bdd6-e4635285a854-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a0db8b97-4fe1-4ccc-bdd6-e4635285a854" (UID: "a0db8b97-4fe1-4ccc-bdd6-e4635285a854"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:59:30 crc kubenswrapper[4962]: I1201 21:59:30.844090 4962 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a0db8b97-4fe1-4ccc-bdd6-e4635285a854-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 01 21:59:30 crc kubenswrapper[4962]: I1201 21:59:30.844335 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a0db8b97-4fe1-4ccc-bdd6-e4635285a854-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 21:59:30 crc kubenswrapper[4962]: I1201 21:59:30.844394 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a0db8b97-4fe1-4ccc-bdd6-e4635285a854-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 21:59:30 crc kubenswrapper[4962]: I1201 21:59:30.844447 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a0db8b97-4fe1-4ccc-bdd6-e4635285a854-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 21:59:30 crc kubenswrapper[4962]: I1201 21:59:30.844511 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0db8b97-4fe1-4ccc-bdd6-e4635285a854-config\") on node \"crc\" DevicePath \"\"" Dec 01 21:59:31 crc kubenswrapper[4962]: I1201 21:59:31.043820 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6dc4f027-5299-427c-9726-65012507b49b","Type":"ContainerStarted","Data":"67e6cba5f9c83c97d830e21bc34b7b832d7bb447966d09fdca3275918dd953a5"} Dec 01 21:59:31 crc kubenswrapper[4962]: I1201 21:59:31.045798 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84f9ccf-bg7lh" event={"ID":"a0db8b97-4fe1-4ccc-bdd6-e4635285a854","Type":"ContainerDied","Data":"d5884c14e435507d16f47befcfd85a870b49a5761eca34309f7184917994d3b2"} Dec 01 21:59:31 crc kubenswrapper[4962]: I1201 21:59:31.045831 4962 scope.go:117] "RemoveContainer" containerID="c77afe8853921b13088819e7c2221c0ebb8655e3725f3bb62bbc7964f0f8c9f1" Dec 01 21:59:31 crc kubenswrapper[4962]: I1201 21:59:31.045983 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f84f9ccf-bg7lh" Dec 01 21:59:31 crc kubenswrapper[4962]: I1201 21:59:31.090777 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f84f9ccf-bg7lh"] Dec 01 21:59:31 crc kubenswrapper[4962]: I1201 21:59:31.100376 4962 scope.go:117] "RemoveContainer" containerID="d1c5ba058c84dbadc3b7e623c9a161eaa74208431d249e3b27acaecbaa770789" Dec 01 21:59:31 crc kubenswrapper[4962]: I1201 21:59:31.106598 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-f84f9ccf-bg7lh"] Dec 01 21:59:32 crc kubenswrapper[4962]: I1201 21:59:32.231325 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0db8b97-4fe1-4ccc-bdd6-e4635285a854" path="/var/lib/kubelet/pods/a0db8b97-4fe1-4ccc-bdd6-e4635285a854/volumes" Dec 01 21:59:33 crc kubenswrapper[4962]: I1201 21:59:33.077728 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6dc4f027-5299-427c-9726-65012507b49b","Type":"ContainerStarted","Data":"d71867293fe6bf6fc075b1745895619bf22a8f813149f8a2a347f4eb5ac31d9a"} Dec 01 21:59:35 crc kubenswrapper[4962]: I1201 21:59:35.102767 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6dc4f027-5299-427c-9726-65012507b49b","Type":"ContainerStarted","Data":"342b1c0aca1a93249347806bdc2d1ed62796c6e778a75dfbc8ed28193012bf37"} Dec 01 21:59:35 crc kubenswrapper[4962]: I1201 21:59:35.103257 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 01 21:59:35 crc kubenswrapper[4962]: I1201 21:59:35.140639 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.722262294 podStartE2EDuration="15.140620336s" podCreationTimestamp="2025-12-01 21:59:20 +0000 UTC" firstStartedPulling="2025-12-01 21:59:21.600386163 +0000 UTC m=+1545.701825358" lastFinishedPulling="2025-12-01 21:59:34.018744205 +0000 UTC m=+1558.120183400" observedRunningTime="2025-12-01 21:59:35.124105718 +0000 UTC m=+1559.225544913" watchObservedRunningTime="2025-12-01 21:59:35.140620336 +0000 UTC m=+1559.242059521" Dec 01 21:59:35 crc kubenswrapper[4962]: I1201 21:59:35.733411 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-t8n5k"] Dec 01 21:59:35 crc kubenswrapper[4962]: E1201 21:59:35.734108 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0db8b97-4fe1-4ccc-bdd6-e4635285a854" containerName="init" Dec 01 21:59:35 crc kubenswrapper[4962]: I1201 21:59:35.734125 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0db8b97-4fe1-4ccc-bdd6-e4635285a854" containerName="init" Dec 01 21:59:35 crc kubenswrapper[4962]: E1201 21:59:35.734151 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0db8b97-4fe1-4ccc-bdd6-e4635285a854" containerName="dnsmasq-dns" Dec 01 21:59:35 crc kubenswrapper[4962]: I1201 21:59:35.734158 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0db8b97-4fe1-4ccc-bdd6-e4635285a854" containerName="dnsmasq-dns" Dec 01 21:59:35 crc kubenswrapper[4962]: I1201 21:59:35.734395 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0db8b97-4fe1-4ccc-bdd6-e4635285a854" containerName="dnsmasq-dns" Dec 01 21:59:35 crc kubenswrapper[4962]: I1201 21:59:35.735152 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-t8n5k" Dec 01 21:59:35 crc kubenswrapper[4962]: I1201 21:59:35.737535 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 21:59:35 crc kubenswrapper[4962]: I1201 21:59:35.737632 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 21:59:35 crc kubenswrapper[4962]: I1201 21:59:35.737828 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 21:59:35 crc kubenswrapper[4962]: I1201 21:59:35.738063 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mpxjd" Dec 01 21:59:35 crc kubenswrapper[4962]: I1201 21:59:35.756472 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b8f0680f-6407-4e35-a927-3c0613e4f3e5-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-t8n5k\" (UID: \"b8f0680f-6407-4e35-a927-3c0613e4f3e5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-t8n5k" Dec 01 21:59:35 crc kubenswrapper[4962]: I1201 21:59:35.756538 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wt66w\" (UniqueName: \"kubernetes.io/projected/b8f0680f-6407-4e35-a927-3c0613e4f3e5-kube-api-access-wt66w\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-t8n5k\" (UID: \"b8f0680f-6407-4e35-a927-3c0613e4f3e5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-t8n5k" Dec 01 21:59:35 crc kubenswrapper[4962]: I1201 21:59:35.756682 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8f0680f-6407-4e35-a927-3c0613e4f3e5-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-t8n5k\" (UID: \"b8f0680f-6407-4e35-a927-3c0613e4f3e5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-t8n5k" Dec 01 21:59:35 crc kubenswrapper[4962]: I1201 21:59:35.756734 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b8f0680f-6407-4e35-a927-3c0613e4f3e5-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-t8n5k\" (UID: \"b8f0680f-6407-4e35-a927-3c0613e4f3e5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-t8n5k" Dec 01 21:59:35 crc kubenswrapper[4962]: I1201 21:59:35.804334 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-t8n5k"] Dec 01 21:59:35 crc kubenswrapper[4962]: I1201 21:59:35.859073 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wt66w\" (UniqueName: \"kubernetes.io/projected/b8f0680f-6407-4e35-a927-3c0613e4f3e5-kube-api-access-wt66w\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-t8n5k\" (UID: \"b8f0680f-6407-4e35-a927-3c0613e4f3e5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-t8n5k" Dec 01 21:59:35 crc kubenswrapper[4962]: I1201 21:59:35.859299 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8f0680f-6407-4e35-a927-3c0613e4f3e5-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-t8n5k\" (UID: \"b8f0680f-6407-4e35-a927-3c0613e4f3e5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-t8n5k" Dec 01 21:59:35 crc kubenswrapper[4962]: I1201 21:59:35.859373 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b8f0680f-6407-4e35-a927-3c0613e4f3e5-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-t8n5k\" (UID: \"b8f0680f-6407-4e35-a927-3c0613e4f3e5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-t8n5k" Dec 01 21:59:35 crc kubenswrapper[4962]: I1201 21:59:35.859423 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b8f0680f-6407-4e35-a927-3c0613e4f3e5-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-t8n5k\" (UID: \"b8f0680f-6407-4e35-a927-3c0613e4f3e5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-t8n5k" Dec 01 21:59:35 crc kubenswrapper[4962]: I1201 21:59:35.865320 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b8f0680f-6407-4e35-a927-3c0613e4f3e5-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-t8n5k\" (UID: \"b8f0680f-6407-4e35-a927-3c0613e4f3e5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-t8n5k" Dec 01 21:59:35 crc kubenswrapper[4962]: I1201 21:59:35.865506 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8f0680f-6407-4e35-a927-3c0613e4f3e5-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-t8n5k\" (UID: \"b8f0680f-6407-4e35-a927-3c0613e4f3e5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-t8n5k" Dec 01 21:59:35 crc kubenswrapper[4962]: I1201 21:59:35.866534 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b8f0680f-6407-4e35-a927-3c0613e4f3e5-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-t8n5k\" (UID: \"b8f0680f-6407-4e35-a927-3c0613e4f3e5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-t8n5k" Dec 01 21:59:35 crc kubenswrapper[4962]: I1201 21:59:35.884740 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wt66w\" (UniqueName: \"kubernetes.io/projected/b8f0680f-6407-4e35-a927-3c0613e4f3e5-kube-api-access-wt66w\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-t8n5k\" (UID: \"b8f0680f-6407-4e35-a927-3c0613e4f3e5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-t8n5k" Dec 01 21:59:36 crc kubenswrapper[4962]: I1201 21:59:36.058419 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-t8n5k" Dec 01 21:59:36 crc kubenswrapper[4962]: I1201 21:59:36.756691 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-t8n5k"] Dec 01 21:59:36 crc kubenswrapper[4962]: W1201 21:59:36.762363 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8f0680f_6407_4e35_a927_3c0613e4f3e5.slice/crio-84fcf34b1097e01ff9cf605ff2ac16a209ee9a68ec2de12b32f302b015d54c2a WatchSource:0}: Error finding container 84fcf34b1097e01ff9cf605ff2ac16a209ee9a68ec2de12b32f302b015d54c2a: Status 404 returned error can't find the container with id 84fcf34b1097e01ff9cf605ff2ac16a209ee9a68ec2de12b32f302b015d54c2a Dec 01 21:59:37 crc kubenswrapper[4962]: I1201 21:59:37.128064 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-t8n5k" event={"ID":"b8f0680f-6407-4e35-a927-3c0613e4f3e5","Type":"ContainerStarted","Data":"84fcf34b1097e01ff9cf605ff2ac16a209ee9a68ec2de12b32f302b015d54c2a"} Dec 01 21:59:37 crc kubenswrapper[4962]: I1201 21:59:37.130742 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-h98rk" event={"ID":"ad0d848c-97d4-4360-a1e6-335cd2a8896c","Type":"ContainerStarted","Data":"fc8aac9eea6d702113a0e121c057dc321eeeb7340b6c56d21962f43c2f8b119a"} Dec 01 21:59:37 crc kubenswrapper[4962]: I1201 21:59:37.163385 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-h98rk" podStartSLOduration=2.77351341 podStartE2EDuration="41.163365806s" podCreationTimestamp="2025-12-01 21:58:56 +0000 UTC" firstStartedPulling="2025-12-01 21:58:57.517196802 +0000 UTC m=+1521.618636007" lastFinishedPulling="2025-12-01 21:59:35.907049208 +0000 UTC m=+1560.008488403" observedRunningTime="2025-12-01 21:59:37.149598016 +0000 UTC m=+1561.251037231" watchObservedRunningTime="2025-12-01 21:59:37.163365806 +0000 UTC m=+1561.264805011" Dec 01 21:59:38 crc kubenswrapper[4962]: I1201 21:59:38.055868 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-cgzd8"] Dec 01 21:59:38 crc kubenswrapper[4962]: I1201 21:59:38.058973 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cgzd8" Dec 01 21:59:38 crc kubenswrapper[4962]: I1201 21:59:38.079177 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cgzd8"] Dec 01 21:59:38 crc kubenswrapper[4962]: I1201 21:59:38.219369 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqjvr\" (UniqueName: \"kubernetes.io/projected/58eb5862-e8e9-4558-8eea-fe5bbffad3e1-kube-api-access-jqjvr\") pod \"certified-operators-cgzd8\" (UID: \"58eb5862-e8e9-4558-8eea-fe5bbffad3e1\") " pod="openshift-marketplace/certified-operators-cgzd8" Dec 01 21:59:38 crc kubenswrapper[4962]: I1201 21:59:38.219429 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58eb5862-e8e9-4558-8eea-fe5bbffad3e1-catalog-content\") pod \"certified-operators-cgzd8\" (UID: \"58eb5862-e8e9-4558-8eea-fe5bbffad3e1\") " pod="openshift-marketplace/certified-operators-cgzd8" Dec 01 21:59:38 crc kubenswrapper[4962]: I1201 21:59:38.219481 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58eb5862-e8e9-4558-8eea-fe5bbffad3e1-utilities\") pod \"certified-operators-cgzd8\" (UID: \"58eb5862-e8e9-4558-8eea-fe5bbffad3e1\") " pod="openshift-marketplace/certified-operators-cgzd8" Dec 01 21:59:38 crc kubenswrapper[4962]: I1201 21:59:38.322122 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqjvr\" (UniqueName: \"kubernetes.io/projected/58eb5862-e8e9-4558-8eea-fe5bbffad3e1-kube-api-access-jqjvr\") pod \"certified-operators-cgzd8\" (UID: \"58eb5862-e8e9-4558-8eea-fe5bbffad3e1\") " pod="openshift-marketplace/certified-operators-cgzd8" Dec 01 21:59:38 crc kubenswrapper[4962]: I1201 21:59:38.322211 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58eb5862-e8e9-4558-8eea-fe5bbffad3e1-catalog-content\") pod \"certified-operators-cgzd8\" (UID: \"58eb5862-e8e9-4558-8eea-fe5bbffad3e1\") " pod="openshift-marketplace/certified-operators-cgzd8" Dec 01 21:59:38 crc kubenswrapper[4962]: I1201 21:59:38.322289 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58eb5862-e8e9-4558-8eea-fe5bbffad3e1-utilities\") pod \"certified-operators-cgzd8\" (UID: \"58eb5862-e8e9-4558-8eea-fe5bbffad3e1\") " pod="openshift-marketplace/certified-operators-cgzd8" Dec 01 21:59:38 crc kubenswrapper[4962]: I1201 21:59:38.322689 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58eb5862-e8e9-4558-8eea-fe5bbffad3e1-catalog-content\") pod \"certified-operators-cgzd8\" (UID: \"58eb5862-e8e9-4558-8eea-fe5bbffad3e1\") " pod="openshift-marketplace/certified-operators-cgzd8" Dec 01 21:59:38 crc kubenswrapper[4962]: I1201 21:59:38.323469 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58eb5862-e8e9-4558-8eea-fe5bbffad3e1-utilities\") pod \"certified-operators-cgzd8\" (UID: \"58eb5862-e8e9-4558-8eea-fe5bbffad3e1\") " pod="openshift-marketplace/certified-operators-cgzd8" Dec 01 21:59:38 crc kubenswrapper[4962]: I1201 21:59:38.367698 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqjvr\" (UniqueName: \"kubernetes.io/projected/58eb5862-e8e9-4558-8eea-fe5bbffad3e1-kube-api-access-jqjvr\") pod \"certified-operators-cgzd8\" (UID: \"58eb5862-e8e9-4558-8eea-fe5bbffad3e1\") " pod="openshift-marketplace/certified-operators-cgzd8" Dec 01 21:59:38 crc kubenswrapper[4962]: I1201 21:59:38.390055 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cgzd8" Dec 01 21:59:38 crc kubenswrapper[4962]: I1201 21:59:38.895680 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cgzd8"] Dec 01 21:59:38 crc kubenswrapper[4962]: W1201 21:59:38.907237 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58eb5862_e8e9_4558_8eea_fe5bbffad3e1.slice/crio-1e02c3b4a66276a9d510770fa62364d1a9f2faf610843a8bf38323eaf4ab1108 WatchSource:0}: Error finding container 1e02c3b4a66276a9d510770fa62364d1a9f2faf610843a8bf38323eaf4ab1108: Status 404 returned error can't find the container with id 1e02c3b4a66276a9d510770fa62364d1a9f2faf610843a8bf38323eaf4ab1108 Dec 01 21:59:39 crc kubenswrapper[4962]: I1201 21:59:39.156838 4962 generic.go:334] "Generic (PLEG): container finished" podID="ad0d848c-97d4-4360-a1e6-335cd2a8896c" containerID="fc8aac9eea6d702113a0e121c057dc321eeeb7340b6c56d21962f43c2f8b119a" exitCode=0 Dec 01 21:59:39 crc kubenswrapper[4962]: I1201 21:59:39.156954 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-h98rk" event={"ID":"ad0d848c-97d4-4360-a1e6-335cd2a8896c","Type":"ContainerDied","Data":"fc8aac9eea6d702113a0e121c057dc321eeeb7340b6c56d21962f43c2f8b119a"} Dec 01 21:59:39 crc kubenswrapper[4962]: I1201 21:59:39.158466 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cgzd8" event={"ID":"58eb5862-e8e9-4558-8eea-fe5bbffad3e1","Type":"ContainerStarted","Data":"1e02c3b4a66276a9d510770fa62364d1a9f2faf610843a8bf38323eaf4ab1108"} Dec 01 21:59:40 crc kubenswrapper[4962]: I1201 21:59:40.856266 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-h98rk" Dec 01 21:59:40 crc kubenswrapper[4962]: I1201 21:59:40.992262 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad0d848c-97d4-4360-a1e6-335cd2a8896c-combined-ca-bundle\") pod \"ad0d848c-97d4-4360-a1e6-335cd2a8896c\" (UID: \"ad0d848c-97d4-4360-a1e6-335cd2a8896c\") " Dec 01 21:59:40 crc kubenswrapper[4962]: I1201 21:59:40.992412 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9d4h9\" (UniqueName: \"kubernetes.io/projected/ad0d848c-97d4-4360-a1e6-335cd2a8896c-kube-api-access-9d4h9\") pod \"ad0d848c-97d4-4360-a1e6-335cd2a8896c\" (UID: \"ad0d848c-97d4-4360-a1e6-335cd2a8896c\") " Dec 01 21:59:40 crc kubenswrapper[4962]: I1201 21:59:40.992555 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad0d848c-97d4-4360-a1e6-335cd2a8896c-config-data\") pod \"ad0d848c-97d4-4360-a1e6-335cd2a8896c\" (UID: \"ad0d848c-97d4-4360-a1e6-335cd2a8896c\") " Dec 01 21:59:41 crc kubenswrapper[4962]: I1201 21:59:41.003314 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad0d848c-97d4-4360-a1e6-335cd2a8896c-kube-api-access-9d4h9" (OuterVolumeSpecName: "kube-api-access-9d4h9") pod "ad0d848c-97d4-4360-a1e6-335cd2a8896c" (UID: "ad0d848c-97d4-4360-a1e6-335cd2a8896c"). InnerVolumeSpecName "kube-api-access-9d4h9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:59:41 crc kubenswrapper[4962]: I1201 21:59:41.049134 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad0d848c-97d4-4360-a1e6-335cd2a8896c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ad0d848c-97d4-4360-a1e6-335cd2a8896c" (UID: "ad0d848c-97d4-4360-a1e6-335cd2a8896c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:59:41 crc kubenswrapper[4962]: I1201 21:59:41.096731 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad0d848c-97d4-4360-a1e6-335cd2a8896c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 21:59:41 crc kubenswrapper[4962]: I1201 21:59:41.096769 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9d4h9\" (UniqueName: \"kubernetes.io/projected/ad0d848c-97d4-4360-a1e6-335cd2a8896c-kube-api-access-9d4h9\") on node \"crc\" DevicePath \"\"" Dec 01 21:59:41 crc kubenswrapper[4962]: I1201 21:59:41.109145 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad0d848c-97d4-4360-a1e6-335cd2a8896c-config-data" (OuterVolumeSpecName: "config-data") pod "ad0d848c-97d4-4360-a1e6-335cd2a8896c" (UID: "ad0d848c-97d4-4360-a1e6-335cd2a8896c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:59:41 crc kubenswrapper[4962]: I1201 21:59:41.187164 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-h98rk" event={"ID":"ad0d848c-97d4-4360-a1e6-335cd2a8896c","Type":"ContainerDied","Data":"59c21c945b2337826c79dade86fcbf0cca0c838d91c1dae9f9d8fbed329ede84"} Dec 01 21:59:41 crc kubenswrapper[4962]: I1201 21:59:41.187468 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="59c21c945b2337826c79dade86fcbf0cca0c838d91c1dae9f9d8fbed329ede84" Dec 01 21:59:41 crc kubenswrapper[4962]: I1201 21:59:41.187206 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-h98rk" Dec 01 21:59:41 crc kubenswrapper[4962]: I1201 21:59:41.189305 4962 generic.go:334] "Generic (PLEG): container finished" podID="58eb5862-e8e9-4558-8eea-fe5bbffad3e1" containerID="e6e6fdb55822c15a2f289d252610946e9b25a529cb8a1ab68485fc984a95b703" exitCode=0 Dec 01 21:59:41 crc kubenswrapper[4962]: I1201 21:59:41.189355 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cgzd8" event={"ID":"58eb5862-e8e9-4558-8eea-fe5bbffad3e1","Type":"ContainerDied","Data":"e6e6fdb55822c15a2f289d252610946e9b25a529cb8a1ab68485fc984a95b703"} Dec 01 21:59:41 crc kubenswrapper[4962]: I1201 21:59:41.200797 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad0d848c-97d4-4360-a1e6-335cd2a8896c-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 21:59:42 crc kubenswrapper[4962]: I1201 21:59:42.245322 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-65cfb46b8d-72cj6"] Dec 01 21:59:42 crc kubenswrapper[4962]: E1201 21:59:42.245693 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad0d848c-97d4-4360-a1e6-335cd2a8896c" containerName="heat-db-sync" Dec 01 21:59:42 crc kubenswrapper[4962]: I1201 21:59:42.245704 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad0d848c-97d4-4360-a1e6-335cd2a8896c" containerName="heat-db-sync" Dec 01 21:59:42 crc kubenswrapper[4962]: I1201 21:59:42.246421 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad0d848c-97d4-4360-a1e6-335cd2a8896c" containerName="heat-db-sync" Dec 01 21:59:42 crc kubenswrapper[4962]: I1201 21:59:42.247243 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-65cfb46b8d-72cj6" Dec 01 21:59:42 crc kubenswrapper[4962]: I1201 21:59:42.283495 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-65cfb46b8d-72cj6"] Dec 01 21:59:42 crc kubenswrapper[4962]: I1201 21:59:42.285895 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-6d7cf4b459-tkf5n"] Dec 01 21:59:42 crc kubenswrapper[4962]: I1201 21:59:42.287335 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6d7cf4b459-tkf5n" Dec 01 21:59:42 crc kubenswrapper[4962]: I1201 21:59:42.327836 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59a79ba3-6726-4020-8e97-80654b9cc661-config-data\") pod \"heat-engine-65cfb46b8d-72cj6\" (UID: \"59a79ba3-6726-4020-8e97-80654b9cc661\") " pod="openstack/heat-engine-65cfb46b8d-72cj6" Dec 01 21:59:42 crc kubenswrapper[4962]: I1201 21:59:42.328132 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2phv\" (UniqueName: \"kubernetes.io/projected/59a79ba3-6726-4020-8e97-80654b9cc661-kube-api-access-l2phv\") pod \"heat-engine-65cfb46b8d-72cj6\" (UID: \"59a79ba3-6726-4020-8e97-80654b9cc661\") " pod="openstack/heat-engine-65cfb46b8d-72cj6" Dec 01 21:59:42 crc kubenswrapper[4962]: I1201 21:59:42.328198 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/59a79ba3-6726-4020-8e97-80654b9cc661-config-data-custom\") pod \"heat-engine-65cfb46b8d-72cj6\" (UID: \"59a79ba3-6726-4020-8e97-80654b9cc661\") " pod="openstack/heat-engine-65cfb46b8d-72cj6" Dec 01 21:59:42 crc kubenswrapper[4962]: I1201 21:59:42.328280 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59a79ba3-6726-4020-8e97-80654b9cc661-combined-ca-bundle\") pod \"heat-engine-65cfb46b8d-72cj6\" (UID: \"59a79ba3-6726-4020-8e97-80654b9cc661\") " pod="openstack/heat-engine-65cfb46b8d-72cj6" Dec 01 21:59:42 crc kubenswrapper[4962]: I1201 21:59:42.395152 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-6d7cf4b459-tkf5n"] Dec 01 21:59:42 crc kubenswrapper[4962]: I1201 21:59:42.408651 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-77477565cc-t4xcz"] Dec 01 21:59:42 crc kubenswrapper[4962]: I1201 21:59:42.410602 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-77477565cc-t4xcz" Dec 01 21:59:42 crc kubenswrapper[4962]: I1201 21:59:42.430105 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/59a79ba3-6726-4020-8e97-80654b9cc661-config-data-custom\") pod \"heat-engine-65cfb46b8d-72cj6\" (UID: \"59a79ba3-6726-4020-8e97-80654b9cc661\") " pod="openstack/heat-engine-65cfb46b8d-72cj6" Dec 01 21:59:42 crc kubenswrapper[4962]: I1201 21:59:42.430177 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb34fc59-5f49-4cdf-81f2-9d2b03afc6e4-public-tls-certs\") pod \"heat-cfnapi-6d7cf4b459-tkf5n\" (UID: \"bb34fc59-5f49-4cdf-81f2-9d2b03afc6e4\") " pod="openstack/heat-cfnapi-6d7cf4b459-tkf5n" Dec 01 21:59:42 crc kubenswrapper[4962]: I1201 21:59:42.430284 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59a79ba3-6726-4020-8e97-80654b9cc661-combined-ca-bundle\") pod \"heat-engine-65cfb46b8d-72cj6\" (UID: \"59a79ba3-6726-4020-8e97-80654b9cc661\") " pod="openstack/heat-engine-65cfb46b8d-72cj6" Dec 01 21:59:42 crc kubenswrapper[4962]: I1201 21:59:42.430334 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9knx8\" (UniqueName: \"kubernetes.io/projected/bb34fc59-5f49-4cdf-81f2-9d2b03afc6e4-kube-api-access-9knx8\") pod \"heat-cfnapi-6d7cf4b459-tkf5n\" (UID: \"bb34fc59-5f49-4cdf-81f2-9d2b03afc6e4\") " pod="openstack/heat-cfnapi-6d7cf4b459-tkf5n" Dec 01 21:59:42 crc kubenswrapper[4962]: I1201 21:59:42.430365 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bb34fc59-5f49-4cdf-81f2-9d2b03afc6e4-config-data-custom\") pod \"heat-cfnapi-6d7cf4b459-tkf5n\" (UID: \"bb34fc59-5f49-4cdf-81f2-9d2b03afc6e4\") " pod="openstack/heat-cfnapi-6d7cf4b459-tkf5n" Dec 01 21:59:42 crc kubenswrapper[4962]: I1201 21:59:42.430397 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb34fc59-5f49-4cdf-81f2-9d2b03afc6e4-combined-ca-bundle\") pod \"heat-cfnapi-6d7cf4b459-tkf5n\" (UID: \"bb34fc59-5f49-4cdf-81f2-9d2b03afc6e4\") " pod="openstack/heat-cfnapi-6d7cf4b459-tkf5n" Dec 01 21:59:42 crc kubenswrapper[4962]: I1201 21:59:42.430501 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59a79ba3-6726-4020-8e97-80654b9cc661-config-data\") pod \"heat-engine-65cfb46b8d-72cj6\" (UID: \"59a79ba3-6726-4020-8e97-80654b9cc661\") " pod="openstack/heat-engine-65cfb46b8d-72cj6" Dec 01 21:59:42 crc kubenswrapper[4962]: I1201 21:59:42.430532 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2phv\" (UniqueName: \"kubernetes.io/projected/59a79ba3-6726-4020-8e97-80654b9cc661-kube-api-access-l2phv\") pod \"heat-engine-65cfb46b8d-72cj6\" (UID: \"59a79ba3-6726-4020-8e97-80654b9cc661\") " pod="openstack/heat-engine-65cfb46b8d-72cj6" Dec 01 21:59:42 crc kubenswrapper[4962]: I1201 21:59:42.430561 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb34fc59-5f49-4cdf-81f2-9d2b03afc6e4-internal-tls-certs\") pod \"heat-cfnapi-6d7cf4b459-tkf5n\" (UID: \"bb34fc59-5f49-4cdf-81f2-9d2b03afc6e4\") " pod="openstack/heat-cfnapi-6d7cf4b459-tkf5n" Dec 01 21:59:42 crc kubenswrapper[4962]: I1201 21:59:42.430609 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb34fc59-5f49-4cdf-81f2-9d2b03afc6e4-config-data\") pod \"heat-cfnapi-6d7cf4b459-tkf5n\" (UID: \"bb34fc59-5f49-4cdf-81f2-9d2b03afc6e4\") " pod="openstack/heat-cfnapi-6d7cf4b459-tkf5n" Dec 01 21:59:42 crc kubenswrapper[4962]: I1201 21:59:42.441082 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-77477565cc-t4xcz"] Dec 01 21:59:42 crc kubenswrapper[4962]: I1201 21:59:42.454213 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/59a79ba3-6726-4020-8e97-80654b9cc661-config-data-custom\") pod \"heat-engine-65cfb46b8d-72cj6\" (UID: \"59a79ba3-6726-4020-8e97-80654b9cc661\") " pod="openstack/heat-engine-65cfb46b8d-72cj6" Dec 01 21:59:42 crc kubenswrapper[4962]: I1201 21:59:42.459835 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59a79ba3-6726-4020-8e97-80654b9cc661-combined-ca-bundle\") pod \"heat-engine-65cfb46b8d-72cj6\" (UID: \"59a79ba3-6726-4020-8e97-80654b9cc661\") " pod="openstack/heat-engine-65cfb46b8d-72cj6" Dec 01 21:59:42 crc kubenswrapper[4962]: I1201 21:59:42.466331 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59a79ba3-6726-4020-8e97-80654b9cc661-config-data\") pod \"heat-engine-65cfb46b8d-72cj6\" (UID: \"59a79ba3-6726-4020-8e97-80654b9cc661\") " pod="openstack/heat-engine-65cfb46b8d-72cj6" Dec 01 21:59:42 crc kubenswrapper[4962]: I1201 21:59:42.477919 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2phv\" (UniqueName: \"kubernetes.io/projected/59a79ba3-6726-4020-8e97-80654b9cc661-kube-api-access-l2phv\") pod \"heat-engine-65cfb46b8d-72cj6\" (UID: \"59a79ba3-6726-4020-8e97-80654b9cc661\") " pod="openstack/heat-engine-65cfb46b8d-72cj6" Dec 01 21:59:42 crc kubenswrapper[4962]: I1201 21:59:42.532319 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb34fc59-5f49-4cdf-81f2-9d2b03afc6e4-public-tls-certs\") pod \"heat-cfnapi-6d7cf4b459-tkf5n\" (UID: \"bb34fc59-5f49-4cdf-81f2-9d2b03afc6e4\") " pod="openstack/heat-cfnapi-6d7cf4b459-tkf5n" Dec 01 21:59:42 crc kubenswrapper[4962]: I1201 21:59:42.532637 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/673eaa4d-d246-4ca5-8f8e-7b464149d355-internal-tls-certs\") pod \"heat-api-77477565cc-t4xcz\" (UID: \"673eaa4d-d246-4ca5-8f8e-7b464149d355\") " pod="openstack/heat-api-77477565cc-t4xcz" Dec 01 21:59:42 crc kubenswrapper[4962]: I1201 21:59:42.532731 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9knx8\" (UniqueName: \"kubernetes.io/projected/bb34fc59-5f49-4cdf-81f2-9d2b03afc6e4-kube-api-access-9knx8\") pod \"heat-cfnapi-6d7cf4b459-tkf5n\" (UID: \"bb34fc59-5f49-4cdf-81f2-9d2b03afc6e4\") " pod="openstack/heat-cfnapi-6d7cf4b459-tkf5n" Dec 01 21:59:42 crc kubenswrapper[4962]: I1201 21:59:42.532809 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bb34fc59-5f49-4cdf-81f2-9d2b03afc6e4-config-data-custom\") pod \"heat-cfnapi-6d7cf4b459-tkf5n\" (UID: \"bb34fc59-5f49-4cdf-81f2-9d2b03afc6e4\") " pod="openstack/heat-cfnapi-6d7cf4b459-tkf5n" Dec 01 21:59:42 crc kubenswrapper[4962]: I1201 21:59:42.532892 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb34fc59-5f49-4cdf-81f2-9d2b03afc6e4-combined-ca-bundle\") pod \"heat-cfnapi-6d7cf4b459-tkf5n\" (UID: \"bb34fc59-5f49-4cdf-81f2-9d2b03afc6e4\") " pod="openstack/heat-cfnapi-6d7cf4b459-tkf5n" Dec 01 21:59:42 crc kubenswrapper[4962]: I1201 21:59:42.533033 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/673eaa4d-d246-4ca5-8f8e-7b464149d355-public-tls-certs\") pod \"heat-api-77477565cc-t4xcz\" (UID: \"673eaa4d-d246-4ca5-8f8e-7b464149d355\") " pod="openstack/heat-api-77477565cc-t4xcz" Dec 01 21:59:42 crc kubenswrapper[4962]: I1201 21:59:42.533155 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8nv2\" (UniqueName: \"kubernetes.io/projected/673eaa4d-d246-4ca5-8f8e-7b464149d355-kube-api-access-v8nv2\") pod \"heat-api-77477565cc-t4xcz\" (UID: \"673eaa4d-d246-4ca5-8f8e-7b464149d355\") " pod="openstack/heat-api-77477565cc-t4xcz" Dec 01 21:59:42 crc kubenswrapper[4962]: I1201 21:59:42.533260 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/673eaa4d-d246-4ca5-8f8e-7b464149d355-config-data-custom\") pod \"heat-api-77477565cc-t4xcz\" (UID: \"673eaa4d-d246-4ca5-8f8e-7b464149d355\") " pod="openstack/heat-api-77477565cc-t4xcz" Dec 01 21:59:42 crc kubenswrapper[4962]: I1201 21:59:42.533420 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb34fc59-5f49-4cdf-81f2-9d2b03afc6e4-internal-tls-certs\") pod \"heat-cfnapi-6d7cf4b459-tkf5n\" (UID: \"bb34fc59-5f49-4cdf-81f2-9d2b03afc6e4\") " pod="openstack/heat-cfnapi-6d7cf4b459-tkf5n" Dec 01 21:59:42 crc kubenswrapper[4962]: I1201 21:59:42.533570 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/673eaa4d-d246-4ca5-8f8e-7b464149d355-config-data\") pod \"heat-api-77477565cc-t4xcz\" (UID: \"673eaa4d-d246-4ca5-8f8e-7b464149d355\") " pod="openstack/heat-api-77477565cc-t4xcz" Dec 01 21:59:42 crc kubenswrapper[4962]: I1201 21:59:42.533674 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb34fc59-5f49-4cdf-81f2-9d2b03afc6e4-config-data\") pod \"heat-cfnapi-6d7cf4b459-tkf5n\" (UID: \"bb34fc59-5f49-4cdf-81f2-9d2b03afc6e4\") " pod="openstack/heat-cfnapi-6d7cf4b459-tkf5n" Dec 01 21:59:42 crc kubenswrapper[4962]: I1201 21:59:42.533793 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/673eaa4d-d246-4ca5-8f8e-7b464149d355-combined-ca-bundle\") pod \"heat-api-77477565cc-t4xcz\" (UID: \"673eaa4d-d246-4ca5-8f8e-7b464149d355\") " pod="openstack/heat-api-77477565cc-t4xcz" Dec 01 21:59:42 crc kubenswrapper[4962]: I1201 21:59:42.537601 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb34fc59-5f49-4cdf-81f2-9d2b03afc6e4-internal-tls-certs\") pod \"heat-cfnapi-6d7cf4b459-tkf5n\" (UID: \"bb34fc59-5f49-4cdf-81f2-9d2b03afc6e4\") " pod="openstack/heat-cfnapi-6d7cf4b459-tkf5n" Dec 01 21:59:42 crc kubenswrapper[4962]: I1201 21:59:42.538109 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb34fc59-5f49-4cdf-81f2-9d2b03afc6e4-combined-ca-bundle\") pod \"heat-cfnapi-6d7cf4b459-tkf5n\" (UID: \"bb34fc59-5f49-4cdf-81f2-9d2b03afc6e4\") " pod="openstack/heat-cfnapi-6d7cf4b459-tkf5n" Dec 01 21:59:42 crc kubenswrapper[4962]: I1201 21:59:42.539158 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bb34fc59-5f49-4cdf-81f2-9d2b03afc6e4-config-data-custom\") pod \"heat-cfnapi-6d7cf4b459-tkf5n\" (UID: \"bb34fc59-5f49-4cdf-81f2-9d2b03afc6e4\") " pod="openstack/heat-cfnapi-6d7cf4b459-tkf5n" Dec 01 21:59:42 crc kubenswrapper[4962]: I1201 21:59:42.539651 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb34fc59-5f49-4cdf-81f2-9d2b03afc6e4-public-tls-certs\") pod \"heat-cfnapi-6d7cf4b459-tkf5n\" (UID: \"bb34fc59-5f49-4cdf-81f2-9d2b03afc6e4\") " pod="openstack/heat-cfnapi-6d7cf4b459-tkf5n" Dec 01 21:59:42 crc kubenswrapper[4962]: I1201 21:59:42.542574 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb34fc59-5f49-4cdf-81f2-9d2b03afc6e4-config-data\") pod \"heat-cfnapi-6d7cf4b459-tkf5n\" (UID: \"bb34fc59-5f49-4cdf-81f2-9d2b03afc6e4\") " pod="openstack/heat-cfnapi-6d7cf4b459-tkf5n" Dec 01 21:59:42 crc kubenswrapper[4962]: I1201 21:59:42.558159 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9knx8\" (UniqueName: \"kubernetes.io/projected/bb34fc59-5f49-4cdf-81f2-9d2b03afc6e4-kube-api-access-9knx8\") pod \"heat-cfnapi-6d7cf4b459-tkf5n\" (UID: \"bb34fc59-5f49-4cdf-81f2-9d2b03afc6e4\") " pod="openstack/heat-cfnapi-6d7cf4b459-tkf5n" Dec 01 21:59:42 crc kubenswrapper[4962]: I1201 21:59:42.622907 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-65cfb46b8d-72cj6" Dec 01 21:59:42 crc kubenswrapper[4962]: I1201 21:59:42.631618 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6d7cf4b459-tkf5n" Dec 01 21:59:42 crc kubenswrapper[4962]: I1201 21:59:42.636405 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/673eaa4d-d246-4ca5-8f8e-7b464149d355-combined-ca-bundle\") pod \"heat-api-77477565cc-t4xcz\" (UID: \"673eaa4d-d246-4ca5-8f8e-7b464149d355\") " pod="openstack/heat-api-77477565cc-t4xcz" Dec 01 21:59:42 crc kubenswrapper[4962]: I1201 21:59:42.636595 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/673eaa4d-d246-4ca5-8f8e-7b464149d355-internal-tls-certs\") pod \"heat-api-77477565cc-t4xcz\" (UID: \"673eaa4d-d246-4ca5-8f8e-7b464149d355\") " pod="openstack/heat-api-77477565cc-t4xcz" Dec 01 21:59:42 crc kubenswrapper[4962]: I1201 21:59:42.636713 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/673eaa4d-d246-4ca5-8f8e-7b464149d355-public-tls-certs\") pod \"heat-api-77477565cc-t4xcz\" (UID: \"673eaa4d-d246-4ca5-8f8e-7b464149d355\") " pod="openstack/heat-api-77477565cc-t4xcz" Dec 01 21:59:42 crc kubenswrapper[4962]: I1201 21:59:42.636798 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8nv2\" (UniqueName: \"kubernetes.io/projected/673eaa4d-d246-4ca5-8f8e-7b464149d355-kube-api-access-v8nv2\") pod \"heat-api-77477565cc-t4xcz\" (UID: \"673eaa4d-d246-4ca5-8f8e-7b464149d355\") " pod="openstack/heat-api-77477565cc-t4xcz" Dec 01 21:59:42 crc kubenswrapper[4962]: I1201 21:59:42.636889 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/673eaa4d-d246-4ca5-8f8e-7b464149d355-config-data-custom\") pod \"heat-api-77477565cc-t4xcz\" (UID: \"673eaa4d-d246-4ca5-8f8e-7b464149d355\") " pod="openstack/heat-api-77477565cc-t4xcz" Dec 01 21:59:42 crc kubenswrapper[4962]: I1201 21:59:42.637539 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/673eaa4d-d246-4ca5-8f8e-7b464149d355-config-data\") pod \"heat-api-77477565cc-t4xcz\" (UID: \"673eaa4d-d246-4ca5-8f8e-7b464149d355\") " pod="openstack/heat-api-77477565cc-t4xcz" Dec 01 21:59:42 crc kubenswrapper[4962]: I1201 21:59:42.640473 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/673eaa4d-d246-4ca5-8f8e-7b464149d355-public-tls-certs\") pod \"heat-api-77477565cc-t4xcz\" (UID: \"673eaa4d-d246-4ca5-8f8e-7b464149d355\") " pod="openstack/heat-api-77477565cc-t4xcz" Dec 01 21:59:42 crc kubenswrapper[4962]: I1201 21:59:42.642602 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/673eaa4d-d246-4ca5-8f8e-7b464149d355-combined-ca-bundle\") pod \"heat-api-77477565cc-t4xcz\" (UID: \"673eaa4d-d246-4ca5-8f8e-7b464149d355\") " pod="openstack/heat-api-77477565cc-t4xcz" Dec 01 21:59:42 crc kubenswrapper[4962]: I1201 21:59:42.643569 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/673eaa4d-d246-4ca5-8f8e-7b464149d355-config-data-custom\") pod \"heat-api-77477565cc-t4xcz\" (UID: \"673eaa4d-d246-4ca5-8f8e-7b464149d355\") " pod="openstack/heat-api-77477565cc-t4xcz" Dec 01 21:59:42 crc kubenswrapper[4962]: I1201 21:59:42.645381 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/673eaa4d-d246-4ca5-8f8e-7b464149d355-config-data\") pod \"heat-api-77477565cc-t4xcz\" (UID: \"673eaa4d-d246-4ca5-8f8e-7b464149d355\") " pod="openstack/heat-api-77477565cc-t4xcz" Dec 01 21:59:42 crc kubenswrapper[4962]: I1201 21:59:42.653210 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/673eaa4d-d246-4ca5-8f8e-7b464149d355-internal-tls-certs\") pod \"heat-api-77477565cc-t4xcz\" (UID: \"673eaa4d-d246-4ca5-8f8e-7b464149d355\") " pod="openstack/heat-api-77477565cc-t4xcz" Dec 01 21:59:42 crc kubenswrapper[4962]: I1201 21:59:42.659659 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8nv2\" (UniqueName: \"kubernetes.io/projected/673eaa4d-d246-4ca5-8f8e-7b464149d355-kube-api-access-v8nv2\") pod \"heat-api-77477565cc-t4xcz\" (UID: \"673eaa4d-d246-4ca5-8f8e-7b464149d355\") " pod="openstack/heat-api-77477565cc-t4xcz" Dec 01 21:59:42 crc kubenswrapper[4962]: I1201 21:59:42.730876 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-77477565cc-t4xcz" Dec 01 21:59:44 crc kubenswrapper[4962]: I1201 21:59:44.580414 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8cwr7"] Dec 01 21:59:44 crc kubenswrapper[4962]: I1201 21:59:44.585836 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8cwr7" Dec 01 21:59:44 crc kubenswrapper[4962]: I1201 21:59:44.599766 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8cwr7"] Dec 01 21:59:44 crc kubenswrapper[4962]: I1201 21:59:44.694898 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cb296aa-497e-434e-b234-4f54ca5bb9bd-catalog-content\") pod \"redhat-marketplace-8cwr7\" (UID: \"0cb296aa-497e-434e-b234-4f54ca5bb9bd\") " pod="openshift-marketplace/redhat-marketplace-8cwr7" Dec 01 21:59:44 crc kubenswrapper[4962]: I1201 21:59:44.694984 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cb296aa-497e-434e-b234-4f54ca5bb9bd-utilities\") pod \"redhat-marketplace-8cwr7\" (UID: \"0cb296aa-497e-434e-b234-4f54ca5bb9bd\") " pod="openshift-marketplace/redhat-marketplace-8cwr7" Dec 01 21:59:44 crc kubenswrapper[4962]: I1201 21:59:44.695008 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8c9fr\" (UniqueName: \"kubernetes.io/projected/0cb296aa-497e-434e-b234-4f54ca5bb9bd-kube-api-access-8c9fr\") pod \"redhat-marketplace-8cwr7\" (UID: \"0cb296aa-497e-434e-b234-4f54ca5bb9bd\") " pod="openshift-marketplace/redhat-marketplace-8cwr7" Dec 01 21:59:44 crc kubenswrapper[4962]: I1201 21:59:44.796850 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cb296aa-497e-434e-b234-4f54ca5bb9bd-catalog-content\") pod \"redhat-marketplace-8cwr7\" (UID: \"0cb296aa-497e-434e-b234-4f54ca5bb9bd\") " pod="openshift-marketplace/redhat-marketplace-8cwr7" Dec 01 21:59:44 crc kubenswrapper[4962]: I1201 21:59:44.796978 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cb296aa-497e-434e-b234-4f54ca5bb9bd-utilities\") pod \"redhat-marketplace-8cwr7\" (UID: \"0cb296aa-497e-434e-b234-4f54ca5bb9bd\") " pod="openshift-marketplace/redhat-marketplace-8cwr7" Dec 01 21:59:44 crc kubenswrapper[4962]: I1201 21:59:44.797005 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8c9fr\" (UniqueName: \"kubernetes.io/projected/0cb296aa-497e-434e-b234-4f54ca5bb9bd-kube-api-access-8c9fr\") pod \"redhat-marketplace-8cwr7\" (UID: \"0cb296aa-497e-434e-b234-4f54ca5bb9bd\") " pod="openshift-marketplace/redhat-marketplace-8cwr7" Dec 01 21:59:44 crc kubenswrapper[4962]: I1201 21:59:44.797423 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cb296aa-497e-434e-b234-4f54ca5bb9bd-catalog-content\") pod \"redhat-marketplace-8cwr7\" (UID: \"0cb296aa-497e-434e-b234-4f54ca5bb9bd\") " pod="openshift-marketplace/redhat-marketplace-8cwr7" Dec 01 21:59:44 crc kubenswrapper[4962]: I1201 21:59:44.797554 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cb296aa-497e-434e-b234-4f54ca5bb9bd-utilities\") pod \"redhat-marketplace-8cwr7\" (UID: \"0cb296aa-497e-434e-b234-4f54ca5bb9bd\") " pod="openshift-marketplace/redhat-marketplace-8cwr7" Dec 01 21:59:44 crc kubenswrapper[4962]: I1201 21:59:44.821480 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8c9fr\" (UniqueName: \"kubernetes.io/projected/0cb296aa-497e-434e-b234-4f54ca5bb9bd-kube-api-access-8c9fr\") pod \"redhat-marketplace-8cwr7\" (UID: \"0cb296aa-497e-434e-b234-4f54ca5bb9bd\") " pod="openshift-marketplace/redhat-marketplace-8cwr7" Dec 01 21:59:44 crc kubenswrapper[4962]: I1201 21:59:44.915255 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8cwr7" Dec 01 21:59:49 crc kubenswrapper[4962]: I1201 21:59:49.922406 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 21:59:50 crc kubenswrapper[4962]: I1201 21:59:50.327227 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cgzd8" event={"ID":"58eb5862-e8e9-4558-8eea-fe5bbffad3e1","Type":"ContainerStarted","Data":"3e7a9f3d9ba6091d15be3c9fcea8003466f68996ceb8178ea2d787c1fb3e8039"} Dec 01 21:59:50 crc kubenswrapper[4962]: I1201 21:59:50.334641 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-t8n5k" event={"ID":"b8f0680f-6407-4e35-a927-3c0613e4f3e5","Type":"ContainerStarted","Data":"6c9ace807ba3e115f342edc739655675d7e1f0d00fb436493cbb801d2a753d40"} Dec 01 21:59:50 crc kubenswrapper[4962]: I1201 21:59:50.373880 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-t8n5k" podStartSLOduration=2.220419912 podStartE2EDuration="15.373862242s" podCreationTimestamp="2025-12-01 21:59:35 +0000 UTC" firstStartedPulling="2025-12-01 21:59:36.765295379 +0000 UTC m=+1560.866734584" lastFinishedPulling="2025-12-01 21:59:49.918737699 +0000 UTC m=+1574.020176914" observedRunningTime="2025-12-01 21:59:50.366149774 +0000 UTC m=+1574.467588969" watchObservedRunningTime="2025-12-01 21:59:50.373862242 +0000 UTC m=+1574.475301437" Dec 01 21:59:50 crc kubenswrapper[4962]: I1201 21:59:50.459696 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-6d7cf4b459-tkf5n"] Dec 01 21:59:50 crc kubenswrapper[4962]: I1201 21:59:50.492527 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8cwr7"] Dec 01 21:59:50 crc kubenswrapper[4962]: W1201 21:59:50.618357 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod673eaa4d_d246_4ca5_8f8e_7b464149d355.slice/crio-33b83ea3111382e9009301316d4cac27ee374d82958de01ef01ccf97b7e2a153 WatchSource:0}: Error finding container 33b83ea3111382e9009301316d4cac27ee374d82958de01ef01ccf97b7e2a153: Status 404 returned error can't find the container with id 33b83ea3111382e9009301316d4cac27ee374d82958de01ef01ccf97b7e2a153 Dec 01 21:59:50 crc kubenswrapper[4962]: I1201 21:59:50.633303 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-77477565cc-t4xcz"] Dec 01 21:59:50 crc kubenswrapper[4962]: I1201 21:59:50.672041 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-65cfb46b8d-72cj6"] Dec 01 21:59:51 crc kubenswrapper[4962]: I1201 21:59:51.189013 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 01 21:59:51 crc kubenswrapper[4962]: I1201 21:59:51.346640 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-65cfb46b8d-72cj6" event={"ID":"59a79ba3-6726-4020-8e97-80654b9cc661","Type":"ContainerStarted","Data":"e8bb3accf26877917f0a0d439459a64b5edd51cd1b6eadad7210cd9432b0dc74"} Dec 01 21:59:51 crc kubenswrapper[4962]: I1201 21:59:51.348297 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6d7cf4b459-tkf5n" event={"ID":"bb34fc59-5f49-4cdf-81f2-9d2b03afc6e4","Type":"ContainerStarted","Data":"1972834eac37db2477ed21a2f19f774b43b3aa070ad4b30d69e4cc550c5e4658"} Dec 01 21:59:51 crc kubenswrapper[4962]: I1201 21:59:51.349281 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-77477565cc-t4xcz" event={"ID":"673eaa4d-d246-4ca5-8f8e-7b464149d355","Type":"ContainerStarted","Data":"33b83ea3111382e9009301316d4cac27ee374d82958de01ef01ccf97b7e2a153"} Dec 01 21:59:51 crc kubenswrapper[4962]: I1201 21:59:51.350648 4962 generic.go:334] "Generic (PLEG): container finished" podID="0cb296aa-497e-434e-b234-4f54ca5bb9bd" containerID="14f09125be4159bbaa64ba89244e54e859b6b25d7f1dd5ed9e0efa468d32853a" exitCode=0 Dec 01 21:59:51 crc kubenswrapper[4962]: I1201 21:59:51.352180 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8cwr7" event={"ID":"0cb296aa-497e-434e-b234-4f54ca5bb9bd","Type":"ContainerDied","Data":"14f09125be4159bbaa64ba89244e54e859b6b25d7f1dd5ed9e0efa468d32853a"} Dec 01 21:59:51 crc kubenswrapper[4962]: I1201 21:59:51.352206 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8cwr7" event={"ID":"0cb296aa-497e-434e-b234-4f54ca5bb9bd","Type":"ContainerStarted","Data":"288ce108163db3f029496e4bcf3ec6e2df6b2294ff3d22edd359636b6caa785d"} Dec 01 21:59:51 crc kubenswrapper[4962]: E1201 21:59:51.965812 4962 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58eb5862_e8e9_4558_8eea_fe5bbffad3e1.slice/crio-conmon-3e7a9f3d9ba6091d15be3c9fcea8003466f68996ceb8178ea2d787c1fb3e8039.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58eb5862_e8e9_4558_8eea_fe5bbffad3e1.slice/crio-3e7a9f3d9ba6091d15be3c9fcea8003466f68996ceb8178ea2d787c1fb3e8039.scope\": RecentStats: unable to find data in memory cache]" Dec 01 21:59:52 crc kubenswrapper[4962]: I1201 21:59:52.362612 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-65cfb46b8d-72cj6" event={"ID":"59a79ba3-6726-4020-8e97-80654b9cc661","Type":"ContainerStarted","Data":"8ad59a0c80f538b099a7c775f948f4295bf942ce21edc37b84509fa158cb336e"} Dec 01 21:59:52 crc kubenswrapper[4962]: I1201 21:59:52.362777 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-65cfb46b8d-72cj6" Dec 01 21:59:52 crc kubenswrapper[4962]: I1201 21:59:52.364738 4962 generic.go:334] "Generic (PLEG): container finished" podID="58eb5862-e8e9-4558-8eea-fe5bbffad3e1" containerID="3e7a9f3d9ba6091d15be3c9fcea8003466f68996ceb8178ea2d787c1fb3e8039" exitCode=0 Dec 01 21:59:52 crc kubenswrapper[4962]: I1201 21:59:52.364773 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cgzd8" event={"ID":"58eb5862-e8e9-4558-8eea-fe5bbffad3e1","Type":"ContainerDied","Data":"3e7a9f3d9ba6091d15be3c9fcea8003466f68996ceb8178ea2d787c1fb3e8039"} Dec 01 21:59:52 crc kubenswrapper[4962]: I1201 21:59:52.403898 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-65cfb46b8d-72cj6" podStartSLOduration=10.403879378 podStartE2EDuration="10.403879378s" podCreationTimestamp="2025-12-01 21:59:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:59:52.379706043 +0000 UTC m=+1576.481145248" watchObservedRunningTime="2025-12-01 21:59:52.403879378 +0000 UTC m=+1576.505318573" Dec 01 21:59:54 crc kubenswrapper[4962]: I1201 21:59:54.442528 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-77477565cc-t4xcz" event={"ID":"673eaa4d-d246-4ca5-8f8e-7b464149d355","Type":"ContainerStarted","Data":"00fcbc492e1dae041a547f841528e9757b8c1e74ccfe152db46e23b6514d2276"} Dec 01 21:59:54 crc kubenswrapper[4962]: I1201 21:59:54.443118 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-77477565cc-t4xcz" Dec 01 21:59:54 crc kubenswrapper[4962]: I1201 21:59:54.448081 4962 generic.go:334] "Generic (PLEG): container finished" podID="0cb296aa-497e-434e-b234-4f54ca5bb9bd" containerID="3e50200711c274fd78a9a62593d016d5df8f06744a90d27a72dbab5707f59827" exitCode=0 Dec 01 21:59:54 crc kubenswrapper[4962]: I1201 21:59:54.448166 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8cwr7" event={"ID":"0cb296aa-497e-434e-b234-4f54ca5bb9bd","Type":"ContainerDied","Data":"3e50200711c274fd78a9a62593d016d5df8f06744a90d27a72dbab5707f59827"} Dec 01 21:59:54 crc kubenswrapper[4962]: I1201 21:59:54.456191 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cgzd8" event={"ID":"58eb5862-e8e9-4558-8eea-fe5bbffad3e1","Type":"ContainerStarted","Data":"46a9430b4dbf0b39eaecf7f583426dd4b1e567e53b6f12e680a9fc1a6add876c"} Dec 01 21:59:54 crc kubenswrapper[4962]: I1201 21:59:54.472819 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6d7cf4b459-tkf5n" event={"ID":"bb34fc59-5f49-4cdf-81f2-9d2b03afc6e4","Type":"ContainerStarted","Data":"1fb761e78efe2cef49ded3de9bcc9547a42abc027ef3c860a782f46355b7ac14"} Dec 01 21:59:54 crc kubenswrapper[4962]: I1201 21:59:54.473880 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-6d7cf4b459-tkf5n" Dec 01 21:59:54 crc kubenswrapper[4962]: I1201 21:59:54.482475 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-77477565cc-t4xcz" podStartSLOduration=9.770515596 podStartE2EDuration="12.482455929s" podCreationTimestamp="2025-12-01 21:59:42 +0000 UTC" firstStartedPulling="2025-12-01 21:59:50.62122024 +0000 UTC m=+1574.722659435" lastFinishedPulling="2025-12-01 21:59:53.333160573 +0000 UTC m=+1577.434599768" observedRunningTime="2025-12-01 21:59:54.469364099 +0000 UTC m=+1578.570803294" watchObservedRunningTime="2025-12-01 21:59:54.482455929 +0000 UTC m=+1578.583895124" Dec 01 21:59:54 crc kubenswrapper[4962]: I1201 21:59:54.530419 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-cgzd8" podStartSLOduration=4.134995061 podStartE2EDuration="16.530390767s" podCreationTimestamp="2025-12-01 21:59:38 +0000 UTC" firstStartedPulling="2025-12-01 21:59:41.194316965 +0000 UTC m=+1565.295756160" lastFinishedPulling="2025-12-01 21:59:53.589712681 +0000 UTC m=+1577.691151866" observedRunningTime="2025-12-01 21:59:54.511847692 +0000 UTC m=+1578.613286887" watchObservedRunningTime="2025-12-01 21:59:54.530390767 +0000 UTC m=+1578.631829972" Dec 01 21:59:54 crc kubenswrapper[4962]: I1201 21:59:54.557098 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-6d7cf4b459-tkf5n" podStartSLOduration=9.693942596 podStartE2EDuration="12.557076903s" podCreationTimestamp="2025-12-01 21:59:42 +0000 UTC" firstStartedPulling="2025-12-01 21:59:50.456111462 +0000 UTC m=+1574.557550657" lastFinishedPulling="2025-12-01 21:59:53.319245769 +0000 UTC m=+1577.420684964" observedRunningTime="2025-12-01 21:59:54.529269545 +0000 UTC m=+1578.630708740" watchObservedRunningTime="2025-12-01 21:59:54.557076903 +0000 UTC m=+1578.658516108" Dec 01 21:59:56 crc kubenswrapper[4962]: I1201 21:59:56.498384 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8cwr7" event={"ID":"0cb296aa-497e-434e-b234-4f54ca5bb9bd","Type":"ContainerStarted","Data":"3680cfabc6234ce2a502956294f3f6bcc351dfd309833f45756cb50a1e0799f7"} Dec 01 21:59:56 crc kubenswrapper[4962]: I1201 21:59:56.526300 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8cwr7" podStartSLOduration=8.68140157 podStartE2EDuration="12.526281057s" podCreationTimestamp="2025-12-01 21:59:44 +0000 UTC" firstStartedPulling="2025-12-01 21:59:51.354355377 +0000 UTC m=+1575.455794572" lastFinishedPulling="2025-12-01 21:59:55.199234874 +0000 UTC m=+1579.300674059" observedRunningTime="2025-12-01 21:59:56.520816282 +0000 UTC m=+1580.622255497" watchObservedRunningTime="2025-12-01 21:59:56.526281057 +0000 UTC m=+1580.627720252" Dec 01 21:59:57 crc kubenswrapper[4962]: I1201 21:59:57.512416 4962 generic.go:334] "Generic (PLEG): container finished" podID="42940ca4-6f73-42b9-97b9-8fcf3fa4f968" containerID="d1756adf50fb08cb7e78346a339e512220c725da98f4d206ff5fe19b86c54499" exitCode=0 Dec 01 21:59:57 crc kubenswrapper[4962]: I1201 21:59:57.512480 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"42940ca4-6f73-42b9-97b9-8fcf3fa4f968","Type":"ContainerDied","Data":"d1756adf50fb08cb7e78346a339e512220c725da98f4d206ff5fe19b86c54499"} Dec 01 21:59:57 crc kubenswrapper[4962]: I1201 21:59:57.516043 4962 generic.go:334] "Generic (PLEG): container finished" podID="2284f352-fb8b-4432-b26f-106c1255dd90" containerID="2913e9bf91b13c4a8a6816e294e5189ee1d2c60c4fb7df487009deca8b3ed76f" exitCode=0 Dec 01 21:59:57 crc kubenswrapper[4962]: I1201 21:59:57.516158 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2284f352-fb8b-4432-b26f-106c1255dd90","Type":"ContainerDied","Data":"2913e9bf91b13c4a8a6816e294e5189ee1d2c60c4fb7df487009deca8b3ed76f"} Dec 01 21:59:58 crc kubenswrapper[4962]: I1201 21:59:58.391092 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-cgzd8" Dec 01 21:59:58 crc kubenswrapper[4962]: I1201 21:59:58.392149 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-cgzd8" Dec 01 21:59:58 crc kubenswrapper[4962]: I1201 21:59:58.528836 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2284f352-fb8b-4432-b26f-106c1255dd90","Type":"ContainerStarted","Data":"6a40123b6a1ae3ff222843aa53fae03d1f4df36e55c4b20c2217d75a104957c8"} Dec 01 21:59:58 crc kubenswrapper[4962]: I1201 21:59:58.529138 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 01 21:59:58 crc kubenswrapper[4962]: I1201 21:59:58.532812 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"42940ca4-6f73-42b9-97b9-8fcf3fa4f968","Type":"ContainerStarted","Data":"9f4e0a6ecf4ec8443cb98003b358bd5cd6b081cdf7c207875c85c9774a19a824"} Dec 01 21:59:58 crc kubenswrapper[4962]: I1201 21:59:58.533016 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 01 21:59:58 crc kubenswrapper[4962]: I1201 21:59:58.552523 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=38.552503806 podStartE2EDuration="38.552503806s" podCreationTimestamp="2025-12-01 21:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:59:58.548353688 +0000 UTC m=+1582.649792893" watchObservedRunningTime="2025-12-01 21:59:58.552503806 +0000 UTC m=+1582.653943001" Dec 01 21:59:59 crc kubenswrapper[4962]: I1201 21:59:59.451805 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-cgzd8" podUID="58eb5862-e8e9-4558-8eea-fe5bbffad3e1" containerName="registry-server" probeResult="failure" output=< Dec 01 21:59:59 crc kubenswrapper[4962]: timeout: failed to connect service ":50051" within 1s Dec 01 21:59:59 crc kubenswrapper[4962]: > Dec 01 22:00:00 crc kubenswrapper[4962]: I1201 22:00:00.142369 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=40.142343773 podStartE2EDuration="40.142343773s" podCreationTimestamp="2025-12-01 21:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:59:58.583545635 +0000 UTC m=+1582.684984860" watchObservedRunningTime="2025-12-01 22:00:00.142343773 +0000 UTC m=+1584.243782968" Dec 01 22:00:00 crc kubenswrapper[4962]: I1201 22:00:00.150834 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410440-fqjh9"] Dec 01 22:00:00 crc kubenswrapper[4962]: I1201 22:00:00.153254 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410440-fqjh9" Dec 01 22:00:00 crc kubenswrapper[4962]: I1201 22:00:00.156836 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 01 22:00:00 crc kubenswrapper[4962]: I1201 22:00:00.157571 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 01 22:00:00 crc kubenswrapper[4962]: I1201 22:00:00.193191 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410440-fqjh9"] Dec 01 22:00:00 crc kubenswrapper[4962]: I1201 22:00:00.306090 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e5e2c40f-767a-4ec4-a768-f64f3d2b5b20-config-volume\") pod \"collect-profiles-29410440-fqjh9\" (UID: \"e5e2c40f-767a-4ec4-a768-f64f3d2b5b20\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410440-fqjh9" Dec 01 22:00:00 crc kubenswrapper[4962]: I1201 22:00:00.306178 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e5e2c40f-767a-4ec4-a768-f64f3d2b5b20-secret-volume\") pod \"collect-profiles-29410440-fqjh9\" (UID: \"e5e2c40f-767a-4ec4-a768-f64f3d2b5b20\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410440-fqjh9" Dec 01 22:00:00 crc kubenswrapper[4962]: I1201 22:00:00.306274 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s48cs\" (UniqueName: \"kubernetes.io/projected/e5e2c40f-767a-4ec4-a768-f64f3d2b5b20-kube-api-access-s48cs\") pod \"collect-profiles-29410440-fqjh9\" (UID: \"e5e2c40f-767a-4ec4-a768-f64f3d2b5b20\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410440-fqjh9" Dec 01 22:00:00 crc kubenswrapper[4962]: I1201 22:00:00.408767 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e5e2c40f-767a-4ec4-a768-f64f3d2b5b20-config-volume\") pod \"collect-profiles-29410440-fqjh9\" (UID: \"e5e2c40f-767a-4ec4-a768-f64f3d2b5b20\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410440-fqjh9" Dec 01 22:00:00 crc kubenswrapper[4962]: I1201 22:00:00.408882 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e5e2c40f-767a-4ec4-a768-f64f3d2b5b20-secret-volume\") pod \"collect-profiles-29410440-fqjh9\" (UID: \"e5e2c40f-767a-4ec4-a768-f64f3d2b5b20\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410440-fqjh9" Dec 01 22:00:00 crc kubenswrapper[4962]: I1201 22:00:00.409788 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e5e2c40f-767a-4ec4-a768-f64f3d2b5b20-config-volume\") pod \"collect-profiles-29410440-fqjh9\" (UID: \"e5e2c40f-767a-4ec4-a768-f64f3d2b5b20\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410440-fqjh9" Dec 01 22:00:00 crc kubenswrapper[4962]: I1201 22:00:00.410106 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s48cs\" (UniqueName: \"kubernetes.io/projected/e5e2c40f-767a-4ec4-a768-f64f3d2b5b20-kube-api-access-s48cs\") pod \"collect-profiles-29410440-fqjh9\" (UID: \"e5e2c40f-767a-4ec4-a768-f64f3d2b5b20\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410440-fqjh9" Dec 01 22:00:00 crc kubenswrapper[4962]: I1201 22:00:00.416369 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e5e2c40f-767a-4ec4-a768-f64f3d2b5b20-secret-volume\") pod \"collect-profiles-29410440-fqjh9\" (UID: \"e5e2c40f-767a-4ec4-a768-f64f3d2b5b20\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410440-fqjh9" Dec 01 22:00:00 crc kubenswrapper[4962]: I1201 22:00:00.430859 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s48cs\" (UniqueName: \"kubernetes.io/projected/e5e2c40f-767a-4ec4-a768-f64f3d2b5b20-kube-api-access-s48cs\") pod \"collect-profiles-29410440-fqjh9\" (UID: \"e5e2c40f-767a-4ec4-a768-f64f3d2b5b20\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410440-fqjh9" Dec 01 22:00:00 crc kubenswrapper[4962]: I1201 22:00:00.471999 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410440-fqjh9" Dec 01 22:00:00 crc kubenswrapper[4962]: W1201 22:00:00.973939 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5e2c40f_767a_4ec4_a768_f64f3d2b5b20.slice/crio-45060130431c0ef9dc0b78e232ca89079c6daf4b8c0ee883f37757b96ba74edf WatchSource:0}: Error finding container 45060130431c0ef9dc0b78e232ca89079c6daf4b8c0ee883f37757b96ba74edf: Status 404 returned error can't find the container with id 45060130431c0ef9dc0b78e232ca89079c6daf4b8c0ee883f37757b96ba74edf Dec 01 22:00:00 crc kubenswrapper[4962]: I1201 22:00:00.984397 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410440-fqjh9"] Dec 01 22:00:01 crc kubenswrapper[4962]: I1201 22:00:01.531318 4962 scope.go:117] "RemoveContainer" containerID="ee46eef84872efd0f6091b889c036282a64fe41fca51819c77b23dfe90ad9487" Dec 01 22:00:01 crc kubenswrapper[4962]: I1201 22:00:01.582128 4962 generic.go:334] "Generic (PLEG): container finished" podID="e5e2c40f-767a-4ec4-a768-f64f3d2b5b20" containerID="47f9cc9152e2e099a2a13a1dc683fecd637345e62e616527567081918534a58c" exitCode=0 Dec 01 22:00:01 crc kubenswrapper[4962]: I1201 22:00:01.582393 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410440-fqjh9" event={"ID":"e5e2c40f-767a-4ec4-a768-f64f3d2b5b20","Type":"ContainerDied","Data":"47f9cc9152e2e099a2a13a1dc683fecd637345e62e616527567081918534a58c"} Dec 01 22:00:01 crc kubenswrapper[4962]: I1201 22:00:01.582527 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410440-fqjh9" event={"ID":"e5e2c40f-767a-4ec4-a768-f64f3d2b5b20","Type":"ContainerStarted","Data":"45060130431c0ef9dc0b78e232ca89079c6daf4b8c0ee883f37757b96ba74edf"} Dec 01 22:00:02 crc kubenswrapper[4962]: I1201 22:00:02.598687 4962 generic.go:334] "Generic (PLEG): container finished" podID="b8f0680f-6407-4e35-a927-3c0613e4f3e5" containerID="6c9ace807ba3e115f342edc739655675d7e1f0d00fb436493cbb801d2a753d40" exitCode=0 Dec 01 22:00:02 crc kubenswrapper[4962]: I1201 22:00:02.599108 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-t8n5k" event={"ID":"b8f0680f-6407-4e35-a927-3c0613e4f3e5","Type":"ContainerDied","Data":"6c9ace807ba3e115f342edc739655675d7e1f0d00fb436493cbb801d2a753d40"} Dec 01 22:00:02 crc kubenswrapper[4962]: I1201 22:00:02.666293 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-65cfb46b8d-72cj6" Dec 01 22:00:02 crc kubenswrapper[4962]: I1201 22:00:02.724636 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-6776d74cd9-xhqgt"] Dec 01 22:00:02 crc kubenswrapper[4962]: I1201 22:00:02.725072 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-6776d74cd9-xhqgt" podUID="a0a5ca37-2aa2-43a5-8e9d-a58d62955f44" containerName="heat-engine" containerID="cri-o://c38766e3f428e4e0124f10f797be6610444e5fb4a10b2633ed129a3c001a354d" gracePeriod=60 Dec 01 22:00:03 crc kubenswrapper[4962]: I1201 22:00:03.110975 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410440-fqjh9" Dec 01 22:00:03 crc kubenswrapper[4962]: I1201 22:00:03.281083 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s48cs\" (UniqueName: \"kubernetes.io/projected/e5e2c40f-767a-4ec4-a768-f64f3d2b5b20-kube-api-access-s48cs\") pod \"e5e2c40f-767a-4ec4-a768-f64f3d2b5b20\" (UID: \"e5e2c40f-767a-4ec4-a768-f64f3d2b5b20\") " Dec 01 22:00:03 crc kubenswrapper[4962]: I1201 22:00:03.281341 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e5e2c40f-767a-4ec4-a768-f64f3d2b5b20-config-volume\") pod \"e5e2c40f-767a-4ec4-a768-f64f3d2b5b20\" (UID: \"e5e2c40f-767a-4ec4-a768-f64f3d2b5b20\") " Dec 01 22:00:03 crc kubenswrapper[4962]: I1201 22:00:03.281401 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e5e2c40f-767a-4ec4-a768-f64f3d2b5b20-secret-volume\") pod \"e5e2c40f-767a-4ec4-a768-f64f3d2b5b20\" (UID: \"e5e2c40f-767a-4ec4-a768-f64f3d2b5b20\") " Dec 01 22:00:03 crc kubenswrapper[4962]: I1201 22:00:03.283517 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5e2c40f-767a-4ec4-a768-f64f3d2b5b20-config-volume" (OuterVolumeSpecName: "config-volume") pod "e5e2c40f-767a-4ec4-a768-f64f3d2b5b20" (UID: "e5e2c40f-767a-4ec4-a768-f64f3d2b5b20"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 22:00:03 crc kubenswrapper[4962]: I1201 22:00:03.289023 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5e2c40f-767a-4ec4-a768-f64f3d2b5b20-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e5e2c40f-767a-4ec4-a768-f64f3d2b5b20" (UID: "e5e2c40f-767a-4ec4-a768-f64f3d2b5b20"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:00:03 crc kubenswrapper[4962]: I1201 22:00:03.289107 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5e2c40f-767a-4ec4-a768-f64f3d2b5b20-kube-api-access-s48cs" (OuterVolumeSpecName: "kube-api-access-s48cs") pod "e5e2c40f-767a-4ec4-a768-f64f3d2b5b20" (UID: "e5e2c40f-767a-4ec4-a768-f64f3d2b5b20"). InnerVolumeSpecName "kube-api-access-s48cs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 22:00:03 crc kubenswrapper[4962]: I1201 22:00:03.386524 4962 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e5e2c40f-767a-4ec4-a768-f64f3d2b5b20-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 22:00:03 crc kubenswrapper[4962]: I1201 22:00:03.386557 4962 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e5e2c40f-767a-4ec4-a768-f64f3d2b5b20-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 01 22:00:03 crc kubenswrapper[4962]: I1201 22:00:03.386566 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s48cs\" (UniqueName: \"kubernetes.io/projected/e5e2c40f-767a-4ec4-a768-f64f3d2b5b20-kube-api-access-s48cs\") on node \"crc\" DevicePath \"\"" Dec 01 22:00:03 crc kubenswrapper[4962]: I1201 22:00:03.612207 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410440-fqjh9" Dec 01 22:00:03 crc kubenswrapper[4962]: I1201 22:00:03.613853 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410440-fqjh9" event={"ID":"e5e2c40f-767a-4ec4-a768-f64f3d2b5b20","Type":"ContainerDied","Data":"45060130431c0ef9dc0b78e232ca89079c6daf4b8c0ee883f37757b96ba74edf"} Dec 01 22:00:03 crc kubenswrapper[4962]: I1201 22:00:03.613892 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45060130431c0ef9dc0b78e232ca89079c6daf4b8c0ee883f37757b96ba74edf" Dec 01 22:00:04 crc kubenswrapper[4962]: I1201 22:00:04.413718 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-t8n5k" Dec 01 22:00:04 crc kubenswrapper[4962]: I1201 22:00:04.618520 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b8f0680f-6407-4e35-a927-3c0613e4f3e5-inventory\") pod \"b8f0680f-6407-4e35-a927-3c0613e4f3e5\" (UID: \"b8f0680f-6407-4e35-a927-3c0613e4f3e5\") " Dec 01 22:00:04 crc kubenswrapper[4962]: I1201 22:00:04.618755 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wt66w\" (UniqueName: \"kubernetes.io/projected/b8f0680f-6407-4e35-a927-3c0613e4f3e5-kube-api-access-wt66w\") pod \"b8f0680f-6407-4e35-a927-3c0613e4f3e5\" (UID: \"b8f0680f-6407-4e35-a927-3c0613e4f3e5\") " Dec 01 22:00:04 crc kubenswrapper[4962]: I1201 22:00:04.619920 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b8f0680f-6407-4e35-a927-3c0613e4f3e5-ssh-key\") pod \"b8f0680f-6407-4e35-a927-3c0613e4f3e5\" (UID: \"b8f0680f-6407-4e35-a927-3c0613e4f3e5\") " Dec 01 22:00:04 crc kubenswrapper[4962]: I1201 22:00:04.620048 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8f0680f-6407-4e35-a927-3c0613e4f3e5-repo-setup-combined-ca-bundle\") pod \"b8f0680f-6407-4e35-a927-3c0613e4f3e5\" (UID: \"b8f0680f-6407-4e35-a927-3c0613e4f3e5\") " Dec 01 22:00:04 crc kubenswrapper[4962]: I1201 22:00:04.640592 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8f0680f-6407-4e35-a927-3c0613e4f3e5-kube-api-access-wt66w" (OuterVolumeSpecName: "kube-api-access-wt66w") pod "b8f0680f-6407-4e35-a927-3c0613e4f3e5" (UID: "b8f0680f-6407-4e35-a927-3c0613e4f3e5"). InnerVolumeSpecName "kube-api-access-wt66w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 22:00:04 crc kubenswrapper[4962]: I1201 22:00:04.643520 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-t8n5k" event={"ID":"b8f0680f-6407-4e35-a927-3c0613e4f3e5","Type":"ContainerDied","Data":"84fcf34b1097e01ff9cf605ff2ac16a209ee9a68ec2de12b32f302b015d54c2a"} Dec 01 22:00:04 crc kubenswrapper[4962]: I1201 22:00:04.643555 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84fcf34b1097e01ff9cf605ff2ac16a209ee9a68ec2de12b32f302b015d54c2a" Dec 01 22:00:04 crc kubenswrapper[4962]: I1201 22:00:04.643606 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-t8n5k" Dec 01 22:00:04 crc kubenswrapper[4962]: I1201 22:00:04.653496 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8f0680f-6407-4e35-a927-3c0613e4f3e5-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "b8f0680f-6407-4e35-a927-3c0613e4f3e5" (UID: "b8f0680f-6407-4e35-a927-3c0613e4f3e5"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:00:04 crc kubenswrapper[4962]: I1201 22:00:04.674187 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8f0680f-6407-4e35-a927-3c0613e4f3e5-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b8f0680f-6407-4e35-a927-3c0613e4f3e5" (UID: "b8f0680f-6407-4e35-a927-3c0613e4f3e5"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:00:04 crc kubenswrapper[4962]: I1201 22:00:04.702159 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8f0680f-6407-4e35-a927-3c0613e4f3e5-inventory" (OuterVolumeSpecName: "inventory") pod "b8f0680f-6407-4e35-a927-3c0613e4f3e5" (UID: "b8f0680f-6407-4e35-a927-3c0613e4f3e5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:00:04 crc kubenswrapper[4962]: I1201 22:00:04.723688 4962 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b8f0680f-6407-4e35-a927-3c0613e4f3e5-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 22:00:04 crc kubenswrapper[4962]: I1201 22:00:04.723749 4962 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8f0680f-6407-4e35-a927-3c0613e4f3e5-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 22:00:04 crc kubenswrapper[4962]: I1201 22:00:04.723769 4962 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b8f0680f-6407-4e35-a927-3c0613e4f3e5-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 22:00:04 crc kubenswrapper[4962]: I1201 22:00:04.723784 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wt66w\" (UniqueName: \"kubernetes.io/projected/b8f0680f-6407-4e35-a927-3c0613e4f3e5-kube-api-access-wt66w\") on node \"crc\" DevicePath \"\"" Dec 01 22:00:04 crc kubenswrapper[4962]: I1201 22:00:04.745155 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-qs68c"] Dec 01 22:00:04 crc kubenswrapper[4962]: E1201 22:00:04.745774 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5e2c40f-767a-4ec4-a768-f64f3d2b5b20" containerName="collect-profiles" Dec 01 22:00:04 crc kubenswrapper[4962]: I1201 22:00:04.745830 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5e2c40f-767a-4ec4-a768-f64f3d2b5b20" containerName="collect-profiles" Dec 01 22:00:04 crc kubenswrapper[4962]: E1201 22:00:04.745845 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8f0680f-6407-4e35-a927-3c0613e4f3e5" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 01 22:00:04 crc kubenswrapper[4962]: I1201 22:00:04.745856 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8f0680f-6407-4e35-a927-3c0613e4f3e5" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 01 22:00:04 crc kubenswrapper[4962]: I1201 22:00:04.746103 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5e2c40f-767a-4ec4-a768-f64f3d2b5b20" containerName="collect-profiles" Dec 01 22:00:04 crc kubenswrapper[4962]: I1201 22:00:04.746126 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8f0680f-6407-4e35-a927-3c0613e4f3e5" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 01 22:00:04 crc kubenswrapper[4962]: I1201 22:00:04.746977 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qs68c" Dec 01 22:00:04 crc kubenswrapper[4962]: I1201 22:00:04.763157 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-qs68c"] Dec 01 22:00:04 crc kubenswrapper[4962]: I1201 22:00:04.826022 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/805f56ee-17d1-4e5a-8655-756050592352-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-qs68c\" (UID: \"805f56ee-17d1-4e5a-8655-756050592352\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qs68c" Dec 01 22:00:04 crc kubenswrapper[4962]: I1201 22:00:04.826075 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/805f56ee-17d1-4e5a-8655-756050592352-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-qs68c\" (UID: \"805f56ee-17d1-4e5a-8655-756050592352\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qs68c" Dec 01 22:00:04 crc kubenswrapper[4962]: I1201 22:00:04.826099 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n95b9\" (UniqueName: \"kubernetes.io/projected/805f56ee-17d1-4e5a-8655-756050592352-kube-api-access-n95b9\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-qs68c\" (UID: \"805f56ee-17d1-4e5a-8655-756050592352\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qs68c" Dec 01 22:00:04 crc kubenswrapper[4962]: I1201 22:00:04.880664 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-6d7cf4b459-tkf5n" Dec 01 22:00:04 crc kubenswrapper[4962]: I1201 22:00:04.881289 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-77477565cc-t4xcz" Dec 01 22:00:04 crc kubenswrapper[4962]: I1201 22:00:04.915479 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8cwr7" Dec 01 22:00:04 crc kubenswrapper[4962]: I1201 22:00:04.915712 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8cwr7" Dec 01 22:00:04 crc kubenswrapper[4962]: I1201 22:00:04.927983 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/805f56ee-17d1-4e5a-8655-756050592352-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-qs68c\" (UID: \"805f56ee-17d1-4e5a-8655-756050592352\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qs68c" Dec 01 22:00:04 crc kubenswrapper[4962]: I1201 22:00:04.928040 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/805f56ee-17d1-4e5a-8655-756050592352-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-qs68c\" (UID: \"805f56ee-17d1-4e5a-8655-756050592352\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qs68c" Dec 01 22:00:04 crc kubenswrapper[4962]: I1201 22:00:04.928068 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n95b9\" (UniqueName: \"kubernetes.io/projected/805f56ee-17d1-4e5a-8655-756050592352-kube-api-access-n95b9\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-qs68c\" (UID: \"805f56ee-17d1-4e5a-8655-756050592352\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qs68c" Dec 01 22:00:04 crc kubenswrapper[4962]: I1201 22:00:04.931418 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/805f56ee-17d1-4e5a-8655-756050592352-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-qs68c\" (UID: \"805f56ee-17d1-4e5a-8655-756050592352\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qs68c" Dec 01 22:00:04 crc kubenswrapper[4962]: I1201 22:00:04.931420 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/805f56ee-17d1-4e5a-8655-756050592352-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-qs68c\" (UID: \"805f56ee-17d1-4e5a-8655-756050592352\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qs68c" Dec 01 22:00:04 crc kubenswrapper[4962]: I1201 22:00:04.964906 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n95b9\" (UniqueName: \"kubernetes.io/projected/805f56ee-17d1-4e5a-8655-756050592352-kube-api-access-n95b9\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-qs68c\" (UID: \"805f56ee-17d1-4e5a-8655-756050592352\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qs68c" Dec 01 22:00:05 crc kubenswrapper[4962]: I1201 22:00:05.003601 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8cwr7" Dec 01 22:00:05 crc kubenswrapper[4962]: I1201 22:00:05.018105 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-5ff8b998b6-kt4sg"] Dec 01 22:00:05 crc kubenswrapper[4962]: I1201 22:00:05.018672 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-5ff8b998b6-kt4sg" podUID="0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38" containerName="heat-cfnapi" containerID="cri-o://8d4f4153ae312232890de01e4d5af8bb9b987f42a8438310ce6e51adfdee1ee7" gracePeriod=60 Dec 01 22:00:05 crc kubenswrapper[4962]: I1201 22:00:05.079004 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-59c6c9c84d-2tqrs"] Dec 01 22:00:05 crc kubenswrapper[4962]: I1201 22:00:05.079608 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-59c6c9c84d-2tqrs" podUID="63a2deaf-718f-418e-bed6-b8b1351c4d85" containerName="heat-api" containerID="cri-o://16727c7a62203f9348fd8459a6a6f8c6ef71d3401d4e4feefc46901162f5a1b9" gracePeriod=60 Dec 01 22:00:05 crc kubenswrapper[4962]: I1201 22:00:05.129474 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qs68c" Dec 01 22:00:05 crc kubenswrapper[4962]: E1201 22:00:05.349036 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c38766e3f428e4e0124f10f797be6610444e5fb4a10b2633ed129a3c001a354d" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 01 22:00:05 crc kubenswrapper[4962]: E1201 22:00:05.351149 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c38766e3f428e4e0124f10f797be6610444e5fb4a10b2633ed129a3c001a354d" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 01 22:00:05 crc kubenswrapper[4962]: E1201 22:00:05.352774 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c38766e3f428e4e0124f10f797be6610444e5fb4a10b2633ed129a3c001a354d" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 01 22:00:05 crc kubenswrapper[4962]: E1201 22:00:05.352859 4962 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-6776d74cd9-xhqgt" podUID="a0a5ca37-2aa2-43a5-8e9d-a58d62955f44" containerName="heat-engine" Dec 01 22:00:05 crc kubenswrapper[4962]: I1201 22:00:05.748676 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8cwr7" Dec 01 22:00:05 crc kubenswrapper[4962]: I1201 22:00:05.833285 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-qs68c"] Dec 01 22:00:05 crc kubenswrapper[4962]: I1201 22:00:05.852381 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8cwr7"] Dec 01 22:00:06 crc kubenswrapper[4962]: I1201 22:00:06.673362 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qs68c" event={"ID":"805f56ee-17d1-4e5a-8655-756050592352","Type":"ContainerStarted","Data":"90b362b97da6573499acc00bbd809b7757cad235c773da87ecd165f7ad00b51b"} Dec 01 22:00:06 crc kubenswrapper[4962]: I1201 22:00:06.673750 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qs68c" event={"ID":"805f56ee-17d1-4e5a-8655-756050592352","Type":"ContainerStarted","Data":"5e89e5823d34fa32990e9bbd92736e5c6abb9f914596cb32c624ae771b3eddf0"} Dec 01 22:00:06 crc kubenswrapper[4962]: I1201 22:00:06.695488 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qs68c" podStartSLOduration=2.195235179 podStartE2EDuration="2.695471899s" podCreationTimestamp="2025-12-01 22:00:04 +0000 UTC" firstStartedPulling="2025-12-01 22:00:05.842198018 +0000 UTC m=+1589.943637213" lastFinishedPulling="2025-12-01 22:00:06.342434748 +0000 UTC m=+1590.443873933" observedRunningTime="2025-12-01 22:00:06.695263773 +0000 UTC m=+1590.796702988" watchObservedRunningTime="2025-12-01 22:00:06.695471899 +0000 UTC m=+1590.796911094" Dec 01 22:00:06 crc kubenswrapper[4962]: I1201 22:00:06.826993 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-9zgsj"] Dec 01 22:00:06 crc kubenswrapper[4962]: I1201 22:00:06.836786 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-9zgsj"] Dec 01 22:00:06 crc kubenswrapper[4962]: I1201 22:00:06.881451 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-fxbhs"] Dec 01 22:00:06 crc kubenswrapper[4962]: I1201 22:00:06.883099 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-fxbhs" Dec 01 22:00:06 crc kubenswrapper[4962]: I1201 22:00:06.885446 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 01 22:00:06 crc kubenswrapper[4962]: I1201 22:00:06.920597 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-fxbhs"] Dec 01 22:00:06 crc kubenswrapper[4962]: I1201 22:00:06.982008 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/182ffb78-4709-491d-a294-c0c924cf4d5d-config-data\") pod \"aodh-db-sync-fxbhs\" (UID: \"182ffb78-4709-491d-a294-c0c924cf4d5d\") " pod="openstack/aodh-db-sync-fxbhs" Dec 01 22:00:06 crc kubenswrapper[4962]: I1201 22:00:06.982071 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/182ffb78-4709-491d-a294-c0c924cf4d5d-scripts\") pod \"aodh-db-sync-fxbhs\" (UID: \"182ffb78-4709-491d-a294-c0c924cf4d5d\") " pod="openstack/aodh-db-sync-fxbhs" Dec 01 22:00:06 crc kubenswrapper[4962]: I1201 22:00:06.982098 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x428d\" (UniqueName: \"kubernetes.io/projected/182ffb78-4709-491d-a294-c0c924cf4d5d-kube-api-access-x428d\") pod \"aodh-db-sync-fxbhs\" (UID: \"182ffb78-4709-491d-a294-c0c924cf4d5d\") " pod="openstack/aodh-db-sync-fxbhs" Dec 01 22:00:06 crc kubenswrapper[4962]: I1201 22:00:06.982481 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/182ffb78-4709-491d-a294-c0c924cf4d5d-combined-ca-bundle\") pod \"aodh-db-sync-fxbhs\" (UID: \"182ffb78-4709-491d-a294-c0c924cf4d5d\") " pod="openstack/aodh-db-sync-fxbhs" Dec 01 22:00:07 crc kubenswrapper[4962]: I1201 22:00:07.084313 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/182ffb78-4709-491d-a294-c0c924cf4d5d-config-data\") pod \"aodh-db-sync-fxbhs\" (UID: \"182ffb78-4709-491d-a294-c0c924cf4d5d\") " pod="openstack/aodh-db-sync-fxbhs" Dec 01 22:00:07 crc kubenswrapper[4962]: I1201 22:00:07.084381 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/182ffb78-4709-491d-a294-c0c924cf4d5d-scripts\") pod \"aodh-db-sync-fxbhs\" (UID: \"182ffb78-4709-491d-a294-c0c924cf4d5d\") " pod="openstack/aodh-db-sync-fxbhs" Dec 01 22:00:07 crc kubenswrapper[4962]: I1201 22:00:07.084410 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x428d\" (UniqueName: \"kubernetes.io/projected/182ffb78-4709-491d-a294-c0c924cf4d5d-kube-api-access-x428d\") pod \"aodh-db-sync-fxbhs\" (UID: \"182ffb78-4709-491d-a294-c0c924cf4d5d\") " pod="openstack/aodh-db-sync-fxbhs" Dec 01 22:00:07 crc kubenswrapper[4962]: I1201 22:00:07.084618 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/182ffb78-4709-491d-a294-c0c924cf4d5d-combined-ca-bundle\") pod \"aodh-db-sync-fxbhs\" (UID: \"182ffb78-4709-491d-a294-c0c924cf4d5d\") " pod="openstack/aodh-db-sync-fxbhs" Dec 01 22:00:07 crc kubenswrapper[4962]: I1201 22:00:07.089377 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/182ffb78-4709-491d-a294-c0c924cf4d5d-scripts\") pod \"aodh-db-sync-fxbhs\" (UID: \"182ffb78-4709-491d-a294-c0c924cf4d5d\") " pod="openstack/aodh-db-sync-fxbhs" Dec 01 22:00:07 crc kubenswrapper[4962]: I1201 22:00:07.101761 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/182ffb78-4709-491d-a294-c0c924cf4d5d-combined-ca-bundle\") pod \"aodh-db-sync-fxbhs\" (UID: \"182ffb78-4709-491d-a294-c0c924cf4d5d\") " pod="openstack/aodh-db-sync-fxbhs" Dec 01 22:00:07 crc kubenswrapper[4962]: I1201 22:00:07.101989 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/182ffb78-4709-491d-a294-c0c924cf4d5d-config-data\") pod \"aodh-db-sync-fxbhs\" (UID: \"182ffb78-4709-491d-a294-c0c924cf4d5d\") " pod="openstack/aodh-db-sync-fxbhs" Dec 01 22:00:07 crc kubenswrapper[4962]: I1201 22:00:07.105122 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x428d\" (UniqueName: \"kubernetes.io/projected/182ffb78-4709-491d-a294-c0c924cf4d5d-kube-api-access-x428d\") pod \"aodh-db-sync-fxbhs\" (UID: \"182ffb78-4709-491d-a294-c0c924cf4d5d\") " pod="openstack/aodh-db-sync-fxbhs" Dec 01 22:00:07 crc kubenswrapper[4962]: I1201 22:00:07.201536 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-fxbhs" Dec 01 22:00:07 crc kubenswrapper[4962]: I1201 22:00:07.673808 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-fxbhs"] Dec 01 22:00:07 crc kubenswrapper[4962]: I1201 22:00:07.705536 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8cwr7" podUID="0cb296aa-497e-434e-b234-4f54ca5bb9bd" containerName="registry-server" containerID="cri-o://3680cfabc6234ce2a502956294f3f6bcc351dfd309833f45756cb50a1e0799f7" gracePeriod=2 Dec 01 22:00:08 crc kubenswrapper[4962]: I1201 22:00:08.260720 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c24511ba-ce02-420a-83c0-7ef9a6c4eb47" path="/var/lib/kubelet/pods/c24511ba-ce02-420a-83c0-7ef9a6c4eb47/volumes" Dec 01 22:00:08 crc kubenswrapper[4962]: I1201 22:00:08.338891 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-59c6c9c84d-2tqrs" podUID="63a2deaf-718f-418e-bed6-b8b1351c4d85" containerName="heat-api" probeResult="failure" output="Get \"https://10.217.0.220:8004/healthcheck\": read tcp 10.217.0.2:41744->10.217.0.220:8004: read: connection reset by peer" Dec 01 22:00:08 crc kubenswrapper[4962]: I1201 22:00:08.458214 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-cgzd8" Dec 01 22:00:08 crc kubenswrapper[4962]: I1201 22:00:08.485138 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-5ff8b998b6-kt4sg" podUID="0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38" containerName="heat-cfnapi" probeResult="failure" output="Get \"https://10.217.0.221:8000/healthcheck\": read tcp 10.217.0.2:51950->10.217.0.221:8000: read: connection reset by peer" Dec 01 22:00:08 crc kubenswrapper[4962]: I1201 22:00:08.557240 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-cgzd8" Dec 01 22:00:08 crc kubenswrapper[4962]: I1201 22:00:08.721595 4962 generic.go:334] "Generic (PLEG): container finished" podID="0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38" containerID="8d4f4153ae312232890de01e4d5af8bb9b987f42a8438310ce6e51adfdee1ee7" exitCode=0 Dec 01 22:00:08 crc kubenswrapper[4962]: I1201 22:00:08.721677 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5ff8b998b6-kt4sg" event={"ID":"0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38","Type":"ContainerDied","Data":"8d4f4153ae312232890de01e4d5af8bb9b987f42a8438310ce6e51adfdee1ee7"} Dec 01 22:00:08 crc kubenswrapper[4962]: I1201 22:00:08.731494 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-fxbhs" event={"ID":"182ffb78-4709-491d-a294-c0c924cf4d5d","Type":"ContainerStarted","Data":"ed70861e981f115797acbe87c627e6895221ad91c70e0031aadb02e6f11323f1"} Dec 01 22:00:08 crc kubenswrapper[4962]: I1201 22:00:08.735167 4962 generic.go:334] "Generic (PLEG): container finished" podID="63a2deaf-718f-418e-bed6-b8b1351c4d85" containerID="16727c7a62203f9348fd8459a6a6f8c6ef71d3401d4e4feefc46901162f5a1b9" exitCode=0 Dec 01 22:00:08 crc kubenswrapper[4962]: I1201 22:00:08.735217 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-59c6c9c84d-2tqrs" event={"ID":"63a2deaf-718f-418e-bed6-b8b1351c4d85","Type":"ContainerDied","Data":"16727c7a62203f9348fd8459a6a6f8c6ef71d3401d4e4feefc46901162f5a1b9"} Dec 01 22:00:08 crc kubenswrapper[4962]: I1201 22:00:08.749885 4962 generic.go:334] "Generic (PLEG): container finished" podID="0cb296aa-497e-434e-b234-4f54ca5bb9bd" containerID="3680cfabc6234ce2a502956294f3f6bcc351dfd309833f45756cb50a1e0799f7" exitCode=0 Dec 01 22:00:08 crc kubenswrapper[4962]: I1201 22:00:08.750284 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8cwr7" Dec 01 22:00:08 crc kubenswrapper[4962]: I1201 22:00:08.750073 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8cwr7" event={"ID":"0cb296aa-497e-434e-b234-4f54ca5bb9bd","Type":"ContainerDied","Data":"3680cfabc6234ce2a502956294f3f6bcc351dfd309833f45756cb50a1e0799f7"} Dec 01 22:00:08 crc kubenswrapper[4962]: I1201 22:00:08.750871 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8cwr7" event={"ID":"0cb296aa-497e-434e-b234-4f54ca5bb9bd","Type":"ContainerDied","Data":"288ce108163db3f029496e4bcf3ec6e2df6b2294ff3d22edd359636b6caa785d"} Dec 01 22:00:08 crc kubenswrapper[4962]: I1201 22:00:08.750894 4962 scope.go:117] "RemoveContainer" containerID="3680cfabc6234ce2a502956294f3f6bcc351dfd309833f45756cb50a1e0799f7" Dec 01 22:00:08 crc kubenswrapper[4962]: I1201 22:00:08.753181 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8c9fr\" (UniqueName: \"kubernetes.io/projected/0cb296aa-497e-434e-b234-4f54ca5bb9bd-kube-api-access-8c9fr\") pod \"0cb296aa-497e-434e-b234-4f54ca5bb9bd\" (UID: \"0cb296aa-497e-434e-b234-4f54ca5bb9bd\") " Dec 01 22:00:08 crc kubenswrapper[4962]: I1201 22:00:08.753233 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cb296aa-497e-434e-b234-4f54ca5bb9bd-utilities\") pod \"0cb296aa-497e-434e-b234-4f54ca5bb9bd\" (UID: \"0cb296aa-497e-434e-b234-4f54ca5bb9bd\") " Dec 01 22:00:08 crc kubenswrapper[4962]: I1201 22:00:08.753303 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cb296aa-497e-434e-b234-4f54ca5bb9bd-catalog-content\") pod \"0cb296aa-497e-434e-b234-4f54ca5bb9bd\" (UID: \"0cb296aa-497e-434e-b234-4f54ca5bb9bd\") " Dec 01 22:00:08 crc kubenswrapper[4962]: I1201 22:00:08.755295 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cb296aa-497e-434e-b234-4f54ca5bb9bd-utilities" (OuterVolumeSpecName: "utilities") pod "0cb296aa-497e-434e-b234-4f54ca5bb9bd" (UID: "0cb296aa-497e-434e-b234-4f54ca5bb9bd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 22:00:08 crc kubenswrapper[4962]: I1201 22:00:08.763420 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cb296aa-497e-434e-b234-4f54ca5bb9bd-kube-api-access-8c9fr" (OuterVolumeSpecName: "kube-api-access-8c9fr") pod "0cb296aa-497e-434e-b234-4f54ca5bb9bd" (UID: "0cb296aa-497e-434e-b234-4f54ca5bb9bd"). InnerVolumeSpecName "kube-api-access-8c9fr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 22:00:08 crc kubenswrapper[4962]: I1201 22:00:08.770033 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cb296aa-497e-434e-b234-4f54ca5bb9bd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0cb296aa-497e-434e-b234-4f54ca5bb9bd" (UID: "0cb296aa-497e-434e-b234-4f54ca5bb9bd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 22:00:08 crc kubenswrapper[4962]: I1201 22:00:08.851132 4962 scope.go:117] "RemoveContainer" containerID="3e50200711c274fd78a9a62593d016d5df8f06744a90d27a72dbab5707f59827" Dec 01 22:00:08 crc kubenswrapper[4962]: I1201 22:00:08.856276 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8c9fr\" (UniqueName: \"kubernetes.io/projected/0cb296aa-497e-434e-b234-4f54ca5bb9bd-kube-api-access-8c9fr\") on node \"crc\" DevicePath \"\"" Dec 01 22:00:08 crc kubenswrapper[4962]: I1201 22:00:08.856302 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cb296aa-497e-434e-b234-4f54ca5bb9bd-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 22:00:08 crc kubenswrapper[4962]: I1201 22:00:08.856312 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cb296aa-497e-434e-b234-4f54ca5bb9bd-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 22:00:08 crc kubenswrapper[4962]: I1201 22:00:08.885275 4962 scope.go:117] "RemoveContainer" containerID="14f09125be4159bbaa64ba89244e54e859b6b25d7f1dd5ed9e0efa468d32853a" Dec 01 22:00:09 crc kubenswrapper[4962]: I1201 22:00:09.022845 4962 scope.go:117] "RemoveContainer" containerID="3680cfabc6234ce2a502956294f3f6bcc351dfd309833f45756cb50a1e0799f7" Dec 01 22:00:09 crc kubenswrapper[4962]: I1201 22:00:09.023117 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5ff8b998b6-kt4sg" Dec 01 22:00:09 crc kubenswrapper[4962]: E1201 22:00:09.033277 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3680cfabc6234ce2a502956294f3f6bcc351dfd309833f45756cb50a1e0799f7\": container with ID starting with 3680cfabc6234ce2a502956294f3f6bcc351dfd309833f45756cb50a1e0799f7 not found: ID does not exist" containerID="3680cfabc6234ce2a502956294f3f6bcc351dfd309833f45756cb50a1e0799f7" Dec 01 22:00:09 crc kubenswrapper[4962]: I1201 22:00:09.033329 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3680cfabc6234ce2a502956294f3f6bcc351dfd309833f45756cb50a1e0799f7"} err="failed to get container status \"3680cfabc6234ce2a502956294f3f6bcc351dfd309833f45756cb50a1e0799f7\": rpc error: code = NotFound desc = could not find container \"3680cfabc6234ce2a502956294f3f6bcc351dfd309833f45756cb50a1e0799f7\": container with ID starting with 3680cfabc6234ce2a502956294f3f6bcc351dfd309833f45756cb50a1e0799f7 not found: ID does not exist" Dec 01 22:00:09 crc kubenswrapper[4962]: I1201 22:00:09.033360 4962 scope.go:117] "RemoveContainer" containerID="3e50200711c274fd78a9a62593d016d5df8f06744a90d27a72dbab5707f59827" Dec 01 22:00:09 crc kubenswrapper[4962]: E1201 22:00:09.034707 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e50200711c274fd78a9a62593d016d5df8f06744a90d27a72dbab5707f59827\": container with ID starting with 3e50200711c274fd78a9a62593d016d5df8f06744a90d27a72dbab5707f59827 not found: ID does not exist" containerID="3e50200711c274fd78a9a62593d016d5df8f06744a90d27a72dbab5707f59827" Dec 01 22:00:09 crc kubenswrapper[4962]: I1201 22:00:09.034773 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e50200711c274fd78a9a62593d016d5df8f06744a90d27a72dbab5707f59827"} err="failed to get container status \"3e50200711c274fd78a9a62593d016d5df8f06744a90d27a72dbab5707f59827\": rpc error: code = NotFound desc = could not find container \"3e50200711c274fd78a9a62593d016d5df8f06744a90d27a72dbab5707f59827\": container with ID starting with 3e50200711c274fd78a9a62593d016d5df8f06744a90d27a72dbab5707f59827 not found: ID does not exist" Dec 01 22:00:09 crc kubenswrapper[4962]: I1201 22:00:09.034790 4962 scope.go:117] "RemoveContainer" containerID="14f09125be4159bbaa64ba89244e54e859b6b25d7f1dd5ed9e0efa468d32853a" Dec 01 22:00:09 crc kubenswrapper[4962]: E1201 22:00:09.035076 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14f09125be4159bbaa64ba89244e54e859b6b25d7f1dd5ed9e0efa468d32853a\": container with ID starting with 14f09125be4159bbaa64ba89244e54e859b6b25d7f1dd5ed9e0efa468d32853a not found: ID does not exist" containerID="14f09125be4159bbaa64ba89244e54e859b6b25d7f1dd5ed9e0efa468d32853a" Dec 01 22:00:09 crc kubenswrapper[4962]: I1201 22:00:09.035098 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14f09125be4159bbaa64ba89244e54e859b6b25d7f1dd5ed9e0efa468d32853a"} err="failed to get container status \"14f09125be4159bbaa64ba89244e54e859b6b25d7f1dd5ed9e0efa468d32853a\": rpc error: code = NotFound desc = could not find container \"14f09125be4159bbaa64ba89244e54e859b6b25d7f1dd5ed9e0efa468d32853a\": container with ID starting with 14f09125be4159bbaa64ba89244e54e859b6b25d7f1dd5ed9e0efa468d32853a not found: ID does not exist" Dec 01 22:00:09 crc kubenswrapper[4962]: I1201 22:00:09.164632 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38-combined-ca-bundle\") pod \"0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38\" (UID: \"0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38\") " Dec 01 22:00:09 crc kubenswrapper[4962]: I1201 22:00:09.164680 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38-internal-tls-certs\") pod \"0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38\" (UID: \"0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38\") " Dec 01 22:00:09 crc kubenswrapper[4962]: I1201 22:00:09.164749 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38-public-tls-certs\") pod \"0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38\" (UID: \"0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38\") " Dec 01 22:00:09 crc kubenswrapper[4962]: I1201 22:00:09.164842 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38-config-data\") pod \"0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38\" (UID: \"0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38\") " Dec 01 22:00:09 crc kubenswrapper[4962]: I1201 22:00:09.164911 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38-config-data-custom\") pod \"0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38\" (UID: \"0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38\") " Dec 01 22:00:09 crc kubenswrapper[4962]: I1201 22:00:09.164983 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5296\" (UniqueName: \"kubernetes.io/projected/0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38-kube-api-access-s5296\") pod \"0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38\" (UID: \"0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38\") " Dec 01 22:00:09 crc kubenswrapper[4962]: I1201 22:00:09.181153 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38" (UID: "0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:00:09 crc kubenswrapper[4962]: I1201 22:00:09.181243 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38-kube-api-access-s5296" (OuterVolumeSpecName: "kube-api-access-s5296") pod "0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38" (UID: "0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38"). InnerVolumeSpecName "kube-api-access-s5296". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 22:00:09 crc kubenswrapper[4962]: I1201 22:00:09.213070 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38" (UID: "0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:00:09 crc kubenswrapper[4962]: I1201 22:00:09.276971 4962 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 01 22:00:09 crc kubenswrapper[4962]: I1201 22:00:09.277009 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5296\" (UniqueName: \"kubernetes.io/projected/0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38-kube-api-access-s5296\") on node \"crc\" DevicePath \"\"" Dec 01 22:00:09 crc kubenswrapper[4962]: I1201 22:00:09.277023 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 22:00:09 crc kubenswrapper[4962]: I1201 22:00:09.286035 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38" (UID: "0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:00:09 crc kubenswrapper[4962]: I1201 22:00:09.302488 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38" (UID: "0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:00:09 crc kubenswrapper[4962]: I1201 22:00:09.360090 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38-config-data" (OuterVolumeSpecName: "config-data") pod "0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38" (UID: "0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:00:09 crc kubenswrapper[4962]: I1201 22:00:09.382831 4962 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 22:00:09 crc kubenswrapper[4962]: I1201 22:00:09.382866 4962 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 22:00:09 crc kubenswrapper[4962]: I1201 22:00:09.382878 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 22:00:09 crc kubenswrapper[4962]: I1201 22:00:09.734430 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-59c6c9c84d-2tqrs" Dec 01 22:00:09 crc kubenswrapper[4962]: I1201 22:00:09.777674 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5ff8b998b6-kt4sg" event={"ID":"0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38","Type":"ContainerDied","Data":"4f980f4cc5d6bc6dda7468888f04db385a7f544d3d736e7d3d4d51585ae47bff"} Dec 01 22:00:09 crc kubenswrapper[4962]: I1201 22:00:09.777721 4962 scope.go:117] "RemoveContainer" containerID="8d4f4153ae312232890de01e4d5af8bb9b987f42a8438310ce6e51adfdee1ee7" Dec 01 22:00:09 crc kubenswrapper[4962]: I1201 22:00:09.777837 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5ff8b998b6-kt4sg" Dec 01 22:00:09 crc kubenswrapper[4962]: I1201 22:00:09.781120 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-59c6c9c84d-2tqrs" event={"ID":"63a2deaf-718f-418e-bed6-b8b1351c4d85","Type":"ContainerDied","Data":"ef3d7aaf3a2f6c9b95329b88992d39bdf7e33f165f3266c38fb916023d4e7406"} Dec 01 22:00:09 crc kubenswrapper[4962]: I1201 22:00:09.782162 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-59c6c9c84d-2tqrs" Dec 01 22:00:09 crc kubenswrapper[4962]: I1201 22:00:09.786395 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8cwr7" Dec 01 22:00:09 crc kubenswrapper[4962]: I1201 22:00:09.824060 4962 scope.go:117] "RemoveContainer" containerID="16727c7a62203f9348fd8459a6a6f8c6ef71d3401d4e4feefc46901162f5a1b9" Dec 01 22:00:09 crc kubenswrapper[4962]: I1201 22:00:09.834181 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8cwr7"] Dec 01 22:00:09 crc kubenswrapper[4962]: I1201 22:00:09.845732 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8cwr7"] Dec 01 22:00:09 crc kubenswrapper[4962]: I1201 22:00:09.864041 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-5ff8b998b6-kt4sg"] Dec 01 22:00:09 crc kubenswrapper[4962]: I1201 22:00:09.875166 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-5ff8b998b6-kt4sg"] Dec 01 22:00:09 crc kubenswrapper[4962]: I1201 22:00:09.894909 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jqwb\" (UniqueName: \"kubernetes.io/projected/63a2deaf-718f-418e-bed6-b8b1351c4d85-kube-api-access-5jqwb\") pod \"63a2deaf-718f-418e-bed6-b8b1351c4d85\" (UID: \"63a2deaf-718f-418e-bed6-b8b1351c4d85\") " Dec 01 22:00:09 crc kubenswrapper[4962]: I1201 22:00:09.895024 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/63a2deaf-718f-418e-bed6-b8b1351c4d85-config-data-custom\") pod \"63a2deaf-718f-418e-bed6-b8b1351c4d85\" (UID: \"63a2deaf-718f-418e-bed6-b8b1351c4d85\") " Dec 01 22:00:09 crc kubenswrapper[4962]: I1201 22:00:09.895068 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63a2deaf-718f-418e-bed6-b8b1351c4d85-config-data\") pod \"63a2deaf-718f-418e-bed6-b8b1351c4d85\" (UID: \"63a2deaf-718f-418e-bed6-b8b1351c4d85\") " Dec 01 22:00:09 crc kubenswrapper[4962]: I1201 22:00:09.895140 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/63a2deaf-718f-418e-bed6-b8b1351c4d85-public-tls-certs\") pod \"63a2deaf-718f-418e-bed6-b8b1351c4d85\" (UID: \"63a2deaf-718f-418e-bed6-b8b1351c4d85\") " Dec 01 22:00:09 crc kubenswrapper[4962]: I1201 22:00:09.895223 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63a2deaf-718f-418e-bed6-b8b1351c4d85-combined-ca-bundle\") pod \"63a2deaf-718f-418e-bed6-b8b1351c4d85\" (UID: \"63a2deaf-718f-418e-bed6-b8b1351c4d85\") " Dec 01 22:00:09 crc kubenswrapper[4962]: I1201 22:00:09.895370 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/63a2deaf-718f-418e-bed6-b8b1351c4d85-internal-tls-certs\") pod \"63a2deaf-718f-418e-bed6-b8b1351c4d85\" (UID: \"63a2deaf-718f-418e-bed6-b8b1351c4d85\") " Dec 01 22:00:09 crc kubenswrapper[4962]: I1201 22:00:09.899105 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63a2deaf-718f-418e-bed6-b8b1351c4d85-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "63a2deaf-718f-418e-bed6-b8b1351c4d85" (UID: "63a2deaf-718f-418e-bed6-b8b1351c4d85"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:00:09 crc kubenswrapper[4962]: I1201 22:00:09.916545 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63a2deaf-718f-418e-bed6-b8b1351c4d85-kube-api-access-5jqwb" (OuterVolumeSpecName: "kube-api-access-5jqwb") pod "63a2deaf-718f-418e-bed6-b8b1351c4d85" (UID: "63a2deaf-718f-418e-bed6-b8b1351c4d85"). InnerVolumeSpecName "kube-api-access-5jqwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 22:00:09 crc kubenswrapper[4962]: I1201 22:00:09.929439 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63a2deaf-718f-418e-bed6-b8b1351c4d85-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "63a2deaf-718f-418e-bed6-b8b1351c4d85" (UID: "63a2deaf-718f-418e-bed6-b8b1351c4d85"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:00:09 crc kubenswrapper[4962]: I1201 22:00:09.980782 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63a2deaf-718f-418e-bed6-b8b1351c4d85-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "63a2deaf-718f-418e-bed6-b8b1351c4d85" (UID: "63a2deaf-718f-418e-bed6-b8b1351c4d85"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:00:09 crc kubenswrapper[4962]: I1201 22:00:09.981866 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63a2deaf-718f-418e-bed6-b8b1351c4d85-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "63a2deaf-718f-418e-bed6-b8b1351c4d85" (UID: "63a2deaf-718f-418e-bed6-b8b1351c4d85"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:00:09 crc kubenswrapper[4962]: I1201 22:00:09.993118 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63a2deaf-718f-418e-bed6-b8b1351c4d85-config-data" (OuterVolumeSpecName: "config-data") pod "63a2deaf-718f-418e-bed6-b8b1351c4d85" (UID: "63a2deaf-718f-418e-bed6-b8b1351c4d85"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:00:09 crc kubenswrapper[4962]: I1201 22:00:09.998460 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jqwb\" (UniqueName: \"kubernetes.io/projected/63a2deaf-718f-418e-bed6-b8b1351c4d85-kube-api-access-5jqwb\") on node \"crc\" DevicePath \"\"" Dec 01 22:00:09 crc kubenswrapper[4962]: I1201 22:00:09.998535 4962 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/63a2deaf-718f-418e-bed6-b8b1351c4d85-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 01 22:00:09 crc kubenswrapper[4962]: I1201 22:00:09.998551 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63a2deaf-718f-418e-bed6-b8b1351c4d85-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 22:00:09 crc kubenswrapper[4962]: I1201 22:00:09.998564 4962 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/63a2deaf-718f-418e-bed6-b8b1351c4d85-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 22:00:09 crc kubenswrapper[4962]: I1201 22:00:09.998576 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63a2deaf-718f-418e-bed6-b8b1351c4d85-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 22:00:09 crc kubenswrapper[4962]: I1201 22:00:09.998611 4962 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/63a2deaf-718f-418e-bed6-b8b1351c4d85-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 22:00:10 crc kubenswrapper[4962]: I1201 22:00:10.132940 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-59c6c9c84d-2tqrs"] Dec 01 22:00:10 crc kubenswrapper[4962]: I1201 22:00:10.144363 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-59c6c9c84d-2tqrs"] Dec 01 22:00:10 crc kubenswrapper[4962]: I1201 22:00:10.235221 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38" path="/var/lib/kubelet/pods/0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38/volumes" Dec 01 22:00:10 crc kubenswrapper[4962]: I1201 22:00:10.235987 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cb296aa-497e-434e-b234-4f54ca5bb9bd" path="/var/lib/kubelet/pods/0cb296aa-497e-434e-b234-4f54ca5bb9bd/volumes" Dec 01 22:00:10 crc kubenswrapper[4962]: I1201 22:00:10.237379 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63a2deaf-718f-418e-bed6-b8b1351c4d85" path="/var/lib/kubelet/pods/63a2deaf-718f-418e-bed6-b8b1351c4d85/volumes" Dec 01 22:00:10 crc kubenswrapper[4962]: I1201 22:00:10.580409 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cgzd8"] Dec 01 22:00:10 crc kubenswrapper[4962]: I1201 22:00:10.580743 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-cgzd8" podUID="58eb5862-e8e9-4558-8eea-fe5bbffad3e1" containerName="registry-server" containerID="cri-o://46a9430b4dbf0b39eaecf7f583426dd4b1e567e53b6f12e680a9fc1a6add876c" gracePeriod=2 Dec 01 22:00:10 crc kubenswrapper[4962]: I1201 22:00:10.803588 4962 generic.go:334] "Generic (PLEG): container finished" podID="58eb5862-e8e9-4558-8eea-fe5bbffad3e1" containerID="46a9430b4dbf0b39eaecf7f583426dd4b1e567e53b6f12e680a9fc1a6add876c" exitCode=0 Dec 01 22:00:10 crc kubenswrapper[4962]: I1201 22:00:10.803665 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cgzd8" event={"ID":"58eb5862-e8e9-4558-8eea-fe5bbffad3e1","Type":"ContainerDied","Data":"46a9430b4dbf0b39eaecf7f583426dd4b1e567e53b6f12e680a9fc1a6add876c"} Dec 01 22:00:10 crc kubenswrapper[4962]: I1201 22:00:10.810025 4962 generic.go:334] "Generic (PLEG): container finished" podID="805f56ee-17d1-4e5a-8655-756050592352" containerID="90b362b97da6573499acc00bbd809b7757cad235c773da87ecd165f7ad00b51b" exitCode=0 Dec 01 22:00:10 crc kubenswrapper[4962]: I1201 22:00:10.810094 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qs68c" event={"ID":"805f56ee-17d1-4e5a-8655-756050592352","Type":"ContainerDied","Data":"90b362b97da6573499acc00bbd809b7757cad235c773da87ecd165f7ad00b51b"} Dec 01 22:00:10 crc kubenswrapper[4962]: I1201 22:00:10.936620 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 01 22:00:11 crc kubenswrapper[4962]: I1201 22:00:11.000126 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 01 22:00:14 crc kubenswrapper[4962]: I1201 22:00:14.442463 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qs68c" Dec 01 22:00:14 crc kubenswrapper[4962]: I1201 22:00:14.641099 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n95b9\" (UniqueName: \"kubernetes.io/projected/805f56ee-17d1-4e5a-8655-756050592352-kube-api-access-n95b9\") pod \"805f56ee-17d1-4e5a-8655-756050592352\" (UID: \"805f56ee-17d1-4e5a-8655-756050592352\") " Dec 01 22:00:14 crc kubenswrapper[4962]: I1201 22:00:14.641487 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/805f56ee-17d1-4e5a-8655-756050592352-ssh-key\") pod \"805f56ee-17d1-4e5a-8655-756050592352\" (UID: \"805f56ee-17d1-4e5a-8655-756050592352\") " Dec 01 22:00:14 crc kubenswrapper[4962]: I1201 22:00:14.641979 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/805f56ee-17d1-4e5a-8655-756050592352-inventory\") pod \"805f56ee-17d1-4e5a-8655-756050592352\" (UID: \"805f56ee-17d1-4e5a-8655-756050592352\") " Dec 01 22:00:14 crc kubenswrapper[4962]: I1201 22:00:14.646088 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/805f56ee-17d1-4e5a-8655-756050592352-kube-api-access-n95b9" (OuterVolumeSpecName: "kube-api-access-n95b9") pod "805f56ee-17d1-4e5a-8655-756050592352" (UID: "805f56ee-17d1-4e5a-8655-756050592352"). InnerVolumeSpecName "kube-api-access-n95b9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 22:00:14 crc kubenswrapper[4962]: I1201 22:00:14.655746 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cgzd8" Dec 01 22:00:14 crc kubenswrapper[4962]: I1201 22:00:14.694192 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/805f56ee-17d1-4e5a-8655-756050592352-inventory" (OuterVolumeSpecName: "inventory") pod "805f56ee-17d1-4e5a-8655-756050592352" (UID: "805f56ee-17d1-4e5a-8655-756050592352"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:00:14 crc kubenswrapper[4962]: I1201 22:00:14.697430 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/805f56ee-17d1-4e5a-8655-756050592352-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "805f56ee-17d1-4e5a-8655-756050592352" (UID: "805f56ee-17d1-4e5a-8655-756050592352"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:00:14 crc kubenswrapper[4962]: I1201 22:00:14.744426 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n95b9\" (UniqueName: \"kubernetes.io/projected/805f56ee-17d1-4e5a-8655-756050592352-kube-api-access-n95b9\") on node \"crc\" DevicePath \"\"" Dec 01 22:00:14 crc kubenswrapper[4962]: I1201 22:00:14.744455 4962 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/805f56ee-17d1-4e5a-8655-756050592352-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 22:00:14 crc kubenswrapper[4962]: I1201 22:00:14.744465 4962 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/805f56ee-17d1-4e5a-8655-756050592352-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 22:00:14 crc kubenswrapper[4962]: I1201 22:00:14.846783 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58eb5862-e8e9-4558-8eea-fe5bbffad3e1-utilities\") pod \"58eb5862-e8e9-4558-8eea-fe5bbffad3e1\" (UID: \"58eb5862-e8e9-4558-8eea-fe5bbffad3e1\") " Dec 01 22:00:14 crc kubenswrapper[4962]: I1201 22:00:14.847073 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58eb5862-e8e9-4558-8eea-fe5bbffad3e1-catalog-content\") pod \"58eb5862-e8e9-4558-8eea-fe5bbffad3e1\" (UID: \"58eb5862-e8e9-4558-8eea-fe5bbffad3e1\") " Dec 01 22:00:14 crc kubenswrapper[4962]: I1201 22:00:14.847180 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqjvr\" (UniqueName: \"kubernetes.io/projected/58eb5862-e8e9-4558-8eea-fe5bbffad3e1-kube-api-access-jqjvr\") pod \"58eb5862-e8e9-4558-8eea-fe5bbffad3e1\" (UID: \"58eb5862-e8e9-4558-8eea-fe5bbffad3e1\") " Dec 01 22:00:14 crc kubenswrapper[4962]: I1201 22:00:14.847998 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58eb5862-e8e9-4558-8eea-fe5bbffad3e1-utilities" (OuterVolumeSpecName: "utilities") pod "58eb5862-e8e9-4558-8eea-fe5bbffad3e1" (UID: "58eb5862-e8e9-4558-8eea-fe5bbffad3e1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 22:00:14 crc kubenswrapper[4962]: I1201 22:00:14.853620 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58eb5862-e8e9-4558-8eea-fe5bbffad3e1-kube-api-access-jqjvr" (OuterVolumeSpecName: "kube-api-access-jqjvr") pod "58eb5862-e8e9-4558-8eea-fe5bbffad3e1" (UID: "58eb5862-e8e9-4558-8eea-fe5bbffad3e1"). InnerVolumeSpecName "kube-api-access-jqjvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 22:00:14 crc kubenswrapper[4962]: I1201 22:00:14.873498 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-fxbhs" event={"ID":"182ffb78-4709-491d-a294-c0c924cf4d5d","Type":"ContainerStarted","Data":"293d8ed9818712602706120aa1c390e9d45a79be67adab1be930418272593f07"} Dec 01 22:00:14 crc kubenswrapper[4962]: I1201 22:00:14.882167 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cgzd8" event={"ID":"58eb5862-e8e9-4558-8eea-fe5bbffad3e1","Type":"ContainerDied","Data":"1e02c3b4a66276a9d510770fa62364d1a9f2faf610843a8bf38323eaf4ab1108"} Dec 01 22:00:14 crc kubenswrapper[4962]: I1201 22:00:14.882205 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cgzd8" Dec 01 22:00:14 crc kubenswrapper[4962]: I1201 22:00:14.882216 4962 scope.go:117] "RemoveContainer" containerID="46a9430b4dbf0b39eaecf7f583426dd4b1e567e53b6f12e680a9fc1a6add876c" Dec 01 22:00:14 crc kubenswrapper[4962]: I1201 22:00:14.887603 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qs68c" event={"ID":"805f56ee-17d1-4e5a-8655-756050592352","Type":"ContainerDied","Data":"5e89e5823d34fa32990e9bbd92736e5c6abb9f914596cb32c624ae771b3eddf0"} Dec 01 22:00:14 crc kubenswrapper[4962]: I1201 22:00:14.887630 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e89e5823d34fa32990e9bbd92736e5c6abb9f914596cb32c624ae771b3eddf0" Dec 01 22:00:14 crc kubenswrapper[4962]: I1201 22:00:14.887681 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qs68c" Dec 01 22:00:14 crc kubenswrapper[4962]: I1201 22:00:14.901144 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-fxbhs" podStartSLOduration=2.3204099510000002 podStartE2EDuration="8.901120388s" podCreationTimestamp="2025-12-01 22:00:06 +0000 UTC" firstStartedPulling="2025-12-01 22:00:07.694012546 +0000 UTC m=+1591.795451741" lastFinishedPulling="2025-12-01 22:00:14.274722983 +0000 UTC m=+1598.376162178" observedRunningTime="2025-12-01 22:00:14.899210794 +0000 UTC m=+1599.000650029" watchObservedRunningTime="2025-12-01 22:00:14.901120388 +0000 UTC m=+1599.002559613" Dec 01 22:00:14 crc kubenswrapper[4962]: I1201 22:00:14.923356 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58eb5862-e8e9-4558-8eea-fe5bbffad3e1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "58eb5862-e8e9-4558-8eea-fe5bbffad3e1" (UID: "58eb5862-e8e9-4558-8eea-fe5bbffad3e1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 22:00:14 crc kubenswrapper[4962]: I1201 22:00:14.937401 4962 scope.go:117] "RemoveContainer" containerID="3e7a9f3d9ba6091d15be3c9fcea8003466f68996ceb8178ea2d787c1fb3e8039" Dec 01 22:00:14 crc kubenswrapper[4962]: I1201 22:00:14.951748 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58eb5862-e8e9-4558-8eea-fe5bbffad3e1-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 22:00:14 crc kubenswrapper[4962]: I1201 22:00:14.952122 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58eb5862-e8e9-4558-8eea-fe5bbffad3e1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 22:00:14 crc kubenswrapper[4962]: I1201 22:00:14.952146 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqjvr\" (UniqueName: \"kubernetes.io/projected/58eb5862-e8e9-4558-8eea-fe5bbffad3e1-kube-api-access-jqjvr\") on node \"crc\" DevicePath \"\"" Dec 01 22:00:14 crc kubenswrapper[4962]: I1201 22:00:14.979620 4962 scope.go:117] "RemoveContainer" containerID="e6e6fdb55822c15a2f289d252610946e9b25a529cb8a1ab68485fc984a95b703" Dec 01 22:00:15 crc kubenswrapper[4962]: I1201 22:00:15.323308 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cgzd8"] Dec 01 22:00:15 crc kubenswrapper[4962]: I1201 22:00:15.336359 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-cgzd8"] Dec 01 22:00:15 crc kubenswrapper[4962]: E1201 22:00:15.344790 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c38766e3f428e4e0124f10f797be6610444e5fb4a10b2633ed129a3c001a354d" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 01 22:00:15 crc kubenswrapper[4962]: E1201 22:00:15.346707 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c38766e3f428e4e0124f10f797be6610444e5fb4a10b2633ed129a3c001a354d" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 01 22:00:15 crc kubenswrapper[4962]: E1201 22:00:15.348448 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c38766e3f428e4e0124f10f797be6610444e5fb4a10b2633ed129a3c001a354d" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 01 22:00:15 crc kubenswrapper[4962]: E1201 22:00:15.348505 4962 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-6776d74cd9-xhqgt" podUID="a0a5ca37-2aa2-43a5-8e9d-a58d62955f44" containerName="heat-engine" Dec 01 22:00:15 crc kubenswrapper[4962]: I1201 22:00:15.554320 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4d6td"] Dec 01 22:00:15 crc kubenswrapper[4962]: E1201 22:00:15.555161 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cb296aa-497e-434e-b234-4f54ca5bb9bd" containerName="extract-content" Dec 01 22:00:15 crc kubenswrapper[4962]: I1201 22:00:15.555184 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cb296aa-497e-434e-b234-4f54ca5bb9bd" containerName="extract-content" Dec 01 22:00:15 crc kubenswrapper[4962]: E1201 22:00:15.555207 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63a2deaf-718f-418e-bed6-b8b1351c4d85" containerName="heat-api" Dec 01 22:00:15 crc kubenswrapper[4962]: I1201 22:00:15.555219 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="63a2deaf-718f-418e-bed6-b8b1351c4d85" containerName="heat-api" Dec 01 22:00:15 crc kubenswrapper[4962]: E1201 22:00:15.555258 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38" containerName="heat-cfnapi" Dec 01 22:00:15 crc kubenswrapper[4962]: I1201 22:00:15.555271 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38" containerName="heat-cfnapi" Dec 01 22:00:15 crc kubenswrapper[4962]: E1201 22:00:15.555295 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58eb5862-e8e9-4558-8eea-fe5bbffad3e1" containerName="extract-utilities" Dec 01 22:00:15 crc kubenswrapper[4962]: I1201 22:00:15.555308 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="58eb5862-e8e9-4558-8eea-fe5bbffad3e1" containerName="extract-utilities" Dec 01 22:00:15 crc kubenswrapper[4962]: E1201 22:00:15.555330 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58eb5862-e8e9-4558-8eea-fe5bbffad3e1" containerName="registry-server" Dec 01 22:00:15 crc kubenswrapper[4962]: I1201 22:00:15.555343 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="58eb5862-e8e9-4558-8eea-fe5bbffad3e1" containerName="registry-server" Dec 01 22:00:15 crc kubenswrapper[4962]: E1201 22:00:15.555359 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cb296aa-497e-434e-b234-4f54ca5bb9bd" containerName="registry-server" Dec 01 22:00:15 crc kubenswrapper[4962]: I1201 22:00:15.555371 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cb296aa-497e-434e-b234-4f54ca5bb9bd" containerName="registry-server" Dec 01 22:00:15 crc kubenswrapper[4962]: E1201 22:00:15.555420 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cb296aa-497e-434e-b234-4f54ca5bb9bd" containerName="extract-utilities" Dec 01 22:00:15 crc kubenswrapper[4962]: I1201 22:00:15.555432 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cb296aa-497e-434e-b234-4f54ca5bb9bd" containerName="extract-utilities" Dec 01 22:00:15 crc kubenswrapper[4962]: E1201 22:00:15.555463 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="805f56ee-17d1-4e5a-8655-756050592352" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 01 22:00:15 crc kubenswrapper[4962]: I1201 22:00:15.555482 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="805f56ee-17d1-4e5a-8655-756050592352" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 01 22:00:15 crc kubenswrapper[4962]: E1201 22:00:15.555515 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58eb5862-e8e9-4558-8eea-fe5bbffad3e1" containerName="extract-content" Dec 01 22:00:15 crc kubenswrapper[4962]: I1201 22:00:15.555532 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="58eb5862-e8e9-4558-8eea-fe5bbffad3e1" containerName="extract-content" Dec 01 22:00:15 crc kubenswrapper[4962]: I1201 22:00:15.556015 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="0564a7bc-694c-4c6b-b7ec-7e7d26f4ea38" containerName="heat-cfnapi" Dec 01 22:00:15 crc kubenswrapper[4962]: I1201 22:00:15.556059 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="805f56ee-17d1-4e5a-8655-756050592352" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 01 22:00:15 crc kubenswrapper[4962]: I1201 22:00:15.556102 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="58eb5862-e8e9-4558-8eea-fe5bbffad3e1" containerName="registry-server" Dec 01 22:00:15 crc kubenswrapper[4962]: I1201 22:00:15.556131 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="63a2deaf-718f-418e-bed6-b8b1351c4d85" containerName="heat-api" Dec 01 22:00:15 crc kubenswrapper[4962]: I1201 22:00:15.556162 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cb296aa-497e-434e-b234-4f54ca5bb9bd" containerName="registry-server" Dec 01 22:00:15 crc kubenswrapper[4962]: I1201 22:00:15.558620 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4d6td" Dec 01 22:00:15 crc kubenswrapper[4962]: I1201 22:00:15.564205 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mpxjd" Dec 01 22:00:15 crc kubenswrapper[4962]: I1201 22:00:15.564519 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4d6td"] Dec 01 22:00:15 crc kubenswrapper[4962]: I1201 22:00:15.565218 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 22:00:15 crc kubenswrapper[4962]: I1201 22:00:15.569373 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 22:00:15 crc kubenswrapper[4962]: I1201 22:00:15.569678 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 22:00:15 crc kubenswrapper[4962]: I1201 22:00:15.673976 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6dc28247-9b3f-421b-a195-2f89ea5b50f8-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4d6td\" (UID: \"6dc28247-9b3f-421b-a195-2f89ea5b50f8\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4d6td" Dec 01 22:00:15 crc kubenswrapper[4962]: I1201 22:00:15.674287 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6dc28247-9b3f-421b-a195-2f89ea5b50f8-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4d6td\" (UID: \"6dc28247-9b3f-421b-a195-2f89ea5b50f8\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4d6td" Dec 01 22:00:15 crc kubenswrapper[4962]: I1201 22:00:15.674524 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dc28247-9b3f-421b-a195-2f89ea5b50f8-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4d6td\" (UID: \"6dc28247-9b3f-421b-a195-2f89ea5b50f8\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4d6td" Dec 01 22:00:15 crc kubenswrapper[4962]: I1201 22:00:15.674655 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvx5f\" (UniqueName: \"kubernetes.io/projected/6dc28247-9b3f-421b-a195-2f89ea5b50f8-kube-api-access-tvx5f\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4d6td\" (UID: \"6dc28247-9b3f-421b-a195-2f89ea5b50f8\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4d6td" Dec 01 22:00:15 crc kubenswrapper[4962]: I1201 22:00:15.778819 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6dc28247-9b3f-421b-a195-2f89ea5b50f8-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4d6td\" (UID: \"6dc28247-9b3f-421b-a195-2f89ea5b50f8\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4d6td" Dec 01 22:00:15 crc kubenswrapper[4962]: I1201 22:00:15.779006 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6dc28247-9b3f-421b-a195-2f89ea5b50f8-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4d6td\" (UID: \"6dc28247-9b3f-421b-a195-2f89ea5b50f8\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4d6td" Dec 01 22:00:15 crc kubenswrapper[4962]: I1201 22:00:15.779185 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dc28247-9b3f-421b-a195-2f89ea5b50f8-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4d6td\" (UID: \"6dc28247-9b3f-421b-a195-2f89ea5b50f8\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4d6td" Dec 01 22:00:15 crc kubenswrapper[4962]: I1201 22:00:15.779325 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvx5f\" (UniqueName: \"kubernetes.io/projected/6dc28247-9b3f-421b-a195-2f89ea5b50f8-kube-api-access-tvx5f\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4d6td\" (UID: \"6dc28247-9b3f-421b-a195-2f89ea5b50f8\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4d6td" Dec 01 22:00:15 crc kubenswrapper[4962]: I1201 22:00:15.784233 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6dc28247-9b3f-421b-a195-2f89ea5b50f8-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4d6td\" (UID: \"6dc28247-9b3f-421b-a195-2f89ea5b50f8\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4d6td" Dec 01 22:00:15 crc kubenswrapper[4962]: I1201 22:00:15.789304 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6dc28247-9b3f-421b-a195-2f89ea5b50f8-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4d6td\" (UID: \"6dc28247-9b3f-421b-a195-2f89ea5b50f8\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4d6td" Dec 01 22:00:15 crc kubenswrapper[4962]: I1201 22:00:15.798537 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvx5f\" (UniqueName: \"kubernetes.io/projected/6dc28247-9b3f-421b-a195-2f89ea5b50f8-kube-api-access-tvx5f\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4d6td\" (UID: \"6dc28247-9b3f-421b-a195-2f89ea5b50f8\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4d6td" Dec 01 22:00:15 crc kubenswrapper[4962]: I1201 22:00:15.813908 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dc28247-9b3f-421b-a195-2f89ea5b50f8-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4d6td\" (UID: \"6dc28247-9b3f-421b-a195-2f89ea5b50f8\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4d6td" Dec 01 22:00:15 crc kubenswrapper[4962]: I1201 22:00:15.897123 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4d6td" Dec 01 22:00:16 crc kubenswrapper[4962]: I1201 22:00:16.249559 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58eb5862-e8e9-4558-8eea-fe5bbffad3e1" path="/var/lib/kubelet/pods/58eb5862-e8e9-4558-8eea-fe5bbffad3e1/volumes" Dec 01 22:00:16 crc kubenswrapper[4962]: I1201 22:00:16.535857 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4d6td"] Dec 01 22:00:16 crc kubenswrapper[4962]: I1201 22:00:16.946997 4962 generic.go:334] "Generic (PLEG): container finished" podID="182ffb78-4709-491d-a294-c0c924cf4d5d" containerID="293d8ed9818712602706120aa1c390e9d45a79be67adab1be930418272593f07" exitCode=0 Dec 01 22:00:16 crc kubenswrapper[4962]: I1201 22:00:16.947105 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-fxbhs" event={"ID":"182ffb78-4709-491d-a294-c0c924cf4d5d","Type":"ContainerDied","Data":"293d8ed9818712602706120aa1c390e9d45a79be67adab1be930418272593f07"} Dec 01 22:00:16 crc kubenswrapper[4962]: I1201 22:00:16.951235 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4d6td" event={"ID":"6dc28247-9b3f-421b-a195-2f89ea5b50f8","Type":"ContainerStarted","Data":"8b0f477f4fe2dc79da7a842021777e81d75bcd0afb3afc2ad19b38bcd0d40929"} Dec 01 22:00:17 crc kubenswrapper[4962]: I1201 22:00:17.965553 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4d6td" event={"ID":"6dc28247-9b3f-421b-a195-2f89ea5b50f8","Type":"ContainerStarted","Data":"9c1f7b510f6a69b50e06ea235e58c04905d120e5644dd5b71df316e9f95cecd1"} Dec 01 22:00:18 crc kubenswrapper[4962]: I1201 22:00:18.011955 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4d6td" podStartSLOduration=2.391163155 podStartE2EDuration="3.01191542s" podCreationTimestamp="2025-12-01 22:00:15 +0000 UTC" firstStartedPulling="2025-12-01 22:00:16.549507343 +0000 UTC m=+1600.650946538" lastFinishedPulling="2025-12-01 22:00:17.170259578 +0000 UTC m=+1601.271698803" observedRunningTime="2025-12-01 22:00:17.987786987 +0000 UTC m=+1602.089226222" watchObservedRunningTime="2025-12-01 22:00:18.01191542 +0000 UTC m=+1602.113354625" Dec 01 22:00:18 crc kubenswrapper[4962]: I1201 22:00:18.463642 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-fxbhs" Dec 01 22:00:18 crc kubenswrapper[4962]: I1201 22:00:18.578681 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/182ffb78-4709-491d-a294-c0c924cf4d5d-scripts\") pod \"182ffb78-4709-491d-a294-c0c924cf4d5d\" (UID: \"182ffb78-4709-491d-a294-c0c924cf4d5d\") " Dec 01 22:00:18 crc kubenswrapper[4962]: I1201 22:00:18.578758 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x428d\" (UniqueName: \"kubernetes.io/projected/182ffb78-4709-491d-a294-c0c924cf4d5d-kube-api-access-x428d\") pod \"182ffb78-4709-491d-a294-c0c924cf4d5d\" (UID: \"182ffb78-4709-491d-a294-c0c924cf4d5d\") " Dec 01 22:00:18 crc kubenswrapper[4962]: I1201 22:00:18.578812 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/182ffb78-4709-491d-a294-c0c924cf4d5d-combined-ca-bundle\") pod \"182ffb78-4709-491d-a294-c0c924cf4d5d\" (UID: \"182ffb78-4709-491d-a294-c0c924cf4d5d\") " Dec 01 22:00:18 crc kubenswrapper[4962]: I1201 22:00:18.579024 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/182ffb78-4709-491d-a294-c0c924cf4d5d-config-data\") pod \"182ffb78-4709-491d-a294-c0c924cf4d5d\" (UID: \"182ffb78-4709-491d-a294-c0c924cf4d5d\") " Dec 01 22:00:18 crc kubenswrapper[4962]: I1201 22:00:18.585130 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/182ffb78-4709-491d-a294-c0c924cf4d5d-kube-api-access-x428d" (OuterVolumeSpecName: "kube-api-access-x428d") pod "182ffb78-4709-491d-a294-c0c924cf4d5d" (UID: "182ffb78-4709-491d-a294-c0c924cf4d5d"). InnerVolumeSpecName "kube-api-access-x428d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 22:00:18 crc kubenswrapper[4962]: I1201 22:00:18.589008 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/182ffb78-4709-491d-a294-c0c924cf4d5d-scripts" (OuterVolumeSpecName: "scripts") pod "182ffb78-4709-491d-a294-c0c924cf4d5d" (UID: "182ffb78-4709-491d-a294-c0c924cf4d5d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:00:18 crc kubenswrapper[4962]: I1201 22:00:18.620011 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/182ffb78-4709-491d-a294-c0c924cf4d5d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "182ffb78-4709-491d-a294-c0c924cf4d5d" (UID: "182ffb78-4709-491d-a294-c0c924cf4d5d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:00:18 crc kubenswrapper[4962]: I1201 22:00:18.622107 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/182ffb78-4709-491d-a294-c0c924cf4d5d-config-data" (OuterVolumeSpecName: "config-data") pod "182ffb78-4709-491d-a294-c0c924cf4d5d" (UID: "182ffb78-4709-491d-a294-c0c924cf4d5d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:00:18 crc kubenswrapper[4962]: I1201 22:00:18.681862 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/182ffb78-4709-491d-a294-c0c924cf4d5d-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 22:00:18 crc kubenswrapper[4962]: I1201 22:00:18.681896 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/182ffb78-4709-491d-a294-c0c924cf4d5d-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 22:00:18 crc kubenswrapper[4962]: I1201 22:00:18.681906 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x428d\" (UniqueName: \"kubernetes.io/projected/182ffb78-4709-491d-a294-c0c924cf4d5d-kube-api-access-x428d\") on node \"crc\" DevicePath \"\"" Dec 01 22:00:18 crc kubenswrapper[4962]: I1201 22:00:18.681916 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/182ffb78-4709-491d-a294-c0c924cf4d5d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 22:00:18 crc kubenswrapper[4962]: I1201 22:00:18.979497 4962 generic.go:334] "Generic (PLEG): container finished" podID="a0a5ca37-2aa2-43a5-8e9d-a58d62955f44" containerID="c38766e3f428e4e0124f10f797be6610444e5fb4a10b2633ed129a3c001a354d" exitCode=0 Dec 01 22:00:18 crc kubenswrapper[4962]: I1201 22:00:18.979599 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6776d74cd9-xhqgt" event={"ID":"a0a5ca37-2aa2-43a5-8e9d-a58d62955f44","Type":"ContainerDied","Data":"c38766e3f428e4e0124f10f797be6610444e5fb4a10b2633ed129a3c001a354d"} Dec 01 22:00:18 crc kubenswrapper[4962]: I1201 22:00:18.982117 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-fxbhs" event={"ID":"182ffb78-4709-491d-a294-c0c924cf4d5d","Type":"ContainerDied","Data":"ed70861e981f115797acbe87c627e6895221ad91c70e0031aadb02e6f11323f1"} Dec 01 22:00:18 crc kubenswrapper[4962]: I1201 22:00:18.982156 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed70861e981f115797acbe87c627e6895221ad91c70e0031aadb02e6f11323f1" Dec 01 22:00:18 crc kubenswrapper[4962]: I1201 22:00:18.982286 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-fxbhs" Dec 01 22:00:19 crc kubenswrapper[4962]: I1201 22:00:19.201372 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6776d74cd9-xhqgt" Dec 01 22:00:19 crc kubenswrapper[4962]: I1201 22:00:19.297370 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a0a5ca37-2aa2-43a5-8e9d-a58d62955f44-config-data-custom\") pod \"a0a5ca37-2aa2-43a5-8e9d-a58d62955f44\" (UID: \"a0a5ca37-2aa2-43a5-8e9d-a58d62955f44\") " Dec 01 22:00:19 crc kubenswrapper[4962]: I1201 22:00:19.297562 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lntx7\" (UniqueName: \"kubernetes.io/projected/a0a5ca37-2aa2-43a5-8e9d-a58d62955f44-kube-api-access-lntx7\") pod \"a0a5ca37-2aa2-43a5-8e9d-a58d62955f44\" (UID: \"a0a5ca37-2aa2-43a5-8e9d-a58d62955f44\") " Dec 01 22:00:19 crc kubenswrapper[4962]: I1201 22:00:19.297603 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0a5ca37-2aa2-43a5-8e9d-a58d62955f44-config-data\") pod \"a0a5ca37-2aa2-43a5-8e9d-a58d62955f44\" (UID: \"a0a5ca37-2aa2-43a5-8e9d-a58d62955f44\") " Dec 01 22:00:19 crc kubenswrapper[4962]: I1201 22:00:19.297672 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0a5ca37-2aa2-43a5-8e9d-a58d62955f44-combined-ca-bundle\") pod \"a0a5ca37-2aa2-43a5-8e9d-a58d62955f44\" (UID: \"a0a5ca37-2aa2-43a5-8e9d-a58d62955f44\") " Dec 01 22:00:19 crc kubenswrapper[4962]: I1201 22:00:19.304659 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0a5ca37-2aa2-43a5-8e9d-a58d62955f44-kube-api-access-lntx7" (OuterVolumeSpecName: "kube-api-access-lntx7") pod "a0a5ca37-2aa2-43a5-8e9d-a58d62955f44" (UID: "a0a5ca37-2aa2-43a5-8e9d-a58d62955f44"). InnerVolumeSpecName "kube-api-access-lntx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 22:00:19 crc kubenswrapper[4962]: I1201 22:00:19.306410 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0a5ca37-2aa2-43a5-8e9d-a58d62955f44-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a0a5ca37-2aa2-43a5-8e9d-a58d62955f44" (UID: "a0a5ca37-2aa2-43a5-8e9d-a58d62955f44"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:00:19 crc kubenswrapper[4962]: I1201 22:00:19.333899 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0a5ca37-2aa2-43a5-8e9d-a58d62955f44-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a0a5ca37-2aa2-43a5-8e9d-a58d62955f44" (UID: "a0a5ca37-2aa2-43a5-8e9d-a58d62955f44"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:00:19 crc kubenswrapper[4962]: I1201 22:00:19.385291 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0a5ca37-2aa2-43a5-8e9d-a58d62955f44-config-data" (OuterVolumeSpecName: "config-data") pod "a0a5ca37-2aa2-43a5-8e9d-a58d62955f44" (UID: "a0a5ca37-2aa2-43a5-8e9d-a58d62955f44"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:00:19 crc kubenswrapper[4962]: I1201 22:00:19.403078 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lntx7\" (UniqueName: \"kubernetes.io/projected/a0a5ca37-2aa2-43a5-8e9d-a58d62955f44-kube-api-access-lntx7\") on node \"crc\" DevicePath \"\"" Dec 01 22:00:19 crc kubenswrapper[4962]: I1201 22:00:19.403137 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0a5ca37-2aa2-43a5-8e9d-a58d62955f44-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 22:00:19 crc kubenswrapper[4962]: I1201 22:00:19.403156 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0a5ca37-2aa2-43a5-8e9d-a58d62955f44-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 22:00:19 crc kubenswrapper[4962]: I1201 22:00:19.403175 4962 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a0a5ca37-2aa2-43a5-8e9d-a58d62955f44-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 01 22:00:20 crc kubenswrapper[4962]: I1201 22:00:20.006170 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6776d74cd9-xhqgt" event={"ID":"a0a5ca37-2aa2-43a5-8e9d-a58d62955f44","Type":"ContainerDied","Data":"93f32796cf6b2df6c8a4208ca786c163fe23f0ee5788edd5bce43df0ac416c04"} Dec 01 22:00:20 crc kubenswrapper[4962]: I1201 22:00:20.006256 4962 scope.go:117] "RemoveContainer" containerID="c38766e3f428e4e0124f10f797be6610444e5fb4a10b2633ed129a3c001a354d" Dec 01 22:00:20 crc kubenswrapper[4962]: I1201 22:00:20.006308 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6776d74cd9-xhqgt" Dec 01 22:00:20 crc kubenswrapper[4962]: I1201 22:00:20.083983 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-6776d74cd9-xhqgt"] Dec 01 22:00:20 crc kubenswrapper[4962]: I1201 22:00:20.100141 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-6776d74cd9-xhqgt"] Dec 01 22:00:20 crc kubenswrapper[4962]: I1201 22:00:20.265463 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0a5ca37-2aa2-43a5-8e9d-a58d62955f44" path="/var/lib/kubelet/pods/a0a5ca37-2aa2-43a5-8e9d-a58d62955f44/volumes" Dec 01 22:00:21 crc kubenswrapper[4962]: I1201 22:00:21.938532 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Dec 01 22:00:21 crc kubenswrapper[4962]: I1201 22:00:21.939061 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="01b5da04-0e15-442b-87c1-941fac20aeaf" containerName="aodh-api" containerID="cri-o://2a6b4f2584e177ecd418182a17c7e0b9ea8bb9cca0bb4a3bb48d4e9535339ecc" gracePeriod=30 Dec 01 22:00:21 crc kubenswrapper[4962]: I1201 22:00:21.939556 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="01b5da04-0e15-442b-87c1-941fac20aeaf" containerName="aodh-listener" containerID="cri-o://72339093c5bd948f8ebc0537ff7b20ed87883f6ba7a8384c485a2883f5497225" gracePeriod=30 Dec 01 22:00:21 crc kubenswrapper[4962]: I1201 22:00:21.939600 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="01b5da04-0e15-442b-87c1-941fac20aeaf" containerName="aodh-notifier" containerID="cri-o://6651bb97fb232768e5f823c1ddfb513ca5c7505398718dc627f72c6088e7a941" gracePeriod=30 Dec 01 22:00:21 crc kubenswrapper[4962]: I1201 22:00:21.939630 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="01b5da04-0e15-442b-87c1-941fac20aeaf" containerName="aodh-evaluator" containerID="cri-o://d11df9a3ad95b4d8ff1402a72f82ab05f4352cf9afb8a5e46a1a2a829116f4ff" gracePeriod=30 Dec 01 22:00:23 crc kubenswrapper[4962]: I1201 22:00:23.058668 4962 generic.go:334] "Generic (PLEG): container finished" podID="01b5da04-0e15-442b-87c1-941fac20aeaf" containerID="d11df9a3ad95b4d8ff1402a72f82ab05f4352cf9afb8a5e46a1a2a829116f4ff" exitCode=0 Dec 01 22:00:23 crc kubenswrapper[4962]: I1201 22:00:23.058907 4962 generic.go:334] "Generic (PLEG): container finished" podID="01b5da04-0e15-442b-87c1-941fac20aeaf" containerID="2a6b4f2584e177ecd418182a17c7e0b9ea8bb9cca0bb4a3bb48d4e9535339ecc" exitCode=0 Dec 01 22:00:23 crc kubenswrapper[4962]: I1201 22:00:23.058742 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"01b5da04-0e15-442b-87c1-941fac20aeaf","Type":"ContainerDied","Data":"d11df9a3ad95b4d8ff1402a72f82ab05f4352cf9afb8a5e46a1a2a829116f4ff"} Dec 01 22:00:23 crc kubenswrapper[4962]: I1201 22:00:23.058963 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"01b5da04-0e15-442b-87c1-941fac20aeaf","Type":"ContainerDied","Data":"2a6b4f2584e177ecd418182a17c7e0b9ea8bb9cca0bb4a3bb48d4e9535339ecc"} Dec 01 22:00:26 crc kubenswrapper[4962]: I1201 22:00:26.106908 4962 generic.go:334] "Generic (PLEG): container finished" podID="01b5da04-0e15-442b-87c1-941fac20aeaf" containerID="72339093c5bd948f8ebc0537ff7b20ed87883f6ba7a8384c485a2883f5497225" exitCode=0 Dec 01 22:00:26 crc kubenswrapper[4962]: I1201 22:00:26.107304 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"01b5da04-0e15-442b-87c1-941fac20aeaf","Type":"ContainerDied","Data":"72339093c5bd948f8ebc0537ff7b20ed87883f6ba7a8384c485a2883f5497225"} Dec 01 22:00:27 crc kubenswrapper[4962]: I1201 22:00:27.679414 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 01 22:00:27 crc kubenswrapper[4962]: I1201 22:00:27.835726 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/01b5da04-0e15-442b-87c1-941fac20aeaf-internal-tls-certs\") pod \"01b5da04-0e15-442b-87c1-941fac20aeaf\" (UID: \"01b5da04-0e15-442b-87c1-941fac20aeaf\") " Dec 01 22:00:27 crc kubenswrapper[4962]: I1201 22:00:27.835785 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4bt5p\" (UniqueName: \"kubernetes.io/projected/01b5da04-0e15-442b-87c1-941fac20aeaf-kube-api-access-4bt5p\") pod \"01b5da04-0e15-442b-87c1-941fac20aeaf\" (UID: \"01b5da04-0e15-442b-87c1-941fac20aeaf\") " Dec 01 22:00:27 crc kubenswrapper[4962]: I1201 22:00:27.835853 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01b5da04-0e15-442b-87c1-941fac20aeaf-scripts\") pod \"01b5da04-0e15-442b-87c1-941fac20aeaf\" (UID: \"01b5da04-0e15-442b-87c1-941fac20aeaf\") " Dec 01 22:00:27 crc kubenswrapper[4962]: I1201 22:00:27.835911 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01b5da04-0e15-442b-87c1-941fac20aeaf-config-data\") pod \"01b5da04-0e15-442b-87c1-941fac20aeaf\" (UID: \"01b5da04-0e15-442b-87c1-941fac20aeaf\") " Dec 01 22:00:27 crc kubenswrapper[4962]: I1201 22:00:27.836092 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01b5da04-0e15-442b-87c1-941fac20aeaf-combined-ca-bundle\") pod \"01b5da04-0e15-442b-87c1-941fac20aeaf\" (UID: \"01b5da04-0e15-442b-87c1-941fac20aeaf\") " Dec 01 22:00:27 crc kubenswrapper[4962]: I1201 22:00:27.836121 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/01b5da04-0e15-442b-87c1-941fac20aeaf-public-tls-certs\") pod \"01b5da04-0e15-442b-87c1-941fac20aeaf\" (UID: \"01b5da04-0e15-442b-87c1-941fac20aeaf\") " Dec 01 22:00:27 crc kubenswrapper[4962]: I1201 22:00:27.845624 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01b5da04-0e15-442b-87c1-941fac20aeaf-scripts" (OuterVolumeSpecName: "scripts") pod "01b5da04-0e15-442b-87c1-941fac20aeaf" (UID: "01b5da04-0e15-442b-87c1-941fac20aeaf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:00:27 crc kubenswrapper[4962]: I1201 22:00:27.858272 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01b5da04-0e15-442b-87c1-941fac20aeaf-kube-api-access-4bt5p" (OuterVolumeSpecName: "kube-api-access-4bt5p") pod "01b5da04-0e15-442b-87c1-941fac20aeaf" (UID: "01b5da04-0e15-442b-87c1-941fac20aeaf"). InnerVolumeSpecName "kube-api-access-4bt5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 22:00:27 crc kubenswrapper[4962]: I1201 22:00:27.939077 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4bt5p\" (UniqueName: \"kubernetes.io/projected/01b5da04-0e15-442b-87c1-941fac20aeaf-kube-api-access-4bt5p\") on node \"crc\" DevicePath \"\"" Dec 01 22:00:27 crc kubenswrapper[4962]: I1201 22:00:27.939103 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01b5da04-0e15-442b-87c1-941fac20aeaf-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 22:00:27 crc kubenswrapper[4962]: I1201 22:00:27.961748 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01b5da04-0e15-442b-87c1-941fac20aeaf-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "01b5da04-0e15-442b-87c1-941fac20aeaf" (UID: "01b5da04-0e15-442b-87c1-941fac20aeaf"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:00:27 crc kubenswrapper[4962]: I1201 22:00:27.990215 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01b5da04-0e15-442b-87c1-941fac20aeaf-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "01b5da04-0e15-442b-87c1-941fac20aeaf" (UID: "01b5da04-0e15-442b-87c1-941fac20aeaf"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:00:28 crc kubenswrapper[4962]: I1201 22:00:28.041241 4962 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/01b5da04-0e15-442b-87c1-941fac20aeaf-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 22:00:28 crc kubenswrapper[4962]: I1201 22:00:28.042013 4962 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/01b5da04-0e15-442b-87c1-941fac20aeaf-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 22:00:28 crc kubenswrapper[4962]: I1201 22:00:28.042379 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01b5da04-0e15-442b-87c1-941fac20aeaf-config-data" (OuterVolumeSpecName: "config-data") pod "01b5da04-0e15-442b-87c1-941fac20aeaf" (UID: "01b5da04-0e15-442b-87c1-941fac20aeaf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:00:28 crc kubenswrapper[4962]: I1201 22:00:28.048390 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01b5da04-0e15-442b-87c1-941fac20aeaf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "01b5da04-0e15-442b-87c1-941fac20aeaf" (UID: "01b5da04-0e15-442b-87c1-941fac20aeaf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:00:28 crc kubenswrapper[4962]: I1201 22:00:28.133395 4962 generic.go:334] "Generic (PLEG): container finished" podID="01b5da04-0e15-442b-87c1-941fac20aeaf" containerID="6651bb97fb232768e5f823c1ddfb513ca5c7505398718dc627f72c6088e7a941" exitCode=0 Dec 01 22:00:28 crc kubenswrapper[4962]: I1201 22:00:28.133470 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 01 22:00:28 crc kubenswrapper[4962]: I1201 22:00:28.133469 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"01b5da04-0e15-442b-87c1-941fac20aeaf","Type":"ContainerDied","Data":"6651bb97fb232768e5f823c1ddfb513ca5c7505398718dc627f72c6088e7a941"} Dec 01 22:00:28 crc kubenswrapper[4962]: I1201 22:00:28.133771 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"01b5da04-0e15-442b-87c1-941fac20aeaf","Type":"ContainerDied","Data":"3472de3d4f771beb86d9af47d5c192f06e17bc8bb6f53146bb061a327191943f"} Dec 01 22:00:28 crc kubenswrapper[4962]: I1201 22:00:28.133790 4962 scope.go:117] "RemoveContainer" containerID="72339093c5bd948f8ebc0537ff7b20ed87883f6ba7a8384c485a2883f5497225" Dec 01 22:00:28 crc kubenswrapper[4962]: I1201 22:00:28.145221 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01b5da04-0e15-442b-87c1-941fac20aeaf-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 22:00:28 crc kubenswrapper[4962]: I1201 22:00:28.145248 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01b5da04-0e15-442b-87c1-941fac20aeaf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 22:00:28 crc kubenswrapper[4962]: I1201 22:00:28.166712 4962 scope.go:117] "RemoveContainer" containerID="6651bb97fb232768e5f823c1ddfb513ca5c7505398718dc627f72c6088e7a941" Dec 01 22:00:28 crc kubenswrapper[4962]: I1201 22:00:28.184012 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Dec 01 22:00:28 crc kubenswrapper[4962]: I1201 22:00:28.204114 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Dec 01 22:00:28 crc kubenswrapper[4962]: I1201 22:00:28.260477 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01b5da04-0e15-442b-87c1-941fac20aeaf" path="/var/lib/kubelet/pods/01b5da04-0e15-442b-87c1-941fac20aeaf/volumes" Dec 01 22:00:28 crc kubenswrapper[4962]: I1201 22:00:28.262269 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Dec 01 22:00:28 crc kubenswrapper[4962]: E1201 22:00:28.262673 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0a5ca37-2aa2-43a5-8e9d-a58d62955f44" containerName="heat-engine" Dec 01 22:00:28 crc kubenswrapper[4962]: I1201 22:00:28.262693 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0a5ca37-2aa2-43a5-8e9d-a58d62955f44" containerName="heat-engine" Dec 01 22:00:28 crc kubenswrapper[4962]: E1201 22:00:28.262732 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01b5da04-0e15-442b-87c1-941fac20aeaf" containerName="aodh-evaluator" Dec 01 22:00:28 crc kubenswrapper[4962]: I1201 22:00:28.262741 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="01b5da04-0e15-442b-87c1-941fac20aeaf" containerName="aodh-evaluator" Dec 01 22:00:28 crc kubenswrapper[4962]: E1201 22:00:28.262763 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01b5da04-0e15-442b-87c1-941fac20aeaf" containerName="aodh-listener" Dec 01 22:00:28 crc kubenswrapper[4962]: I1201 22:00:28.262772 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="01b5da04-0e15-442b-87c1-941fac20aeaf" containerName="aodh-listener" Dec 01 22:00:28 crc kubenswrapper[4962]: E1201 22:00:28.262811 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01b5da04-0e15-442b-87c1-941fac20aeaf" containerName="aodh-api" Dec 01 22:00:28 crc kubenswrapper[4962]: I1201 22:00:28.262821 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="01b5da04-0e15-442b-87c1-941fac20aeaf" containerName="aodh-api" Dec 01 22:00:28 crc kubenswrapper[4962]: E1201 22:00:28.262854 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01b5da04-0e15-442b-87c1-941fac20aeaf" containerName="aodh-notifier" Dec 01 22:00:28 crc kubenswrapper[4962]: I1201 22:00:28.262864 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="01b5da04-0e15-442b-87c1-941fac20aeaf" containerName="aodh-notifier" Dec 01 22:00:28 crc kubenswrapper[4962]: E1201 22:00:28.262877 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="182ffb78-4709-491d-a294-c0c924cf4d5d" containerName="aodh-db-sync" Dec 01 22:00:28 crc kubenswrapper[4962]: I1201 22:00:28.262885 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="182ffb78-4709-491d-a294-c0c924cf4d5d" containerName="aodh-db-sync" Dec 01 22:00:28 crc kubenswrapper[4962]: I1201 22:00:28.263410 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="01b5da04-0e15-442b-87c1-941fac20aeaf" containerName="aodh-evaluator" Dec 01 22:00:28 crc kubenswrapper[4962]: I1201 22:00:28.263441 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="182ffb78-4709-491d-a294-c0c924cf4d5d" containerName="aodh-db-sync" Dec 01 22:00:28 crc kubenswrapper[4962]: I1201 22:00:28.263457 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="01b5da04-0e15-442b-87c1-941fac20aeaf" containerName="aodh-notifier" Dec 01 22:00:28 crc kubenswrapper[4962]: I1201 22:00:28.263467 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0a5ca37-2aa2-43a5-8e9d-a58d62955f44" containerName="heat-engine" Dec 01 22:00:28 crc kubenswrapper[4962]: I1201 22:00:28.263546 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="01b5da04-0e15-442b-87c1-941fac20aeaf" containerName="aodh-api" Dec 01 22:00:28 crc kubenswrapper[4962]: I1201 22:00:28.263564 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="01b5da04-0e15-442b-87c1-941fac20aeaf" containerName="aodh-listener" Dec 01 22:00:28 crc kubenswrapper[4962]: I1201 22:00:28.267278 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 01 22:00:28 crc kubenswrapper[4962]: I1201 22:00:28.268492 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 01 22:00:28 crc kubenswrapper[4962]: I1201 22:00:28.275995 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Dec 01 22:00:28 crc kubenswrapper[4962]: I1201 22:00:28.276545 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Dec 01 22:00:28 crc kubenswrapper[4962]: I1201 22:00:28.276719 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-bf8t4" Dec 01 22:00:28 crc kubenswrapper[4962]: I1201 22:00:28.276898 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Dec 01 22:00:28 crc kubenswrapper[4962]: I1201 22:00:28.289228 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Dec 01 22:00:28 crc kubenswrapper[4962]: I1201 22:00:28.317154 4962 scope.go:117] "RemoveContainer" containerID="d11df9a3ad95b4d8ff1402a72f82ab05f4352cf9afb8a5e46a1a2a829116f4ff" Dec 01 22:00:28 crc kubenswrapper[4962]: I1201 22:00:28.352872 4962 scope.go:117] "RemoveContainer" containerID="2a6b4f2584e177ecd418182a17c7e0b9ea8bb9cca0bb4a3bb48d4e9535339ecc" Dec 01 22:00:28 crc kubenswrapper[4962]: I1201 22:00:28.361882 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1377a9a-eb6d-42f1-89f5-f8383c69b93e-public-tls-certs\") pod \"aodh-0\" (UID: \"f1377a9a-eb6d-42f1-89f5-f8383c69b93e\") " pod="openstack/aodh-0" Dec 01 22:00:28 crc kubenswrapper[4962]: I1201 22:00:28.361968 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lsd6\" (UniqueName: \"kubernetes.io/projected/f1377a9a-eb6d-42f1-89f5-f8383c69b93e-kube-api-access-9lsd6\") pod \"aodh-0\" (UID: \"f1377a9a-eb6d-42f1-89f5-f8383c69b93e\") " pod="openstack/aodh-0" Dec 01 22:00:28 crc kubenswrapper[4962]: I1201 22:00:28.361995 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1377a9a-eb6d-42f1-89f5-f8383c69b93e-combined-ca-bundle\") pod \"aodh-0\" (UID: \"f1377a9a-eb6d-42f1-89f5-f8383c69b93e\") " pod="openstack/aodh-0" Dec 01 22:00:28 crc kubenswrapper[4962]: I1201 22:00:28.362049 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1377a9a-eb6d-42f1-89f5-f8383c69b93e-scripts\") pod \"aodh-0\" (UID: \"f1377a9a-eb6d-42f1-89f5-f8383c69b93e\") " pod="openstack/aodh-0" Dec 01 22:00:28 crc kubenswrapper[4962]: I1201 22:00:28.362082 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1377a9a-eb6d-42f1-89f5-f8383c69b93e-internal-tls-certs\") pod \"aodh-0\" (UID: \"f1377a9a-eb6d-42f1-89f5-f8383c69b93e\") " pod="openstack/aodh-0" Dec 01 22:00:28 crc kubenswrapper[4962]: I1201 22:00:28.362240 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1377a9a-eb6d-42f1-89f5-f8383c69b93e-config-data\") pod \"aodh-0\" (UID: \"f1377a9a-eb6d-42f1-89f5-f8383c69b93e\") " pod="openstack/aodh-0" Dec 01 22:00:28 crc kubenswrapper[4962]: I1201 22:00:28.373871 4962 scope.go:117] "RemoveContainer" containerID="72339093c5bd948f8ebc0537ff7b20ed87883f6ba7a8384c485a2883f5497225" Dec 01 22:00:28 crc kubenswrapper[4962]: E1201 22:00:28.374456 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72339093c5bd948f8ebc0537ff7b20ed87883f6ba7a8384c485a2883f5497225\": container with ID starting with 72339093c5bd948f8ebc0537ff7b20ed87883f6ba7a8384c485a2883f5497225 not found: ID does not exist" containerID="72339093c5bd948f8ebc0537ff7b20ed87883f6ba7a8384c485a2883f5497225" Dec 01 22:00:28 crc kubenswrapper[4962]: I1201 22:00:28.374495 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72339093c5bd948f8ebc0537ff7b20ed87883f6ba7a8384c485a2883f5497225"} err="failed to get container status \"72339093c5bd948f8ebc0537ff7b20ed87883f6ba7a8384c485a2883f5497225\": rpc error: code = NotFound desc = could not find container \"72339093c5bd948f8ebc0537ff7b20ed87883f6ba7a8384c485a2883f5497225\": container with ID starting with 72339093c5bd948f8ebc0537ff7b20ed87883f6ba7a8384c485a2883f5497225 not found: ID does not exist" Dec 01 22:00:28 crc kubenswrapper[4962]: I1201 22:00:28.374521 4962 scope.go:117] "RemoveContainer" containerID="6651bb97fb232768e5f823c1ddfb513ca5c7505398718dc627f72c6088e7a941" Dec 01 22:00:28 crc kubenswrapper[4962]: E1201 22:00:28.374852 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6651bb97fb232768e5f823c1ddfb513ca5c7505398718dc627f72c6088e7a941\": container with ID starting with 6651bb97fb232768e5f823c1ddfb513ca5c7505398718dc627f72c6088e7a941 not found: ID does not exist" containerID="6651bb97fb232768e5f823c1ddfb513ca5c7505398718dc627f72c6088e7a941" Dec 01 22:00:28 crc kubenswrapper[4962]: I1201 22:00:28.374878 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6651bb97fb232768e5f823c1ddfb513ca5c7505398718dc627f72c6088e7a941"} err="failed to get container status \"6651bb97fb232768e5f823c1ddfb513ca5c7505398718dc627f72c6088e7a941\": rpc error: code = NotFound desc = could not find container \"6651bb97fb232768e5f823c1ddfb513ca5c7505398718dc627f72c6088e7a941\": container with ID starting with 6651bb97fb232768e5f823c1ddfb513ca5c7505398718dc627f72c6088e7a941 not found: ID does not exist" Dec 01 22:00:28 crc kubenswrapper[4962]: I1201 22:00:28.374891 4962 scope.go:117] "RemoveContainer" containerID="d11df9a3ad95b4d8ff1402a72f82ab05f4352cf9afb8a5e46a1a2a829116f4ff" Dec 01 22:00:28 crc kubenswrapper[4962]: E1201 22:00:28.375119 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d11df9a3ad95b4d8ff1402a72f82ab05f4352cf9afb8a5e46a1a2a829116f4ff\": container with ID starting with d11df9a3ad95b4d8ff1402a72f82ab05f4352cf9afb8a5e46a1a2a829116f4ff not found: ID does not exist" containerID="d11df9a3ad95b4d8ff1402a72f82ab05f4352cf9afb8a5e46a1a2a829116f4ff" Dec 01 22:00:28 crc kubenswrapper[4962]: I1201 22:00:28.375141 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d11df9a3ad95b4d8ff1402a72f82ab05f4352cf9afb8a5e46a1a2a829116f4ff"} err="failed to get container status \"d11df9a3ad95b4d8ff1402a72f82ab05f4352cf9afb8a5e46a1a2a829116f4ff\": rpc error: code = NotFound desc = could not find container \"d11df9a3ad95b4d8ff1402a72f82ab05f4352cf9afb8a5e46a1a2a829116f4ff\": container with ID starting with d11df9a3ad95b4d8ff1402a72f82ab05f4352cf9afb8a5e46a1a2a829116f4ff not found: ID does not exist" Dec 01 22:00:28 crc kubenswrapper[4962]: I1201 22:00:28.375156 4962 scope.go:117] "RemoveContainer" containerID="2a6b4f2584e177ecd418182a17c7e0b9ea8bb9cca0bb4a3bb48d4e9535339ecc" Dec 01 22:00:28 crc kubenswrapper[4962]: E1201 22:00:28.375320 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a6b4f2584e177ecd418182a17c7e0b9ea8bb9cca0bb4a3bb48d4e9535339ecc\": container with ID starting with 2a6b4f2584e177ecd418182a17c7e0b9ea8bb9cca0bb4a3bb48d4e9535339ecc not found: ID does not exist" containerID="2a6b4f2584e177ecd418182a17c7e0b9ea8bb9cca0bb4a3bb48d4e9535339ecc" Dec 01 22:00:28 crc kubenswrapper[4962]: I1201 22:00:28.375340 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a6b4f2584e177ecd418182a17c7e0b9ea8bb9cca0bb4a3bb48d4e9535339ecc"} err="failed to get container status \"2a6b4f2584e177ecd418182a17c7e0b9ea8bb9cca0bb4a3bb48d4e9535339ecc\": rpc error: code = NotFound desc = could not find container \"2a6b4f2584e177ecd418182a17c7e0b9ea8bb9cca0bb4a3bb48d4e9535339ecc\": container with ID starting with 2a6b4f2584e177ecd418182a17c7e0b9ea8bb9cca0bb4a3bb48d4e9535339ecc not found: ID does not exist" Dec 01 22:00:28 crc kubenswrapper[4962]: I1201 22:00:28.464186 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lsd6\" (UniqueName: \"kubernetes.io/projected/f1377a9a-eb6d-42f1-89f5-f8383c69b93e-kube-api-access-9lsd6\") pod \"aodh-0\" (UID: \"f1377a9a-eb6d-42f1-89f5-f8383c69b93e\") " pod="openstack/aodh-0" Dec 01 22:00:28 crc kubenswrapper[4962]: I1201 22:00:28.464242 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1377a9a-eb6d-42f1-89f5-f8383c69b93e-combined-ca-bundle\") pod \"aodh-0\" (UID: \"f1377a9a-eb6d-42f1-89f5-f8383c69b93e\") " pod="openstack/aodh-0" Dec 01 22:00:28 crc kubenswrapper[4962]: I1201 22:00:28.464287 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1377a9a-eb6d-42f1-89f5-f8383c69b93e-scripts\") pod \"aodh-0\" (UID: \"f1377a9a-eb6d-42f1-89f5-f8383c69b93e\") " pod="openstack/aodh-0" Dec 01 22:00:28 crc kubenswrapper[4962]: I1201 22:00:28.464315 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1377a9a-eb6d-42f1-89f5-f8383c69b93e-internal-tls-certs\") pod \"aodh-0\" (UID: \"f1377a9a-eb6d-42f1-89f5-f8383c69b93e\") " pod="openstack/aodh-0" Dec 01 22:00:28 crc kubenswrapper[4962]: I1201 22:00:28.464472 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1377a9a-eb6d-42f1-89f5-f8383c69b93e-config-data\") pod \"aodh-0\" (UID: \"f1377a9a-eb6d-42f1-89f5-f8383c69b93e\") " pod="openstack/aodh-0" Dec 01 22:00:28 crc kubenswrapper[4962]: I1201 22:00:28.464504 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1377a9a-eb6d-42f1-89f5-f8383c69b93e-public-tls-certs\") pod \"aodh-0\" (UID: \"f1377a9a-eb6d-42f1-89f5-f8383c69b93e\") " pod="openstack/aodh-0" Dec 01 22:00:28 crc kubenswrapper[4962]: I1201 22:00:28.468857 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1377a9a-eb6d-42f1-89f5-f8383c69b93e-scripts\") pod \"aodh-0\" (UID: \"f1377a9a-eb6d-42f1-89f5-f8383c69b93e\") " pod="openstack/aodh-0" Dec 01 22:00:28 crc kubenswrapper[4962]: I1201 22:00:28.469182 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1377a9a-eb6d-42f1-89f5-f8383c69b93e-config-data\") pod \"aodh-0\" (UID: \"f1377a9a-eb6d-42f1-89f5-f8383c69b93e\") " pod="openstack/aodh-0" Dec 01 22:00:28 crc kubenswrapper[4962]: I1201 22:00:28.469637 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1377a9a-eb6d-42f1-89f5-f8383c69b93e-public-tls-certs\") pod \"aodh-0\" (UID: \"f1377a9a-eb6d-42f1-89f5-f8383c69b93e\") " pod="openstack/aodh-0" Dec 01 22:00:28 crc kubenswrapper[4962]: I1201 22:00:28.471521 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1377a9a-eb6d-42f1-89f5-f8383c69b93e-combined-ca-bundle\") pod \"aodh-0\" (UID: \"f1377a9a-eb6d-42f1-89f5-f8383c69b93e\") " pod="openstack/aodh-0" Dec 01 22:00:28 crc kubenswrapper[4962]: I1201 22:00:28.474847 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1377a9a-eb6d-42f1-89f5-f8383c69b93e-internal-tls-certs\") pod \"aodh-0\" (UID: \"f1377a9a-eb6d-42f1-89f5-f8383c69b93e\") " pod="openstack/aodh-0" Dec 01 22:00:28 crc kubenswrapper[4962]: I1201 22:00:28.483418 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lsd6\" (UniqueName: \"kubernetes.io/projected/f1377a9a-eb6d-42f1-89f5-f8383c69b93e-kube-api-access-9lsd6\") pod \"aodh-0\" (UID: \"f1377a9a-eb6d-42f1-89f5-f8383c69b93e\") " pod="openstack/aodh-0" Dec 01 22:00:28 crc kubenswrapper[4962]: I1201 22:00:28.594538 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 01 22:00:29 crc kubenswrapper[4962]: I1201 22:00:29.197000 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 01 22:00:30 crc kubenswrapper[4962]: I1201 22:00:30.157451 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"f1377a9a-eb6d-42f1-89f5-f8383c69b93e","Type":"ContainerStarted","Data":"4312c0b13d1a9605d47ac6064502fd3c368fd31c270206b263cb64b83b511495"} Dec 01 22:00:30 crc kubenswrapper[4962]: I1201 22:00:30.157817 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"f1377a9a-eb6d-42f1-89f5-f8383c69b93e","Type":"ContainerStarted","Data":"f26a2efd2f07fc571a667189e53f0d30264a65b5cddcd83181082dfbe2ac868a"} Dec 01 22:00:32 crc kubenswrapper[4962]: I1201 22:00:32.187893 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"f1377a9a-eb6d-42f1-89f5-f8383c69b93e","Type":"ContainerStarted","Data":"159e9eb29996fbf3f1a215df7cf9b7696ca9ef1dc5a8cc066b314355df96f869"} Dec 01 22:00:32 crc kubenswrapper[4962]: I1201 22:00:32.786022 4962 patch_prober.go:28] interesting pod/machine-config-daemon-b642k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 22:00:32 crc kubenswrapper[4962]: I1201 22:00:32.786343 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 22:00:33 crc kubenswrapper[4962]: I1201 22:00:33.201529 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"f1377a9a-eb6d-42f1-89f5-f8383c69b93e","Type":"ContainerStarted","Data":"476ad4d417b9b73402a0a1786da043875dfba961ce41a2a3df80726dcbf5a963"} Dec 01 22:00:35 crc kubenswrapper[4962]: I1201 22:00:35.300704 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"f1377a9a-eb6d-42f1-89f5-f8383c69b93e","Type":"ContainerStarted","Data":"f45e0d90cf460ec13ff18e125ec6ffd3af13d418bc7cd7fea42249431e9829fc"} Dec 01 22:00:35 crc kubenswrapper[4962]: I1201 22:00:35.322735 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.372973274 podStartE2EDuration="7.322716399s" podCreationTimestamp="2025-12-01 22:00:28 +0000 UTC" firstStartedPulling="2025-12-01 22:00:29.209595678 +0000 UTC m=+1613.311034883" lastFinishedPulling="2025-12-01 22:00:34.159338813 +0000 UTC m=+1618.260778008" observedRunningTime="2025-12-01 22:00:35.319856788 +0000 UTC m=+1619.421296003" watchObservedRunningTime="2025-12-01 22:00:35.322716399 +0000 UTC m=+1619.424155604" Dec 01 22:01:00 crc kubenswrapper[4962]: I1201 22:01:00.177870 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29410441-sgtnd"] Dec 01 22:01:00 crc kubenswrapper[4962]: I1201 22:01:00.182172 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29410441-sgtnd" Dec 01 22:01:00 crc kubenswrapper[4962]: I1201 22:01:00.237366 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29410441-sgtnd"] Dec 01 22:01:00 crc kubenswrapper[4962]: I1201 22:01:00.317288 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/14dca7aa-3ee9-4af1-85ba-e92ac88fd223-fernet-keys\") pod \"keystone-cron-29410441-sgtnd\" (UID: \"14dca7aa-3ee9-4af1-85ba-e92ac88fd223\") " pod="openstack/keystone-cron-29410441-sgtnd" Dec 01 22:01:00 crc kubenswrapper[4962]: I1201 22:01:00.317477 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14dca7aa-3ee9-4af1-85ba-e92ac88fd223-combined-ca-bundle\") pod \"keystone-cron-29410441-sgtnd\" (UID: \"14dca7aa-3ee9-4af1-85ba-e92ac88fd223\") " pod="openstack/keystone-cron-29410441-sgtnd" Dec 01 22:01:00 crc kubenswrapper[4962]: I1201 22:01:00.317534 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14dca7aa-3ee9-4af1-85ba-e92ac88fd223-config-data\") pod \"keystone-cron-29410441-sgtnd\" (UID: \"14dca7aa-3ee9-4af1-85ba-e92ac88fd223\") " pod="openstack/keystone-cron-29410441-sgtnd" Dec 01 22:01:00 crc kubenswrapper[4962]: I1201 22:01:00.317837 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2mm4\" (UniqueName: \"kubernetes.io/projected/14dca7aa-3ee9-4af1-85ba-e92ac88fd223-kube-api-access-v2mm4\") pod \"keystone-cron-29410441-sgtnd\" (UID: \"14dca7aa-3ee9-4af1-85ba-e92ac88fd223\") " pod="openstack/keystone-cron-29410441-sgtnd" Dec 01 22:01:00 crc kubenswrapper[4962]: I1201 22:01:00.419834 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2mm4\" (UniqueName: \"kubernetes.io/projected/14dca7aa-3ee9-4af1-85ba-e92ac88fd223-kube-api-access-v2mm4\") pod \"keystone-cron-29410441-sgtnd\" (UID: \"14dca7aa-3ee9-4af1-85ba-e92ac88fd223\") " pod="openstack/keystone-cron-29410441-sgtnd" Dec 01 22:01:00 crc kubenswrapper[4962]: I1201 22:01:00.420178 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/14dca7aa-3ee9-4af1-85ba-e92ac88fd223-fernet-keys\") pod \"keystone-cron-29410441-sgtnd\" (UID: \"14dca7aa-3ee9-4af1-85ba-e92ac88fd223\") " pod="openstack/keystone-cron-29410441-sgtnd" Dec 01 22:01:00 crc kubenswrapper[4962]: I1201 22:01:00.420225 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14dca7aa-3ee9-4af1-85ba-e92ac88fd223-combined-ca-bundle\") pod \"keystone-cron-29410441-sgtnd\" (UID: \"14dca7aa-3ee9-4af1-85ba-e92ac88fd223\") " pod="openstack/keystone-cron-29410441-sgtnd" Dec 01 22:01:00 crc kubenswrapper[4962]: I1201 22:01:00.420251 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14dca7aa-3ee9-4af1-85ba-e92ac88fd223-config-data\") pod \"keystone-cron-29410441-sgtnd\" (UID: \"14dca7aa-3ee9-4af1-85ba-e92ac88fd223\") " pod="openstack/keystone-cron-29410441-sgtnd" Dec 01 22:01:00 crc kubenswrapper[4962]: I1201 22:01:00.427631 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14dca7aa-3ee9-4af1-85ba-e92ac88fd223-config-data\") pod \"keystone-cron-29410441-sgtnd\" (UID: \"14dca7aa-3ee9-4af1-85ba-e92ac88fd223\") " pod="openstack/keystone-cron-29410441-sgtnd" Dec 01 22:01:00 crc kubenswrapper[4962]: I1201 22:01:00.429495 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/14dca7aa-3ee9-4af1-85ba-e92ac88fd223-fernet-keys\") pod \"keystone-cron-29410441-sgtnd\" (UID: \"14dca7aa-3ee9-4af1-85ba-e92ac88fd223\") " pod="openstack/keystone-cron-29410441-sgtnd" Dec 01 22:01:00 crc kubenswrapper[4962]: I1201 22:01:00.438134 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14dca7aa-3ee9-4af1-85ba-e92ac88fd223-combined-ca-bundle\") pod \"keystone-cron-29410441-sgtnd\" (UID: \"14dca7aa-3ee9-4af1-85ba-e92ac88fd223\") " pod="openstack/keystone-cron-29410441-sgtnd" Dec 01 22:01:00 crc kubenswrapper[4962]: I1201 22:01:00.455520 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2mm4\" (UniqueName: \"kubernetes.io/projected/14dca7aa-3ee9-4af1-85ba-e92ac88fd223-kube-api-access-v2mm4\") pod \"keystone-cron-29410441-sgtnd\" (UID: \"14dca7aa-3ee9-4af1-85ba-e92ac88fd223\") " pod="openstack/keystone-cron-29410441-sgtnd" Dec 01 22:01:00 crc kubenswrapper[4962]: I1201 22:01:00.528123 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29410441-sgtnd" Dec 01 22:01:01 crc kubenswrapper[4962]: I1201 22:01:01.037529 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29410441-sgtnd"] Dec 01 22:01:01 crc kubenswrapper[4962]: I1201 22:01:01.672478 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29410441-sgtnd" event={"ID":"14dca7aa-3ee9-4af1-85ba-e92ac88fd223","Type":"ContainerStarted","Data":"e736ec957c99bc179371759a608da0b9ff5290cacca413a3e4b992b253f62cc4"} Dec 01 22:01:01 crc kubenswrapper[4962]: I1201 22:01:01.672815 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29410441-sgtnd" event={"ID":"14dca7aa-3ee9-4af1-85ba-e92ac88fd223","Type":"ContainerStarted","Data":"a98bb3b5fc2d4329c389d988ba454764dbf32994ce8f30799c0510a5a4de41f4"} Dec 01 22:01:01 crc kubenswrapper[4962]: I1201 22:01:01.710480 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29410441-sgtnd" podStartSLOduration=1.7104371980000002 podStartE2EDuration="1.710437198s" podCreationTimestamp="2025-12-01 22:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 22:01:01.702899764 +0000 UTC m=+1645.804338999" watchObservedRunningTime="2025-12-01 22:01:01.710437198 +0000 UTC m=+1645.811876403" Dec 01 22:01:01 crc kubenswrapper[4962]: I1201 22:01:01.760261 4962 scope.go:117] "RemoveContainer" containerID="6229a94679f1738612a7755d3dfa7af7a89ee9d8252d01528edb8b738d3cd3cc" Dec 01 22:01:01 crc kubenswrapper[4962]: I1201 22:01:01.797668 4962 scope.go:117] "RemoveContainer" containerID="3a870dd0de19c7ac629701cb1c7e5d27c5d5f4d2d56d910a99a91d08aa4523cd" Dec 01 22:01:02 crc kubenswrapper[4962]: I1201 22:01:02.786334 4962 patch_prober.go:28] interesting pod/machine-config-daemon-b642k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 22:01:02 crc kubenswrapper[4962]: I1201 22:01:02.786636 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 22:01:04 crc kubenswrapper[4962]: I1201 22:01:04.721047 4962 generic.go:334] "Generic (PLEG): container finished" podID="14dca7aa-3ee9-4af1-85ba-e92ac88fd223" containerID="e736ec957c99bc179371759a608da0b9ff5290cacca413a3e4b992b253f62cc4" exitCode=0 Dec 01 22:01:04 crc kubenswrapper[4962]: I1201 22:01:04.721133 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29410441-sgtnd" event={"ID":"14dca7aa-3ee9-4af1-85ba-e92ac88fd223","Type":"ContainerDied","Data":"e736ec957c99bc179371759a608da0b9ff5290cacca413a3e4b992b253f62cc4"} Dec 01 22:01:06 crc kubenswrapper[4962]: I1201 22:01:06.286749 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29410441-sgtnd" Dec 01 22:01:06 crc kubenswrapper[4962]: I1201 22:01:06.337626 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2mm4\" (UniqueName: \"kubernetes.io/projected/14dca7aa-3ee9-4af1-85ba-e92ac88fd223-kube-api-access-v2mm4\") pod \"14dca7aa-3ee9-4af1-85ba-e92ac88fd223\" (UID: \"14dca7aa-3ee9-4af1-85ba-e92ac88fd223\") " Dec 01 22:01:06 crc kubenswrapper[4962]: I1201 22:01:06.337979 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14dca7aa-3ee9-4af1-85ba-e92ac88fd223-config-data\") pod \"14dca7aa-3ee9-4af1-85ba-e92ac88fd223\" (UID: \"14dca7aa-3ee9-4af1-85ba-e92ac88fd223\") " Dec 01 22:01:06 crc kubenswrapper[4962]: I1201 22:01:06.338089 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/14dca7aa-3ee9-4af1-85ba-e92ac88fd223-fernet-keys\") pod \"14dca7aa-3ee9-4af1-85ba-e92ac88fd223\" (UID: \"14dca7aa-3ee9-4af1-85ba-e92ac88fd223\") " Dec 01 22:01:06 crc kubenswrapper[4962]: I1201 22:01:06.338144 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14dca7aa-3ee9-4af1-85ba-e92ac88fd223-combined-ca-bundle\") pod \"14dca7aa-3ee9-4af1-85ba-e92ac88fd223\" (UID: \"14dca7aa-3ee9-4af1-85ba-e92ac88fd223\") " Dec 01 22:01:06 crc kubenswrapper[4962]: I1201 22:01:06.350640 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14dca7aa-3ee9-4af1-85ba-e92ac88fd223-kube-api-access-v2mm4" (OuterVolumeSpecName: "kube-api-access-v2mm4") pod "14dca7aa-3ee9-4af1-85ba-e92ac88fd223" (UID: "14dca7aa-3ee9-4af1-85ba-e92ac88fd223"). InnerVolumeSpecName "kube-api-access-v2mm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 22:01:06 crc kubenswrapper[4962]: I1201 22:01:06.351065 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14dca7aa-3ee9-4af1-85ba-e92ac88fd223-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "14dca7aa-3ee9-4af1-85ba-e92ac88fd223" (UID: "14dca7aa-3ee9-4af1-85ba-e92ac88fd223"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:01:06 crc kubenswrapper[4962]: I1201 22:01:06.390478 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14dca7aa-3ee9-4af1-85ba-e92ac88fd223-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "14dca7aa-3ee9-4af1-85ba-e92ac88fd223" (UID: "14dca7aa-3ee9-4af1-85ba-e92ac88fd223"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:01:06 crc kubenswrapper[4962]: I1201 22:01:06.423971 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14dca7aa-3ee9-4af1-85ba-e92ac88fd223-config-data" (OuterVolumeSpecName: "config-data") pod "14dca7aa-3ee9-4af1-85ba-e92ac88fd223" (UID: "14dca7aa-3ee9-4af1-85ba-e92ac88fd223"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:01:06 crc kubenswrapper[4962]: I1201 22:01:06.442498 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14dca7aa-3ee9-4af1-85ba-e92ac88fd223-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 22:01:06 crc kubenswrapper[4962]: I1201 22:01:06.442544 4962 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/14dca7aa-3ee9-4af1-85ba-e92ac88fd223-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 01 22:01:06 crc kubenswrapper[4962]: I1201 22:01:06.442555 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14dca7aa-3ee9-4af1-85ba-e92ac88fd223-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 22:01:06 crc kubenswrapper[4962]: I1201 22:01:06.442569 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2mm4\" (UniqueName: \"kubernetes.io/projected/14dca7aa-3ee9-4af1-85ba-e92ac88fd223-kube-api-access-v2mm4\") on node \"crc\" DevicePath \"\"" Dec 01 22:01:06 crc kubenswrapper[4962]: I1201 22:01:06.769710 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29410441-sgtnd" event={"ID":"14dca7aa-3ee9-4af1-85ba-e92ac88fd223","Type":"ContainerDied","Data":"a98bb3b5fc2d4329c389d988ba454764dbf32994ce8f30799c0510a5a4de41f4"} Dec 01 22:01:06 crc kubenswrapper[4962]: I1201 22:01:06.770095 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a98bb3b5fc2d4329c389d988ba454764dbf32994ce8f30799c0510a5a4de41f4" Dec 01 22:01:06 crc kubenswrapper[4962]: I1201 22:01:06.769871 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29410441-sgtnd" Dec 01 22:01:32 crc kubenswrapper[4962]: I1201 22:01:32.784805 4962 patch_prober.go:28] interesting pod/machine-config-daemon-b642k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 22:01:32 crc kubenswrapper[4962]: I1201 22:01:32.785716 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 22:01:32 crc kubenswrapper[4962]: I1201 22:01:32.785790 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b642k" Dec 01 22:01:32 crc kubenswrapper[4962]: I1201 22:01:32.787075 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ff25160e91e4465a0010cbcc4456883387e71df20e0a000eb4c6b6750118371f"} pod="openshift-machine-config-operator/machine-config-daemon-b642k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 22:01:32 crc kubenswrapper[4962]: I1201 22:01:32.787186 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" containerID="cri-o://ff25160e91e4465a0010cbcc4456883387e71df20e0a000eb4c6b6750118371f" gracePeriod=600 Dec 01 22:01:32 crc kubenswrapper[4962]: E1201 22:01:32.926015 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:01:33 crc kubenswrapper[4962]: I1201 22:01:33.199100 4962 generic.go:334] "Generic (PLEG): container finished" podID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerID="ff25160e91e4465a0010cbcc4456883387e71df20e0a000eb4c6b6750118371f" exitCode=0 Dec 01 22:01:33 crc kubenswrapper[4962]: I1201 22:01:33.199154 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" event={"ID":"191b6ce3-f613-4217-b224-a65ee4cfdfe7","Type":"ContainerDied","Data":"ff25160e91e4465a0010cbcc4456883387e71df20e0a000eb4c6b6750118371f"} Dec 01 22:01:33 crc kubenswrapper[4962]: I1201 22:01:33.199189 4962 scope.go:117] "RemoveContainer" containerID="be316a715ee51336fb8f9d77180528af883eb796f40ec81884b6acc27922aa28" Dec 01 22:01:33 crc kubenswrapper[4962]: I1201 22:01:33.200304 4962 scope.go:117] "RemoveContainer" containerID="ff25160e91e4465a0010cbcc4456883387e71df20e0a000eb4c6b6750118371f" Dec 01 22:01:33 crc kubenswrapper[4962]: E1201 22:01:33.200999 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:01:48 crc kubenswrapper[4962]: I1201 22:01:48.220049 4962 scope.go:117] "RemoveContainer" containerID="ff25160e91e4465a0010cbcc4456883387e71df20e0a000eb4c6b6750118371f" Dec 01 22:01:48 crc kubenswrapper[4962]: E1201 22:01:48.220793 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:02:02 crc kubenswrapper[4962]: I1201 22:02:02.153822 4962 scope.go:117] "RemoveContainer" containerID="f30cd32a16ae30fb07bc255f27a885076d3e7f6d6b645546881a4289c0a926e9" Dec 01 22:02:02 crc kubenswrapper[4962]: I1201 22:02:02.222825 4962 scope.go:117] "RemoveContainer" containerID="7a0e052c4595839848ef3ee1213e2439112bd2167ad32ddfb8ab42a9161cb824" Dec 01 22:02:03 crc kubenswrapper[4962]: I1201 22:02:03.221503 4962 scope.go:117] "RemoveContainer" containerID="ff25160e91e4465a0010cbcc4456883387e71df20e0a000eb4c6b6750118371f" Dec 01 22:02:03 crc kubenswrapper[4962]: E1201 22:02:03.222601 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:02:16 crc kubenswrapper[4962]: I1201 22:02:16.231527 4962 scope.go:117] "RemoveContainer" containerID="ff25160e91e4465a0010cbcc4456883387e71df20e0a000eb4c6b6750118371f" Dec 01 22:02:16 crc kubenswrapper[4962]: E1201 22:02:16.232577 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:02:30 crc kubenswrapper[4962]: I1201 22:02:30.220620 4962 scope.go:117] "RemoveContainer" containerID="ff25160e91e4465a0010cbcc4456883387e71df20e0a000eb4c6b6750118371f" Dec 01 22:02:30 crc kubenswrapper[4962]: E1201 22:02:30.221928 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:02:44 crc kubenswrapper[4962]: I1201 22:02:44.229022 4962 scope.go:117] "RemoveContainer" containerID="ff25160e91e4465a0010cbcc4456883387e71df20e0a000eb4c6b6750118371f" Dec 01 22:02:44 crc kubenswrapper[4962]: E1201 22:02:44.237716 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:02:59 crc kubenswrapper[4962]: I1201 22:02:59.219852 4962 scope.go:117] "RemoveContainer" containerID="ff25160e91e4465a0010cbcc4456883387e71df20e0a000eb4c6b6750118371f" Dec 01 22:02:59 crc kubenswrapper[4962]: E1201 22:02:59.221296 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:03:02 crc kubenswrapper[4962]: I1201 22:03:02.404048 4962 scope.go:117] "RemoveContainer" containerID="1a73bd9240037ee68094ed3e267ab37cd096c0744898f10ce5ddcc92cdf5849f" Dec 01 22:03:02 crc kubenswrapper[4962]: I1201 22:03:02.447097 4962 scope.go:117] "RemoveContainer" containerID="5a6778c5ed2512f1ec2efda396e8e9535eb9077d93cd0315583df081b2eadf0f" Dec 01 22:03:02 crc kubenswrapper[4962]: I1201 22:03:02.476116 4962 scope.go:117] "RemoveContainer" containerID="2da244bf8c60657deaac54de9fd127e4ff9e1c7fd734afd69cae2978ac0f3a5c" Dec 01 22:03:02 crc kubenswrapper[4962]: I1201 22:03:02.508700 4962 scope.go:117] "RemoveContainer" containerID="5afe056cdbf7eb9c4ee729101a3f13b72572b19443d8c503f40610110b805f8b" Dec 01 22:03:02 crc kubenswrapper[4962]: I1201 22:03:02.563867 4962 scope.go:117] "RemoveContainer" containerID="19e4e8aa5d7790937197a98667f4c39ae0c70400f1c571e288d265fbf070736c" Dec 01 22:03:10 crc kubenswrapper[4962]: I1201 22:03:10.220550 4962 scope.go:117] "RemoveContainer" containerID="ff25160e91e4465a0010cbcc4456883387e71df20e0a000eb4c6b6750118371f" Dec 01 22:03:10 crc kubenswrapper[4962]: E1201 22:03:10.222204 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:03:25 crc kubenswrapper[4962]: I1201 22:03:25.223280 4962 scope.go:117] "RemoveContainer" containerID="ff25160e91e4465a0010cbcc4456883387e71df20e0a000eb4c6b6750118371f" Dec 01 22:03:25 crc kubenswrapper[4962]: E1201 22:03:25.224536 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:03:34 crc kubenswrapper[4962]: I1201 22:03:34.074130 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-bk224"] Dec 01 22:03:34 crc kubenswrapper[4962]: I1201 22:03:34.099523 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-bk224"] Dec 01 22:03:34 crc kubenswrapper[4962]: I1201 22:03:34.250612 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc37b492-d684-42d9-a258-391677fbb9d7" path="/var/lib/kubelet/pods/fc37b492-d684-42d9-a258-391677fbb9d7/volumes" Dec 01 22:03:35 crc kubenswrapper[4962]: I1201 22:03:35.048125 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-3f94-account-create-update-tcsxp"] Dec 01 22:03:35 crc kubenswrapper[4962]: I1201 22:03:35.062159 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-3d8a-account-create-update-8j7jd"] Dec 01 22:03:35 crc kubenswrapper[4962]: I1201 22:03:35.079418 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-qd95v"] Dec 01 22:03:35 crc kubenswrapper[4962]: I1201 22:03:35.089735 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-3d8a-account-create-update-8j7jd"] Dec 01 22:03:35 crc kubenswrapper[4962]: I1201 22:03:35.100056 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-8b9sd"] Dec 01 22:03:35 crc kubenswrapper[4962]: I1201 22:03:35.110192 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-3f94-account-create-update-tcsxp"] Dec 01 22:03:35 crc kubenswrapper[4962]: I1201 22:03:35.120178 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-qd95v"] Dec 01 22:03:35 crc kubenswrapper[4962]: I1201 22:03:35.130309 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-gxffv"] Dec 01 22:03:35 crc kubenswrapper[4962]: I1201 22:03:35.141667 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-8b9sd"] Dec 01 22:03:35 crc kubenswrapper[4962]: I1201 22:03:35.153383 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-gxffv"] Dec 01 22:03:35 crc kubenswrapper[4962]: I1201 22:03:35.164044 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-3071-account-create-update-pqf8s"] Dec 01 22:03:35 crc kubenswrapper[4962]: I1201 22:03:35.176107 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-4fa1-account-create-update-rfbmh"] Dec 01 22:03:35 crc kubenswrapper[4962]: I1201 22:03:35.188971 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-4fa1-account-create-update-rfbmh"] Dec 01 22:03:35 crc kubenswrapper[4962]: I1201 22:03:35.222548 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-3071-account-create-update-pqf8s"] Dec 01 22:03:36 crc kubenswrapper[4962]: I1201 22:03:36.239626 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="057caf82-bc3a-4a9a-92a8-ac85a2ce7fd0" path="/var/lib/kubelet/pods/057caf82-bc3a-4a9a-92a8-ac85a2ce7fd0/volumes" Dec 01 22:03:36 crc kubenswrapper[4962]: I1201 22:03:36.242899 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="287023aa-b843-4ff9-bca0-7c5fcfc67688" path="/var/lib/kubelet/pods/287023aa-b843-4ff9-bca0-7c5fcfc67688/volumes" Dec 01 22:03:36 crc kubenswrapper[4962]: I1201 22:03:36.247902 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40f2f133-6737-4ab7-ab43-4de519e2d4c0" path="/var/lib/kubelet/pods/40f2f133-6737-4ab7-ab43-4de519e2d4c0/volumes" Dec 01 22:03:36 crc kubenswrapper[4962]: I1201 22:03:36.255736 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75a602fc-6e93-4dff-a482-64734dd6a817" path="/var/lib/kubelet/pods/75a602fc-6e93-4dff-a482-64734dd6a817/volumes" Dec 01 22:03:36 crc kubenswrapper[4962]: I1201 22:03:36.259379 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90e37bf0-3279-477d-bb05-cb6744af0908" path="/var/lib/kubelet/pods/90e37bf0-3279-477d-bb05-cb6744af0908/volumes" Dec 01 22:03:36 crc kubenswrapper[4962]: I1201 22:03:36.261092 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e9931f2-fa3e-4c5a-a8ea-fbfe5dbb25fb" path="/var/lib/kubelet/pods/9e9931f2-fa3e-4c5a-a8ea-fbfe5dbb25fb/volumes" Dec 01 22:03:36 crc kubenswrapper[4962]: I1201 22:03:36.262921 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c05e4ab6-7ea8-4adb-b276-f2c1883ac638" path="/var/lib/kubelet/pods/c05e4ab6-7ea8-4adb-b276-f2c1883ac638/volumes" Dec 01 22:03:39 crc kubenswrapper[4962]: I1201 22:03:39.220481 4962 scope.go:117] "RemoveContainer" containerID="ff25160e91e4465a0010cbcc4456883387e71df20e0a000eb4c6b6750118371f" Dec 01 22:03:39 crc kubenswrapper[4962]: E1201 22:03:39.221299 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:03:44 crc kubenswrapper[4962]: I1201 22:03:44.038310 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-bj9qn"] Dec 01 22:03:44 crc kubenswrapper[4962]: I1201 22:03:44.054056 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-af24-account-create-update-g6rz5"] Dec 01 22:03:44 crc kubenswrapper[4962]: I1201 22:03:44.064639 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-bj9qn"] Dec 01 22:03:44 crc kubenswrapper[4962]: I1201 22:03:44.074879 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-af24-account-create-update-g6rz5"] Dec 01 22:03:44 crc kubenswrapper[4962]: I1201 22:03:44.233159 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="095fd634-41e6-4675-b439-50fe9f184b3a" path="/var/lib/kubelet/pods/095fd634-41e6-4675-b439-50fe9f184b3a/volumes" Dec 01 22:03:44 crc kubenswrapper[4962]: I1201 22:03:44.234267 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0dbe1ae4-c145-451d-9350-d1172e7042d3" path="/var/lib/kubelet/pods/0dbe1ae4-c145-451d-9350-d1172e7042d3/volumes" Dec 01 22:03:51 crc kubenswrapper[4962]: I1201 22:03:51.065022 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-dr5bl"] Dec 01 22:03:51 crc kubenswrapper[4962]: I1201 22:03:51.078202 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-dr5bl"] Dec 01 22:03:51 crc kubenswrapper[4962]: I1201 22:03:51.220120 4962 scope.go:117] "RemoveContainer" containerID="ff25160e91e4465a0010cbcc4456883387e71df20e0a000eb4c6b6750118371f" Dec 01 22:03:51 crc kubenswrapper[4962]: E1201 22:03:51.220484 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:03:52 crc kubenswrapper[4962]: I1201 22:03:52.054774 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-bbd8-account-create-update-tnv4m"] Dec 01 22:03:52 crc kubenswrapper[4962]: I1201 22:03:52.069005 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-bg7z9"] Dec 01 22:03:52 crc kubenswrapper[4962]: I1201 22:03:52.082888 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-58e3-account-create-update-qf5td"] Dec 01 22:03:52 crc kubenswrapper[4962]: I1201 22:03:52.094954 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-lpdtt"] Dec 01 22:03:52 crc kubenswrapper[4962]: I1201 22:03:52.107119 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-f7vp4"] Dec 01 22:03:52 crc kubenswrapper[4962]: I1201 22:03:52.118283 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-bg7z9"] Dec 01 22:03:52 crc kubenswrapper[4962]: I1201 22:03:52.128867 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-dbc8-account-create-update-x8lg9"] Dec 01 22:03:52 crc kubenswrapper[4962]: I1201 22:03:52.139023 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-bbd8-account-create-update-tnv4m"] Dec 01 22:03:52 crc kubenswrapper[4962]: I1201 22:03:52.149205 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-lpdtt"] Dec 01 22:03:52 crc kubenswrapper[4962]: I1201 22:03:52.159352 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-58e3-account-create-update-qf5td"] Dec 01 22:03:52 crc kubenswrapper[4962]: I1201 22:03:52.169557 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-dbc8-account-create-update-x8lg9"] Dec 01 22:03:52 crc kubenswrapper[4962]: I1201 22:03:52.179857 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-f7vp4"] Dec 01 22:03:52 crc kubenswrapper[4962]: I1201 22:03:52.188916 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-8031-account-create-update-x2qgm"] Dec 01 22:03:52 crc kubenswrapper[4962]: I1201 22:03:52.198373 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-8031-account-create-update-x2qgm"] Dec 01 22:03:52 crc kubenswrapper[4962]: I1201 22:03:52.233824 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29e77231-0423-458f-8262-aa12c2536566" path="/var/lib/kubelet/pods/29e77231-0423-458f-8262-aa12c2536566/volumes" Dec 01 22:03:52 crc kubenswrapper[4962]: I1201 22:03:52.236127 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37f45ac0-c39d-4785-957e-e69e1659927e" path="/var/lib/kubelet/pods/37f45ac0-c39d-4785-957e-e69e1659927e/volumes" Dec 01 22:03:52 crc kubenswrapper[4962]: I1201 22:03:52.237556 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5cd1b6eb-52cf-44aa-993a-90d3abec28ad" path="/var/lib/kubelet/pods/5cd1b6eb-52cf-44aa-993a-90d3abec28ad/volumes" Dec 01 22:03:52 crc kubenswrapper[4962]: I1201 22:03:52.239217 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d0bb151-9b17-4471-9b0d-05f74fa33f0c" path="/var/lib/kubelet/pods/8d0bb151-9b17-4471-9b0d-05f74fa33f0c/volumes" Dec 01 22:03:52 crc kubenswrapper[4962]: I1201 22:03:52.241452 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7515449-4f20-4673-ac23-a7a5a40f852d" path="/var/lib/kubelet/pods/a7515449-4f20-4673-ac23-a7a5a40f852d/volumes" Dec 01 22:03:52 crc kubenswrapper[4962]: I1201 22:03:52.243203 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b539cc27-f876-40e3-b77a-8af750ce5b3a" path="/var/lib/kubelet/pods/b539cc27-f876-40e3-b77a-8af750ce5b3a/volumes" Dec 01 22:03:52 crc kubenswrapper[4962]: I1201 22:03:52.244718 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfa1c735-01c9-4f3e-ae1d-32bc2af0972d" path="/var/lib/kubelet/pods/bfa1c735-01c9-4f3e-ae1d-32bc2af0972d/volumes" Dec 01 22:03:52 crc kubenswrapper[4962]: I1201 22:03:52.246567 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c889a0f9-46ee-413a-bae1-94ee4eb8f16d" path="/var/lib/kubelet/pods/c889a0f9-46ee-413a-bae1-94ee4eb8f16d/volumes" Dec 01 22:04:02 crc kubenswrapper[4962]: I1201 22:04:02.707063 4962 scope.go:117] "RemoveContainer" containerID="30bb1e4077ae583f12c4c6689e334cdf76af4dfb6afe2c61c6b099563f3eb6a2" Dec 01 22:04:02 crc kubenswrapper[4962]: I1201 22:04:02.738284 4962 scope.go:117] "RemoveContainer" containerID="8a1c0fb7af3abf3b53e289d52dcbb4c123cacf5925374f7d1493f706edb38d79" Dec 01 22:04:02 crc kubenswrapper[4962]: I1201 22:04:02.830319 4962 scope.go:117] "RemoveContainer" containerID="8ab4004fc2e5dc0799776153309c837efc8148de0b5ea7246b5dc107b10df66a" Dec 01 22:04:02 crc kubenswrapper[4962]: I1201 22:04:02.909679 4962 scope.go:117] "RemoveContainer" containerID="8f2da0fd1ec5fd019c5150b84d509035ed363835c9f0a0203d2563fdbc807f70" Dec 01 22:04:02 crc kubenswrapper[4962]: I1201 22:04:02.964408 4962 scope.go:117] "RemoveContainer" containerID="64de11a8be27f77a842e09d38601f7326e01819ceb15dad617e7e51d9e526011" Dec 01 22:04:03 crc kubenswrapper[4962]: I1201 22:04:03.027373 4962 scope.go:117] "RemoveContainer" containerID="89272950bdbf82012cd12825bfcbf1da1de172ee4d61b5963fa60c515724d833" Dec 01 22:04:03 crc kubenswrapper[4962]: I1201 22:04:03.063079 4962 scope.go:117] "RemoveContainer" containerID="b022597fb5249e1e727c92ba7738ab1dc71db487e4c705c7d496197bf238eee1" Dec 01 22:04:03 crc kubenswrapper[4962]: I1201 22:04:03.195905 4962 scope.go:117] "RemoveContainer" containerID="9972fa75fbfa3f510c06489b11cac1b796cb70719b4acf05a42d6aa6303f5de1" Dec 01 22:04:03 crc kubenswrapper[4962]: I1201 22:04:03.220576 4962 scope.go:117] "RemoveContainer" containerID="ff25160e91e4465a0010cbcc4456883387e71df20e0a000eb4c6b6750118371f" Dec 01 22:04:03 crc kubenswrapper[4962]: E1201 22:04:03.220955 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:04:03 crc kubenswrapper[4962]: I1201 22:04:03.320375 4962 scope.go:117] "RemoveContainer" containerID="3ade7d54e515b3af6ab7f6813de832c61375247f29b19e7e84248ba3942b0745" Dec 01 22:04:03 crc kubenswrapper[4962]: I1201 22:04:03.371510 4962 scope.go:117] "RemoveContainer" containerID="58e8b4e147d076ee2c8ef454df8e0fdd9b54d15263c0c463a2b94585625c6b77" Dec 01 22:04:03 crc kubenswrapper[4962]: I1201 22:04:03.398646 4962 scope.go:117] "RemoveContainer" containerID="4b10d51e358d870a2c8b9428c9c8d99acd292597b8743870c73fe4b1823372d1" Dec 01 22:04:03 crc kubenswrapper[4962]: I1201 22:04:03.419154 4962 scope.go:117] "RemoveContainer" containerID="4c6cb28056486117a7cacbe282e06f687ec3d7e1358ff656bc53622675e1876a" Dec 01 22:04:03 crc kubenswrapper[4962]: I1201 22:04:03.446980 4962 scope.go:117] "RemoveContainer" containerID="2833172e38a500bfe10d49fda192ef2936fa2ad2ae35f63856a63fb81bd140eb" Dec 01 22:04:03 crc kubenswrapper[4962]: I1201 22:04:03.480007 4962 scope.go:117] "RemoveContainer" containerID="b260d26addbd49a8a9f9d59648defcb6b45e3e5b147d3fdcf866e7baf96f703f" Dec 01 22:04:03 crc kubenswrapper[4962]: I1201 22:04:03.506948 4962 scope.go:117] "RemoveContainer" containerID="0b8cf51fd167f1e54c9c805e29c1a7697b89ff963a7381ef0fe424bb972a6250" Dec 01 22:04:03 crc kubenswrapper[4962]: I1201 22:04:03.538963 4962 scope.go:117] "RemoveContainer" containerID="77f87a9016ecab1c560188b5260a06146211e48bfc5c67248b63450fca93f960" Dec 01 22:04:03 crc kubenswrapper[4962]: I1201 22:04:03.565927 4962 scope.go:117] "RemoveContainer" containerID="b3615ed752a8c2690297a5c0938d81c9c0b3433c8a9c6df6f4abc6631c78ffae" Dec 01 22:04:03 crc kubenswrapper[4962]: I1201 22:04:03.592984 4962 scope.go:117] "RemoveContainer" containerID="a5728c452ee33f2eafd2f8441ed6962da04a90c4a8f27f9f3f1495b681f8bcdc" Dec 01 22:04:03 crc kubenswrapper[4962]: I1201 22:04:03.625505 4962 scope.go:117] "RemoveContainer" containerID="7e3cf5e00421be7fc94877dd0e0c9871052b09c5b0041455aa5cf7869223e600" Dec 01 22:04:03 crc kubenswrapper[4962]: I1201 22:04:03.651097 4962 scope.go:117] "RemoveContainer" containerID="7e4570bd611c799e9c21509ffe5b73a1bdfd7d06d349aaae1e9e520a1bc9174b" Dec 01 22:04:03 crc kubenswrapper[4962]: I1201 22:04:03.683277 4962 scope.go:117] "RemoveContainer" containerID="511ff24f0bc730b93e596c22bc2f62060c72391a0076d989f6d5ad1cf7f2cd12" Dec 01 22:04:03 crc kubenswrapper[4962]: I1201 22:04:03.711073 4962 scope.go:117] "RemoveContainer" containerID="e46de99b8473033437861ecfef864ffc26a78968b854651184220f433944c7d5" Dec 01 22:04:03 crc kubenswrapper[4962]: I1201 22:04:03.744439 4962 scope.go:117] "RemoveContainer" containerID="bc64d8a39eaab4dc24ac303ee3d12b50a27b7806023f4ec717d25cdc571c95c4" Dec 01 22:04:03 crc kubenswrapper[4962]: I1201 22:04:03.819419 4962 scope.go:117] "RemoveContainer" containerID="269869044a427490757176305f563e559f01fad883064266d82b586da8054a16" Dec 01 22:04:03 crc kubenswrapper[4962]: I1201 22:04:03.847418 4962 scope.go:117] "RemoveContainer" containerID="8829dfd12517b7a3c99f992987be4805b73b3ae2e753eca56a8bde276acdaec7" Dec 01 22:04:03 crc kubenswrapper[4962]: I1201 22:04:03.870868 4962 scope.go:117] "RemoveContainer" containerID="b689f1199444cf62162b7b56d7045dee1401338f5e4f1e0e90a273c84b6dfe8f" Dec 01 22:04:03 crc kubenswrapper[4962]: I1201 22:04:03.936346 4962 scope.go:117] "RemoveContainer" containerID="29e1028ed8f8120707ef6f71cc59fdc41314e0ea8c38c91d05363a3950f63b1f" Dec 01 22:04:03 crc kubenswrapper[4962]: I1201 22:04:03.985869 4962 scope.go:117] "RemoveContainer" containerID="0e9aab10cf8f465f91a67c0776211da3fcb1b8a3473f1328b1a6d4d40b74c6e2" Dec 01 22:04:04 crc kubenswrapper[4962]: I1201 22:04:04.056381 4962 scope.go:117] "RemoveContainer" containerID="0efd232c766d87d65cec6a2ccc46feda5170654f1da631611abc283402224ee7" Dec 01 22:04:04 crc kubenswrapper[4962]: I1201 22:04:04.104544 4962 scope.go:117] "RemoveContainer" containerID="32b621ce7bb8e9b7672dc0f0861f7989a2e83922bd9cdcc8a372a5bdc0b8ada4" Dec 01 22:04:08 crc kubenswrapper[4962]: I1201 22:04:08.812643 4962 generic.go:334] "Generic (PLEG): container finished" podID="6dc28247-9b3f-421b-a195-2f89ea5b50f8" containerID="9c1f7b510f6a69b50e06ea235e58c04905d120e5644dd5b71df316e9f95cecd1" exitCode=0 Dec 01 22:04:08 crc kubenswrapper[4962]: I1201 22:04:08.812760 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4d6td" event={"ID":"6dc28247-9b3f-421b-a195-2f89ea5b50f8","Type":"ContainerDied","Data":"9c1f7b510f6a69b50e06ea235e58c04905d120e5644dd5b71df316e9f95cecd1"} Dec 01 22:04:10 crc kubenswrapper[4962]: I1201 22:04:10.367928 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4d6td" Dec 01 22:04:10 crc kubenswrapper[4962]: I1201 22:04:10.479220 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6dc28247-9b3f-421b-a195-2f89ea5b50f8-ssh-key\") pod \"6dc28247-9b3f-421b-a195-2f89ea5b50f8\" (UID: \"6dc28247-9b3f-421b-a195-2f89ea5b50f8\") " Dec 01 22:04:10 crc kubenswrapper[4962]: I1201 22:04:10.479299 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dc28247-9b3f-421b-a195-2f89ea5b50f8-bootstrap-combined-ca-bundle\") pod \"6dc28247-9b3f-421b-a195-2f89ea5b50f8\" (UID: \"6dc28247-9b3f-421b-a195-2f89ea5b50f8\") " Dec 01 22:04:10 crc kubenswrapper[4962]: I1201 22:04:10.479349 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6dc28247-9b3f-421b-a195-2f89ea5b50f8-inventory\") pod \"6dc28247-9b3f-421b-a195-2f89ea5b50f8\" (UID: \"6dc28247-9b3f-421b-a195-2f89ea5b50f8\") " Dec 01 22:04:10 crc kubenswrapper[4962]: I1201 22:04:10.479512 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvx5f\" (UniqueName: \"kubernetes.io/projected/6dc28247-9b3f-421b-a195-2f89ea5b50f8-kube-api-access-tvx5f\") pod \"6dc28247-9b3f-421b-a195-2f89ea5b50f8\" (UID: \"6dc28247-9b3f-421b-a195-2f89ea5b50f8\") " Dec 01 22:04:10 crc kubenswrapper[4962]: I1201 22:04:10.485798 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6dc28247-9b3f-421b-a195-2f89ea5b50f8-kube-api-access-tvx5f" (OuterVolumeSpecName: "kube-api-access-tvx5f") pod "6dc28247-9b3f-421b-a195-2f89ea5b50f8" (UID: "6dc28247-9b3f-421b-a195-2f89ea5b50f8"). InnerVolumeSpecName "kube-api-access-tvx5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 22:04:10 crc kubenswrapper[4962]: I1201 22:04:10.491138 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dc28247-9b3f-421b-a195-2f89ea5b50f8-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "6dc28247-9b3f-421b-a195-2f89ea5b50f8" (UID: "6dc28247-9b3f-421b-a195-2f89ea5b50f8"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:04:10 crc kubenswrapper[4962]: I1201 22:04:10.518168 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dc28247-9b3f-421b-a195-2f89ea5b50f8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6dc28247-9b3f-421b-a195-2f89ea5b50f8" (UID: "6dc28247-9b3f-421b-a195-2f89ea5b50f8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:04:10 crc kubenswrapper[4962]: I1201 22:04:10.520827 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dc28247-9b3f-421b-a195-2f89ea5b50f8-inventory" (OuterVolumeSpecName: "inventory") pod "6dc28247-9b3f-421b-a195-2f89ea5b50f8" (UID: "6dc28247-9b3f-421b-a195-2f89ea5b50f8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:04:10 crc kubenswrapper[4962]: I1201 22:04:10.581788 4962 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6dc28247-9b3f-421b-a195-2f89ea5b50f8-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 22:04:10 crc kubenswrapper[4962]: I1201 22:04:10.581962 4962 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dc28247-9b3f-421b-a195-2f89ea5b50f8-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 22:04:10 crc kubenswrapper[4962]: I1201 22:04:10.582024 4962 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6dc28247-9b3f-421b-a195-2f89ea5b50f8-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 22:04:10 crc kubenswrapper[4962]: I1201 22:04:10.582077 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvx5f\" (UniqueName: \"kubernetes.io/projected/6dc28247-9b3f-421b-a195-2f89ea5b50f8-kube-api-access-tvx5f\") on node \"crc\" DevicePath \"\"" Dec 01 22:04:10 crc kubenswrapper[4962]: I1201 22:04:10.843240 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4d6td" event={"ID":"6dc28247-9b3f-421b-a195-2f89ea5b50f8","Type":"ContainerDied","Data":"8b0f477f4fe2dc79da7a842021777e81d75bcd0afb3afc2ad19b38bcd0d40929"} Dec 01 22:04:10 crc kubenswrapper[4962]: I1201 22:04:10.843566 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b0f477f4fe2dc79da7a842021777e81d75bcd0afb3afc2ad19b38bcd0d40929" Dec 01 22:04:10 crc kubenswrapper[4962]: I1201 22:04:10.843329 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4d6td" Dec 01 22:04:10 crc kubenswrapper[4962]: I1201 22:04:10.968929 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d2c4z"] Dec 01 22:04:10 crc kubenswrapper[4962]: E1201 22:04:10.969814 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dc28247-9b3f-421b-a195-2f89ea5b50f8" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 01 22:04:10 crc kubenswrapper[4962]: I1201 22:04:10.969921 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dc28247-9b3f-421b-a195-2f89ea5b50f8" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 01 22:04:10 crc kubenswrapper[4962]: E1201 22:04:10.970084 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14dca7aa-3ee9-4af1-85ba-e92ac88fd223" containerName="keystone-cron" Dec 01 22:04:10 crc kubenswrapper[4962]: I1201 22:04:10.970163 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="14dca7aa-3ee9-4af1-85ba-e92ac88fd223" containerName="keystone-cron" Dec 01 22:04:10 crc kubenswrapper[4962]: I1201 22:04:10.970532 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="6dc28247-9b3f-421b-a195-2f89ea5b50f8" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 01 22:04:10 crc kubenswrapper[4962]: I1201 22:04:10.970692 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="14dca7aa-3ee9-4af1-85ba-e92ac88fd223" containerName="keystone-cron" Dec 01 22:04:10 crc kubenswrapper[4962]: I1201 22:04:10.971776 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d2c4z" Dec 01 22:04:10 crc kubenswrapper[4962]: I1201 22:04:10.974662 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 22:04:10 crc kubenswrapper[4962]: I1201 22:04:10.975003 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 22:04:10 crc kubenswrapper[4962]: I1201 22:04:10.975139 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 22:04:10 crc kubenswrapper[4962]: I1201 22:04:10.975363 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mpxjd" Dec 01 22:04:11 crc kubenswrapper[4962]: I1201 22:04:10.998548 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d2c4z"] Dec 01 22:04:11 crc kubenswrapper[4962]: I1201 22:04:11.093753 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqmzw\" (UniqueName: \"kubernetes.io/projected/de8a047d-3a82-4ffe-a734-76c25d8997e5-kube-api-access-kqmzw\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-d2c4z\" (UID: \"de8a047d-3a82-4ffe-a734-76c25d8997e5\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d2c4z" Dec 01 22:04:11 crc kubenswrapper[4962]: I1201 22:04:11.094122 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/de8a047d-3a82-4ffe-a734-76c25d8997e5-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-d2c4z\" (UID: \"de8a047d-3a82-4ffe-a734-76c25d8997e5\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d2c4z" Dec 01 22:04:11 crc kubenswrapper[4962]: I1201 22:04:11.094231 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de8a047d-3a82-4ffe-a734-76c25d8997e5-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-d2c4z\" (UID: \"de8a047d-3a82-4ffe-a734-76c25d8997e5\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d2c4z" Dec 01 22:04:11 crc kubenswrapper[4962]: I1201 22:04:11.195723 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/de8a047d-3a82-4ffe-a734-76c25d8997e5-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-d2c4z\" (UID: \"de8a047d-3a82-4ffe-a734-76c25d8997e5\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d2c4z" Dec 01 22:04:11 crc kubenswrapper[4962]: I1201 22:04:11.195789 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de8a047d-3a82-4ffe-a734-76c25d8997e5-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-d2c4z\" (UID: \"de8a047d-3a82-4ffe-a734-76c25d8997e5\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d2c4z" Dec 01 22:04:11 crc kubenswrapper[4962]: I1201 22:04:11.195888 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqmzw\" (UniqueName: \"kubernetes.io/projected/de8a047d-3a82-4ffe-a734-76c25d8997e5-kube-api-access-kqmzw\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-d2c4z\" (UID: \"de8a047d-3a82-4ffe-a734-76c25d8997e5\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d2c4z" Dec 01 22:04:11 crc kubenswrapper[4962]: I1201 22:04:11.200727 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/de8a047d-3a82-4ffe-a734-76c25d8997e5-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-d2c4z\" (UID: \"de8a047d-3a82-4ffe-a734-76c25d8997e5\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d2c4z" Dec 01 22:04:11 crc kubenswrapper[4962]: I1201 22:04:11.202800 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de8a047d-3a82-4ffe-a734-76c25d8997e5-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-d2c4z\" (UID: \"de8a047d-3a82-4ffe-a734-76c25d8997e5\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d2c4z" Dec 01 22:04:11 crc kubenswrapper[4962]: I1201 22:04:11.216961 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqmzw\" (UniqueName: \"kubernetes.io/projected/de8a047d-3a82-4ffe-a734-76c25d8997e5-kube-api-access-kqmzw\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-d2c4z\" (UID: \"de8a047d-3a82-4ffe-a734-76c25d8997e5\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d2c4z" Dec 01 22:04:11 crc kubenswrapper[4962]: I1201 22:04:11.292605 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d2c4z" Dec 01 22:04:11 crc kubenswrapper[4962]: I1201 22:04:11.948688 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d2c4z"] Dec 01 22:04:11 crc kubenswrapper[4962]: W1201 22:04:11.968840 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde8a047d_3a82_4ffe_a734_76c25d8997e5.slice/crio-95147735f730e34d7db8bb7417d81f954c2fff97d9abb478ea56259c12a4532f WatchSource:0}: Error finding container 95147735f730e34d7db8bb7417d81f954c2fff97d9abb478ea56259c12a4532f: Status 404 returned error can't find the container with id 95147735f730e34d7db8bb7417d81f954c2fff97d9abb478ea56259c12a4532f Dec 01 22:04:11 crc kubenswrapper[4962]: I1201 22:04:11.973896 4962 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 22:04:12 crc kubenswrapper[4962]: I1201 22:04:12.870298 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d2c4z" event={"ID":"de8a047d-3a82-4ffe-a734-76c25d8997e5","Type":"ContainerStarted","Data":"95147735f730e34d7db8bb7417d81f954c2fff97d9abb478ea56259c12a4532f"} Dec 01 22:04:13 crc kubenswrapper[4962]: I1201 22:04:13.048221 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-28z4p"] Dec 01 22:04:13 crc kubenswrapper[4962]: I1201 22:04:13.074823 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-28z4p"] Dec 01 22:04:13 crc kubenswrapper[4962]: I1201 22:04:13.886239 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d2c4z" event={"ID":"de8a047d-3a82-4ffe-a734-76c25d8997e5","Type":"ContainerStarted","Data":"49d26daf26e371c581ebbb5d4422236dd37a8b9c4aa12aa911d19839be99afb4"} Dec 01 22:04:13 crc kubenswrapper[4962]: I1201 22:04:13.925800 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d2c4z" podStartSLOduration=2.646366771 podStartE2EDuration="3.925763094s" podCreationTimestamp="2025-12-01 22:04:10 +0000 UTC" firstStartedPulling="2025-12-01 22:04:11.973666228 +0000 UTC m=+1836.075105423" lastFinishedPulling="2025-12-01 22:04:13.253062541 +0000 UTC m=+1837.354501746" observedRunningTime="2025-12-01 22:04:13.908489547 +0000 UTC m=+1838.009928782" watchObservedRunningTime="2025-12-01 22:04:13.925763094 +0000 UTC m=+1838.027202389" Dec 01 22:04:14 crc kubenswrapper[4962]: I1201 22:04:14.239651 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="235fb826-ef71-488f-b902-efcf5dc9a7dd" path="/var/lib/kubelet/pods/235fb826-ef71-488f-b902-efcf5dc9a7dd/volumes" Dec 01 22:04:17 crc kubenswrapper[4962]: I1201 22:04:17.220186 4962 scope.go:117] "RemoveContainer" containerID="ff25160e91e4465a0010cbcc4456883387e71df20e0a000eb4c6b6750118371f" Dec 01 22:04:17 crc kubenswrapper[4962]: E1201 22:04:17.221165 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:04:24 crc kubenswrapper[4962]: I1201 22:04:24.109542 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-qd22d"] Dec 01 22:04:24 crc kubenswrapper[4962]: I1201 22:04:24.129246 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-qd22d"] Dec 01 22:04:24 crc kubenswrapper[4962]: I1201 22:04:24.234166 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6634bf9-94a9-4b1c-b14b-44b4ecc882bb" path="/var/lib/kubelet/pods/e6634bf9-94a9-4b1c-b14b-44b4ecc882bb/volumes" Dec 01 22:04:32 crc kubenswrapper[4962]: I1201 22:04:32.220308 4962 scope.go:117] "RemoveContainer" containerID="ff25160e91e4465a0010cbcc4456883387e71df20e0a000eb4c6b6750118371f" Dec 01 22:04:32 crc kubenswrapper[4962]: E1201 22:04:32.221292 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:04:43 crc kubenswrapper[4962]: I1201 22:04:43.220307 4962 scope.go:117] "RemoveContainer" containerID="ff25160e91e4465a0010cbcc4456883387e71df20e0a000eb4c6b6750118371f" Dec 01 22:04:43 crc kubenswrapper[4962]: E1201 22:04:43.221094 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:04:53 crc kubenswrapper[4962]: I1201 22:04:53.061163 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-pvhvh"] Dec 01 22:04:53 crc kubenswrapper[4962]: I1201 22:04:53.075782 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-pvhvh"] Dec 01 22:04:54 crc kubenswrapper[4962]: I1201 22:04:54.248565 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ecfd3cc-e01f-4a75-ad5e-8c0e44638525" path="/var/lib/kubelet/pods/4ecfd3cc-e01f-4a75-ad5e-8c0e44638525/volumes" Dec 01 22:04:55 crc kubenswrapper[4962]: I1201 22:04:55.221068 4962 scope.go:117] "RemoveContainer" containerID="ff25160e91e4465a0010cbcc4456883387e71df20e0a000eb4c6b6750118371f" Dec 01 22:04:55 crc kubenswrapper[4962]: E1201 22:04:55.221609 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:05:04 crc kubenswrapper[4962]: I1201 22:05:04.629502 4962 scope.go:117] "RemoveContainer" containerID="93735b0229348dd1a45e61ee147ea636c8eb78d00c50726b4a269773fbd0fea9" Dec 01 22:05:04 crc kubenswrapper[4962]: I1201 22:05:04.687174 4962 scope.go:117] "RemoveContainer" containerID="b9bc37bd96b4ccb33a288be857b879b5744c6cdb5081ebf82d1f77c52393fb2d" Dec 01 22:05:04 crc kubenswrapper[4962]: I1201 22:05:04.733549 4962 scope.go:117] "RemoveContainer" containerID="0cf2149813cb7bb627682a3022799e5c83455c122581776f0639344ccd6f3827" Dec 01 22:05:05 crc kubenswrapper[4962]: I1201 22:05:05.044383 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-762cx"] Dec 01 22:05:05 crc kubenswrapper[4962]: I1201 22:05:05.059394 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-762cx"] Dec 01 22:05:06 crc kubenswrapper[4962]: I1201 22:05:06.239473 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c13da36-7f69-4c0e-b830-8b71bdc181b2" path="/var/lib/kubelet/pods/7c13da36-7f69-4c0e-b830-8b71bdc181b2/volumes" Dec 01 22:05:07 crc kubenswrapper[4962]: I1201 22:05:07.220448 4962 scope.go:117] "RemoveContainer" containerID="ff25160e91e4465a0010cbcc4456883387e71df20e0a000eb4c6b6750118371f" Dec 01 22:05:07 crc kubenswrapper[4962]: E1201 22:05:07.220777 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:05:11 crc kubenswrapper[4962]: I1201 22:05:11.047668 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-l44sw"] Dec 01 22:05:11 crc kubenswrapper[4962]: I1201 22:05:11.059790 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-l44sw"] Dec 01 22:05:12 crc kubenswrapper[4962]: I1201 22:05:12.238449 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb73c9fd-2c3b-4d8e-a1c4-5ff4a2b8f67c" path="/var/lib/kubelet/pods/eb73c9fd-2c3b-4d8e-a1c4-5ff4a2b8f67c/volumes" Dec 01 22:05:15 crc kubenswrapper[4962]: I1201 22:05:15.052393 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-q7knr"] Dec 01 22:05:15 crc kubenswrapper[4962]: I1201 22:05:15.064769 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-q7knr"] Dec 01 22:05:16 crc kubenswrapper[4962]: I1201 22:05:16.236533 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98cede6f-200e-44d9-a4ee-886de53f2459" path="/var/lib/kubelet/pods/98cede6f-200e-44d9-a4ee-886de53f2459/volumes" Dec 01 22:05:17 crc kubenswrapper[4962]: I1201 22:05:17.036158 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-vzzq8"] Dec 01 22:05:17 crc kubenswrapper[4962]: I1201 22:05:17.046140 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-vzzq8"] Dec 01 22:05:18 crc kubenswrapper[4962]: I1201 22:05:18.231805 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29a10360-5fb7-4add-8bf5-1bc35e6e76dd" path="/var/lib/kubelet/pods/29a10360-5fb7-4add-8bf5-1bc35e6e76dd/volumes" Dec 01 22:05:19 crc kubenswrapper[4962]: I1201 22:05:19.220318 4962 scope.go:117] "RemoveContainer" containerID="ff25160e91e4465a0010cbcc4456883387e71df20e0a000eb4c6b6750118371f" Dec 01 22:05:19 crc kubenswrapper[4962]: E1201 22:05:19.221099 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:05:33 crc kubenswrapper[4962]: I1201 22:05:33.219724 4962 scope.go:117] "RemoveContainer" containerID="ff25160e91e4465a0010cbcc4456883387e71df20e0a000eb4c6b6750118371f" Dec 01 22:05:33 crc kubenswrapper[4962]: E1201 22:05:33.220583 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:05:45 crc kubenswrapper[4962]: I1201 22:05:45.219409 4962 scope.go:117] "RemoveContainer" containerID="ff25160e91e4465a0010cbcc4456883387e71df20e0a000eb4c6b6750118371f" Dec 01 22:05:45 crc kubenswrapper[4962]: E1201 22:05:45.220354 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:05:58 crc kubenswrapper[4962]: I1201 22:05:58.220517 4962 scope.go:117] "RemoveContainer" containerID="ff25160e91e4465a0010cbcc4456883387e71df20e0a000eb4c6b6750118371f" Dec 01 22:05:58 crc kubenswrapper[4962]: E1201 22:05:58.221310 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:06:02 crc kubenswrapper[4962]: I1201 22:06:02.065318 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-b85sm"] Dec 01 22:06:02 crc kubenswrapper[4962]: I1201 22:06:02.076890 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-25db-account-create-update-h9hw7"] Dec 01 22:06:02 crc kubenswrapper[4962]: I1201 22:06:02.094747 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-4zm7d"] Dec 01 22:06:02 crc kubenswrapper[4962]: I1201 22:06:02.107394 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-b85sm"] Dec 01 22:06:02 crc kubenswrapper[4962]: I1201 22:06:02.121667 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-25db-account-create-update-h9hw7"] Dec 01 22:06:02 crc kubenswrapper[4962]: I1201 22:06:02.131731 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-4zm7d"] Dec 01 22:06:02 crc kubenswrapper[4962]: I1201 22:06:02.234527 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88297b50-c65c-4dc3-8eed-d86b046b7f84" path="/var/lib/kubelet/pods/88297b50-c65c-4dc3-8eed-d86b046b7f84/volumes" Dec 01 22:06:02 crc kubenswrapper[4962]: I1201 22:06:02.235545 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdaa775e-7b2f-4c56-8f07-256a62b4ed20" path="/var/lib/kubelet/pods/bdaa775e-7b2f-4c56-8f07-256a62b4ed20/volumes" Dec 01 22:06:02 crc kubenswrapper[4962]: I1201 22:06:02.236255 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2e2dc11-c865-45cb-ab81-c39e911fdef9" path="/var/lib/kubelet/pods/e2e2dc11-c865-45cb-ab81-c39e911fdef9/volumes" Dec 01 22:06:03 crc kubenswrapper[4962]: I1201 22:06:03.032127 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-tpnk4"] Dec 01 22:06:03 crc kubenswrapper[4962]: I1201 22:06:03.046363 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-tpnk4"] Dec 01 22:06:04 crc kubenswrapper[4962]: I1201 22:06:04.053981 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-52d5-account-create-update-qxxxt"] Dec 01 22:06:04 crc kubenswrapper[4962]: I1201 22:06:04.071242 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-52d5-account-create-update-qxxxt"] Dec 01 22:06:04 crc kubenswrapper[4962]: I1201 22:06:04.084718 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-287a-account-create-update-svftm"] Dec 01 22:06:04 crc kubenswrapper[4962]: I1201 22:06:04.095438 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-287a-account-create-update-svftm"] Dec 01 22:06:04 crc kubenswrapper[4962]: I1201 22:06:04.242421 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62f2a608-1cbf-4af5-ae73-9e3d141ae906" path="/var/lib/kubelet/pods/62f2a608-1cbf-4af5-ae73-9e3d141ae906/volumes" Dec 01 22:06:04 crc kubenswrapper[4962]: I1201 22:06:04.243756 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6dd00f79-0c0f-4016-bd96-e0c497c73e36" path="/var/lib/kubelet/pods/6dd00f79-0c0f-4016-bd96-e0c497c73e36/volumes" Dec 01 22:06:04 crc kubenswrapper[4962]: I1201 22:06:04.247047 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0f3d631-bb76-48fd-9bb2-2326d9044956" path="/var/lib/kubelet/pods/a0f3d631-bb76-48fd-9bb2-2326d9044956/volumes" Dec 01 22:06:04 crc kubenswrapper[4962]: I1201 22:06:04.889877 4962 scope.go:117] "RemoveContainer" containerID="6734d4d7d5d7a7e3a4b3fc1fcf5b534ca37454bc57c2d58023114345fa69751a" Dec 01 22:06:04 crc kubenswrapper[4962]: I1201 22:06:04.935124 4962 scope.go:117] "RemoveContainer" containerID="0933a1b3fae9725aafaf5fa8117f6c9c36071772289fc14389d1f89c99a0bf99" Dec 01 22:06:05 crc kubenswrapper[4962]: I1201 22:06:05.027131 4962 scope.go:117] "RemoveContainer" containerID="112394d5797b70c9941755c33740353ce3336dac30cfb421d5359cb72a83bbe6" Dec 01 22:06:05 crc kubenswrapper[4962]: I1201 22:06:05.078392 4962 scope.go:117] "RemoveContainer" containerID="4ee394e3c08b6b22f910c1ac746acc212144457ea20d4507fd9f0b626d62f0ea" Dec 01 22:06:05 crc kubenswrapper[4962]: I1201 22:06:05.162360 4962 scope.go:117] "RemoveContainer" containerID="d4e9f414a0b7139454956f76686a5451feb7b9d41abfad2effe2882278195e7b" Dec 01 22:06:05 crc kubenswrapper[4962]: I1201 22:06:05.207580 4962 scope.go:117] "RemoveContainer" containerID="312953e90d66804e0c3fe2c6b7f8cb61941ccdfa613a5441030df9579c545190" Dec 01 22:06:05 crc kubenswrapper[4962]: I1201 22:06:05.261350 4962 scope.go:117] "RemoveContainer" containerID="25f007fbb2c33ecfdc5a981921a1439c076bfb642912dbffdbffa0965553bcdd" Dec 01 22:06:05 crc kubenswrapper[4962]: I1201 22:06:05.290284 4962 scope.go:117] "RemoveContainer" containerID="45946b1574666843499789ecb531de96bd76a29871dbdec68756db81bc11fbc6" Dec 01 22:06:05 crc kubenswrapper[4962]: I1201 22:06:05.320926 4962 scope.go:117] "RemoveContainer" containerID="ffb379f93d52940c69349e2ca2c9ad09bf479e0b63c8db987b3041ef9b636637" Dec 01 22:06:05 crc kubenswrapper[4962]: I1201 22:06:05.350071 4962 scope.go:117] "RemoveContainer" containerID="03f499c20053f72585e31fd54b1d5759491a0ce63ac00973d6192f9e5c5683d4" Dec 01 22:06:10 crc kubenswrapper[4962]: I1201 22:06:10.219905 4962 scope.go:117] "RemoveContainer" containerID="ff25160e91e4465a0010cbcc4456883387e71df20e0a000eb4c6b6750118371f" Dec 01 22:06:10 crc kubenswrapper[4962]: E1201 22:06:10.220800 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:06:18 crc kubenswrapper[4962]: I1201 22:06:18.094010 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-m4mqr"] Dec 01 22:06:18 crc kubenswrapper[4962]: I1201 22:06:18.097088 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m4mqr" Dec 01 22:06:18 crc kubenswrapper[4962]: I1201 22:06:18.105417 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m4mqr"] Dec 01 22:06:18 crc kubenswrapper[4962]: I1201 22:06:18.260128 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9ca4ee2-77c4-4862-9b18-7b3cf1598ac3-utilities\") pod \"community-operators-m4mqr\" (UID: \"d9ca4ee2-77c4-4862-9b18-7b3cf1598ac3\") " pod="openshift-marketplace/community-operators-m4mqr" Dec 01 22:06:18 crc kubenswrapper[4962]: I1201 22:06:18.260463 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6b9r\" (UniqueName: \"kubernetes.io/projected/d9ca4ee2-77c4-4862-9b18-7b3cf1598ac3-kube-api-access-z6b9r\") pod \"community-operators-m4mqr\" (UID: \"d9ca4ee2-77c4-4862-9b18-7b3cf1598ac3\") " pod="openshift-marketplace/community-operators-m4mqr" Dec 01 22:06:18 crc kubenswrapper[4962]: I1201 22:06:18.260542 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9ca4ee2-77c4-4862-9b18-7b3cf1598ac3-catalog-content\") pod \"community-operators-m4mqr\" (UID: \"d9ca4ee2-77c4-4862-9b18-7b3cf1598ac3\") " pod="openshift-marketplace/community-operators-m4mqr" Dec 01 22:06:18 crc kubenswrapper[4962]: I1201 22:06:18.363039 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9ca4ee2-77c4-4862-9b18-7b3cf1598ac3-utilities\") pod \"community-operators-m4mqr\" (UID: \"d9ca4ee2-77c4-4862-9b18-7b3cf1598ac3\") " pod="openshift-marketplace/community-operators-m4mqr" Dec 01 22:06:18 crc kubenswrapper[4962]: I1201 22:06:18.363284 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6b9r\" (UniqueName: \"kubernetes.io/projected/d9ca4ee2-77c4-4862-9b18-7b3cf1598ac3-kube-api-access-z6b9r\") pod \"community-operators-m4mqr\" (UID: \"d9ca4ee2-77c4-4862-9b18-7b3cf1598ac3\") " pod="openshift-marketplace/community-operators-m4mqr" Dec 01 22:06:18 crc kubenswrapper[4962]: I1201 22:06:18.363324 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9ca4ee2-77c4-4862-9b18-7b3cf1598ac3-catalog-content\") pod \"community-operators-m4mqr\" (UID: \"d9ca4ee2-77c4-4862-9b18-7b3cf1598ac3\") " pod="openshift-marketplace/community-operators-m4mqr" Dec 01 22:06:18 crc kubenswrapper[4962]: I1201 22:06:18.364058 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9ca4ee2-77c4-4862-9b18-7b3cf1598ac3-utilities\") pod \"community-operators-m4mqr\" (UID: \"d9ca4ee2-77c4-4862-9b18-7b3cf1598ac3\") " pod="openshift-marketplace/community-operators-m4mqr" Dec 01 22:06:18 crc kubenswrapper[4962]: I1201 22:06:18.364283 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9ca4ee2-77c4-4862-9b18-7b3cf1598ac3-catalog-content\") pod \"community-operators-m4mqr\" (UID: \"d9ca4ee2-77c4-4862-9b18-7b3cf1598ac3\") " pod="openshift-marketplace/community-operators-m4mqr" Dec 01 22:06:18 crc kubenswrapper[4962]: I1201 22:06:18.388468 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6b9r\" (UniqueName: \"kubernetes.io/projected/d9ca4ee2-77c4-4862-9b18-7b3cf1598ac3-kube-api-access-z6b9r\") pod \"community-operators-m4mqr\" (UID: \"d9ca4ee2-77c4-4862-9b18-7b3cf1598ac3\") " pod="openshift-marketplace/community-operators-m4mqr" Dec 01 22:06:18 crc kubenswrapper[4962]: I1201 22:06:18.428893 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m4mqr" Dec 01 22:06:19 crc kubenswrapper[4962]: I1201 22:06:19.016960 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m4mqr"] Dec 01 22:06:19 crc kubenswrapper[4962]: I1201 22:06:19.908225 4962 generic.go:334] "Generic (PLEG): container finished" podID="d9ca4ee2-77c4-4862-9b18-7b3cf1598ac3" containerID="b44d714730115e74b589a8cb1e917aa10f5e0080c8c3529d217e4faab10548fa" exitCode=0 Dec 01 22:06:19 crc kubenswrapper[4962]: I1201 22:06:19.908323 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m4mqr" event={"ID":"d9ca4ee2-77c4-4862-9b18-7b3cf1598ac3","Type":"ContainerDied","Data":"b44d714730115e74b589a8cb1e917aa10f5e0080c8c3529d217e4faab10548fa"} Dec 01 22:06:19 crc kubenswrapper[4962]: I1201 22:06:19.912030 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m4mqr" event={"ID":"d9ca4ee2-77c4-4862-9b18-7b3cf1598ac3","Type":"ContainerStarted","Data":"66cda2f318a140a96a5a4ea46650214f1265cb1ce76aa6249e53f3de09f8231f"} Dec 01 22:06:21 crc kubenswrapper[4962]: I1201 22:06:21.074746 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bzdvf"] Dec 01 22:06:21 crc kubenswrapper[4962]: I1201 22:06:21.078216 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bzdvf" Dec 01 22:06:21 crc kubenswrapper[4962]: I1201 22:06:21.097094 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bzdvf"] Dec 01 22:06:21 crc kubenswrapper[4962]: I1201 22:06:21.132534 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcwv5\" (UniqueName: \"kubernetes.io/projected/18cdf103-82c8-4796-9c6e-a544e6bce98e-kube-api-access-jcwv5\") pod \"redhat-operators-bzdvf\" (UID: \"18cdf103-82c8-4796-9c6e-a544e6bce98e\") " pod="openshift-marketplace/redhat-operators-bzdvf" Dec 01 22:06:21 crc kubenswrapper[4962]: I1201 22:06:21.133017 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18cdf103-82c8-4796-9c6e-a544e6bce98e-utilities\") pod \"redhat-operators-bzdvf\" (UID: \"18cdf103-82c8-4796-9c6e-a544e6bce98e\") " pod="openshift-marketplace/redhat-operators-bzdvf" Dec 01 22:06:21 crc kubenswrapper[4962]: I1201 22:06:21.133079 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18cdf103-82c8-4796-9c6e-a544e6bce98e-catalog-content\") pod \"redhat-operators-bzdvf\" (UID: \"18cdf103-82c8-4796-9c6e-a544e6bce98e\") " pod="openshift-marketplace/redhat-operators-bzdvf" Dec 01 22:06:21 crc kubenswrapper[4962]: I1201 22:06:21.235033 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcwv5\" (UniqueName: \"kubernetes.io/projected/18cdf103-82c8-4796-9c6e-a544e6bce98e-kube-api-access-jcwv5\") pod \"redhat-operators-bzdvf\" (UID: \"18cdf103-82c8-4796-9c6e-a544e6bce98e\") " pod="openshift-marketplace/redhat-operators-bzdvf" Dec 01 22:06:21 crc kubenswrapper[4962]: I1201 22:06:21.235145 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18cdf103-82c8-4796-9c6e-a544e6bce98e-utilities\") pod \"redhat-operators-bzdvf\" (UID: \"18cdf103-82c8-4796-9c6e-a544e6bce98e\") " pod="openshift-marketplace/redhat-operators-bzdvf" Dec 01 22:06:21 crc kubenswrapper[4962]: I1201 22:06:21.235185 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18cdf103-82c8-4796-9c6e-a544e6bce98e-catalog-content\") pod \"redhat-operators-bzdvf\" (UID: \"18cdf103-82c8-4796-9c6e-a544e6bce98e\") " pod="openshift-marketplace/redhat-operators-bzdvf" Dec 01 22:06:21 crc kubenswrapper[4962]: I1201 22:06:21.235733 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18cdf103-82c8-4796-9c6e-a544e6bce98e-utilities\") pod \"redhat-operators-bzdvf\" (UID: \"18cdf103-82c8-4796-9c6e-a544e6bce98e\") " pod="openshift-marketplace/redhat-operators-bzdvf" Dec 01 22:06:21 crc kubenswrapper[4962]: I1201 22:06:21.235779 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18cdf103-82c8-4796-9c6e-a544e6bce98e-catalog-content\") pod \"redhat-operators-bzdvf\" (UID: \"18cdf103-82c8-4796-9c6e-a544e6bce98e\") " pod="openshift-marketplace/redhat-operators-bzdvf" Dec 01 22:06:21 crc kubenswrapper[4962]: I1201 22:06:21.263465 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcwv5\" (UniqueName: \"kubernetes.io/projected/18cdf103-82c8-4796-9c6e-a544e6bce98e-kube-api-access-jcwv5\") pod \"redhat-operators-bzdvf\" (UID: \"18cdf103-82c8-4796-9c6e-a544e6bce98e\") " pod="openshift-marketplace/redhat-operators-bzdvf" Dec 01 22:06:21 crc kubenswrapper[4962]: I1201 22:06:21.397485 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bzdvf" Dec 01 22:06:21 crc kubenswrapper[4962]: I1201 22:06:21.894798 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bzdvf"] Dec 01 22:06:21 crc kubenswrapper[4962]: I1201 22:06:21.950743 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bzdvf" event={"ID":"18cdf103-82c8-4796-9c6e-a544e6bce98e","Type":"ContainerStarted","Data":"29a0421849021ebb46c607ff1334357515ddf0497d95ae1c264f9433507fdc5a"} Dec 01 22:06:21 crc kubenswrapper[4962]: I1201 22:06:21.952956 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m4mqr" event={"ID":"d9ca4ee2-77c4-4862-9b18-7b3cf1598ac3","Type":"ContainerStarted","Data":"54994a2f8a562281ff230d564e00532aaff03d50bd9e4916f77995f819c36827"} Dec 01 22:06:22 crc kubenswrapper[4962]: I1201 22:06:22.962751 4962 generic.go:334] "Generic (PLEG): container finished" podID="d9ca4ee2-77c4-4862-9b18-7b3cf1598ac3" containerID="54994a2f8a562281ff230d564e00532aaff03d50bd9e4916f77995f819c36827" exitCode=0 Dec 01 22:06:22 crc kubenswrapper[4962]: I1201 22:06:22.962980 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m4mqr" event={"ID":"d9ca4ee2-77c4-4862-9b18-7b3cf1598ac3","Type":"ContainerDied","Data":"54994a2f8a562281ff230d564e00532aaff03d50bd9e4916f77995f819c36827"} Dec 01 22:06:22 crc kubenswrapper[4962]: I1201 22:06:22.964786 4962 generic.go:334] "Generic (PLEG): container finished" podID="18cdf103-82c8-4796-9c6e-a544e6bce98e" containerID="453a8ec523fc6f0d226988c7bde187bed0f59ce81a2f8b898b6e525acabbe7d8" exitCode=0 Dec 01 22:06:22 crc kubenswrapper[4962]: I1201 22:06:22.964827 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bzdvf" event={"ID":"18cdf103-82c8-4796-9c6e-a544e6bce98e","Type":"ContainerDied","Data":"453a8ec523fc6f0d226988c7bde187bed0f59ce81a2f8b898b6e525acabbe7d8"} Dec 01 22:06:23 crc kubenswrapper[4962]: I1201 22:06:23.981620 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m4mqr" event={"ID":"d9ca4ee2-77c4-4862-9b18-7b3cf1598ac3","Type":"ContainerStarted","Data":"b7786fdd5b5450d40ba959565333636728881cd7ed0827941bd631efe80dffc9"} Dec 01 22:06:24 crc kubenswrapper[4962]: I1201 22:06:24.004164 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-m4mqr" podStartSLOduration=2.273169666 podStartE2EDuration="6.004144054s" podCreationTimestamp="2025-12-01 22:06:18 +0000 UTC" firstStartedPulling="2025-12-01 22:06:19.911217798 +0000 UTC m=+1964.012657003" lastFinishedPulling="2025-12-01 22:06:23.642192196 +0000 UTC m=+1967.743631391" observedRunningTime="2025-12-01 22:06:23.99938479 +0000 UTC m=+1968.100823995" watchObservedRunningTime="2025-12-01 22:06:24.004144054 +0000 UTC m=+1968.105583269" Dec 01 22:06:24 crc kubenswrapper[4962]: I1201 22:06:24.998882 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bzdvf" event={"ID":"18cdf103-82c8-4796-9c6e-a544e6bce98e","Type":"ContainerStarted","Data":"152fb843aaa444d4e42fd1950444704c228575b294eacf3969da19c0d9dfd2c8"} Dec 01 22:06:25 crc kubenswrapper[4962]: I1201 22:06:25.220103 4962 scope.go:117] "RemoveContainer" containerID="ff25160e91e4465a0010cbcc4456883387e71df20e0a000eb4c6b6750118371f" Dec 01 22:06:25 crc kubenswrapper[4962]: E1201 22:06:25.220961 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:06:28 crc kubenswrapper[4962]: I1201 22:06:28.430123 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-m4mqr" Dec 01 22:06:28 crc kubenswrapper[4962]: I1201 22:06:28.431010 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-m4mqr" Dec 01 22:06:29 crc kubenswrapper[4962]: I1201 22:06:29.503045 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-m4mqr" podUID="d9ca4ee2-77c4-4862-9b18-7b3cf1598ac3" containerName="registry-server" probeResult="failure" output=< Dec 01 22:06:29 crc kubenswrapper[4962]: timeout: failed to connect service ":50051" within 1s Dec 01 22:06:29 crc kubenswrapper[4962]: > Dec 01 22:06:30 crc kubenswrapper[4962]: I1201 22:06:30.069788 4962 generic.go:334] "Generic (PLEG): container finished" podID="18cdf103-82c8-4796-9c6e-a544e6bce98e" containerID="152fb843aaa444d4e42fd1950444704c228575b294eacf3969da19c0d9dfd2c8" exitCode=0 Dec 01 22:06:30 crc kubenswrapper[4962]: I1201 22:06:30.069878 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bzdvf" event={"ID":"18cdf103-82c8-4796-9c6e-a544e6bce98e","Type":"ContainerDied","Data":"152fb843aaa444d4e42fd1950444704c228575b294eacf3969da19c0d9dfd2c8"} Dec 01 22:06:31 crc kubenswrapper[4962]: I1201 22:06:31.080886 4962 generic.go:334] "Generic (PLEG): container finished" podID="de8a047d-3a82-4ffe-a734-76c25d8997e5" containerID="49d26daf26e371c581ebbb5d4422236dd37a8b9c4aa12aa911d19839be99afb4" exitCode=0 Dec 01 22:06:31 crc kubenswrapper[4962]: I1201 22:06:31.080969 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d2c4z" event={"ID":"de8a047d-3a82-4ffe-a734-76c25d8997e5","Type":"ContainerDied","Data":"49d26daf26e371c581ebbb5d4422236dd37a8b9c4aa12aa911d19839be99afb4"} Dec 01 22:06:31 crc kubenswrapper[4962]: I1201 22:06:31.083430 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bzdvf" event={"ID":"18cdf103-82c8-4796-9c6e-a544e6bce98e","Type":"ContainerStarted","Data":"72af669461d2fae2ab8ce58091342df5556ccdea35b7fcfefad0350542234291"} Dec 01 22:06:31 crc kubenswrapper[4962]: I1201 22:06:31.130863 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bzdvf" podStartSLOduration=2.311964742 podStartE2EDuration="10.130841904s" podCreationTimestamp="2025-12-01 22:06:21 +0000 UTC" firstStartedPulling="2025-12-01 22:06:22.966302283 +0000 UTC m=+1967.067741488" lastFinishedPulling="2025-12-01 22:06:30.785179455 +0000 UTC m=+1974.886618650" observedRunningTime="2025-12-01 22:06:31.122790567 +0000 UTC m=+1975.224229792" watchObservedRunningTime="2025-12-01 22:06:31.130841904 +0000 UTC m=+1975.232281099" Dec 01 22:06:31 crc kubenswrapper[4962]: I1201 22:06:31.398008 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bzdvf" Dec 01 22:06:31 crc kubenswrapper[4962]: I1201 22:06:31.398184 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bzdvf" Dec 01 22:06:32 crc kubenswrapper[4962]: I1201 22:06:32.477153 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bzdvf" podUID="18cdf103-82c8-4796-9c6e-a544e6bce98e" containerName="registry-server" probeResult="failure" output=< Dec 01 22:06:32 crc kubenswrapper[4962]: timeout: failed to connect service ":50051" within 1s Dec 01 22:06:32 crc kubenswrapper[4962]: > Dec 01 22:06:32 crc kubenswrapper[4962]: I1201 22:06:32.691731 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d2c4z" Dec 01 22:06:32 crc kubenswrapper[4962]: I1201 22:06:32.704685 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de8a047d-3a82-4ffe-a734-76c25d8997e5-inventory\") pod \"de8a047d-3a82-4ffe-a734-76c25d8997e5\" (UID: \"de8a047d-3a82-4ffe-a734-76c25d8997e5\") " Dec 01 22:06:32 crc kubenswrapper[4962]: I1201 22:06:32.704733 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqmzw\" (UniqueName: \"kubernetes.io/projected/de8a047d-3a82-4ffe-a734-76c25d8997e5-kube-api-access-kqmzw\") pod \"de8a047d-3a82-4ffe-a734-76c25d8997e5\" (UID: \"de8a047d-3a82-4ffe-a734-76c25d8997e5\") " Dec 01 22:06:32 crc kubenswrapper[4962]: I1201 22:06:32.705106 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/de8a047d-3a82-4ffe-a734-76c25d8997e5-ssh-key\") pod \"de8a047d-3a82-4ffe-a734-76c25d8997e5\" (UID: \"de8a047d-3a82-4ffe-a734-76c25d8997e5\") " Dec 01 22:06:32 crc kubenswrapper[4962]: I1201 22:06:32.723212 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de8a047d-3a82-4ffe-a734-76c25d8997e5-kube-api-access-kqmzw" (OuterVolumeSpecName: "kube-api-access-kqmzw") pod "de8a047d-3a82-4ffe-a734-76c25d8997e5" (UID: "de8a047d-3a82-4ffe-a734-76c25d8997e5"). InnerVolumeSpecName "kube-api-access-kqmzw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 22:06:32 crc kubenswrapper[4962]: I1201 22:06:32.769620 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de8a047d-3a82-4ffe-a734-76c25d8997e5-inventory" (OuterVolumeSpecName: "inventory") pod "de8a047d-3a82-4ffe-a734-76c25d8997e5" (UID: "de8a047d-3a82-4ffe-a734-76c25d8997e5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:06:32 crc kubenswrapper[4962]: I1201 22:06:32.773534 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de8a047d-3a82-4ffe-a734-76c25d8997e5-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "de8a047d-3a82-4ffe-a734-76c25d8997e5" (UID: "de8a047d-3a82-4ffe-a734-76c25d8997e5"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:06:32 crc kubenswrapper[4962]: I1201 22:06:32.807324 4962 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de8a047d-3a82-4ffe-a734-76c25d8997e5-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 22:06:32 crc kubenswrapper[4962]: I1201 22:06:32.807364 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqmzw\" (UniqueName: \"kubernetes.io/projected/de8a047d-3a82-4ffe-a734-76c25d8997e5-kube-api-access-kqmzw\") on node \"crc\" DevicePath \"\"" Dec 01 22:06:32 crc kubenswrapper[4962]: I1201 22:06:32.807376 4962 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/de8a047d-3a82-4ffe-a734-76c25d8997e5-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 22:06:33 crc kubenswrapper[4962]: I1201 22:06:33.116975 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d2c4z" event={"ID":"de8a047d-3a82-4ffe-a734-76c25d8997e5","Type":"ContainerDied","Data":"95147735f730e34d7db8bb7417d81f954c2fff97d9abb478ea56259c12a4532f"} Dec 01 22:06:33 crc kubenswrapper[4962]: I1201 22:06:33.117022 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d2c4z" Dec 01 22:06:33 crc kubenswrapper[4962]: I1201 22:06:33.117043 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="95147735f730e34d7db8bb7417d81f954c2fff97d9abb478ea56259c12a4532f" Dec 01 22:06:33 crc kubenswrapper[4962]: I1201 22:06:33.209713 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wl946"] Dec 01 22:06:33 crc kubenswrapper[4962]: E1201 22:06:33.210458 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de8a047d-3a82-4ffe-a734-76c25d8997e5" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 01 22:06:33 crc kubenswrapper[4962]: I1201 22:06:33.211389 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="de8a047d-3a82-4ffe-a734-76c25d8997e5" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 01 22:06:33 crc kubenswrapper[4962]: I1201 22:06:33.211816 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="de8a047d-3a82-4ffe-a734-76c25d8997e5" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 01 22:06:33 crc kubenswrapper[4962]: I1201 22:06:33.213074 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wl946" Dec 01 22:06:33 crc kubenswrapper[4962]: I1201 22:06:33.215843 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 22:06:33 crc kubenswrapper[4962]: I1201 22:06:33.216882 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 22:06:33 crc kubenswrapper[4962]: I1201 22:06:33.216882 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 22:06:33 crc kubenswrapper[4962]: I1201 22:06:33.216965 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mpxjd" Dec 01 22:06:33 crc kubenswrapper[4962]: I1201 22:06:33.236806 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wl946"] Dec 01 22:06:33 crc kubenswrapper[4962]: I1201 22:06:33.317837 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fae4b755-5a52-461b-939c-b870ddcc521b-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-wl946\" (UID: \"fae4b755-5a52-461b-939c-b870ddcc521b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wl946" Dec 01 22:06:33 crc kubenswrapper[4962]: I1201 22:06:33.317920 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fae4b755-5a52-461b-939c-b870ddcc521b-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-wl946\" (UID: \"fae4b755-5a52-461b-939c-b870ddcc521b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wl946" Dec 01 22:06:33 crc kubenswrapper[4962]: I1201 22:06:33.318326 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzlm9\" (UniqueName: \"kubernetes.io/projected/fae4b755-5a52-461b-939c-b870ddcc521b-kube-api-access-bzlm9\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-wl946\" (UID: \"fae4b755-5a52-461b-939c-b870ddcc521b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wl946" Dec 01 22:06:33 crc kubenswrapper[4962]: I1201 22:06:33.421377 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzlm9\" (UniqueName: \"kubernetes.io/projected/fae4b755-5a52-461b-939c-b870ddcc521b-kube-api-access-bzlm9\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-wl946\" (UID: \"fae4b755-5a52-461b-939c-b870ddcc521b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wl946" Dec 01 22:06:33 crc kubenswrapper[4962]: I1201 22:06:33.421698 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fae4b755-5a52-461b-939c-b870ddcc521b-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-wl946\" (UID: \"fae4b755-5a52-461b-939c-b870ddcc521b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wl946" Dec 01 22:06:33 crc kubenswrapper[4962]: I1201 22:06:33.421816 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fae4b755-5a52-461b-939c-b870ddcc521b-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-wl946\" (UID: \"fae4b755-5a52-461b-939c-b870ddcc521b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wl946" Dec 01 22:06:33 crc kubenswrapper[4962]: I1201 22:06:33.429464 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fae4b755-5a52-461b-939c-b870ddcc521b-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-wl946\" (UID: \"fae4b755-5a52-461b-939c-b870ddcc521b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wl946" Dec 01 22:06:33 crc kubenswrapper[4962]: I1201 22:06:33.430902 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fae4b755-5a52-461b-939c-b870ddcc521b-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-wl946\" (UID: \"fae4b755-5a52-461b-939c-b870ddcc521b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wl946" Dec 01 22:06:33 crc kubenswrapper[4962]: I1201 22:06:33.445166 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzlm9\" (UniqueName: \"kubernetes.io/projected/fae4b755-5a52-461b-939c-b870ddcc521b-kube-api-access-bzlm9\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-wl946\" (UID: \"fae4b755-5a52-461b-939c-b870ddcc521b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wl946" Dec 01 22:06:33 crc kubenswrapper[4962]: I1201 22:06:33.537398 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wl946" Dec 01 22:06:34 crc kubenswrapper[4962]: I1201 22:06:34.447306 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wl946"] Dec 01 22:06:35 crc kubenswrapper[4962]: I1201 22:06:35.051013 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-4997b"] Dec 01 22:06:35 crc kubenswrapper[4962]: I1201 22:06:35.073378 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-4997b"] Dec 01 22:06:35 crc kubenswrapper[4962]: I1201 22:06:35.135654 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wl946" event={"ID":"fae4b755-5a52-461b-939c-b870ddcc521b","Type":"ContainerStarted","Data":"dd276b891ceedf7cea5f2520a356541c422ce9a714025cd6335d466ae0fc1b64"} Dec 01 22:06:36 crc kubenswrapper[4962]: I1201 22:06:36.153011 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wl946" event={"ID":"fae4b755-5a52-461b-939c-b870ddcc521b","Type":"ContainerStarted","Data":"076a75ea0487d6020f3172fb950560f0f6a54e9f00d3e7065905fd151fd80dfa"} Dec 01 22:06:36 crc kubenswrapper[4962]: I1201 22:06:36.180901 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wl946" podStartSLOduration=2.257313257 podStartE2EDuration="3.180880274s" podCreationTimestamp="2025-12-01 22:06:33 +0000 UTC" firstStartedPulling="2025-12-01 22:06:34.442108735 +0000 UTC m=+1978.543547920" lastFinishedPulling="2025-12-01 22:06:35.365675732 +0000 UTC m=+1979.467114937" observedRunningTime="2025-12-01 22:06:36.168222827 +0000 UTC m=+1980.269662012" watchObservedRunningTime="2025-12-01 22:06:36.180880274 +0000 UTC m=+1980.282319479" Dec 01 22:06:36 crc kubenswrapper[4962]: I1201 22:06:36.238533 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93391e58-9905-4fec-a4fe-4ed30bbb5eec" path="/var/lib/kubelet/pods/93391e58-9905-4fec-a4fe-4ed30bbb5eec/volumes" Dec 01 22:06:38 crc kubenswrapper[4962]: I1201 22:06:38.220126 4962 scope.go:117] "RemoveContainer" containerID="ff25160e91e4465a0010cbcc4456883387e71df20e0a000eb4c6b6750118371f" Dec 01 22:06:38 crc kubenswrapper[4962]: I1201 22:06:38.533476 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-m4mqr" Dec 01 22:06:38 crc kubenswrapper[4962]: I1201 22:06:38.601173 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-m4mqr" Dec 01 22:06:38 crc kubenswrapper[4962]: I1201 22:06:38.792296 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-m4mqr"] Dec 01 22:06:39 crc kubenswrapper[4962]: I1201 22:06:39.186975 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" event={"ID":"191b6ce3-f613-4217-b224-a65ee4cfdfe7","Type":"ContainerStarted","Data":"f9afb91c220faf243aae75f9834e77e5c18873f4e7d46e25474ebf0923963082"} Dec 01 22:06:40 crc kubenswrapper[4962]: I1201 22:06:40.198188 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-m4mqr" podUID="d9ca4ee2-77c4-4862-9b18-7b3cf1598ac3" containerName="registry-server" containerID="cri-o://b7786fdd5b5450d40ba959565333636728881cd7ed0827941bd631efe80dffc9" gracePeriod=2 Dec 01 22:06:40 crc kubenswrapper[4962]: I1201 22:06:40.718242 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m4mqr" Dec 01 22:06:40 crc kubenswrapper[4962]: I1201 22:06:40.825869 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6b9r\" (UniqueName: \"kubernetes.io/projected/d9ca4ee2-77c4-4862-9b18-7b3cf1598ac3-kube-api-access-z6b9r\") pod \"d9ca4ee2-77c4-4862-9b18-7b3cf1598ac3\" (UID: \"d9ca4ee2-77c4-4862-9b18-7b3cf1598ac3\") " Dec 01 22:06:40 crc kubenswrapper[4962]: I1201 22:06:40.825986 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9ca4ee2-77c4-4862-9b18-7b3cf1598ac3-utilities\") pod \"d9ca4ee2-77c4-4862-9b18-7b3cf1598ac3\" (UID: \"d9ca4ee2-77c4-4862-9b18-7b3cf1598ac3\") " Dec 01 22:06:40 crc kubenswrapper[4962]: I1201 22:06:40.826112 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9ca4ee2-77c4-4862-9b18-7b3cf1598ac3-catalog-content\") pod \"d9ca4ee2-77c4-4862-9b18-7b3cf1598ac3\" (UID: \"d9ca4ee2-77c4-4862-9b18-7b3cf1598ac3\") " Dec 01 22:06:40 crc kubenswrapper[4962]: I1201 22:06:40.826843 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9ca4ee2-77c4-4862-9b18-7b3cf1598ac3-utilities" (OuterVolumeSpecName: "utilities") pod "d9ca4ee2-77c4-4862-9b18-7b3cf1598ac3" (UID: "d9ca4ee2-77c4-4862-9b18-7b3cf1598ac3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 22:06:40 crc kubenswrapper[4962]: I1201 22:06:40.831336 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9ca4ee2-77c4-4862-9b18-7b3cf1598ac3-kube-api-access-z6b9r" (OuterVolumeSpecName: "kube-api-access-z6b9r") pod "d9ca4ee2-77c4-4862-9b18-7b3cf1598ac3" (UID: "d9ca4ee2-77c4-4862-9b18-7b3cf1598ac3"). InnerVolumeSpecName "kube-api-access-z6b9r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 22:06:40 crc kubenswrapper[4962]: I1201 22:06:40.875668 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9ca4ee2-77c4-4862-9b18-7b3cf1598ac3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d9ca4ee2-77c4-4862-9b18-7b3cf1598ac3" (UID: "d9ca4ee2-77c4-4862-9b18-7b3cf1598ac3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 22:06:40 crc kubenswrapper[4962]: I1201 22:06:40.928950 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6b9r\" (UniqueName: \"kubernetes.io/projected/d9ca4ee2-77c4-4862-9b18-7b3cf1598ac3-kube-api-access-z6b9r\") on node \"crc\" DevicePath \"\"" Dec 01 22:06:40 crc kubenswrapper[4962]: I1201 22:06:40.928989 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9ca4ee2-77c4-4862-9b18-7b3cf1598ac3-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 22:06:40 crc kubenswrapper[4962]: I1201 22:06:40.928999 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9ca4ee2-77c4-4862-9b18-7b3cf1598ac3-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 22:06:41 crc kubenswrapper[4962]: I1201 22:06:41.210760 4962 generic.go:334] "Generic (PLEG): container finished" podID="d9ca4ee2-77c4-4862-9b18-7b3cf1598ac3" containerID="b7786fdd5b5450d40ba959565333636728881cd7ed0827941bd631efe80dffc9" exitCode=0 Dec 01 22:06:41 crc kubenswrapper[4962]: I1201 22:06:41.211138 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m4mqr" event={"ID":"d9ca4ee2-77c4-4862-9b18-7b3cf1598ac3","Type":"ContainerDied","Data":"b7786fdd5b5450d40ba959565333636728881cd7ed0827941bd631efe80dffc9"} Dec 01 22:06:41 crc kubenswrapper[4962]: I1201 22:06:41.211171 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m4mqr" event={"ID":"d9ca4ee2-77c4-4862-9b18-7b3cf1598ac3","Type":"ContainerDied","Data":"66cda2f318a140a96a5a4ea46650214f1265cb1ce76aa6249e53f3de09f8231f"} Dec 01 22:06:41 crc kubenswrapper[4962]: I1201 22:06:41.211191 4962 scope.go:117] "RemoveContainer" containerID="b7786fdd5b5450d40ba959565333636728881cd7ed0827941bd631efe80dffc9" Dec 01 22:06:41 crc kubenswrapper[4962]: I1201 22:06:41.211370 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m4mqr" Dec 01 22:06:41 crc kubenswrapper[4962]: I1201 22:06:41.246297 4962 scope.go:117] "RemoveContainer" containerID="54994a2f8a562281ff230d564e00532aaff03d50bd9e4916f77995f819c36827" Dec 01 22:06:41 crc kubenswrapper[4962]: I1201 22:06:41.266557 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-m4mqr"] Dec 01 22:06:41 crc kubenswrapper[4962]: I1201 22:06:41.283555 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-m4mqr"] Dec 01 22:06:41 crc kubenswrapper[4962]: I1201 22:06:41.301043 4962 scope.go:117] "RemoveContainer" containerID="b44d714730115e74b589a8cb1e917aa10f5e0080c8c3529d217e4faab10548fa" Dec 01 22:06:41 crc kubenswrapper[4962]: I1201 22:06:41.333875 4962 scope.go:117] "RemoveContainer" containerID="b7786fdd5b5450d40ba959565333636728881cd7ed0827941bd631efe80dffc9" Dec 01 22:06:41 crc kubenswrapper[4962]: E1201 22:06:41.335319 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7786fdd5b5450d40ba959565333636728881cd7ed0827941bd631efe80dffc9\": container with ID starting with b7786fdd5b5450d40ba959565333636728881cd7ed0827941bd631efe80dffc9 not found: ID does not exist" containerID="b7786fdd5b5450d40ba959565333636728881cd7ed0827941bd631efe80dffc9" Dec 01 22:06:41 crc kubenswrapper[4962]: I1201 22:06:41.335363 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7786fdd5b5450d40ba959565333636728881cd7ed0827941bd631efe80dffc9"} err="failed to get container status \"b7786fdd5b5450d40ba959565333636728881cd7ed0827941bd631efe80dffc9\": rpc error: code = NotFound desc = could not find container \"b7786fdd5b5450d40ba959565333636728881cd7ed0827941bd631efe80dffc9\": container with ID starting with b7786fdd5b5450d40ba959565333636728881cd7ed0827941bd631efe80dffc9 not found: ID does not exist" Dec 01 22:06:41 crc kubenswrapper[4962]: I1201 22:06:41.335398 4962 scope.go:117] "RemoveContainer" containerID="54994a2f8a562281ff230d564e00532aaff03d50bd9e4916f77995f819c36827" Dec 01 22:06:41 crc kubenswrapper[4962]: E1201 22:06:41.336172 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54994a2f8a562281ff230d564e00532aaff03d50bd9e4916f77995f819c36827\": container with ID starting with 54994a2f8a562281ff230d564e00532aaff03d50bd9e4916f77995f819c36827 not found: ID does not exist" containerID="54994a2f8a562281ff230d564e00532aaff03d50bd9e4916f77995f819c36827" Dec 01 22:06:41 crc kubenswrapper[4962]: I1201 22:06:41.336202 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54994a2f8a562281ff230d564e00532aaff03d50bd9e4916f77995f819c36827"} err="failed to get container status \"54994a2f8a562281ff230d564e00532aaff03d50bd9e4916f77995f819c36827\": rpc error: code = NotFound desc = could not find container \"54994a2f8a562281ff230d564e00532aaff03d50bd9e4916f77995f819c36827\": container with ID starting with 54994a2f8a562281ff230d564e00532aaff03d50bd9e4916f77995f819c36827 not found: ID does not exist" Dec 01 22:06:41 crc kubenswrapper[4962]: I1201 22:06:41.336217 4962 scope.go:117] "RemoveContainer" containerID="b44d714730115e74b589a8cb1e917aa10f5e0080c8c3529d217e4faab10548fa" Dec 01 22:06:41 crc kubenswrapper[4962]: E1201 22:06:41.336424 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b44d714730115e74b589a8cb1e917aa10f5e0080c8c3529d217e4faab10548fa\": container with ID starting with b44d714730115e74b589a8cb1e917aa10f5e0080c8c3529d217e4faab10548fa not found: ID does not exist" containerID="b44d714730115e74b589a8cb1e917aa10f5e0080c8c3529d217e4faab10548fa" Dec 01 22:06:41 crc kubenswrapper[4962]: I1201 22:06:41.336448 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b44d714730115e74b589a8cb1e917aa10f5e0080c8c3529d217e4faab10548fa"} err="failed to get container status \"b44d714730115e74b589a8cb1e917aa10f5e0080c8c3529d217e4faab10548fa\": rpc error: code = NotFound desc = could not find container \"b44d714730115e74b589a8cb1e917aa10f5e0080c8c3529d217e4faab10548fa\": container with ID starting with b44d714730115e74b589a8cb1e917aa10f5e0080c8c3529d217e4faab10548fa not found: ID does not exist" Dec 01 22:06:42 crc kubenswrapper[4962]: I1201 22:06:42.233725 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9ca4ee2-77c4-4862-9b18-7b3cf1598ac3" path="/var/lib/kubelet/pods/d9ca4ee2-77c4-4862-9b18-7b3cf1598ac3/volumes" Dec 01 22:06:42 crc kubenswrapper[4962]: I1201 22:06:42.455435 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bzdvf" podUID="18cdf103-82c8-4796-9c6e-a544e6bce98e" containerName="registry-server" probeResult="failure" output=< Dec 01 22:06:42 crc kubenswrapper[4962]: timeout: failed to connect service ":50051" within 1s Dec 01 22:06:42 crc kubenswrapper[4962]: > Dec 01 22:06:46 crc kubenswrapper[4962]: I1201 22:06:46.353675 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-65c954fcc-wpwn7" podUID="85abfbd6-374e-486e-93f1-8e8c4e8b5da0" containerName="proxy-server" probeResult="failure" output="HTTP probe failed with statuscode: 502" Dec 01 22:06:51 crc kubenswrapper[4962]: I1201 22:06:51.495234 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bzdvf" Dec 01 22:06:51 crc kubenswrapper[4962]: I1201 22:06:51.581495 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bzdvf" Dec 01 22:06:52 crc kubenswrapper[4962]: I1201 22:06:52.279819 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bzdvf"] Dec 01 22:06:53 crc kubenswrapper[4962]: I1201 22:06:53.069448 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-h9qhc"] Dec 01 22:06:53 crc kubenswrapper[4962]: I1201 22:06:53.083340 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-e930-account-create-update-gr4rp"] Dec 01 22:06:53 crc kubenswrapper[4962]: I1201 22:06:53.094962 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-e930-account-create-update-gr4rp"] Dec 01 22:06:53 crc kubenswrapper[4962]: I1201 22:06:53.103663 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-h9qhc"] Dec 01 22:06:53 crc kubenswrapper[4962]: I1201 22:06:53.397628 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bzdvf" podUID="18cdf103-82c8-4796-9c6e-a544e6bce98e" containerName="registry-server" containerID="cri-o://72af669461d2fae2ab8ce58091342df5556ccdea35b7fcfefad0350542234291" gracePeriod=2 Dec 01 22:06:53 crc kubenswrapper[4962]: E1201 22:06:53.577470 4962 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18cdf103_82c8_4796_9c6e_a544e6bce98e.slice/crio-conmon-72af669461d2fae2ab8ce58091342df5556ccdea35b7fcfefad0350542234291.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18cdf103_82c8_4796_9c6e_a544e6bce98e.slice/crio-72af669461d2fae2ab8ce58091342df5556ccdea35b7fcfefad0350542234291.scope\": RecentStats: unable to find data in memory cache]" Dec 01 22:06:53 crc kubenswrapper[4962]: I1201 22:06:53.962227 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bzdvf" Dec 01 22:06:54 crc kubenswrapper[4962]: I1201 22:06:54.020528 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jcwv5\" (UniqueName: \"kubernetes.io/projected/18cdf103-82c8-4796-9c6e-a544e6bce98e-kube-api-access-jcwv5\") pod \"18cdf103-82c8-4796-9c6e-a544e6bce98e\" (UID: \"18cdf103-82c8-4796-9c6e-a544e6bce98e\") " Dec 01 22:06:54 crc kubenswrapper[4962]: I1201 22:06:54.020626 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18cdf103-82c8-4796-9c6e-a544e6bce98e-catalog-content\") pod \"18cdf103-82c8-4796-9c6e-a544e6bce98e\" (UID: \"18cdf103-82c8-4796-9c6e-a544e6bce98e\") " Dec 01 22:06:54 crc kubenswrapper[4962]: I1201 22:06:54.020779 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18cdf103-82c8-4796-9c6e-a544e6bce98e-utilities\") pod \"18cdf103-82c8-4796-9c6e-a544e6bce98e\" (UID: \"18cdf103-82c8-4796-9c6e-a544e6bce98e\") " Dec 01 22:06:54 crc kubenswrapper[4962]: I1201 22:06:54.022075 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18cdf103-82c8-4796-9c6e-a544e6bce98e-utilities" (OuterVolumeSpecName: "utilities") pod "18cdf103-82c8-4796-9c6e-a544e6bce98e" (UID: "18cdf103-82c8-4796-9c6e-a544e6bce98e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 22:06:54 crc kubenswrapper[4962]: I1201 22:06:54.022376 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18cdf103-82c8-4796-9c6e-a544e6bce98e-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 22:06:54 crc kubenswrapper[4962]: I1201 22:06:54.035668 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18cdf103-82c8-4796-9c6e-a544e6bce98e-kube-api-access-jcwv5" (OuterVolumeSpecName: "kube-api-access-jcwv5") pod "18cdf103-82c8-4796-9c6e-a544e6bce98e" (UID: "18cdf103-82c8-4796-9c6e-a544e6bce98e"). InnerVolumeSpecName "kube-api-access-jcwv5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 22:06:54 crc kubenswrapper[4962]: I1201 22:06:54.124369 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jcwv5\" (UniqueName: \"kubernetes.io/projected/18cdf103-82c8-4796-9c6e-a544e6bce98e-kube-api-access-jcwv5\") on node \"crc\" DevicePath \"\"" Dec 01 22:06:54 crc kubenswrapper[4962]: I1201 22:06:54.144444 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18cdf103-82c8-4796-9c6e-a544e6bce98e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "18cdf103-82c8-4796-9c6e-a544e6bce98e" (UID: "18cdf103-82c8-4796-9c6e-a544e6bce98e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 22:06:54 crc kubenswrapper[4962]: I1201 22:06:54.226389 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18cdf103-82c8-4796-9c6e-a544e6bce98e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 22:06:54 crc kubenswrapper[4962]: I1201 22:06:54.240684 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="737cedfd-d3c6-4f5b-8289-af4b32ec094a" path="/var/lib/kubelet/pods/737cedfd-d3c6-4f5b-8289-af4b32ec094a/volumes" Dec 01 22:06:54 crc kubenswrapper[4962]: I1201 22:06:54.242862 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="805240bc-5355-4ffa-886c-a4e96fb3a540" path="/var/lib/kubelet/pods/805240bc-5355-4ffa-886c-a4e96fb3a540/volumes" Dec 01 22:06:54 crc kubenswrapper[4962]: I1201 22:06:54.414456 4962 generic.go:334] "Generic (PLEG): container finished" podID="18cdf103-82c8-4796-9c6e-a544e6bce98e" containerID="72af669461d2fae2ab8ce58091342df5556ccdea35b7fcfefad0350542234291" exitCode=0 Dec 01 22:06:54 crc kubenswrapper[4962]: I1201 22:06:54.414533 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bzdvf" event={"ID":"18cdf103-82c8-4796-9c6e-a544e6bce98e","Type":"ContainerDied","Data":"72af669461d2fae2ab8ce58091342df5556ccdea35b7fcfefad0350542234291"} Dec 01 22:06:54 crc kubenswrapper[4962]: I1201 22:06:54.414580 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bzdvf" event={"ID":"18cdf103-82c8-4796-9c6e-a544e6bce98e","Type":"ContainerDied","Data":"29a0421849021ebb46c607ff1334357515ddf0497d95ae1c264f9433507fdc5a"} Dec 01 22:06:54 crc kubenswrapper[4962]: I1201 22:06:54.414616 4962 scope.go:117] "RemoveContainer" containerID="72af669461d2fae2ab8ce58091342df5556ccdea35b7fcfefad0350542234291" Dec 01 22:06:54 crc kubenswrapper[4962]: I1201 22:06:54.414901 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bzdvf" Dec 01 22:06:54 crc kubenswrapper[4962]: I1201 22:06:54.453381 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bzdvf"] Dec 01 22:06:54 crc kubenswrapper[4962]: I1201 22:06:54.458911 4962 scope.go:117] "RemoveContainer" containerID="152fb843aaa444d4e42fd1950444704c228575b294eacf3969da19c0d9dfd2c8" Dec 01 22:06:54 crc kubenswrapper[4962]: I1201 22:06:54.464778 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bzdvf"] Dec 01 22:06:54 crc kubenswrapper[4962]: I1201 22:06:54.487424 4962 scope.go:117] "RemoveContainer" containerID="453a8ec523fc6f0d226988c7bde187bed0f59ce81a2f8b898b6e525acabbe7d8" Dec 01 22:06:54 crc kubenswrapper[4962]: I1201 22:06:54.558684 4962 scope.go:117] "RemoveContainer" containerID="72af669461d2fae2ab8ce58091342df5556ccdea35b7fcfefad0350542234291" Dec 01 22:06:54 crc kubenswrapper[4962]: E1201 22:06:54.562459 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72af669461d2fae2ab8ce58091342df5556ccdea35b7fcfefad0350542234291\": container with ID starting with 72af669461d2fae2ab8ce58091342df5556ccdea35b7fcfefad0350542234291 not found: ID does not exist" containerID="72af669461d2fae2ab8ce58091342df5556ccdea35b7fcfefad0350542234291" Dec 01 22:06:54 crc kubenswrapper[4962]: I1201 22:06:54.562518 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72af669461d2fae2ab8ce58091342df5556ccdea35b7fcfefad0350542234291"} err="failed to get container status \"72af669461d2fae2ab8ce58091342df5556ccdea35b7fcfefad0350542234291\": rpc error: code = NotFound desc = could not find container \"72af669461d2fae2ab8ce58091342df5556ccdea35b7fcfefad0350542234291\": container with ID starting with 72af669461d2fae2ab8ce58091342df5556ccdea35b7fcfefad0350542234291 not found: ID does not exist" Dec 01 22:06:54 crc kubenswrapper[4962]: I1201 22:06:54.562556 4962 scope.go:117] "RemoveContainer" containerID="152fb843aaa444d4e42fd1950444704c228575b294eacf3969da19c0d9dfd2c8" Dec 01 22:06:54 crc kubenswrapper[4962]: E1201 22:06:54.563013 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"152fb843aaa444d4e42fd1950444704c228575b294eacf3969da19c0d9dfd2c8\": container with ID starting with 152fb843aaa444d4e42fd1950444704c228575b294eacf3969da19c0d9dfd2c8 not found: ID does not exist" containerID="152fb843aaa444d4e42fd1950444704c228575b294eacf3969da19c0d9dfd2c8" Dec 01 22:06:54 crc kubenswrapper[4962]: I1201 22:06:54.563041 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"152fb843aaa444d4e42fd1950444704c228575b294eacf3969da19c0d9dfd2c8"} err="failed to get container status \"152fb843aaa444d4e42fd1950444704c228575b294eacf3969da19c0d9dfd2c8\": rpc error: code = NotFound desc = could not find container \"152fb843aaa444d4e42fd1950444704c228575b294eacf3969da19c0d9dfd2c8\": container with ID starting with 152fb843aaa444d4e42fd1950444704c228575b294eacf3969da19c0d9dfd2c8 not found: ID does not exist" Dec 01 22:06:54 crc kubenswrapper[4962]: I1201 22:06:54.563059 4962 scope.go:117] "RemoveContainer" containerID="453a8ec523fc6f0d226988c7bde187bed0f59ce81a2f8b898b6e525acabbe7d8" Dec 01 22:06:54 crc kubenswrapper[4962]: E1201 22:06:54.563277 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"453a8ec523fc6f0d226988c7bde187bed0f59ce81a2f8b898b6e525acabbe7d8\": container with ID starting with 453a8ec523fc6f0d226988c7bde187bed0f59ce81a2f8b898b6e525acabbe7d8 not found: ID does not exist" containerID="453a8ec523fc6f0d226988c7bde187bed0f59ce81a2f8b898b6e525acabbe7d8" Dec 01 22:06:54 crc kubenswrapper[4962]: I1201 22:06:54.563316 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"453a8ec523fc6f0d226988c7bde187bed0f59ce81a2f8b898b6e525acabbe7d8"} err="failed to get container status \"453a8ec523fc6f0d226988c7bde187bed0f59ce81a2f8b898b6e525acabbe7d8\": rpc error: code = NotFound desc = could not find container \"453a8ec523fc6f0d226988c7bde187bed0f59ce81a2f8b898b6e525acabbe7d8\": container with ID starting with 453a8ec523fc6f0d226988c7bde187bed0f59ce81a2f8b898b6e525acabbe7d8 not found: ID does not exist" Dec 01 22:06:56 crc kubenswrapper[4962]: I1201 22:06:56.237787 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18cdf103-82c8-4796-9c6e-a544e6bce98e" path="/var/lib/kubelet/pods/18cdf103-82c8-4796-9c6e-a544e6bce98e/volumes" Dec 01 22:07:05 crc kubenswrapper[4962]: I1201 22:07:05.705643 4962 scope.go:117] "RemoveContainer" containerID="0c593c3dbdade458e130af55b4171bbf267694e32969ebf966f687723d011ab0" Dec 01 22:07:05 crc kubenswrapper[4962]: I1201 22:07:05.740119 4962 scope.go:117] "RemoveContainer" containerID="e34f0dac6167d0be003d127806068534bce4d39d4a26e764dcf94cfeaa2bd185" Dec 01 22:07:05 crc kubenswrapper[4962]: I1201 22:07:05.791590 4962 scope.go:117] "RemoveContainer" containerID="c77c0cb97248d978dc7a7e01e23b41b641701625149ceea908f513bdfe61d35f" Dec 01 22:07:34 crc kubenswrapper[4962]: I1201 22:07:34.051817 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-fvdgt"] Dec 01 22:07:34 crc kubenswrapper[4962]: I1201 22:07:34.063164 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-fvdgt"] Dec 01 22:07:34 crc kubenswrapper[4962]: I1201 22:07:34.241627 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e236312e-ed98-484b-ad3e-9ed5a6645df0" path="/var/lib/kubelet/pods/e236312e-ed98-484b-ad3e-9ed5a6645df0/volumes" Dec 01 22:07:35 crc kubenswrapper[4962]: I1201 22:07:35.037762 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-58lcl"] Dec 01 22:07:35 crc kubenswrapper[4962]: I1201 22:07:35.052187 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-58lcl"] Dec 01 22:07:36 crc kubenswrapper[4962]: I1201 22:07:36.233822 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f337e4ae-c46a-49f4-b8fe-bb7973cbdf8d" path="/var/lib/kubelet/pods/f337e4ae-c46a-49f4-b8fe-bb7973cbdf8d/volumes" Dec 01 22:07:57 crc kubenswrapper[4962]: I1201 22:07:57.325096 4962 generic.go:334] "Generic (PLEG): container finished" podID="fae4b755-5a52-461b-939c-b870ddcc521b" containerID="076a75ea0487d6020f3172fb950560f0f6a54e9f00d3e7065905fd151fd80dfa" exitCode=0 Dec 01 22:07:57 crc kubenswrapper[4962]: I1201 22:07:57.325226 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wl946" event={"ID":"fae4b755-5a52-461b-939c-b870ddcc521b","Type":"ContainerDied","Data":"076a75ea0487d6020f3172fb950560f0f6a54e9f00d3e7065905fd151fd80dfa"} Dec 01 22:07:58 crc kubenswrapper[4962]: I1201 22:07:58.949695 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wl946" Dec 01 22:07:59 crc kubenswrapper[4962]: I1201 22:07:59.076441 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fae4b755-5a52-461b-939c-b870ddcc521b-ssh-key\") pod \"fae4b755-5a52-461b-939c-b870ddcc521b\" (UID: \"fae4b755-5a52-461b-939c-b870ddcc521b\") " Dec 01 22:07:59 crc kubenswrapper[4962]: I1201 22:07:59.076524 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fae4b755-5a52-461b-939c-b870ddcc521b-inventory\") pod \"fae4b755-5a52-461b-939c-b870ddcc521b\" (UID: \"fae4b755-5a52-461b-939c-b870ddcc521b\") " Dec 01 22:07:59 crc kubenswrapper[4962]: I1201 22:07:59.076757 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzlm9\" (UniqueName: \"kubernetes.io/projected/fae4b755-5a52-461b-939c-b870ddcc521b-kube-api-access-bzlm9\") pod \"fae4b755-5a52-461b-939c-b870ddcc521b\" (UID: \"fae4b755-5a52-461b-939c-b870ddcc521b\") " Dec 01 22:07:59 crc kubenswrapper[4962]: I1201 22:07:59.082471 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fae4b755-5a52-461b-939c-b870ddcc521b-kube-api-access-bzlm9" (OuterVolumeSpecName: "kube-api-access-bzlm9") pod "fae4b755-5a52-461b-939c-b870ddcc521b" (UID: "fae4b755-5a52-461b-939c-b870ddcc521b"). InnerVolumeSpecName "kube-api-access-bzlm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 22:07:59 crc kubenswrapper[4962]: I1201 22:07:59.117163 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fae4b755-5a52-461b-939c-b870ddcc521b-inventory" (OuterVolumeSpecName: "inventory") pod "fae4b755-5a52-461b-939c-b870ddcc521b" (UID: "fae4b755-5a52-461b-939c-b870ddcc521b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:07:59 crc kubenswrapper[4962]: I1201 22:07:59.120525 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fae4b755-5a52-461b-939c-b870ddcc521b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "fae4b755-5a52-461b-939c-b870ddcc521b" (UID: "fae4b755-5a52-461b-939c-b870ddcc521b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:07:59 crc kubenswrapper[4962]: I1201 22:07:59.180525 4962 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fae4b755-5a52-461b-939c-b870ddcc521b-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 22:07:59 crc kubenswrapper[4962]: I1201 22:07:59.180580 4962 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fae4b755-5a52-461b-939c-b870ddcc521b-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 22:07:59 crc kubenswrapper[4962]: I1201 22:07:59.180604 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzlm9\" (UniqueName: \"kubernetes.io/projected/fae4b755-5a52-461b-939c-b870ddcc521b-kube-api-access-bzlm9\") on node \"crc\" DevicePath \"\"" Dec 01 22:07:59 crc kubenswrapper[4962]: I1201 22:07:59.358611 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wl946" event={"ID":"fae4b755-5a52-461b-939c-b870ddcc521b","Type":"ContainerDied","Data":"dd276b891ceedf7cea5f2520a356541c422ce9a714025cd6335d466ae0fc1b64"} Dec 01 22:07:59 crc kubenswrapper[4962]: I1201 22:07:59.358671 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd276b891ceedf7cea5f2520a356541c422ce9a714025cd6335d466ae0fc1b64" Dec 01 22:07:59 crc kubenswrapper[4962]: I1201 22:07:59.358691 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wl946" Dec 01 22:07:59 crc kubenswrapper[4962]: I1201 22:07:59.481868 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tjg57"] Dec 01 22:07:59 crc kubenswrapper[4962]: E1201 22:07:59.482636 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18cdf103-82c8-4796-9c6e-a544e6bce98e" containerName="extract-content" Dec 01 22:07:59 crc kubenswrapper[4962]: I1201 22:07:59.482690 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="18cdf103-82c8-4796-9c6e-a544e6bce98e" containerName="extract-content" Dec 01 22:07:59 crc kubenswrapper[4962]: E1201 22:07:59.482713 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18cdf103-82c8-4796-9c6e-a544e6bce98e" containerName="extract-utilities" Dec 01 22:07:59 crc kubenswrapper[4962]: I1201 22:07:59.482722 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="18cdf103-82c8-4796-9c6e-a544e6bce98e" containerName="extract-utilities" Dec 01 22:07:59 crc kubenswrapper[4962]: E1201 22:07:59.482747 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18cdf103-82c8-4796-9c6e-a544e6bce98e" containerName="registry-server" Dec 01 22:07:59 crc kubenswrapper[4962]: I1201 22:07:59.482755 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="18cdf103-82c8-4796-9c6e-a544e6bce98e" containerName="registry-server" Dec 01 22:07:59 crc kubenswrapper[4962]: E1201 22:07:59.482784 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9ca4ee2-77c4-4862-9b18-7b3cf1598ac3" containerName="extract-content" Dec 01 22:07:59 crc kubenswrapper[4962]: I1201 22:07:59.482793 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9ca4ee2-77c4-4862-9b18-7b3cf1598ac3" containerName="extract-content" Dec 01 22:07:59 crc kubenswrapper[4962]: E1201 22:07:59.482822 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fae4b755-5a52-461b-939c-b870ddcc521b" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 01 22:07:59 crc kubenswrapper[4962]: I1201 22:07:59.482831 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="fae4b755-5a52-461b-939c-b870ddcc521b" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 01 22:07:59 crc kubenswrapper[4962]: E1201 22:07:59.482851 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9ca4ee2-77c4-4862-9b18-7b3cf1598ac3" containerName="registry-server" Dec 01 22:07:59 crc kubenswrapper[4962]: I1201 22:07:59.482859 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9ca4ee2-77c4-4862-9b18-7b3cf1598ac3" containerName="registry-server" Dec 01 22:07:59 crc kubenswrapper[4962]: E1201 22:07:59.482898 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9ca4ee2-77c4-4862-9b18-7b3cf1598ac3" containerName="extract-utilities" Dec 01 22:07:59 crc kubenswrapper[4962]: I1201 22:07:59.482908 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9ca4ee2-77c4-4862-9b18-7b3cf1598ac3" containerName="extract-utilities" Dec 01 22:07:59 crc kubenswrapper[4962]: I1201 22:07:59.483256 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="18cdf103-82c8-4796-9c6e-a544e6bce98e" containerName="registry-server" Dec 01 22:07:59 crc kubenswrapper[4962]: I1201 22:07:59.483293 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="fae4b755-5a52-461b-939c-b870ddcc521b" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 01 22:07:59 crc kubenswrapper[4962]: I1201 22:07:59.483324 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9ca4ee2-77c4-4862-9b18-7b3cf1598ac3" containerName="registry-server" Dec 01 22:07:59 crc kubenswrapper[4962]: I1201 22:07:59.485143 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tjg57" Dec 01 22:07:59 crc kubenswrapper[4962]: I1201 22:07:59.490288 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 22:07:59 crc kubenswrapper[4962]: I1201 22:07:59.490342 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 22:07:59 crc kubenswrapper[4962]: I1201 22:07:59.490582 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 22:07:59 crc kubenswrapper[4962]: I1201 22:07:59.501136 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mpxjd" Dec 01 22:07:59 crc kubenswrapper[4962]: I1201 22:07:59.510685 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tjg57"] Dec 01 22:07:59 crc kubenswrapper[4962]: I1201 22:07:59.597365 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blp9p\" (UniqueName: \"kubernetes.io/projected/f2a3fbd2-3eb8-4784-8442-3299926b0172-kube-api-access-blp9p\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-tjg57\" (UID: \"f2a3fbd2-3eb8-4784-8442-3299926b0172\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tjg57" Dec 01 22:07:59 crc kubenswrapper[4962]: I1201 22:07:59.597555 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f2a3fbd2-3eb8-4784-8442-3299926b0172-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-tjg57\" (UID: \"f2a3fbd2-3eb8-4784-8442-3299926b0172\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tjg57" Dec 01 22:07:59 crc kubenswrapper[4962]: I1201 22:07:59.597857 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2a3fbd2-3eb8-4784-8442-3299926b0172-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-tjg57\" (UID: \"f2a3fbd2-3eb8-4784-8442-3299926b0172\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tjg57" Dec 01 22:07:59 crc kubenswrapper[4962]: I1201 22:07:59.700378 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f2a3fbd2-3eb8-4784-8442-3299926b0172-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-tjg57\" (UID: \"f2a3fbd2-3eb8-4784-8442-3299926b0172\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tjg57" Dec 01 22:07:59 crc kubenswrapper[4962]: I1201 22:07:59.701187 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2a3fbd2-3eb8-4784-8442-3299926b0172-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-tjg57\" (UID: \"f2a3fbd2-3eb8-4784-8442-3299926b0172\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tjg57" Dec 01 22:07:59 crc kubenswrapper[4962]: I1201 22:07:59.701579 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blp9p\" (UniqueName: \"kubernetes.io/projected/f2a3fbd2-3eb8-4784-8442-3299926b0172-kube-api-access-blp9p\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-tjg57\" (UID: \"f2a3fbd2-3eb8-4784-8442-3299926b0172\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tjg57" Dec 01 22:07:59 crc kubenswrapper[4962]: I1201 22:07:59.711716 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f2a3fbd2-3eb8-4784-8442-3299926b0172-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-tjg57\" (UID: \"f2a3fbd2-3eb8-4784-8442-3299926b0172\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tjg57" Dec 01 22:07:59 crc kubenswrapper[4962]: I1201 22:07:59.712445 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2a3fbd2-3eb8-4784-8442-3299926b0172-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-tjg57\" (UID: \"f2a3fbd2-3eb8-4784-8442-3299926b0172\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tjg57" Dec 01 22:07:59 crc kubenswrapper[4962]: I1201 22:07:59.729875 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blp9p\" (UniqueName: \"kubernetes.io/projected/f2a3fbd2-3eb8-4784-8442-3299926b0172-kube-api-access-blp9p\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-tjg57\" (UID: \"f2a3fbd2-3eb8-4784-8442-3299926b0172\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tjg57" Dec 01 22:07:59 crc kubenswrapper[4962]: I1201 22:07:59.811566 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tjg57" Dec 01 22:08:00 crc kubenswrapper[4962]: I1201 22:08:00.532018 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tjg57"] Dec 01 22:08:00 crc kubenswrapper[4962]: W1201 22:08:00.537118 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf2a3fbd2_3eb8_4784_8442_3299926b0172.slice/crio-0c247d33df996805ca0fc0943cdd758c002b6bdccc49adc4cc1165f23f9db314 WatchSource:0}: Error finding container 0c247d33df996805ca0fc0943cdd758c002b6bdccc49adc4cc1165f23f9db314: Status 404 returned error can't find the container with id 0c247d33df996805ca0fc0943cdd758c002b6bdccc49adc4cc1165f23f9db314 Dec 01 22:08:01 crc kubenswrapper[4962]: I1201 22:08:01.389404 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tjg57" event={"ID":"f2a3fbd2-3eb8-4784-8442-3299926b0172","Type":"ContainerStarted","Data":"1324ffd55769302e1614408716d1abb6cbb79f64c728957cfa24d2b0838d6ef7"} Dec 01 22:08:01 crc kubenswrapper[4962]: I1201 22:08:01.389873 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tjg57" event={"ID":"f2a3fbd2-3eb8-4784-8442-3299926b0172","Type":"ContainerStarted","Data":"0c247d33df996805ca0fc0943cdd758c002b6bdccc49adc4cc1165f23f9db314"} Dec 01 22:08:01 crc kubenswrapper[4962]: I1201 22:08:01.415670 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tjg57" podStartSLOduration=1.9264477960000002 podStartE2EDuration="2.415648764s" podCreationTimestamp="2025-12-01 22:07:59 +0000 UTC" firstStartedPulling="2025-12-01 22:08:00.542235529 +0000 UTC m=+2064.643674764" lastFinishedPulling="2025-12-01 22:08:01.031436517 +0000 UTC m=+2065.132875732" observedRunningTime="2025-12-01 22:08:01.40987197 +0000 UTC m=+2065.511311185" watchObservedRunningTime="2025-12-01 22:08:01.415648764 +0000 UTC m=+2065.517087979" Dec 01 22:08:06 crc kubenswrapper[4962]: I1201 22:08:06.007162 4962 scope.go:117] "RemoveContainer" containerID="0fbb87ea3559411331543374877d9cc86ad777f447b9042fed0047fb49676970" Dec 01 22:08:06 crc kubenswrapper[4962]: I1201 22:08:06.081674 4962 scope.go:117] "RemoveContainer" containerID="1d0ff72f8769c0fd22c9037b58fc1ff886d1665425352ace8895034a81a97248" Dec 01 22:08:07 crc kubenswrapper[4962]: I1201 22:08:07.482996 4962 generic.go:334] "Generic (PLEG): container finished" podID="f2a3fbd2-3eb8-4784-8442-3299926b0172" containerID="1324ffd55769302e1614408716d1abb6cbb79f64c728957cfa24d2b0838d6ef7" exitCode=0 Dec 01 22:08:07 crc kubenswrapper[4962]: I1201 22:08:07.483087 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tjg57" event={"ID":"f2a3fbd2-3eb8-4784-8442-3299926b0172","Type":"ContainerDied","Data":"1324ffd55769302e1614408716d1abb6cbb79f64c728957cfa24d2b0838d6ef7"} Dec 01 22:08:09 crc kubenswrapper[4962]: I1201 22:08:09.106175 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tjg57" Dec 01 22:08:09 crc kubenswrapper[4962]: I1201 22:08:09.180915 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-blp9p\" (UniqueName: \"kubernetes.io/projected/f2a3fbd2-3eb8-4784-8442-3299926b0172-kube-api-access-blp9p\") pod \"f2a3fbd2-3eb8-4784-8442-3299926b0172\" (UID: \"f2a3fbd2-3eb8-4784-8442-3299926b0172\") " Dec 01 22:08:09 crc kubenswrapper[4962]: I1201 22:08:09.181177 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2a3fbd2-3eb8-4784-8442-3299926b0172-inventory\") pod \"f2a3fbd2-3eb8-4784-8442-3299926b0172\" (UID: \"f2a3fbd2-3eb8-4784-8442-3299926b0172\") " Dec 01 22:08:09 crc kubenswrapper[4962]: I1201 22:08:09.181355 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f2a3fbd2-3eb8-4784-8442-3299926b0172-ssh-key\") pod \"f2a3fbd2-3eb8-4784-8442-3299926b0172\" (UID: \"f2a3fbd2-3eb8-4784-8442-3299926b0172\") " Dec 01 22:08:09 crc kubenswrapper[4962]: I1201 22:08:09.189682 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2a3fbd2-3eb8-4784-8442-3299926b0172-kube-api-access-blp9p" (OuterVolumeSpecName: "kube-api-access-blp9p") pod "f2a3fbd2-3eb8-4784-8442-3299926b0172" (UID: "f2a3fbd2-3eb8-4784-8442-3299926b0172"). InnerVolumeSpecName "kube-api-access-blp9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 22:08:09 crc kubenswrapper[4962]: I1201 22:08:09.216206 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2a3fbd2-3eb8-4784-8442-3299926b0172-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f2a3fbd2-3eb8-4784-8442-3299926b0172" (UID: "f2a3fbd2-3eb8-4784-8442-3299926b0172"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:08:09 crc kubenswrapper[4962]: I1201 22:08:09.230834 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2a3fbd2-3eb8-4784-8442-3299926b0172-inventory" (OuterVolumeSpecName: "inventory") pod "f2a3fbd2-3eb8-4784-8442-3299926b0172" (UID: "f2a3fbd2-3eb8-4784-8442-3299926b0172"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:08:09 crc kubenswrapper[4962]: I1201 22:08:09.285010 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-blp9p\" (UniqueName: \"kubernetes.io/projected/f2a3fbd2-3eb8-4784-8442-3299926b0172-kube-api-access-blp9p\") on node \"crc\" DevicePath \"\"" Dec 01 22:08:09 crc kubenswrapper[4962]: I1201 22:08:09.285067 4962 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2a3fbd2-3eb8-4784-8442-3299926b0172-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 22:08:09 crc kubenswrapper[4962]: I1201 22:08:09.285080 4962 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f2a3fbd2-3eb8-4784-8442-3299926b0172-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 22:08:09 crc kubenswrapper[4962]: I1201 22:08:09.517812 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tjg57" event={"ID":"f2a3fbd2-3eb8-4784-8442-3299926b0172","Type":"ContainerDied","Data":"0c247d33df996805ca0fc0943cdd758c002b6bdccc49adc4cc1165f23f9db314"} Dec 01 22:08:09 crc kubenswrapper[4962]: I1201 22:08:09.517909 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c247d33df996805ca0fc0943cdd758c002b6bdccc49adc4cc1165f23f9db314" Dec 01 22:08:09 crc kubenswrapper[4962]: I1201 22:08:09.518037 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tjg57" Dec 01 22:08:09 crc kubenswrapper[4962]: I1201 22:08:09.640452 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-68jbf"] Dec 01 22:08:09 crc kubenswrapper[4962]: E1201 22:08:09.641025 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2a3fbd2-3eb8-4784-8442-3299926b0172" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 01 22:08:09 crc kubenswrapper[4962]: I1201 22:08:09.641042 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2a3fbd2-3eb8-4784-8442-3299926b0172" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 01 22:08:09 crc kubenswrapper[4962]: I1201 22:08:09.641296 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2a3fbd2-3eb8-4784-8442-3299926b0172" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 01 22:08:09 crc kubenswrapper[4962]: I1201 22:08:09.642106 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-68jbf" Dec 01 22:08:09 crc kubenswrapper[4962]: I1201 22:08:09.645543 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 22:08:09 crc kubenswrapper[4962]: I1201 22:08:09.645739 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mpxjd" Dec 01 22:08:09 crc kubenswrapper[4962]: I1201 22:08:09.645788 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 22:08:09 crc kubenswrapper[4962]: I1201 22:08:09.646153 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 22:08:09 crc kubenswrapper[4962]: I1201 22:08:09.674617 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-68jbf"] Dec 01 22:08:09 crc kubenswrapper[4962]: I1201 22:08:09.693124 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d7f57b07-0c88-4569-9062-bbaaf50abefe-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-68jbf\" (UID: \"d7f57b07-0c88-4569-9062-bbaaf50abefe\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-68jbf" Dec 01 22:08:09 crc kubenswrapper[4962]: I1201 22:08:09.693219 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d7f57b07-0c88-4569-9062-bbaaf50abefe-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-68jbf\" (UID: \"d7f57b07-0c88-4569-9062-bbaaf50abefe\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-68jbf" Dec 01 22:08:09 crc kubenswrapper[4962]: I1201 22:08:09.693294 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gjj2\" (UniqueName: \"kubernetes.io/projected/d7f57b07-0c88-4569-9062-bbaaf50abefe-kube-api-access-9gjj2\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-68jbf\" (UID: \"d7f57b07-0c88-4569-9062-bbaaf50abefe\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-68jbf" Dec 01 22:08:09 crc kubenswrapper[4962]: I1201 22:08:09.796074 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d7f57b07-0c88-4569-9062-bbaaf50abefe-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-68jbf\" (UID: \"d7f57b07-0c88-4569-9062-bbaaf50abefe\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-68jbf" Dec 01 22:08:09 crc kubenswrapper[4962]: I1201 22:08:09.796156 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d7f57b07-0c88-4569-9062-bbaaf50abefe-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-68jbf\" (UID: \"d7f57b07-0c88-4569-9062-bbaaf50abefe\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-68jbf" Dec 01 22:08:09 crc kubenswrapper[4962]: I1201 22:08:09.796223 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gjj2\" (UniqueName: \"kubernetes.io/projected/d7f57b07-0c88-4569-9062-bbaaf50abefe-kube-api-access-9gjj2\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-68jbf\" (UID: \"d7f57b07-0c88-4569-9062-bbaaf50abefe\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-68jbf" Dec 01 22:08:09 crc kubenswrapper[4962]: I1201 22:08:09.800467 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d7f57b07-0c88-4569-9062-bbaaf50abefe-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-68jbf\" (UID: \"d7f57b07-0c88-4569-9062-bbaaf50abefe\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-68jbf" Dec 01 22:08:09 crc kubenswrapper[4962]: I1201 22:08:09.802624 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d7f57b07-0c88-4569-9062-bbaaf50abefe-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-68jbf\" (UID: \"d7f57b07-0c88-4569-9062-bbaaf50abefe\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-68jbf" Dec 01 22:08:09 crc kubenswrapper[4962]: I1201 22:08:09.816924 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gjj2\" (UniqueName: \"kubernetes.io/projected/d7f57b07-0c88-4569-9062-bbaaf50abefe-kube-api-access-9gjj2\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-68jbf\" (UID: \"d7f57b07-0c88-4569-9062-bbaaf50abefe\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-68jbf" Dec 01 22:08:09 crc kubenswrapper[4962]: I1201 22:08:09.962969 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-68jbf" Dec 01 22:08:10 crc kubenswrapper[4962]: I1201 22:08:10.545530 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-68jbf"] Dec 01 22:08:11 crc kubenswrapper[4962]: I1201 22:08:11.549916 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-68jbf" event={"ID":"d7f57b07-0c88-4569-9062-bbaaf50abefe","Type":"ContainerStarted","Data":"6e66fdc5f245ae2d4241e130ea12963d3c9c5715e6662ff2267458128563de73"} Dec 01 22:08:11 crc kubenswrapper[4962]: I1201 22:08:11.550404 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-68jbf" event={"ID":"d7f57b07-0c88-4569-9062-bbaaf50abefe","Type":"ContainerStarted","Data":"5c3fe28bc3093956388ceaf222526e28b8003f1f899c84b9bc19cb4906bfaf7e"} Dec 01 22:08:11 crc kubenswrapper[4962]: I1201 22:08:11.579774 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-68jbf" podStartSLOduration=2.080260536 podStartE2EDuration="2.57974547s" podCreationTimestamp="2025-12-01 22:08:09 +0000 UTC" firstStartedPulling="2025-12-01 22:08:10.557800119 +0000 UTC m=+2074.659239384" lastFinishedPulling="2025-12-01 22:08:11.057285093 +0000 UTC m=+2075.158724318" observedRunningTime="2025-12-01 22:08:11.57657591 +0000 UTC m=+2075.678015115" watchObservedRunningTime="2025-12-01 22:08:11.57974547 +0000 UTC m=+2075.681184705" Dec 01 22:08:19 crc kubenswrapper[4962]: I1201 22:08:19.051175 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-snnm6"] Dec 01 22:08:19 crc kubenswrapper[4962]: I1201 22:08:19.071057 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-snnm6"] Dec 01 22:08:20 crc kubenswrapper[4962]: I1201 22:08:20.237677 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36804ca4-2be7-4289-ac20-41e5209f0ae3" path="/var/lib/kubelet/pods/36804ca4-2be7-4289-ac20-41e5209f0ae3/volumes" Dec 01 22:08:55 crc kubenswrapper[4962]: I1201 22:08:55.141974 4962 generic.go:334] "Generic (PLEG): container finished" podID="d7f57b07-0c88-4569-9062-bbaaf50abefe" containerID="6e66fdc5f245ae2d4241e130ea12963d3c9c5715e6662ff2267458128563de73" exitCode=0 Dec 01 22:08:55 crc kubenswrapper[4962]: I1201 22:08:55.142064 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-68jbf" event={"ID":"d7f57b07-0c88-4569-9062-bbaaf50abefe","Type":"ContainerDied","Data":"6e66fdc5f245ae2d4241e130ea12963d3c9c5715e6662ff2267458128563de73"} Dec 01 22:08:56 crc kubenswrapper[4962]: I1201 22:08:56.795831 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-68jbf" Dec 01 22:08:56 crc kubenswrapper[4962]: I1201 22:08:56.850816 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9gjj2\" (UniqueName: \"kubernetes.io/projected/d7f57b07-0c88-4569-9062-bbaaf50abefe-kube-api-access-9gjj2\") pod \"d7f57b07-0c88-4569-9062-bbaaf50abefe\" (UID: \"d7f57b07-0c88-4569-9062-bbaaf50abefe\") " Dec 01 22:08:56 crc kubenswrapper[4962]: I1201 22:08:56.850980 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d7f57b07-0c88-4569-9062-bbaaf50abefe-inventory\") pod \"d7f57b07-0c88-4569-9062-bbaaf50abefe\" (UID: \"d7f57b07-0c88-4569-9062-bbaaf50abefe\") " Dec 01 22:08:56 crc kubenswrapper[4962]: I1201 22:08:56.851130 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d7f57b07-0c88-4569-9062-bbaaf50abefe-ssh-key\") pod \"d7f57b07-0c88-4569-9062-bbaaf50abefe\" (UID: \"d7f57b07-0c88-4569-9062-bbaaf50abefe\") " Dec 01 22:08:56 crc kubenswrapper[4962]: I1201 22:08:56.862511 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7f57b07-0c88-4569-9062-bbaaf50abefe-kube-api-access-9gjj2" (OuterVolumeSpecName: "kube-api-access-9gjj2") pod "d7f57b07-0c88-4569-9062-bbaaf50abefe" (UID: "d7f57b07-0c88-4569-9062-bbaaf50abefe"). InnerVolumeSpecName "kube-api-access-9gjj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 22:08:56 crc kubenswrapper[4962]: I1201 22:08:56.884252 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7f57b07-0c88-4569-9062-bbaaf50abefe-inventory" (OuterVolumeSpecName: "inventory") pod "d7f57b07-0c88-4569-9062-bbaaf50abefe" (UID: "d7f57b07-0c88-4569-9062-bbaaf50abefe"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:08:56 crc kubenswrapper[4962]: I1201 22:08:56.912048 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7f57b07-0c88-4569-9062-bbaaf50abefe-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d7f57b07-0c88-4569-9062-bbaaf50abefe" (UID: "d7f57b07-0c88-4569-9062-bbaaf50abefe"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:08:56 crc kubenswrapper[4962]: I1201 22:08:56.955338 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9gjj2\" (UniqueName: \"kubernetes.io/projected/d7f57b07-0c88-4569-9062-bbaaf50abefe-kube-api-access-9gjj2\") on node \"crc\" DevicePath \"\"" Dec 01 22:08:56 crc kubenswrapper[4962]: I1201 22:08:56.955382 4962 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d7f57b07-0c88-4569-9062-bbaaf50abefe-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 22:08:56 crc kubenswrapper[4962]: I1201 22:08:56.955396 4962 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d7f57b07-0c88-4569-9062-bbaaf50abefe-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 22:08:57 crc kubenswrapper[4962]: I1201 22:08:57.169024 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-68jbf" event={"ID":"d7f57b07-0c88-4569-9062-bbaaf50abefe","Type":"ContainerDied","Data":"5c3fe28bc3093956388ceaf222526e28b8003f1f899c84b9bc19cb4906bfaf7e"} Dec 01 22:08:57 crc kubenswrapper[4962]: I1201 22:08:57.169529 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c3fe28bc3093956388ceaf222526e28b8003f1f899c84b9bc19cb4906bfaf7e" Dec 01 22:08:57 crc kubenswrapper[4962]: I1201 22:08:57.169107 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-68jbf" Dec 01 22:08:57 crc kubenswrapper[4962]: I1201 22:08:57.348813 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cjt2f"] Dec 01 22:08:57 crc kubenswrapper[4962]: E1201 22:08:57.349754 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7f57b07-0c88-4569-9062-bbaaf50abefe" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 01 22:08:57 crc kubenswrapper[4962]: I1201 22:08:57.349893 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7f57b07-0c88-4569-9062-bbaaf50abefe" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 01 22:08:57 crc kubenswrapper[4962]: I1201 22:08:57.350431 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7f57b07-0c88-4569-9062-bbaaf50abefe" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 01 22:08:57 crc kubenswrapper[4962]: I1201 22:08:57.351844 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cjt2f" Dec 01 22:08:57 crc kubenswrapper[4962]: I1201 22:08:57.354588 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 22:08:57 crc kubenswrapper[4962]: I1201 22:08:57.356320 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 22:08:57 crc kubenswrapper[4962]: I1201 22:08:57.356755 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 22:08:57 crc kubenswrapper[4962]: I1201 22:08:57.357784 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mpxjd" Dec 01 22:08:57 crc kubenswrapper[4962]: I1201 22:08:57.371186 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cjt2f"] Dec 01 22:08:57 crc kubenswrapper[4962]: I1201 22:08:57.469133 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e004e0a7-fc4c-4236-a816-288147301262-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-cjt2f\" (UID: \"e004e0a7-fc4c-4236-a816-288147301262\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cjt2f" Dec 01 22:08:57 crc kubenswrapper[4962]: I1201 22:08:57.469534 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e004e0a7-fc4c-4236-a816-288147301262-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-cjt2f\" (UID: \"e004e0a7-fc4c-4236-a816-288147301262\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cjt2f" Dec 01 22:08:57 crc kubenswrapper[4962]: I1201 22:08:57.469924 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjchx\" (UniqueName: \"kubernetes.io/projected/e004e0a7-fc4c-4236-a816-288147301262-kube-api-access-bjchx\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-cjt2f\" (UID: \"e004e0a7-fc4c-4236-a816-288147301262\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cjt2f" Dec 01 22:08:57 crc kubenswrapper[4962]: I1201 22:08:57.573356 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjchx\" (UniqueName: \"kubernetes.io/projected/e004e0a7-fc4c-4236-a816-288147301262-kube-api-access-bjchx\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-cjt2f\" (UID: \"e004e0a7-fc4c-4236-a816-288147301262\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cjt2f" Dec 01 22:08:57 crc kubenswrapper[4962]: I1201 22:08:57.573618 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e004e0a7-fc4c-4236-a816-288147301262-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-cjt2f\" (UID: \"e004e0a7-fc4c-4236-a816-288147301262\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cjt2f" Dec 01 22:08:57 crc kubenswrapper[4962]: I1201 22:08:57.573679 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e004e0a7-fc4c-4236-a816-288147301262-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-cjt2f\" (UID: \"e004e0a7-fc4c-4236-a816-288147301262\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cjt2f" Dec 01 22:08:57 crc kubenswrapper[4962]: I1201 22:08:57.599544 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e004e0a7-fc4c-4236-a816-288147301262-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-cjt2f\" (UID: \"e004e0a7-fc4c-4236-a816-288147301262\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cjt2f" Dec 01 22:08:57 crc kubenswrapper[4962]: I1201 22:08:57.616440 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e004e0a7-fc4c-4236-a816-288147301262-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-cjt2f\" (UID: \"e004e0a7-fc4c-4236-a816-288147301262\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cjt2f" Dec 01 22:08:57 crc kubenswrapper[4962]: I1201 22:08:57.619638 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjchx\" (UniqueName: \"kubernetes.io/projected/e004e0a7-fc4c-4236-a816-288147301262-kube-api-access-bjchx\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-cjt2f\" (UID: \"e004e0a7-fc4c-4236-a816-288147301262\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cjt2f" Dec 01 22:08:57 crc kubenswrapper[4962]: I1201 22:08:57.692440 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cjt2f" Dec 01 22:08:58 crc kubenswrapper[4962]: I1201 22:08:58.256613 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cjt2f"] Dec 01 22:08:59 crc kubenswrapper[4962]: I1201 22:08:59.196777 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cjt2f" event={"ID":"e004e0a7-fc4c-4236-a816-288147301262","Type":"ContainerStarted","Data":"0fae85168d65396330773e784d3ac3803bfcb59c6d7d90950a193852cb9847be"} Dec 01 22:09:00 crc kubenswrapper[4962]: I1201 22:09:00.209609 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cjt2f" event={"ID":"e004e0a7-fc4c-4236-a816-288147301262","Type":"ContainerStarted","Data":"18d970ffcdad8417ec6b2884848226a63142102d63d495c658b3260fe06086de"} Dec 01 22:09:00 crc kubenswrapper[4962]: I1201 22:09:00.241701 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cjt2f" podStartSLOduration=2.3533327059999998 podStartE2EDuration="3.241680276s" podCreationTimestamp="2025-12-01 22:08:57 +0000 UTC" firstStartedPulling="2025-12-01 22:08:58.252979666 +0000 UTC m=+2122.354418861" lastFinishedPulling="2025-12-01 22:08:59.141327226 +0000 UTC m=+2123.242766431" observedRunningTime="2025-12-01 22:09:00.239528695 +0000 UTC m=+2124.340967900" watchObservedRunningTime="2025-12-01 22:09:00.241680276 +0000 UTC m=+2124.343119481" Dec 01 22:09:02 crc kubenswrapper[4962]: I1201 22:09:02.784081 4962 patch_prober.go:28] interesting pod/machine-config-daemon-b642k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 22:09:02 crc kubenswrapper[4962]: I1201 22:09:02.784345 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 22:09:06 crc kubenswrapper[4962]: I1201 22:09:06.209286 4962 scope.go:117] "RemoveContainer" containerID="5349d1e018999cb3ae1b93470b4e63c23ad7c10a2906009fcd6516c6f176afd9" Dec 01 22:09:32 crc kubenswrapper[4962]: I1201 22:09:32.784395 4962 patch_prober.go:28] interesting pod/machine-config-daemon-b642k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 22:09:32 crc kubenswrapper[4962]: I1201 22:09:32.785274 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 22:09:41 crc kubenswrapper[4962]: I1201 22:09:41.053356 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-h98rk"] Dec 01 22:09:41 crc kubenswrapper[4962]: I1201 22:09:41.063361 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-h98rk"] Dec 01 22:09:42 crc kubenswrapper[4962]: I1201 22:09:42.243842 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad0d848c-97d4-4360-a1e6-335cd2a8896c" path="/var/lib/kubelet/pods/ad0d848c-97d4-4360-a1e6-335cd2a8896c/volumes" Dec 01 22:10:00 crc kubenswrapper[4962]: I1201 22:10:00.085622 4962 generic.go:334] "Generic (PLEG): container finished" podID="e004e0a7-fc4c-4236-a816-288147301262" containerID="18d970ffcdad8417ec6b2884848226a63142102d63d495c658b3260fe06086de" exitCode=0 Dec 01 22:10:00 crc kubenswrapper[4962]: I1201 22:10:00.085719 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cjt2f" event={"ID":"e004e0a7-fc4c-4236-a816-288147301262","Type":"ContainerDied","Data":"18d970ffcdad8417ec6b2884848226a63142102d63d495c658b3260fe06086de"} Dec 01 22:10:01 crc kubenswrapper[4962]: I1201 22:10:01.613008 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cjt2f" Dec 01 22:10:01 crc kubenswrapper[4962]: I1201 22:10:01.680763 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e004e0a7-fc4c-4236-a816-288147301262-inventory\") pod \"e004e0a7-fc4c-4236-a816-288147301262\" (UID: \"e004e0a7-fc4c-4236-a816-288147301262\") " Dec 01 22:10:01 crc kubenswrapper[4962]: I1201 22:10:01.680907 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjchx\" (UniqueName: \"kubernetes.io/projected/e004e0a7-fc4c-4236-a816-288147301262-kube-api-access-bjchx\") pod \"e004e0a7-fc4c-4236-a816-288147301262\" (UID: \"e004e0a7-fc4c-4236-a816-288147301262\") " Dec 01 22:10:01 crc kubenswrapper[4962]: I1201 22:10:01.681063 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e004e0a7-fc4c-4236-a816-288147301262-ssh-key\") pod \"e004e0a7-fc4c-4236-a816-288147301262\" (UID: \"e004e0a7-fc4c-4236-a816-288147301262\") " Dec 01 22:10:01 crc kubenswrapper[4962]: I1201 22:10:01.688235 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e004e0a7-fc4c-4236-a816-288147301262-kube-api-access-bjchx" (OuterVolumeSpecName: "kube-api-access-bjchx") pod "e004e0a7-fc4c-4236-a816-288147301262" (UID: "e004e0a7-fc4c-4236-a816-288147301262"). InnerVolumeSpecName "kube-api-access-bjchx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 22:10:01 crc kubenswrapper[4962]: I1201 22:10:01.720040 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e004e0a7-fc4c-4236-a816-288147301262-inventory" (OuterVolumeSpecName: "inventory") pod "e004e0a7-fc4c-4236-a816-288147301262" (UID: "e004e0a7-fc4c-4236-a816-288147301262"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:10:01 crc kubenswrapper[4962]: I1201 22:10:01.723360 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e004e0a7-fc4c-4236-a816-288147301262-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e004e0a7-fc4c-4236-a816-288147301262" (UID: "e004e0a7-fc4c-4236-a816-288147301262"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:10:01 crc kubenswrapper[4962]: I1201 22:10:01.784612 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjchx\" (UniqueName: \"kubernetes.io/projected/e004e0a7-fc4c-4236-a816-288147301262-kube-api-access-bjchx\") on node \"crc\" DevicePath \"\"" Dec 01 22:10:01 crc kubenswrapper[4962]: I1201 22:10:01.784655 4962 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e004e0a7-fc4c-4236-a816-288147301262-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 22:10:01 crc kubenswrapper[4962]: I1201 22:10:01.784669 4962 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e004e0a7-fc4c-4236-a816-288147301262-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 22:10:02 crc kubenswrapper[4962]: I1201 22:10:02.110823 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cjt2f" event={"ID":"e004e0a7-fc4c-4236-a816-288147301262","Type":"ContainerDied","Data":"0fae85168d65396330773e784d3ac3803bfcb59c6d7d90950a193852cb9847be"} Dec 01 22:10:02 crc kubenswrapper[4962]: I1201 22:10:02.111164 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0fae85168d65396330773e784d3ac3803bfcb59c6d7d90950a193852cb9847be" Dec 01 22:10:02 crc kubenswrapper[4962]: I1201 22:10:02.110926 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cjt2f" Dec 01 22:10:02 crc kubenswrapper[4962]: I1201 22:10:02.208762 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-stgzx"] Dec 01 22:10:02 crc kubenswrapper[4962]: E1201 22:10:02.209427 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e004e0a7-fc4c-4236-a816-288147301262" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 01 22:10:02 crc kubenswrapper[4962]: I1201 22:10:02.209451 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="e004e0a7-fc4c-4236-a816-288147301262" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 01 22:10:02 crc kubenswrapper[4962]: I1201 22:10:02.209807 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="e004e0a7-fc4c-4236-a816-288147301262" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 01 22:10:02 crc kubenswrapper[4962]: I1201 22:10:02.210919 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-stgzx" Dec 01 22:10:02 crc kubenswrapper[4962]: I1201 22:10:02.213368 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 22:10:02 crc kubenswrapper[4962]: I1201 22:10:02.213857 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mpxjd" Dec 01 22:10:02 crc kubenswrapper[4962]: I1201 22:10:02.214040 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 22:10:02 crc kubenswrapper[4962]: I1201 22:10:02.214042 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 22:10:02 crc kubenswrapper[4962]: I1201 22:10:02.297290 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-stgzx"] Dec 01 22:10:02 crc kubenswrapper[4962]: I1201 22:10:02.397117 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4356462c-43b0-40db-824b-f9abb87cb9dd-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-stgzx\" (UID: \"4356462c-43b0-40db-824b-f9abb87cb9dd\") " pod="openstack/ssh-known-hosts-edpm-deployment-stgzx" Dec 01 22:10:02 crc kubenswrapper[4962]: I1201 22:10:02.397243 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4356462c-43b0-40db-824b-f9abb87cb9dd-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-stgzx\" (UID: \"4356462c-43b0-40db-824b-f9abb87cb9dd\") " pod="openstack/ssh-known-hosts-edpm-deployment-stgzx" Dec 01 22:10:02 crc kubenswrapper[4962]: I1201 22:10:02.397268 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jb6wq\" (UniqueName: \"kubernetes.io/projected/4356462c-43b0-40db-824b-f9abb87cb9dd-kube-api-access-jb6wq\") pod \"ssh-known-hosts-edpm-deployment-stgzx\" (UID: \"4356462c-43b0-40db-824b-f9abb87cb9dd\") " pod="openstack/ssh-known-hosts-edpm-deployment-stgzx" Dec 01 22:10:02 crc kubenswrapper[4962]: I1201 22:10:02.499276 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4356462c-43b0-40db-824b-f9abb87cb9dd-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-stgzx\" (UID: \"4356462c-43b0-40db-824b-f9abb87cb9dd\") " pod="openstack/ssh-known-hosts-edpm-deployment-stgzx" Dec 01 22:10:02 crc kubenswrapper[4962]: I1201 22:10:02.499354 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jb6wq\" (UniqueName: \"kubernetes.io/projected/4356462c-43b0-40db-824b-f9abb87cb9dd-kube-api-access-jb6wq\") pod \"ssh-known-hosts-edpm-deployment-stgzx\" (UID: \"4356462c-43b0-40db-824b-f9abb87cb9dd\") " pod="openstack/ssh-known-hosts-edpm-deployment-stgzx" Dec 01 22:10:02 crc kubenswrapper[4962]: I1201 22:10:02.499611 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4356462c-43b0-40db-824b-f9abb87cb9dd-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-stgzx\" (UID: \"4356462c-43b0-40db-824b-f9abb87cb9dd\") " pod="openstack/ssh-known-hosts-edpm-deployment-stgzx" Dec 01 22:10:02 crc kubenswrapper[4962]: I1201 22:10:02.504970 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4356462c-43b0-40db-824b-f9abb87cb9dd-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-stgzx\" (UID: \"4356462c-43b0-40db-824b-f9abb87cb9dd\") " pod="openstack/ssh-known-hosts-edpm-deployment-stgzx" Dec 01 22:10:02 crc kubenswrapper[4962]: I1201 22:10:02.505058 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4356462c-43b0-40db-824b-f9abb87cb9dd-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-stgzx\" (UID: \"4356462c-43b0-40db-824b-f9abb87cb9dd\") " pod="openstack/ssh-known-hosts-edpm-deployment-stgzx" Dec 01 22:10:02 crc kubenswrapper[4962]: I1201 22:10:02.518853 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jb6wq\" (UniqueName: \"kubernetes.io/projected/4356462c-43b0-40db-824b-f9abb87cb9dd-kube-api-access-jb6wq\") pod \"ssh-known-hosts-edpm-deployment-stgzx\" (UID: \"4356462c-43b0-40db-824b-f9abb87cb9dd\") " pod="openstack/ssh-known-hosts-edpm-deployment-stgzx" Dec 01 22:10:02 crc kubenswrapper[4962]: I1201 22:10:02.611486 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-stgzx" Dec 01 22:10:02 crc kubenswrapper[4962]: I1201 22:10:02.784347 4962 patch_prober.go:28] interesting pod/machine-config-daemon-b642k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 22:10:02 crc kubenswrapper[4962]: I1201 22:10:02.784736 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 22:10:02 crc kubenswrapper[4962]: I1201 22:10:02.784782 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b642k" Dec 01 22:10:02 crc kubenswrapper[4962]: I1201 22:10:02.785658 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f9afb91c220faf243aae75f9834e77e5c18873f4e7d46e25474ebf0923963082"} pod="openshift-machine-config-operator/machine-config-daemon-b642k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 22:10:02 crc kubenswrapper[4962]: I1201 22:10:02.785714 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" containerID="cri-o://f9afb91c220faf243aae75f9834e77e5c18873f4e7d46e25474ebf0923963082" gracePeriod=600 Dec 01 22:10:03 crc kubenswrapper[4962]: I1201 22:10:03.129016 4962 generic.go:334] "Generic (PLEG): container finished" podID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerID="f9afb91c220faf243aae75f9834e77e5c18873f4e7d46e25474ebf0923963082" exitCode=0 Dec 01 22:10:03 crc kubenswrapper[4962]: I1201 22:10:03.129082 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" event={"ID":"191b6ce3-f613-4217-b224-a65ee4cfdfe7","Type":"ContainerDied","Data":"f9afb91c220faf243aae75f9834e77e5c18873f4e7d46e25474ebf0923963082"} Dec 01 22:10:03 crc kubenswrapper[4962]: I1201 22:10:03.129409 4962 scope.go:117] "RemoveContainer" containerID="ff25160e91e4465a0010cbcc4456883387e71df20e0a000eb4c6b6750118371f" Dec 01 22:10:03 crc kubenswrapper[4962]: I1201 22:10:03.193868 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-stgzx"] Dec 01 22:10:03 crc kubenswrapper[4962]: W1201 22:10:03.194667 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4356462c_43b0_40db_824b_f9abb87cb9dd.slice/crio-216e60d1e5f5648b7116c92b37d2a5793ecd85962c0a74704c19fffbccacced5 WatchSource:0}: Error finding container 216e60d1e5f5648b7116c92b37d2a5793ecd85962c0a74704c19fffbccacced5: Status 404 returned error can't find the container with id 216e60d1e5f5648b7116c92b37d2a5793ecd85962c0a74704c19fffbccacced5 Dec 01 22:10:03 crc kubenswrapper[4962]: I1201 22:10:03.197279 4962 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 22:10:04 crc kubenswrapper[4962]: I1201 22:10:04.140039 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-stgzx" event={"ID":"4356462c-43b0-40db-824b-f9abb87cb9dd","Type":"ContainerStarted","Data":"216e60d1e5f5648b7116c92b37d2a5793ecd85962c0a74704c19fffbccacced5"} Dec 01 22:10:04 crc kubenswrapper[4962]: I1201 22:10:04.144343 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" event={"ID":"191b6ce3-f613-4217-b224-a65ee4cfdfe7","Type":"ContainerStarted","Data":"8c484572750533778a7ce8cd28638da562706e712e60fd64e1058f563e05403c"} Dec 01 22:10:05 crc kubenswrapper[4962]: I1201 22:10:05.156698 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-stgzx" event={"ID":"4356462c-43b0-40db-824b-f9abb87cb9dd","Type":"ContainerStarted","Data":"262298f269dd2fb8322f5e1f0bb0ff2f9eafb8279f0821be3104584ed51e793f"} Dec 01 22:10:05 crc kubenswrapper[4962]: I1201 22:10:05.175827 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-stgzx" podStartSLOduration=2.40837649 podStartE2EDuration="3.175812473s" podCreationTimestamp="2025-12-01 22:10:02 +0000 UTC" firstStartedPulling="2025-12-01 22:10:03.196961131 +0000 UTC m=+2187.298400326" lastFinishedPulling="2025-12-01 22:10:03.964397114 +0000 UTC m=+2188.065836309" observedRunningTime="2025-12-01 22:10:05.169176214 +0000 UTC m=+2189.270615409" watchObservedRunningTime="2025-12-01 22:10:05.175812473 +0000 UTC m=+2189.277251668" Dec 01 22:10:06 crc kubenswrapper[4962]: I1201 22:10:06.302967 4962 scope.go:117] "RemoveContainer" containerID="fc8aac9eea6d702113a0e121c057dc321eeeb7340b6c56d21962f43c2f8b119a" Dec 01 22:10:12 crc kubenswrapper[4962]: I1201 22:10:12.232003 4962 generic.go:334] "Generic (PLEG): container finished" podID="4356462c-43b0-40db-824b-f9abb87cb9dd" containerID="262298f269dd2fb8322f5e1f0bb0ff2f9eafb8279f0821be3104584ed51e793f" exitCode=0 Dec 01 22:10:12 crc kubenswrapper[4962]: I1201 22:10:12.237650 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-stgzx" event={"ID":"4356462c-43b0-40db-824b-f9abb87cb9dd","Type":"ContainerDied","Data":"262298f269dd2fb8322f5e1f0bb0ff2f9eafb8279f0821be3104584ed51e793f"} Dec 01 22:10:13 crc kubenswrapper[4962]: I1201 22:10:13.809873 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-stgzx" Dec 01 22:10:13 crc kubenswrapper[4962]: I1201 22:10:13.915587 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jb6wq\" (UniqueName: \"kubernetes.io/projected/4356462c-43b0-40db-824b-f9abb87cb9dd-kube-api-access-jb6wq\") pod \"4356462c-43b0-40db-824b-f9abb87cb9dd\" (UID: \"4356462c-43b0-40db-824b-f9abb87cb9dd\") " Dec 01 22:10:13 crc kubenswrapper[4962]: I1201 22:10:13.915659 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4356462c-43b0-40db-824b-f9abb87cb9dd-ssh-key-openstack-edpm-ipam\") pod \"4356462c-43b0-40db-824b-f9abb87cb9dd\" (UID: \"4356462c-43b0-40db-824b-f9abb87cb9dd\") " Dec 01 22:10:13 crc kubenswrapper[4962]: I1201 22:10:13.915729 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4356462c-43b0-40db-824b-f9abb87cb9dd-inventory-0\") pod \"4356462c-43b0-40db-824b-f9abb87cb9dd\" (UID: \"4356462c-43b0-40db-824b-f9abb87cb9dd\") " Dec 01 22:10:13 crc kubenswrapper[4962]: I1201 22:10:13.937241 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4356462c-43b0-40db-824b-f9abb87cb9dd-kube-api-access-jb6wq" (OuterVolumeSpecName: "kube-api-access-jb6wq") pod "4356462c-43b0-40db-824b-f9abb87cb9dd" (UID: "4356462c-43b0-40db-824b-f9abb87cb9dd"). InnerVolumeSpecName "kube-api-access-jb6wq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 22:10:13 crc kubenswrapper[4962]: I1201 22:10:13.955426 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4356462c-43b0-40db-824b-f9abb87cb9dd-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "4356462c-43b0-40db-824b-f9abb87cb9dd" (UID: "4356462c-43b0-40db-824b-f9abb87cb9dd"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:10:13 crc kubenswrapper[4962]: I1201 22:10:13.958173 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4356462c-43b0-40db-824b-f9abb87cb9dd-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "4356462c-43b0-40db-824b-f9abb87cb9dd" (UID: "4356462c-43b0-40db-824b-f9abb87cb9dd"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:10:14 crc kubenswrapper[4962]: I1201 22:10:14.018806 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jb6wq\" (UniqueName: \"kubernetes.io/projected/4356462c-43b0-40db-824b-f9abb87cb9dd-kube-api-access-jb6wq\") on node \"crc\" DevicePath \"\"" Dec 01 22:10:14 crc kubenswrapper[4962]: I1201 22:10:14.018852 4962 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4356462c-43b0-40db-824b-f9abb87cb9dd-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 01 22:10:14 crc kubenswrapper[4962]: I1201 22:10:14.018872 4962 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4356462c-43b0-40db-824b-f9abb87cb9dd-inventory-0\") on node \"crc\" DevicePath \"\"" Dec 01 22:10:14 crc kubenswrapper[4962]: I1201 22:10:14.267977 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-stgzx" event={"ID":"4356462c-43b0-40db-824b-f9abb87cb9dd","Type":"ContainerDied","Data":"216e60d1e5f5648b7116c92b37d2a5793ecd85962c0a74704c19fffbccacced5"} Dec 01 22:10:14 crc kubenswrapper[4962]: I1201 22:10:14.268025 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="216e60d1e5f5648b7116c92b37d2a5793ecd85962c0a74704c19fffbccacced5" Dec 01 22:10:14 crc kubenswrapper[4962]: I1201 22:10:14.268062 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-stgzx" Dec 01 22:10:14 crc kubenswrapper[4962]: I1201 22:10:14.358927 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-mwgwj"] Dec 01 22:10:14 crc kubenswrapper[4962]: E1201 22:10:14.359853 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4356462c-43b0-40db-824b-f9abb87cb9dd" containerName="ssh-known-hosts-edpm-deployment" Dec 01 22:10:14 crc kubenswrapper[4962]: I1201 22:10:14.359927 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="4356462c-43b0-40db-824b-f9abb87cb9dd" containerName="ssh-known-hosts-edpm-deployment" Dec 01 22:10:14 crc kubenswrapper[4962]: I1201 22:10:14.360261 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="4356462c-43b0-40db-824b-f9abb87cb9dd" containerName="ssh-known-hosts-edpm-deployment" Dec 01 22:10:14 crc kubenswrapper[4962]: I1201 22:10:14.361117 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mwgwj" Dec 01 22:10:14 crc kubenswrapper[4962]: I1201 22:10:14.370702 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mpxjd" Dec 01 22:10:14 crc kubenswrapper[4962]: I1201 22:10:14.371213 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 22:10:14 crc kubenswrapper[4962]: I1201 22:10:14.372019 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 22:10:14 crc kubenswrapper[4962]: I1201 22:10:14.387681 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 22:10:14 crc kubenswrapper[4962]: I1201 22:10:14.390391 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-mwgwj"] Dec 01 22:10:14 crc kubenswrapper[4962]: I1201 22:10:14.535288 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1e8e98c7-cc77-45f5-be56-cb73df6427e4-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-mwgwj\" (UID: \"1e8e98c7-cc77-45f5-be56-cb73df6427e4\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mwgwj" Dec 01 22:10:14 crc kubenswrapper[4962]: I1201 22:10:14.535349 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1e8e98c7-cc77-45f5-be56-cb73df6427e4-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-mwgwj\" (UID: \"1e8e98c7-cc77-45f5-be56-cb73df6427e4\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mwgwj" Dec 01 22:10:14 crc kubenswrapper[4962]: I1201 22:10:14.535488 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcs2m\" (UniqueName: \"kubernetes.io/projected/1e8e98c7-cc77-45f5-be56-cb73df6427e4-kube-api-access-tcs2m\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-mwgwj\" (UID: \"1e8e98c7-cc77-45f5-be56-cb73df6427e4\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mwgwj" Dec 01 22:10:14 crc kubenswrapper[4962]: I1201 22:10:14.637696 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcs2m\" (UniqueName: \"kubernetes.io/projected/1e8e98c7-cc77-45f5-be56-cb73df6427e4-kube-api-access-tcs2m\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-mwgwj\" (UID: \"1e8e98c7-cc77-45f5-be56-cb73df6427e4\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mwgwj" Dec 01 22:10:14 crc kubenswrapper[4962]: I1201 22:10:14.637803 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1e8e98c7-cc77-45f5-be56-cb73df6427e4-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-mwgwj\" (UID: \"1e8e98c7-cc77-45f5-be56-cb73df6427e4\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mwgwj" Dec 01 22:10:14 crc kubenswrapper[4962]: I1201 22:10:14.637840 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1e8e98c7-cc77-45f5-be56-cb73df6427e4-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-mwgwj\" (UID: \"1e8e98c7-cc77-45f5-be56-cb73df6427e4\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mwgwj" Dec 01 22:10:14 crc kubenswrapper[4962]: I1201 22:10:14.641887 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1e8e98c7-cc77-45f5-be56-cb73df6427e4-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-mwgwj\" (UID: \"1e8e98c7-cc77-45f5-be56-cb73df6427e4\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mwgwj" Dec 01 22:10:14 crc kubenswrapper[4962]: I1201 22:10:14.642288 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1e8e98c7-cc77-45f5-be56-cb73df6427e4-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-mwgwj\" (UID: \"1e8e98c7-cc77-45f5-be56-cb73df6427e4\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mwgwj" Dec 01 22:10:14 crc kubenswrapper[4962]: I1201 22:10:14.653120 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcs2m\" (UniqueName: \"kubernetes.io/projected/1e8e98c7-cc77-45f5-be56-cb73df6427e4-kube-api-access-tcs2m\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-mwgwj\" (UID: \"1e8e98c7-cc77-45f5-be56-cb73df6427e4\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mwgwj" Dec 01 22:10:14 crc kubenswrapper[4962]: I1201 22:10:14.705368 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mwgwj" Dec 01 22:10:15 crc kubenswrapper[4962]: I1201 22:10:15.362296 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-mwgwj"] Dec 01 22:10:16 crc kubenswrapper[4962]: I1201 22:10:16.303952 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mwgwj" event={"ID":"1e8e98c7-cc77-45f5-be56-cb73df6427e4","Type":"ContainerStarted","Data":"b998c9ea0f58ee40ed4a1d31829179fd3a8c70952c06e54f4fcea5e658817a52"} Dec 01 22:10:16 crc kubenswrapper[4962]: I1201 22:10:16.304320 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mwgwj" event={"ID":"1e8e98c7-cc77-45f5-be56-cb73df6427e4","Type":"ContainerStarted","Data":"f448462a8553647aa33ae6be887c96d65a42a1b5b57a83fcfad1650a6f4aec84"} Dec 01 22:10:16 crc kubenswrapper[4962]: I1201 22:10:16.329153 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mwgwj" podStartSLOduration=1.773474928 podStartE2EDuration="2.329128069s" podCreationTimestamp="2025-12-01 22:10:14 +0000 UTC" firstStartedPulling="2025-12-01 22:10:15.346426264 +0000 UTC m=+2199.447865499" lastFinishedPulling="2025-12-01 22:10:15.902079445 +0000 UTC m=+2200.003518640" observedRunningTime="2025-12-01 22:10:16.325280939 +0000 UTC m=+2200.426720174" watchObservedRunningTime="2025-12-01 22:10:16.329128069 +0000 UTC m=+2200.430567284" Dec 01 22:10:19 crc kubenswrapper[4962]: I1201 22:10:19.064477 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-fxbhs"] Dec 01 22:10:19 crc kubenswrapper[4962]: I1201 22:10:19.080309 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-fxbhs"] Dec 01 22:10:20 crc kubenswrapper[4962]: I1201 22:10:20.237654 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="182ffb78-4709-491d-a294-c0c924cf4d5d" path="/var/lib/kubelet/pods/182ffb78-4709-491d-a294-c0c924cf4d5d/volumes" Dec 01 22:10:20 crc kubenswrapper[4962]: I1201 22:10:20.729630 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-cm8tm"] Dec 01 22:10:20 crc kubenswrapper[4962]: I1201 22:10:20.732690 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cm8tm" Dec 01 22:10:20 crc kubenswrapper[4962]: I1201 22:10:20.746451 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cm8tm"] Dec 01 22:10:20 crc kubenswrapper[4962]: I1201 22:10:20.899019 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22t89\" (UniqueName: \"kubernetes.io/projected/2b8bbee6-8af7-45c9-8a59-7b7220fce26b-kube-api-access-22t89\") pod \"certified-operators-cm8tm\" (UID: \"2b8bbee6-8af7-45c9-8a59-7b7220fce26b\") " pod="openshift-marketplace/certified-operators-cm8tm" Dec 01 22:10:20 crc kubenswrapper[4962]: I1201 22:10:20.899233 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b8bbee6-8af7-45c9-8a59-7b7220fce26b-utilities\") pod \"certified-operators-cm8tm\" (UID: \"2b8bbee6-8af7-45c9-8a59-7b7220fce26b\") " pod="openshift-marketplace/certified-operators-cm8tm" Dec 01 22:10:20 crc kubenswrapper[4962]: I1201 22:10:20.899405 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b8bbee6-8af7-45c9-8a59-7b7220fce26b-catalog-content\") pod \"certified-operators-cm8tm\" (UID: \"2b8bbee6-8af7-45c9-8a59-7b7220fce26b\") " pod="openshift-marketplace/certified-operators-cm8tm" Dec 01 22:10:21 crc kubenswrapper[4962]: I1201 22:10:21.001917 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22t89\" (UniqueName: \"kubernetes.io/projected/2b8bbee6-8af7-45c9-8a59-7b7220fce26b-kube-api-access-22t89\") pod \"certified-operators-cm8tm\" (UID: \"2b8bbee6-8af7-45c9-8a59-7b7220fce26b\") " pod="openshift-marketplace/certified-operators-cm8tm" Dec 01 22:10:21 crc kubenswrapper[4962]: I1201 22:10:21.002057 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b8bbee6-8af7-45c9-8a59-7b7220fce26b-utilities\") pod \"certified-operators-cm8tm\" (UID: \"2b8bbee6-8af7-45c9-8a59-7b7220fce26b\") " pod="openshift-marketplace/certified-operators-cm8tm" Dec 01 22:10:21 crc kubenswrapper[4962]: I1201 22:10:21.002640 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b8bbee6-8af7-45c9-8a59-7b7220fce26b-utilities\") pod \"certified-operators-cm8tm\" (UID: \"2b8bbee6-8af7-45c9-8a59-7b7220fce26b\") " pod="openshift-marketplace/certified-operators-cm8tm" Dec 01 22:10:21 crc kubenswrapper[4962]: I1201 22:10:21.003106 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b8bbee6-8af7-45c9-8a59-7b7220fce26b-catalog-content\") pod \"certified-operators-cm8tm\" (UID: \"2b8bbee6-8af7-45c9-8a59-7b7220fce26b\") " pod="openshift-marketplace/certified-operators-cm8tm" Dec 01 22:10:21 crc kubenswrapper[4962]: I1201 22:10:21.002746 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b8bbee6-8af7-45c9-8a59-7b7220fce26b-catalog-content\") pod \"certified-operators-cm8tm\" (UID: \"2b8bbee6-8af7-45c9-8a59-7b7220fce26b\") " pod="openshift-marketplace/certified-operators-cm8tm" Dec 01 22:10:21 crc kubenswrapper[4962]: I1201 22:10:21.042187 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22t89\" (UniqueName: \"kubernetes.io/projected/2b8bbee6-8af7-45c9-8a59-7b7220fce26b-kube-api-access-22t89\") pod \"certified-operators-cm8tm\" (UID: \"2b8bbee6-8af7-45c9-8a59-7b7220fce26b\") " pod="openshift-marketplace/certified-operators-cm8tm" Dec 01 22:10:21 crc kubenswrapper[4962]: I1201 22:10:21.059291 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cm8tm" Dec 01 22:10:21 crc kubenswrapper[4962]: I1201 22:10:21.598081 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cm8tm"] Dec 01 22:10:22 crc kubenswrapper[4962]: I1201 22:10:22.386361 4962 generic.go:334] "Generic (PLEG): container finished" podID="2b8bbee6-8af7-45c9-8a59-7b7220fce26b" containerID="41669553c7c0c067b438bd86d4195907266b1030f435955a5242f27e333a55b9" exitCode=0 Dec 01 22:10:22 crc kubenswrapper[4962]: I1201 22:10:22.386499 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cm8tm" event={"ID":"2b8bbee6-8af7-45c9-8a59-7b7220fce26b","Type":"ContainerDied","Data":"41669553c7c0c067b438bd86d4195907266b1030f435955a5242f27e333a55b9"} Dec 01 22:10:22 crc kubenswrapper[4962]: I1201 22:10:22.387000 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cm8tm" event={"ID":"2b8bbee6-8af7-45c9-8a59-7b7220fce26b","Type":"ContainerStarted","Data":"3572496c5a0d5c6dd0259d8861ef7529f490eb955a50b8c19a3060781d88a3ba"} Dec 01 22:10:23 crc kubenswrapper[4962]: I1201 22:10:23.403124 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cm8tm" event={"ID":"2b8bbee6-8af7-45c9-8a59-7b7220fce26b","Type":"ContainerStarted","Data":"0b62a8dbfbc6cbf1155b1c0a863169c22c569be2bf47f00becbef59862013695"} Dec 01 22:10:25 crc kubenswrapper[4962]: I1201 22:10:25.434750 4962 generic.go:334] "Generic (PLEG): container finished" podID="2b8bbee6-8af7-45c9-8a59-7b7220fce26b" containerID="0b62a8dbfbc6cbf1155b1c0a863169c22c569be2bf47f00becbef59862013695" exitCode=0 Dec 01 22:10:25 crc kubenswrapper[4962]: I1201 22:10:25.437891 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cm8tm" event={"ID":"2b8bbee6-8af7-45c9-8a59-7b7220fce26b","Type":"ContainerDied","Data":"0b62a8dbfbc6cbf1155b1c0a863169c22c569be2bf47f00becbef59862013695"} Dec 01 22:10:26 crc kubenswrapper[4962]: I1201 22:10:26.458149 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cm8tm" event={"ID":"2b8bbee6-8af7-45c9-8a59-7b7220fce26b","Type":"ContainerStarted","Data":"736c4c82760ad47eb1ea2451294159969ad798e200e4042ee79d21b6cfa6f83c"} Dec 01 22:10:26 crc kubenswrapper[4962]: I1201 22:10:26.463349 4962 generic.go:334] "Generic (PLEG): container finished" podID="1e8e98c7-cc77-45f5-be56-cb73df6427e4" containerID="b998c9ea0f58ee40ed4a1d31829179fd3a8c70952c06e54f4fcea5e658817a52" exitCode=0 Dec 01 22:10:26 crc kubenswrapper[4962]: I1201 22:10:26.463393 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mwgwj" event={"ID":"1e8e98c7-cc77-45f5-be56-cb73df6427e4","Type":"ContainerDied","Data":"b998c9ea0f58ee40ed4a1d31829179fd3a8c70952c06e54f4fcea5e658817a52"} Dec 01 22:10:26 crc kubenswrapper[4962]: I1201 22:10:26.502253 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-cm8tm" podStartSLOduration=2.81703954 podStartE2EDuration="6.502229213s" podCreationTimestamp="2025-12-01 22:10:20 +0000 UTC" firstStartedPulling="2025-12-01 22:10:22.389656326 +0000 UTC m=+2206.491095571" lastFinishedPulling="2025-12-01 22:10:26.074846019 +0000 UTC m=+2210.176285244" observedRunningTime="2025-12-01 22:10:26.485915469 +0000 UTC m=+2210.587354674" watchObservedRunningTime="2025-12-01 22:10:26.502229213 +0000 UTC m=+2210.603668418" Dec 01 22:10:28 crc kubenswrapper[4962]: I1201 22:10:28.050999 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mwgwj" Dec 01 22:10:28 crc kubenswrapper[4962]: I1201 22:10:28.218950 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1e8e98c7-cc77-45f5-be56-cb73df6427e4-inventory\") pod \"1e8e98c7-cc77-45f5-be56-cb73df6427e4\" (UID: \"1e8e98c7-cc77-45f5-be56-cb73df6427e4\") " Dec 01 22:10:28 crc kubenswrapper[4962]: I1201 22:10:28.219240 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tcs2m\" (UniqueName: \"kubernetes.io/projected/1e8e98c7-cc77-45f5-be56-cb73df6427e4-kube-api-access-tcs2m\") pod \"1e8e98c7-cc77-45f5-be56-cb73df6427e4\" (UID: \"1e8e98c7-cc77-45f5-be56-cb73df6427e4\") " Dec 01 22:10:28 crc kubenswrapper[4962]: I1201 22:10:28.219370 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1e8e98c7-cc77-45f5-be56-cb73df6427e4-ssh-key\") pod \"1e8e98c7-cc77-45f5-be56-cb73df6427e4\" (UID: \"1e8e98c7-cc77-45f5-be56-cb73df6427e4\") " Dec 01 22:10:28 crc kubenswrapper[4962]: I1201 22:10:28.224709 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e8e98c7-cc77-45f5-be56-cb73df6427e4-kube-api-access-tcs2m" (OuterVolumeSpecName: "kube-api-access-tcs2m") pod "1e8e98c7-cc77-45f5-be56-cb73df6427e4" (UID: "1e8e98c7-cc77-45f5-be56-cb73df6427e4"). InnerVolumeSpecName "kube-api-access-tcs2m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 22:10:28 crc kubenswrapper[4962]: I1201 22:10:28.250895 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e8e98c7-cc77-45f5-be56-cb73df6427e4-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1e8e98c7-cc77-45f5-be56-cb73df6427e4" (UID: "1e8e98c7-cc77-45f5-be56-cb73df6427e4"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:10:28 crc kubenswrapper[4962]: I1201 22:10:28.259653 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e8e98c7-cc77-45f5-be56-cb73df6427e4-inventory" (OuterVolumeSpecName: "inventory") pod "1e8e98c7-cc77-45f5-be56-cb73df6427e4" (UID: "1e8e98c7-cc77-45f5-be56-cb73df6427e4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:10:28 crc kubenswrapper[4962]: I1201 22:10:28.323519 4962 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1e8e98c7-cc77-45f5-be56-cb73df6427e4-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 22:10:28 crc kubenswrapper[4962]: I1201 22:10:28.323972 4962 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1e8e98c7-cc77-45f5-be56-cb73df6427e4-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 22:10:28 crc kubenswrapper[4962]: I1201 22:10:28.323996 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tcs2m\" (UniqueName: \"kubernetes.io/projected/1e8e98c7-cc77-45f5-be56-cb73df6427e4-kube-api-access-tcs2m\") on node \"crc\" DevicePath \"\"" Dec 01 22:10:28 crc kubenswrapper[4962]: I1201 22:10:28.497417 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mwgwj" event={"ID":"1e8e98c7-cc77-45f5-be56-cb73df6427e4","Type":"ContainerDied","Data":"f448462a8553647aa33ae6be887c96d65a42a1b5b57a83fcfad1650a6f4aec84"} Dec 01 22:10:28 crc kubenswrapper[4962]: I1201 22:10:28.497505 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f448462a8553647aa33ae6be887c96d65a42a1b5b57a83fcfad1650a6f4aec84" Dec 01 22:10:28 crc kubenswrapper[4962]: I1201 22:10:28.497600 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mwgwj" Dec 01 22:10:28 crc kubenswrapper[4962]: I1201 22:10:28.633696 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sp5b2"] Dec 01 22:10:28 crc kubenswrapper[4962]: E1201 22:10:28.634574 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e8e98c7-cc77-45f5-be56-cb73df6427e4" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 01 22:10:28 crc kubenswrapper[4962]: I1201 22:10:28.634608 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e8e98c7-cc77-45f5-be56-cb73df6427e4" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 01 22:10:28 crc kubenswrapper[4962]: I1201 22:10:28.635234 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e8e98c7-cc77-45f5-be56-cb73df6427e4" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 01 22:10:28 crc kubenswrapper[4962]: I1201 22:10:28.636747 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sp5b2" Dec 01 22:10:28 crc kubenswrapper[4962]: I1201 22:10:28.673642 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 22:10:28 crc kubenswrapper[4962]: I1201 22:10:28.674433 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mpxjd" Dec 01 22:10:28 crc kubenswrapper[4962]: I1201 22:10:28.675306 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 22:10:28 crc kubenswrapper[4962]: I1201 22:10:28.677355 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 22:10:28 crc kubenswrapper[4962]: I1201 22:10:28.691855 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sp5b2"] Dec 01 22:10:28 crc kubenswrapper[4962]: I1201 22:10:28.734908 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aee89e9b-f93a-4100-bc51-0a701a9d9549-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-sp5b2\" (UID: \"aee89e9b-f93a-4100-bc51-0a701a9d9549\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sp5b2" Dec 01 22:10:28 crc kubenswrapper[4962]: I1201 22:10:28.735333 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74zfd\" (UniqueName: \"kubernetes.io/projected/aee89e9b-f93a-4100-bc51-0a701a9d9549-kube-api-access-74zfd\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-sp5b2\" (UID: \"aee89e9b-f93a-4100-bc51-0a701a9d9549\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sp5b2" Dec 01 22:10:28 crc kubenswrapper[4962]: I1201 22:10:28.735430 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aee89e9b-f93a-4100-bc51-0a701a9d9549-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-sp5b2\" (UID: \"aee89e9b-f93a-4100-bc51-0a701a9d9549\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sp5b2" Dec 01 22:10:28 crc kubenswrapper[4962]: I1201 22:10:28.837394 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74zfd\" (UniqueName: \"kubernetes.io/projected/aee89e9b-f93a-4100-bc51-0a701a9d9549-kube-api-access-74zfd\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-sp5b2\" (UID: \"aee89e9b-f93a-4100-bc51-0a701a9d9549\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sp5b2" Dec 01 22:10:28 crc kubenswrapper[4962]: I1201 22:10:28.837455 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aee89e9b-f93a-4100-bc51-0a701a9d9549-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-sp5b2\" (UID: \"aee89e9b-f93a-4100-bc51-0a701a9d9549\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sp5b2" Dec 01 22:10:28 crc kubenswrapper[4962]: I1201 22:10:28.837582 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aee89e9b-f93a-4100-bc51-0a701a9d9549-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-sp5b2\" (UID: \"aee89e9b-f93a-4100-bc51-0a701a9d9549\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sp5b2" Dec 01 22:10:28 crc kubenswrapper[4962]: I1201 22:10:28.843087 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aee89e9b-f93a-4100-bc51-0a701a9d9549-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-sp5b2\" (UID: \"aee89e9b-f93a-4100-bc51-0a701a9d9549\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sp5b2" Dec 01 22:10:28 crc kubenswrapper[4962]: I1201 22:10:28.844754 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aee89e9b-f93a-4100-bc51-0a701a9d9549-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-sp5b2\" (UID: \"aee89e9b-f93a-4100-bc51-0a701a9d9549\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sp5b2" Dec 01 22:10:28 crc kubenswrapper[4962]: I1201 22:10:28.857575 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74zfd\" (UniqueName: \"kubernetes.io/projected/aee89e9b-f93a-4100-bc51-0a701a9d9549-kube-api-access-74zfd\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-sp5b2\" (UID: \"aee89e9b-f93a-4100-bc51-0a701a9d9549\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sp5b2" Dec 01 22:10:28 crc kubenswrapper[4962]: I1201 22:10:28.993757 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sp5b2" Dec 01 22:10:29 crc kubenswrapper[4962]: I1201 22:10:29.429806 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sp5b2"] Dec 01 22:10:29 crc kubenswrapper[4962]: W1201 22:10:29.432529 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaee89e9b_f93a_4100_bc51_0a701a9d9549.slice/crio-4ca22c1594b440c6a69368b3924fefeedd118f0bc03d0b031a02ddba64201a49 WatchSource:0}: Error finding container 4ca22c1594b440c6a69368b3924fefeedd118f0bc03d0b031a02ddba64201a49: Status 404 returned error can't find the container with id 4ca22c1594b440c6a69368b3924fefeedd118f0bc03d0b031a02ddba64201a49 Dec 01 22:10:29 crc kubenswrapper[4962]: I1201 22:10:29.509972 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sp5b2" event={"ID":"aee89e9b-f93a-4100-bc51-0a701a9d9549","Type":"ContainerStarted","Data":"4ca22c1594b440c6a69368b3924fefeedd118f0bc03d0b031a02ddba64201a49"} Dec 01 22:10:30 crc kubenswrapper[4962]: I1201 22:10:30.526825 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sp5b2" event={"ID":"aee89e9b-f93a-4100-bc51-0a701a9d9549","Type":"ContainerStarted","Data":"5013fc3f9fdffddbbd1366602fab786c417da181e172911babf82a345c803d13"} Dec 01 22:10:30 crc kubenswrapper[4962]: I1201 22:10:30.562221 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sp5b2" podStartSLOduration=1.9745628929999999 podStartE2EDuration="2.562195553s" podCreationTimestamp="2025-12-01 22:10:28 +0000 UTC" firstStartedPulling="2025-12-01 22:10:29.434758322 +0000 UTC m=+2213.536197517" lastFinishedPulling="2025-12-01 22:10:30.022390972 +0000 UTC m=+2214.123830177" observedRunningTime="2025-12-01 22:10:30.550003066 +0000 UTC m=+2214.651442271" watchObservedRunningTime="2025-12-01 22:10:30.562195553 +0000 UTC m=+2214.663634758" Dec 01 22:10:31 crc kubenswrapper[4962]: I1201 22:10:31.059904 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-cm8tm" Dec 01 22:10:31 crc kubenswrapper[4962]: I1201 22:10:31.060017 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-cm8tm" Dec 01 22:10:31 crc kubenswrapper[4962]: I1201 22:10:31.141415 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-cm8tm" Dec 01 22:10:31 crc kubenswrapper[4962]: I1201 22:10:31.622235 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-cm8tm" Dec 01 22:10:31 crc kubenswrapper[4962]: I1201 22:10:31.688826 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cm8tm"] Dec 01 22:10:33 crc kubenswrapper[4962]: I1201 22:10:33.578900 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-cm8tm" podUID="2b8bbee6-8af7-45c9-8a59-7b7220fce26b" containerName="registry-server" containerID="cri-o://736c4c82760ad47eb1ea2451294159969ad798e200e4042ee79d21b6cfa6f83c" gracePeriod=2 Dec 01 22:10:34 crc kubenswrapper[4962]: I1201 22:10:34.564236 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cm8tm" Dec 01 22:10:34 crc kubenswrapper[4962]: I1201 22:10:34.621075 4962 generic.go:334] "Generic (PLEG): container finished" podID="2b8bbee6-8af7-45c9-8a59-7b7220fce26b" containerID="736c4c82760ad47eb1ea2451294159969ad798e200e4042ee79d21b6cfa6f83c" exitCode=0 Dec 01 22:10:34 crc kubenswrapper[4962]: I1201 22:10:34.621132 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cm8tm" event={"ID":"2b8bbee6-8af7-45c9-8a59-7b7220fce26b","Type":"ContainerDied","Data":"736c4c82760ad47eb1ea2451294159969ad798e200e4042ee79d21b6cfa6f83c"} Dec 01 22:10:34 crc kubenswrapper[4962]: I1201 22:10:34.621171 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cm8tm" event={"ID":"2b8bbee6-8af7-45c9-8a59-7b7220fce26b","Type":"ContainerDied","Data":"3572496c5a0d5c6dd0259d8861ef7529f490eb955a50b8c19a3060781d88a3ba"} Dec 01 22:10:34 crc kubenswrapper[4962]: I1201 22:10:34.621201 4962 scope.go:117] "RemoveContainer" containerID="736c4c82760ad47eb1ea2451294159969ad798e200e4042ee79d21b6cfa6f83c" Dec 01 22:10:34 crc kubenswrapper[4962]: I1201 22:10:34.621205 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cm8tm" Dec 01 22:10:34 crc kubenswrapper[4962]: I1201 22:10:34.649963 4962 scope.go:117] "RemoveContainer" containerID="0b62a8dbfbc6cbf1155b1c0a863169c22c569be2bf47f00becbef59862013695" Dec 01 22:10:34 crc kubenswrapper[4962]: I1201 22:10:34.651996 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b8bbee6-8af7-45c9-8a59-7b7220fce26b-utilities\") pod \"2b8bbee6-8af7-45c9-8a59-7b7220fce26b\" (UID: \"2b8bbee6-8af7-45c9-8a59-7b7220fce26b\") " Dec 01 22:10:34 crc kubenswrapper[4962]: I1201 22:10:34.652666 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22t89\" (UniqueName: \"kubernetes.io/projected/2b8bbee6-8af7-45c9-8a59-7b7220fce26b-kube-api-access-22t89\") pod \"2b8bbee6-8af7-45c9-8a59-7b7220fce26b\" (UID: \"2b8bbee6-8af7-45c9-8a59-7b7220fce26b\") " Dec 01 22:10:34 crc kubenswrapper[4962]: I1201 22:10:34.652953 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b8bbee6-8af7-45c9-8a59-7b7220fce26b-catalog-content\") pod \"2b8bbee6-8af7-45c9-8a59-7b7220fce26b\" (UID: \"2b8bbee6-8af7-45c9-8a59-7b7220fce26b\") " Dec 01 22:10:34 crc kubenswrapper[4962]: I1201 22:10:34.653515 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b8bbee6-8af7-45c9-8a59-7b7220fce26b-utilities" (OuterVolumeSpecName: "utilities") pod "2b8bbee6-8af7-45c9-8a59-7b7220fce26b" (UID: "2b8bbee6-8af7-45c9-8a59-7b7220fce26b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 22:10:34 crc kubenswrapper[4962]: I1201 22:10:34.654082 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b8bbee6-8af7-45c9-8a59-7b7220fce26b-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 22:10:34 crc kubenswrapper[4962]: I1201 22:10:34.660577 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b8bbee6-8af7-45c9-8a59-7b7220fce26b-kube-api-access-22t89" (OuterVolumeSpecName: "kube-api-access-22t89") pod "2b8bbee6-8af7-45c9-8a59-7b7220fce26b" (UID: "2b8bbee6-8af7-45c9-8a59-7b7220fce26b"). InnerVolumeSpecName "kube-api-access-22t89". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 22:10:34 crc kubenswrapper[4962]: I1201 22:10:34.714550 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b8bbee6-8af7-45c9-8a59-7b7220fce26b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2b8bbee6-8af7-45c9-8a59-7b7220fce26b" (UID: "2b8bbee6-8af7-45c9-8a59-7b7220fce26b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 22:10:34 crc kubenswrapper[4962]: I1201 22:10:34.741073 4962 scope.go:117] "RemoveContainer" containerID="41669553c7c0c067b438bd86d4195907266b1030f435955a5242f27e333a55b9" Dec 01 22:10:34 crc kubenswrapper[4962]: I1201 22:10:34.756267 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22t89\" (UniqueName: \"kubernetes.io/projected/2b8bbee6-8af7-45c9-8a59-7b7220fce26b-kube-api-access-22t89\") on node \"crc\" DevicePath \"\"" Dec 01 22:10:34 crc kubenswrapper[4962]: I1201 22:10:34.756305 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b8bbee6-8af7-45c9-8a59-7b7220fce26b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 22:10:34 crc kubenswrapper[4962]: I1201 22:10:34.773001 4962 scope.go:117] "RemoveContainer" containerID="736c4c82760ad47eb1ea2451294159969ad798e200e4042ee79d21b6cfa6f83c" Dec 01 22:10:34 crc kubenswrapper[4962]: E1201 22:10:34.773425 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"736c4c82760ad47eb1ea2451294159969ad798e200e4042ee79d21b6cfa6f83c\": container with ID starting with 736c4c82760ad47eb1ea2451294159969ad798e200e4042ee79d21b6cfa6f83c not found: ID does not exist" containerID="736c4c82760ad47eb1ea2451294159969ad798e200e4042ee79d21b6cfa6f83c" Dec 01 22:10:34 crc kubenswrapper[4962]: I1201 22:10:34.773468 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"736c4c82760ad47eb1ea2451294159969ad798e200e4042ee79d21b6cfa6f83c"} err="failed to get container status \"736c4c82760ad47eb1ea2451294159969ad798e200e4042ee79d21b6cfa6f83c\": rpc error: code = NotFound desc = could not find container \"736c4c82760ad47eb1ea2451294159969ad798e200e4042ee79d21b6cfa6f83c\": container with ID starting with 736c4c82760ad47eb1ea2451294159969ad798e200e4042ee79d21b6cfa6f83c not found: ID does not exist" Dec 01 22:10:34 crc kubenswrapper[4962]: I1201 22:10:34.773498 4962 scope.go:117] "RemoveContainer" containerID="0b62a8dbfbc6cbf1155b1c0a863169c22c569be2bf47f00becbef59862013695" Dec 01 22:10:34 crc kubenswrapper[4962]: E1201 22:10:34.773912 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b62a8dbfbc6cbf1155b1c0a863169c22c569be2bf47f00becbef59862013695\": container with ID starting with 0b62a8dbfbc6cbf1155b1c0a863169c22c569be2bf47f00becbef59862013695 not found: ID does not exist" containerID="0b62a8dbfbc6cbf1155b1c0a863169c22c569be2bf47f00becbef59862013695" Dec 01 22:10:34 crc kubenswrapper[4962]: I1201 22:10:34.774018 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b62a8dbfbc6cbf1155b1c0a863169c22c569be2bf47f00becbef59862013695"} err="failed to get container status \"0b62a8dbfbc6cbf1155b1c0a863169c22c569be2bf47f00becbef59862013695\": rpc error: code = NotFound desc = could not find container \"0b62a8dbfbc6cbf1155b1c0a863169c22c569be2bf47f00becbef59862013695\": container with ID starting with 0b62a8dbfbc6cbf1155b1c0a863169c22c569be2bf47f00becbef59862013695 not found: ID does not exist" Dec 01 22:10:34 crc kubenswrapper[4962]: I1201 22:10:34.774076 4962 scope.go:117] "RemoveContainer" containerID="41669553c7c0c067b438bd86d4195907266b1030f435955a5242f27e333a55b9" Dec 01 22:10:34 crc kubenswrapper[4962]: E1201 22:10:34.774401 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41669553c7c0c067b438bd86d4195907266b1030f435955a5242f27e333a55b9\": container with ID starting with 41669553c7c0c067b438bd86d4195907266b1030f435955a5242f27e333a55b9 not found: ID does not exist" containerID="41669553c7c0c067b438bd86d4195907266b1030f435955a5242f27e333a55b9" Dec 01 22:10:34 crc kubenswrapper[4962]: I1201 22:10:34.774432 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41669553c7c0c067b438bd86d4195907266b1030f435955a5242f27e333a55b9"} err="failed to get container status \"41669553c7c0c067b438bd86d4195907266b1030f435955a5242f27e333a55b9\": rpc error: code = NotFound desc = could not find container \"41669553c7c0c067b438bd86d4195907266b1030f435955a5242f27e333a55b9\": container with ID starting with 41669553c7c0c067b438bd86d4195907266b1030f435955a5242f27e333a55b9 not found: ID does not exist" Dec 01 22:10:34 crc kubenswrapper[4962]: I1201 22:10:34.985817 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cm8tm"] Dec 01 22:10:35 crc kubenswrapper[4962]: I1201 22:10:35.001837 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-cm8tm"] Dec 01 22:10:36 crc kubenswrapper[4962]: I1201 22:10:36.244983 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b8bbee6-8af7-45c9-8a59-7b7220fce26b" path="/var/lib/kubelet/pods/2b8bbee6-8af7-45c9-8a59-7b7220fce26b/volumes" Dec 01 22:10:41 crc kubenswrapper[4962]: I1201 22:10:41.728636 4962 generic.go:334] "Generic (PLEG): container finished" podID="aee89e9b-f93a-4100-bc51-0a701a9d9549" containerID="5013fc3f9fdffddbbd1366602fab786c417da181e172911babf82a345c803d13" exitCode=0 Dec 01 22:10:41 crc kubenswrapper[4962]: I1201 22:10:41.729474 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sp5b2" event={"ID":"aee89e9b-f93a-4100-bc51-0a701a9d9549","Type":"ContainerDied","Data":"5013fc3f9fdffddbbd1366602fab786c417da181e172911babf82a345c803d13"} Dec 01 22:10:43 crc kubenswrapper[4962]: I1201 22:10:43.341222 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sp5b2" Dec 01 22:10:43 crc kubenswrapper[4962]: I1201 22:10:43.494771 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74zfd\" (UniqueName: \"kubernetes.io/projected/aee89e9b-f93a-4100-bc51-0a701a9d9549-kube-api-access-74zfd\") pod \"aee89e9b-f93a-4100-bc51-0a701a9d9549\" (UID: \"aee89e9b-f93a-4100-bc51-0a701a9d9549\") " Dec 01 22:10:43 crc kubenswrapper[4962]: I1201 22:10:43.494822 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aee89e9b-f93a-4100-bc51-0a701a9d9549-ssh-key\") pod \"aee89e9b-f93a-4100-bc51-0a701a9d9549\" (UID: \"aee89e9b-f93a-4100-bc51-0a701a9d9549\") " Dec 01 22:10:43 crc kubenswrapper[4962]: I1201 22:10:43.494862 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aee89e9b-f93a-4100-bc51-0a701a9d9549-inventory\") pod \"aee89e9b-f93a-4100-bc51-0a701a9d9549\" (UID: \"aee89e9b-f93a-4100-bc51-0a701a9d9549\") " Dec 01 22:10:43 crc kubenswrapper[4962]: I1201 22:10:43.508157 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aee89e9b-f93a-4100-bc51-0a701a9d9549-kube-api-access-74zfd" (OuterVolumeSpecName: "kube-api-access-74zfd") pod "aee89e9b-f93a-4100-bc51-0a701a9d9549" (UID: "aee89e9b-f93a-4100-bc51-0a701a9d9549"). InnerVolumeSpecName "kube-api-access-74zfd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 22:10:43 crc kubenswrapper[4962]: I1201 22:10:43.540146 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aee89e9b-f93a-4100-bc51-0a701a9d9549-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "aee89e9b-f93a-4100-bc51-0a701a9d9549" (UID: "aee89e9b-f93a-4100-bc51-0a701a9d9549"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:10:43 crc kubenswrapper[4962]: I1201 22:10:43.543825 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aee89e9b-f93a-4100-bc51-0a701a9d9549-inventory" (OuterVolumeSpecName: "inventory") pod "aee89e9b-f93a-4100-bc51-0a701a9d9549" (UID: "aee89e9b-f93a-4100-bc51-0a701a9d9549"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:10:43 crc kubenswrapper[4962]: I1201 22:10:43.597893 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74zfd\" (UniqueName: \"kubernetes.io/projected/aee89e9b-f93a-4100-bc51-0a701a9d9549-kube-api-access-74zfd\") on node \"crc\" DevicePath \"\"" Dec 01 22:10:43 crc kubenswrapper[4962]: I1201 22:10:43.597925 4962 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aee89e9b-f93a-4100-bc51-0a701a9d9549-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 22:10:43 crc kubenswrapper[4962]: I1201 22:10:43.597938 4962 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aee89e9b-f93a-4100-bc51-0a701a9d9549-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 22:10:43 crc kubenswrapper[4962]: I1201 22:10:43.760282 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sp5b2" event={"ID":"aee89e9b-f93a-4100-bc51-0a701a9d9549","Type":"ContainerDied","Data":"4ca22c1594b440c6a69368b3924fefeedd118f0bc03d0b031a02ddba64201a49"} Dec 01 22:10:43 crc kubenswrapper[4962]: I1201 22:10:43.760537 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ca22c1594b440c6a69368b3924fefeedd118f0bc03d0b031a02ddba64201a49" Dec 01 22:10:43 crc kubenswrapper[4962]: I1201 22:10:43.760572 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sp5b2" Dec 01 22:10:43 crc kubenswrapper[4962]: I1201 22:10:43.879984 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-544xz"] Dec 01 22:10:43 crc kubenswrapper[4962]: E1201 22:10:43.880491 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b8bbee6-8af7-45c9-8a59-7b7220fce26b" containerName="registry-server" Dec 01 22:10:43 crc kubenswrapper[4962]: I1201 22:10:43.880508 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b8bbee6-8af7-45c9-8a59-7b7220fce26b" containerName="registry-server" Dec 01 22:10:43 crc kubenswrapper[4962]: E1201 22:10:43.880564 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b8bbee6-8af7-45c9-8a59-7b7220fce26b" containerName="extract-utilities" Dec 01 22:10:43 crc kubenswrapper[4962]: I1201 22:10:43.880571 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b8bbee6-8af7-45c9-8a59-7b7220fce26b" containerName="extract-utilities" Dec 01 22:10:43 crc kubenswrapper[4962]: E1201 22:10:43.880583 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aee89e9b-f93a-4100-bc51-0a701a9d9549" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 01 22:10:43 crc kubenswrapper[4962]: I1201 22:10:43.880590 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="aee89e9b-f93a-4100-bc51-0a701a9d9549" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 01 22:10:43 crc kubenswrapper[4962]: E1201 22:10:43.880606 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b8bbee6-8af7-45c9-8a59-7b7220fce26b" containerName="extract-content" Dec 01 22:10:43 crc kubenswrapper[4962]: I1201 22:10:43.880611 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b8bbee6-8af7-45c9-8a59-7b7220fce26b" containerName="extract-content" Dec 01 22:10:43 crc kubenswrapper[4962]: I1201 22:10:43.880871 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="aee89e9b-f93a-4100-bc51-0a701a9d9549" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 01 22:10:43 crc kubenswrapper[4962]: I1201 22:10:43.880900 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b8bbee6-8af7-45c9-8a59-7b7220fce26b" containerName="registry-server" Dec 01 22:10:43 crc kubenswrapper[4962]: I1201 22:10:43.882127 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-544xz" Dec 01 22:10:43 crc kubenswrapper[4962]: I1201 22:10:43.885350 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mpxjd" Dec 01 22:10:43 crc kubenswrapper[4962]: I1201 22:10:43.885656 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Dec 01 22:10:43 crc kubenswrapper[4962]: I1201 22:10:43.886046 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 22:10:43 crc kubenswrapper[4962]: I1201 22:10:43.886269 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0" Dec 01 22:10:43 crc kubenswrapper[4962]: I1201 22:10:43.886429 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Dec 01 22:10:43 crc kubenswrapper[4962]: I1201 22:10:43.886686 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 22:10:43 crc kubenswrapper[4962]: I1201 22:10:43.886857 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Dec 01 22:10:43 crc kubenswrapper[4962]: I1201 22:10:43.888318 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Dec 01 22:10:43 crc kubenswrapper[4962]: I1201 22:10:43.888463 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 22:10:43 crc kubenswrapper[4962]: I1201 22:10:43.892866 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-544xz"] Dec 01 22:10:44 crc kubenswrapper[4962]: I1201 22:10:44.007842 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-544xz\" (UID: \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-544xz" Dec 01 22:10:44 crc kubenswrapper[4962]: I1201 22:10:44.008121 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-544xz\" (UID: \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-544xz" Dec 01 22:10:44 crc kubenswrapper[4962]: I1201 22:10:44.008349 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-544xz\" (UID: \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-544xz" Dec 01 22:10:44 crc kubenswrapper[4962]: I1201 22:10:44.008423 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-544xz\" (UID: \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-544xz" Dec 01 22:10:44 crc kubenswrapper[4962]: I1201 22:10:44.008455 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-544xz\" (UID: \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-544xz" Dec 01 22:10:44 crc kubenswrapper[4962]: I1201 22:10:44.008521 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-544xz\" (UID: \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-544xz" Dec 01 22:10:44 crc kubenswrapper[4962]: I1201 22:10:44.008600 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-544xz\" (UID: \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-544xz" Dec 01 22:10:44 crc kubenswrapper[4962]: I1201 22:10:44.009581 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-544xz\" (UID: \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-544xz" Dec 01 22:10:44 crc kubenswrapper[4962]: I1201 22:10:44.009684 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-544xz\" (UID: \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-544xz" Dec 01 22:10:44 crc kubenswrapper[4962]: I1201 22:10:44.009747 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-544xz\" (UID: \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-544xz" Dec 01 22:10:44 crc kubenswrapper[4962]: I1201 22:10:44.009860 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-544xz\" (UID: \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-544xz" Dec 01 22:10:44 crc kubenswrapper[4962]: I1201 22:10:44.010034 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-544xz\" (UID: \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-544xz" Dec 01 22:10:44 crc kubenswrapper[4962]: I1201 22:10:44.010097 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-544xz\" (UID: \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-544xz" Dec 01 22:10:44 crc kubenswrapper[4962]: I1201 22:10:44.010183 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-544xz\" (UID: \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-544xz" Dec 01 22:10:44 crc kubenswrapper[4962]: I1201 22:10:44.010228 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l48sg\" (UniqueName: \"kubernetes.io/projected/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-kube-api-access-l48sg\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-544xz\" (UID: \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-544xz" Dec 01 22:10:44 crc kubenswrapper[4962]: I1201 22:10:44.010293 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-544xz\" (UID: \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-544xz" Dec 01 22:10:44 crc kubenswrapper[4962]: I1201 22:10:44.112361 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-544xz\" (UID: \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-544xz" Dec 01 22:10:44 crc kubenswrapper[4962]: I1201 22:10:44.112408 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-544xz\" (UID: \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-544xz" Dec 01 22:10:44 crc kubenswrapper[4962]: I1201 22:10:44.112434 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-544xz\" (UID: \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-544xz" Dec 01 22:10:44 crc kubenswrapper[4962]: I1201 22:10:44.112468 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-544xz\" (UID: \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-544xz" Dec 01 22:10:44 crc kubenswrapper[4962]: I1201 22:10:44.112505 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-544xz\" (UID: \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-544xz" Dec 01 22:10:44 crc kubenswrapper[4962]: I1201 22:10:44.112571 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-544xz\" (UID: \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-544xz" Dec 01 22:10:44 crc kubenswrapper[4962]: I1201 22:10:44.112600 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-544xz\" (UID: \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-544xz" Dec 01 22:10:44 crc kubenswrapper[4962]: I1201 22:10:44.112630 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-544xz\" (UID: \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-544xz" Dec 01 22:10:44 crc kubenswrapper[4962]: I1201 22:10:44.112649 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-544xz\" (UID: \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-544xz" Dec 01 22:10:44 crc kubenswrapper[4962]: I1201 22:10:44.112678 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-544xz\" (UID: \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-544xz" Dec 01 22:10:44 crc kubenswrapper[4962]: I1201 22:10:44.112714 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-544xz\" (UID: \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-544xz" Dec 01 22:10:44 crc kubenswrapper[4962]: I1201 22:10:44.112757 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-544xz\" (UID: \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-544xz" Dec 01 22:10:44 crc kubenswrapper[4962]: I1201 22:10:44.112775 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l48sg\" (UniqueName: \"kubernetes.io/projected/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-kube-api-access-l48sg\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-544xz\" (UID: \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-544xz" Dec 01 22:10:44 crc kubenswrapper[4962]: I1201 22:10:44.112805 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-544xz\" (UID: \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-544xz" Dec 01 22:10:44 crc kubenswrapper[4962]: I1201 22:10:44.112837 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-544xz\" (UID: \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-544xz" Dec 01 22:10:44 crc kubenswrapper[4962]: I1201 22:10:44.112882 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-544xz\" (UID: \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-544xz" Dec 01 22:10:44 crc kubenswrapper[4962]: I1201 22:10:44.120531 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-544xz\" (UID: \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-544xz" Dec 01 22:10:44 crc kubenswrapper[4962]: I1201 22:10:44.121784 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-544xz\" (UID: \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-544xz" Dec 01 22:10:44 crc kubenswrapper[4962]: I1201 22:10:44.123146 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-544xz\" (UID: \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-544xz" Dec 01 22:10:44 crc kubenswrapper[4962]: I1201 22:10:44.123660 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-544xz\" (UID: \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-544xz" Dec 01 22:10:44 crc kubenswrapper[4962]: I1201 22:10:44.124477 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-544xz\" (UID: \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-544xz" Dec 01 22:10:44 crc kubenswrapper[4962]: I1201 22:10:44.124863 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-544xz\" (UID: \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-544xz" Dec 01 22:10:44 crc kubenswrapper[4962]: I1201 22:10:44.125545 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-544xz\" (UID: \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-544xz" Dec 01 22:10:44 crc kubenswrapper[4962]: I1201 22:10:44.126313 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-544xz\" (UID: \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-544xz" Dec 01 22:10:44 crc kubenswrapper[4962]: I1201 22:10:44.126325 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-544xz\" (UID: \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-544xz" Dec 01 22:10:44 crc kubenswrapper[4962]: I1201 22:10:44.124610 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-544xz\" (UID: \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-544xz" Dec 01 22:10:44 crc kubenswrapper[4962]: I1201 22:10:44.126640 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-544xz\" (UID: \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-544xz" Dec 01 22:10:44 crc kubenswrapper[4962]: I1201 22:10:44.128643 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-544xz\" (UID: \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-544xz" Dec 01 22:10:44 crc kubenswrapper[4962]: I1201 22:10:44.132152 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-544xz\" (UID: \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-544xz" Dec 01 22:10:44 crc kubenswrapper[4962]: I1201 22:10:44.136375 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-544xz\" (UID: \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-544xz" Dec 01 22:10:44 crc kubenswrapper[4962]: I1201 22:10:44.149763 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-544xz\" (UID: \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-544xz" Dec 01 22:10:44 crc kubenswrapper[4962]: I1201 22:10:44.151036 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l48sg\" (UniqueName: \"kubernetes.io/projected/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-kube-api-access-l48sg\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-544xz\" (UID: \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-544xz" Dec 01 22:10:44 crc kubenswrapper[4962]: I1201 22:10:44.223022 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-544xz" Dec 01 22:10:44 crc kubenswrapper[4962]: I1201 22:10:44.795901 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-544xz"] Dec 01 22:10:45 crc kubenswrapper[4962]: I1201 22:10:45.798049 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-544xz" event={"ID":"3dd32d50-cae8-4762-ba07-ce8d8d1996b8","Type":"ContainerStarted","Data":"45aaf9eae00818a8fe1c84d24c328837c0b4028ff91d2f6abae4a853428b17e0"} Dec 01 22:10:46 crc kubenswrapper[4962]: I1201 22:10:46.810020 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-544xz" event={"ID":"3dd32d50-cae8-4762-ba07-ce8d8d1996b8","Type":"ContainerStarted","Data":"ca45f55ef375b96615955376f2ce439fe355ed14655a08983c2a337c2fabc6b4"} Dec 01 22:10:46 crc kubenswrapper[4962]: I1201 22:10:46.847565 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-544xz" podStartSLOduration=2.993465358 podStartE2EDuration="3.847544385s" podCreationTimestamp="2025-12-01 22:10:43 +0000 UTC" firstStartedPulling="2025-12-01 22:10:44.804402236 +0000 UTC m=+2228.905841441" lastFinishedPulling="2025-12-01 22:10:45.658481263 +0000 UTC m=+2229.759920468" observedRunningTime="2025-12-01 22:10:46.834684259 +0000 UTC m=+2230.936123464" watchObservedRunningTime="2025-12-01 22:10:46.847544385 +0000 UTC m=+2230.948983590" Dec 01 22:11:06 crc kubenswrapper[4962]: I1201 22:11:06.389679 4962 scope.go:117] "RemoveContainer" containerID="293d8ed9818712602706120aa1c390e9d45a79be67adab1be930418272593f07" Dec 01 22:11:42 crc kubenswrapper[4962]: I1201 22:11:42.569777 4962 generic.go:334] "Generic (PLEG): container finished" podID="3dd32d50-cae8-4762-ba07-ce8d8d1996b8" containerID="ca45f55ef375b96615955376f2ce439fe355ed14655a08983c2a337c2fabc6b4" exitCode=0 Dec 01 22:11:42 crc kubenswrapper[4962]: I1201 22:11:42.569915 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-544xz" event={"ID":"3dd32d50-cae8-4762-ba07-ce8d8d1996b8","Type":"ContainerDied","Data":"ca45f55ef375b96615955376f2ce439fe355ed14655a08983c2a337c2fabc6b4"} Dec 01 22:11:44 crc kubenswrapper[4962]: I1201 22:11:44.186004 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-544xz" Dec 01 22:11:44 crc kubenswrapper[4962]: I1201 22:11:44.373104 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-telemetry-combined-ca-bundle\") pod \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\" (UID: \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\") " Dec 01 22:11:44 crc kubenswrapper[4962]: I1201 22:11:44.373184 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-nova-combined-ca-bundle\") pod \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\" (UID: \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\") " Dec 01 22:11:44 crc kubenswrapper[4962]: I1201 22:11:44.373216 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-ssh-key\") pod \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\" (UID: \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\") " Dec 01 22:11:44 crc kubenswrapper[4962]: I1201 22:11:44.373241 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-bootstrap-combined-ca-bundle\") pod \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\" (UID: \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\") " Dec 01 22:11:44 crc kubenswrapper[4962]: I1201 22:11:44.373284 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\" (UID: \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\") " Dec 01 22:11:44 crc kubenswrapper[4962]: I1201 22:11:44.373597 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\" (UID: \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\") " Dec 01 22:11:44 crc kubenswrapper[4962]: I1201 22:11:44.373650 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l48sg\" (UniqueName: \"kubernetes.io/projected/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-kube-api-access-l48sg\") pod \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\" (UID: \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\") " Dec 01 22:11:44 crc kubenswrapper[4962]: I1201 22:11:44.373695 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-inventory\") pod \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\" (UID: \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\") " Dec 01 22:11:44 crc kubenswrapper[4962]: I1201 22:11:44.373753 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\" (UID: \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\") " Dec 01 22:11:44 crc kubenswrapper[4962]: I1201 22:11:44.373787 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\" (UID: \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\") " Dec 01 22:11:44 crc kubenswrapper[4962]: I1201 22:11:44.373824 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-openstack-edpm-ipam-ovn-default-certs-0\") pod \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\" (UID: \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\") " Dec 01 22:11:44 crc kubenswrapper[4962]: I1201 22:11:44.373955 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-repo-setup-combined-ca-bundle\") pod \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\" (UID: \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\") " Dec 01 22:11:44 crc kubenswrapper[4962]: I1201 22:11:44.374156 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-libvirt-combined-ca-bundle\") pod \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\" (UID: \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\") " Dec 01 22:11:44 crc kubenswrapper[4962]: I1201 22:11:44.374214 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-telemetry-power-monitoring-combined-ca-bundle\") pod \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\" (UID: \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\") " Dec 01 22:11:44 crc kubenswrapper[4962]: I1201 22:11:44.374250 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-ovn-combined-ca-bundle\") pod \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\" (UID: \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\") " Dec 01 22:11:44 crc kubenswrapper[4962]: I1201 22:11:44.374276 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-neutron-metadata-combined-ca-bundle\") pod \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\" (UID: \"3dd32d50-cae8-4762-ba07-ce8d8d1996b8\") " Dec 01 22:11:44 crc kubenswrapper[4962]: I1201 22:11:44.384427 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "3dd32d50-cae8-4762-ba07-ce8d8d1996b8" (UID: "3dd32d50-cae8-4762-ba07-ce8d8d1996b8"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 22:11:44 crc kubenswrapper[4962]: I1201 22:11:44.386259 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "3dd32d50-cae8-4762-ba07-ce8d8d1996b8" (UID: "3dd32d50-cae8-4762-ba07-ce8d8d1996b8"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:11:44 crc kubenswrapper[4962]: I1201 22:11:44.387173 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "3dd32d50-cae8-4762-ba07-ce8d8d1996b8" (UID: "3dd32d50-cae8-4762-ba07-ce8d8d1996b8"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 22:11:44 crc kubenswrapper[4962]: I1201 22:11:44.387370 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0") pod "3dd32d50-cae8-4762-ba07-ce8d8d1996b8" (UID: "3dd32d50-cae8-4762-ba07-ce8d8d1996b8"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 22:11:44 crc kubenswrapper[4962]: I1201 22:11:44.389614 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "3dd32d50-cae8-4762-ba07-ce8d8d1996b8" (UID: "3dd32d50-cae8-4762-ba07-ce8d8d1996b8"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:11:44 crc kubenswrapper[4962]: I1201 22:11:44.390135 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "3dd32d50-cae8-4762-ba07-ce8d8d1996b8" (UID: "3dd32d50-cae8-4762-ba07-ce8d8d1996b8"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:11:44 crc kubenswrapper[4962]: I1201 22:11:44.390873 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "3dd32d50-cae8-4762-ba07-ce8d8d1996b8" (UID: "3dd32d50-cae8-4762-ba07-ce8d8d1996b8"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 22:11:44 crc kubenswrapper[4962]: I1201 22:11:44.390961 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-kube-api-access-l48sg" (OuterVolumeSpecName: "kube-api-access-l48sg") pod "3dd32d50-cae8-4762-ba07-ce8d8d1996b8" (UID: "3dd32d50-cae8-4762-ba07-ce8d8d1996b8"). InnerVolumeSpecName "kube-api-access-l48sg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 22:11:44 crc kubenswrapper[4962]: I1201 22:11:44.392816 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "3dd32d50-cae8-4762-ba07-ce8d8d1996b8" (UID: "3dd32d50-cae8-4762-ba07-ce8d8d1996b8"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 22:11:44 crc kubenswrapper[4962]: I1201 22:11:44.395536 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-telemetry-power-monitoring-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-power-monitoring-combined-ca-bundle") pod "3dd32d50-cae8-4762-ba07-ce8d8d1996b8" (UID: "3dd32d50-cae8-4762-ba07-ce8d8d1996b8"). InnerVolumeSpecName "telemetry-power-monitoring-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:11:44 crc kubenswrapper[4962]: I1201 22:11:44.404019 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "3dd32d50-cae8-4762-ba07-ce8d8d1996b8" (UID: "3dd32d50-cae8-4762-ba07-ce8d8d1996b8"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:11:44 crc kubenswrapper[4962]: I1201 22:11:44.404818 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "3dd32d50-cae8-4762-ba07-ce8d8d1996b8" (UID: "3dd32d50-cae8-4762-ba07-ce8d8d1996b8"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:11:44 crc kubenswrapper[4962]: I1201 22:11:44.405213 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "3dd32d50-cae8-4762-ba07-ce8d8d1996b8" (UID: "3dd32d50-cae8-4762-ba07-ce8d8d1996b8"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:11:44 crc kubenswrapper[4962]: I1201 22:11:44.407915 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "3dd32d50-cae8-4762-ba07-ce8d8d1996b8" (UID: "3dd32d50-cae8-4762-ba07-ce8d8d1996b8"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:11:44 crc kubenswrapper[4962]: I1201 22:11:44.438285 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-inventory" (OuterVolumeSpecName: "inventory") pod "3dd32d50-cae8-4762-ba07-ce8d8d1996b8" (UID: "3dd32d50-cae8-4762-ba07-ce8d8d1996b8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:11:44 crc kubenswrapper[4962]: I1201 22:11:44.465362 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "3dd32d50-cae8-4762-ba07-ce8d8d1996b8" (UID: "3dd32d50-cae8-4762-ba07-ce8d8d1996b8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:11:44 crc kubenswrapper[4962]: I1201 22:11:44.477088 4962 reconciler_common.go:293] "Volume detached for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-telemetry-power-monitoring-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 22:11:44 crc kubenswrapper[4962]: I1201 22:11:44.477121 4962 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 22:11:44 crc kubenswrapper[4962]: I1201 22:11:44.477132 4962 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 22:11:44 crc kubenswrapper[4962]: I1201 22:11:44.477143 4962 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 22:11:44 crc kubenswrapper[4962]: I1201 22:11:44.477152 4962 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 22:11:44 crc kubenswrapper[4962]: I1201 22:11:44.477161 4962 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 22:11:44 crc kubenswrapper[4962]: I1201 22:11:44.477173 4962 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 22:11:44 crc kubenswrapper[4962]: I1201 22:11:44.477197 4962 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 01 22:11:44 crc kubenswrapper[4962]: I1201 22:11:44.477207 4962 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 01 22:11:44 crc kubenswrapper[4962]: I1201 22:11:44.477221 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l48sg\" (UniqueName: \"kubernetes.io/projected/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-kube-api-access-l48sg\") on node \"crc\" DevicePath \"\"" Dec 01 22:11:44 crc kubenswrapper[4962]: I1201 22:11:44.477230 4962 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 22:11:44 crc kubenswrapper[4962]: I1201 22:11:44.477238 4962 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 01 22:11:44 crc kubenswrapper[4962]: I1201 22:11:44.477247 4962 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 01 22:11:44 crc kubenswrapper[4962]: I1201 22:11:44.477256 4962 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 01 22:11:44 crc kubenswrapper[4962]: I1201 22:11:44.477265 4962 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 22:11:44 crc kubenswrapper[4962]: I1201 22:11:44.477273 4962 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dd32d50-cae8-4762-ba07-ce8d8d1996b8-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 22:11:44 crc kubenswrapper[4962]: I1201 22:11:44.602094 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-544xz" event={"ID":"3dd32d50-cae8-4762-ba07-ce8d8d1996b8","Type":"ContainerDied","Data":"45aaf9eae00818a8fe1c84d24c328837c0b4028ff91d2f6abae4a853428b17e0"} Dec 01 22:11:44 crc kubenswrapper[4962]: I1201 22:11:44.602157 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45aaf9eae00818a8fe1c84d24c328837c0b4028ff91d2f6abae4a853428b17e0" Dec 01 22:11:44 crc kubenswrapper[4962]: I1201 22:11:44.602179 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-544xz" Dec 01 22:11:44 crc kubenswrapper[4962]: I1201 22:11:44.865184 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-hwvmq"] Dec 01 22:11:44 crc kubenswrapper[4962]: E1201 22:11:44.866460 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dd32d50-cae8-4762-ba07-ce8d8d1996b8" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 01 22:11:44 crc kubenswrapper[4962]: I1201 22:11:44.866503 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dd32d50-cae8-4762-ba07-ce8d8d1996b8" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 01 22:11:44 crc kubenswrapper[4962]: I1201 22:11:44.866976 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="3dd32d50-cae8-4762-ba07-ce8d8d1996b8" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 01 22:11:44 crc kubenswrapper[4962]: I1201 22:11:44.868389 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hwvmq" Dec 01 22:11:44 crc kubenswrapper[4962]: I1201 22:11:44.871223 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 22:11:44 crc kubenswrapper[4962]: I1201 22:11:44.871565 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 22:11:44 crc kubenswrapper[4962]: I1201 22:11:44.872499 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Dec 01 22:11:44 crc kubenswrapper[4962]: I1201 22:11:44.872866 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 22:11:44 crc kubenswrapper[4962]: I1201 22:11:44.873137 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mpxjd" Dec 01 22:11:44 crc kubenswrapper[4962]: I1201 22:11:44.885024 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-hwvmq"] Dec 01 22:11:44 crc kubenswrapper[4962]: I1201 22:11:44.908894 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4ea2d579-b57c-41a6-a255-ee852e50ec7a-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hwvmq\" (UID: \"4ea2d579-b57c-41a6-a255-ee852e50ec7a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hwvmq" Dec 01 22:11:44 crc kubenswrapper[4962]: I1201 22:11:44.909231 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4ea2d579-b57c-41a6-a255-ee852e50ec7a-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hwvmq\" (UID: \"4ea2d579-b57c-41a6-a255-ee852e50ec7a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hwvmq" Dec 01 22:11:44 crc kubenswrapper[4962]: I1201 22:11:44.909558 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/4ea2d579-b57c-41a6-a255-ee852e50ec7a-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hwvmq\" (UID: \"4ea2d579-b57c-41a6-a255-ee852e50ec7a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hwvmq" Dec 01 22:11:44 crc kubenswrapper[4962]: I1201 22:11:44.909629 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6gsk\" (UniqueName: \"kubernetes.io/projected/4ea2d579-b57c-41a6-a255-ee852e50ec7a-kube-api-access-f6gsk\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hwvmq\" (UID: \"4ea2d579-b57c-41a6-a255-ee852e50ec7a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hwvmq" Dec 01 22:11:44 crc kubenswrapper[4962]: I1201 22:11:44.909753 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ea2d579-b57c-41a6-a255-ee852e50ec7a-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hwvmq\" (UID: \"4ea2d579-b57c-41a6-a255-ee852e50ec7a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hwvmq" Dec 01 22:11:45 crc kubenswrapper[4962]: I1201 22:11:45.011681 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/4ea2d579-b57c-41a6-a255-ee852e50ec7a-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hwvmq\" (UID: \"4ea2d579-b57c-41a6-a255-ee852e50ec7a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hwvmq" Dec 01 22:11:45 crc kubenswrapper[4962]: I1201 22:11:45.011749 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6gsk\" (UniqueName: \"kubernetes.io/projected/4ea2d579-b57c-41a6-a255-ee852e50ec7a-kube-api-access-f6gsk\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hwvmq\" (UID: \"4ea2d579-b57c-41a6-a255-ee852e50ec7a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hwvmq" Dec 01 22:11:45 crc kubenswrapper[4962]: I1201 22:11:45.011822 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ea2d579-b57c-41a6-a255-ee852e50ec7a-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hwvmq\" (UID: \"4ea2d579-b57c-41a6-a255-ee852e50ec7a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hwvmq" Dec 01 22:11:45 crc kubenswrapper[4962]: I1201 22:11:45.011977 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4ea2d579-b57c-41a6-a255-ee852e50ec7a-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hwvmq\" (UID: \"4ea2d579-b57c-41a6-a255-ee852e50ec7a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hwvmq" Dec 01 22:11:45 crc kubenswrapper[4962]: I1201 22:11:45.012155 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4ea2d579-b57c-41a6-a255-ee852e50ec7a-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hwvmq\" (UID: \"4ea2d579-b57c-41a6-a255-ee852e50ec7a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hwvmq" Dec 01 22:11:45 crc kubenswrapper[4962]: I1201 22:11:45.013400 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/4ea2d579-b57c-41a6-a255-ee852e50ec7a-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hwvmq\" (UID: \"4ea2d579-b57c-41a6-a255-ee852e50ec7a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hwvmq" Dec 01 22:11:45 crc kubenswrapper[4962]: I1201 22:11:45.016847 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4ea2d579-b57c-41a6-a255-ee852e50ec7a-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hwvmq\" (UID: \"4ea2d579-b57c-41a6-a255-ee852e50ec7a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hwvmq" Dec 01 22:11:45 crc kubenswrapper[4962]: I1201 22:11:45.017961 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4ea2d579-b57c-41a6-a255-ee852e50ec7a-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hwvmq\" (UID: \"4ea2d579-b57c-41a6-a255-ee852e50ec7a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hwvmq" Dec 01 22:11:45 crc kubenswrapper[4962]: I1201 22:11:45.018298 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ea2d579-b57c-41a6-a255-ee852e50ec7a-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hwvmq\" (UID: \"4ea2d579-b57c-41a6-a255-ee852e50ec7a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hwvmq" Dec 01 22:11:45 crc kubenswrapper[4962]: I1201 22:11:45.032250 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6gsk\" (UniqueName: \"kubernetes.io/projected/4ea2d579-b57c-41a6-a255-ee852e50ec7a-kube-api-access-f6gsk\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hwvmq\" (UID: \"4ea2d579-b57c-41a6-a255-ee852e50ec7a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hwvmq" Dec 01 22:11:45 crc kubenswrapper[4962]: I1201 22:11:45.249232 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hwvmq" Dec 01 22:11:45 crc kubenswrapper[4962]: I1201 22:11:45.856826 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-hwvmq"] Dec 01 22:11:46 crc kubenswrapper[4962]: I1201 22:11:46.632923 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hwvmq" event={"ID":"4ea2d579-b57c-41a6-a255-ee852e50ec7a","Type":"ContainerStarted","Data":"bfc893d6675e9a73d88dbc1c574de7278891b32765a513b762e3078f941e294c"} Dec 01 22:11:47 crc kubenswrapper[4962]: I1201 22:11:47.644680 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hwvmq" event={"ID":"4ea2d579-b57c-41a6-a255-ee852e50ec7a","Type":"ContainerStarted","Data":"630e1204ce489026c6f849c2a7a94d6aaf9a83763dec25c6e1164d07e59e2893"} Dec 01 22:11:47 crc kubenswrapper[4962]: I1201 22:11:47.678983 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hwvmq" podStartSLOduration=2.9705218909999997 podStartE2EDuration="3.678964977s" podCreationTimestamp="2025-12-01 22:11:44 +0000 UTC" firstStartedPulling="2025-12-01 22:11:45.860385544 +0000 UTC m=+2289.961824739" lastFinishedPulling="2025-12-01 22:11:46.56882859 +0000 UTC m=+2290.670267825" observedRunningTime="2025-12-01 22:11:47.669417576 +0000 UTC m=+2291.770856781" watchObservedRunningTime="2025-12-01 22:11:47.678964977 +0000 UTC m=+2291.780404172" Dec 01 22:12:30 crc kubenswrapper[4962]: I1201 22:12:30.977185 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bsnqr"] Dec 01 22:12:30 crc kubenswrapper[4962]: I1201 22:12:30.980585 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bsnqr" Dec 01 22:12:31 crc kubenswrapper[4962]: I1201 22:12:31.004642 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bsnqr"] Dec 01 22:12:31 crc kubenswrapper[4962]: I1201 22:12:31.009798 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8796560f-c576-43fc-8d05-af44154632ab-utilities\") pod \"redhat-marketplace-bsnqr\" (UID: \"8796560f-c576-43fc-8d05-af44154632ab\") " pod="openshift-marketplace/redhat-marketplace-bsnqr" Dec 01 22:12:31 crc kubenswrapper[4962]: I1201 22:12:31.009963 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8796560f-c576-43fc-8d05-af44154632ab-catalog-content\") pod \"redhat-marketplace-bsnqr\" (UID: \"8796560f-c576-43fc-8d05-af44154632ab\") " pod="openshift-marketplace/redhat-marketplace-bsnqr" Dec 01 22:12:31 crc kubenswrapper[4962]: I1201 22:12:31.010058 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6vg2\" (UniqueName: \"kubernetes.io/projected/8796560f-c576-43fc-8d05-af44154632ab-kube-api-access-n6vg2\") pod \"redhat-marketplace-bsnqr\" (UID: \"8796560f-c576-43fc-8d05-af44154632ab\") " pod="openshift-marketplace/redhat-marketplace-bsnqr" Dec 01 22:12:31 crc kubenswrapper[4962]: I1201 22:12:31.112122 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8796560f-c576-43fc-8d05-af44154632ab-utilities\") pod \"redhat-marketplace-bsnqr\" (UID: \"8796560f-c576-43fc-8d05-af44154632ab\") " pod="openshift-marketplace/redhat-marketplace-bsnqr" Dec 01 22:12:31 crc kubenswrapper[4962]: I1201 22:12:31.112203 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8796560f-c576-43fc-8d05-af44154632ab-catalog-content\") pod \"redhat-marketplace-bsnqr\" (UID: \"8796560f-c576-43fc-8d05-af44154632ab\") " pod="openshift-marketplace/redhat-marketplace-bsnqr" Dec 01 22:12:31 crc kubenswrapper[4962]: I1201 22:12:31.112231 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6vg2\" (UniqueName: \"kubernetes.io/projected/8796560f-c576-43fc-8d05-af44154632ab-kube-api-access-n6vg2\") pod \"redhat-marketplace-bsnqr\" (UID: \"8796560f-c576-43fc-8d05-af44154632ab\") " pod="openshift-marketplace/redhat-marketplace-bsnqr" Dec 01 22:12:31 crc kubenswrapper[4962]: I1201 22:12:31.112642 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8796560f-c576-43fc-8d05-af44154632ab-utilities\") pod \"redhat-marketplace-bsnqr\" (UID: \"8796560f-c576-43fc-8d05-af44154632ab\") " pod="openshift-marketplace/redhat-marketplace-bsnqr" Dec 01 22:12:31 crc kubenswrapper[4962]: I1201 22:12:31.112720 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8796560f-c576-43fc-8d05-af44154632ab-catalog-content\") pod \"redhat-marketplace-bsnqr\" (UID: \"8796560f-c576-43fc-8d05-af44154632ab\") " pod="openshift-marketplace/redhat-marketplace-bsnqr" Dec 01 22:12:31 crc kubenswrapper[4962]: I1201 22:12:31.131645 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6vg2\" (UniqueName: \"kubernetes.io/projected/8796560f-c576-43fc-8d05-af44154632ab-kube-api-access-n6vg2\") pod \"redhat-marketplace-bsnqr\" (UID: \"8796560f-c576-43fc-8d05-af44154632ab\") " pod="openshift-marketplace/redhat-marketplace-bsnqr" Dec 01 22:12:31 crc kubenswrapper[4962]: I1201 22:12:31.313510 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bsnqr" Dec 01 22:12:31 crc kubenswrapper[4962]: I1201 22:12:31.872242 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bsnqr"] Dec 01 22:12:32 crc kubenswrapper[4962]: I1201 22:12:32.344808 4962 generic.go:334] "Generic (PLEG): container finished" podID="8796560f-c576-43fc-8d05-af44154632ab" containerID="c958bdcd6a2e61d509acf7d4cd2bda12b2c0f47054d4f8c9b61eb72ef9b7f851" exitCode=0 Dec 01 22:12:32 crc kubenswrapper[4962]: I1201 22:12:32.344855 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bsnqr" event={"ID":"8796560f-c576-43fc-8d05-af44154632ab","Type":"ContainerDied","Data":"c958bdcd6a2e61d509acf7d4cd2bda12b2c0f47054d4f8c9b61eb72ef9b7f851"} Dec 01 22:12:32 crc kubenswrapper[4962]: I1201 22:12:32.345159 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bsnqr" event={"ID":"8796560f-c576-43fc-8d05-af44154632ab","Type":"ContainerStarted","Data":"207168d386924fe249c5f6402b99bd3112686f86f31adbe3b46205d083f364f1"} Dec 01 22:12:32 crc kubenswrapper[4962]: I1201 22:12:32.784757 4962 patch_prober.go:28] interesting pod/machine-config-daemon-b642k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 22:12:32 crc kubenswrapper[4962]: I1201 22:12:32.784867 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 22:12:34 crc kubenswrapper[4962]: I1201 22:12:34.388819 4962 generic.go:334] "Generic (PLEG): container finished" podID="8796560f-c576-43fc-8d05-af44154632ab" containerID="05890f2342fbe4afd6dc32b8328e53123817d655a46153d85b5b38fe3d5ca2ee" exitCode=0 Dec 01 22:12:34 crc kubenswrapper[4962]: I1201 22:12:34.388915 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bsnqr" event={"ID":"8796560f-c576-43fc-8d05-af44154632ab","Type":"ContainerDied","Data":"05890f2342fbe4afd6dc32b8328e53123817d655a46153d85b5b38fe3d5ca2ee"} Dec 01 22:12:36 crc kubenswrapper[4962]: I1201 22:12:36.416863 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bsnqr" event={"ID":"8796560f-c576-43fc-8d05-af44154632ab","Type":"ContainerStarted","Data":"e27d8ab26377b657fc90cbfe3ef263ef35fa95330ee70dd0f1f327f987420a91"} Dec 01 22:12:36 crc kubenswrapper[4962]: I1201 22:12:36.445585 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bsnqr" podStartSLOduration=3.511827897 podStartE2EDuration="6.445562582s" podCreationTimestamp="2025-12-01 22:12:30 +0000 UTC" firstStartedPulling="2025-12-01 22:12:32.348154377 +0000 UTC m=+2336.449593562" lastFinishedPulling="2025-12-01 22:12:35.281889022 +0000 UTC m=+2339.383328247" observedRunningTime="2025-12-01 22:12:36.43741147 +0000 UTC m=+2340.538850665" watchObservedRunningTime="2025-12-01 22:12:36.445562582 +0000 UTC m=+2340.547001787" Dec 01 22:12:41 crc kubenswrapper[4962]: I1201 22:12:41.313774 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bsnqr" Dec 01 22:12:41 crc kubenswrapper[4962]: I1201 22:12:41.314692 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bsnqr" Dec 01 22:12:41 crc kubenswrapper[4962]: I1201 22:12:41.404306 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bsnqr" Dec 01 22:12:41 crc kubenswrapper[4962]: I1201 22:12:41.531816 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bsnqr" Dec 01 22:12:41 crc kubenswrapper[4962]: I1201 22:12:41.648630 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bsnqr"] Dec 01 22:12:43 crc kubenswrapper[4962]: I1201 22:12:43.504015 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bsnqr" podUID="8796560f-c576-43fc-8d05-af44154632ab" containerName="registry-server" containerID="cri-o://e27d8ab26377b657fc90cbfe3ef263ef35fa95330ee70dd0f1f327f987420a91" gracePeriod=2 Dec 01 22:12:44 crc kubenswrapper[4962]: I1201 22:12:44.102189 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bsnqr" Dec 01 22:12:44 crc kubenswrapper[4962]: I1201 22:12:44.213558 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8796560f-c576-43fc-8d05-af44154632ab-utilities\") pod \"8796560f-c576-43fc-8d05-af44154632ab\" (UID: \"8796560f-c576-43fc-8d05-af44154632ab\") " Dec 01 22:12:44 crc kubenswrapper[4962]: I1201 22:12:44.213790 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8796560f-c576-43fc-8d05-af44154632ab-catalog-content\") pod \"8796560f-c576-43fc-8d05-af44154632ab\" (UID: \"8796560f-c576-43fc-8d05-af44154632ab\") " Dec 01 22:12:44 crc kubenswrapper[4962]: I1201 22:12:44.213961 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6vg2\" (UniqueName: \"kubernetes.io/projected/8796560f-c576-43fc-8d05-af44154632ab-kube-api-access-n6vg2\") pod \"8796560f-c576-43fc-8d05-af44154632ab\" (UID: \"8796560f-c576-43fc-8d05-af44154632ab\") " Dec 01 22:12:44 crc kubenswrapper[4962]: I1201 22:12:44.215927 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8796560f-c576-43fc-8d05-af44154632ab-utilities" (OuterVolumeSpecName: "utilities") pod "8796560f-c576-43fc-8d05-af44154632ab" (UID: "8796560f-c576-43fc-8d05-af44154632ab"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 22:12:44 crc kubenswrapper[4962]: I1201 22:12:44.216640 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8796560f-c576-43fc-8d05-af44154632ab-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 22:12:44 crc kubenswrapper[4962]: I1201 22:12:44.225611 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8796560f-c576-43fc-8d05-af44154632ab-kube-api-access-n6vg2" (OuterVolumeSpecName: "kube-api-access-n6vg2") pod "8796560f-c576-43fc-8d05-af44154632ab" (UID: "8796560f-c576-43fc-8d05-af44154632ab"). InnerVolumeSpecName "kube-api-access-n6vg2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 22:12:44 crc kubenswrapper[4962]: I1201 22:12:44.251763 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8796560f-c576-43fc-8d05-af44154632ab-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8796560f-c576-43fc-8d05-af44154632ab" (UID: "8796560f-c576-43fc-8d05-af44154632ab"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 22:12:44 crc kubenswrapper[4962]: I1201 22:12:44.320558 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8796560f-c576-43fc-8d05-af44154632ab-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 22:12:44 crc kubenswrapper[4962]: I1201 22:12:44.320666 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6vg2\" (UniqueName: \"kubernetes.io/projected/8796560f-c576-43fc-8d05-af44154632ab-kube-api-access-n6vg2\") on node \"crc\" DevicePath \"\"" Dec 01 22:12:44 crc kubenswrapper[4962]: I1201 22:12:44.520743 4962 generic.go:334] "Generic (PLEG): container finished" podID="8796560f-c576-43fc-8d05-af44154632ab" containerID="e27d8ab26377b657fc90cbfe3ef263ef35fa95330ee70dd0f1f327f987420a91" exitCode=0 Dec 01 22:12:44 crc kubenswrapper[4962]: I1201 22:12:44.520792 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bsnqr" event={"ID":"8796560f-c576-43fc-8d05-af44154632ab","Type":"ContainerDied","Data":"e27d8ab26377b657fc90cbfe3ef263ef35fa95330ee70dd0f1f327f987420a91"} Dec 01 22:12:44 crc kubenswrapper[4962]: I1201 22:12:44.520827 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bsnqr" event={"ID":"8796560f-c576-43fc-8d05-af44154632ab","Type":"ContainerDied","Data":"207168d386924fe249c5f6402b99bd3112686f86f31adbe3b46205d083f364f1"} Dec 01 22:12:44 crc kubenswrapper[4962]: I1201 22:12:44.520852 4962 scope.go:117] "RemoveContainer" containerID="e27d8ab26377b657fc90cbfe3ef263ef35fa95330ee70dd0f1f327f987420a91" Dec 01 22:12:44 crc kubenswrapper[4962]: I1201 22:12:44.520846 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bsnqr" Dec 01 22:12:44 crc kubenswrapper[4962]: I1201 22:12:44.565831 4962 scope.go:117] "RemoveContainer" containerID="05890f2342fbe4afd6dc32b8328e53123817d655a46153d85b5b38fe3d5ca2ee" Dec 01 22:12:44 crc kubenswrapper[4962]: I1201 22:12:44.584745 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bsnqr"] Dec 01 22:12:44 crc kubenswrapper[4962]: I1201 22:12:44.606453 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bsnqr"] Dec 01 22:12:44 crc kubenswrapper[4962]: I1201 22:12:44.621405 4962 scope.go:117] "RemoveContainer" containerID="c958bdcd6a2e61d509acf7d4cd2bda12b2c0f47054d4f8c9b61eb72ef9b7f851" Dec 01 22:12:44 crc kubenswrapper[4962]: I1201 22:12:44.681925 4962 scope.go:117] "RemoveContainer" containerID="e27d8ab26377b657fc90cbfe3ef263ef35fa95330ee70dd0f1f327f987420a91" Dec 01 22:12:44 crc kubenswrapper[4962]: E1201 22:12:44.682899 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e27d8ab26377b657fc90cbfe3ef263ef35fa95330ee70dd0f1f327f987420a91\": container with ID starting with e27d8ab26377b657fc90cbfe3ef263ef35fa95330ee70dd0f1f327f987420a91 not found: ID does not exist" containerID="e27d8ab26377b657fc90cbfe3ef263ef35fa95330ee70dd0f1f327f987420a91" Dec 01 22:12:44 crc kubenswrapper[4962]: I1201 22:12:44.682974 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e27d8ab26377b657fc90cbfe3ef263ef35fa95330ee70dd0f1f327f987420a91"} err="failed to get container status \"e27d8ab26377b657fc90cbfe3ef263ef35fa95330ee70dd0f1f327f987420a91\": rpc error: code = NotFound desc = could not find container \"e27d8ab26377b657fc90cbfe3ef263ef35fa95330ee70dd0f1f327f987420a91\": container with ID starting with e27d8ab26377b657fc90cbfe3ef263ef35fa95330ee70dd0f1f327f987420a91 not found: ID does not exist" Dec 01 22:12:44 crc kubenswrapper[4962]: I1201 22:12:44.683014 4962 scope.go:117] "RemoveContainer" containerID="05890f2342fbe4afd6dc32b8328e53123817d655a46153d85b5b38fe3d5ca2ee" Dec 01 22:12:44 crc kubenswrapper[4962]: E1201 22:12:44.683616 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05890f2342fbe4afd6dc32b8328e53123817d655a46153d85b5b38fe3d5ca2ee\": container with ID starting with 05890f2342fbe4afd6dc32b8328e53123817d655a46153d85b5b38fe3d5ca2ee not found: ID does not exist" containerID="05890f2342fbe4afd6dc32b8328e53123817d655a46153d85b5b38fe3d5ca2ee" Dec 01 22:12:44 crc kubenswrapper[4962]: I1201 22:12:44.683669 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05890f2342fbe4afd6dc32b8328e53123817d655a46153d85b5b38fe3d5ca2ee"} err="failed to get container status \"05890f2342fbe4afd6dc32b8328e53123817d655a46153d85b5b38fe3d5ca2ee\": rpc error: code = NotFound desc = could not find container \"05890f2342fbe4afd6dc32b8328e53123817d655a46153d85b5b38fe3d5ca2ee\": container with ID starting with 05890f2342fbe4afd6dc32b8328e53123817d655a46153d85b5b38fe3d5ca2ee not found: ID does not exist" Dec 01 22:12:44 crc kubenswrapper[4962]: I1201 22:12:44.683711 4962 scope.go:117] "RemoveContainer" containerID="c958bdcd6a2e61d509acf7d4cd2bda12b2c0f47054d4f8c9b61eb72ef9b7f851" Dec 01 22:12:44 crc kubenswrapper[4962]: E1201 22:12:44.684334 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c958bdcd6a2e61d509acf7d4cd2bda12b2c0f47054d4f8c9b61eb72ef9b7f851\": container with ID starting with c958bdcd6a2e61d509acf7d4cd2bda12b2c0f47054d4f8c9b61eb72ef9b7f851 not found: ID does not exist" containerID="c958bdcd6a2e61d509acf7d4cd2bda12b2c0f47054d4f8c9b61eb72ef9b7f851" Dec 01 22:12:44 crc kubenswrapper[4962]: I1201 22:12:44.684405 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c958bdcd6a2e61d509acf7d4cd2bda12b2c0f47054d4f8c9b61eb72ef9b7f851"} err="failed to get container status \"c958bdcd6a2e61d509acf7d4cd2bda12b2c0f47054d4f8c9b61eb72ef9b7f851\": rpc error: code = NotFound desc = could not find container \"c958bdcd6a2e61d509acf7d4cd2bda12b2c0f47054d4f8c9b61eb72ef9b7f851\": container with ID starting with c958bdcd6a2e61d509acf7d4cd2bda12b2c0f47054d4f8c9b61eb72ef9b7f851 not found: ID does not exist" Dec 01 22:12:46 crc kubenswrapper[4962]: I1201 22:12:46.241339 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8796560f-c576-43fc-8d05-af44154632ab" path="/var/lib/kubelet/pods/8796560f-c576-43fc-8d05-af44154632ab/volumes" Dec 01 22:13:01 crc kubenswrapper[4962]: I1201 22:13:01.792147 4962 generic.go:334] "Generic (PLEG): container finished" podID="4ea2d579-b57c-41a6-a255-ee852e50ec7a" containerID="630e1204ce489026c6f849c2a7a94d6aaf9a83763dec25c6e1164d07e59e2893" exitCode=0 Dec 01 22:13:01 crc kubenswrapper[4962]: I1201 22:13:01.792239 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hwvmq" event={"ID":"4ea2d579-b57c-41a6-a255-ee852e50ec7a","Type":"ContainerDied","Data":"630e1204ce489026c6f849c2a7a94d6aaf9a83763dec25c6e1164d07e59e2893"} Dec 01 22:13:02 crc kubenswrapper[4962]: I1201 22:13:02.784218 4962 patch_prober.go:28] interesting pod/machine-config-daemon-b642k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 22:13:02 crc kubenswrapper[4962]: I1201 22:13:02.784480 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 22:13:03 crc kubenswrapper[4962]: I1201 22:13:03.341594 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hwvmq" Dec 01 22:13:03 crc kubenswrapper[4962]: I1201 22:13:03.513835 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ea2d579-b57c-41a6-a255-ee852e50ec7a-ovn-combined-ca-bundle\") pod \"4ea2d579-b57c-41a6-a255-ee852e50ec7a\" (UID: \"4ea2d579-b57c-41a6-a255-ee852e50ec7a\") " Dec 01 22:13:03 crc kubenswrapper[4962]: I1201 22:13:03.513882 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6gsk\" (UniqueName: \"kubernetes.io/projected/4ea2d579-b57c-41a6-a255-ee852e50ec7a-kube-api-access-f6gsk\") pod \"4ea2d579-b57c-41a6-a255-ee852e50ec7a\" (UID: \"4ea2d579-b57c-41a6-a255-ee852e50ec7a\") " Dec 01 22:13:03 crc kubenswrapper[4962]: I1201 22:13:03.513911 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4ea2d579-b57c-41a6-a255-ee852e50ec7a-inventory\") pod \"4ea2d579-b57c-41a6-a255-ee852e50ec7a\" (UID: \"4ea2d579-b57c-41a6-a255-ee852e50ec7a\") " Dec 01 22:13:03 crc kubenswrapper[4962]: I1201 22:13:03.514141 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/4ea2d579-b57c-41a6-a255-ee852e50ec7a-ovncontroller-config-0\") pod \"4ea2d579-b57c-41a6-a255-ee852e50ec7a\" (UID: \"4ea2d579-b57c-41a6-a255-ee852e50ec7a\") " Dec 01 22:13:03 crc kubenswrapper[4962]: I1201 22:13:03.514356 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4ea2d579-b57c-41a6-a255-ee852e50ec7a-ssh-key\") pod \"4ea2d579-b57c-41a6-a255-ee852e50ec7a\" (UID: \"4ea2d579-b57c-41a6-a255-ee852e50ec7a\") " Dec 01 22:13:03 crc kubenswrapper[4962]: I1201 22:13:03.520296 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ea2d579-b57c-41a6-a255-ee852e50ec7a-kube-api-access-f6gsk" (OuterVolumeSpecName: "kube-api-access-f6gsk") pod "4ea2d579-b57c-41a6-a255-ee852e50ec7a" (UID: "4ea2d579-b57c-41a6-a255-ee852e50ec7a"). InnerVolumeSpecName "kube-api-access-f6gsk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 22:13:03 crc kubenswrapper[4962]: I1201 22:13:03.524488 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ea2d579-b57c-41a6-a255-ee852e50ec7a-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "4ea2d579-b57c-41a6-a255-ee852e50ec7a" (UID: "4ea2d579-b57c-41a6-a255-ee852e50ec7a"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:13:03 crc kubenswrapper[4962]: I1201 22:13:03.551839 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ea2d579-b57c-41a6-a255-ee852e50ec7a-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "4ea2d579-b57c-41a6-a255-ee852e50ec7a" (UID: "4ea2d579-b57c-41a6-a255-ee852e50ec7a"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 22:13:03 crc kubenswrapper[4962]: I1201 22:13:03.554139 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ea2d579-b57c-41a6-a255-ee852e50ec7a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4ea2d579-b57c-41a6-a255-ee852e50ec7a" (UID: "4ea2d579-b57c-41a6-a255-ee852e50ec7a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:13:03 crc kubenswrapper[4962]: I1201 22:13:03.562649 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ea2d579-b57c-41a6-a255-ee852e50ec7a-inventory" (OuterVolumeSpecName: "inventory") pod "4ea2d579-b57c-41a6-a255-ee852e50ec7a" (UID: "4ea2d579-b57c-41a6-a255-ee852e50ec7a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:13:03 crc kubenswrapper[4962]: I1201 22:13:03.617681 4962 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ea2d579-b57c-41a6-a255-ee852e50ec7a-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 22:13:03 crc kubenswrapper[4962]: I1201 22:13:03.617712 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f6gsk\" (UniqueName: \"kubernetes.io/projected/4ea2d579-b57c-41a6-a255-ee852e50ec7a-kube-api-access-f6gsk\") on node \"crc\" DevicePath \"\"" Dec 01 22:13:03 crc kubenswrapper[4962]: I1201 22:13:03.617722 4962 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4ea2d579-b57c-41a6-a255-ee852e50ec7a-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 22:13:03 crc kubenswrapper[4962]: I1201 22:13:03.617732 4962 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/4ea2d579-b57c-41a6-a255-ee852e50ec7a-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Dec 01 22:13:03 crc kubenswrapper[4962]: I1201 22:13:03.617743 4962 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4ea2d579-b57c-41a6-a255-ee852e50ec7a-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 22:13:03 crc kubenswrapper[4962]: I1201 22:13:03.825228 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hwvmq" event={"ID":"4ea2d579-b57c-41a6-a255-ee852e50ec7a","Type":"ContainerDied","Data":"bfc893d6675e9a73d88dbc1c574de7278891b32765a513b762e3078f941e294c"} Dec 01 22:13:03 crc kubenswrapper[4962]: I1201 22:13:03.825286 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bfc893d6675e9a73d88dbc1c574de7278891b32765a513b762e3078f941e294c" Dec 01 22:13:03 crc kubenswrapper[4962]: I1201 22:13:03.825380 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hwvmq" Dec 01 22:13:03 crc kubenswrapper[4962]: I1201 22:13:03.933971 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bcxmz"] Dec 01 22:13:03 crc kubenswrapper[4962]: E1201 22:13:03.934412 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8796560f-c576-43fc-8d05-af44154632ab" containerName="extract-utilities" Dec 01 22:13:03 crc kubenswrapper[4962]: I1201 22:13:03.934428 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="8796560f-c576-43fc-8d05-af44154632ab" containerName="extract-utilities" Dec 01 22:13:03 crc kubenswrapper[4962]: E1201 22:13:03.934444 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8796560f-c576-43fc-8d05-af44154632ab" containerName="extract-content" Dec 01 22:13:03 crc kubenswrapper[4962]: I1201 22:13:03.934449 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="8796560f-c576-43fc-8d05-af44154632ab" containerName="extract-content" Dec 01 22:13:03 crc kubenswrapper[4962]: E1201 22:13:03.934490 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8796560f-c576-43fc-8d05-af44154632ab" containerName="registry-server" Dec 01 22:13:03 crc kubenswrapper[4962]: I1201 22:13:03.934497 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="8796560f-c576-43fc-8d05-af44154632ab" containerName="registry-server" Dec 01 22:13:03 crc kubenswrapper[4962]: E1201 22:13:03.934508 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ea2d579-b57c-41a6-a255-ee852e50ec7a" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 01 22:13:03 crc kubenswrapper[4962]: I1201 22:13:03.934513 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ea2d579-b57c-41a6-a255-ee852e50ec7a" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 01 22:13:03 crc kubenswrapper[4962]: I1201 22:13:03.934723 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="8796560f-c576-43fc-8d05-af44154632ab" containerName="registry-server" Dec 01 22:13:03 crc kubenswrapper[4962]: I1201 22:13:03.934749 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ea2d579-b57c-41a6-a255-ee852e50ec7a" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 01 22:13:03 crc kubenswrapper[4962]: I1201 22:13:03.935735 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bcxmz" Dec 01 22:13:03 crc kubenswrapper[4962]: I1201 22:13:03.938147 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Dec 01 22:13:03 crc kubenswrapper[4962]: I1201 22:13:03.938412 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 22:13:03 crc kubenswrapper[4962]: I1201 22:13:03.938422 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Dec 01 22:13:03 crc kubenswrapper[4962]: I1201 22:13:03.938796 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 22:13:03 crc kubenswrapper[4962]: I1201 22:13:03.938804 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 22:13:03 crc kubenswrapper[4962]: I1201 22:13:03.943323 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mpxjd" Dec 01 22:13:03 crc kubenswrapper[4962]: I1201 22:13:03.958412 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bcxmz"] Dec 01 22:13:04 crc kubenswrapper[4962]: I1201 22:13:04.130292 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bcxmz\" (UID: \"dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bcxmz" Dec 01 22:13:04 crc kubenswrapper[4962]: I1201 22:13:04.130364 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bcxmz\" (UID: \"dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bcxmz" Dec 01 22:13:04 crc kubenswrapper[4962]: I1201 22:13:04.130490 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bcxmz\" (UID: \"dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bcxmz" Dec 01 22:13:04 crc kubenswrapper[4962]: I1201 22:13:04.130536 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bcxmz\" (UID: \"dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bcxmz" Dec 01 22:13:04 crc kubenswrapper[4962]: I1201 22:13:04.130562 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6w24c\" (UniqueName: \"kubernetes.io/projected/dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c-kube-api-access-6w24c\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bcxmz\" (UID: \"dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bcxmz" Dec 01 22:13:04 crc kubenswrapper[4962]: I1201 22:13:04.130592 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bcxmz\" (UID: \"dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bcxmz" Dec 01 22:13:04 crc kubenswrapper[4962]: I1201 22:13:04.232025 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bcxmz\" (UID: \"dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bcxmz" Dec 01 22:13:04 crc kubenswrapper[4962]: I1201 22:13:04.232092 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bcxmz\" (UID: \"dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bcxmz" Dec 01 22:13:04 crc kubenswrapper[4962]: I1201 22:13:04.232219 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bcxmz\" (UID: \"dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bcxmz" Dec 01 22:13:04 crc kubenswrapper[4962]: I1201 22:13:04.232278 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bcxmz\" (UID: \"dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bcxmz" Dec 01 22:13:04 crc kubenswrapper[4962]: I1201 22:13:04.232299 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6w24c\" (UniqueName: \"kubernetes.io/projected/dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c-kube-api-access-6w24c\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bcxmz\" (UID: \"dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bcxmz" Dec 01 22:13:04 crc kubenswrapper[4962]: I1201 22:13:04.232324 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bcxmz\" (UID: \"dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bcxmz" Dec 01 22:13:04 crc kubenswrapper[4962]: I1201 22:13:04.235818 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bcxmz\" (UID: \"dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bcxmz" Dec 01 22:13:04 crc kubenswrapper[4962]: I1201 22:13:04.235819 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bcxmz\" (UID: \"dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bcxmz" Dec 01 22:13:04 crc kubenswrapper[4962]: I1201 22:13:04.236470 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bcxmz\" (UID: \"dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bcxmz" Dec 01 22:13:04 crc kubenswrapper[4962]: I1201 22:13:04.236680 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bcxmz\" (UID: \"dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bcxmz" Dec 01 22:13:04 crc kubenswrapper[4962]: I1201 22:13:04.249277 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bcxmz\" (UID: \"dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bcxmz" Dec 01 22:13:04 crc kubenswrapper[4962]: I1201 22:13:04.251078 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6w24c\" (UniqueName: \"kubernetes.io/projected/dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c-kube-api-access-6w24c\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bcxmz\" (UID: \"dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bcxmz" Dec 01 22:13:04 crc kubenswrapper[4962]: I1201 22:13:04.306789 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bcxmz" Dec 01 22:13:04 crc kubenswrapper[4962]: I1201 22:13:04.829704 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bcxmz"] Dec 01 22:13:05 crc kubenswrapper[4962]: I1201 22:13:05.881410 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bcxmz" event={"ID":"dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c","Type":"ContainerStarted","Data":"597b2dd9fb1b690915767475cc544a35a4802fc676835cbbe384ddffaae9d305"} Dec 01 22:13:05 crc kubenswrapper[4962]: I1201 22:13:05.882160 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bcxmz" event={"ID":"dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c","Type":"ContainerStarted","Data":"e81943b081ad715cd867a7238df9d811938cbe212b925a82c3cf8cf516285798"} Dec 01 22:13:05 crc kubenswrapper[4962]: I1201 22:13:05.923090 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bcxmz" podStartSLOduration=2.365999708 podStartE2EDuration="2.923065808s" podCreationTimestamp="2025-12-01 22:13:03 +0000 UTC" firstStartedPulling="2025-12-01 22:13:04.839564589 +0000 UTC m=+2368.941003784" lastFinishedPulling="2025-12-01 22:13:05.396630649 +0000 UTC m=+2369.498069884" observedRunningTime="2025-12-01 22:13:05.907669591 +0000 UTC m=+2370.009108816" watchObservedRunningTime="2025-12-01 22:13:05.923065808 +0000 UTC m=+2370.024505013" Dec 01 22:13:32 crc kubenswrapper[4962]: I1201 22:13:32.784289 4962 patch_prober.go:28] interesting pod/machine-config-daemon-b642k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 22:13:32 crc kubenswrapper[4962]: I1201 22:13:32.784731 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 22:13:32 crc kubenswrapper[4962]: I1201 22:13:32.784772 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b642k" Dec 01 22:13:32 crc kubenswrapper[4962]: I1201 22:13:32.785584 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8c484572750533778a7ce8cd28638da562706e712e60fd64e1058f563e05403c"} pod="openshift-machine-config-operator/machine-config-daemon-b642k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 22:13:32 crc kubenswrapper[4962]: I1201 22:13:32.785636 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" containerID="cri-o://8c484572750533778a7ce8cd28638da562706e712e60fd64e1058f563e05403c" gracePeriod=600 Dec 01 22:13:32 crc kubenswrapper[4962]: E1201 22:13:32.927129 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:13:33 crc kubenswrapper[4962]: I1201 22:13:33.264143 4962 generic.go:334] "Generic (PLEG): container finished" podID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerID="8c484572750533778a7ce8cd28638da562706e712e60fd64e1058f563e05403c" exitCode=0 Dec 01 22:13:33 crc kubenswrapper[4962]: I1201 22:13:33.264370 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" event={"ID":"191b6ce3-f613-4217-b224-a65ee4cfdfe7","Type":"ContainerDied","Data":"8c484572750533778a7ce8cd28638da562706e712e60fd64e1058f563e05403c"} Dec 01 22:13:33 crc kubenswrapper[4962]: I1201 22:13:33.264492 4962 scope.go:117] "RemoveContainer" containerID="f9afb91c220faf243aae75f9834e77e5c18873f4e7d46e25474ebf0923963082" Dec 01 22:13:33 crc kubenswrapper[4962]: I1201 22:13:33.266045 4962 scope.go:117] "RemoveContainer" containerID="8c484572750533778a7ce8cd28638da562706e712e60fd64e1058f563e05403c" Dec 01 22:13:33 crc kubenswrapper[4962]: E1201 22:13:33.267786 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:13:47 crc kubenswrapper[4962]: I1201 22:13:47.220672 4962 scope.go:117] "RemoveContainer" containerID="8c484572750533778a7ce8cd28638da562706e712e60fd64e1058f563e05403c" Dec 01 22:13:47 crc kubenswrapper[4962]: E1201 22:13:47.221864 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:14:01 crc kubenswrapper[4962]: I1201 22:14:01.221157 4962 scope.go:117] "RemoveContainer" containerID="8c484572750533778a7ce8cd28638da562706e712e60fd64e1058f563e05403c" Dec 01 22:14:01 crc kubenswrapper[4962]: E1201 22:14:01.222243 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:14:03 crc kubenswrapper[4962]: I1201 22:14:03.717977 4962 generic.go:334] "Generic (PLEG): container finished" podID="dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c" containerID="597b2dd9fb1b690915767475cc544a35a4802fc676835cbbe384ddffaae9d305" exitCode=0 Dec 01 22:14:03 crc kubenswrapper[4962]: I1201 22:14:03.718014 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bcxmz" event={"ID":"dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c","Type":"ContainerDied","Data":"597b2dd9fb1b690915767475cc544a35a4802fc676835cbbe384ddffaae9d305"} Dec 01 22:14:05 crc kubenswrapper[4962]: I1201 22:14:05.217039 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bcxmz" Dec 01 22:14:05 crc kubenswrapper[4962]: I1201 22:14:05.274148 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6w24c\" (UniqueName: \"kubernetes.io/projected/dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c-kube-api-access-6w24c\") pod \"dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c\" (UID: \"dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c\") " Dec 01 22:14:05 crc kubenswrapper[4962]: I1201 22:14:05.274207 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c-nova-metadata-neutron-config-0\") pod \"dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c\" (UID: \"dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c\") " Dec 01 22:14:05 crc kubenswrapper[4962]: I1201 22:14:05.274316 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c-inventory\") pod \"dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c\" (UID: \"dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c\") " Dec 01 22:14:05 crc kubenswrapper[4962]: I1201 22:14:05.274446 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c-neutron-ovn-metadata-agent-neutron-config-0\") pod \"dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c\" (UID: \"dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c\") " Dec 01 22:14:05 crc kubenswrapper[4962]: I1201 22:14:05.274507 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c-neutron-metadata-combined-ca-bundle\") pod \"dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c\" (UID: \"dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c\") " Dec 01 22:14:05 crc kubenswrapper[4962]: I1201 22:14:05.274557 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c-ssh-key\") pod \"dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c\" (UID: \"dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c\") " Dec 01 22:14:05 crc kubenswrapper[4962]: I1201 22:14:05.281240 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c-kube-api-access-6w24c" (OuterVolumeSpecName: "kube-api-access-6w24c") pod "dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c" (UID: "dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c"). InnerVolumeSpecName "kube-api-access-6w24c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 22:14:05 crc kubenswrapper[4962]: I1201 22:14:05.282996 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c" (UID: "dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:14:05 crc kubenswrapper[4962]: I1201 22:14:05.321939 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c" (UID: "dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:14:05 crc kubenswrapper[4962]: I1201 22:14:05.331102 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c" (UID: "dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:14:05 crc kubenswrapper[4962]: I1201 22:14:05.332498 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c" (UID: "dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:14:05 crc kubenswrapper[4962]: I1201 22:14:05.345467 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c-inventory" (OuterVolumeSpecName: "inventory") pod "dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c" (UID: "dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:14:05 crc kubenswrapper[4962]: I1201 22:14:05.378398 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6w24c\" (UniqueName: \"kubernetes.io/projected/dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c-kube-api-access-6w24c\") on node \"crc\" DevicePath \"\"" Dec 01 22:14:05 crc kubenswrapper[4962]: I1201 22:14:05.378641 4962 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 01 22:14:05 crc kubenswrapper[4962]: I1201 22:14:05.378728 4962 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 22:14:05 crc kubenswrapper[4962]: I1201 22:14:05.378852 4962 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 01 22:14:05 crc kubenswrapper[4962]: I1201 22:14:05.378965 4962 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 22:14:05 crc kubenswrapper[4962]: I1201 22:14:05.379059 4962 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 22:14:05 crc kubenswrapper[4962]: I1201 22:14:05.738285 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bcxmz" event={"ID":"dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c","Type":"ContainerDied","Data":"e81943b081ad715cd867a7238df9d811938cbe212b925a82c3cf8cf516285798"} Dec 01 22:14:05 crc kubenswrapper[4962]: I1201 22:14:05.738560 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e81943b081ad715cd867a7238df9d811938cbe212b925a82c3cf8cf516285798" Dec 01 22:14:05 crc kubenswrapper[4962]: I1201 22:14:05.738632 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bcxmz" Dec 01 22:14:05 crc kubenswrapper[4962]: I1201 22:14:05.957276 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nh8kx"] Dec 01 22:14:05 crc kubenswrapper[4962]: E1201 22:14:05.958272 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 01 22:14:05 crc kubenswrapper[4962]: I1201 22:14:05.958308 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 01 22:14:05 crc kubenswrapper[4962]: I1201 22:14:05.958800 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 01 22:14:05 crc kubenswrapper[4962]: I1201 22:14:05.960075 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nh8kx" Dec 01 22:14:05 crc kubenswrapper[4962]: I1201 22:14:05.961702 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Dec 01 22:14:05 crc kubenswrapper[4962]: I1201 22:14:05.962322 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mpxjd" Dec 01 22:14:05 crc kubenswrapper[4962]: I1201 22:14:05.962429 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 22:14:05 crc kubenswrapper[4962]: I1201 22:14:05.962737 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 22:14:05 crc kubenswrapper[4962]: I1201 22:14:05.964893 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 22:14:05 crc kubenswrapper[4962]: I1201 22:14:05.983065 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nh8kx"] Dec 01 22:14:06 crc kubenswrapper[4962]: I1201 22:14:06.098090 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/db6f9af8-342d-4a5d-bd75-21d8d0f95c04-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nh8kx\" (UID: \"db6f9af8-342d-4a5d-bd75-21d8d0f95c04\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nh8kx" Dec 01 22:14:06 crc kubenswrapper[4962]: I1201 22:14:06.098433 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/db6f9af8-342d-4a5d-bd75-21d8d0f95c04-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nh8kx\" (UID: \"db6f9af8-342d-4a5d-bd75-21d8d0f95c04\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nh8kx" Dec 01 22:14:06 crc kubenswrapper[4962]: I1201 22:14:06.098715 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slw6z\" (UniqueName: \"kubernetes.io/projected/db6f9af8-342d-4a5d-bd75-21d8d0f95c04-kube-api-access-slw6z\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nh8kx\" (UID: \"db6f9af8-342d-4a5d-bd75-21d8d0f95c04\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nh8kx" Dec 01 22:14:06 crc kubenswrapper[4962]: I1201 22:14:06.098918 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/db6f9af8-342d-4a5d-bd75-21d8d0f95c04-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nh8kx\" (UID: \"db6f9af8-342d-4a5d-bd75-21d8d0f95c04\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nh8kx" Dec 01 22:14:06 crc kubenswrapper[4962]: I1201 22:14:06.099089 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db6f9af8-342d-4a5d-bd75-21d8d0f95c04-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nh8kx\" (UID: \"db6f9af8-342d-4a5d-bd75-21d8d0f95c04\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nh8kx" Dec 01 22:14:06 crc kubenswrapper[4962]: I1201 22:14:06.201659 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/db6f9af8-342d-4a5d-bd75-21d8d0f95c04-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nh8kx\" (UID: \"db6f9af8-342d-4a5d-bd75-21d8d0f95c04\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nh8kx" Dec 01 22:14:06 crc kubenswrapper[4962]: I1201 22:14:06.202283 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/db6f9af8-342d-4a5d-bd75-21d8d0f95c04-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nh8kx\" (UID: \"db6f9af8-342d-4a5d-bd75-21d8d0f95c04\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nh8kx" Dec 01 22:14:06 crc kubenswrapper[4962]: I1201 22:14:06.202734 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slw6z\" (UniqueName: \"kubernetes.io/projected/db6f9af8-342d-4a5d-bd75-21d8d0f95c04-kube-api-access-slw6z\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nh8kx\" (UID: \"db6f9af8-342d-4a5d-bd75-21d8d0f95c04\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nh8kx" Dec 01 22:14:06 crc kubenswrapper[4962]: I1201 22:14:06.202798 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/db6f9af8-342d-4a5d-bd75-21d8d0f95c04-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nh8kx\" (UID: \"db6f9af8-342d-4a5d-bd75-21d8d0f95c04\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nh8kx" Dec 01 22:14:06 crc kubenswrapper[4962]: I1201 22:14:06.202840 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db6f9af8-342d-4a5d-bd75-21d8d0f95c04-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nh8kx\" (UID: \"db6f9af8-342d-4a5d-bd75-21d8d0f95c04\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nh8kx" Dec 01 22:14:06 crc kubenswrapper[4962]: I1201 22:14:06.209853 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db6f9af8-342d-4a5d-bd75-21d8d0f95c04-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nh8kx\" (UID: \"db6f9af8-342d-4a5d-bd75-21d8d0f95c04\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nh8kx" Dec 01 22:14:06 crc kubenswrapper[4962]: I1201 22:14:06.210580 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/db6f9af8-342d-4a5d-bd75-21d8d0f95c04-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nh8kx\" (UID: \"db6f9af8-342d-4a5d-bd75-21d8d0f95c04\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nh8kx" Dec 01 22:14:06 crc kubenswrapper[4962]: I1201 22:14:06.210744 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/db6f9af8-342d-4a5d-bd75-21d8d0f95c04-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nh8kx\" (UID: \"db6f9af8-342d-4a5d-bd75-21d8d0f95c04\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nh8kx" Dec 01 22:14:06 crc kubenswrapper[4962]: I1201 22:14:06.210895 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/db6f9af8-342d-4a5d-bd75-21d8d0f95c04-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nh8kx\" (UID: \"db6f9af8-342d-4a5d-bd75-21d8d0f95c04\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nh8kx" Dec 01 22:14:06 crc kubenswrapper[4962]: I1201 22:14:06.224182 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slw6z\" (UniqueName: \"kubernetes.io/projected/db6f9af8-342d-4a5d-bd75-21d8d0f95c04-kube-api-access-slw6z\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nh8kx\" (UID: \"db6f9af8-342d-4a5d-bd75-21d8d0f95c04\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nh8kx" Dec 01 22:14:06 crc kubenswrapper[4962]: I1201 22:14:06.297889 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nh8kx" Dec 01 22:14:06 crc kubenswrapper[4962]: I1201 22:14:06.916316 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nh8kx"] Dec 01 22:14:06 crc kubenswrapper[4962]: W1201 22:14:06.922534 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb6f9af8_342d_4a5d_bd75_21d8d0f95c04.slice/crio-a20547365035c5d82730f4b718329ff356a91e082e29c134d360d9a027275adc WatchSource:0}: Error finding container a20547365035c5d82730f4b718329ff356a91e082e29c134d360d9a027275adc: Status 404 returned error can't find the container with id a20547365035c5d82730f4b718329ff356a91e082e29c134d360d9a027275adc Dec 01 22:14:07 crc kubenswrapper[4962]: I1201 22:14:07.760425 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nh8kx" event={"ID":"db6f9af8-342d-4a5d-bd75-21d8d0f95c04","Type":"ContainerStarted","Data":"a20547365035c5d82730f4b718329ff356a91e082e29c134d360d9a027275adc"} Dec 01 22:14:08 crc kubenswrapper[4962]: I1201 22:14:08.776208 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nh8kx" event={"ID":"db6f9af8-342d-4a5d-bd75-21d8d0f95c04","Type":"ContainerStarted","Data":"770078652f9222dbbe830a34a8483903d9c61ab437f4726a40785a7c152413e4"} Dec 01 22:14:08 crc kubenswrapper[4962]: I1201 22:14:08.809056 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nh8kx" podStartSLOduration=3.186696865 podStartE2EDuration="3.809016702s" podCreationTimestamp="2025-12-01 22:14:05 +0000 UTC" firstStartedPulling="2025-12-01 22:14:06.926216651 +0000 UTC m=+2431.027655856" lastFinishedPulling="2025-12-01 22:14:07.548536498 +0000 UTC m=+2431.649975693" observedRunningTime="2025-12-01 22:14:08.80298193 +0000 UTC m=+2432.904421145" watchObservedRunningTime="2025-12-01 22:14:08.809016702 +0000 UTC m=+2432.910455917" Dec 01 22:14:12 crc kubenswrapper[4962]: I1201 22:14:12.220956 4962 scope.go:117] "RemoveContainer" containerID="8c484572750533778a7ce8cd28638da562706e712e60fd64e1058f563e05403c" Dec 01 22:14:12 crc kubenswrapper[4962]: E1201 22:14:12.222093 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:14:27 crc kubenswrapper[4962]: I1201 22:14:27.220104 4962 scope.go:117] "RemoveContainer" containerID="8c484572750533778a7ce8cd28638da562706e712e60fd64e1058f563e05403c" Dec 01 22:14:27 crc kubenswrapper[4962]: E1201 22:14:27.221437 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:14:41 crc kubenswrapper[4962]: I1201 22:14:41.219832 4962 scope.go:117] "RemoveContainer" containerID="8c484572750533778a7ce8cd28638da562706e712e60fd64e1058f563e05403c" Dec 01 22:14:41 crc kubenswrapper[4962]: E1201 22:14:41.221213 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:14:54 crc kubenswrapper[4962]: I1201 22:14:54.220635 4962 scope.go:117] "RemoveContainer" containerID="8c484572750533778a7ce8cd28638da562706e712e60fd64e1058f563e05403c" Dec 01 22:14:54 crc kubenswrapper[4962]: E1201 22:14:54.221980 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:15:00 crc kubenswrapper[4962]: I1201 22:15:00.190005 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410455-m4kkr"] Dec 01 22:15:00 crc kubenswrapper[4962]: I1201 22:15:00.193244 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410455-m4kkr" Dec 01 22:15:00 crc kubenswrapper[4962]: I1201 22:15:00.197031 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 01 22:15:00 crc kubenswrapper[4962]: I1201 22:15:00.197640 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 01 22:15:00 crc kubenswrapper[4962]: I1201 22:15:00.218477 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410455-m4kkr"] Dec 01 22:15:00 crc kubenswrapper[4962]: I1201 22:15:00.233318 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5c084a91-ba28-43e1-b1d7-bb0c15be6c97-config-volume\") pod \"collect-profiles-29410455-m4kkr\" (UID: \"5c084a91-ba28-43e1-b1d7-bb0c15be6c97\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410455-m4kkr" Dec 01 22:15:00 crc kubenswrapper[4962]: I1201 22:15:00.233835 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5c084a91-ba28-43e1-b1d7-bb0c15be6c97-secret-volume\") pod \"collect-profiles-29410455-m4kkr\" (UID: \"5c084a91-ba28-43e1-b1d7-bb0c15be6c97\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410455-m4kkr" Dec 01 22:15:00 crc kubenswrapper[4962]: I1201 22:15:00.233978 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4mxh\" (UniqueName: \"kubernetes.io/projected/5c084a91-ba28-43e1-b1d7-bb0c15be6c97-kube-api-access-k4mxh\") pod \"collect-profiles-29410455-m4kkr\" (UID: \"5c084a91-ba28-43e1-b1d7-bb0c15be6c97\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410455-m4kkr" Dec 01 22:15:00 crc kubenswrapper[4962]: I1201 22:15:00.335877 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5c084a91-ba28-43e1-b1d7-bb0c15be6c97-config-volume\") pod \"collect-profiles-29410455-m4kkr\" (UID: \"5c084a91-ba28-43e1-b1d7-bb0c15be6c97\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410455-m4kkr" Dec 01 22:15:00 crc kubenswrapper[4962]: I1201 22:15:00.336066 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5c084a91-ba28-43e1-b1d7-bb0c15be6c97-secret-volume\") pod \"collect-profiles-29410455-m4kkr\" (UID: \"5c084a91-ba28-43e1-b1d7-bb0c15be6c97\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410455-m4kkr" Dec 01 22:15:00 crc kubenswrapper[4962]: I1201 22:15:00.336134 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4mxh\" (UniqueName: \"kubernetes.io/projected/5c084a91-ba28-43e1-b1d7-bb0c15be6c97-kube-api-access-k4mxh\") pod \"collect-profiles-29410455-m4kkr\" (UID: \"5c084a91-ba28-43e1-b1d7-bb0c15be6c97\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410455-m4kkr" Dec 01 22:15:00 crc kubenswrapper[4962]: I1201 22:15:00.337655 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5c084a91-ba28-43e1-b1d7-bb0c15be6c97-config-volume\") pod \"collect-profiles-29410455-m4kkr\" (UID: \"5c084a91-ba28-43e1-b1d7-bb0c15be6c97\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410455-m4kkr" Dec 01 22:15:00 crc kubenswrapper[4962]: I1201 22:15:00.346389 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5c084a91-ba28-43e1-b1d7-bb0c15be6c97-secret-volume\") pod \"collect-profiles-29410455-m4kkr\" (UID: \"5c084a91-ba28-43e1-b1d7-bb0c15be6c97\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410455-m4kkr" Dec 01 22:15:00 crc kubenswrapper[4962]: I1201 22:15:00.358367 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4mxh\" (UniqueName: \"kubernetes.io/projected/5c084a91-ba28-43e1-b1d7-bb0c15be6c97-kube-api-access-k4mxh\") pod \"collect-profiles-29410455-m4kkr\" (UID: \"5c084a91-ba28-43e1-b1d7-bb0c15be6c97\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410455-m4kkr" Dec 01 22:15:00 crc kubenswrapper[4962]: I1201 22:15:00.549334 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410455-m4kkr" Dec 01 22:15:00 crc kubenswrapper[4962]: I1201 22:15:00.885946 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410455-m4kkr"] Dec 01 22:15:01 crc kubenswrapper[4962]: I1201 22:15:01.533295 4962 generic.go:334] "Generic (PLEG): container finished" podID="5c084a91-ba28-43e1-b1d7-bb0c15be6c97" containerID="03826c7bbfe40b786382eebb831fd7e55698210c7d2284243d7e483cd28931cb" exitCode=0 Dec 01 22:15:01 crc kubenswrapper[4962]: I1201 22:15:01.533360 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410455-m4kkr" event={"ID":"5c084a91-ba28-43e1-b1d7-bb0c15be6c97","Type":"ContainerDied","Data":"03826c7bbfe40b786382eebb831fd7e55698210c7d2284243d7e483cd28931cb"} Dec 01 22:15:01 crc kubenswrapper[4962]: I1201 22:15:01.533583 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410455-m4kkr" event={"ID":"5c084a91-ba28-43e1-b1d7-bb0c15be6c97","Type":"ContainerStarted","Data":"c0f2784eed29cb3d0ba2547c0b036b48dcf4386fa392f8e05cecc6fdb48c12a4"} Dec 01 22:15:03 crc kubenswrapper[4962]: I1201 22:15:03.006407 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410455-m4kkr" Dec 01 22:15:03 crc kubenswrapper[4962]: I1201 22:15:03.103646 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5c084a91-ba28-43e1-b1d7-bb0c15be6c97-secret-volume\") pod \"5c084a91-ba28-43e1-b1d7-bb0c15be6c97\" (UID: \"5c084a91-ba28-43e1-b1d7-bb0c15be6c97\") " Dec 01 22:15:03 crc kubenswrapper[4962]: I1201 22:15:03.105466 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5c084a91-ba28-43e1-b1d7-bb0c15be6c97-config-volume\") pod \"5c084a91-ba28-43e1-b1d7-bb0c15be6c97\" (UID: \"5c084a91-ba28-43e1-b1d7-bb0c15be6c97\") " Dec 01 22:15:03 crc kubenswrapper[4962]: I1201 22:15:03.105512 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4mxh\" (UniqueName: \"kubernetes.io/projected/5c084a91-ba28-43e1-b1d7-bb0c15be6c97-kube-api-access-k4mxh\") pod \"5c084a91-ba28-43e1-b1d7-bb0c15be6c97\" (UID: \"5c084a91-ba28-43e1-b1d7-bb0c15be6c97\") " Dec 01 22:15:03 crc kubenswrapper[4962]: I1201 22:15:03.106185 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c084a91-ba28-43e1-b1d7-bb0c15be6c97-config-volume" (OuterVolumeSpecName: "config-volume") pod "5c084a91-ba28-43e1-b1d7-bb0c15be6c97" (UID: "5c084a91-ba28-43e1-b1d7-bb0c15be6c97"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 22:15:03 crc kubenswrapper[4962]: I1201 22:15:03.106498 4962 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5c084a91-ba28-43e1-b1d7-bb0c15be6c97-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 22:15:03 crc kubenswrapper[4962]: I1201 22:15:03.112902 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c084a91-ba28-43e1-b1d7-bb0c15be6c97-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5c084a91-ba28-43e1-b1d7-bb0c15be6c97" (UID: "5c084a91-ba28-43e1-b1d7-bb0c15be6c97"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:15:03 crc kubenswrapper[4962]: I1201 22:15:03.113153 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c084a91-ba28-43e1-b1d7-bb0c15be6c97-kube-api-access-k4mxh" (OuterVolumeSpecName: "kube-api-access-k4mxh") pod "5c084a91-ba28-43e1-b1d7-bb0c15be6c97" (UID: "5c084a91-ba28-43e1-b1d7-bb0c15be6c97"). InnerVolumeSpecName "kube-api-access-k4mxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 22:15:03 crc kubenswrapper[4962]: I1201 22:15:03.209127 4962 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5c084a91-ba28-43e1-b1d7-bb0c15be6c97-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 01 22:15:03 crc kubenswrapper[4962]: I1201 22:15:03.209159 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4mxh\" (UniqueName: \"kubernetes.io/projected/5c084a91-ba28-43e1-b1d7-bb0c15be6c97-kube-api-access-k4mxh\") on node \"crc\" DevicePath \"\"" Dec 01 22:15:03 crc kubenswrapper[4962]: I1201 22:15:03.563872 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410455-m4kkr" event={"ID":"5c084a91-ba28-43e1-b1d7-bb0c15be6c97","Type":"ContainerDied","Data":"c0f2784eed29cb3d0ba2547c0b036b48dcf4386fa392f8e05cecc6fdb48c12a4"} Dec 01 22:15:03 crc kubenswrapper[4962]: I1201 22:15:03.564288 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c0f2784eed29cb3d0ba2547c0b036b48dcf4386fa392f8e05cecc6fdb48c12a4" Dec 01 22:15:03 crc kubenswrapper[4962]: I1201 22:15:03.564037 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410455-m4kkr" Dec 01 22:15:04 crc kubenswrapper[4962]: I1201 22:15:04.102859 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410410-kr2lr"] Dec 01 22:15:04 crc kubenswrapper[4962]: I1201 22:15:04.113603 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410410-kr2lr"] Dec 01 22:15:04 crc kubenswrapper[4962]: I1201 22:15:04.239736 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fde3e2c2-ed59-4cbf-8554-1a0438eb81dc" path="/var/lib/kubelet/pods/fde3e2c2-ed59-4cbf-8554-1a0438eb81dc/volumes" Dec 01 22:15:06 crc kubenswrapper[4962]: I1201 22:15:06.235449 4962 scope.go:117] "RemoveContainer" containerID="8c484572750533778a7ce8cd28638da562706e712e60fd64e1058f563e05403c" Dec 01 22:15:06 crc kubenswrapper[4962]: E1201 22:15:06.236275 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:15:06 crc kubenswrapper[4962]: I1201 22:15:06.654834 4962 scope.go:117] "RemoveContainer" containerID="578cf2c176fdcfafd49e6d657f8c064a6770b8a85590ca82bc4ea2b72aa4403d" Dec 01 22:15:19 crc kubenswrapper[4962]: I1201 22:15:19.219868 4962 scope.go:117] "RemoveContainer" containerID="8c484572750533778a7ce8cd28638da562706e712e60fd64e1058f563e05403c" Dec 01 22:15:19 crc kubenswrapper[4962]: E1201 22:15:19.220969 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:15:34 crc kubenswrapper[4962]: I1201 22:15:34.221609 4962 scope.go:117] "RemoveContainer" containerID="8c484572750533778a7ce8cd28638da562706e712e60fd64e1058f563e05403c" Dec 01 22:15:34 crc kubenswrapper[4962]: E1201 22:15:34.222918 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:15:46 crc kubenswrapper[4962]: I1201 22:15:46.226872 4962 scope.go:117] "RemoveContainer" containerID="8c484572750533778a7ce8cd28638da562706e712e60fd64e1058f563e05403c" Dec 01 22:15:46 crc kubenswrapper[4962]: E1201 22:15:46.227963 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:15:58 crc kubenswrapper[4962]: I1201 22:15:58.220287 4962 scope.go:117] "RemoveContainer" containerID="8c484572750533778a7ce8cd28638da562706e712e60fd64e1058f563e05403c" Dec 01 22:15:58 crc kubenswrapper[4962]: E1201 22:15:58.221491 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:16:10 crc kubenswrapper[4962]: I1201 22:16:10.220166 4962 scope.go:117] "RemoveContainer" containerID="8c484572750533778a7ce8cd28638da562706e712e60fd64e1058f563e05403c" Dec 01 22:16:10 crc kubenswrapper[4962]: E1201 22:16:10.220968 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:16:23 crc kubenswrapper[4962]: I1201 22:16:23.220114 4962 scope.go:117] "RemoveContainer" containerID="8c484572750533778a7ce8cd28638da562706e712e60fd64e1058f563e05403c" Dec 01 22:16:23 crc kubenswrapper[4962]: E1201 22:16:23.221359 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:16:29 crc kubenswrapper[4962]: I1201 22:16:29.674905 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qgq22"] Dec 01 22:16:29 crc kubenswrapper[4962]: E1201 22:16:29.676683 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c084a91-ba28-43e1-b1d7-bb0c15be6c97" containerName="collect-profiles" Dec 01 22:16:29 crc kubenswrapper[4962]: I1201 22:16:29.676710 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c084a91-ba28-43e1-b1d7-bb0c15be6c97" containerName="collect-profiles" Dec 01 22:16:29 crc kubenswrapper[4962]: I1201 22:16:29.677296 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c084a91-ba28-43e1-b1d7-bb0c15be6c97" containerName="collect-profiles" Dec 01 22:16:29 crc kubenswrapper[4962]: I1201 22:16:29.681031 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qgq22" Dec 01 22:16:29 crc kubenswrapper[4962]: I1201 22:16:29.696170 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qgq22"] Dec 01 22:16:29 crc kubenswrapper[4962]: I1201 22:16:29.854309 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5289a6d-3c62-4e74-934a-af45a16d5dcb-utilities\") pod \"redhat-operators-qgq22\" (UID: \"e5289a6d-3c62-4e74-934a-af45a16d5dcb\") " pod="openshift-marketplace/redhat-operators-qgq22" Dec 01 22:16:29 crc kubenswrapper[4962]: I1201 22:16:29.854358 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5289a6d-3c62-4e74-934a-af45a16d5dcb-catalog-content\") pod \"redhat-operators-qgq22\" (UID: \"e5289a6d-3c62-4e74-934a-af45a16d5dcb\") " pod="openshift-marketplace/redhat-operators-qgq22" Dec 01 22:16:29 crc kubenswrapper[4962]: I1201 22:16:29.854504 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jv7p7\" (UniqueName: \"kubernetes.io/projected/e5289a6d-3c62-4e74-934a-af45a16d5dcb-kube-api-access-jv7p7\") pod \"redhat-operators-qgq22\" (UID: \"e5289a6d-3c62-4e74-934a-af45a16d5dcb\") " pod="openshift-marketplace/redhat-operators-qgq22" Dec 01 22:16:29 crc kubenswrapper[4962]: I1201 22:16:29.956594 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5289a6d-3c62-4e74-934a-af45a16d5dcb-utilities\") pod \"redhat-operators-qgq22\" (UID: \"e5289a6d-3c62-4e74-934a-af45a16d5dcb\") " pod="openshift-marketplace/redhat-operators-qgq22" Dec 01 22:16:29 crc kubenswrapper[4962]: I1201 22:16:29.956652 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5289a6d-3c62-4e74-934a-af45a16d5dcb-catalog-content\") pod \"redhat-operators-qgq22\" (UID: \"e5289a6d-3c62-4e74-934a-af45a16d5dcb\") " pod="openshift-marketplace/redhat-operators-qgq22" Dec 01 22:16:29 crc kubenswrapper[4962]: I1201 22:16:29.956796 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jv7p7\" (UniqueName: \"kubernetes.io/projected/e5289a6d-3c62-4e74-934a-af45a16d5dcb-kube-api-access-jv7p7\") pod \"redhat-operators-qgq22\" (UID: \"e5289a6d-3c62-4e74-934a-af45a16d5dcb\") " pod="openshift-marketplace/redhat-operators-qgq22" Dec 01 22:16:29 crc kubenswrapper[4962]: I1201 22:16:29.957285 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5289a6d-3c62-4e74-934a-af45a16d5dcb-utilities\") pod \"redhat-operators-qgq22\" (UID: \"e5289a6d-3c62-4e74-934a-af45a16d5dcb\") " pod="openshift-marketplace/redhat-operators-qgq22" Dec 01 22:16:29 crc kubenswrapper[4962]: I1201 22:16:29.957394 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5289a6d-3c62-4e74-934a-af45a16d5dcb-catalog-content\") pod \"redhat-operators-qgq22\" (UID: \"e5289a6d-3c62-4e74-934a-af45a16d5dcb\") " pod="openshift-marketplace/redhat-operators-qgq22" Dec 01 22:16:29 crc kubenswrapper[4962]: I1201 22:16:29.978021 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jv7p7\" (UniqueName: \"kubernetes.io/projected/e5289a6d-3c62-4e74-934a-af45a16d5dcb-kube-api-access-jv7p7\") pod \"redhat-operators-qgq22\" (UID: \"e5289a6d-3c62-4e74-934a-af45a16d5dcb\") " pod="openshift-marketplace/redhat-operators-qgq22" Dec 01 22:16:30 crc kubenswrapper[4962]: I1201 22:16:30.026567 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qgq22" Dec 01 22:16:30 crc kubenswrapper[4962]: I1201 22:16:30.535074 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qgq22"] Dec 01 22:16:30 crc kubenswrapper[4962]: I1201 22:16:30.689416 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qgq22" event={"ID":"e5289a6d-3c62-4e74-934a-af45a16d5dcb","Type":"ContainerStarted","Data":"623d11fa1288e9684670dbdf531372e9af25c6e2c6749de3329e5ea50c3766ce"} Dec 01 22:16:31 crc kubenswrapper[4962]: I1201 22:16:31.702476 4962 generic.go:334] "Generic (PLEG): container finished" podID="e5289a6d-3c62-4e74-934a-af45a16d5dcb" containerID="3afc365a40de35b305a7901f2ba5eb3d9f047413d924d2eb4b0f8ea4de310ec9" exitCode=0 Dec 01 22:16:31 crc kubenswrapper[4962]: I1201 22:16:31.702688 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qgq22" event={"ID":"e5289a6d-3c62-4e74-934a-af45a16d5dcb","Type":"ContainerDied","Data":"3afc365a40de35b305a7901f2ba5eb3d9f047413d924d2eb4b0f8ea4de310ec9"} Dec 01 22:16:31 crc kubenswrapper[4962]: I1201 22:16:31.706036 4962 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 22:16:33 crc kubenswrapper[4962]: I1201 22:16:33.734017 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qgq22" event={"ID":"e5289a6d-3c62-4e74-934a-af45a16d5dcb","Type":"ContainerStarted","Data":"5074d718c43e8c380a577375338fea80caad8b2610532780ba6e482a58871743"} Dec 01 22:16:37 crc kubenswrapper[4962]: I1201 22:16:37.220528 4962 scope.go:117] "RemoveContainer" containerID="8c484572750533778a7ce8cd28638da562706e712e60fd64e1058f563e05403c" Dec 01 22:16:37 crc kubenswrapper[4962]: E1201 22:16:37.221537 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:16:37 crc kubenswrapper[4962]: I1201 22:16:37.823143 4962 generic.go:334] "Generic (PLEG): container finished" podID="e5289a6d-3c62-4e74-934a-af45a16d5dcb" containerID="5074d718c43e8c380a577375338fea80caad8b2610532780ba6e482a58871743" exitCode=0 Dec 01 22:16:37 crc kubenswrapper[4962]: I1201 22:16:37.823230 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qgq22" event={"ID":"e5289a6d-3c62-4e74-934a-af45a16d5dcb","Type":"ContainerDied","Data":"5074d718c43e8c380a577375338fea80caad8b2610532780ba6e482a58871743"} Dec 01 22:16:38 crc kubenswrapper[4962]: I1201 22:16:38.835751 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qgq22" event={"ID":"e5289a6d-3c62-4e74-934a-af45a16d5dcb","Type":"ContainerStarted","Data":"d7885764d05adb9ac8bf222b3848ff0d96fe31b72a02d4afb82a253706183243"} Dec 01 22:16:38 crc kubenswrapper[4962]: I1201 22:16:38.865335 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qgq22" podStartSLOduration=3.195305347 podStartE2EDuration="9.865315386s" podCreationTimestamp="2025-12-01 22:16:29 +0000 UTC" firstStartedPulling="2025-12-01 22:16:31.705784247 +0000 UTC m=+2575.807223442" lastFinishedPulling="2025-12-01 22:16:38.375794286 +0000 UTC m=+2582.477233481" observedRunningTime="2025-12-01 22:16:38.855343613 +0000 UTC m=+2582.956782808" watchObservedRunningTime="2025-12-01 22:16:38.865315386 +0000 UTC m=+2582.966754581" Dec 01 22:16:40 crc kubenswrapper[4962]: I1201 22:16:40.026812 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qgq22" Dec 01 22:16:40 crc kubenswrapper[4962]: I1201 22:16:40.026876 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qgq22" Dec 01 22:16:41 crc kubenswrapper[4962]: I1201 22:16:41.108688 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qgq22" podUID="e5289a6d-3c62-4e74-934a-af45a16d5dcb" containerName="registry-server" probeResult="failure" output=< Dec 01 22:16:41 crc kubenswrapper[4962]: timeout: failed to connect service ":50051" within 1s Dec 01 22:16:41 crc kubenswrapper[4962]: > Dec 01 22:16:50 crc kubenswrapper[4962]: I1201 22:16:50.133492 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qgq22" Dec 01 22:16:50 crc kubenswrapper[4962]: I1201 22:16:50.195716 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qgq22" Dec 01 22:16:50 crc kubenswrapper[4962]: I1201 22:16:50.219912 4962 scope.go:117] "RemoveContainer" containerID="8c484572750533778a7ce8cd28638da562706e712e60fd64e1058f563e05403c" Dec 01 22:16:50 crc kubenswrapper[4962]: E1201 22:16:50.220832 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:16:50 crc kubenswrapper[4962]: I1201 22:16:50.383751 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qgq22"] Dec 01 22:16:52 crc kubenswrapper[4962]: I1201 22:16:52.041214 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qgq22" podUID="e5289a6d-3c62-4e74-934a-af45a16d5dcb" containerName="registry-server" containerID="cri-o://d7885764d05adb9ac8bf222b3848ff0d96fe31b72a02d4afb82a253706183243" gracePeriod=2 Dec 01 22:16:52 crc kubenswrapper[4962]: I1201 22:16:52.713973 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qgq22" Dec 01 22:16:52 crc kubenswrapper[4962]: I1201 22:16:52.810172 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5289a6d-3c62-4e74-934a-af45a16d5dcb-catalog-content\") pod \"e5289a6d-3c62-4e74-934a-af45a16d5dcb\" (UID: \"e5289a6d-3c62-4e74-934a-af45a16d5dcb\") " Dec 01 22:16:52 crc kubenswrapper[4962]: I1201 22:16:52.810279 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jv7p7\" (UniqueName: \"kubernetes.io/projected/e5289a6d-3c62-4e74-934a-af45a16d5dcb-kube-api-access-jv7p7\") pod \"e5289a6d-3c62-4e74-934a-af45a16d5dcb\" (UID: \"e5289a6d-3c62-4e74-934a-af45a16d5dcb\") " Dec 01 22:16:52 crc kubenswrapper[4962]: I1201 22:16:52.810362 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5289a6d-3c62-4e74-934a-af45a16d5dcb-utilities\") pod \"e5289a6d-3c62-4e74-934a-af45a16d5dcb\" (UID: \"e5289a6d-3c62-4e74-934a-af45a16d5dcb\") " Dec 01 22:16:52 crc kubenswrapper[4962]: I1201 22:16:52.810910 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5289a6d-3c62-4e74-934a-af45a16d5dcb-utilities" (OuterVolumeSpecName: "utilities") pod "e5289a6d-3c62-4e74-934a-af45a16d5dcb" (UID: "e5289a6d-3c62-4e74-934a-af45a16d5dcb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 22:16:52 crc kubenswrapper[4962]: I1201 22:16:52.811018 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5289a6d-3c62-4e74-934a-af45a16d5dcb-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 22:16:52 crc kubenswrapper[4962]: I1201 22:16:52.816454 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5289a6d-3c62-4e74-934a-af45a16d5dcb-kube-api-access-jv7p7" (OuterVolumeSpecName: "kube-api-access-jv7p7") pod "e5289a6d-3c62-4e74-934a-af45a16d5dcb" (UID: "e5289a6d-3c62-4e74-934a-af45a16d5dcb"). InnerVolumeSpecName "kube-api-access-jv7p7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 22:16:52 crc kubenswrapper[4962]: I1201 22:16:52.914689 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jv7p7\" (UniqueName: \"kubernetes.io/projected/e5289a6d-3c62-4e74-934a-af45a16d5dcb-kube-api-access-jv7p7\") on node \"crc\" DevicePath \"\"" Dec 01 22:16:52 crc kubenswrapper[4962]: I1201 22:16:52.939276 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5289a6d-3c62-4e74-934a-af45a16d5dcb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e5289a6d-3c62-4e74-934a-af45a16d5dcb" (UID: "e5289a6d-3c62-4e74-934a-af45a16d5dcb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 22:16:53 crc kubenswrapper[4962]: I1201 22:16:53.016762 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5289a6d-3c62-4e74-934a-af45a16d5dcb-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 22:16:53 crc kubenswrapper[4962]: I1201 22:16:53.067768 4962 generic.go:334] "Generic (PLEG): container finished" podID="e5289a6d-3c62-4e74-934a-af45a16d5dcb" containerID="d7885764d05adb9ac8bf222b3848ff0d96fe31b72a02d4afb82a253706183243" exitCode=0 Dec 01 22:16:53 crc kubenswrapper[4962]: I1201 22:16:53.067830 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qgq22" event={"ID":"e5289a6d-3c62-4e74-934a-af45a16d5dcb","Type":"ContainerDied","Data":"d7885764d05adb9ac8bf222b3848ff0d96fe31b72a02d4afb82a253706183243"} Dec 01 22:16:53 crc kubenswrapper[4962]: I1201 22:16:53.067868 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qgq22" event={"ID":"e5289a6d-3c62-4e74-934a-af45a16d5dcb","Type":"ContainerDied","Data":"623d11fa1288e9684670dbdf531372e9af25c6e2c6749de3329e5ea50c3766ce"} Dec 01 22:16:53 crc kubenswrapper[4962]: I1201 22:16:53.067895 4962 scope.go:117] "RemoveContainer" containerID="d7885764d05adb9ac8bf222b3848ff0d96fe31b72a02d4afb82a253706183243" Dec 01 22:16:53 crc kubenswrapper[4962]: I1201 22:16:53.068123 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qgq22" Dec 01 22:16:53 crc kubenswrapper[4962]: I1201 22:16:53.108803 4962 scope.go:117] "RemoveContainer" containerID="5074d718c43e8c380a577375338fea80caad8b2610532780ba6e482a58871743" Dec 01 22:16:53 crc kubenswrapper[4962]: I1201 22:16:53.138598 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qgq22"] Dec 01 22:16:53 crc kubenswrapper[4962]: I1201 22:16:53.153045 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qgq22"] Dec 01 22:16:53 crc kubenswrapper[4962]: I1201 22:16:53.160104 4962 scope.go:117] "RemoveContainer" containerID="3afc365a40de35b305a7901f2ba5eb3d9f047413d924d2eb4b0f8ea4de310ec9" Dec 01 22:16:53 crc kubenswrapper[4962]: I1201 22:16:53.209982 4962 scope.go:117] "RemoveContainer" containerID="d7885764d05adb9ac8bf222b3848ff0d96fe31b72a02d4afb82a253706183243" Dec 01 22:16:53 crc kubenswrapper[4962]: E1201 22:16:53.211381 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7885764d05adb9ac8bf222b3848ff0d96fe31b72a02d4afb82a253706183243\": container with ID starting with d7885764d05adb9ac8bf222b3848ff0d96fe31b72a02d4afb82a253706183243 not found: ID does not exist" containerID="d7885764d05adb9ac8bf222b3848ff0d96fe31b72a02d4afb82a253706183243" Dec 01 22:16:53 crc kubenswrapper[4962]: I1201 22:16:53.211432 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7885764d05adb9ac8bf222b3848ff0d96fe31b72a02d4afb82a253706183243"} err="failed to get container status \"d7885764d05adb9ac8bf222b3848ff0d96fe31b72a02d4afb82a253706183243\": rpc error: code = NotFound desc = could not find container \"d7885764d05adb9ac8bf222b3848ff0d96fe31b72a02d4afb82a253706183243\": container with ID starting with d7885764d05adb9ac8bf222b3848ff0d96fe31b72a02d4afb82a253706183243 not found: ID does not exist" Dec 01 22:16:53 crc kubenswrapper[4962]: I1201 22:16:53.211461 4962 scope.go:117] "RemoveContainer" containerID="5074d718c43e8c380a577375338fea80caad8b2610532780ba6e482a58871743" Dec 01 22:16:53 crc kubenswrapper[4962]: E1201 22:16:53.212184 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5074d718c43e8c380a577375338fea80caad8b2610532780ba6e482a58871743\": container with ID starting with 5074d718c43e8c380a577375338fea80caad8b2610532780ba6e482a58871743 not found: ID does not exist" containerID="5074d718c43e8c380a577375338fea80caad8b2610532780ba6e482a58871743" Dec 01 22:16:53 crc kubenswrapper[4962]: I1201 22:16:53.212245 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5074d718c43e8c380a577375338fea80caad8b2610532780ba6e482a58871743"} err="failed to get container status \"5074d718c43e8c380a577375338fea80caad8b2610532780ba6e482a58871743\": rpc error: code = NotFound desc = could not find container \"5074d718c43e8c380a577375338fea80caad8b2610532780ba6e482a58871743\": container with ID starting with 5074d718c43e8c380a577375338fea80caad8b2610532780ba6e482a58871743 not found: ID does not exist" Dec 01 22:16:53 crc kubenswrapper[4962]: I1201 22:16:53.212287 4962 scope.go:117] "RemoveContainer" containerID="3afc365a40de35b305a7901f2ba5eb3d9f047413d924d2eb4b0f8ea4de310ec9" Dec 01 22:16:53 crc kubenswrapper[4962]: E1201 22:16:53.213223 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3afc365a40de35b305a7901f2ba5eb3d9f047413d924d2eb4b0f8ea4de310ec9\": container with ID starting with 3afc365a40de35b305a7901f2ba5eb3d9f047413d924d2eb4b0f8ea4de310ec9 not found: ID does not exist" containerID="3afc365a40de35b305a7901f2ba5eb3d9f047413d924d2eb4b0f8ea4de310ec9" Dec 01 22:16:53 crc kubenswrapper[4962]: I1201 22:16:53.213252 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3afc365a40de35b305a7901f2ba5eb3d9f047413d924d2eb4b0f8ea4de310ec9"} err="failed to get container status \"3afc365a40de35b305a7901f2ba5eb3d9f047413d924d2eb4b0f8ea4de310ec9\": rpc error: code = NotFound desc = could not find container \"3afc365a40de35b305a7901f2ba5eb3d9f047413d924d2eb4b0f8ea4de310ec9\": container with ID starting with 3afc365a40de35b305a7901f2ba5eb3d9f047413d924d2eb4b0f8ea4de310ec9 not found: ID does not exist" Dec 01 22:16:54 crc kubenswrapper[4962]: I1201 22:16:54.243881 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5289a6d-3c62-4e74-934a-af45a16d5dcb" path="/var/lib/kubelet/pods/e5289a6d-3c62-4e74-934a-af45a16d5dcb/volumes" Dec 01 22:17:05 crc kubenswrapper[4962]: I1201 22:17:05.220829 4962 scope.go:117] "RemoveContainer" containerID="8c484572750533778a7ce8cd28638da562706e712e60fd64e1058f563e05403c" Dec 01 22:17:05 crc kubenswrapper[4962]: E1201 22:17:05.221901 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:17:20 crc kubenswrapper[4962]: I1201 22:17:20.220264 4962 scope.go:117] "RemoveContainer" containerID="8c484572750533778a7ce8cd28638da562706e712e60fd64e1058f563e05403c" Dec 01 22:17:20 crc kubenswrapper[4962]: E1201 22:17:20.221413 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:17:21 crc kubenswrapper[4962]: I1201 22:17:21.692115 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9s4rw"] Dec 01 22:17:21 crc kubenswrapper[4962]: E1201 22:17:21.693197 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5289a6d-3c62-4e74-934a-af45a16d5dcb" containerName="extract-utilities" Dec 01 22:17:21 crc kubenswrapper[4962]: I1201 22:17:21.693227 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5289a6d-3c62-4e74-934a-af45a16d5dcb" containerName="extract-utilities" Dec 01 22:17:21 crc kubenswrapper[4962]: E1201 22:17:21.693271 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5289a6d-3c62-4e74-934a-af45a16d5dcb" containerName="registry-server" Dec 01 22:17:21 crc kubenswrapper[4962]: I1201 22:17:21.693284 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5289a6d-3c62-4e74-934a-af45a16d5dcb" containerName="registry-server" Dec 01 22:17:21 crc kubenswrapper[4962]: E1201 22:17:21.693341 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5289a6d-3c62-4e74-934a-af45a16d5dcb" containerName="extract-content" Dec 01 22:17:21 crc kubenswrapper[4962]: I1201 22:17:21.693354 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5289a6d-3c62-4e74-934a-af45a16d5dcb" containerName="extract-content" Dec 01 22:17:21 crc kubenswrapper[4962]: I1201 22:17:21.693802 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5289a6d-3c62-4e74-934a-af45a16d5dcb" containerName="registry-server" Dec 01 22:17:21 crc kubenswrapper[4962]: I1201 22:17:21.698183 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9s4rw" Dec 01 22:17:21 crc kubenswrapper[4962]: I1201 22:17:21.720722 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9s4rw"] Dec 01 22:17:21 crc kubenswrapper[4962]: I1201 22:17:21.820105 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w786b\" (UniqueName: \"kubernetes.io/projected/49a0393e-ecf0-499c-aeb6-ed4f6175595c-kube-api-access-w786b\") pod \"community-operators-9s4rw\" (UID: \"49a0393e-ecf0-499c-aeb6-ed4f6175595c\") " pod="openshift-marketplace/community-operators-9s4rw" Dec 01 22:17:21 crc kubenswrapper[4962]: I1201 22:17:21.820210 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49a0393e-ecf0-499c-aeb6-ed4f6175595c-utilities\") pod \"community-operators-9s4rw\" (UID: \"49a0393e-ecf0-499c-aeb6-ed4f6175595c\") " pod="openshift-marketplace/community-operators-9s4rw" Dec 01 22:17:21 crc kubenswrapper[4962]: I1201 22:17:21.820255 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49a0393e-ecf0-499c-aeb6-ed4f6175595c-catalog-content\") pod \"community-operators-9s4rw\" (UID: \"49a0393e-ecf0-499c-aeb6-ed4f6175595c\") " pod="openshift-marketplace/community-operators-9s4rw" Dec 01 22:17:21 crc kubenswrapper[4962]: I1201 22:17:21.923269 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w786b\" (UniqueName: \"kubernetes.io/projected/49a0393e-ecf0-499c-aeb6-ed4f6175595c-kube-api-access-w786b\") pod \"community-operators-9s4rw\" (UID: \"49a0393e-ecf0-499c-aeb6-ed4f6175595c\") " pod="openshift-marketplace/community-operators-9s4rw" Dec 01 22:17:21 crc kubenswrapper[4962]: I1201 22:17:21.923334 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49a0393e-ecf0-499c-aeb6-ed4f6175595c-utilities\") pod \"community-operators-9s4rw\" (UID: \"49a0393e-ecf0-499c-aeb6-ed4f6175595c\") " pod="openshift-marketplace/community-operators-9s4rw" Dec 01 22:17:21 crc kubenswrapper[4962]: I1201 22:17:21.923363 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49a0393e-ecf0-499c-aeb6-ed4f6175595c-catalog-content\") pod \"community-operators-9s4rw\" (UID: \"49a0393e-ecf0-499c-aeb6-ed4f6175595c\") " pod="openshift-marketplace/community-operators-9s4rw" Dec 01 22:17:21 crc kubenswrapper[4962]: I1201 22:17:21.924063 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49a0393e-ecf0-499c-aeb6-ed4f6175595c-catalog-content\") pod \"community-operators-9s4rw\" (UID: \"49a0393e-ecf0-499c-aeb6-ed4f6175595c\") " pod="openshift-marketplace/community-operators-9s4rw" Dec 01 22:17:21 crc kubenswrapper[4962]: I1201 22:17:21.924206 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49a0393e-ecf0-499c-aeb6-ed4f6175595c-utilities\") pod \"community-operators-9s4rw\" (UID: \"49a0393e-ecf0-499c-aeb6-ed4f6175595c\") " pod="openshift-marketplace/community-operators-9s4rw" Dec 01 22:17:21 crc kubenswrapper[4962]: I1201 22:17:21.951556 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w786b\" (UniqueName: \"kubernetes.io/projected/49a0393e-ecf0-499c-aeb6-ed4f6175595c-kube-api-access-w786b\") pod \"community-operators-9s4rw\" (UID: \"49a0393e-ecf0-499c-aeb6-ed4f6175595c\") " pod="openshift-marketplace/community-operators-9s4rw" Dec 01 22:17:22 crc kubenswrapper[4962]: I1201 22:17:22.032064 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9s4rw" Dec 01 22:17:22 crc kubenswrapper[4962]: I1201 22:17:22.557409 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9s4rw"] Dec 01 22:17:23 crc kubenswrapper[4962]: I1201 22:17:23.516594 4962 generic.go:334] "Generic (PLEG): container finished" podID="49a0393e-ecf0-499c-aeb6-ed4f6175595c" containerID="88f39f0595374a46d1f116408c61b9a64196ee80d22ac6d0e3e7874090cb3059" exitCode=0 Dec 01 22:17:23 crc kubenswrapper[4962]: I1201 22:17:23.516640 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9s4rw" event={"ID":"49a0393e-ecf0-499c-aeb6-ed4f6175595c","Type":"ContainerDied","Data":"88f39f0595374a46d1f116408c61b9a64196ee80d22ac6d0e3e7874090cb3059"} Dec 01 22:17:23 crc kubenswrapper[4962]: I1201 22:17:23.516670 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9s4rw" event={"ID":"49a0393e-ecf0-499c-aeb6-ed4f6175595c","Type":"ContainerStarted","Data":"f2c87aee4472ad514e980670479cd314afcc67d4ff6bd3386d069a04ac4e0c62"} Dec 01 22:17:24 crc kubenswrapper[4962]: I1201 22:17:24.535862 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9s4rw" event={"ID":"49a0393e-ecf0-499c-aeb6-ed4f6175595c","Type":"ContainerStarted","Data":"ebf44098c00eba900267f59dbc1c4b66584c33c6a81a7796fd0d7165fb498cac"} Dec 01 22:17:25 crc kubenswrapper[4962]: I1201 22:17:25.558639 4962 generic.go:334] "Generic (PLEG): container finished" podID="49a0393e-ecf0-499c-aeb6-ed4f6175595c" containerID="ebf44098c00eba900267f59dbc1c4b66584c33c6a81a7796fd0d7165fb498cac" exitCode=0 Dec 01 22:17:25 crc kubenswrapper[4962]: I1201 22:17:25.558709 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9s4rw" event={"ID":"49a0393e-ecf0-499c-aeb6-ed4f6175595c","Type":"ContainerDied","Data":"ebf44098c00eba900267f59dbc1c4b66584c33c6a81a7796fd0d7165fb498cac"} Dec 01 22:17:26 crc kubenswrapper[4962]: I1201 22:17:26.573599 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9s4rw" event={"ID":"49a0393e-ecf0-499c-aeb6-ed4f6175595c","Type":"ContainerStarted","Data":"53617a96f3444895269f73490363fad7ba39f4cd493a82f4defa73a2b17aec3a"} Dec 01 22:17:26 crc kubenswrapper[4962]: I1201 22:17:26.617758 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9s4rw" podStartSLOduration=3.032081111 podStartE2EDuration="5.617736614s" podCreationTimestamp="2025-12-01 22:17:21 +0000 UTC" firstStartedPulling="2025-12-01 22:17:23.523998345 +0000 UTC m=+2627.625437570" lastFinishedPulling="2025-12-01 22:17:26.109653848 +0000 UTC m=+2630.211093073" observedRunningTime="2025-12-01 22:17:26.604082987 +0000 UTC m=+2630.705522262" watchObservedRunningTime="2025-12-01 22:17:26.617736614 +0000 UTC m=+2630.719175819" Dec 01 22:17:32 crc kubenswrapper[4962]: I1201 22:17:32.032679 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9s4rw" Dec 01 22:17:32 crc kubenswrapper[4962]: I1201 22:17:32.033978 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9s4rw" Dec 01 22:17:32 crc kubenswrapper[4962]: I1201 22:17:32.122764 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9s4rw" Dec 01 22:17:32 crc kubenswrapper[4962]: I1201 22:17:32.729397 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9s4rw" Dec 01 22:17:32 crc kubenswrapper[4962]: I1201 22:17:32.807645 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9s4rw"] Dec 01 22:17:34 crc kubenswrapper[4962]: I1201 22:17:34.220602 4962 scope.go:117] "RemoveContainer" containerID="8c484572750533778a7ce8cd28638da562706e712e60fd64e1058f563e05403c" Dec 01 22:17:34 crc kubenswrapper[4962]: E1201 22:17:34.221343 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:17:34 crc kubenswrapper[4962]: I1201 22:17:34.682354 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9s4rw" podUID="49a0393e-ecf0-499c-aeb6-ed4f6175595c" containerName="registry-server" containerID="cri-o://53617a96f3444895269f73490363fad7ba39f4cd493a82f4defa73a2b17aec3a" gracePeriod=2 Dec 01 22:17:35 crc kubenswrapper[4962]: I1201 22:17:35.326820 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9s4rw" Dec 01 22:17:35 crc kubenswrapper[4962]: I1201 22:17:35.449406 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49a0393e-ecf0-499c-aeb6-ed4f6175595c-utilities\") pod \"49a0393e-ecf0-499c-aeb6-ed4f6175595c\" (UID: \"49a0393e-ecf0-499c-aeb6-ed4f6175595c\") " Dec 01 22:17:35 crc kubenswrapper[4962]: I1201 22:17:35.449545 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49a0393e-ecf0-499c-aeb6-ed4f6175595c-catalog-content\") pod \"49a0393e-ecf0-499c-aeb6-ed4f6175595c\" (UID: \"49a0393e-ecf0-499c-aeb6-ed4f6175595c\") " Dec 01 22:17:35 crc kubenswrapper[4962]: I1201 22:17:35.449922 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w786b\" (UniqueName: \"kubernetes.io/projected/49a0393e-ecf0-499c-aeb6-ed4f6175595c-kube-api-access-w786b\") pod \"49a0393e-ecf0-499c-aeb6-ed4f6175595c\" (UID: \"49a0393e-ecf0-499c-aeb6-ed4f6175595c\") " Dec 01 22:17:35 crc kubenswrapper[4962]: I1201 22:17:35.450226 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49a0393e-ecf0-499c-aeb6-ed4f6175595c-utilities" (OuterVolumeSpecName: "utilities") pod "49a0393e-ecf0-499c-aeb6-ed4f6175595c" (UID: "49a0393e-ecf0-499c-aeb6-ed4f6175595c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 22:17:35 crc kubenswrapper[4962]: I1201 22:17:35.451070 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49a0393e-ecf0-499c-aeb6-ed4f6175595c-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 22:17:35 crc kubenswrapper[4962]: I1201 22:17:35.459187 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49a0393e-ecf0-499c-aeb6-ed4f6175595c-kube-api-access-w786b" (OuterVolumeSpecName: "kube-api-access-w786b") pod "49a0393e-ecf0-499c-aeb6-ed4f6175595c" (UID: "49a0393e-ecf0-499c-aeb6-ed4f6175595c"). InnerVolumeSpecName "kube-api-access-w786b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 22:17:35 crc kubenswrapper[4962]: I1201 22:17:35.516688 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49a0393e-ecf0-499c-aeb6-ed4f6175595c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "49a0393e-ecf0-499c-aeb6-ed4f6175595c" (UID: "49a0393e-ecf0-499c-aeb6-ed4f6175595c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 22:17:35 crc kubenswrapper[4962]: I1201 22:17:35.553491 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w786b\" (UniqueName: \"kubernetes.io/projected/49a0393e-ecf0-499c-aeb6-ed4f6175595c-kube-api-access-w786b\") on node \"crc\" DevicePath \"\"" Dec 01 22:17:35 crc kubenswrapper[4962]: I1201 22:17:35.553528 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49a0393e-ecf0-499c-aeb6-ed4f6175595c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 22:17:35 crc kubenswrapper[4962]: I1201 22:17:35.700404 4962 generic.go:334] "Generic (PLEG): container finished" podID="49a0393e-ecf0-499c-aeb6-ed4f6175595c" containerID="53617a96f3444895269f73490363fad7ba39f4cd493a82f4defa73a2b17aec3a" exitCode=0 Dec 01 22:17:35 crc kubenswrapper[4962]: I1201 22:17:35.700480 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9s4rw" Dec 01 22:17:35 crc kubenswrapper[4962]: I1201 22:17:35.700496 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9s4rw" event={"ID":"49a0393e-ecf0-499c-aeb6-ed4f6175595c","Type":"ContainerDied","Data":"53617a96f3444895269f73490363fad7ba39f4cd493a82f4defa73a2b17aec3a"} Dec 01 22:17:35 crc kubenswrapper[4962]: I1201 22:17:35.700996 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9s4rw" event={"ID":"49a0393e-ecf0-499c-aeb6-ed4f6175595c","Type":"ContainerDied","Data":"f2c87aee4472ad514e980670479cd314afcc67d4ff6bd3386d069a04ac4e0c62"} Dec 01 22:17:35 crc kubenswrapper[4962]: I1201 22:17:35.701025 4962 scope.go:117] "RemoveContainer" containerID="53617a96f3444895269f73490363fad7ba39f4cd493a82f4defa73a2b17aec3a" Dec 01 22:17:35 crc kubenswrapper[4962]: I1201 22:17:35.734097 4962 scope.go:117] "RemoveContainer" containerID="ebf44098c00eba900267f59dbc1c4b66584c33c6a81a7796fd0d7165fb498cac" Dec 01 22:17:35 crc kubenswrapper[4962]: I1201 22:17:35.785503 4962 scope.go:117] "RemoveContainer" containerID="88f39f0595374a46d1f116408c61b9a64196ee80d22ac6d0e3e7874090cb3059" Dec 01 22:17:35 crc kubenswrapper[4962]: I1201 22:17:35.815328 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9s4rw"] Dec 01 22:17:35 crc kubenswrapper[4962]: I1201 22:17:35.821699 4962 scope.go:117] "RemoveContainer" containerID="53617a96f3444895269f73490363fad7ba39f4cd493a82f4defa73a2b17aec3a" Dec 01 22:17:35 crc kubenswrapper[4962]: E1201 22:17:35.822110 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53617a96f3444895269f73490363fad7ba39f4cd493a82f4defa73a2b17aec3a\": container with ID starting with 53617a96f3444895269f73490363fad7ba39f4cd493a82f4defa73a2b17aec3a not found: ID does not exist" containerID="53617a96f3444895269f73490363fad7ba39f4cd493a82f4defa73a2b17aec3a" Dec 01 22:17:35 crc kubenswrapper[4962]: I1201 22:17:35.822142 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53617a96f3444895269f73490363fad7ba39f4cd493a82f4defa73a2b17aec3a"} err="failed to get container status \"53617a96f3444895269f73490363fad7ba39f4cd493a82f4defa73a2b17aec3a\": rpc error: code = NotFound desc = could not find container \"53617a96f3444895269f73490363fad7ba39f4cd493a82f4defa73a2b17aec3a\": container with ID starting with 53617a96f3444895269f73490363fad7ba39f4cd493a82f4defa73a2b17aec3a not found: ID does not exist" Dec 01 22:17:35 crc kubenswrapper[4962]: I1201 22:17:35.822164 4962 scope.go:117] "RemoveContainer" containerID="ebf44098c00eba900267f59dbc1c4b66584c33c6a81a7796fd0d7165fb498cac" Dec 01 22:17:35 crc kubenswrapper[4962]: E1201 22:17:35.822455 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebf44098c00eba900267f59dbc1c4b66584c33c6a81a7796fd0d7165fb498cac\": container with ID starting with ebf44098c00eba900267f59dbc1c4b66584c33c6a81a7796fd0d7165fb498cac not found: ID does not exist" containerID="ebf44098c00eba900267f59dbc1c4b66584c33c6a81a7796fd0d7165fb498cac" Dec 01 22:17:35 crc kubenswrapper[4962]: I1201 22:17:35.822477 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebf44098c00eba900267f59dbc1c4b66584c33c6a81a7796fd0d7165fb498cac"} err="failed to get container status \"ebf44098c00eba900267f59dbc1c4b66584c33c6a81a7796fd0d7165fb498cac\": rpc error: code = NotFound desc = could not find container \"ebf44098c00eba900267f59dbc1c4b66584c33c6a81a7796fd0d7165fb498cac\": container with ID starting with ebf44098c00eba900267f59dbc1c4b66584c33c6a81a7796fd0d7165fb498cac not found: ID does not exist" Dec 01 22:17:35 crc kubenswrapper[4962]: I1201 22:17:35.822490 4962 scope.go:117] "RemoveContainer" containerID="88f39f0595374a46d1f116408c61b9a64196ee80d22ac6d0e3e7874090cb3059" Dec 01 22:17:35 crc kubenswrapper[4962]: E1201 22:17:35.822701 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88f39f0595374a46d1f116408c61b9a64196ee80d22ac6d0e3e7874090cb3059\": container with ID starting with 88f39f0595374a46d1f116408c61b9a64196ee80d22ac6d0e3e7874090cb3059 not found: ID does not exist" containerID="88f39f0595374a46d1f116408c61b9a64196ee80d22ac6d0e3e7874090cb3059" Dec 01 22:17:35 crc kubenswrapper[4962]: I1201 22:17:35.822724 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88f39f0595374a46d1f116408c61b9a64196ee80d22ac6d0e3e7874090cb3059"} err="failed to get container status \"88f39f0595374a46d1f116408c61b9a64196ee80d22ac6d0e3e7874090cb3059\": rpc error: code = NotFound desc = could not find container \"88f39f0595374a46d1f116408c61b9a64196ee80d22ac6d0e3e7874090cb3059\": container with ID starting with 88f39f0595374a46d1f116408c61b9a64196ee80d22ac6d0e3e7874090cb3059 not found: ID does not exist" Dec 01 22:17:35 crc kubenswrapper[4962]: I1201 22:17:35.828370 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9s4rw"] Dec 01 22:17:36 crc kubenswrapper[4962]: I1201 22:17:36.238738 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49a0393e-ecf0-499c-aeb6-ed4f6175595c" path="/var/lib/kubelet/pods/49a0393e-ecf0-499c-aeb6-ed4f6175595c/volumes" Dec 01 22:17:47 crc kubenswrapper[4962]: I1201 22:17:47.220053 4962 scope.go:117] "RemoveContainer" containerID="8c484572750533778a7ce8cd28638da562706e712e60fd64e1058f563e05403c" Dec 01 22:17:47 crc kubenswrapper[4962]: E1201 22:17:47.221427 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:18:01 crc kubenswrapper[4962]: I1201 22:18:01.220961 4962 scope.go:117] "RemoveContainer" containerID="8c484572750533778a7ce8cd28638da562706e712e60fd64e1058f563e05403c" Dec 01 22:18:01 crc kubenswrapper[4962]: E1201 22:18:01.221706 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:18:12 crc kubenswrapper[4962]: I1201 22:18:12.220258 4962 scope.go:117] "RemoveContainer" containerID="8c484572750533778a7ce8cd28638da562706e712e60fd64e1058f563e05403c" Dec 01 22:18:12 crc kubenswrapper[4962]: E1201 22:18:12.221117 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:18:24 crc kubenswrapper[4962]: I1201 22:18:24.219574 4962 scope.go:117] "RemoveContainer" containerID="8c484572750533778a7ce8cd28638da562706e712e60fd64e1058f563e05403c" Dec 01 22:18:24 crc kubenswrapper[4962]: E1201 22:18:24.220574 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:18:35 crc kubenswrapper[4962]: I1201 22:18:35.219594 4962 scope.go:117] "RemoveContainer" containerID="8c484572750533778a7ce8cd28638da562706e712e60fd64e1058f563e05403c" Dec 01 22:18:35 crc kubenswrapper[4962]: I1201 22:18:35.543342 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" event={"ID":"191b6ce3-f613-4217-b224-a65ee4cfdfe7","Type":"ContainerStarted","Data":"7e86de3a727bf2cd2d1675ee45c0547dea508d02b19d9166f854056d0475f742"} Dec 01 22:18:58 crc kubenswrapper[4962]: I1201 22:18:58.887838 4962 generic.go:334] "Generic (PLEG): container finished" podID="db6f9af8-342d-4a5d-bd75-21d8d0f95c04" containerID="770078652f9222dbbe830a34a8483903d9c61ab437f4726a40785a7c152413e4" exitCode=0 Dec 01 22:18:58 crc kubenswrapper[4962]: I1201 22:18:58.887998 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nh8kx" event={"ID":"db6f9af8-342d-4a5d-bd75-21d8d0f95c04","Type":"ContainerDied","Data":"770078652f9222dbbe830a34a8483903d9c61ab437f4726a40785a7c152413e4"} Dec 01 22:19:00 crc kubenswrapper[4962]: I1201 22:19:00.474925 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nh8kx" Dec 01 22:19:00 crc kubenswrapper[4962]: I1201 22:19:00.601998 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/db6f9af8-342d-4a5d-bd75-21d8d0f95c04-libvirt-secret-0\") pod \"db6f9af8-342d-4a5d-bd75-21d8d0f95c04\" (UID: \"db6f9af8-342d-4a5d-bd75-21d8d0f95c04\") " Dec 01 22:19:00 crc kubenswrapper[4962]: I1201 22:19:00.602177 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/db6f9af8-342d-4a5d-bd75-21d8d0f95c04-ssh-key\") pod \"db6f9af8-342d-4a5d-bd75-21d8d0f95c04\" (UID: \"db6f9af8-342d-4a5d-bd75-21d8d0f95c04\") " Dec 01 22:19:00 crc kubenswrapper[4962]: I1201 22:19:00.602329 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-slw6z\" (UniqueName: \"kubernetes.io/projected/db6f9af8-342d-4a5d-bd75-21d8d0f95c04-kube-api-access-slw6z\") pod \"db6f9af8-342d-4a5d-bd75-21d8d0f95c04\" (UID: \"db6f9af8-342d-4a5d-bd75-21d8d0f95c04\") " Dec 01 22:19:00 crc kubenswrapper[4962]: I1201 22:19:00.602351 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db6f9af8-342d-4a5d-bd75-21d8d0f95c04-libvirt-combined-ca-bundle\") pod \"db6f9af8-342d-4a5d-bd75-21d8d0f95c04\" (UID: \"db6f9af8-342d-4a5d-bd75-21d8d0f95c04\") " Dec 01 22:19:00 crc kubenswrapper[4962]: I1201 22:19:00.602383 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/db6f9af8-342d-4a5d-bd75-21d8d0f95c04-inventory\") pod \"db6f9af8-342d-4a5d-bd75-21d8d0f95c04\" (UID: \"db6f9af8-342d-4a5d-bd75-21d8d0f95c04\") " Dec 01 22:19:00 crc kubenswrapper[4962]: I1201 22:19:00.613185 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db6f9af8-342d-4a5d-bd75-21d8d0f95c04-kube-api-access-slw6z" (OuterVolumeSpecName: "kube-api-access-slw6z") pod "db6f9af8-342d-4a5d-bd75-21d8d0f95c04" (UID: "db6f9af8-342d-4a5d-bd75-21d8d0f95c04"). InnerVolumeSpecName "kube-api-access-slw6z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 22:19:00 crc kubenswrapper[4962]: I1201 22:19:00.639140 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db6f9af8-342d-4a5d-bd75-21d8d0f95c04-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "db6f9af8-342d-4a5d-bd75-21d8d0f95c04" (UID: "db6f9af8-342d-4a5d-bd75-21d8d0f95c04"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:19:00 crc kubenswrapper[4962]: I1201 22:19:00.697289 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db6f9af8-342d-4a5d-bd75-21d8d0f95c04-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "db6f9af8-342d-4a5d-bd75-21d8d0f95c04" (UID: "db6f9af8-342d-4a5d-bd75-21d8d0f95c04"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:19:00 crc kubenswrapper[4962]: I1201 22:19:00.707140 4962 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/db6f9af8-342d-4a5d-bd75-21d8d0f95c04-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Dec 01 22:19:00 crc kubenswrapper[4962]: I1201 22:19:00.707764 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-slw6z\" (UniqueName: \"kubernetes.io/projected/db6f9af8-342d-4a5d-bd75-21d8d0f95c04-kube-api-access-slw6z\") on node \"crc\" DevicePath \"\"" Dec 01 22:19:00 crc kubenswrapper[4962]: I1201 22:19:00.707796 4962 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db6f9af8-342d-4a5d-bd75-21d8d0f95c04-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 22:19:00 crc kubenswrapper[4962]: I1201 22:19:00.711346 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db6f9af8-342d-4a5d-bd75-21d8d0f95c04-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "db6f9af8-342d-4a5d-bd75-21d8d0f95c04" (UID: "db6f9af8-342d-4a5d-bd75-21d8d0f95c04"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:19:00 crc kubenswrapper[4962]: I1201 22:19:00.729979 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db6f9af8-342d-4a5d-bd75-21d8d0f95c04-inventory" (OuterVolumeSpecName: "inventory") pod "db6f9af8-342d-4a5d-bd75-21d8d0f95c04" (UID: "db6f9af8-342d-4a5d-bd75-21d8d0f95c04"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:19:00 crc kubenswrapper[4962]: I1201 22:19:00.810677 4962 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/db6f9af8-342d-4a5d-bd75-21d8d0f95c04-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 22:19:00 crc kubenswrapper[4962]: I1201 22:19:00.810908 4962 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/db6f9af8-342d-4a5d-bd75-21d8d0f95c04-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 22:19:00 crc kubenswrapper[4962]: I1201 22:19:00.918401 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nh8kx" event={"ID":"db6f9af8-342d-4a5d-bd75-21d8d0f95c04","Type":"ContainerDied","Data":"a20547365035c5d82730f4b718329ff356a91e082e29c134d360d9a027275adc"} Dec 01 22:19:00 crc kubenswrapper[4962]: I1201 22:19:00.918439 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a20547365035c5d82730f4b718329ff356a91e082e29c134d360d9a027275adc" Dec 01 22:19:00 crc kubenswrapper[4962]: I1201 22:19:00.918471 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nh8kx" Dec 01 22:19:01 crc kubenswrapper[4962]: I1201 22:19:01.057904 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-pl278"] Dec 01 22:19:01 crc kubenswrapper[4962]: E1201 22:19:01.058400 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49a0393e-ecf0-499c-aeb6-ed4f6175595c" containerName="registry-server" Dec 01 22:19:01 crc kubenswrapper[4962]: I1201 22:19:01.058418 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="49a0393e-ecf0-499c-aeb6-ed4f6175595c" containerName="registry-server" Dec 01 22:19:01 crc kubenswrapper[4962]: E1201 22:19:01.058432 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49a0393e-ecf0-499c-aeb6-ed4f6175595c" containerName="extract-utilities" Dec 01 22:19:01 crc kubenswrapper[4962]: I1201 22:19:01.058440 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="49a0393e-ecf0-499c-aeb6-ed4f6175595c" containerName="extract-utilities" Dec 01 22:19:01 crc kubenswrapper[4962]: E1201 22:19:01.058460 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db6f9af8-342d-4a5d-bd75-21d8d0f95c04" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 01 22:19:01 crc kubenswrapper[4962]: I1201 22:19:01.058469 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="db6f9af8-342d-4a5d-bd75-21d8d0f95c04" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 01 22:19:01 crc kubenswrapper[4962]: E1201 22:19:01.058497 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49a0393e-ecf0-499c-aeb6-ed4f6175595c" containerName="extract-content" Dec 01 22:19:01 crc kubenswrapper[4962]: I1201 22:19:01.058503 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="49a0393e-ecf0-499c-aeb6-ed4f6175595c" containerName="extract-content" Dec 01 22:19:01 crc kubenswrapper[4962]: I1201 22:19:01.058724 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="db6f9af8-342d-4a5d-bd75-21d8d0f95c04" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 01 22:19:01 crc kubenswrapper[4962]: I1201 22:19:01.058743 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="49a0393e-ecf0-499c-aeb6-ed4f6175595c" containerName="registry-server" Dec 01 22:19:01 crc kubenswrapper[4962]: I1201 22:19:01.059610 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-pl278" Dec 01 22:19:01 crc kubenswrapper[4962]: I1201 22:19:01.062856 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Dec 01 22:19:01 crc kubenswrapper[4962]: I1201 22:19:01.063100 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Dec 01 22:19:01 crc kubenswrapper[4962]: I1201 22:19:01.063335 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 22:19:01 crc kubenswrapper[4962]: I1201 22:19:01.063805 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 22:19:01 crc kubenswrapper[4962]: I1201 22:19:01.064617 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 22:19:01 crc kubenswrapper[4962]: I1201 22:19:01.068030 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Dec 01 22:19:01 crc kubenswrapper[4962]: I1201 22:19:01.079582 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mpxjd" Dec 01 22:19:01 crc kubenswrapper[4962]: I1201 22:19:01.086393 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-pl278"] Dec 01 22:19:01 crc kubenswrapper[4962]: I1201 22:19:01.220103 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/db0505bf-0445-4d21-9bc6-a483fdf94816-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-pl278\" (UID: \"db0505bf-0445-4d21-9bc6-a483fdf94816\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-pl278" Dec 01 22:19:01 crc kubenswrapper[4962]: I1201 22:19:01.220250 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/db0505bf-0445-4d21-9bc6-a483fdf94816-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-pl278\" (UID: \"db0505bf-0445-4d21-9bc6-a483fdf94816\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-pl278" Dec 01 22:19:01 crc kubenswrapper[4962]: I1201 22:19:01.220342 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/db0505bf-0445-4d21-9bc6-a483fdf94816-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-pl278\" (UID: \"db0505bf-0445-4d21-9bc6-a483fdf94816\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-pl278" Dec 01 22:19:01 crc kubenswrapper[4962]: I1201 22:19:01.221312 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/db0505bf-0445-4d21-9bc6-a483fdf94816-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-pl278\" (UID: \"db0505bf-0445-4d21-9bc6-a483fdf94816\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-pl278" Dec 01 22:19:01 crc kubenswrapper[4962]: I1201 22:19:01.221478 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/db0505bf-0445-4d21-9bc6-a483fdf94816-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-pl278\" (UID: \"db0505bf-0445-4d21-9bc6-a483fdf94816\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-pl278" Dec 01 22:19:01 crc kubenswrapper[4962]: I1201 22:19:01.221714 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-km494\" (UniqueName: \"kubernetes.io/projected/db0505bf-0445-4d21-9bc6-a483fdf94816-kube-api-access-km494\") pod \"nova-edpm-deployment-openstack-edpm-ipam-pl278\" (UID: \"db0505bf-0445-4d21-9bc6-a483fdf94816\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-pl278" Dec 01 22:19:01 crc kubenswrapper[4962]: I1201 22:19:01.221896 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db0505bf-0445-4d21-9bc6-a483fdf94816-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-pl278\" (UID: \"db0505bf-0445-4d21-9bc6-a483fdf94816\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-pl278" Dec 01 22:19:01 crc kubenswrapper[4962]: I1201 22:19:01.222044 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/db0505bf-0445-4d21-9bc6-a483fdf94816-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-pl278\" (UID: \"db0505bf-0445-4d21-9bc6-a483fdf94816\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-pl278" Dec 01 22:19:01 crc kubenswrapper[4962]: I1201 22:19:01.222201 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/db0505bf-0445-4d21-9bc6-a483fdf94816-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-pl278\" (UID: \"db0505bf-0445-4d21-9bc6-a483fdf94816\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-pl278" Dec 01 22:19:01 crc kubenswrapper[4962]: I1201 22:19:01.323888 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-km494\" (UniqueName: \"kubernetes.io/projected/db0505bf-0445-4d21-9bc6-a483fdf94816-kube-api-access-km494\") pod \"nova-edpm-deployment-openstack-edpm-ipam-pl278\" (UID: \"db0505bf-0445-4d21-9bc6-a483fdf94816\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-pl278" Dec 01 22:19:01 crc kubenswrapper[4962]: I1201 22:19:01.323968 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db0505bf-0445-4d21-9bc6-a483fdf94816-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-pl278\" (UID: \"db0505bf-0445-4d21-9bc6-a483fdf94816\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-pl278" Dec 01 22:19:01 crc kubenswrapper[4962]: I1201 22:19:01.324002 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/db0505bf-0445-4d21-9bc6-a483fdf94816-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-pl278\" (UID: \"db0505bf-0445-4d21-9bc6-a483fdf94816\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-pl278" Dec 01 22:19:01 crc kubenswrapper[4962]: I1201 22:19:01.324040 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/db0505bf-0445-4d21-9bc6-a483fdf94816-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-pl278\" (UID: \"db0505bf-0445-4d21-9bc6-a483fdf94816\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-pl278" Dec 01 22:19:01 crc kubenswrapper[4962]: I1201 22:19:01.324114 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/db0505bf-0445-4d21-9bc6-a483fdf94816-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-pl278\" (UID: \"db0505bf-0445-4d21-9bc6-a483fdf94816\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-pl278" Dec 01 22:19:01 crc kubenswrapper[4962]: I1201 22:19:01.324156 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/db0505bf-0445-4d21-9bc6-a483fdf94816-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-pl278\" (UID: \"db0505bf-0445-4d21-9bc6-a483fdf94816\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-pl278" Dec 01 22:19:01 crc kubenswrapper[4962]: I1201 22:19:01.324184 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/db0505bf-0445-4d21-9bc6-a483fdf94816-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-pl278\" (UID: \"db0505bf-0445-4d21-9bc6-a483fdf94816\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-pl278" Dec 01 22:19:01 crc kubenswrapper[4962]: I1201 22:19:01.324231 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/db0505bf-0445-4d21-9bc6-a483fdf94816-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-pl278\" (UID: \"db0505bf-0445-4d21-9bc6-a483fdf94816\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-pl278" Dec 01 22:19:01 crc kubenswrapper[4962]: I1201 22:19:01.324267 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/db0505bf-0445-4d21-9bc6-a483fdf94816-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-pl278\" (UID: \"db0505bf-0445-4d21-9bc6-a483fdf94816\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-pl278" Dec 01 22:19:01 crc kubenswrapper[4962]: I1201 22:19:01.326195 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/db0505bf-0445-4d21-9bc6-a483fdf94816-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-pl278\" (UID: \"db0505bf-0445-4d21-9bc6-a483fdf94816\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-pl278" Dec 01 22:19:01 crc kubenswrapper[4962]: I1201 22:19:01.330587 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/db0505bf-0445-4d21-9bc6-a483fdf94816-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-pl278\" (UID: \"db0505bf-0445-4d21-9bc6-a483fdf94816\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-pl278" Dec 01 22:19:01 crc kubenswrapper[4962]: I1201 22:19:01.331210 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/db0505bf-0445-4d21-9bc6-a483fdf94816-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-pl278\" (UID: \"db0505bf-0445-4d21-9bc6-a483fdf94816\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-pl278" Dec 01 22:19:01 crc kubenswrapper[4962]: I1201 22:19:01.331700 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/db0505bf-0445-4d21-9bc6-a483fdf94816-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-pl278\" (UID: \"db0505bf-0445-4d21-9bc6-a483fdf94816\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-pl278" Dec 01 22:19:01 crc kubenswrapper[4962]: I1201 22:19:01.332439 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/db0505bf-0445-4d21-9bc6-a483fdf94816-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-pl278\" (UID: \"db0505bf-0445-4d21-9bc6-a483fdf94816\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-pl278" Dec 01 22:19:01 crc kubenswrapper[4962]: I1201 22:19:01.334167 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/db0505bf-0445-4d21-9bc6-a483fdf94816-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-pl278\" (UID: \"db0505bf-0445-4d21-9bc6-a483fdf94816\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-pl278" Dec 01 22:19:01 crc kubenswrapper[4962]: I1201 22:19:01.334608 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/db0505bf-0445-4d21-9bc6-a483fdf94816-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-pl278\" (UID: \"db0505bf-0445-4d21-9bc6-a483fdf94816\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-pl278" Dec 01 22:19:01 crc kubenswrapper[4962]: I1201 22:19:01.334701 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db0505bf-0445-4d21-9bc6-a483fdf94816-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-pl278\" (UID: \"db0505bf-0445-4d21-9bc6-a483fdf94816\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-pl278" Dec 01 22:19:01 crc kubenswrapper[4962]: I1201 22:19:01.362021 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-km494\" (UniqueName: \"kubernetes.io/projected/db0505bf-0445-4d21-9bc6-a483fdf94816-kube-api-access-km494\") pod \"nova-edpm-deployment-openstack-edpm-ipam-pl278\" (UID: \"db0505bf-0445-4d21-9bc6-a483fdf94816\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-pl278" Dec 01 22:19:01 crc kubenswrapper[4962]: I1201 22:19:01.384828 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-pl278" Dec 01 22:19:02 crc kubenswrapper[4962]: I1201 22:19:02.049124 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-pl278"] Dec 01 22:19:02 crc kubenswrapper[4962]: I1201 22:19:02.943687 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-pl278" event={"ID":"db0505bf-0445-4d21-9bc6-a483fdf94816","Type":"ContainerStarted","Data":"693fe8fed74b94e01c4d0ca2018e9ad76cdbafea5615d7ef17286dbc8c918bf2"} Dec 01 22:19:03 crc kubenswrapper[4962]: I1201 22:19:03.967387 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-pl278" event={"ID":"db0505bf-0445-4d21-9bc6-a483fdf94816","Type":"ContainerStarted","Data":"35e25e375748844c604c84f58f348629bceb0f16162dd0afadc585ba149922e5"} Dec 01 22:19:04 crc kubenswrapper[4962]: I1201 22:19:04.024408 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-pl278" podStartSLOduration=2.293550937 podStartE2EDuration="3.024384084s" podCreationTimestamp="2025-12-01 22:19:01 +0000 UTC" firstStartedPulling="2025-12-01 22:19:02.054308575 +0000 UTC m=+2726.155747780" lastFinishedPulling="2025-12-01 22:19:02.785141732 +0000 UTC m=+2726.886580927" observedRunningTime="2025-12-01 22:19:03.999008735 +0000 UTC m=+2728.100448020" watchObservedRunningTime="2025-12-01 22:19:04.024384084 +0000 UTC m=+2728.125823319" Dec 01 22:21:02 crc kubenswrapper[4962]: I1201 22:21:02.784186 4962 patch_prober.go:28] interesting pod/machine-config-daemon-b642k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 22:21:02 crc kubenswrapper[4962]: I1201 22:21:02.785245 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 22:21:28 crc kubenswrapper[4962]: I1201 22:21:28.589462 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9cbx5"] Dec 01 22:21:28 crc kubenswrapper[4962]: I1201 22:21:28.595913 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9cbx5" Dec 01 22:21:28 crc kubenswrapper[4962]: I1201 22:21:28.600573 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9cbx5"] Dec 01 22:21:28 crc kubenswrapper[4962]: I1201 22:21:28.728735 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fd18ee7-d466-464a-8ccf-0805e7b82337-utilities\") pod \"certified-operators-9cbx5\" (UID: \"3fd18ee7-d466-464a-8ccf-0805e7b82337\") " pod="openshift-marketplace/certified-operators-9cbx5" Dec 01 22:21:28 crc kubenswrapper[4962]: I1201 22:21:28.729058 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqdfs\" (UniqueName: \"kubernetes.io/projected/3fd18ee7-d466-464a-8ccf-0805e7b82337-kube-api-access-hqdfs\") pod \"certified-operators-9cbx5\" (UID: \"3fd18ee7-d466-464a-8ccf-0805e7b82337\") " pod="openshift-marketplace/certified-operators-9cbx5" Dec 01 22:21:28 crc kubenswrapper[4962]: I1201 22:21:28.729172 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fd18ee7-d466-464a-8ccf-0805e7b82337-catalog-content\") pod \"certified-operators-9cbx5\" (UID: \"3fd18ee7-d466-464a-8ccf-0805e7b82337\") " pod="openshift-marketplace/certified-operators-9cbx5" Dec 01 22:21:28 crc kubenswrapper[4962]: I1201 22:21:28.831590 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqdfs\" (UniqueName: \"kubernetes.io/projected/3fd18ee7-d466-464a-8ccf-0805e7b82337-kube-api-access-hqdfs\") pod \"certified-operators-9cbx5\" (UID: \"3fd18ee7-d466-464a-8ccf-0805e7b82337\") " pod="openshift-marketplace/certified-operators-9cbx5" Dec 01 22:21:28 crc kubenswrapper[4962]: I1201 22:21:28.831667 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fd18ee7-d466-464a-8ccf-0805e7b82337-catalog-content\") pod \"certified-operators-9cbx5\" (UID: \"3fd18ee7-d466-464a-8ccf-0805e7b82337\") " pod="openshift-marketplace/certified-operators-9cbx5" Dec 01 22:21:28 crc kubenswrapper[4962]: I1201 22:21:28.831753 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fd18ee7-d466-464a-8ccf-0805e7b82337-utilities\") pod \"certified-operators-9cbx5\" (UID: \"3fd18ee7-d466-464a-8ccf-0805e7b82337\") " pod="openshift-marketplace/certified-operators-9cbx5" Dec 01 22:21:28 crc kubenswrapper[4962]: I1201 22:21:28.832339 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fd18ee7-d466-464a-8ccf-0805e7b82337-utilities\") pod \"certified-operators-9cbx5\" (UID: \"3fd18ee7-d466-464a-8ccf-0805e7b82337\") " pod="openshift-marketplace/certified-operators-9cbx5" Dec 01 22:21:28 crc kubenswrapper[4962]: I1201 22:21:28.832532 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fd18ee7-d466-464a-8ccf-0805e7b82337-catalog-content\") pod \"certified-operators-9cbx5\" (UID: \"3fd18ee7-d466-464a-8ccf-0805e7b82337\") " pod="openshift-marketplace/certified-operators-9cbx5" Dec 01 22:21:28 crc kubenswrapper[4962]: I1201 22:21:28.857274 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqdfs\" (UniqueName: \"kubernetes.io/projected/3fd18ee7-d466-464a-8ccf-0805e7b82337-kube-api-access-hqdfs\") pod \"certified-operators-9cbx5\" (UID: \"3fd18ee7-d466-464a-8ccf-0805e7b82337\") " pod="openshift-marketplace/certified-operators-9cbx5" Dec 01 22:21:28 crc kubenswrapper[4962]: I1201 22:21:28.952660 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9cbx5" Dec 01 22:21:29 crc kubenswrapper[4962]: I1201 22:21:29.526158 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9cbx5"] Dec 01 22:21:29 crc kubenswrapper[4962]: I1201 22:21:29.891189 4962 generic.go:334] "Generic (PLEG): container finished" podID="3fd18ee7-d466-464a-8ccf-0805e7b82337" containerID="9e07e030feb9bbe259bb369c772c17db4d56a29a42dacddc4a5c98dbc3417853" exitCode=0 Dec 01 22:21:29 crc kubenswrapper[4962]: I1201 22:21:29.891231 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9cbx5" event={"ID":"3fd18ee7-d466-464a-8ccf-0805e7b82337","Type":"ContainerDied","Data":"9e07e030feb9bbe259bb369c772c17db4d56a29a42dacddc4a5c98dbc3417853"} Dec 01 22:21:29 crc kubenswrapper[4962]: I1201 22:21:29.891259 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9cbx5" event={"ID":"3fd18ee7-d466-464a-8ccf-0805e7b82337","Type":"ContainerStarted","Data":"3b495d34ee312a23d75a95f096f3ebb8cf06f5c41a09be1c6cd8a3318acd9b91"} Dec 01 22:21:31 crc kubenswrapper[4962]: I1201 22:21:31.922081 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9cbx5" event={"ID":"3fd18ee7-d466-464a-8ccf-0805e7b82337","Type":"ContainerStarted","Data":"1e5bfd0fcdd70952edb8239669cc1dcf41d853445e948e2f1408ff42a96cbedf"} Dec 01 22:21:32 crc kubenswrapper[4962]: I1201 22:21:32.784661 4962 patch_prober.go:28] interesting pod/machine-config-daemon-b642k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 22:21:32 crc kubenswrapper[4962]: I1201 22:21:32.785233 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 22:21:32 crc kubenswrapper[4962]: I1201 22:21:32.936262 4962 generic.go:334] "Generic (PLEG): container finished" podID="3fd18ee7-d466-464a-8ccf-0805e7b82337" containerID="1e5bfd0fcdd70952edb8239669cc1dcf41d853445e948e2f1408ff42a96cbedf" exitCode=0 Dec 01 22:21:32 crc kubenswrapper[4962]: I1201 22:21:32.936311 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9cbx5" event={"ID":"3fd18ee7-d466-464a-8ccf-0805e7b82337","Type":"ContainerDied","Data":"1e5bfd0fcdd70952edb8239669cc1dcf41d853445e948e2f1408ff42a96cbedf"} Dec 01 22:21:32 crc kubenswrapper[4962]: I1201 22:21:32.940193 4962 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 22:21:33 crc kubenswrapper[4962]: I1201 22:21:33.950727 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9cbx5" event={"ID":"3fd18ee7-d466-464a-8ccf-0805e7b82337","Type":"ContainerStarted","Data":"4cbdf820cf562b1c69d714e114a7f2ba586a83885391d3b8312cd67bd4ce0efb"} Dec 01 22:21:33 crc kubenswrapper[4962]: I1201 22:21:33.984721 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9cbx5" podStartSLOduration=2.393390047 podStartE2EDuration="5.984689902s" podCreationTimestamp="2025-12-01 22:21:28 +0000 UTC" firstStartedPulling="2025-12-01 22:21:29.895366212 +0000 UTC m=+2873.996805417" lastFinishedPulling="2025-12-01 22:21:33.486666077 +0000 UTC m=+2877.588105272" observedRunningTime="2025-12-01 22:21:33.975893294 +0000 UTC m=+2878.077332489" watchObservedRunningTime="2025-12-01 22:21:33.984689902 +0000 UTC m=+2878.086129137" Dec 01 22:21:38 crc kubenswrapper[4962]: I1201 22:21:38.953123 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9cbx5" Dec 01 22:21:38 crc kubenswrapper[4962]: I1201 22:21:38.953676 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9cbx5" Dec 01 22:21:39 crc kubenswrapper[4962]: I1201 22:21:39.015878 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9cbx5" Dec 01 22:21:39 crc kubenswrapper[4962]: I1201 22:21:39.100868 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9cbx5" Dec 01 22:21:39 crc kubenswrapper[4962]: I1201 22:21:39.300082 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9cbx5"] Dec 01 22:21:41 crc kubenswrapper[4962]: I1201 22:21:41.057551 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9cbx5" podUID="3fd18ee7-d466-464a-8ccf-0805e7b82337" containerName="registry-server" containerID="cri-o://4cbdf820cf562b1c69d714e114a7f2ba586a83885391d3b8312cd67bd4ce0efb" gracePeriod=2 Dec 01 22:21:41 crc kubenswrapper[4962]: I1201 22:21:41.648549 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9cbx5" Dec 01 22:21:41 crc kubenswrapper[4962]: I1201 22:21:41.652925 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fd18ee7-d466-464a-8ccf-0805e7b82337-catalog-content\") pod \"3fd18ee7-d466-464a-8ccf-0805e7b82337\" (UID: \"3fd18ee7-d466-464a-8ccf-0805e7b82337\") " Dec 01 22:21:41 crc kubenswrapper[4962]: I1201 22:21:41.653098 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hqdfs\" (UniqueName: \"kubernetes.io/projected/3fd18ee7-d466-464a-8ccf-0805e7b82337-kube-api-access-hqdfs\") pod \"3fd18ee7-d466-464a-8ccf-0805e7b82337\" (UID: \"3fd18ee7-d466-464a-8ccf-0805e7b82337\") " Dec 01 22:21:41 crc kubenswrapper[4962]: I1201 22:21:41.661134 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fd18ee7-d466-464a-8ccf-0805e7b82337-kube-api-access-hqdfs" (OuterVolumeSpecName: "kube-api-access-hqdfs") pod "3fd18ee7-d466-464a-8ccf-0805e7b82337" (UID: "3fd18ee7-d466-464a-8ccf-0805e7b82337"). InnerVolumeSpecName "kube-api-access-hqdfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 22:21:41 crc kubenswrapper[4962]: I1201 22:21:41.718873 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3fd18ee7-d466-464a-8ccf-0805e7b82337-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3fd18ee7-d466-464a-8ccf-0805e7b82337" (UID: "3fd18ee7-d466-464a-8ccf-0805e7b82337"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 22:21:41 crc kubenswrapper[4962]: I1201 22:21:41.757032 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fd18ee7-d466-464a-8ccf-0805e7b82337-utilities\") pod \"3fd18ee7-d466-464a-8ccf-0805e7b82337\" (UID: \"3fd18ee7-d466-464a-8ccf-0805e7b82337\") " Dec 01 22:21:41 crc kubenswrapper[4962]: I1201 22:21:41.758039 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hqdfs\" (UniqueName: \"kubernetes.io/projected/3fd18ee7-d466-464a-8ccf-0805e7b82337-kube-api-access-hqdfs\") on node \"crc\" DevicePath \"\"" Dec 01 22:21:41 crc kubenswrapper[4962]: I1201 22:21:41.758059 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fd18ee7-d466-464a-8ccf-0805e7b82337-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 22:21:41 crc kubenswrapper[4962]: I1201 22:21:41.758769 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3fd18ee7-d466-464a-8ccf-0805e7b82337-utilities" (OuterVolumeSpecName: "utilities") pod "3fd18ee7-d466-464a-8ccf-0805e7b82337" (UID: "3fd18ee7-d466-464a-8ccf-0805e7b82337"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 22:21:41 crc kubenswrapper[4962]: I1201 22:21:41.860338 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fd18ee7-d466-464a-8ccf-0805e7b82337-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 22:21:42 crc kubenswrapper[4962]: I1201 22:21:42.073040 4962 generic.go:334] "Generic (PLEG): container finished" podID="3fd18ee7-d466-464a-8ccf-0805e7b82337" containerID="4cbdf820cf562b1c69d714e114a7f2ba586a83885391d3b8312cd67bd4ce0efb" exitCode=0 Dec 01 22:21:42 crc kubenswrapper[4962]: I1201 22:21:42.073095 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9cbx5" event={"ID":"3fd18ee7-d466-464a-8ccf-0805e7b82337","Type":"ContainerDied","Data":"4cbdf820cf562b1c69d714e114a7f2ba586a83885391d3b8312cd67bd4ce0efb"} Dec 01 22:21:42 crc kubenswrapper[4962]: I1201 22:21:42.073182 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9cbx5" event={"ID":"3fd18ee7-d466-464a-8ccf-0805e7b82337","Type":"ContainerDied","Data":"3b495d34ee312a23d75a95f096f3ebb8cf06f5c41a09be1c6cd8a3318acd9b91"} Dec 01 22:21:42 crc kubenswrapper[4962]: I1201 22:21:42.073166 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9cbx5" Dec 01 22:21:42 crc kubenswrapper[4962]: I1201 22:21:42.073217 4962 scope.go:117] "RemoveContainer" containerID="4cbdf820cf562b1c69d714e114a7f2ba586a83885391d3b8312cd67bd4ce0efb" Dec 01 22:21:42 crc kubenswrapper[4962]: I1201 22:21:42.111333 4962 scope.go:117] "RemoveContainer" containerID="1e5bfd0fcdd70952edb8239669cc1dcf41d853445e948e2f1408ff42a96cbedf" Dec 01 22:21:42 crc kubenswrapper[4962]: I1201 22:21:42.130361 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9cbx5"] Dec 01 22:21:42 crc kubenswrapper[4962]: I1201 22:21:42.140989 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9cbx5"] Dec 01 22:21:42 crc kubenswrapper[4962]: I1201 22:21:42.148598 4962 scope.go:117] "RemoveContainer" containerID="9e07e030feb9bbe259bb369c772c17db4d56a29a42dacddc4a5c98dbc3417853" Dec 01 22:21:42 crc kubenswrapper[4962]: I1201 22:21:42.202606 4962 scope.go:117] "RemoveContainer" containerID="4cbdf820cf562b1c69d714e114a7f2ba586a83885391d3b8312cd67bd4ce0efb" Dec 01 22:21:42 crc kubenswrapper[4962]: E1201 22:21:42.203314 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cbdf820cf562b1c69d714e114a7f2ba586a83885391d3b8312cd67bd4ce0efb\": container with ID starting with 4cbdf820cf562b1c69d714e114a7f2ba586a83885391d3b8312cd67bd4ce0efb not found: ID does not exist" containerID="4cbdf820cf562b1c69d714e114a7f2ba586a83885391d3b8312cd67bd4ce0efb" Dec 01 22:21:42 crc kubenswrapper[4962]: I1201 22:21:42.203348 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cbdf820cf562b1c69d714e114a7f2ba586a83885391d3b8312cd67bd4ce0efb"} err="failed to get container status \"4cbdf820cf562b1c69d714e114a7f2ba586a83885391d3b8312cd67bd4ce0efb\": rpc error: code = NotFound desc = could not find container \"4cbdf820cf562b1c69d714e114a7f2ba586a83885391d3b8312cd67bd4ce0efb\": container with ID starting with 4cbdf820cf562b1c69d714e114a7f2ba586a83885391d3b8312cd67bd4ce0efb not found: ID does not exist" Dec 01 22:21:42 crc kubenswrapper[4962]: I1201 22:21:42.203370 4962 scope.go:117] "RemoveContainer" containerID="1e5bfd0fcdd70952edb8239669cc1dcf41d853445e948e2f1408ff42a96cbedf" Dec 01 22:21:42 crc kubenswrapper[4962]: E1201 22:21:42.203772 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e5bfd0fcdd70952edb8239669cc1dcf41d853445e948e2f1408ff42a96cbedf\": container with ID starting with 1e5bfd0fcdd70952edb8239669cc1dcf41d853445e948e2f1408ff42a96cbedf not found: ID does not exist" containerID="1e5bfd0fcdd70952edb8239669cc1dcf41d853445e948e2f1408ff42a96cbedf" Dec 01 22:21:42 crc kubenswrapper[4962]: I1201 22:21:42.203856 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e5bfd0fcdd70952edb8239669cc1dcf41d853445e948e2f1408ff42a96cbedf"} err="failed to get container status \"1e5bfd0fcdd70952edb8239669cc1dcf41d853445e948e2f1408ff42a96cbedf\": rpc error: code = NotFound desc = could not find container \"1e5bfd0fcdd70952edb8239669cc1dcf41d853445e948e2f1408ff42a96cbedf\": container with ID starting with 1e5bfd0fcdd70952edb8239669cc1dcf41d853445e948e2f1408ff42a96cbedf not found: ID does not exist" Dec 01 22:21:42 crc kubenswrapper[4962]: I1201 22:21:42.203899 4962 scope.go:117] "RemoveContainer" containerID="9e07e030feb9bbe259bb369c772c17db4d56a29a42dacddc4a5c98dbc3417853" Dec 01 22:21:42 crc kubenswrapper[4962]: E1201 22:21:42.207308 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e07e030feb9bbe259bb369c772c17db4d56a29a42dacddc4a5c98dbc3417853\": container with ID starting with 9e07e030feb9bbe259bb369c772c17db4d56a29a42dacddc4a5c98dbc3417853 not found: ID does not exist" containerID="9e07e030feb9bbe259bb369c772c17db4d56a29a42dacddc4a5c98dbc3417853" Dec 01 22:21:42 crc kubenswrapper[4962]: I1201 22:21:42.207336 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e07e030feb9bbe259bb369c772c17db4d56a29a42dacddc4a5c98dbc3417853"} err="failed to get container status \"9e07e030feb9bbe259bb369c772c17db4d56a29a42dacddc4a5c98dbc3417853\": rpc error: code = NotFound desc = could not find container \"9e07e030feb9bbe259bb369c772c17db4d56a29a42dacddc4a5c98dbc3417853\": container with ID starting with 9e07e030feb9bbe259bb369c772c17db4d56a29a42dacddc4a5c98dbc3417853 not found: ID does not exist" Dec 01 22:21:42 crc kubenswrapper[4962]: I1201 22:21:42.239850 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fd18ee7-d466-464a-8ccf-0805e7b82337" path="/var/lib/kubelet/pods/3fd18ee7-d466-464a-8ccf-0805e7b82337/volumes" Dec 01 22:22:02 crc kubenswrapper[4962]: I1201 22:22:02.785683 4962 patch_prober.go:28] interesting pod/machine-config-daemon-b642k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 22:22:02 crc kubenswrapper[4962]: I1201 22:22:02.786165 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 22:22:02 crc kubenswrapper[4962]: I1201 22:22:02.786208 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b642k" Dec 01 22:22:02 crc kubenswrapper[4962]: I1201 22:22:02.792094 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7e86de3a727bf2cd2d1675ee45c0547dea508d02b19d9166f854056d0475f742"} pod="openshift-machine-config-operator/machine-config-daemon-b642k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 22:22:02 crc kubenswrapper[4962]: I1201 22:22:02.792185 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" containerID="cri-o://7e86de3a727bf2cd2d1675ee45c0547dea508d02b19d9166f854056d0475f742" gracePeriod=600 Dec 01 22:22:03 crc kubenswrapper[4962]: I1201 22:22:03.436854 4962 generic.go:334] "Generic (PLEG): container finished" podID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerID="7e86de3a727bf2cd2d1675ee45c0547dea508d02b19d9166f854056d0475f742" exitCode=0 Dec 01 22:22:03 crc kubenswrapper[4962]: I1201 22:22:03.437272 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" event={"ID":"191b6ce3-f613-4217-b224-a65ee4cfdfe7","Type":"ContainerDied","Data":"7e86de3a727bf2cd2d1675ee45c0547dea508d02b19d9166f854056d0475f742"} Dec 01 22:22:03 crc kubenswrapper[4962]: I1201 22:22:03.437308 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" event={"ID":"191b6ce3-f613-4217-b224-a65ee4cfdfe7","Type":"ContainerStarted","Data":"2e23bada28e6e6bf435d62af0fa8d63ab344b8505237ba65bcba1b122cc5a1ef"} Dec 01 22:22:03 crc kubenswrapper[4962]: I1201 22:22:03.437328 4962 scope.go:117] "RemoveContainer" containerID="8c484572750533778a7ce8cd28638da562706e712e60fd64e1058f563e05403c" Dec 01 22:22:19 crc kubenswrapper[4962]: I1201 22:22:19.686826 4962 generic.go:334] "Generic (PLEG): container finished" podID="db0505bf-0445-4d21-9bc6-a483fdf94816" containerID="35e25e375748844c604c84f58f348629bceb0f16162dd0afadc585ba149922e5" exitCode=0 Dec 01 22:22:19 crc kubenswrapper[4962]: I1201 22:22:19.686928 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-pl278" event={"ID":"db0505bf-0445-4d21-9bc6-a483fdf94816","Type":"ContainerDied","Data":"35e25e375748844c604c84f58f348629bceb0f16162dd0afadc585ba149922e5"} Dec 01 22:22:21 crc kubenswrapper[4962]: I1201 22:22:21.321043 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-pl278" Dec 01 22:22:21 crc kubenswrapper[4962]: I1201 22:22:21.448281 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-km494\" (UniqueName: \"kubernetes.io/projected/db0505bf-0445-4d21-9bc6-a483fdf94816-kube-api-access-km494\") pod \"db0505bf-0445-4d21-9bc6-a483fdf94816\" (UID: \"db0505bf-0445-4d21-9bc6-a483fdf94816\") " Dec 01 22:22:21 crc kubenswrapper[4962]: I1201 22:22:21.448375 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/db0505bf-0445-4d21-9bc6-a483fdf94816-ssh-key\") pod \"db0505bf-0445-4d21-9bc6-a483fdf94816\" (UID: \"db0505bf-0445-4d21-9bc6-a483fdf94816\") " Dec 01 22:22:21 crc kubenswrapper[4962]: I1201 22:22:21.448423 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db0505bf-0445-4d21-9bc6-a483fdf94816-nova-combined-ca-bundle\") pod \"db0505bf-0445-4d21-9bc6-a483fdf94816\" (UID: \"db0505bf-0445-4d21-9bc6-a483fdf94816\") " Dec 01 22:22:21 crc kubenswrapper[4962]: I1201 22:22:21.448456 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/db0505bf-0445-4d21-9bc6-a483fdf94816-nova-cell1-compute-config-1\") pod \"db0505bf-0445-4d21-9bc6-a483fdf94816\" (UID: \"db0505bf-0445-4d21-9bc6-a483fdf94816\") " Dec 01 22:22:21 crc kubenswrapper[4962]: I1201 22:22:21.448486 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/db0505bf-0445-4d21-9bc6-a483fdf94816-inventory\") pod \"db0505bf-0445-4d21-9bc6-a483fdf94816\" (UID: \"db0505bf-0445-4d21-9bc6-a483fdf94816\") " Dec 01 22:22:21 crc kubenswrapper[4962]: I1201 22:22:21.448633 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/db0505bf-0445-4d21-9bc6-a483fdf94816-nova-cell1-compute-config-0\") pod \"db0505bf-0445-4d21-9bc6-a483fdf94816\" (UID: \"db0505bf-0445-4d21-9bc6-a483fdf94816\") " Dec 01 22:22:21 crc kubenswrapper[4962]: I1201 22:22:21.448666 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/db0505bf-0445-4d21-9bc6-a483fdf94816-nova-migration-ssh-key-0\") pod \"db0505bf-0445-4d21-9bc6-a483fdf94816\" (UID: \"db0505bf-0445-4d21-9bc6-a483fdf94816\") " Dec 01 22:22:21 crc kubenswrapper[4962]: I1201 22:22:21.448794 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/db0505bf-0445-4d21-9bc6-a483fdf94816-nova-migration-ssh-key-1\") pod \"db0505bf-0445-4d21-9bc6-a483fdf94816\" (UID: \"db0505bf-0445-4d21-9bc6-a483fdf94816\") " Dec 01 22:22:21 crc kubenswrapper[4962]: I1201 22:22:21.448864 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/db0505bf-0445-4d21-9bc6-a483fdf94816-nova-extra-config-0\") pod \"db0505bf-0445-4d21-9bc6-a483fdf94816\" (UID: \"db0505bf-0445-4d21-9bc6-a483fdf94816\") " Dec 01 22:22:21 crc kubenswrapper[4962]: I1201 22:22:21.460250 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db0505bf-0445-4d21-9bc6-a483fdf94816-kube-api-access-km494" (OuterVolumeSpecName: "kube-api-access-km494") pod "db0505bf-0445-4d21-9bc6-a483fdf94816" (UID: "db0505bf-0445-4d21-9bc6-a483fdf94816"). InnerVolumeSpecName "kube-api-access-km494". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 22:22:21 crc kubenswrapper[4962]: I1201 22:22:21.475283 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db0505bf-0445-4d21-9bc6-a483fdf94816-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "db0505bf-0445-4d21-9bc6-a483fdf94816" (UID: "db0505bf-0445-4d21-9bc6-a483fdf94816"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:22:21 crc kubenswrapper[4962]: I1201 22:22:21.481348 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db0505bf-0445-4d21-9bc6-a483fdf94816-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "db0505bf-0445-4d21-9bc6-a483fdf94816" (UID: "db0505bf-0445-4d21-9bc6-a483fdf94816"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 22:22:21 crc kubenswrapper[4962]: I1201 22:22:21.490825 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db0505bf-0445-4d21-9bc6-a483fdf94816-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "db0505bf-0445-4d21-9bc6-a483fdf94816" (UID: "db0505bf-0445-4d21-9bc6-a483fdf94816"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:22:21 crc kubenswrapper[4962]: I1201 22:22:21.492856 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db0505bf-0445-4d21-9bc6-a483fdf94816-inventory" (OuterVolumeSpecName: "inventory") pod "db0505bf-0445-4d21-9bc6-a483fdf94816" (UID: "db0505bf-0445-4d21-9bc6-a483fdf94816"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:22:21 crc kubenswrapper[4962]: I1201 22:22:21.494829 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db0505bf-0445-4d21-9bc6-a483fdf94816-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "db0505bf-0445-4d21-9bc6-a483fdf94816" (UID: "db0505bf-0445-4d21-9bc6-a483fdf94816"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:22:21 crc kubenswrapper[4962]: I1201 22:22:21.495333 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db0505bf-0445-4d21-9bc6-a483fdf94816-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "db0505bf-0445-4d21-9bc6-a483fdf94816" (UID: "db0505bf-0445-4d21-9bc6-a483fdf94816"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:22:21 crc kubenswrapper[4962]: I1201 22:22:21.508123 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db0505bf-0445-4d21-9bc6-a483fdf94816-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "db0505bf-0445-4d21-9bc6-a483fdf94816" (UID: "db0505bf-0445-4d21-9bc6-a483fdf94816"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:22:21 crc kubenswrapper[4962]: I1201 22:22:21.514666 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db0505bf-0445-4d21-9bc6-a483fdf94816-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "db0505bf-0445-4d21-9bc6-a483fdf94816" (UID: "db0505bf-0445-4d21-9bc6-a483fdf94816"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:22:21 crc kubenswrapper[4962]: I1201 22:22:21.552757 4962 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/db0505bf-0445-4d21-9bc6-a483fdf94816-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Dec 01 22:22:21 crc kubenswrapper[4962]: I1201 22:22:21.552799 4962 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/db0505bf-0445-4d21-9bc6-a483fdf94816-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Dec 01 22:22:21 crc kubenswrapper[4962]: I1201 22:22:21.552811 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-km494\" (UniqueName: \"kubernetes.io/projected/db0505bf-0445-4d21-9bc6-a483fdf94816-kube-api-access-km494\") on node \"crc\" DevicePath \"\"" Dec 01 22:22:21 crc kubenswrapper[4962]: I1201 22:22:21.552829 4962 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/db0505bf-0445-4d21-9bc6-a483fdf94816-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 22:22:21 crc kubenswrapper[4962]: I1201 22:22:21.552842 4962 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db0505bf-0445-4d21-9bc6-a483fdf94816-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 22:22:21 crc kubenswrapper[4962]: I1201 22:22:21.552854 4962 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/db0505bf-0445-4d21-9bc6-a483fdf94816-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Dec 01 22:22:21 crc kubenswrapper[4962]: I1201 22:22:21.552866 4962 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/db0505bf-0445-4d21-9bc6-a483fdf94816-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 22:22:21 crc kubenswrapper[4962]: I1201 22:22:21.552895 4962 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/db0505bf-0445-4d21-9bc6-a483fdf94816-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Dec 01 22:22:21 crc kubenswrapper[4962]: I1201 22:22:21.552907 4962 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/db0505bf-0445-4d21-9bc6-a483fdf94816-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Dec 01 22:22:21 crc kubenswrapper[4962]: I1201 22:22:21.730757 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-pl278" event={"ID":"db0505bf-0445-4d21-9bc6-a483fdf94816","Type":"ContainerDied","Data":"693fe8fed74b94e01c4d0ca2018e9ad76cdbafea5615d7ef17286dbc8c918bf2"} Dec 01 22:22:21 crc kubenswrapper[4962]: I1201 22:22:21.730856 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="693fe8fed74b94e01c4d0ca2018e9ad76cdbafea5615d7ef17286dbc8c918bf2" Dec 01 22:22:21 crc kubenswrapper[4962]: I1201 22:22:21.730884 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-pl278" Dec 01 22:22:21 crc kubenswrapper[4962]: I1201 22:22:21.866518 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4cmp6"] Dec 01 22:22:21 crc kubenswrapper[4962]: E1201 22:22:21.867067 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db0505bf-0445-4d21-9bc6-a483fdf94816" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 01 22:22:21 crc kubenswrapper[4962]: I1201 22:22:21.867090 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="db0505bf-0445-4d21-9bc6-a483fdf94816" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 01 22:22:21 crc kubenswrapper[4962]: E1201 22:22:21.867115 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fd18ee7-d466-464a-8ccf-0805e7b82337" containerName="extract-content" Dec 01 22:22:21 crc kubenswrapper[4962]: I1201 22:22:21.867125 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fd18ee7-d466-464a-8ccf-0805e7b82337" containerName="extract-content" Dec 01 22:22:21 crc kubenswrapper[4962]: E1201 22:22:21.867167 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fd18ee7-d466-464a-8ccf-0805e7b82337" containerName="registry-server" Dec 01 22:22:21 crc kubenswrapper[4962]: I1201 22:22:21.867175 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fd18ee7-d466-464a-8ccf-0805e7b82337" containerName="registry-server" Dec 01 22:22:21 crc kubenswrapper[4962]: E1201 22:22:21.867200 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fd18ee7-d466-464a-8ccf-0805e7b82337" containerName="extract-utilities" Dec 01 22:22:21 crc kubenswrapper[4962]: I1201 22:22:21.867208 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fd18ee7-d466-464a-8ccf-0805e7b82337" containerName="extract-utilities" Dec 01 22:22:21 crc kubenswrapper[4962]: I1201 22:22:21.867466 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fd18ee7-d466-464a-8ccf-0805e7b82337" containerName="registry-server" Dec 01 22:22:21 crc kubenswrapper[4962]: I1201 22:22:21.867515 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="db0505bf-0445-4d21-9bc6-a483fdf94816" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 01 22:22:21 crc kubenswrapper[4962]: I1201 22:22:21.868419 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4cmp6" Dec 01 22:22:21 crc kubenswrapper[4962]: I1201 22:22:21.872673 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 22:22:21 crc kubenswrapper[4962]: I1201 22:22:21.872914 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Dec 01 22:22:21 crc kubenswrapper[4962]: I1201 22:22:21.873088 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mpxjd" Dec 01 22:22:21 crc kubenswrapper[4962]: I1201 22:22:21.873234 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 22:22:21 crc kubenswrapper[4962]: I1201 22:22:21.873253 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 22:22:21 crc kubenswrapper[4962]: I1201 22:22:21.900299 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4cmp6"] Dec 01 22:22:21 crc kubenswrapper[4962]: I1201 22:22:21.963669 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ldzx\" (UniqueName: \"kubernetes.io/projected/a7fabf85-9e84-477d-9831-1f6ff8c52e3e-kube-api-access-7ldzx\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4cmp6\" (UID: \"a7fabf85-9e84-477d-9831-1f6ff8c52e3e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4cmp6" Dec 01 22:22:21 crc kubenswrapper[4962]: I1201 22:22:21.963818 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a7fabf85-9e84-477d-9831-1f6ff8c52e3e-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4cmp6\" (UID: \"a7fabf85-9e84-477d-9831-1f6ff8c52e3e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4cmp6" Dec 01 22:22:21 crc kubenswrapper[4962]: I1201 22:22:21.963899 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a7fabf85-9e84-477d-9831-1f6ff8c52e3e-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4cmp6\" (UID: \"a7fabf85-9e84-477d-9831-1f6ff8c52e3e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4cmp6" Dec 01 22:22:21 crc kubenswrapper[4962]: I1201 22:22:21.964006 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a7fabf85-9e84-477d-9831-1f6ff8c52e3e-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4cmp6\" (UID: \"a7fabf85-9e84-477d-9831-1f6ff8c52e3e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4cmp6" Dec 01 22:22:21 crc kubenswrapper[4962]: I1201 22:22:21.964238 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7fabf85-9e84-477d-9831-1f6ff8c52e3e-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4cmp6\" (UID: \"a7fabf85-9e84-477d-9831-1f6ff8c52e3e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4cmp6" Dec 01 22:22:21 crc kubenswrapper[4962]: I1201 22:22:21.964519 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a7fabf85-9e84-477d-9831-1f6ff8c52e3e-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4cmp6\" (UID: \"a7fabf85-9e84-477d-9831-1f6ff8c52e3e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4cmp6" Dec 01 22:22:21 crc kubenswrapper[4962]: I1201 22:22:21.964579 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a7fabf85-9e84-477d-9831-1f6ff8c52e3e-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4cmp6\" (UID: \"a7fabf85-9e84-477d-9831-1f6ff8c52e3e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4cmp6" Dec 01 22:22:21 crc kubenswrapper[4962]: E1201 22:22:21.975739 4962 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb0505bf_0445_4d21_9bc6_a483fdf94816.slice/crio-693fe8fed74b94e01c4d0ca2018e9ad76cdbafea5615d7ef17286dbc8c918bf2\": RecentStats: unable to find data in memory cache]" Dec 01 22:22:22 crc kubenswrapper[4962]: I1201 22:22:22.066773 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a7fabf85-9e84-477d-9831-1f6ff8c52e3e-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4cmp6\" (UID: \"a7fabf85-9e84-477d-9831-1f6ff8c52e3e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4cmp6" Dec 01 22:22:22 crc kubenswrapper[4962]: I1201 22:22:22.067138 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a7fabf85-9e84-477d-9831-1f6ff8c52e3e-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4cmp6\" (UID: \"a7fabf85-9e84-477d-9831-1f6ff8c52e3e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4cmp6" Dec 01 22:22:22 crc kubenswrapper[4962]: I1201 22:22:22.067196 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ldzx\" (UniqueName: \"kubernetes.io/projected/a7fabf85-9e84-477d-9831-1f6ff8c52e3e-kube-api-access-7ldzx\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4cmp6\" (UID: \"a7fabf85-9e84-477d-9831-1f6ff8c52e3e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4cmp6" Dec 01 22:22:22 crc kubenswrapper[4962]: I1201 22:22:22.067238 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a7fabf85-9e84-477d-9831-1f6ff8c52e3e-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4cmp6\" (UID: \"a7fabf85-9e84-477d-9831-1f6ff8c52e3e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4cmp6" Dec 01 22:22:22 crc kubenswrapper[4962]: I1201 22:22:22.067269 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a7fabf85-9e84-477d-9831-1f6ff8c52e3e-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4cmp6\" (UID: \"a7fabf85-9e84-477d-9831-1f6ff8c52e3e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4cmp6" Dec 01 22:22:22 crc kubenswrapper[4962]: I1201 22:22:22.067296 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a7fabf85-9e84-477d-9831-1f6ff8c52e3e-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4cmp6\" (UID: \"a7fabf85-9e84-477d-9831-1f6ff8c52e3e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4cmp6" Dec 01 22:22:22 crc kubenswrapper[4962]: I1201 22:22:22.067380 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7fabf85-9e84-477d-9831-1f6ff8c52e3e-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4cmp6\" (UID: \"a7fabf85-9e84-477d-9831-1f6ff8c52e3e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4cmp6" Dec 01 22:22:22 crc kubenswrapper[4962]: I1201 22:22:22.071378 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a7fabf85-9e84-477d-9831-1f6ff8c52e3e-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4cmp6\" (UID: \"a7fabf85-9e84-477d-9831-1f6ff8c52e3e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4cmp6" Dec 01 22:22:22 crc kubenswrapper[4962]: I1201 22:22:22.072158 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a7fabf85-9e84-477d-9831-1f6ff8c52e3e-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4cmp6\" (UID: \"a7fabf85-9e84-477d-9831-1f6ff8c52e3e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4cmp6" Dec 01 22:22:22 crc kubenswrapper[4962]: I1201 22:22:22.072316 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a7fabf85-9e84-477d-9831-1f6ff8c52e3e-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4cmp6\" (UID: \"a7fabf85-9e84-477d-9831-1f6ff8c52e3e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4cmp6" Dec 01 22:22:22 crc kubenswrapper[4962]: I1201 22:22:22.072956 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a7fabf85-9e84-477d-9831-1f6ff8c52e3e-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4cmp6\" (UID: \"a7fabf85-9e84-477d-9831-1f6ff8c52e3e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4cmp6" Dec 01 22:22:22 crc kubenswrapper[4962]: I1201 22:22:22.076426 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7fabf85-9e84-477d-9831-1f6ff8c52e3e-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4cmp6\" (UID: \"a7fabf85-9e84-477d-9831-1f6ff8c52e3e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4cmp6" Dec 01 22:22:22 crc kubenswrapper[4962]: I1201 22:22:22.088121 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a7fabf85-9e84-477d-9831-1f6ff8c52e3e-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4cmp6\" (UID: \"a7fabf85-9e84-477d-9831-1f6ff8c52e3e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4cmp6" Dec 01 22:22:22 crc kubenswrapper[4962]: I1201 22:22:22.089366 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ldzx\" (UniqueName: \"kubernetes.io/projected/a7fabf85-9e84-477d-9831-1f6ff8c52e3e-kube-api-access-7ldzx\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4cmp6\" (UID: \"a7fabf85-9e84-477d-9831-1f6ff8c52e3e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4cmp6" Dec 01 22:22:22 crc kubenswrapper[4962]: I1201 22:22:22.204444 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4cmp6" Dec 01 22:22:22 crc kubenswrapper[4962]: W1201 22:22:22.867150 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7fabf85_9e84_477d_9831_1f6ff8c52e3e.slice/crio-d87b0bdedd9a45e6c19840f594cc2d88487f0c600c0a8db3ff26d9db0440df7e WatchSource:0}: Error finding container d87b0bdedd9a45e6c19840f594cc2d88487f0c600c0a8db3ff26d9db0440df7e: Status 404 returned error can't find the container with id d87b0bdedd9a45e6c19840f594cc2d88487f0c600c0a8db3ff26d9db0440df7e Dec 01 22:22:22 crc kubenswrapper[4962]: I1201 22:22:22.871639 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4cmp6"] Dec 01 22:22:23 crc kubenswrapper[4962]: I1201 22:22:23.756884 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4cmp6" event={"ID":"a7fabf85-9e84-477d-9831-1f6ff8c52e3e","Type":"ContainerStarted","Data":"d87b0bdedd9a45e6c19840f594cc2d88487f0c600c0a8db3ff26d9db0440df7e"} Dec 01 22:22:24 crc kubenswrapper[4962]: I1201 22:22:24.774509 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4cmp6" event={"ID":"a7fabf85-9e84-477d-9831-1f6ff8c52e3e","Type":"ContainerStarted","Data":"37b23b3e9716bc16b3472fea7de40f79e6ad5a00ba5eee2657865d6879199e75"} Dec 01 22:22:24 crc kubenswrapper[4962]: I1201 22:22:24.821751 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4cmp6" podStartSLOduration=3.180239114 podStartE2EDuration="3.821721183s" podCreationTimestamp="2025-12-01 22:22:21 +0000 UTC" firstStartedPulling="2025-12-01 22:22:22.869122453 +0000 UTC m=+2926.970561648" lastFinishedPulling="2025-12-01 22:22:23.510604512 +0000 UTC m=+2927.612043717" observedRunningTime="2025-12-01 22:22:24.80781287 +0000 UTC m=+2928.909252125" watchObservedRunningTime="2025-12-01 22:22:24.821721183 +0000 UTC m=+2928.923160418" Dec 01 22:22:45 crc kubenswrapper[4962]: I1201 22:22:45.777518 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mc2jl"] Dec 01 22:22:45 crc kubenswrapper[4962]: I1201 22:22:45.785352 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mc2jl" Dec 01 22:22:45 crc kubenswrapper[4962]: I1201 22:22:45.817031 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mc2jl"] Dec 01 22:22:45 crc kubenswrapper[4962]: I1201 22:22:45.886714 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kr5c\" (UniqueName: \"kubernetes.io/projected/a5f51854-fdc1-437a-86fd-d5ce5977a89f-kube-api-access-4kr5c\") pod \"redhat-marketplace-mc2jl\" (UID: \"a5f51854-fdc1-437a-86fd-d5ce5977a89f\") " pod="openshift-marketplace/redhat-marketplace-mc2jl" Dec 01 22:22:45 crc kubenswrapper[4962]: I1201 22:22:45.886817 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5f51854-fdc1-437a-86fd-d5ce5977a89f-catalog-content\") pod \"redhat-marketplace-mc2jl\" (UID: \"a5f51854-fdc1-437a-86fd-d5ce5977a89f\") " pod="openshift-marketplace/redhat-marketplace-mc2jl" Dec 01 22:22:45 crc kubenswrapper[4962]: I1201 22:22:45.887112 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5f51854-fdc1-437a-86fd-d5ce5977a89f-utilities\") pod \"redhat-marketplace-mc2jl\" (UID: \"a5f51854-fdc1-437a-86fd-d5ce5977a89f\") " pod="openshift-marketplace/redhat-marketplace-mc2jl" Dec 01 22:22:45 crc kubenswrapper[4962]: I1201 22:22:45.990143 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kr5c\" (UniqueName: \"kubernetes.io/projected/a5f51854-fdc1-437a-86fd-d5ce5977a89f-kube-api-access-4kr5c\") pod \"redhat-marketplace-mc2jl\" (UID: \"a5f51854-fdc1-437a-86fd-d5ce5977a89f\") " pod="openshift-marketplace/redhat-marketplace-mc2jl" Dec 01 22:22:45 crc kubenswrapper[4962]: I1201 22:22:45.990264 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5f51854-fdc1-437a-86fd-d5ce5977a89f-catalog-content\") pod \"redhat-marketplace-mc2jl\" (UID: \"a5f51854-fdc1-437a-86fd-d5ce5977a89f\") " pod="openshift-marketplace/redhat-marketplace-mc2jl" Dec 01 22:22:45 crc kubenswrapper[4962]: I1201 22:22:45.990306 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5f51854-fdc1-437a-86fd-d5ce5977a89f-utilities\") pod \"redhat-marketplace-mc2jl\" (UID: \"a5f51854-fdc1-437a-86fd-d5ce5977a89f\") " pod="openshift-marketplace/redhat-marketplace-mc2jl" Dec 01 22:22:45 crc kubenswrapper[4962]: I1201 22:22:45.990754 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5f51854-fdc1-437a-86fd-d5ce5977a89f-utilities\") pod \"redhat-marketplace-mc2jl\" (UID: \"a5f51854-fdc1-437a-86fd-d5ce5977a89f\") " pod="openshift-marketplace/redhat-marketplace-mc2jl" Dec 01 22:22:45 crc kubenswrapper[4962]: I1201 22:22:45.990975 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5f51854-fdc1-437a-86fd-d5ce5977a89f-catalog-content\") pod \"redhat-marketplace-mc2jl\" (UID: \"a5f51854-fdc1-437a-86fd-d5ce5977a89f\") " pod="openshift-marketplace/redhat-marketplace-mc2jl" Dec 01 22:22:46 crc kubenswrapper[4962]: I1201 22:22:46.012520 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kr5c\" (UniqueName: \"kubernetes.io/projected/a5f51854-fdc1-437a-86fd-d5ce5977a89f-kube-api-access-4kr5c\") pod \"redhat-marketplace-mc2jl\" (UID: \"a5f51854-fdc1-437a-86fd-d5ce5977a89f\") " pod="openshift-marketplace/redhat-marketplace-mc2jl" Dec 01 22:22:46 crc kubenswrapper[4962]: I1201 22:22:46.126827 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mc2jl" Dec 01 22:22:46 crc kubenswrapper[4962]: I1201 22:22:46.788039 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mc2jl"] Dec 01 22:22:47 crc kubenswrapper[4962]: I1201 22:22:47.101321 4962 generic.go:334] "Generic (PLEG): container finished" podID="a5f51854-fdc1-437a-86fd-d5ce5977a89f" containerID="3b5c73b73d82a75ca3ad2cf3f1ec17a914c8e9f5c5fee711c1e110f0d3f088f0" exitCode=0 Dec 01 22:22:47 crc kubenswrapper[4962]: I1201 22:22:47.101389 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mc2jl" event={"ID":"a5f51854-fdc1-437a-86fd-d5ce5977a89f","Type":"ContainerDied","Data":"3b5c73b73d82a75ca3ad2cf3f1ec17a914c8e9f5c5fee711c1e110f0d3f088f0"} Dec 01 22:22:47 crc kubenswrapper[4962]: I1201 22:22:47.101430 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mc2jl" event={"ID":"a5f51854-fdc1-437a-86fd-d5ce5977a89f","Type":"ContainerStarted","Data":"3a7db4e101807377e748a5561dc38df85c4454991438e96b7e481835d7c3fdab"} Dec 01 22:22:49 crc kubenswrapper[4962]: I1201 22:22:49.126700 4962 generic.go:334] "Generic (PLEG): container finished" podID="a5f51854-fdc1-437a-86fd-d5ce5977a89f" containerID="76b3aeedf9743977e145937c7719d2924f8bcdbb7d0a7c9ee2c96e6951430e6b" exitCode=0 Dec 01 22:22:49 crc kubenswrapper[4962]: I1201 22:22:49.127008 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mc2jl" event={"ID":"a5f51854-fdc1-437a-86fd-d5ce5977a89f","Type":"ContainerDied","Data":"76b3aeedf9743977e145937c7719d2924f8bcdbb7d0a7c9ee2c96e6951430e6b"} Dec 01 22:22:50 crc kubenswrapper[4962]: I1201 22:22:50.146124 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mc2jl" event={"ID":"a5f51854-fdc1-437a-86fd-d5ce5977a89f","Type":"ContainerStarted","Data":"9be19d89c27f527812785fc9c8e991c231c14941b55511734c00af34fcc78225"} Dec 01 22:22:50 crc kubenswrapper[4962]: I1201 22:22:50.182741 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mc2jl" podStartSLOduration=2.463516561 podStartE2EDuration="5.182712683s" podCreationTimestamp="2025-12-01 22:22:45 +0000 UTC" firstStartedPulling="2025-12-01 22:22:47.104058268 +0000 UTC m=+2951.205497503" lastFinishedPulling="2025-12-01 22:22:49.82325439 +0000 UTC m=+2953.924693625" observedRunningTime="2025-12-01 22:22:50.167561195 +0000 UTC m=+2954.269000480" watchObservedRunningTime="2025-12-01 22:22:50.182712683 +0000 UTC m=+2954.284151918" Dec 01 22:22:56 crc kubenswrapper[4962]: I1201 22:22:56.128172 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mc2jl" Dec 01 22:22:56 crc kubenswrapper[4962]: I1201 22:22:56.128828 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mc2jl" Dec 01 22:22:56 crc kubenswrapper[4962]: I1201 22:22:56.192685 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mc2jl" Dec 01 22:22:56 crc kubenswrapper[4962]: I1201 22:22:56.327391 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mc2jl" Dec 01 22:22:56 crc kubenswrapper[4962]: I1201 22:22:56.448828 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mc2jl"] Dec 01 22:22:58 crc kubenswrapper[4962]: I1201 22:22:58.283478 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mc2jl" podUID="a5f51854-fdc1-437a-86fd-d5ce5977a89f" containerName="registry-server" containerID="cri-o://9be19d89c27f527812785fc9c8e991c231c14941b55511734c00af34fcc78225" gracePeriod=2 Dec 01 22:22:58 crc kubenswrapper[4962]: I1201 22:22:58.840664 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mc2jl" Dec 01 22:22:58 crc kubenswrapper[4962]: I1201 22:22:58.872752 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5f51854-fdc1-437a-86fd-d5ce5977a89f-utilities\") pod \"a5f51854-fdc1-437a-86fd-d5ce5977a89f\" (UID: \"a5f51854-fdc1-437a-86fd-d5ce5977a89f\") " Dec 01 22:22:58 crc kubenswrapper[4962]: I1201 22:22:58.873180 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5f51854-fdc1-437a-86fd-d5ce5977a89f-catalog-content\") pod \"a5f51854-fdc1-437a-86fd-d5ce5977a89f\" (UID: \"a5f51854-fdc1-437a-86fd-d5ce5977a89f\") " Dec 01 22:22:58 crc kubenswrapper[4962]: I1201 22:22:58.873236 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4kr5c\" (UniqueName: \"kubernetes.io/projected/a5f51854-fdc1-437a-86fd-d5ce5977a89f-kube-api-access-4kr5c\") pod \"a5f51854-fdc1-437a-86fd-d5ce5977a89f\" (UID: \"a5f51854-fdc1-437a-86fd-d5ce5977a89f\") " Dec 01 22:22:58 crc kubenswrapper[4962]: I1201 22:22:58.874276 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5f51854-fdc1-437a-86fd-d5ce5977a89f-utilities" (OuterVolumeSpecName: "utilities") pod "a5f51854-fdc1-437a-86fd-d5ce5977a89f" (UID: "a5f51854-fdc1-437a-86fd-d5ce5977a89f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 22:22:58 crc kubenswrapper[4962]: I1201 22:22:58.907552 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5f51854-fdc1-437a-86fd-d5ce5977a89f-kube-api-access-4kr5c" (OuterVolumeSpecName: "kube-api-access-4kr5c") pod "a5f51854-fdc1-437a-86fd-d5ce5977a89f" (UID: "a5f51854-fdc1-437a-86fd-d5ce5977a89f"). InnerVolumeSpecName "kube-api-access-4kr5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 22:22:58 crc kubenswrapper[4962]: I1201 22:22:58.913245 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5f51854-fdc1-437a-86fd-d5ce5977a89f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a5f51854-fdc1-437a-86fd-d5ce5977a89f" (UID: "a5f51854-fdc1-437a-86fd-d5ce5977a89f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 22:22:58 crc kubenswrapper[4962]: I1201 22:22:58.976258 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5f51854-fdc1-437a-86fd-d5ce5977a89f-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 22:22:58 crc kubenswrapper[4962]: I1201 22:22:58.976294 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5f51854-fdc1-437a-86fd-d5ce5977a89f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 22:22:58 crc kubenswrapper[4962]: I1201 22:22:58.976311 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4kr5c\" (UniqueName: \"kubernetes.io/projected/a5f51854-fdc1-437a-86fd-d5ce5977a89f-kube-api-access-4kr5c\") on node \"crc\" DevicePath \"\"" Dec 01 22:22:59 crc kubenswrapper[4962]: I1201 22:22:59.306213 4962 generic.go:334] "Generic (PLEG): container finished" podID="a5f51854-fdc1-437a-86fd-d5ce5977a89f" containerID="9be19d89c27f527812785fc9c8e991c231c14941b55511734c00af34fcc78225" exitCode=0 Dec 01 22:22:59 crc kubenswrapper[4962]: I1201 22:22:59.306377 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mc2jl" event={"ID":"a5f51854-fdc1-437a-86fd-d5ce5977a89f","Type":"ContainerDied","Data":"9be19d89c27f527812785fc9c8e991c231c14941b55511734c00af34fcc78225"} Dec 01 22:22:59 crc kubenswrapper[4962]: I1201 22:22:59.306709 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mc2jl" event={"ID":"a5f51854-fdc1-437a-86fd-d5ce5977a89f","Type":"ContainerDied","Data":"3a7db4e101807377e748a5561dc38df85c4454991438e96b7e481835d7c3fdab"} Dec 01 22:22:59 crc kubenswrapper[4962]: I1201 22:22:59.306752 4962 scope.go:117] "RemoveContainer" containerID="9be19d89c27f527812785fc9c8e991c231c14941b55511734c00af34fcc78225" Dec 01 22:22:59 crc kubenswrapper[4962]: I1201 22:22:59.306397 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mc2jl" Dec 01 22:22:59 crc kubenswrapper[4962]: I1201 22:22:59.348757 4962 scope.go:117] "RemoveContainer" containerID="76b3aeedf9743977e145937c7719d2924f8bcdbb7d0a7c9ee2c96e6951430e6b" Dec 01 22:22:59 crc kubenswrapper[4962]: I1201 22:22:59.387174 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mc2jl"] Dec 01 22:22:59 crc kubenswrapper[4962]: I1201 22:22:59.397599 4962 scope.go:117] "RemoveContainer" containerID="3b5c73b73d82a75ca3ad2cf3f1ec17a914c8e9f5c5fee711c1e110f0d3f088f0" Dec 01 22:22:59 crc kubenswrapper[4962]: I1201 22:22:59.402213 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mc2jl"] Dec 01 22:22:59 crc kubenswrapper[4962]: I1201 22:22:59.451625 4962 scope.go:117] "RemoveContainer" containerID="9be19d89c27f527812785fc9c8e991c231c14941b55511734c00af34fcc78225" Dec 01 22:22:59 crc kubenswrapper[4962]: E1201 22:22:59.453601 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9be19d89c27f527812785fc9c8e991c231c14941b55511734c00af34fcc78225\": container with ID starting with 9be19d89c27f527812785fc9c8e991c231c14941b55511734c00af34fcc78225 not found: ID does not exist" containerID="9be19d89c27f527812785fc9c8e991c231c14941b55511734c00af34fcc78225" Dec 01 22:22:59 crc kubenswrapper[4962]: I1201 22:22:59.453650 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9be19d89c27f527812785fc9c8e991c231c14941b55511734c00af34fcc78225"} err="failed to get container status \"9be19d89c27f527812785fc9c8e991c231c14941b55511734c00af34fcc78225\": rpc error: code = NotFound desc = could not find container \"9be19d89c27f527812785fc9c8e991c231c14941b55511734c00af34fcc78225\": container with ID starting with 9be19d89c27f527812785fc9c8e991c231c14941b55511734c00af34fcc78225 not found: ID does not exist" Dec 01 22:22:59 crc kubenswrapper[4962]: I1201 22:22:59.453685 4962 scope.go:117] "RemoveContainer" containerID="76b3aeedf9743977e145937c7719d2924f8bcdbb7d0a7c9ee2c96e6951430e6b" Dec 01 22:22:59 crc kubenswrapper[4962]: E1201 22:22:59.454226 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76b3aeedf9743977e145937c7719d2924f8bcdbb7d0a7c9ee2c96e6951430e6b\": container with ID starting with 76b3aeedf9743977e145937c7719d2924f8bcdbb7d0a7c9ee2c96e6951430e6b not found: ID does not exist" containerID="76b3aeedf9743977e145937c7719d2924f8bcdbb7d0a7c9ee2c96e6951430e6b" Dec 01 22:22:59 crc kubenswrapper[4962]: I1201 22:22:59.454297 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76b3aeedf9743977e145937c7719d2924f8bcdbb7d0a7c9ee2c96e6951430e6b"} err="failed to get container status \"76b3aeedf9743977e145937c7719d2924f8bcdbb7d0a7c9ee2c96e6951430e6b\": rpc error: code = NotFound desc = could not find container \"76b3aeedf9743977e145937c7719d2924f8bcdbb7d0a7c9ee2c96e6951430e6b\": container with ID starting with 76b3aeedf9743977e145937c7719d2924f8bcdbb7d0a7c9ee2c96e6951430e6b not found: ID does not exist" Dec 01 22:22:59 crc kubenswrapper[4962]: I1201 22:22:59.454322 4962 scope.go:117] "RemoveContainer" containerID="3b5c73b73d82a75ca3ad2cf3f1ec17a914c8e9f5c5fee711c1e110f0d3f088f0" Dec 01 22:22:59 crc kubenswrapper[4962]: E1201 22:22:59.454696 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b5c73b73d82a75ca3ad2cf3f1ec17a914c8e9f5c5fee711c1e110f0d3f088f0\": container with ID starting with 3b5c73b73d82a75ca3ad2cf3f1ec17a914c8e9f5c5fee711c1e110f0d3f088f0 not found: ID does not exist" containerID="3b5c73b73d82a75ca3ad2cf3f1ec17a914c8e9f5c5fee711c1e110f0d3f088f0" Dec 01 22:22:59 crc kubenswrapper[4962]: I1201 22:22:59.454728 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b5c73b73d82a75ca3ad2cf3f1ec17a914c8e9f5c5fee711c1e110f0d3f088f0"} err="failed to get container status \"3b5c73b73d82a75ca3ad2cf3f1ec17a914c8e9f5c5fee711c1e110f0d3f088f0\": rpc error: code = NotFound desc = could not find container \"3b5c73b73d82a75ca3ad2cf3f1ec17a914c8e9f5c5fee711c1e110f0d3f088f0\": container with ID starting with 3b5c73b73d82a75ca3ad2cf3f1ec17a914c8e9f5c5fee711c1e110f0d3f088f0 not found: ID does not exist" Dec 01 22:23:00 crc kubenswrapper[4962]: I1201 22:23:00.243255 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5f51854-fdc1-437a-86fd-d5ce5977a89f" path="/var/lib/kubelet/pods/a5f51854-fdc1-437a-86fd-d5ce5977a89f/volumes" Dec 01 22:24:32 crc kubenswrapper[4962]: I1201 22:24:32.784645 4962 patch_prober.go:28] interesting pod/machine-config-daemon-b642k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 22:24:32 crc kubenswrapper[4962]: I1201 22:24:32.785253 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 22:25:02 crc kubenswrapper[4962]: I1201 22:25:02.784097 4962 patch_prober.go:28] interesting pod/machine-config-daemon-b642k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 22:25:02 crc kubenswrapper[4962]: I1201 22:25:02.784660 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 22:25:08 crc kubenswrapper[4962]: I1201 22:25:08.070554 4962 generic.go:334] "Generic (PLEG): container finished" podID="a7fabf85-9e84-477d-9831-1f6ff8c52e3e" containerID="37b23b3e9716bc16b3472fea7de40f79e6ad5a00ba5eee2657865d6879199e75" exitCode=0 Dec 01 22:25:08 crc kubenswrapper[4962]: I1201 22:25:08.071178 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4cmp6" event={"ID":"a7fabf85-9e84-477d-9831-1f6ff8c52e3e","Type":"ContainerDied","Data":"37b23b3e9716bc16b3472fea7de40f79e6ad5a00ba5eee2657865d6879199e75"} Dec 01 22:25:09 crc kubenswrapper[4962]: I1201 22:25:09.687448 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4cmp6" Dec 01 22:25:09 crc kubenswrapper[4962]: I1201 22:25:09.836391 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7fabf85-9e84-477d-9831-1f6ff8c52e3e-telemetry-combined-ca-bundle\") pod \"a7fabf85-9e84-477d-9831-1f6ff8c52e3e\" (UID: \"a7fabf85-9e84-477d-9831-1f6ff8c52e3e\") " Dec 01 22:25:09 crc kubenswrapper[4962]: I1201 22:25:09.836466 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a7fabf85-9e84-477d-9831-1f6ff8c52e3e-inventory\") pod \"a7fabf85-9e84-477d-9831-1f6ff8c52e3e\" (UID: \"a7fabf85-9e84-477d-9831-1f6ff8c52e3e\") " Dec 01 22:25:09 crc kubenswrapper[4962]: I1201 22:25:09.836548 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a7fabf85-9e84-477d-9831-1f6ff8c52e3e-ssh-key\") pod \"a7fabf85-9e84-477d-9831-1f6ff8c52e3e\" (UID: \"a7fabf85-9e84-477d-9831-1f6ff8c52e3e\") " Dec 01 22:25:09 crc kubenswrapper[4962]: I1201 22:25:09.836588 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a7fabf85-9e84-477d-9831-1f6ff8c52e3e-ceilometer-compute-config-data-0\") pod \"a7fabf85-9e84-477d-9831-1f6ff8c52e3e\" (UID: \"a7fabf85-9e84-477d-9831-1f6ff8c52e3e\") " Dec 01 22:25:09 crc kubenswrapper[4962]: I1201 22:25:09.836706 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ldzx\" (UniqueName: \"kubernetes.io/projected/a7fabf85-9e84-477d-9831-1f6ff8c52e3e-kube-api-access-7ldzx\") pod \"a7fabf85-9e84-477d-9831-1f6ff8c52e3e\" (UID: \"a7fabf85-9e84-477d-9831-1f6ff8c52e3e\") " Dec 01 22:25:09 crc kubenswrapper[4962]: I1201 22:25:09.836761 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a7fabf85-9e84-477d-9831-1f6ff8c52e3e-ceilometer-compute-config-data-1\") pod \"a7fabf85-9e84-477d-9831-1f6ff8c52e3e\" (UID: \"a7fabf85-9e84-477d-9831-1f6ff8c52e3e\") " Dec 01 22:25:09 crc kubenswrapper[4962]: I1201 22:25:09.836845 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a7fabf85-9e84-477d-9831-1f6ff8c52e3e-ceilometer-compute-config-data-2\") pod \"a7fabf85-9e84-477d-9831-1f6ff8c52e3e\" (UID: \"a7fabf85-9e84-477d-9831-1f6ff8c52e3e\") " Dec 01 22:25:09 crc kubenswrapper[4962]: I1201 22:25:09.844622 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7fabf85-9e84-477d-9831-1f6ff8c52e3e-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "a7fabf85-9e84-477d-9831-1f6ff8c52e3e" (UID: "a7fabf85-9e84-477d-9831-1f6ff8c52e3e"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:25:09 crc kubenswrapper[4962]: I1201 22:25:09.846256 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7fabf85-9e84-477d-9831-1f6ff8c52e3e-kube-api-access-7ldzx" (OuterVolumeSpecName: "kube-api-access-7ldzx") pod "a7fabf85-9e84-477d-9831-1f6ff8c52e3e" (UID: "a7fabf85-9e84-477d-9831-1f6ff8c52e3e"). InnerVolumeSpecName "kube-api-access-7ldzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 22:25:09 crc kubenswrapper[4962]: I1201 22:25:09.875231 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7fabf85-9e84-477d-9831-1f6ff8c52e3e-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "a7fabf85-9e84-477d-9831-1f6ff8c52e3e" (UID: "a7fabf85-9e84-477d-9831-1f6ff8c52e3e"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:25:09 crc kubenswrapper[4962]: I1201 22:25:09.877493 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7fabf85-9e84-477d-9831-1f6ff8c52e3e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a7fabf85-9e84-477d-9831-1f6ff8c52e3e" (UID: "a7fabf85-9e84-477d-9831-1f6ff8c52e3e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:25:09 crc kubenswrapper[4962]: I1201 22:25:09.886515 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7fabf85-9e84-477d-9831-1f6ff8c52e3e-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "a7fabf85-9e84-477d-9831-1f6ff8c52e3e" (UID: "a7fabf85-9e84-477d-9831-1f6ff8c52e3e"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:25:09 crc kubenswrapper[4962]: I1201 22:25:09.899079 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7fabf85-9e84-477d-9831-1f6ff8c52e3e-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "a7fabf85-9e84-477d-9831-1f6ff8c52e3e" (UID: "a7fabf85-9e84-477d-9831-1f6ff8c52e3e"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:25:09 crc kubenswrapper[4962]: I1201 22:25:09.906874 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7fabf85-9e84-477d-9831-1f6ff8c52e3e-inventory" (OuterVolumeSpecName: "inventory") pod "a7fabf85-9e84-477d-9831-1f6ff8c52e3e" (UID: "a7fabf85-9e84-477d-9831-1f6ff8c52e3e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:25:09 crc kubenswrapper[4962]: I1201 22:25:09.941012 4962 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a7fabf85-9e84-477d-9831-1f6ff8c52e3e-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Dec 01 22:25:09 crc kubenswrapper[4962]: I1201 22:25:09.941053 4962 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7fabf85-9e84-477d-9831-1f6ff8c52e3e-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 22:25:09 crc kubenswrapper[4962]: I1201 22:25:09.941066 4962 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a7fabf85-9e84-477d-9831-1f6ff8c52e3e-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 22:25:09 crc kubenswrapper[4962]: I1201 22:25:09.941078 4962 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a7fabf85-9e84-477d-9831-1f6ff8c52e3e-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 22:25:09 crc kubenswrapper[4962]: I1201 22:25:09.941089 4962 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a7fabf85-9e84-477d-9831-1f6ff8c52e3e-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Dec 01 22:25:09 crc kubenswrapper[4962]: I1201 22:25:09.941100 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ldzx\" (UniqueName: \"kubernetes.io/projected/a7fabf85-9e84-477d-9831-1f6ff8c52e3e-kube-api-access-7ldzx\") on node \"crc\" DevicePath \"\"" Dec 01 22:25:09 crc kubenswrapper[4962]: I1201 22:25:09.941112 4962 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a7fabf85-9e84-477d-9831-1f6ff8c52e3e-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Dec 01 22:25:10 crc kubenswrapper[4962]: I1201 22:25:10.106086 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4cmp6" event={"ID":"a7fabf85-9e84-477d-9831-1f6ff8c52e3e","Type":"ContainerDied","Data":"d87b0bdedd9a45e6c19840f594cc2d88487f0c600c0a8db3ff26d9db0440df7e"} Dec 01 22:25:10 crc kubenswrapper[4962]: I1201 22:25:10.106161 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d87b0bdedd9a45e6c19840f594cc2d88487f0c600c0a8db3ff26d9db0440df7e" Dec 01 22:25:10 crc kubenswrapper[4962]: I1201 22:25:10.106235 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4cmp6" Dec 01 22:25:10 crc kubenswrapper[4962]: I1201 22:25:10.274564 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wbrjz"] Dec 01 22:25:10 crc kubenswrapper[4962]: E1201 22:25:10.275281 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5f51854-fdc1-437a-86fd-d5ce5977a89f" containerName="extract-content" Dec 01 22:25:10 crc kubenswrapper[4962]: I1201 22:25:10.275298 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5f51854-fdc1-437a-86fd-d5ce5977a89f" containerName="extract-content" Dec 01 22:25:10 crc kubenswrapper[4962]: E1201 22:25:10.275316 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5f51854-fdc1-437a-86fd-d5ce5977a89f" containerName="extract-utilities" Dec 01 22:25:10 crc kubenswrapper[4962]: I1201 22:25:10.275323 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5f51854-fdc1-437a-86fd-d5ce5977a89f" containerName="extract-utilities" Dec 01 22:25:10 crc kubenswrapper[4962]: E1201 22:25:10.275351 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5f51854-fdc1-437a-86fd-d5ce5977a89f" containerName="registry-server" Dec 01 22:25:10 crc kubenswrapper[4962]: I1201 22:25:10.275357 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5f51854-fdc1-437a-86fd-d5ce5977a89f" containerName="registry-server" Dec 01 22:25:10 crc kubenswrapper[4962]: E1201 22:25:10.275383 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7fabf85-9e84-477d-9831-1f6ff8c52e3e" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 01 22:25:10 crc kubenswrapper[4962]: I1201 22:25:10.275390 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7fabf85-9e84-477d-9831-1f6ff8c52e3e" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 01 22:25:10 crc kubenswrapper[4962]: I1201 22:25:10.275595 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7fabf85-9e84-477d-9831-1f6ff8c52e3e" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 01 22:25:10 crc kubenswrapper[4962]: I1201 22:25:10.275618 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5f51854-fdc1-437a-86fd-d5ce5977a89f" containerName="registry-server" Dec 01 22:25:10 crc kubenswrapper[4962]: I1201 22:25:10.276420 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wbrjz" Dec 01 22:25:10 crc kubenswrapper[4962]: I1201 22:25:10.284107 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 22:25:10 crc kubenswrapper[4962]: I1201 22:25:10.284210 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 22:25:10 crc kubenswrapper[4962]: I1201 22:25:10.284380 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 22:25:10 crc kubenswrapper[4962]: I1201 22:25:10.285049 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mpxjd" Dec 01 22:25:10 crc kubenswrapper[4962]: I1201 22:25:10.287032 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wbrjz"] Dec 01 22:25:10 crc kubenswrapper[4962]: I1201 22:25:10.294450 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-ipmi-config-data" Dec 01 22:25:10 crc kubenswrapper[4962]: I1201 22:25:10.455989 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/6a103f12-9cb1-4018-9db7-67553233f69d-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-wbrjz\" (UID: \"6a103f12-9cb1-4018-9db7-67553233f69d\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wbrjz" Dec 01 22:25:10 crc kubenswrapper[4962]: I1201 22:25:10.456376 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6a103f12-9cb1-4018-9db7-67553233f69d-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-wbrjz\" (UID: \"6a103f12-9cb1-4018-9db7-67553233f69d\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wbrjz" Dec 01 22:25:10 crc kubenswrapper[4962]: I1201 22:25:10.456550 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/6a103f12-9cb1-4018-9db7-67553233f69d-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-wbrjz\" (UID: \"6a103f12-9cb1-4018-9db7-67553233f69d\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wbrjz" Dec 01 22:25:10 crc kubenswrapper[4962]: I1201 22:25:10.456809 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6a103f12-9cb1-4018-9db7-67553233f69d-ssh-key\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-wbrjz\" (UID: \"6a103f12-9cb1-4018-9db7-67553233f69d\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wbrjz" Dec 01 22:25:10 crc kubenswrapper[4962]: I1201 22:25:10.457082 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/6a103f12-9cb1-4018-9db7-67553233f69d-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-wbrjz\" (UID: \"6a103f12-9cb1-4018-9db7-67553233f69d\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wbrjz" Dec 01 22:25:10 crc kubenswrapper[4962]: I1201 22:25:10.457296 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a103f12-9cb1-4018-9db7-67553233f69d-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-wbrjz\" (UID: \"6a103f12-9cb1-4018-9db7-67553233f69d\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wbrjz" Dec 01 22:25:10 crc kubenswrapper[4962]: I1201 22:25:10.457464 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6klqs\" (UniqueName: \"kubernetes.io/projected/6a103f12-9cb1-4018-9db7-67553233f69d-kube-api-access-6klqs\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-wbrjz\" (UID: \"6a103f12-9cb1-4018-9db7-67553233f69d\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wbrjz" Dec 01 22:25:10 crc kubenswrapper[4962]: I1201 22:25:10.560039 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/6a103f12-9cb1-4018-9db7-67553233f69d-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-wbrjz\" (UID: \"6a103f12-9cb1-4018-9db7-67553233f69d\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wbrjz" Dec 01 22:25:10 crc kubenswrapper[4962]: I1201 22:25:10.560093 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6a103f12-9cb1-4018-9db7-67553233f69d-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-wbrjz\" (UID: \"6a103f12-9cb1-4018-9db7-67553233f69d\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wbrjz" Dec 01 22:25:10 crc kubenswrapper[4962]: I1201 22:25:10.560118 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/6a103f12-9cb1-4018-9db7-67553233f69d-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-wbrjz\" (UID: \"6a103f12-9cb1-4018-9db7-67553233f69d\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wbrjz" Dec 01 22:25:10 crc kubenswrapper[4962]: I1201 22:25:10.560165 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6a103f12-9cb1-4018-9db7-67553233f69d-ssh-key\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-wbrjz\" (UID: \"6a103f12-9cb1-4018-9db7-67553233f69d\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wbrjz" Dec 01 22:25:10 crc kubenswrapper[4962]: I1201 22:25:10.560219 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/6a103f12-9cb1-4018-9db7-67553233f69d-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-wbrjz\" (UID: \"6a103f12-9cb1-4018-9db7-67553233f69d\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wbrjz" Dec 01 22:25:10 crc kubenswrapper[4962]: I1201 22:25:10.560247 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a103f12-9cb1-4018-9db7-67553233f69d-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-wbrjz\" (UID: \"6a103f12-9cb1-4018-9db7-67553233f69d\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wbrjz" Dec 01 22:25:10 crc kubenswrapper[4962]: I1201 22:25:10.560266 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6klqs\" (UniqueName: \"kubernetes.io/projected/6a103f12-9cb1-4018-9db7-67553233f69d-kube-api-access-6klqs\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-wbrjz\" (UID: \"6a103f12-9cb1-4018-9db7-67553233f69d\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wbrjz" Dec 01 22:25:10 crc kubenswrapper[4962]: I1201 22:25:10.565630 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6a103f12-9cb1-4018-9db7-67553233f69d-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-wbrjz\" (UID: \"6a103f12-9cb1-4018-9db7-67553233f69d\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wbrjz" Dec 01 22:25:10 crc kubenswrapper[4962]: I1201 22:25:10.566425 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/6a103f12-9cb1-4018-9db7-67553233f69d-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-wbrjz\" (UID: \"6a103f12-9cb1-4018-9db7-67553233f69d\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wbrjz" Dec 01 22:25:10 crc kubenswrapper[4962]: I1201 22:25:10.568797 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/6a103f12-9cb1-4018-9db7-67553233f69d-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-wbrjz\" (UID: \"6a103f12-9cb1-4018-9db7-67553233f69d\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wbrjz" Dec 01 22:25:10 crc kubenswrapper[4962]: I1201 22:25:10.568814 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6a103f12-9cb1-4018-9db7-67553233f69d-ssh-key\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-wbrjz\" (UID: \"6a103f12-9cb1-4018-9db7-67553233f69d\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wbrjz" Dec 01 22:25:10 crc kubenswrapper[4962]: I1201 22:25:10.570171 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a103f12-9cb1-4018-9db7-67553233f69d-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-wbrjz\" (UID: \"6a103f12-9cb1-4018-9db7-67553233f69d\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wbrjz" Dec 01 22:25:10 crc kubenswrapper[4962]: I1201 22:25:10.572093 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/6a103f12-9cb1-4018-9db7-67553233f69d-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-wbrjz\" (UID: \"6a103f12-9cb1-4018-9db7-67553233f69d\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wbrjz" Dec 01 22:25:10 crc kubenswrapper[4962]: I1201 22:25:10.594508 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6klqs\" (UniqueName: \"kubernetes.io/projected/6a103f12-9cb1-4018-9db7-67553233f69d-kube-api-access-6klqs\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-wbrjz\" (UID: \"6a103f12-9cb1-4018-9db7-67553233f69d\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wbrjz" Dec 01 22:25:10 crc kubenswrapper[4962]: I1201 22:25:10.601422 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wbrjz" Dec 01 22:25:11 crc kubenswrapper[4962]: I1201 22:25:11.196338 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wbrjz"] Dec 01 22:25:11 crc kubenswrapper[4962]: W1201 22:25:11.203923 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a103f12_9cb1_4018_9db7_67553233f69d.slice/crio-27a56950b225d06f76aa9b3909f6c256ca9f91d96de283348b642c48080bdd19 WatchSource:0}: Error finding container 27a56950b225d06f76aa9b3909f6c256ca9f91d96de283348b642c48080bdd19: Status 404 returned error can't find the container with id 27a56950b225d06f76aa9b3909f6c256ca9f91d96de283348b642c48080bdd19 Dec 01 22:25:12 crc kubenswrapper[4962]: I1201 22:25:12.134657 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wbrjz" event={"ID":"6a103f12-9cb1-4018-9db7-67553233f69d","Type":"ContainerStarted","Data":"41f692e303477f3f8c743253b74f600ead584ab6747b65a510b0d10d0e4762ae"} Dec 01 22:25:12 crc kubenswrapper[4962]: I1201 22:25:12.135888 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wbrjz" event={"ID":"6a103f12-9cb1-4018-9db7-67553233f69d","Type":"ContainerStarted","Data":"27a56950b225d06f76aa9b3909f6c256ca9f91d96de283348b642c48080bdd19"} Dec 01 22:25:12 crc kubenswrapper[4962]: I1201 22:25:12.171895 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wbrjz" podStartSLOduration=1.650291743 podStartE2EDuration="2.171864695s" podCreationTimestamp="2025-12-01 22:25:10 +0000 UTC" firstStartedPulling="2025-12-01 22:25:11.206969632 +0000 UTC m=+3095.308408827" lastFinishedPulling="2025-12-01 22:25:11.728542554 +0000 UTC m=+3095.829981779" observedRunningTime="2025-12-01 22:25:12.157755306 +0000 UTC m=+3096.259194551" watchObservedRunningTime="2025-12-01 22:25:12.171864695 +0000 UTC m=+3096.273303930" Dec 01 22:25:32 crc kubenswrapper[4962]: I1201 22:25:32.784887 4962 patch_prober.go:28] interesting pod/machine-config-daemon-b642k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 22:25:32 crc kubenswrapper[4962]: I1201 22:25:32.786551 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 22:25:32 crc kubenswrapper[4962]: I1201 22:25:32.786665 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b642k" Dec 01 22:25:32 crc kubenswrapper[4962]: I1201 22:25:32.787576 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2e23bada28e6e6bf435d62af0fa8d63ab344b8505237ba65bcba1b122cc5a1ef"} pod="openshift-machine-config-operator/machine-config-daemon-b642k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 22:25:32 crc kubenswrapper[4962]: I1201 22:25:32.787786 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" containerID="cri-o://2e23bada28e6e6bf435d62af0fa8d63ab344b8505237ba65bcba1b122cc5a1ef" gracePeriod=600 Dec 01 22:25:32 crc kubenswrapper[4962]: E1201 22:25:32.939133 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:25:33 crc kubenswrapper[4962]: I1201 22:25:33.437926 4962 generic.go:334] "Generic (PLEG): container finished" podID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerID="2e23bada28e6e6bf435d62af0fa8d63ab344b8505237ba65bcba1b122cc5a1ef" exitCode=0 Dec 01 22:25:33 crc kubenswrapper[4962]: I1201 22:25:33.438021 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" event={"ID":"191b6ce3-f613-4217-b224-a65ee4cfdfe7","Type":"ContainerDied","Data":"2e23bada28e6e6bf435d62af0fa8d63ab344b8505237ba65bcba1b122cc5a1ef"} Dec 01 22:25:33 crc kubenswrapper[4962]: I1201 22:25:33.438332 4962 scope.go:117] "RemoveContainer" containerID="7e86de3a727bf2cd2d1675ee45c0547dea508d02b19d9166f854056d0475f742" Dec 01 22:25:33 crc kubenswrapper[4962]: I1201 22:25:33.439514 4962 scope.go:117] "RemoveContainer" containerID="2e23bada28e6e6bf435d62af0fa8d63ab344b8505237ba65bcba1b122cc5a1ef" Dec 01 22:25:33 crc kubenswrapper[4962]: E1201 22:25:33.440175 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:25:44 crc kubenswrapper[4962]: I1201 22:25:44.220344 4962 scope.go:117] "RemoveContainer" containerID="2e23bada28e6e6bf435d62af0fa8d63ab344b8505237ba65bcba1b122cc5a1ef" Dec 01 22:25:44 crc kubenswrapper[4962]: E1201 22:25:44.221016 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:25:57 crc kubenswrapper[4962]: I1201 22:25:57.220512 4962 scope.go:117] "RemoveContainer" containerID="2e23bada28e6e6bf435d62af0fa8d63ab344b8505237ba65bcba1b122cc5a1ef" Dec 01 22:25:57 crc kubenswrapper[4962]: E1201 22:25:57.223582 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:26:11 crc kubenswrapper[4962]: I1201 22:26:11.220856 4962 scope.go:117] "RemoveContainer" containerID="2e23bada28e6e6bf435d62af0fa8d63ab344b8505237ba65bcba1b122cc5a1ef" Dec 01 22:26:11 crc kubenswrapper[4962]: E1201 22:26:11.222310 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:26:23 crc kubenswrapper[4962]: I1201 22:26:23.219671 4962 scope.go:117] "RemoveContainer" containerID="2e23bada28e6e6bf435d62af0fa8d63ab344b8505237ba65bcba1b122cc5a1ef" Dec 01 22:26:23 crc kubenswrapper[4962]: E1201 22:26:23.220521 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:26:34 crc kubenswrapper[4962]: I1201 22:26:34.220123 4962 scope.go:117] "RemoveContainer" containerID="2e23bada28e6e6bf435d62af0fa8d63ab344b8505237ba65bcba1b122cc5a1ef" Dec 01 22:26:34 crc kubenswrapper[4962]: E1201 22:26:34.221376 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:26:45 crc kubenswrapper[4962]: I1201 22:26:45.220141 4962 scope.go:117] "RemoveContainer" containerID="2e23bada28e6e6bf435d62af0fa8d63ab344b8505237ba65bcba1b122cc5a1ef" Dec 01 22:26:45 crc kubenswrapper[4962]: E1201 22:26:45.223021 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:26:56 crc kubenswrapper[4962]: I1201 22:26:56.247484 4962 scope.go:117] "RemoveContainer" containerID="2e23bada28e6e6bf435d62af0fa8d63ab344b8505237ba65bcba1b122cc5a1ef" Dec 01 22:26:56 crc kubenswrapper[4962]: E1201 22:26:56.255189 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:27:10 crc kubenswrapper[4962]: I1201 22:27:10.219882 4962 scope.go:117] "RemoveContainer" containerID="2e23bada28e6e6bf435d62af0fa8d63ab344b8505237ba65bcba1b122cc5a1ef" Dec 01 22:27:10 crc kubenswrapper[4962]: E1201 22:27:10.220740 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:27:24 crc kubenswrapper[4962]: I1201 22:27:24.232006 4962 scope.go:117] "RemoveContainer" containerID="2e23bada28e6e6bf435d62af0fa8d63ab344b8505237ba65bcba1b122cc5a1ef" Dec 01 22:27:24 crc kubenswrapper[4962]: E1201 22:27:24.233181 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:27:34 crc kubenswrapper[4962]: I1201 22:27:34.156434 4962 generic.go:334] "Generic (PLEG): container finished" podID="6a103f12-9cb1-4018-9db7-67553233f69d" containerID="41f692e303477f3f8c743253b74f600ead584ab6747b65a510b0d10d0e4762ae" exitCode=0 Dec 01 22:27:34 crc kubenswrapper[4962]: I1201 22:27:34.156544 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wbrjz" event={"ID":"6a103f12-9cb1-4018-9db7-67553233f69d","Type":"ContainerDied","Data":"41f692e303477f3f8c743253b74f600ead584ab6747b65a510b0d10d0e4762ae"} Dec 01 22:27:35 crc kubenswrapper[4962]: I1201 22:27:35.914393 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wbrjz" Dec 01 22:27:35 crc kubenswrapper[4962]: I1201 22:27:35.933682 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/6a103f12-9cb1-4018-9db7-67553233f69d-ceilometer-ipmi-config-data-0\") pod \"6a103f12-9cb1-4018-9db7-67553233f69d\" (UID: \"6a103f12-9cb1-4018-9db7-67553233f69d\") " Dec 01 22:27:35 crc kubenswrapper[4962]: I1201 22:27:35.934462 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a103f12-9cb1-4018-9db7-67553233f69d-telemetry-power-monitoring-combined-ca-bundle\") pod \"6a103f12-9cb1-4018-9db7-67553233f69d\" (UID: \"6a103f12-9cb1-4018-9db7-67553233f69d\") " Dec 01 22:27:35 crc kubenswrapper[4962]: I1201 22:27:35.934701 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/6a103f12-9cb1-4018-9db7-67553233f69d-ceilometer-ipmi-config-data-1\") pod \"6a103f12-9cb1-4018-9db7-67553233f69d\" (UID: \"6a103f12-9cb1-4018-9db7-67553233f69d\") " Dec 01 22:27:35 crc kubenswrapper[4962]: I1201 22:27:35.934902 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6a103f12-9cb1-4018-9db7-67553233f69d-inventory\") pod \"6a103f12-9cb1-4018-9db7-67553233f69d\" (UID: \"6a103f12-9cb1-4018-9db7-67553233f69d\") " Dec 01 22:27:35 crc kubenswrapper[4962]: I1201 22:27:35.935130 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/6a103f12-9cb1-4018-9db7-67553233f69d-ceilometer-ipmi-config-data-2\") pod \"6a103f12-9cb1-4018-9db7-67553233f69d\" (UID: \"6a103f12-9cb1-4018-9db7-67553233f69d\") " Dec 01 22:27:35 crc kubenswrapper[4962]: I1201 22:27:35.935331 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6a103f12-9cb1-4018-9db7-67553233f69d-ssh-key\") pod \"6a103f12-9cb1-4018-9db7-67553233f69d\" (UID: \"6a103f12-9cb1-4018-9db7-67553233f69d\") " Dec 01 22:27:35 crc kubenswrapper[4962]: I1201 22:27:35.935572 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6klqs\" (UniqueName: \"kubernetes.io/projected/6a103f12-9cb1-4018-9db7-67553233f69d-kube-api-access-6klqs\") pod \"6a103f12-9cb1-4018-9db7-67553233f69d\" (UID: \"6a103f12-9cb1-4018-9db7-67553233f69d\") " Dec 01 22:27:35 crc kubenswrapper[4962]: I1201 22:27:35.945014 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a103f12-9cb1-4018-9db7-67553233f69d-telemetry-power-monitoring-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-power-monitoring-combined-ca-bundle") pod "6a103f12-9cb1-4018-9db7-67553233f69d" (UID: "6a103f12-9cb1-4018-9db7-67553233f69d"). InnerVolumeSpecName "telemetry-power-monitoring-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:27:35 crc kubenswrapper[4962]: I1201 22:27:35.949388 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a103f12-9cb1-4018-9db7-67553233f69d-kube-api-access-6klqs" (OuterVolumeSpecName: "kube-api-access-6klqs") pod "6a103f12-9cb1-4018-9db7-67553233f69d" (UID: "6a103f12-9cb1-4018-9db7-67553233f69d"). InnerVolumeSpecName "kube-api-access-6klqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 22:27:35 crc kubenswrapper[4962]: I1201 22:27:35.977727 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a103f12-9cb1-4018-9db7-67553233f69d-inventory" (OuterVolumeSpecName: "inventory") pod "6a103f12-9cb1-4018-9db7-67553233f69d" (UID: "6a103f12-9cb1-4018-9db7-67553233f69d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:27:36 crc kubenswrapper[4962]: I1201 22:27:36.001818 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a103f12-9cb1-4018-9db7-67553233f69d-ceilometer-ipmi-config-data-2" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-2") pod "6a103f12-9cb1-4018-9db7-67553233f69d" (UID: "6a103f12-9cb1-4018-9db7-67553233f69d"). InnerVolumeSpecName "ceilometer-ipmi-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:27:36 crc kubenswrapper[4962]: I1201 22:27:36.002823 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a103f12-9cb1-4018-9db7-67553233f69d-ceilometer-ipmi-config-data-1" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-1") pod "6a103f12-9cb1-4018-9db7-67553233f69d" (UID: "6a103f12-9cb1-4018-9db7-67553233f69d"). InnerVolumeSpecName "ceilometer-ipmi-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:27:36 crc kubenswrapper[4962]: I1201 22:27:36.011055 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a103f12-9cb1-4018-9db7-67553233f69d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6a103f12-9cb1-4018-9db7-67553233f69d" (UID: "6a103f12-9cb1-4018-9db7-67553233f69d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:27:36 crc kubenswrapper[4962]: I1201 22:27:36.027533 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a103f12-9cb1-4018-9db7-67553233f69d-ceilometer-ipmi-config-data-0" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-0") pod "6a103f12-9cb1-4018-9db7-67553233f69d" (UID: "6a103f12-9cb1-4018-9db7-67553233f69d"). InnerVolumeSpecName "ceilometer-ipmi-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:27:36 crc kubenswrapper[4962]: I1201 22:27:36.038519 4962 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/6a103f12-9cb1-4018-9db7-67553233f69d-ceilometer-ipmi-config-data-0\") on node \"crc\" DevicePath \"\"" Dec 01 22:27:36 crc kubenswrapper[4962]: I1201 22:27:36.038552 4962 reconciler_common.go:293] "Volume detached for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a103f12-9cb1-4018-9db7-67553233f69d-telemetry-power-monitoring-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 22:27:36 crc kubenswrapper[4962]: I1201 22:27:36.038566 4962 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/6a103f12-9cb1-4018-9db7-67553233f69d-ceilometer-ipmi-config-data-1\") on node \"crc\" DevicePath \"\"" Dec 01 22:27:36 crc kubenswrapper[4962]: I1201 22:27:36.038578 4962 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6a103f12-9cb1-4018-9db7-67553233f69d-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 22:27:36 crc kubenswrapper[4962]: I1201 22:27:36.038587 4962 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/6a103f12-9cb1-4018-9db7-67553233f69d-ceilometer-ipmi-config-data-2\") on node \"crc\" DevicePath \"\"" Dec 01 22:27:36 crc kubenswrapper[4962]: I1201 22:27:36.038597 4962 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6a103f12-9cb1-4018-9db7-67553233f69d-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 22:27:36 crc kubenswrapper[4962]: I1201 22:27:36.038607 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6klqs\" (UniqueName: \"kubernetes.io/projected/6a103f12-9cb1-4018-9db7-67553233f69d-kube-api-access-6klqs\") on node \"crc\" DevicePath \"\"" Dec 01 22:27:36 crc kubenswrapper[4962]: I1201 22:27:36.179039 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wbrjz" event={"ID":"6a103f12-9cb1-4018-9db7-67553233f69d","Type":"ContainerDied","Data":"27a56950b225d06f76aa9b3909f6c256ca9f91d96de283348b642c48080bdd19"} Dec 01 22:27:36 crc kubenswrapper[4962]: I1201 22:27:36.179372 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27a56950b225d06f76aa9b3909f6c256ca9f91d96de283348b642c48080bdd19" Dec 01 22:27:36 crc kubenswrapper[4962]: I1201 22:27:36.179151 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wbrjz" Dec 01 22:27:36 crc kubenswrapper[4962]: I1201 22:27:36.228701 4962 scope.go:117] "RemoveContainer" containerID="2e23bada28e6e6bf435d62af0fa8d63ab344b8505237ba65bcba1b122cc5a1ef" Dec 01 22:27:36 crc kubenswrapper[4962]: E1201 22:27:36.229125 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:27:36 crc kubenswrapper[4962]: I1201 22:27:36.320736 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-xrsvk"] Dec 01 22:27:36 crc kubenswrapper[4962]: E1201 22:27:36.321282 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a103f12-9cb1-4018-9db7-67553233f69d" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Dec 01 22:27:36 crc kubenswrapper[4962]: I1201 22:27:36.321300 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a103f12-9cb1-4018-9db7-67553233f69d" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Dec 01 22:27:36 crc kubenswrapper[4962]: I1201 22:27:36.321565 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a103f12-9cb1-4018-9db7-67553233f69d" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Dec 01 22:27:36 crc kubenswrapper[4962]: I1201 22:27:36.322587 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-xrsvk" Dec 01 22:27:36 crc kubenswrapper[4962]: I1201 22:27:36.324423 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mpxjd" Dec 01 22:27:36 crc kubenswrapper[4962]: I1201 22:27:36.325716 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"logging-compute-config-data" Dec 01 22:27:36 crc kubenswrapper[4962]: I1201 22:27:36.325771 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 22:27:36 crc kubenswrapper[4962]: I1201 22:27:36.325853 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 22:27:36 crc kubenswrapper[4962]: I1201 22:27:36.326687 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 22:27:36 crc kubenswrapper[4962]: I1201 22:27:36.333120 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-xrsvk"] Dec 01 22:27:36 crc kubenswrapper[4962]: I1201 22:27:36.347281 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/e3c41ae3-36b2-43dd-9580-fac72dc88d09-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-xrsvk\" (UID: \"e3c41ae3-36b2-43dd-9580-fac72dc88d09\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-xrsvk" Dec 01 22:27:36 crc kubenswrapper[4962]: I1201 22:27:36.347695 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e3c41ae3-36b2-43dd-9580-fac72dc88d09-ssh-key\") pod \"logging-edpm-deployment-openstack-edpm-ipam-xrsvk\" (UID: \"e3c41ae3-36b2-43dd-9580-fac72dc88d09\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-xrsvk" Dec 01 22:27:36 crc kubenswrapper[4962]: I1201 22:27:36.347722 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hc2fg\" (UniqueName: \"kubernetes.io/projected/e3c41ae3-36b2-43dd-9580-fac72dc88d09-kube-api-access-hc2fg\") pod \"logging-edpm-deployment-openstack-edpm-ipam-xrsvk\" (UID: \"e3c41ae3-36b2-43dd-9580-fac72dc88d09\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-xrsvk" Dec 01 22:27:36 crc kubenswrapper[4962]: I1201 22:27:36.347756 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3c41ae3-36b2-43dd-9580-fac72dc88d09-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-xrsvk\" (UID: \"e3c41ae3-36b2-43dd-9580-fac72dc88d09\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-xrsvk" Dec 01 22:27:36 crc kubenswrapper[4962]: I1201 22:27:36.347782 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/e3c41ae3-36b2-43dd-9580-fac72dc88d09-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-xrsvk\" (UID: \"e3c41ae3-36b2-43dd-9580-fac72dc88d09\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-xrsvk" Dec 01 22:27:36 crc kubenswrapper[4962]: I1201 22:27:36.449689 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/e3c41ae3-36b2-43dd-9580-fac72dc88d09-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-xrsvk\" (UID: \"e3c41ae3-36b2-43dd-9580-fac72dc88d09\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-xrsvk" Dec 01 22:27:36 crc kubenswrapper[4962]: I1201 22:27:36.449849 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e3c41ae3-36b2-43dd-9580-fac72dc88d09-ssh-key\") pod \"logging-edpm-deployment-openstack-edpm-ipam-xrsvk\" (UID: \"e3c41ae3-36b2-43dd-9580-fac72dc88d09\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-xrsvk" Dec 01 22:27:36 crc kubenswrapper[4962]: I1201 22:27:36.449867 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hc2fg\" (UniqueName: \"kubernetes.io/projected/e3c41ae3-36b2-43dd-9580-fac72dc88d09-kube-api-access-hc2fg\") pod \"logging-edpm-deployment-openstack-edpm-ipam-xrsvk\" (UID: \"e3c41ae3-36b2-43dd-9580-fac72dc88d09\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-xrsvk" Dec 01 22:27:36 crc kubenswrapper[4962]: I1201 22:27:36.449890 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3c41ae3-36b2-43dd-9580-fac72dc88d09-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-xrsvk\" (UID: \"e3c41ae3-36b2-43dd-9580-fac72dc88d09\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-xrsvk" Dec 01 22:27:36 crc kubenswrapper[4962]: I1201 22:27:36.449910 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/e3c41ae3-36b2-43dd-9580-fac72dc88d09-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-xrsvk\" (UID: \"e3c41ae3-36b2-43dd-9580-fac72dc88d09\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-xrsvk" Dec 01 22:27:36 crc kubenswrapper[4962]: I1201 22:27:36.454292 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/e3c41ae3-36b2-43dd-9580-fac72dc88d09-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-xrsvk\" (UID: \"e3c41ae3-36b2-43dd-9580-fac72dc88d09\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-xrsvk" Dec 01 22:27:36 crc kubenswrapper[4962]: I1201 22:27:36.454532 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3c41ae3-36b2-43dd-9580-fac72dc88d09-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-xrsvk\" (UID: \"e3c41ae3-36b2-43dd-9580-fac72dc88d09\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-xrsvk" Dec 01 22:27:36 crc kubenswrapper[4962]: I1201 22:27:36.454574 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/e3c41ae3-36b2-43dd-9580-fac72dc88d09-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-xrsvk\" (UID: \"e3c41ae3-36b2-43dd-9580-fac72dc88d09\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-xrsvk" Dec 01 22:27:36 crc kubenswrapper[4962]: I1201 22:27:36.455126 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e3c41ae3-36b2-43dd-9580-fac72dc88d09-ssh-key\") pod \"logging-edpm-deployment-openstack-edpm-ipam-xrsvk\" (UID: \"e3c41ae3-36b2-43dd-9580-fac72dc88d09\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-xrsvk" Dec 01 22:27:36 crc kubenswrapper[4962]: I1201 22:27:36.477127 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hc2fg\" (UniqueName: \"kubernetes.io/projected/e3c41ae3-36b2-43dd-9580-fac72dc88d09-kube-api-access-hc2fg\") pod \"logging-edpm-deployment-openstack-edpm-ipam-xrsvk\" (UID: \"e3c41ae3-36b2-43dd-9580-fac72dc88d09\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-xrsvk" Dec 01 22:27:36 crc kubenswrapper[4962]: I1201 22:27:36.641466 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-xrsvk" Dec 01 22:27:37 crc kubenswrapper[4962]: I1201 22:27:37.239396 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-xrsvk"] Dec 01 22:27:37 crc kubenswrapper[4962]: I1201 22:27:37.241826 4962 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 22:27:38 crc kubenswrapper[4962]: I1201 22:27:38.208888 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-xrsvk" event={"ID":"e3c41ae3-36b2-43dd-9580-fac72dc88d09","Type":"ContainerStarted","Data":"a54c591fb79f0dcffa0bbbd12fb0da69842a6c541ac5e7ec75312795d94ed647"} Dec 01 22:27:39 crc kubenswrapper[4962]: I1201 22:27:39.228014 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-xrsvk" event={"ID":"e3c41ae3-36b2-43dd-9580-fac72dc88d09","Type":"ContainerStarted","Data":"2d480c9b9581df13ba905254fafcb14ecf7a607eb008970c7faec54a0f6d9f8f"} Dec 01 22:27:51 crc kubenswrapper[4962]: I1201 22:27:51.220753 4962 scope.go:117] "RemoveContainer" containerID="2e23bada28e6e6bf435d62af0fa8d63ab344b8505237ba65bcba1b122cc5a1ef" Dec 01 22:27:51 crc kubenswrapper[4962]: E1201 22:27:51.222034 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:27:56 crc kubenswrapper[4962]: I1201 22:27:56.500124 4962 generic.go:334] "Generic (PLEG): container finished" podID="e3c41ae3-36b2-43dd-9580-fac72dc88d09" containerID="2d480c9b9581df13ba905254fafcb14ecf7a607eb008970c7faec54a0f6d9f8f" exitCode=0 Dec 01 22:27:56 crc kubenswrapper[4962]: I1201 22:27:56.500356 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-xrsvk" event={"ID":"e3c41ae3-36b2-43dd-9580-fac72dc88d09","Type":"ContainerDied","Data":"2d480c9b9581df13ba905254fafcb14ecf7a607eb008970c7faec54a0f6d9f8f"} Dec 01 22:27:58 crc kubenswrapper[4962]: I1201 22:27:58.035620 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-xrsvk" Dec 01 22:27:58 crc kubenswrapper[4962]: I1201 22:27:58.152699 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/e3c41ae3-36b2-43dd-9580-fac72dc88d09-logging-compute-config-data-1\") pod \"e3c41ae3-36b2-43dd-9580-fac72dc88d09\" (UID: \"e3c41ae3-36b2-43dd-9580-fac72dc88d09\") " Dec 01 22:27:58 crc kubenswrapper[4962]: I1201 22:27:58.152947 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hc2fg\" (UniqueName: \"kubernetes.io/projected/e3c41ae3-36b2-43dd-9580-fac72dc88d09-kube-api-access-hc2fg\") pod \"e3c41ae3-36b2-43dd-9580-fac72dc88d09\" (UID: \"e3c41ae3-36b2-43dd-9580-fac72dc88d09\") " Dec 01 22:27:58 crc kubenswrapper[4962]: I1201 22:27:58.153019 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e3c41ae3-36b2-43dd-9580-fac72dc88d09-ssh-key\") pod \"e3c41ae3-36b2-43dd-9580-fac72dc88d09\" (UID: \"e3c41ae3-36b2-43dd-9580-fac72dc88d09\") " Dec 01 22:27:58 crc kubenswrapper[4962]: I1201 22:27:58.153055 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/e3c41ae3-36b2-43dd-9580-fac72dc88d09-logging-compute-config-data-0\") pod \"e3c41ae3-36b2-43dd-9580-fac72dc88d09\" (UID: \"e3c41ae3-36b2-43dd-9580-fac72dc88d09\") " Dec 01 22:27:58 crc kubenswrapper[4962]: I1201 22:27:58.153122 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3c41ae3-36b2-43dd-9580-fac72dc88d09-inventory\") pod \"e3c41ae3-36b2-43dd-9580-fac72dc88d09\" (UID: \"e3c41ae3-36b2-43dd-9580-fac72dc88d09\") " Dec 01 22:27:58 crc kubenswrapper[4962]: I1201 22:27:58.159208 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3c41ae3-36b2-43dd-9580-fac72dc88d09-kube-api-access-hc2fg" (OuterVolumeSpecName: "kube-api-access-hc2fg") pod "e3c41ae3-36b2-43dd-9580-fac72dc88d09" (UID: "e3c41ae3-36b2-43dd-9580-fac72dc88d09"). InnerVolumeSpecName "kube-api-access-hc2fg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 22:27:58 crc kubenswrapper[4962]: I1201 22:27:58.182729 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3c41ae3-36b2-43dd-9580-fac72dc88d09-inventory" (OuterVolumeSpecName: "inventory") pod "e3c41ae3-36b2-43dd-9580-fac72dc88d09" (UID: "e3c41ae3-36b2-43dd-9580-fac72dc88d09"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:27:58 crc kubenswrapper[4962]: I1201 22:27:58.183195 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3c41ae3-36b2-43dd-9580-fac72dc88d09-logging-compute-config-data-0" (OuterVolumeSpecName: "logging-compute-config-data-0") pod "e3c41ae3-36b2-43dd-9580-fac72dc88d09" (UID: "e3c41ae3-36b2-43dd-9580-fac72dc88d09"). InnerVolumeSpecName "logging-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:27:58 crc kubenswrapper[4962]: I1201 22:27:58.185799 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3c41ae3-36b2-43dd-9580-fac72dc88d09-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e3c41ae3-36b2-43dd-9580-fac72dc88d09" (UID: "e3c41ae3-36b2-43dd-9580-fac72dc88d09"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:27:58 crc kubenswrapper[4962]: I1201 22:27:58.187553 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3c41ae3-36b2-43dd-9580-fac72dc88d09-logging-compute-config-data-1" (OuterVolumeSpecName: "logging-compute-config-data-1") pod "e3c41ae3-36b2-43dd-9580-fac72dc88d09" (UID: "e3c41ae3-36b2-43dd-9580-fac72dc88d09"). InnerVolumeSpecName "logging-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:27:58 crc kubenswrapper[4962]: I1201 22:27:58.258652 4962 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3c41ae3-36b2-43dd-9580-fac72dc88d09-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 22:27:58 crc kubenswrapper[4962]: I1201 22:27:58.258725 4962 reconciler_common.go:293] "Volume detached for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/e3c41ae3-36b2-43dd-9580-fac72dc88d09-logging-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Dec 01 22:27:58 crc kubenswrapper[4962]: I1201 22:27:58.258739 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hc2fg\" (UniqueName: \"kubernetes.io/projected/e3c41ae3-36b2-43dd-9580-fac72dc88d09-kube-api-access-hc2fg\") on node \"crc\" DevicePath \"\"" Dec 01 22:27:58 crc kubenswrapper[4962]: I1201 22:27:58.258749 4962 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e3c41ae3-36b2-43dd-9580-fac72dc88d09-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 22:27:58 crc kubenswrapper[4962]: I1201 22:27:58.258779 4962 reconciler_common.go:293] "Volume detached for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/e3c41ae3-36b2-43dd-9580-fac72dc88d09-logging-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Dec 01 22:27:58 crc kubenswrapper[4962]: I1201 22:27:58.529019 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-xrsvk" event={"ID":"e3c41ae3-36b2-43dd-9580-fac72dc88d09","Type":"ContainerDied","Data":"a54c591fb79f0dcffa0bbbd12fb0da69842a6c541ac5e7ec75312795d94ed647"} Dec 01 22:27:58 crc kubenswrapper[4962]: I1201 22:27:58.529062 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a54c591fb79f0dcffa0bbbd12fb0da69842a6c541ac5e7ec75312795d94ed647" Dec 01 22:27:58 crc kubenswrapper[4962]: I1201 22:27:58.529129 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-xrsvk" Dec 01 22:28:04 crc kubenswrapper[4962]: I1201 22:28:04.220408 4962 scope.go:117] "RemoveContainer" containerID="2e23bada28e6e6bf435d62af0fa8d63ab344b8505237ba65bcba1b122cc5a1ef" Dec 01 22:28:04 crc kubenswrapper[4962]: E1201 22:28:04.221605 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:28:15 crc kubenswrapper[4962]: I1201 22:28:15.220990 4962 scope.go:117] "RemoveContainer" containerID="2e23bada28e6e6bf435d62af0fa8d63ab344b8505237ba65bcba1b122cc5a1ef" Dec 01 22:28:15 crc kubenswrapper[4962]: E1201 22:28:15.222108 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:28:21 crc kubenswrapper[4962]: I1201 22:28:21.321410 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vpgpb"] Dec 01 22:28:21 crc kubenswrapper[4962]: E1201 22:28:21.334089 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3c41ae3-36b2-43dd-9580-fac72dc88d09" containerName="logging-edpm-deployment-openstack-edpm-ipam" Dec 01 22:28:21 crc kubenswrapper[4962]: I1201 22:28:21.334503 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3c41ae3-36b2-43dd-9580-fac72dc88d09" containerName="logging-edpm-deployment-openstack-edpm-ipam" Dec 01 22:28:21 crc kubenswrapper[4962]: I1201 22:28:21.335105 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3c41ae3-36b2-43dd-9580-fac72dc88d09" containerName="logging-edpm-deployment-openstack-edpm-ipam" Dec 01 22:28:21 crc kubenswrapper[4962]: I1201 22:28:21.338315 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vpgpb" Dec 01 22:28:21 crc kubenswrapper[4962]: I1201 22:28:21.372253 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vpgpb"] Dec 01 22:28:21 crc kubenswrapper[4962]: I1201 22:28:21.413733 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28d7ef93-f48f-4243-ae99-b2c8ed1a5a41-utilities\") pod \"community-operators-vpgpb\" (UID: \"28d7ef93-f48f-4243-ae99-b2c8ed1a5a41\") " pod="openshift-marketplace/community-operators-vpgpb" Dec 01 22:28:21 crc kubenswrapper[4962]: I1201 22:28:21.413836 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjd8g\" (UniqueName: \"kubernetes.io/projected/28d7ef93-f48f-4243-ae99-b2c8ed1a5a41-kube-api-access-bjd8g\") pod \"community-operators-vpgpb\" (UID: \"28d7ef93-f48f-4243-ae99-b2c8ed1a5a41\") " pod="openshift-marketplace/community-operators-vpgpb" Dec 01 22:28:21 crc kubenswrapper[4962]: I1201 22:28:21.413874 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28d7ef93-f48f-4243-ae99-b2c8ed1a5a41-catalog-content\") pod \"community-operators-vpgpb\" (UID: \"28d7ef93-f48f-4243-ae99-b2c8ed1a5a41\") " pod="openshift-marketplace/community-operators-vpgpb" Dec 01 22:28:21 crc kubenswrapper[4962]: I1201 22:28:21.516985 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28d7ef93-f48f-4243-ae99-b2c8ed1a5a41-utilities\") pod \"community-operators-vpgpb\" (UID: \"28d7ef93-f48f-4243-ae99-b2c8ed1a5a41\") " pod="openshift-marketplace/community-operators-vpgpb" Dec 01 22:28:21 crc kubenswrapper[4962]: I1201 22:28:21.517102 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjd8g\" (UniqueName: \"kubernetes.io/projected/28d7ef93-f48f-4243-ae99-b2c8ed1a5a41-kube-api-access-bjd8g\") pod \"community-operators-vpgpb\" (UID: \"28d7ef93-f48f-4243-ae99-b2c8ed1a5a41\") " pod="openshift-marketplace/community-operators-vpgpb" Dec 01 22:28:21 crc kubenswrapper[4962]: I1201 22:28:21.517134 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28d7ef93-f48f-4243-ae99-b2c8ed1a5a41-catalog-content\") pod \"community-operators-vpgpb\" (UID: \"28d7ef93-f48f-4243-ae99-b2c8ed1a5a41\") " pod="openshift-marketplace/community-operators-vpgpb" Dec 01 22:28:21 crc kubenswrapper[4962]: I1201 22:28:21.517577 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28d7ef93-f48f-4243-ae99-b2c8ed1a5a41-utilities\") pod \"community-operators-vpgpb\" (UID: \"28d7ef93-f48f-4243-ae99-b2c8ed1a5a41\") " pod="openshift-marketplace/community-operators-vpgpb" Dec 01 22:28:21 crc kubenswrapper[4962]: I1201 22:28:21.517646 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28d7ef93-f48f-4243-ae99-b2c8ed1a5a41-catalog-content\") pod \"community-operators-vpgpb\" (UID: \"28d7ef93-f48f-4243-ae99-b2c8ed1a5a41\") " pod="openshift-marketplace/community-operators-vpgpb" Dec 01 22:28:21 crc kubenswrapper[4962]: I1201 22:28:21.542568 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjd8g\" (UniqueName: \"kubernetes.io/projected/28d7ef93-f48f-4243-ae99-b2c8ed1a5a41-kube-api-access-bjd8g\") pod \"community-operators-vpgpb\" (UID: \"28d7ef93-f48f-4243-ae99-b2c8ed1a5a41\") " pod="openshift-marketplace/community-operators-vpgpb" Dec 01 22:28:21 crc kubenswrapper[4962]: I1201 22:28:21.681875 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vpgpb" Dec 01 22:28:22 crc kubenswrapper[4962]: I1201 22:28:22.241072 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vpgpb"] Dec 01 22:28:22 crc kubenswrapper[4962]: I1201 22:28:22.929707 4962 generic.go:334] "Generic (PLEG): container finished" podID="28d7ef93-f48f-4243-ae99-b2c8ed1a5a41" containerID="d425e42a72c6e246ff861a4e6699bbc7bc11d39e3db01feef5dd5cc9c212c5a5" exitCode=0 Dec 01 22:28:22 crc kubenswrapper[4962]: I1201 22:28:22.929818 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vpgpb" event={"ID":"28d7ef93-f48f-4243-ae99-b2c8ed1a5a41","Type":"ContainerDied","Data":"d425e42a72c6e246ff861a4e6699bbc7bc11d39e3db01feef5dd5cc9c212c5a5"} Dec 01 22:28:22 crc kubenswrapper[4962]: I1201 22:28:22.930264 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vpgpb" event={"ID":"28d7ef93-f48f-4243-ae99-b2c8ed1a5a41","Type":"ContainerStarted","Data":"5c3474f6bb4aa79db653e39e3d9f2e06505b630090611752ba76a2a4d036e946"} Dec 01 22:28:24 crc kubenswrapper[4962]: I1201 22:28:24.958659 4962 generic.go:334] "Generic (PLEG): container finished" podID="28d7ef93-f48f-4243-ae99-b2c8ed1a5a41" containerID="5c00df96a7d69274cfc63d41dad5bafb6d38b1f2a2f8162dc494ae6b4b941878" exitCode=0 Dec 01 22:28:24 crc kubenswrapper[4962]: I1201 22:28:24.958794 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vpgpb" event={"ID":"28d7ef93-f48f-4243-ae99-b2c8ed1a5a41","Type":"ContainerDied","Data":"5c00df96a7d69274cfc63d41dad5bafb6d38b1f2a2f8162dc494ae6b4b941878"} Dec 01 22:28:25 crc kubenswrapper[4962]: I1201 22:28:25.980973 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vpgpb" event={"ID":"28d7ef93-f48f-4243-ae99-b2c8ed1a5a41","Type":"ContainerStarted","Data":"1dfe60ed001b9eba72ca244e24d43d5bf53e6641af4be2c870788ca8e295df11"} Dec 01 22:28:26 crc kubenswrapper[4962]: I1201 22:28:26.022049 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vpgpb" podStartSLOduration=2.267690761 podStartE2EDuration="5.02203051s" podCreationTimestamp="2025-12-01 22:28:21 +0000 UTC" firstStartedPulling="2025-12-01 22:28:22.932179092 +0000 UTC m=+3287.033618297" lastFinishedPulling="2025-12-01 22:28:25.686518851 +0000 UTC m=+3289.787958046" observedRunningTime="2025-12-01 22:28:26.012950427 +0000 UTC m=+3290.114389662" watchObservedRunningTime="2025-12-01 22:28:26.02203051 +0000 UTC m=+3290.123469705" Dec 01 22:28:26 crc kubenswrapper[4962]: I1201 22:28:26.230275 4962 scope.go:117] "RemoveContainer" containerID="2e23bada28e6e6bf435d62af0fa8d63ab344b8505237ba65bcba1b122cc5a1ef" Dec 01 22:28:26 crc kubenswrapper[4962]: E1201 22:28:26.230713 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:28:31 crc kubenswrapper[4962]: I1201 22:28:31.682586 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vpgpb" Dec 01 22:28:31 crc kubenswrapper[4962]: I1201 22:28:31.683041 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vpgpb" Dec 01 22:28:31 crc kubenswrapper[4962]: I1201 22:28:31.754278 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vpgpb" Dec 01 22:28:32 crc kubenswrapper[4962]: I1201 22:28:32.176715 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vpgpb" Dec 01 22:28:32 crc kubenswrapper[4962]: I1201 22:28:32.249715 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vpgpb"] Dec 01 22:28:34 crc kubenswrapper[4962]: I1201 22:28:34.135885 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vpgpb" podUID="28d7ef93-f48f-4243-ae99-b2c8ed1a5a41" containerName="registry-server" containerID="cri-o://1dfe60ed001b9eba72ca244e24d43d5bf53e6641af4be2c870788ca8e295df11" gracePeriod=2 Dec 01 22:28:34 crc kubenswrapper[4962]: I1201 22:28:34.788083 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vpgpb" Dec 01 22:28:34 crc kubenswrapper[4962]: I1201 22:28:34.904624 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjd8g\" (UniqueName: \"kubernetes.io/projected/28d7ef93-f48f-4243-ae99-b2c8ed1a5a41-kube-api-access-bjd8g\") pod \"28d7ef93-f48f-4243-ae99-b2c8ed1a5a41\" (UID: \"28d7ef93-f48f-4243-ae99-b2c8ed1a5a41\") " Dec 01 22:28:34 crc kubenswrapper[4962]: I1201 22:28:34.905061 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28d7ef93-f48f-4243-ae99-b2c8ed1a5a41-catalog-content\") pod \"28d7ef93-f48f-4243-ae99-b2c8ed1a5a41\" (UID: \"28d7ef93-f48f-4243-ae99-b2c8ed1a5a41\") " Dec 01 22:28:34 crc kubenswrapper[4962]: I1201 22:28:34.905347 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28d7ef93-f48f-4243-ae99-b2c8ed1a5a41-utilities\") pod \"28d7ef93-f48f-4243-ae99-b2c8ed1a5a41\" (UID: \"28d7ef93-f48f-4243-ae99-b2c8ed1a5a41\") " Dec 01 22:28:34 crc kubenswrapper[4962]: I1201 22:28:34.906671 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28d7ef93-f48f-4243-ae99-b2c8ed1a5a41-utilities" (OuterVolumeSpecName: "utilities") pod "28d7ef93-f48f-4243-ae99-b2c8ed1a5a41" (UID: "28d7ef93-f48f-4243-ae99-b2c8ed1a5a41"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 22:28:34 crc kubenswrapper[4962]: I1201 22:28:34.915223 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28d7ef93-f48f-4243-ae99-b2c8ed1a5a41-kube-api-access-bjd8g" (OuterVolumeSpecName: "kube-api-access-bjd8g") pod "28d7ef93-f48f-4243-ae99-b2c8ed1a5a41" (UID: "28d7ef93-f48f-4243-ae99-b2c8ed1a5a41"). InnerVolumeSpecName "kube-api-access-bjd8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 22:28:35 crc kubenswrapper[4962]: I1201 22:28:35.007774 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28d7ef93-f48f-4243-ae99-b2c8ed1a5a41-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "28d7ef93-f48f-4243-ae99-b2c8ed1a5a41" (UID: "28d7ef93-f48f-4243-ae99-b2c8ed1a5a41"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 22:28:35 crc kubenswrapper[4962]: I1201 22:28:35.008382 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28d7ef93-f48f-4243-ae99-b2c8ed1a5a41-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 22:28:35 crc kubenswrapper[4962]: I1201 22:28:35.008408 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjd8g\" (UniqueName: \"kubernetes.io/projected/28d7ef93-f48f-4243-ae99-b2c8ed1a5a41-kube-api-access-bjd8g\") on node \"crc\" DevicePath \"\"" Dec 01 22:28:35 crc kubenswrapper[4962]: I1201 22:28:35.008420 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28d7ef93-f48f-4243-ae99-b2c8ed1a5a41-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 22:28:35 crc kubenswrapper[4962]: I1201 22:28:35.170489 4962 generic.go:334] "Generic (PLEG): container finished" podID="28d7ef93-f48f-4243-ae99-b2c8ed1a5a41" containerID="1dfe60ed001b9eba72ca244e24d43d5bf53e6641af4be2c870788ca8e295df11" exitCode=0 Dec 01 22:28:35 crc kubenswrapper[4962]: I1201 22:28:35.170531 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vpgpb" event={"ID":"28d7ef93-f48f-4243-ae99-b2c8ed1a5a41","Type":"ContainerDied","Data":"1dfe60ed001b9eba72ca244e24d43d5bf53e6641af4be2c870788ca8e295df11"} Dec 01 22:28:35 crc kubenswrapper[4962]: I1201 22:28:35.170563 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vpgpb" event={"ID":"28d7ef93-f48f-4243-ae99-b2c8ed1a5a41","Type":"ContainerDied","Data":"5c3474f6bb4aa79db653e39e3d9f2e06505b630090611752ba76a2a4d036e946"} Dec 01 22:28:35 crc kubenswrapper[4962]: I1201 22:28:35.170558 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vpgpb" Dec 01 22:28:35 crc kubenswrapper[4962]: I1201 22:28:35.170649 4962 scope.go:117] "RemoveContainer" containerID="1dfe60ed001b9eba72ca244e24d43d5bf53e6641af4be2c870788ca8e295df11" Dec 01 22:28:35 crc kubenswrapper[4962]: I1201 22:28:35.207474 4962 scope.go:117] "RemoveContainer" containerID="5c00df96a7d69274cfc63d41dad5bafb6d38b1f2a2f8162dc494ae6b4b941878" Dec 01 22:28:35 crc kubenswrapper[4962]: I1201 22:28:35.211921 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vpgpb"] Dec 01 22:28:35 crc kubenswrapper[4962]: I1201 22:28:35.221717 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vpgpb"] Dec 01 22:28:35 crc kubenswrapper[4962]: I1201 22:28:35.232776 4962 scope.go:117] "RemoveContainer" containerID="d425e42a72c6e246ff861a4e6699bbc7bc11d39e3db01feef5dd5cc9c212c5a5" Dec 01 22:28:35 crc kubenswrapper[4962]: I1201 22:28:35.281272 4962 scope.go:117] "RemoveContainer" containerID="1dfe60ed001b9eba72ca244e24d43d5bf53e6641af4be2c870788ca8e295df11" Dec 01 22:28:35 crc kubenswrapper[4962]: E1201 22:28:35.281727 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1dfe60ed001b9eba72ca244e24d43d5bf53e6641af4be2c870788ca8e295df11\": container with ID starting with 1dfe60ed001b9eba72ca244e24d43d5bf53e6641af4be2c870788ca8e295df11 not found: ID does not exist" containerID="1dfe60ed001b9eba72ca244e24d43d5bf53e6641af4be2c870788ca8e295df11" Dec 01 22:28:35 crc kubenswrapper[4962]: I1201 22:28:35.281775 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1dfe60ed001b9eba72ca244e24d43d5bf53e6641af4be2c870788ca8e295df11"} err="failed to get container status \"1dfe60ed001b9eba72ca244e24d43d5bf53e6641af4be2c870788ca8e295df11\": rpc error: code = NotFound desc = could not find container \"1dfe60ed001b9eba72ca244e24d43d5bf53e6641af4be2c870788ca8e295df11\": container with ID starting with 1dfe60ed001b9eba72ca244e24d43d5bf53e6641af4be2c870788ca8e295df11 not found: ID does not exist" Dec 01 22:28:35 crc kubenswrapper[4962]: I1201 22:28:35.281802 4962 scope.go:117] "RemoveContainer" containerID="5c00df96a7d69274cfc63d41dad5bafb6d38b1f2a2f8162dc494ae6b4b941878" Dec 01 22:28:35 crc kubenswrapper[4962]: E1201 22:28:35.282197 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c00df96a7d69274cfc63d41dad5bafb6d38b1f2a2f8162dc494ae6b4b941878\": container with ID starting with 5c00df96a7d69274cfc63d41dad5bafb6d38b1f2a2f8162dc494ae6b4b941878 not found: ID does not exist" containerID="5c00df96a7d69274cfc63d41dad5bafb6d38b1f2a2f8162dc494ae6b4b941878" Dec 01 22:28:35 crc kubenswrapper[4962]: I1201 22:28:35.282228 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c00df96a7d69274cfc63d41dad5bafb6d38b1f2a2f8162dc494ae6b4b941878"} err="failed to get container status \"5c00df96a7d69274cfc63d41dad5bafb6d38b1f2a2f8162dc494ae6b4b941878\": rpc error: code = NotFound desc = could not find container \"5c00df96a7d69274cfc63d41dad5bafb6d38b1f2a2f8162dc494ae6b4b941878\": container with ID starting with 5c00df96a7d69274cfc63d41dad5bafb6d38b1f2a2f8162dc494ae6b4b941878 not found: ID does not exist" Dec 01 22:28:35 crc kubenswrapper[4962]: I1201 22:28:35.282253 4962 scope.go:117] "RemoveContainer" containerID="d425e42a72c6e246ff861a4e6699bbc7bc11d39e3db01feef5dd5cc9c212c5a5" Dec 01 22:28:35 crc kubenswrapper[4962]: E1201 22:28:35.282606 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d425e42a72c6e246ff861a4e6699bbc7bc11d39e3db01feef5dd5cc9c212c5a5\": container with ID starting with d425e42a72c6e246ff861a4e6699bbc7bc11d39e3db01feef5dd5cc9c212c5a5 not found: ID does not exist" containerID="d425e42a72c6e246ff861a4e6699bbc7bc11d39e3db01feef5dd5cc9c212c5a5" Dec 01 22:28:35 crc kubenswrapper[4962]: I1201 22:28:35.282625 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d425e42a72c6e246ff861a4e6699bbc7bc11d39e3db01feef5dd5cc9c212c5a5"} err="failed to get container status \"d425e42a72c6e246ff861a4e6699bbc7bc11d39e3db01feef5dd5cc9c212c5a5\": rpc error: code = NotFound desc = could not find container \"d425e42a72c6e246ff861a4e6699bbc7bc11d39e3db01feef5dd5cc9c212c5a5\": container with ID starting with d425e42a72c6e246ff861a4e6699bbc7bc11d39e3db01feef5dd5cc9c212c5a5 not found: ID does not exist" Dec 01 22:28:36 crc kubenswrapper[4962]: I1201 22:28:36.237527 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28d7ef93-f48f-4243-ae99-b2c8ed1a5a41" path="/var/lib/kubelet/pods/28d7ef93-f48f-4243-ae99-b2c8ed1a5a41/volumes" Dec 01 22:28:38 crc kubenswrapper[4962]: I1201 22:28:38.219979 4962 scope.go:117] "RemoveContainer" containerID="2e23bada28e6e6bf435d62af0fa8d63ab344b8505237ba65bcba1b122cc5a1ef" Dec 01 22:28:38 crc kubenswrapper[4962]: E1201 22:28:38.220659 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:28:52 crc kubenswrapper[4962]: I1201 22:28:52.220416 4962 scope.go:117] "RemoveContainer" containerID="2e23bada28e6e6bf435d62af0fa8d63ab344b8505237ba65bcba1b122cc5a1ef" Dec 01 22:28:52 crc kubenswrapper[4962]: E1201 22:28:52.221245 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:29:05 crc kubenswrapper[4962]: I1201 22:29:05.221114 4962 scope.go:117] "RemoveContainer" containerID="2e23bada28e6e6bf435d62af0fa8d63ab344b8505237ba65bcba1b122cc5a1ef" Dec 01 22:29:05 crc kubenswrapper[4962]: E1201 22:29:05.221985 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:29:16 crc kubenswrapper[4962]: I1201 22:29:16.032370 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-b874t"] Dec 01 22:29:16 crc kubenswrapper[4962]: E1201 22:29:16.033348 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28d7ef93-f48f-4243-ae99-b2c8ed1a5a41" containerName="extract-utilities" Dec 01 22:29:16 crc kubenswrapper[4962]: I1201 22:29:16.033455 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="28d7ef93-f48f-4243-ae99-b2c8ed1a5a41" containerName="extract-utilities" Dec 01 22:29:16 crc kubenswrapper[4962]: E1201 22:29:16.033497 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28d7ef93-f48f-4243-ae99-b2c8ed1a5a41" containerName="registry-server" Dec 01 22:29:16 crc kubenswrapper[4962]: I1201 22:29:16.033505 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="28d7ef93-f48f-4243-ae99-b2c8ed1a5a41" containerName="registry-server" Dec 01 22:29:16 crc kubenswrapper[4962]: E1201 22:29:16.033539 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28d7ef93-f48f-4243-ae99-b2c8ed1a5a41" containerName="extract-content" Dec 01 22:29:16 crc kubenswrapper[4962]: I1201 22:29:16.033548 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="28d7ef93-f48f-4243-ae99-b2c8ed1a5a41" containerName="extract-content" Dec 01 22:29:16 crc kubenswrapper[4962]: I1201 22:29:16.033827 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="28d7ef93-f48f-4243-ae99-b2c8ed1a5a41" containerName="registry-server" Dec 01 22:29:16 crc kubenswrapper[4962]: I1201 22:29:16.035914 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b874t" Dec 01 22:29:16 crc kubenswrapper[4962]: I1201 22:29:16.084060 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b874t"] Dec 01 22:29:16 crc kubenswrapper[4962]: I1201 22:29:16.197635 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2df79f92-258a-4579-891c-a3994888850d-utilities\") pod \"redhat-operators-b874t\" (UID: \"2df79f92-258a-4579-891c-a3994888850d\") " pod="openshift-marketplace/redhat-operators-b874t" Dec 01 22:29:16 crc kubenswrapper[4962]: I1201 22:29:16.197700 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2df79f92-258a-4579-891c-a3994888850d-catalog-content\") pod \"redhat-operators-b874t\" (UID: \"2df79f92-258a-4579-891c-a3994888850d\") " pod="openshift-marketplace/redhat-operators-b874t" Dec 01 22:29:16 crc kubenswrapper[4962]: I1201 22:29:16.198346 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwx8z\" (UniqueName: \"kubernetes.io/projected/2df79f92-258a-4579-891c-a3994888850d-kube-api-access-zwx8z\") pod \"redhat-operators-b874t\" (UID: \"2df79f92-258a-4579-891c-a3994888850d\") " pod="openshift-marketplace/redhat-operators-b874t" Dec 01 22:29:16 crc kubenswrapper[4962]: I1201 22:29:16.301573 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwx8z\" (UniqueName: \"kubernetes.io/projected/2df79f92-258a-4579-891c-a3994888850d-kube-api-access-zwx8z\") pod \"redhat-operators-b874t\" (UID: \"2df79f92-258a-4579-891c-a3994888850d\") " pod="openshift-marketplace/redhat-operators-b874t" Dec 01 22:29:16 crc kubenswrapper[4962]: I1201 22:29:16.301740 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2df79f92-258a-4579-891c-a3994888850d-utilities\") pod \"redhat-operators-b874t\" (UID: \"2df79f92-258a-4579-891c-a3994888850d\") " pod="openshift-marketplace/redhat-operators-b874t" Dec 01 22:29:16 crc kubenswrapper[4962]: I1201 22:29:16.302382 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2df79f92-258a-4579-891c-a3994888850d-utilities\") pod \"redhat-operators-b874t\" (UID: \"2df79f92-258a-4579-891c-a3994888850d\") " pod="openshift-marketplace/redhat-operators-b874t" Dec 01 22:29:16 crc kubenswrapper[4962]: I1201 22:29:16.302452 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2df79f92-258a-4579-891c-a3994888850d-catalog-content\") pod \"redhat-operators-b874t\" (UID: \"2df79f92-258a-4579-891c-a3994888850d\") " pod="openshift-marketplace/redhat-operators-b874t" Dec 01 22:29:16 crc kubenswrapper[4962]: I1201 22:29:16.302855 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2df79f92-258a-4579-891c-a3994888850d-catalog-content\") pod \"redhat-operators-b874t\" (UID: \"2df79f92-258a-4579-891c-a3994888850d\") " pod="openshift-marketplace/redhat-operators-b874t" Dec 01 22:29:16 crc kubenswrapper[4962]: I1201 22:29:16.334376 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwx8z\" (UniqueName: \"kubernetes.io/projected/2df79f92-258a-4579-891c-a3994888850d-kube-api-access-zwx8z\") pod \"redhat-operators-b874t\" (UID: \"2df79f92-258a-4579-891c-a3994888850d\") " pod="openshift-marketplace/redhat-operators-b874t" Dec 01 22:29:16 crc kubenswrapper[4962]: I1201 22:29:16.371855 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b874t" Dec 01 22:29:16 crc kubenswrapper[4962]: I1201 22:29:16.859993 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b874t"] Dec 01 22:29:17 crc kubenswrapper[4962]: I1201 22:29:17.772557 4962 generic.go:334] "Generic (PLEG): container finished" podID="2df79f92-258a-4579-891c-a3994888850d" containerID="438e21c5790b6f3296a4c83d75d76a419494e7f97b6e714a7f094f859a77a86e" exitCode=0 Dec 01 22:29:17 crc kubenswrapper[4962]: I1201 22:29:17.773058 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b874t" event={"ID":"2df79f92-258a-4579-891c-a3994888850d","Type":"ContainerDied","Data":"438e21c5790b6f3296a4c83d75d76a419494e7f97b6e714a7f094f859a77a86e"} Dec 01 22:29:17 crc kubenswrapper[4962]: I1201 22:29:17.773091 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b874t" event={"ID":"2df79f92-258a-4579-891c-a3994888850d","Type":"ContainerStarted","Data":"adb27bfdad8eb0fca4230d6aa170ec7892473c6e89e5ce2a8e6685af89934be0"} Dec 01 22:29:19 crc kubenswrapper[4962]: I1201 22:29:19.220346 4962 scope.go:117] "RemoveContainer" containerID="2e23bada28e6e6bf435d62af0fa8d63ab344b8505237ba65bcba1b122cc5a1ef" Dec 01 22:29:19 crc kubenswrapper[4962]: E1201 22:29:19.221081 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:29:19 crc kubenswrapper[4962]: I1201 22:29:19.794756 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b874t" event={"ID":"2df79f92-258a-4579-891c-a3994888850d","Type":"ContainerStarted","Data":"50fedaac5868ed373e47ca17b887b51118d18a29f253e744021eac026b1a6910"} Dec 01 22:29:21 crc kubenswrapper[4962]: I1201 22:29:21.822263 4962 generic.go:334] "Generic (PLEG): container finished" podID="2df79f92-258a-4579-891c-a3994888850d" containerID="50fedaac5868ed373e47ca17b887b51118d18a29f253e744021eac026b1a6910" exitCode=0 Dec 01 22:29:21 crc kubenswrapper[4962]: I1201 22:29:21.822446 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b874t" event={"ID":"2df79f92-258a-4579-891c-a3994888850d","Type":"ContainerDied","Data":"50fedaac5868ed373e47ca17b887b51118d18a29f253e744021eac026b1a6910"} Dec 01 22:29:22 crc kubenswrapper[4962]: I1201 22:29:22.838146 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b874t" event={"ID":"2df79f92-258a-4579-891c-a3994888850d","Type":"ContainerStarted","Data":"3e56c41ec3e06f7cbb0e21a87334a4265f8cdf33bf8117f4cfb671c53540a963"} Dec 01 22:29:22 crc kubenswrapper[4962]: I1201 22:29:22.864392 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-b874t" podStartSLOduration=3.368296078 podStartE2EDuration="7.864374801s" podCreationTimestamp="2025-12-01 22:29:15 +0000 UTC" firstStartedPulling="2025-12-01 22:29:17.776694204 +0000 UTC m=+3341.878133389" lastFinishedPulling="2025-12-01 22:29:22.272772917 +0000 UTC m=+3346.374212112" observedRunningTime="2025-12-01 22:29:22.86327215 +0000 UTC m=+3346.964711345" watchObservedRunningTime="2025-12-01 22:29:22.864374801 +0000 UTC m=+3346.965813986" Dec 01 22:29:26 crc kubenswrapper[4962]: I1201 22:29:26.372361 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-b874t" Dec 01 22:29:26 crc kubenswrapper[4962]: I1201 22:29:26.372900 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-b874t" Dec 01 22:29:27 crc kubenswrapper[4962]: I1201 22:29:27.463222 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-b874t" podUID="2df79f92-258a-4579-891c-a3994888850d" containerName="registry-server" probeResult="failure" output=< Dec 01 22:29:27 crc kubenswrapper[4962]: timeout: failed to connect service ":50051" within 1s Dec 01 22:29:27 crc kubenswrapper[4962]: > Dec 01 22:29:30 crc kubenswrapper[4962]: I1201 22:29:30.220978 4962 scope.go:117] "RemoveContainer" containerID="2e23bada28e6e6bf435d62af0fa8d63ab344b8505237ba65bcba1b122cc5a1ef" Dec 01 22:29:30 crc kubenswrapper[4962]: E1201 22:29:30.221858 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:29:36 crc kubenswrapper[4962]: I1201 22:29:36.449667 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-b874t" Dec 01 22:29:36 crc kubenswrapper[4962]: I1201 22:29:36.529181 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-b874t" Dec 01 22:29:36 crc kubenswrapper[4962]: I1201 22:29:36.690677 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-b874t"] Dec 01 22:29:38 crc kubenswrapper[4962]: I1201 22:29:38.008038 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-b874t" podUID="2df79f92-258a-4579-891c-a3994888850d" containerName="registry-server" containerID="cri-o://3e56c41ec3e06f7cbb0e21a87334a4265f8cdf33bf8117f4cfb671c53540a963" gracePeriod=2 Dec 01 22:29:38 crc kubenswrapper[4962]: I1201 22:29:38.628333 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b874t" Dec 01 22:29:38 crc kubenswrapper[4962]: I1201 22:29:38.783993 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2df79f92-258a-4579-891c-a3994888850d-utilities\") pod \"2df79f92-258a-4579-891c-a3994888850d\" (UID: \"2df79f92-258a-4579-891c-a3994888850d\") " Dec 01 22:29:38 crc kubenswrapper[4962]: I1201 22:29:38.784144 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2df79f92-258a-4579-891c-a3994888850d-catalog-content\") pod \"2df79f92-258a-4579-891c-a3994888850d\" (UID: \"2df79f92-258a-4579-891c-a3994888850d\") " Dec 01 22:29:38 crc kubenswrapper[4962]: I1201 22:29:38.784307 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwx8z\" (UniqueName: \"kubernetes.io/projected/2df79f92-258a-4579-891c-a3994888850d-kube-api-access-zwx8z\") pod \"2df79f92-258a-4579-891c-a3994888850d\" (UID: \"2df79f92-258a-4579-891c-a3994888850d\") " Dec 01 22:29:38 crc kubenswrapper[4962]: I1201 22:29:38.785037 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2df79f92-258a-4579-891c-a3994888850d-utilities" (OuterVolumeSpecName: "utilities") pod "2df79f92-258a-4579-891c-a3994888850d" (UID: "2df79f92-258a-4579-891c-a3994888850d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 22:29:38 crc kubenswrapper[4962]: I1201 22:29:38.790464 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2df79f92-258a-4579-891c-a3994888850d-kube-api-access-zwx8z" (OuterVolumeSpecName: "kube-api-access-zwx8z") pod "2df79f92-258a-4579-891c-a3994888850d" (UID: "2df79f92-258a-4579-891c-a3994888850d"). InnerVolumeSpecName "kube-api-access-zwx8z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 22:29:38 crc kubenswrapper[4962]: I1201 22:29:38.888580 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zwx8z\" (UniqueName: \"kubernetes.io/projected/2df79f92-258a-4579-891c-a3994888850d-kube-api-access-zwx8z\") on node \"crc\" DevicePath \"\"" Dec 01 22:29:38 crc kubenswrapper[4962]: I1201 22:29:38.888632 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2df79f92-258a-4579-891c-a3994888850d-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 22:29:38 crc kubenswrapper[4962]: I1201 22:29:38.899583 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2df79f92-258a-4579-891c-a3994888850d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2df79f92-258a-4579-891c-a3994888850d" (UID: "2df79f92-258a-4579-891c-a3994888850d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 22:29:38 crc kubenswrapper[4962]: I1201 22:29:38.990456 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2df79f92-258a-4579-891c-a3994888850d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 22:29:39 crc kubenswrapper[4962]: I1201 22:29:39.022860 4962 generic.go:334] "Generic (PLEG): container finished" podID="2df79f92-258a-4579-891c-a3994888850d" containerID="3e56c41ec3e06f7cbb0e21a87334a4265f8cdf33bf8117f4cfb671c53540a963" exitCode=0 Dec 01 22:29:39 crc kubenswrapper[4962]: I1201 22:29:39.022926 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b874t" event={"ID":"2df79f92-258a-4579-891c-a3994888850d","Type":"ContainerDied","Data":"3e56c41ec3e06f7cbb0e21a87334a4265f8cdf33bf8117f4cfb671c53540a963"} Dec 01 22:29:39 crc kubenswrapper[4962]: I1201 22:29:39.022973 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b874t" event={"ID":"2df79f92-258a-4579-891c-a3994888850d","Type":"ContainerDied","Data":"adb27bfdad8eb0fca4230d6aa170ec7892473c6e89e5ce2a8e6685af89934be0"} Dec 01 22:29:39 crc kubenswrapper[4962]: I1201 22:29:39.022992 4962 scope.go:117] "RemoveContainer" containerID="3e56c41ec3e06f7cbb0e21a87334a4265f8cdf33bf8117f4cfb671c53540a963" Dec 01 22:29:39 crc kubenswrapper[4962]: I1201 22:29:39.023037 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b874t" Dec 01 22:29:39 crc kubenswrapper[4962]: I1201 22:29:39.047026 4962 scope.go:117] "RemoveContainer" containerID="50fedaac5868ed373e47ca17b887b51118d18a29f253e744021eac026b1a6910" Dec 01 22:29:39 crc kubenswrapper[4962]: I1201 22:29:39.080639 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-b874t"] Dec 01 22:29:39 crc kubenswrapper[4962]: I1201 22:29:39.085780 4962 scope.go:117] "RemoveContainer" containerID="438e21c5790b6f3296a4c83d75d76a419494e7f97b6e714a7f094f859a77a86e" Dec 01 22:29:39 crc kubenswrapper[4962]: I1201 22:29:39.091984 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-b874t"] Dec 01 22:29:39 crc kubenswrapper[4962]: I1201 22:29:39.154917 4962 scope.go:117] "RemoveContainer" containerID="3e56c41ec3e06f7cbb0e21a87334a4265f8cdf33bf8117f4cfb671c53540a963" Dec 01 22:29:39 crc kubenswrapper[4962]: E1201 22:29:39.155386 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e56c41ec3e06f7cbb0e21a87334a4265f8cdf33bf8117f4cfb671c53540a963\": container with ID starting with 3e56c41ec3e06f7cbb0e21a87334a4265f8cdf33bf8117f4cfb671c53540a963 not found: ID does not exist" containerID="3e56c41ec3e06f7cbb0e21a87334a4265f8cdf33bf8117f4cfb671c53540a963" Dec 01 22:29:39 crc kubenswrapper[4962]: I1201 22:29:39.155417 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e56c41ec3e06f7cbb0e21a87334a4265f8cdf33bf8117f4cfb671c53540a963"} err="failed to get container status \"3e56c41ec3e06f7cbb0e21a87334a4265f8cdf33bf8117f4cfb671c53540a963\": rpc error: code = NotFound desc = could not find container \"3e56c41ec3e06f7cbb0e21a87334a4265f8cdf33bf8117f4cfb671c53540a963\": container with ID starting with 3e56c41ec3e06f7cbb0e21a87334a4265f8cdf33bf8117f4cfb671c53540a963 not found: ID does not exist" Dec 01 22:29:39 crc kubenswrapper[4962]: I1201 22:29:39.155438 4962 scope.go:117] "RemoveContainer" containerID="50fedaac5868ed373e47ca17b887b51118d18a29f253e744021eac026b1a6910" Dec 01 22:29:39 crc kubenswrapper[4962]: E1201 22:29:39.155655 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50fedaac5868ed373e47ca17b887b51118d18a29f253e744021eac026b1a6910\": container with ID starting with 50fedaac5868ed373e47ca17b887b51118d18a29f253e744021eac026b1a6910 not found: ID does not exist" containerID="50fedaac5868ed373e47ca17b887b51118d18a29f253e744021eac026b1a6910" Dec 01 22:29:39 crc kubenswrapper[4962]: I1201 22:29:39.155678 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50fedaac5868ed373e47ca17b887b51118d18a29f253e744021eac026b1a6910"} err="failed to get container status \"50fedaac5868ed373e47ca17b887b51118d18a29f253e744021eac026b1a6910\": rpc error: code = NotFound desc = could not find container \"50fedaac5868ed373e47ca17b887b51118d18a29f253e744021eac026b1a6910\": container with ID starting with 50fedaac5868ed373e47ca17b887b51118d18a29f253e744021eac026b1a6910 not found: ID does not exist" Dec 01 22:29:39 crc kubenswrapper[4962]: I1201 22:29:39.155692 4962 scope.go:117] "RemoveContainer" containerID="438e21c5790b6f3296a4c83d75d76a419494e7f97b6e714a7f094f859a77a86e" Dec 01 22:29:39 crc kubenswrapper[4962]: E1201 22:29:39.155981 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"438e21c5790b6f3296a4c83d75d76a419494e7f97b6e714a7f094f859a77a86e\": container with ID starting with 438e21c5790b6f3296a4c83d75d76a419494e7f97b6e714a7f094f859a77a86e not found: ID does not exist" containerID="438e21c5790b6f3296a4c83d75d76a419494e7f97b6e714a7f094f859a77a86e" Dec 01 22:29:39 crc kubenswrapper[4962]: I1201 22:29:39.156004 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"438e21c5790b6f3296a4c83d75d76a419494e7f97b6e714a7f094f859a77a86e"} err="failed to get container status \"438e21c5790b6f3296a4c83d75d76a419494e7f97b6e714a7f094f859a77a86e\": rpc error: code = NotFound desc = could not find container \"438e21c5790b6f3296a4c83d75d76a419494e7f97b6e714a7f094f859a77a86e\": container with ID starting with 438e21c5790b6f3296a4c83d75d76a419494e7f97b6e714a7f094f859a77a86e not found: ID does not exist" Dec 01 22:29:40 crc kubenswrapper[4962]: I1201 22:29:40.243572 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2df79f92-258a-4579-891c-a3994888850d" path="/var/lib/kubelet/pods/2df79f92-258a-4579-891c-a3994888850d/volumes" Dec 01 22:29:45 crc kubenswrapper[4962]: I1201 22:29:45.220473 4962 scope.go:117] "RemoveContainer" containerID="2e23bada28e6e6bf435d62af0fa8d63ab344b8505237ba65bcba1b122cc5a1ef" Dec 01 22:29:45 crc kubenswrapper[4962]: E1201 22:29:45.221605 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:29:59 crc kubenswrapper[4962]: I1201 22:29:59.220444 4962 scope.go:117] "RemoveContainer" containerID="2e23bada28e6e6bf435d62af0fa8d63ab344b8505237ba65bcba1b122cc5a1ef" Dec 01 22:29:59 crc kubenswrapper[4962]: E1201 22:29:59.221465 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:30:00 crc kubenswrapper[4962]: I1201 22:30:00.168236 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410470-pbdpn"] Dec 01 22:30:00 crc kubenswrapper[4962]: E1201 22:30:00.169063 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2df79f92-258a-4579-891c-a3994888850d" containerName="extract-utilities" Dec 01 22:30:00 crc kubenswrapper[4962]: I1201 22:30:00.169087 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2df79f92-258a-4579-891c-a3994888850d" containerName="extract-utilities" Dec 01 22:30:00 crc kubenswrapper[4962]: E1201 22:30:00.169168 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2df79f92-258a-4579-891c-a3994888850d" containerName="extract-content" Dec 01 22:30:00 crc kubenswrapper[4962]: I1201 22:30:00.169182 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2df79f92-258a-4579-891c-a3994888850d" containerName="extract-content" Dec 01 22:30:00 crc kubenswrapper[4962]: E1201 22:30:00.169218 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2df79f92-258a-4579-891c-a3994888850d" containerName="registry-server" Dec 01 22:30:00 crc kubenswrapper[4962]: I1201 22:30:00.169232 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2df79f92-258a-4579-891c-a3994888850d" containerName="registry-server" Dec 01 22:30:00 crc kubenswrapper[4962]: I1201 22:30:00.169685 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2df79f92-258a-4579-891c-a3994888850d" containerName="registry-server" Dec 01 22:30:00 crc kubenswrapper[4962]: I1201 22:30:00.171101 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410470-pbdpn" Dec 01 22:30:00 crc kubenswrapper[4962]: I1201 22:30:00.173355 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 01 22:30:00 crc kubenswrapper[4962]: I1201 22:30:00.173492 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 01 22:30:00 crc kubenswrapper[4962]: I1201 22:30:00.191354 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410470-pbdpn"] Dec 01 22:30:00 crc kubenswrapper[4962]: I1201 22:30:00.229694 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sjgl\" (UniqueName: \"kubernetes.io/projected/e43abc92-633e-496a-9ec0-68bb2520c12d-kube-api-access-4sjgl\") pod \"collect-profiles-29410470-pbdpn\" (UID: \"e43abc92-633e-496a-9ec0-68bb2520c12d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410470-pbdpn" Dec 01 22:30:00 crc kubenswrapper[4962]: I1201 22:30:00.230057 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e43abc92-633e-496a-9ec0-68bb2520c12d-secret-volume\") pod \"collect-profiles-29410470-pbdpn\" (UID: \"e43abc92-633e-496a-9ec0-68bb2520c12d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410470-pbdpn" Dec 01 22:30:00 crc kubenswrapper[4962]: I1201 22:30:00.230366 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e43abc92-633e-496a-9ec0-68bb2520c12d-config-volume\") pod \"collect-profiles-29410470-pbdpn\" (UID: \"e43abc92-633e-496a-9ec0-68bb2520c12d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410470-pbdpn" Dec 01 22:30:00 crc kubenswrapper[4962]: I1201 22:30:00.333830 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e43abc92-633e-496a-9ec0-68bb2520c12d-config-volume\") pod \"collect-profiles-29410470-pbdpn\" (UID: \"e43abc92-633e-496a-9ec0-68bb2520c12d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410470-pbdpn" Dec 01 22:30:00 crc kubenswrapper[4962]: I1201 22:30:00.334305 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4sjgl\" (UniqueName: \"kubernetes.io/projected/e43abc92-633e-496a-9ec0-68bb2520c12d-kube-api-access-4sjgl\") pod \"collect-profiles-29410470-pbdpn\" (UID: \"e43abc92-633e-496a-9ec0-68bb2520c12d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410470-pbdpn" Dec 01 22:30:00 crc kubenswrapper[4962]: I1201 22:30:00.334367 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e43abc92-633e-496a-9ec0-68bb2520c12d-secret-volume\") pod \"collect-profiles-29410470-pbdpn\" (UID: \"e43abc92-633e-496a-9ec0-68bb2520c12d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410470-pbdpn" Dec 01 22:30:00 crc kubenswrapper[4962]: I1201 22:30:00.335237 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e43abc92-633e-496a-9ec0-68bb2520c12d-config-volume\") pod \"collect-profiles-29410470-pbdpn\" (UID: \"e43abc92-633e-496a-9ec0-68bb2520c12d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410470-pbdpn" Dec 01 22:30:00 crc kubenswrapper[4962]: I1201 22:30:00.340899 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e43abc92-633e-496a-9ec0-68bb2520c12d-secret-volume\") pod \"collect-profiles-29410470-pbdpn\" (UID: \"e43abc92-633e-496a-9ec0-68bb2520c12d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410470-pbdpn" Dec 01 22:30:00 crc kubenswrapper[4962]: I1201 22:30:00.356709 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sjgl\" (UniqueName: \"kubernetes.io/projected/e43abc92-633e-496a-9ec0-68bb2520c12d-kube-api-access-4sjgl\") pod \"collect-profiles-29410470-pbdpn\" (UID: \"e43abc92-633e-496a-9ec0-68bb2520c12d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410470-pbdpn" Dec 01 22:30:00 crc kubenswrapper[4962]: I1201 22:30:00.509192 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410470-pbdpn" Dec 01 22:30:01 crc kubenswrapper[4962]: I1201 22:30:01.017625 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410470-pbdpn"] Dec 01 22:30:01 crc kubenswrapper[4962]: W1201 22:30:01.021185 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode43abc92_633e_496a_9ec0_68bb2520c12d.slice/crio-83f469643cf50d820c5296fc677c62f60b62b58f6e66d9fb3531bb2f301f24f3 WatchSource:0}: Error finding container 83f469643cf50d820c5296fc677c62f60b62b58f6e66d9fb3531bb2f301f24f3: Status 404 returned error can't find the container with id 83f469643cf50d820c5296fc677c62f60b62b58f6e66d9fb3531bb2f301f24f3 Dec 01 22:30:01 crc kubenswrapper[4962]: I1201 22:30:01.335320 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410470-pbdpn" event={"ID":"e43abc92-633e-496a-9ec0-68bb2520c12d","Type":"ContainerStarted","Data":"c91095aa1dd359fbddea12128fa7e2c98fb93ea7a9b63ecddcfbea15f9b270da"} Dec 01 22:30:01 crc kubenswrapper[4962]: I1201 22:30:01.335378 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410470-pbdpn" event={"ID":"e43abc92-633e-496a-9ec0-68bb2520c12d","Type":"ContainerStarted","Data":"83f469643cf50d820c5296fc677c62f60b62b58f6e66d9fb3531bb2f301f24f3"} Dec 01 22:30:01 crc kubenswrapper[4962]: I1201 22:30:01.362825 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29410470-pbdpn" podStartSLOduration=1.362797845 podStartE2EDuration="1.362797845s" podCreationTimestamp="2025-12-01 22:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 22:30:01.358785863 +0000 UTC m=+3385.460225088" watchObservedRunningTime="2025-12-01 22:30:01.362797845 +0000 UTC m=+3385.464237060" Dec 01 22:30:02 crc kubenswrapper[4962]: I1201 22:30:02.369887 4962 generic.go:334] "Generic (PLEG): container finished" podID="e43abc92-633e-496a-9ec0-68bb2520c12d" containerID="c91095aa1dd359fbddea12128fa7e2c98fb93ea7a9b63ecddcfbea15f9b270da" exitCode=0 Dec 01 22:30:02 crc kubenswrapper[4962]: I1201 22:30:02.370676 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410470-pbdpn" event={"ID":"e43abc92-633e-496a-9ec0-68bb2520c12d","Type":"ContainerDied","Data":"c91095aa1dd359fbddea12128fa7e2c98fb93ea7a9b63ecddcfbea15f9b270da"} Dec 01 22:30:03 crc kubenswrapper[4962]: I1201 22:30:03.758095 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410470-pbdpn" Dec 01 22:30:03 crc kubenswrapper[4962]: I1201 22:30:03.815158 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4sjgl\" (UniqueName: \"kubernetes.io/projected/e43abc92-633e-496a-9ec0-68bb2520c12d-kube-api-access-4sjgl\") pod \"e43abc92-633e-496a-9ec0-68bb2520c12d\" (UID: \"e43abc92-633e-496a-9ec0-68bb2520c12d\") " Dec 01 22:30:03 crc kubenswrapper[4962]: I1201 22:30:03.815476 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e43abc92-633e-496a-9ec0-68bb2520c12d-config-volume\") pod \"e43abc92-633e-496a-9ec0-68bb2520c12d\" (UID: \"e43abc92-633e-496a-9ec0-68bb2520c12d\") " Dec 01 22:30:03 crc kubenswrapper[4962]: I1201 22:30:03.815543 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e43abc92-633e-496a-9ec0-68bb2520c12d-secret-volume\") pod \"e43abc92-633e-496a-9ec0-68bb2520c12d\" (UID: \"e43abc92-633e-496a-9ec0-68bb2520c12d\") " Dec 01 22:30:03 crc kubenswrapper[4962]: I1201 22:30:03.816267 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e43abc92-633e-496a-9ec0-68bb2520c12d-config-volume" (OuterVolumeSpecName: "config-volume") pod "e43abc92-633e-496a-9ec0-68bb2520c12d" (UID: "e43abc92-633e-496a-9ec0-68bb2520c12d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 22:30:03 crc kubenswrapper[4962]: I1201 22:30:03.822763 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e43abc92-633e-496a-9ec0-68bb2520c12d-kube-api-access-4sjgl" (OuterVolumeSpecName: "kube-api-access-4sjgl") pod "e43abc92-633e-496a-9ec0-68bb2520c12d" (UID: "e43abc92-633e-496a-9ec0-68bb2520c12d"). InnerVolumeSpecName "kube-api-access-4sjgl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 22:30:03 crc kubenswrapper[4962]: I1201 22:30:03.835170 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e43abc92-633e-496a-9ec0-68bb2520c12d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e43abc92-633e-496a-9ec0-68bb2520c12d" (UID: "e43abc92-633e-496a-9ec0-68bb2520c12d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:30:03 crc kubenswrapper[4962]: I1201 22:30:03.918496 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4sjgl\" (UniqueName: \"kubernetes.io/projected/e43abc92-633e-496a-9ec0-68bb2520c12d-kube-api-access-4sjgl\") on node \"crc\" DevicePath \"\"" Dec 01 22:30:03 crc kubenswrapper[4962]: I1201 22:30:03.918698 4962 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e43abc92-633e-496a-9ec0-68bb2520c12d-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 22:30:03 crc kubenswrapper[4962]: I1201 22:30:03.918711 4962 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e43abc92-633e-496a-9ec0-68bb2520c12d-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 01 22:30:04 crc kubenswrapper[4962]: I1201 22:30:04.419632 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410470-pbdpn" event={"ID":"e43abc92-633e-496a-9ec0-68bb2520c12d","Type":"ContainerDied","Data":"83f469643cf50d820c5296fc677c62f60b62b58f6e66d9fb3531bb2f301f24f3"} Dec 01 22:30:04 crc kubenswrapper[4962]: I1201 22:30:04.419890 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83f469643cf50d820c5296fc677c62f60b62b58f6e66d9fb3531bb2f301f24f3" Dec 01 22:30:04 crc kubenswrapper[4962]: I1201 22:30:04.420026 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410470-pbdpn" Dec 01 22:30:04 crc kubenswrapper[4962]: I1201 22:30:04.449031 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410425-pmmvr"] Dec 01 22:30:04 crc kubenswrapper[4962]: I1201 22:30:04.459660 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410425-pmmvr"] Dec 01 22:30:06 crc kubenswrapper[4962]: I1201 22:30:06.250658 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b7d22d3-7996-4a9a-bc30-584a752a7ef9" path="/var/lib/kubelet/pods/9b7d22d3-7996-4a9a-bc30-584a752a7ef9/volumes" Dec 01 22:30:07 crc kubenswrapper[4962]: I1201 22:30:07.313332 4962 scope.go:117] "RemoveContainer" containerID="921b517902c1b5debb58936b939673779b86f01b9281acf1e846becfc68d4504" Dec 01 22:30:11 crc kubenswrapper[4962]: I1201 22:30:11.219763 4962 scope.go:117] "RemoveContainer" containerID="2e23bada28e6e6bf435d62af0fa8d63ab344b8505237ba65bcba1b122cc5a1ef" Dec 01 22:30:11 crc kubenswrapper[4962]: E1201 22:30:11.220579 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:30:24 crc kubenswrapper[4962]: I1201 22:30:24.220764 4962 scope.go:117] "RemoveContainer" containerID="2e23bada28e6e6bf435d62af0fa8d63ab344b8505237ba65bcba1b122cc5a1ef" Dec 01 22:30:24 crc kubenswrapper[4962]: E1201 22:30:24.222741 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:30:35 crc kubenswrapper[4962]: I1201 22:30:35.219338 4962 scope.go:117] "RemoveContainer" containerID="2e23bada28e6e6bf435d62af0fa8d63ab344b8505237ba65bcba1b122cc5a1ef" Dec 01 22:30:36 crc kubenswrapper[4962]: I1201 22:30:36.845701 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" event={"ID":"191b6ce3-f613-4217-b224-a65ee4cfdfe7","Type":"ContainerStarted","Data":"750f5bf5a0e2257032dd3b4e3aa32091ca5bc4fee7f91f51c5f34d3ada27f955"} Dec 01 22:32:26 crc kubenswrapper[4962]: I1201 22:32:26.044512 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-d2j2p"] Dec 01 22:32:26 crc kubenswrapper[4962]: E1201 22:32:26.045585 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e43abc92-633e-496a-9ec0-68bb2520c12d" containerName="collect-profiles" Dec 01 22:32:26 crc kubenswrapper[4962]: I1201 22:32:26.045601 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="e43abc92-633e-496a-9ec0-68bb2520c12d" containerName="collect-profiles" Dec 01 22:32:26 crc kubenswrapper[4962]: I1201 22:32:26.045874 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="e43abc92-633e-496a-9ec0-68bb2520c12d" containerName="collect-profiles" Dec 01 22:32:26 crc kubenswrapper[4962]: I1201 22:32:26.047762 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d2j2p" Dec 01 22:32:26 crc kubenswrapper[4962]: I1201 22:32:26.058666 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d2j2p"] Dec 01 22:32:26 crc kubenswrapper[4962]: I1201 22:32:26.170392 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e11e9f8-f16b-44b8-a690-1211fe4af3db-utilities\") pod \"certified-operators-d2j2p\" (UID: \"5e11e9f8-f16b-44b8-a690-1211fe4af3db\") " pod="openshift-marketplace/certified-operators-d2j2p" Dec 01 22:32:26 crc kubenswrapper[4962]: I1201 22:32:26.170869 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e11e9f8-f16b-44b8-a690-1211fe4af3db-catalog-content\") pod \"certified-operators-d2j2p\" (UID: \"5e11e9f8-f16b-44b8-a690-1211fe4af3db\") " pod="openshift-marketplace/certified-operators-d2j2p" Dec 01 22:32:26 crc kubenswrapper[4962]: I1201 22:32:26.171022 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wf2r\" (UniqueName: \"kubernetes.io/projected/5e11e9f8-f16b-44b8-a690-1211fe4af3db-kube-api-access-8wf2r\") pod \"certified-operators-d2j2p\" (UID: \"5e11e9f8-f16b-44b8-a690-1211fe4af3db\") " pod="openshift-marketplace/certified-operators-d2j2p" Dec 01 22:32:26 crc kubenswrapper[4962]: I1201 22:32:26.273753 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e11e9f8-f16b-44b8-a690-1211fe4af3db-utilities\") pod \"certified-operators-d2j2p\" (UID: \"5e11e9f8-f16b-44b8-a690-1211fe4af3db\") " pod="openshift-marketplace/certified-operators-d2j2p" Dec 01 22:32:26 crc kubenswrapper[4962]: I1201 22:32:26.274002 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e11e9f8-f16b-44b8-a690-1211fe4af3db-catalog-content\") pod \"certified-operators-d2j2p\" (UID: \"5e11e9f8-f16b-44b8-a690-1211fe4af3db\") " pod="openshift-marketplace/certified-operators-d2j2p" Dec 01 22:32:26 crc kubenswrapper[4962]: I1201 22:32:26.274112 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wf2r\" (UniqueName: \"kubernetes.io/projected/5e11e9f8-f16b-44b8-a690-1211fe4af3db-kube-api-access-8wf2r\") pod \"certified-operators-d2j2p\" (UID: \"5e11e9f8-f16b-44b8-a690-1211fe4af3db\") " pod="openshift-marketplace/certified-operators-d2j2p" Dec 01 22:32:26 crc kubenswrapper[4962]: I1201 22:32:26.275412 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e11e9f8-f16b-44b8-a690-1211fe4af3db-utilities\") pod \"certified-operators-d2j2p\" (UID: \"5e11e9f8-f16b-44b8-a690-1211fe4af3db\") " pod="openshift-marketplace/certified-operators-d2j2p" Dec 01 22:32:26 crc kubenswrapper[4962]: I1201 22:32:26.275919 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e11e9f8-f16b-44b8-a690-1211fe4af3db-catalog-content\") pod \"certified-operators-d2j2p\" (UID: \"5e11e9f8-f16b-44b8-a690-1211fe4af3db\") " pod="openshift-marketplace/certified-operators-d2j2p" Dec 01 22:32:26 crc kubenswrapper[4962]: I1201 22:32:26.305241 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wf2r\" (UniqueName: \"kubernetes.io/projected/5e11e9f8-f16b-44b8-a690-1211fe4af3db-kube-api-access-8wf2r\") pod \"certified-operators-d2j2p\" (UID: \"5e11e9f8-f16b-44b8-a690-1211fe4af3db\") " pod="openshift-marketplace/certified-operators-d2j2p" Dec 01 22:32:26 crc kubenswrapper[4962]: I1201 22:32:26.377258 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d2j2p" Dec 01 22:32:26 crc kubenswrapper[4962]: I1201 22:32:26.893330 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d2j2p"] Dec 01 22:32:26 crc kubenswrapper[4962]: W1201 22:32:26.899286 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e11e9f8_f16b_44b8_a690_1211fe4af3db.slice/crio-8e9dd7628e6e06a87db8c233a8024cf6632c9baec216599e80af714c69b4fe9e WatchSource:0}: Error finding container 8e9dd7628e6e06a87db8c233a8024cf6632c9baec216599e80af714c69b4fe9e: Status 404 returned error can't find the container with id 8e9dd7628e6e06a87db8c233a8024cf6632c9baec216599e80af714c69b4fe9e Dec 01 22:32:27 crc kubenswrapper[4962]: I1201 22:32:27.457281 4962 generic.go:334] "Generic (PLEG): container finished" podID="5e11e9f8-f16b-44b8-a690-1211fe4af3db" containerID="90ee26829cb7e91638b58001566294a4004a105f9c1ada3fe288b682ddb478e6" exitCode=0 Dec 01 22:32:27 crc kubenswrapper[4962]: I1201 22:32:27.457369 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d2j2p" event={"ID":"5e11e9f8-f16b-44b8-a690-1211fe4af3db","Type":"ContainerDied","Data":"90ee26829cb7e91638b58001566294a4004a105f9c1ada3fe288b682ddb478e6"} Dec 01 22:32:27 crc kubenswrapper[4962]: I1201 22:32:27.457727 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d2j2p" event={"ID":"5e11e9f8-f16b-44b8-a690-1211fe4af3db","Type":"ContainerStarted","Data":"8e9dd7628e6e06a87db8c233a8024cf6632c9baec216599e80af714c69b4fe9e"} Dec 01 22:32:29 crc kubenswrapper[4962]: I1201 22:32:29.486750 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d2j2p" event={"ID":"5e11e9f8-f16b-44b8-a690-1211fe4af3db","Type":"ContainerStarted","Data":"60842e05dc0bdc1c945a8fdadde690ab52806d6ee1c23f878a871eede71f1513"} Dec 01 22:32:30 crc kubenswrapper[4962]: I1201 22:32:30.501379 4962 generic.go:334] "Generic (PLEG): container finished" podID="5e11e9f8-f16b-44b8-a690-1211fe4af3db" containerID="60842e05dc0bdc1c945a8fdadde690ab52806d6ee1c23f878a871eede71f1513" exitCode=0 Dec 01 22:32:30 crc kubenswrapper[4962]: I1201 22:32:30.501439 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d2j2p" event={"ID":"5e11e9f8-f16b-44b8-a690-1211fe4af3db","Type":"ContainerDied","Data":"60842e05dc0bdc1c945a8fdadde690ab52806d6ee1c23f878a871eede71f1513"} Dec 01 22:32:31 crc kubenswrapper[4962]: I1201 22:32:31.515274 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d2j2p" event={"ID":"5e11e9f8-f16b-44b8-a690-1211fe4af3db","Type":"ContainerStarted","Data":"021bbc8158c6ea8cfefee2998e370936cb8d4b4c88c9b6527875d2c8adbbebf5"} Dec 01 22:32:31 crc kubenswrapper[4962]: I1201 22:32:31.538921 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-d2j2p" podStartSLOduration=2.928860798 podStartE2EDuration="6.538904606s" podCreationTimestamp="2025-12-01 22:32:25 +0000 UTC" firstStartedPulling="2025-12-01 22:32:27.461135623 +0000 UTC m=+3531.562574818" lastFinishedPulling="2025-12-01 22:32:31.071179431 +0000 UTC m=+3535.172618626" observedRunningTime="2025-12-01 22:32:31.536106496 +0000 UTC m=+3535.637545711" watchObservedRunningTime="2025-12-01 22:32:31.538904606 +0000 UTC m=+3535.640343801" Dec 01 22:32:36 crc kubenswrapper[4962]: I1201 22:32:36.378086 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-d2j2p" Dec 01 22:32:36 crc kubenswrapper[4962]: I1201 22:32:36.379771 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-d2j2p" Dec 01 22:32:36 crc kubenswrapper[4962]: I1201 22:32:36.447379 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-d2j2p" Dec 01 22:32:36 crc kubenswrapper[4962]: I1201 22:32:36.647232 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-d2j2p" Dec 01 22:32:36 crc kubenswrapper[4962]: I1201 22:32:36.721358 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-d2j2p"] Dec 01 22:32:38 crc kubenswrapper[4962]: I1201 22:32:38.605838 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-d2j2p" podUID="5e11e9f8-f16b-44b8-a690-1211fe4af3db" containerName="registry-server" containerID="cri-o://021bbc8158c6ea8cfefee2998e370936cb8d4b4c88c9b6527875d2c8adbbebf5" gracePeriod=2 Dec 01 22:32:39 crc kubenswrapper[4962]: I1201 22:32:39.245660 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d2j2p" Dec 01 22:32:39 crc kubenswrapper[4962]: I1201 22:32:39.340364 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wf2r\" (UniqueName: \"kubernetes.io/projected/5e11e9f8-f16b-44b8-a690-1211fe4af3db-kube-api-access-8wf2r\") pod \"5e11e9f8-f16b-44b8-a690-1211fe4af3db\" (UID: \"5e11e9f8-f16b-44b8-a690-1211fe4af3db\") " Dec 01 22:32:39 crc kubenswrapper[4962]: I1201 22:32:39.340673 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e11e9f8-f16b-44b8-a690-1211fe4af3db-catalog-content\") pod \"5e11e9f8-f16b-44b8-a690-1211fe4af3db\" (UID: \"5e11e9f8-f16b-44b8-a690-1211fe4af3db\") " Dec 01 22:32:39 crc kubenswrapper[4962]: I1201 22:32:39.349437 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e11e9f8-f16b-44b8-a690-1211fe4af3db-kube-api-access-8wf2r" (OuterVolumeSpecName: "kube-api-access-8wf2r") pod "5e11e9f8-f16b-44b8-a690-1211fe4af3db" (UID: "5e11e9f8-f16b-44b8-a690-1211fe4af3db"). InnerVolumeSpecName "kube-api-access-8wf2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 22:32:39 crc kubenswrapper[4962]: I1201 22:32:39.408675 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e11e9f8-f16b-44b8-a690-1211fe4af3db-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5e11e9f8-f16b-44b8-a690-1211fe4af3db" (UID: "5e11e9f8-f16b-44b8-a690-1211fe4af3db"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 22:32:39 crc kubenswrapper[4962]: I1201 22:32:39.443161 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e11e9f8-f16b-44b8-a690-1211fe4af3db-utilities\") pod \"5e11e9f8-f16b-44b8-a690-1211fe4af3db\" (UID: \"5e11e9f8-f16b-44b8-a690-1211fe4af3db\") " Dec 01 22:32:39 crc kubenswrapper[4962]: I1201 22:32:39.443884 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e11e9f8-f16b-44b8-a690-1211fe4af3db-utilities" (OuterVolumeSpecName: "utilities") pod "5e11e9f8-f16b-44b8-a690-1211fe4af3db" (UID: "5e11e9f8-f16b-44b8-a690-1211fe4af3db"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 22:32:39 crc kubenswrapper[4962]: I1201 22:32:39.443903 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wf2r\" (UniqueName: \"kubernetes.io/projected/5e11e9f8-f16b-44b8-a690-1211fe4af3db-kube-api-access-8wf2r\") on node \"crc\" DevicePath \"\"" Dec 01 22:32:39 crc kubenswrapper[4962]: I1201 22:32:39.444001 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e11e9f8-f16b-44b8-a690-1211fe4af3db-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 22:32:39 crc kubenswrapper[4962]: I1201 22:32:39.546601 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e11e9f8-f16b-44b8-a690-1211fe4af3db-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 22:32:39 crc kubenswrapper[4962]: I1201 22:32:39.622544 4962 generic.go:334] "Generic (PLEG): container finished" podID="5e11e9f8-f16b-44b8-a690-1211fe4af3db" containerID="021bbc8158c6ea8cfefee2998e370936cb8d4b4c88c9b6527875d2c8adbbebf5" exitCode=0 Dec 01 22:32:39 crc kubenswrapper[4962]: I1201 22:32:39.622607 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d2j2p" event={"ID":"5e11e9f8-f16b-44b8-a690-1211fe4af3db","Type":"ContainerDied","Data":"021bbc8158c6ea8cfefee2998e370936cb8d4b4c88c9b6527875d2c8adbbebf5"} Dec 01 22:32:39 crc kubenswrapper[4962]: I1201 22:32:39.622648 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d2j2p" event={"ID":"5e11e9f8-f16b-44b8-a690-1211fe4af3db","Type":"ContainerDied","Data":"8e9dd7628e6e06a87db8c233a8024cf6632c9baec216599e80af714c69b4fe9e"} Dec 01 22:32:39 crc kubenswrapper[4962]: I1201 22:32:39.622677 4962 scope.go:117] "RemoveContainer" containerID="021bbc8158c6ea8cfefee2998e370936cb8d4b4c88c9b6527875d2c8adbbebf5" Dec 01 22:32:39 crc kubenswrapper[4962]: I1201 22:32:39.622880 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d2j2p" Dec 01 22:32:39 crc kubenswrapper[4962]: I1201 22:32:39.654090 4962 scope.go:117] "RemoveContainer" containerID="60842e05dc0bdc1c945a8fdadde690ab52806d6ee1c23f878a871eede71f1513" Dec 01 22:32:39 crc kubenswrapper[4962]: I1201 22:32:39.683581 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-d2j2p"] Dec 01 22:32:39 crc kubenswrapper[4962]: I1201 22:32:39.699410 4962 scope.go:117] "RemoveContainer" containerID="90ee26829cb7e91638b58001566294a4004a105f9c1ada3fe288b682ddb478e6" Dec 01 22:32:39 crc kubenswrapper[4962]: I1201 22:32:39.707218 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-d2j2p"] Dec 01 22:32:39 crc kubenswrapper[4962]: I1201 22:32:39.760265 4962 scope.go:117] "RemoveContainer" containerID="021bbc8158c6ea8cfefee2998e370936cb8d4b4c88c9b6527875d2c8adbbebf5" Dec 01 22:32:39 crc kubenswrapper[4962]: E1201 22:32:39.760731 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"021bbc8158c6ea8cfefee2998e370936cb8d4b4c88c9b6527875d2c8adbbebf5\": container with ID starting with 021bbc8158c6ea8cfefee2998e370936cb8d4b4c88c9b6527875d2c8adbbebf5 not found: ID does not exist" containerID="021bbc8158c6ea8cfefee2998e370936cb8d4b4c88c9b6527875d2c8adbbebf5" Dec 01 22:32:39 crc kubenswrapper[4962]: I1201 22:32:39.760773 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"021bbc8158c6ea8cfefee2998e370936cb8d4b4c88c9b6527875d2c8adbbebf5"} err="failed to get container status \"021bbc8158c6ea8cfefee2998e370936cb8d4b4c88c9b6527875d2c8adbbebf5\": rpc error: code = NotFound desc = could not find container \"021bbc8158c6ea8cfefee2998e370936cb8d4b4c88c9b6527875d2c8adbbebf5\": container with ID starting with 021bbc8158c6ea8cfefee2998e370936cb8d4b4c88c9b6527875d2c8adbbebf5 not found: ID does not exist" Dec 01 22:32:39 crc kubenswrapper[4962]: I1201 22:32:39.760800 4962 scope.go:117] "RemoveContainer" containerID="60842e05dc0bdc1c945a8fdadde690ab52806d6ee1c23f878a871eede71f1513" Dec 01 22:32:39 crc kubenswrapper[4962]: E1201 22:32:39.761487 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60842e05dc0bdc1c945a8fdadde690ab52806d6ee1c23f878a871eede71f1513\": container with ID starting with 60842e05dc0bdc1c945a8fdadde690ab52806d6ee1c23f878a871eede71f1513 not found: ID does not exist" containerID="60842e05dc0bdc1c945a8fdadde690ab52806d6ee1c23f878a871eede71f1513" Dec 01 22:32:39 crc kubenswrapper[4962]: I1201 22:32:39.761519 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60842e05dc0bdc1c945a8fdadde690ab52806d6ee1c23f878a871eede71f1513"} err="failed to get container status \"60842e05dc0bdc1c945a8fdadde690ab52806d6ee1c23f878a871eede71f1513\": rpc error: code = NotFound desc = could not find container \"60842e05dc0bdc1c945a8fdadde690ab52806d6ee1c23f878a871eede71f1513\": container with ID starting with 60842e05dc0bdc1c945a8fdadde690ab52806d6ee1c23f878a871eede71f1513 not found: ID does not exist" Dec 01 22:32:39 crc kubenswrapper[4962]: I1201 22:32:39.761539 4962 scope.go:117] "RemoveContainer" containerID="90ee26829cb7e91638b58001566294a4004a105f9c1ada3fe288b682ddb478e6" Dec 01 22:32:39 crc kubenswrapper[4962]: E1201 22:32:39.761860 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90ee26829cb7e91638b58001566294a4004a105f9c1ada3fe288b682ddb478e6\": container with ID starting with 90ee26829cb7e91638b58001566294a4004a105f9c1ada3fe288b682ddb478e6 not found: ID does not exist" containerID="90ee26829cb7e91638b58001566294a4004a105f9c1ada3fe288b682ddb478e6" Dec 01 22:32:39 crc kubenswrapper[4962]: I1201 22:32:39.761970 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90ee26829cb7e91638b58001566294a4004a105f9c1ada3fe288b682ddb478e6"} err="failed to get container status \"90ee26829cb7e91638b58001566294a4004a105f9c1ada3fe288b682ddb478e6\": rpc error: code = NotFound desc = could not find container \"90ee26829cb7e91638b58001566294a4004a105f9c1ada3fe288b682ddb478e6\": container with ID starting with 90ee26829cb7e91638b58001566294a4004a105f9c1ada3fe288b682ddb478e6 not found: ID does not exist" Dec 01 22:32:40 crc kubenswrapper[4962]: I1201 22:32:40.235218 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e11e9f8-f16b-44b8-a690-1211fe4af3db" path="/var/lib/kubelet/pods/5e11e9f8-f16b-44b8-a690-1211fe4af3db/volumes" Dec 01 22:33:02 crc kubenswrapper[4962]: I1201 22:33:02.790085 4962 patch_prober.go:28] interesting pod/machine-config-daemon-b642k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 22:33:02 crc kubenswrapper[4962]: I1201 22:33:02.790789 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 22:33:06 crc kubenswrapper[4962]: E1201 22:33:06.169432 4962 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.110:43426->38.102.83.110:46143: write tcp 38.102.83.110:43426->38.102.83.110:46143: write: broken pipe Dec 01 22:33:32 crc kubenswrapper[4962]: I1201 22:33:32.784552 4962 patch_prober.go:28] interesting pod/machine-config-daemon-b642k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 22:33:32 crc kubenswrapper[4962]: I1201 22:33:32.785147 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 22:33:43 crc kubenswrapper[4962]: I1201 22:33:43.304578 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wh6w2"] Dec 01 22:33:43 crc kubenswrapper[4962]: E1201 22:33:43.305705 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e11e9f8-f16b-44b8-a690-1211fe4af3db" containerName="extract-content" Dec 01 22:33:43 crc kubenswrapper[4962]: I1201 22:33:43.305720 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e11e9f8-f16b-44b8-a690-1211fe4af3db" containerName="extract-content" Dec 01 22:33:43 crc kubenswrapper[4962]: E1201 22:33:43.305742 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e11e9f8-f16b-44b8-a690-1211fe4af3db" containerName="extract-utilities" Dec 01 22:33:43 crc kubenswrapper[4962]: I1201 22:33:43.305752 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e11e9f8-f16b-44b8-a690-1211fe4af3db" containerName="extract-utilities" Dec 01 22:33:43 crc kubenswrapper[4962]: E1201 22:33:43.305807 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e11e9f8-f16b-44b8-a690-1211fe4af3db" containerName="registry-server" Dec 01 22:33:43 crc kubenswrapper[4962]: I1201 22:33:43.305816 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e11e9f8-f16b-44b8-a690-1211fe4af3db" containerName="registry-server" Dec 01 22:33:43 crc kubenswrapper[4962]: I1201 22:33:43.306109 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e11e9f8-f16b-44b8-a690-1211fe4af3db" containerName="registry-server" Dec 01 22:33:43 crc kubenswrapper[4962]: I1201 22:33:43.308162 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wh6w2" Dec 01 22:33:43 crc kubenswrapper[4962]: I1201 22:33:43.345805 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wh6w2"] Dec 01 22:33:43 crc kubenswrapper[4962]: I1201 22:33:43.428176 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0181a986-e580-44ec-bae7-a10366f76f4b-utilities\") pod \"redhat-marketplace-wh6w2\" (UID: \"0181a986-e580-44ec-bae7-a10366f76f4b\") " pod="openshift-marketplace/redhat-marketplace-wh6w2" Dec 01 22:33:43 crc kubenswrapper[4962]: I1201 22:33:43.428317 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2wtr\" (UniqueName: \"kubernetes.io/projected/0181a986-e580-44ec-bae7-a10366f76f4b-kube-api-access-s2wtr\") pod \"redhat-marketplace-wh6w2\" (UID: \"0181a986-e580-44ec-bae7-a10366f76f4b\") " pod="openshift-marketplace/redhat-marketplace-wh6w2" Dec 01 22:33:43 crc kubenswrapper[4962]: I1201 22:33:43.428434 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0181a986-e580-44ec-bae7-a10366f76f4b-catalog-content\") pod \"redhat-marketplace-wh6w2\" (UID: \"0181a986-e580-44ec-bae7-a10366f76f4b\") " pod="openshift-marketplace/redhat-marketplace-wh6w2" Dec 01 22:33:43 crc kubenswrapper[4962]: I1201 22:33:43.530807 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0181a986-e580-44ec-bae7-a10366f76f4b-utilities\") pod \"redhat-marketplace-wh6w2\" (UID: \"0181a986-e580-44ec-bae7-a10366f76f4b\") " pod="openshift-marketplace/redhat-marketplace-wh6w2" Dec 01 22:33:43 crc kubenswrapper[4962]: I1201 22:33:43.530913 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2wtr\" (UniqueName: \"kubernetes.io/projected/0181a986-e580-44ec-bae7-a10366f76f4b-kube-api-access-s2wtr\") pod \"redhat-marketplace-wh6w2\" (UID: \"0181a986-e580-44ec-bae7-a10366f76f4b\") " pod="openshift-marketplace/redhat-marketplace-wh6w2" Dec 01 22:33:43 crc kubenswrapper[4962]: I1201 22:33:43.531005 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0181a986-e580-44ec-bae7-a10366f76f4b-catalog-content\") pod \"redhat-marketplace-wh6w2\" (UID: \"0181a986-e580-44ec-bae7-a10366f76f4b\") " pod="openshift-marketplace/redhat-marketplace-wh6w2" Dec 01 22:33:43 crc kubenswrapper[4962]: I1201 22:33:43.531561 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0181a986-e580-44ec-bae7-a10366f76f4b-catalog-content\") pod \"redhat-marketplace-wh6w2\" (UID: \"0181a986-e580-44ec-bae7-a10366f76f4b\") " pod="openshift-marketplace/redhat-marketplace-wh6w2" Dec 01 22:33:43 crc kubenswrapper[4962]: I1201 22:33:43.531636 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0181a986-e580-44ec-bae7-a10366f76f4b-utilities\") pod \"redhat-marketplace-wh6w2\" (UID: \"0181a986-e580-44ec-bae7-a10366f76f4b\") " pod="openshift-marketplace/redhat-marketplace-wh6w2" Dec 01 22:33:43 crc kubenswrapper[4962]: I1201 22:33:43.556795 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2wtr\" (UniqueName: \"kubernetes.io/projected/0181a986-e580-44ec-bae7-a10366f76f4b-kube-api-access-s2wtr\") pod \"redhat-marketplace-wh6w2\" (UID: \"0181a986-e580-44ec-bae7-a10366f76f4b\") " pod="openshift-marketplace/redhat-marketplace-wh6w2" Dec 01 22:33:43 crc kubenswrapper[4962]: I1201 22:33:43.653297 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wh6w2" Dec 01 22:33:44 crc kubenswrapper[4962]: I1201 22:33:44.249791 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wh6w2"] Dec 01 22:33:44 crc kubenswrapper[4962]: W1201 22:33:44.253111 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0181a986_e580_44ec_bae7_a10366f76f4b.slice/crio-0f1617077210b9a24d9339bc698b1c48ac33702c7b59394792a5bf35b8020d22 WatchSource:0}: Error finding container 0f1617077210b9a24d9339bc698b1c48ac33702c7b59394792a5bf35b8020d22: Status 404 returned error can't find the container with id 0f1617077210b9a24d9339bc698b1c48ac33702c7b59394792a5bf35b8020d22 Dec 01 22:33:44 crc kubenswrapper[4962]: I1201 22:33:44.526038 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wh6w2" event={"ID":"0181a986-e580-44ec-bae7-a10366f76f4b","Type":"ContainerStarted","Data":"fd469a093e7d746d2eae5f4b2ce10717c983867e4a707d27910819924ba08116"} Dec 01 22:33:44 crc kubenswrapper[4962]: I1201 22:33:44.526093 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wh6w2" event={"ID":"0181a986-e580-44ec-bae7-a10366f76f4b","Type":"ContainerStarted","Data":"0f1617077210b9a24d9339bc698b1c48ac33702c7b59394792a5bf35b8020d22"} Dec 01 22:33:44 crc kubenswrapper[4962]: I1201 22:33:44.527918 4962 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 22:33:45 crc kubenswrapper[4962]: I1201 22:33:45.548829 4962 generic.go:334] "Generic (PLEG): container finished" podID="0181a986-e580-44ec-bae7-a10366f76f4b" containerID="fd469a093e7d746d2eae5f4b2ce10717c983867e4a707d27910819924ba08116" exitCode=0 Dec 01 22:33:45 crc kubenswrapper[4962]: I1201 22:33:45.548963 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wh6w2" event={"ID":"0181a986-e580-44ec-bae7-a10366f76f4b","Type":"ContainerDied","Data":"fd469a093e7d746d2eae5f4b2ce10717c983867e4a707d27910819924ba08116"} Dec 01 22:33:45 crc kubenswrapper[4962]: I1201 22:33:45.549464 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wh6w2" event={"ID":"0181a986-e580-44ec-bae7-a10366f76f4b","Type":"ContainerStarted","Data":"b8a3e06854f683743e727e0a2a3ade1a00464e7b1bc1e4750d66bf7348635d1c"} Dec 01 22:33:46 crc kubenswrapper[4962]: I1201 22:33:46.562264 4962 generic.go:334] "Generic (PLEG): container finished" podID="0181a986-e580-44ec-bae7-a10366f76f4b" containerID="b8a3e06854f683743e727e0a2a3ade1a00464e7b1bc1e4750d66bf7348635d1c" exitCode=0 Dec 01 22:33:46 crc kubenswrapper[4962]: I1201 22:33:46.562350 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wh6w2" event={"ID":"0181a986-e580-44ec-bae7-a10366f76f4b","Type":"ContainerDied","Data":"b8a3e06854f683743e727e0a2a3ade1a00464e7b1bc1e4750d66bf7348635d1c"} Dec 01 22:33:47 crc kubenswrapper[4962]: I1201 22:33:47.593458 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wh6w2" event={"ID":"0181a986-e580-44ec-bae7-a10366f76f4b","Type":"ContainerStarted","Data":"f9eb28ca8a316fc27e8b905024bfef5c1d71d9b77204b84c68feb9040d16e5b5"} Dec 01 22:33:47 crc kubenswrapper[4962]: I1201 22:33:47.622069 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wh6w2" podStartSLOduration=2.089428334 podStartE2EDuration="4.622042368s" podCreationTimestamp="2025-12-01 22:33:43 +0000 UTC" firstStartedPulling="2025-12-01 22:33:44.527698555 +0000 UTC m=+3608.629137750" lastFinishedPulling="2025-12-01 22:33:47.060312589 +0000 UTC m=+3611.161751784" observedRunningTime="2025-12-01 22:33:47.620924727 +0000 UTC m=+3611.722363942" watchObservedRunningTime="2025-12-01 22:33:47.622042368 +0000 UTC m=+3611.723481593" Dec 01 22:33:53 crc kubenswrapper[4962]: I1201 22:33:53.653618 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wh6w2" Dec 01 22:33:53 crc kubenswrapper[4962]: I1201 22:33:53.654271 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wh6w2" Dec 01 22:33:53 crc kubenswrapper[4962]: I1201 22:33:53.758306 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wh6w2" Dec 01 22:33:54 crc kubenswrapper[4962]: I1201 22:33:54.749849 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wh6w2" Dec 01 22:33:54 crc kubenswrapper[4962]: I1201 22:33:54.846200 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wh6w2"] Dec 01 22:33:56 crc kubenswrapper[4962]: I1201 22:33:56.705737 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wh6w2" podUID="0181a986-e580-44ec-bae7-a10366f76f4b" containerName="registry-server" containerID="cri-o://f9eb28ca8a316fc27e8b905024bfef5c1d71d9b77204b84c68feb9040d16e5b5" gracePeriod=2 Dec 01 22:33:57 crc kubenswrapper[4962]: I1201 22:33:57.354796 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wh6w2" Dec 01 22:33:57 crc kubenswrapper[4962]: I1201 22:33:57.465332 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0181a986-e580-44ec-bae7-a10366f76f4b-catalog-content\") pod \"0181a986-e580-44ec-bae7-a10366f76f4b\" (UID: \"0181a986-e580-44ec-bae7-a10366f76f4b\") " Dec 01 22:33:57 crc kubenswrapper[4962]: I1201 22:33:57.465433 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0181a986-e580-44ec-bae7-a10366f76f4b-utilities\") pod \"0181a986-e580-44ec-bae7-a10366f76f4b\" (UID: \"0181a986-e580-44ec-bae7-a10366f76f4b\") " Dec 01 22:33:57 crc kubenswrapper[4962]: I1201 22:33:57.465692 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2wtr\" (UniqueName: \"kubernetes.io/projected/0181a986-e580-44ec-bae7-a10366f76f4b-kube-api-access-s2wtr\") pod \"0181a986-e580-44ec-bae7-a10366f76f4b\" (UID: \"0181a986-e580-44ec-bae7-a10366f76f4b\") " Dec 01 22:33:57 crc kubenswrapper[4962]: I1201 22:33:57.466222 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0181a986-e580-44ec-bae7-a10366f76f4b-utilities" (OuterVolumeSpecName: "utilities") pod "0181a986-e580-44ec-bae7-a10366f76f4b" (UID: "0181a986-e580-44ec-bae7-a10366f76f4b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 22:33:57 crc kubenswrapper[4962]: I1201 22:33:57.472584 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0181a986-e580-44ec-bae7-a10366f76f4b-kube-api-access-s2wtr" (OuterVolumeSpecName: "kube-api-access-s2wtr") pod "0181a986-e580-44ec-bae7-a10366f76f4b" (UID: "0181a986-e580-44ec-bae7-a10366f76f4b"). InnerVolumeSpecName "kube-api-access-s2wtr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 22:33:57 crc kubenswrapper[4962]: I1201 22:33:57.483897 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0181a986-e580-44ec-bae7-a10366f76f4b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0181a986-e580-44ec-bae7-a10366f76f4b" (UID: "0181a986-e580-44ec-bae7-a10366f76f4b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 22:33:57 crc kubenswrapper[4962]: I1201 22:33:57.570530 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2wtr\" (UniqueName: \"kubernetes.io/projected/0181a986-e580-44ec-bae7-a10366f76f4b-kube-api-access-s2wtr\") on node \"crc\" DevicePath \"\"" Dec 01 22:33:57 crc kubenswrapper[4962]: I1201 22:33:57.570569 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0181a986-e580-44ec-bae7-a10366f76f4b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 22:33:57 crc kubenswrapper[4962]: I1201 22:33:57.570579 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0181a986-e580-44ec-bae7-a10366f76f4b-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 22:33:57 crc kubenswrapper[4962]: I1201 22:33:57.716445 4962 generic.go:334] "Generic (PLEG): container finished" podID="0181a986-e580-44ec-bae7-a10366f76f4b" containerID="f9eb28ca8a316fc27e8b905024bfef5c1d71d9b77204b84c68feb9040d16e5b5" exitCode=0 Dec 01 22:33:57 crc kubenswrapper[4962]: I1201 22:33:57.716484 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wh6w2" event={"ID":"0181a986-e580-44ec-bae7-a10366f76f4b","Type":"ContainerDied","Data":"f9eb28ca8a316fc27e8b905024bfef5c1d71d9b77204b84c68feb9040d16e5b5"} Dec 01 22:33:57 crc kubenswrapper[4962]: I1201 22:33:57.716510 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wh6w2" event={"ID":"0181a986-e580-44ec-bae7-a10366f76f4b","Type":"ContainerDied","Data":"0f1617077210b9a24d9339bc698b1c48ac33702c7b59394792a5bf35b8020d22"} Dec 01 22:33:57 crc kubenswrapper[4962]: I1201 22:33:57.716525 4962 scope.go:117] "RemoveContainer" containerID="f9eb28ca8a316fc27e8b905024bfef5c1d71d9b77204b84c68feb9040d16e5b5" Dec 01 22:33:57 crc kubenswrapper[4962]: I1201 22:33:57.717044 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wh6w2" Dec 01 22:33:57 crc kubenswrapper[4962]: I1201 22:33:57.746012 4962 scope.go:117] "RemoveContainer" containerID="b8a3e06854f683743e727e0a2a3ade1a00464e7b1bc1e4750d66bf7348635d1c" Dec 01 22:33:57 crc kubenswrapper[4962]: I1201 22:33:57.752073 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wh6w2"] Dec 01 22:33:57 crc kubenswrapper[4962]: I1201 22:33:57.762992 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wh6w2"] Dec 01 22:33:57 crc kubenswrapper[4962]: I1201 22:33:57.770120 4962 scope.go:117] "RemoveContainer" containerID="fd469a093e7d746d2eae5f4b2ce10717c983867e4a707d27910819924ba08116" Dec 01 22:33:57 crc kubenswrapper[4962]: I1201 22:33:57.829813 4962 scope.go:117] "RemoveContainer" containerID="f9eb28ca8a316fc27e8b905024bfef5c1d71d9b77204b84c68feb9040d16e5b5" Dec 01 22:33:57 crc kubenswrapper[4962]: E1201 22:33:57.830486 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9eb28ca8a316fc27e8b905024bfef5c1d71d9b77204b84c68feb9040d16e5b5\": container with ID starting with f9eb28ca8a316fc27e8b905024bfef5c1d71d9b77204b84c68feb9040d16e5b5 not found: ID does not exist" containerID="f9eb28ca8a316fc27e8b905024bfef5c1d71d9b77204b84c68feb9040d16e5b5" Dec 01 22:33:57 crc kubenswrapper[4962]: I1201 22:33:57.830517 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9eb28ca8a316fc27e8b905024bfef5c1d71d9b77204b84c68feb9040d16e5b5"} err="failed to get container status \"f9eb28ca8a316fc27e8b905024bfef5c1d71d9b77204b84c68feb9040d16e5b5\": rpc error: code = NotFound desc = could not find container \"f9eb28ca8a316fc27e8b905024bfef5c1d71d9b77204b84c68feb9040d16e5b5\": container with ID starting with f9eb28ca8a316fc27e8b905024bfef5c1d71d9b77204b84c68feb9040d16e5b5 not found: ID does not exist" Dec 01 22:33:57 crc kubenswrapper[4962]: I1201 22:33:57.830537 4962 scope.go:117] "RemoveContainer" containerID="b8a3e06854f683743e727e0a2a3ade1a00464e7b1bc1e4750d66bf7348635d1c" Dec 01 22:33:57 crc kubenswrapper[4962]: E1201 22:33:57.830851 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8a3e06854f683743e727e0a2a3ade1a00464e7b1bc1e4750d66bf7348635d1c\": container with ID starting with b8a3e06854f683743e727e0a2a3ade1a00464e7b1bc1e4750d66bf7348635d1c not found: ID does not exist" containerID="b8a3e06854f683743e727e0a2a3ade1a00464e7b1bc1e4750d66bf7348635d1c" Dec 01 22:33:57 crc kubenswrapper[4962]: I1201 22:33:57.830907 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8a3e06854f683743e727e0a2a3ade1a00464e7b1bc1e4750d66bf7348635d1c"} err="failed to get container status \"b8a3e06854f683743e727e0a2a3ade1a00464e7b1bc1e4750d66bf7348635d1c\": rpc error: code = NotFound desc = could not find container \"b8a3e06854f683743e727e0a2a3ade1a00464e7b1bc1e4750d66bf7348635d1c\": container with ID starting with b8a3e06854f683743e727e0a2a3ade1a00464e7b1bc1e4750d66bf7348635d1c not found: ID does not exist" Dec 01 22:33:57 crc kubenswrapper[4962]: I1201 22:33:57.830966 4962 scope.go:117] "RemoveContainer" containerID="fd469a093e7d746d2eae5f4b2ce10717c983867e4a707d27910819924ba08116" Dec 01 22:33:57 crc kubenswrapper[4962]: E1201 22:33:57.831256 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd469a093e7d746d2eae5f4b2ce10717c983867e4a707d27910819924ba08116\": container with ID starting with fd469a093e7d746d2eae5f4b2ce10717c983867e4a707d27910819924ba08116 not found: ID does not exist" containerID="fd469a093e7d746d2eae5f4b2ce10717c983867e4a707d27910819924ba08116" Dec 01 22:33:57 crc kubenswrapper[4962]: I1201 22:33:57.831292 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd469a093e7d746d2eae5f4b2ce10717c983867e4a707d27910819924ba08116"} err="failed to get container status \"fd469a093e7d746d2eae5f4b2ce10717c983867e4a707d27910819924ba08116\": rpc error: code = NotFound desc = could not find container \"fd469a093e7d746d2eae5f4b2ce10717c983867e4a707d27910819924ba08116\": container with ID starting with fd469a093e7d746d2eae5f4b2ce10717c983867e4a707d27910819924ba08116 not found: ID does not exist" Dec 01 22:33:58 crc kubenswrapper[4962]: I1201 22:33:58.234616 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0181a986-e580-44ec-bae7-a10366f76f4b" path="/var/lib/kubelet/pods/0181a986-e580-44ec-bae7-a10366f76f4b/volumes" Dec 01 22:34:02 crc kubenswrapper[4962]: I1201 22:34:02.784311 4962 patch_prober.go:28] interesting pod/machine-config-daemon-b642k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 22:34:02 crc kubenswrapper[4962]: I1201 22:34:02.784823 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 22:34:02 crc kubenswrapper[4962]: I1201 22:34:02.784858 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b642k" Dec 01 22:34:02 crc kubenswrapper[4962]: I1201 22:34:02.785701 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"750f5bf5a0e2257032dd3b4e3aa32091ca5bc4fee7f91f51c5f34d3ada27f955"} pod="openshift-machine-config-operator/machine-config-daemon-b642k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 22:34:02 crc kubenswrapper[4962]: I1201 22:34:02.785780 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" containerID="cri-o://750f5bf5a0e2257032dd3b4e3aa32091ca5bc4fee7f91f51c5f34d3ada27f955" gracePeriod=600 Dec 01 22:34:03 crc kubenswrapper[4962]: I1201 22:34:03.797608 4962 generic.go:334] "Generic (PLEG): container finished" podID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerID="750f5bf5a0e2257032dd3b4e3aa32091ca5bc4fee7f91f51c5f34d3ada27f955" exitCode=0 Dec 01 22:34:03 crc kubenswrapper[4962]: I1201 22:34:03.798416 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" event={"ID":"191b6ce3-f613-4217-b224-a65ee4cfdfe7","Type":"ContainerDied","Data":"750f5bf5a0e2257032dd3b4e3aa32091ca5bc4fee7f91f51c5f34d3ada27f955"} Dec 01 22:34:03 crc kubenswrapper[4962]: I1201 22:34:03.798452 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" event={"ID":"191b6ce3-f613-4217-b224-a65ee4cfdfe7","Type":"ContainerStarted","Data":"36a09ef847cf5ec44e4a7f255c073008d056414bc9c5b5f97a96b84dbe8dff75"} Dec 01 22:34:03 crc kubenswrapper[4962]: I1201 22:34:03.798474 4962 scope.go:117] "RemoveContainer" containerID="2e23bada28e6e6bf435d62af0fa8d63ab344b8505237ba65bcba1b122cc5a1ef" Dec 01 22:36:32 crc kubenswrapper[4962]: I1201 22:36:32.784905 4962 patch_prober.go:28] interesting pod/machine-config-daemon-b642k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 22:36:32 crc kubenswrapper[4962]: I1201 22:36:32.785986 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 22:37:02 crc kubenswrapper[4962]: I1201 22:37:02.784222 4962 patch_prober.go:28] interesting pod/machine-config-daemon-b642k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 22:37:02 crc kubenswrapper[4962]: I1201 22:37:02.785005 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 22:37:32 crc kubenswrapper[4962]: I1201 22:37:32.784351 4962 patch_prober.go:28] interesting pod/machine-config-daemon-b642k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 22:37:32 crc kubenswrapper[4962]: I1201 22:37:32.784912 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 22:37:32 crc kubenswrapper[4962]: I1201 22:37:32.784984 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b642k" Dec 01 22:37:32 crc kubenswrapper[4962]: I1201 22:37:32.785981 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"36a09ef847cf5ec44e4a7f255c073008d056414bc9c5b5f97a96b84dbe8dff75"} pod="openshift-machine-config-operator/machine-config-daemon-b642k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 22:37:32 crc kubenswrapper[4962]: I1201 22:37:32.786061 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" containerID="cri-o://36a09ef847cf5ec44e4a7f255c073008d056414bc9c5b5f97a96b84dbe8dff75" gracePeriod=600 Dec 01 22:37:32 crc kubenswrapper[4962]: E1201 22:37:32.923861 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:37:33 crc kubenswrapper[4962]: I1201 22:37:33.731405 4962 generic.go:334] "Generic (PLEG): container finished" podID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerID="36a09ef847cf5ec44e4a7f255c073008d056414bc9c5b5f97a96b84dbe8dff75" exitCode=0 Dec 01 22:37:33 crc kubenswrapper[4962]: I1201 22:37:33.731502 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" event={"ID":"191b6ce3-f613-4217-b224-a65ee4cfdfe7","Type":"ContainerDied","Data":"36a09ef847cf5ec44e4a7f255c073008d056414bc9c5b5f97a96b84dbe8dff75"} Dec 01 22:37:33 crc kubenswrapper[4962]: I1201 22:37:33.731782 4962 scope.go:117] "RemoveContainer" containerID="750f5bf5a0e2257032dd3b4e3aa32091ca5bc4fee7f91f51c5f34d3ada27f955" Dec 01 22:37:33 crc kubenswrapper[4962]: I1201 22:37:33.732680 4962 scope.go:117] "RemoveContainer" containerID="36a09ef847cf5ec44e4a7f255c073008d056414bc9c5b5f97a96b84dbe8dff75" Dec 01 22:37:33 crc kubenswrapper[4962]: E1201 22:37:33.733088 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:37:47 crc kubenswrapper[4962]: I1201 22:37:47.220449 4962 scope.go:117] "RemoveContainer" containerID="36a09ef847cf5ec44e4a7f255c073008d056414bc9c5b5f97a96b84dbe8dff75" Dec 01 22:37:47 crc kubenswrapper[4962]: E1201 22:37:47.223254 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:37:58 crc kubenswrapper[4962]: I1201 22:37:58.220404 4962 scope.go:117] "RemoveContainer" containerID="36a09ef847cf5ec44e4a7f255c073008d056414bc9c5b5f97a96b84dbe8dff75" Dec 01 22:37:58 crc kubenswrapper[4962]: E1201 22:37:58.221333 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:38:10 crc kubenswrapper[4962]: I1201 22:38:10.220774 4962 scope.go:117] "RemoveContainer" containerID="36a09ef847cf5ec44e4a7f255c073008d056414bc9c5b5f97a96b84dbe8dff75" Dec 01 22:38:10 crc kubenswrapper[4962]: E1201 22:38:10.221905 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:38:21 crc kubenswrapper[4962]: I1201 22:38:21.735254 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-78bfd"] Dec 01 22:38:21 crc kubenswrapper[4962]: E1201 22:38:21.736645 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0181a986-e580-44ec-bae7-a10366f76f4b" containerName="registry-server" Dec 01 22:38:21 crc kubenswrapper[4962]: I1201 22:38:21.736668 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="0181a986-e580-44ec-bae7-a10366f76f4b" containerName="registry-server" Dec 01 22:38:21 crc kubenswrapper[4962]: E1201 22:38:21.736710 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0181a986-e580-44ec-bae7-a10366f76f4b" containerName="extract-utilities" Dec 01 22:38:21 crc kubenswrapper[4962]: I1201 22:38:21.736720 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="0181a986-e580-44ec-bae7-a10366f76f4b" containerName="extract-utilities" Dec 01 22:38:21 crc kubenswrapper[4962]: E1201 22:38:21.736741 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0181a986-e580-44ec-bae7-a10366f76f4b" containerName="extract-content" Dec 01 22:38:21 crc kubenswrapper[4962]: I1201 22:38:21.736750 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="0181a986-e580-44ec-bae7-a10366f76f4b" containerName="extract-content" Dec 01 22:38:21 crc kubenswrapper[4962]: I1201 22:38:21.737097 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="0181a986-e580-44ec-bae7-a10366f76f4b" containerName="registry-server" Dec 01 22:38:21 crc kubenswrapper[4962]: I1201 22:38:21.739788 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-78bfd" Dec 01 22:38:21 crc kubenswrapper[4962]: I1201 22:38:21.761537 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-78bfd"] Dec 01 22:38:21 crc kubenswrapper[4962]: I1201 22:38:21.782176 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fh4xd\" (UniqueName: \"kubernetes.io/projected/cfb34df0-005b-47af-9575-b1dcc52652a2-kube-api-access-fh4xd\") pod \"community-operators-78bfd\" (UID: \"cfb34df0-005b-47af-9575-b1dcc52652a2\") " pod="openshift-marketplace/community-operators-78bfd" Dec 01 22:38:21 crc kubenswrapper[4962]: I1201 22:38:21.782276 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfb34df0-005b-47af-9575-b1dcc52652a2-catalog-content\") pod \"community-operators-78bfd\" (UID: \"cfb34df0-005b-47af-9575-b1dcc52652a2\") " pod="openshift-marketplace/community-operators-78bfd" Dec 01 22:38:21 crc kubenswrapper[4962]: I1201 22:38:21.782973 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfb34df0-005b-47af-9575-b1dcc52652a2-utilities\") pod \"community-operators-78bfd\" (UID: \"cfb34df0-005b-47af-9575-b1dcc52652a2\") " pod="openshift-marketplace/community-operators-78bfd" Dec 01 22:38:21 crc kubenswrapper[4962]: I1201 22:38:21.885691 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fh4xd\" (UniqueName: \"kubernetes.io/projected/cfb34df0-005b-47af-9575-b1dcc52652a2-kube-api-access-fh4xd\") pod \"community-operators-78bfd\" (UID: \"cfb34df0-005b-47af-9575-b1dcc52652a2\") " pod="openshift-marketplace/community-operators-78bfd" Dec 01 22:38:21 crc kubenswrapper[4962]: I1201 22:38:21.885768 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfb34df0-005b-47af-9575-b1dcc52652a2-catalog-content\") pod \"community-operators-78bfd\" (UID: \"cfb34df0-005b-47af-9575-b1dcc52652a2\") " pod="openshift-marketplace/community-operators-78bfd" Dec 01 22:38:21 crc kubenswrapper[4962]: I1201 22:38:21.886000 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfb34df0-005b-47af-9575-b1dcc52652a2-utilities\") pod \"community-operators-78bfd\" (UID: \"cfb34df0-005b-47af-9575-b1dcc52652a2\") " pod="openshift-marketplace/community-operators-78bfd" Dec 01 22:38:21 crc kubenswrapper[4962]: I1201 22:38:21.886612 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfb34df0-005b-47af-9575-b1dcc52652a2-utilities\") pod \"community-operators-78bfd\" (UID: \"cfb34df0-005b-47af-9575-b1dcc52652a2\") " pod="openshift-marketplace/community-operators-78bfd" Dec 01 22:38:21 crc kubenswrapper[4962]: I1201 22:38:21.886838 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfb34df0-005b-47af-9575-b1dcc52652a2-catalog-content\") pod \"community-operators-78bfd\" (UID: \"cfb34df0-005b-47af-9575-b1dcc52652a2\") " pod="openshift-marketplace/community-operators-78bfd" Dec 01 22:38:21 crc kubenswrapper[4962]: I1201 22:38:21.909672 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fh4xd\" (UniqueName: \"kubernetes.io/projected/cfb34df0-005b-47af-9575-b1dcc52652a2-kube-api-access-fh4xd\") pod \"community-operators-78bfd\" (UID: \"cfb34df0-005b-47af-9575-b1dcc52652a2\") " pod="openshift-marketplace/community-operators-78bfd" Dec 01 22:38:22 crc kubenswrapper[4962]: I1201 22:38:22.078382 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-78bfd" Dec 01 22:38:22 crc kubenswrapper[4962]: I1201 22:38:22.219956 4962 scope.go:117] "RemoveContainer" containerID="36a09ef847cf5ec44e4a7f255c073008d056414bc9c5b5f97a96b84dbe8dff75" Dec 01 22:38:22 crc kubenswrapper[4962]: E1201 22:38:22.220385 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:38:22 crc kubenswrapper[4962]: I1201 22:38:22.613315 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-78bfd"] Dec 01 22:38:23 crc kubenswrapper[4962]: I1201 22:38:23.425182 4962 generic.go:334] "Generic (PLEG): container finished" podID="cfb34df0-005b-47af-9575-b1dcc52652a2" containerID="1e452a72b0025e222ad7994c7115f2df76fb9f3a607f49a831d279bb2f451d6f" exitCode=0 Dec 01 22:38:23 crc kubenswrapper[4962]: I1201 22:38:23.425306 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-78bfd" event={"ID":"cfb34df0-005b-47af-9575-b1dcc52652a2","Type":"ContainerDied","Data":"1e452a72b0025e222ad7994c7115f2df76fb9f3a607f49a831d279bb2f451d6f"} Dec 01 22:38:23 crc kubenswrapper[4962]: I1201 22:38:23.426847 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-78bfd" event={"ID":"cfb34df0-005b-47af-9575-b1dcc52652a2","Type":"ContainerStarted","Data":"e215c2952978a9e781f666f6e0ebd1e899316d130ca424a2449c095a07394b23"} Dec 01 22:38:25 crc kubenswrapper[4962]: I1201 22:38:25.458963 4962 generic.go:334] "Generic (PLEG): container finished" podID="cfb34df0-005b-47af-9575-b1dcc52652a2" containerID="5c9f2515c949965f4e79e4f909373821c7ab97f51a07263d46ca83f23bbc3f06" exitCode=0 Dec 01 22:38:25 crc kubenswrapper[4962]: I1201 22:38:25.459069 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-78bfd" event={"ID":"cfb34df0-005b-47af-9575-b1dcc52652a2","Type":"ContainerDied","Data":"5c9f2515c949965f4e79e4f909373821c7ab97f51a07263d46ca83f23bbc3f06"} Dec 01 22:38:26 crc kubenswrapper[4962]: I1201 22:38:26.472525 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-78bfd" event={"ID":"cfb34df0-005b-47af-9575-b1dcc52652a2","Type":"ContainerStarted","Data":"af8ddba418d0c0d7ddb1707508f55c93a0c9328582faef5d120be2c8c87da3c0"} Dec 01 22:38:26 crc kubenswrapper[4962]: I1201 22:38:26.508600 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-78bfd" podStartSLOduration=2.966715539 podStartE2EDuration="5.508579311s" podCreationTimestamp="2025-12-01 22:38:21 +0000 UTC" firstStartedPulling="2025-12-01 22:38:23.427697392 +0000 UTC m=+3887.529136627" lastFinishedPulling="2025-12-01 22:38:25.969561194 +0000 UTC m=+3890.071000399" observedRunningTime="2025-12-01 22:38:26.495594363 +0000 UTC m=+3890.597033578" watchObservedRunningTime="2025-12-01 22:38:26.508579311 +0000 UTC m=+3890.610018506" Dec 01 22:38:32 crc kubenswrapper[4962]: I1201 22:38:32.079091 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-78bfd" Dec 01 22:38:32 crc kubenswrapper[4962]: I1201 22:38:32.080402 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-78bfd" Dec 01 22:38:32 crc kubenswrapper[4962]: I1201 22:38:32.146337 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-78bfd" Dec 01 22:38:32 crc kubenswrapper[4962]: I1201 22:38:32.608052 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-78bfd" Dec 01 22:38:32 crc kubenswrapper[4962]: I1201 22:38:32.661116 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-78bfd"] Dec 01 22:38:34 crc kubenswrapper[4962]: I1201 22:38:34.220464 4962 scope.go:117] "RemoveContainer" containerID="36a09ef847cf5ec44e4a7f255c073008d056414bc9c5b5f97a96b84dbe8dff75" Dec 01 22:38:34 crc kubenswrapper[4962]: E1201 22:38:34.221702 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:38:34 crc kubenswrapper[4962]: I1201 22:38:34.573838 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-78bfd" podUID="cfb34df0-005b-47af-9575-b1dcc52652a2" containerName="registry-server" containerID="cri-o://af8ddba418d0c0d7ddb1707508f55c93a0c9328582faef5d120be2c8c87da3c0" gracePeriod=2 Dec 01 22:38:35 crc kubenswrapper[4962]: I1201 22:38:35.800972 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/redhat-marketplace-6bphp" podUID="6ee35195-33b7-4bc8-80fb-7eb9f0ca221f" containerName="registry-server" probeResult="failure" output=< Dec 01 22:38:35 crc kubenswrapper[4962]: timeout: failed to connect service ":50051" within 1s Dec 01 22:38:35 crc kubenswrapper[4962]: > Dec 01 22:38:35 crc kubenswrapper[4962]: I1201 22:38:35.824105 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-6bphp" podUID="6ee35195-33b7-4bc8-80fb-7eb9f0ca221f" containerName="registry-server" probeResult="failure" output=< Dec 01 22:38:35 crc kubenswrapper[4962]: timeout: failed to connect service ":50051" within 1s Dec 01 22:38:35 crc kubenswrapper[4962]: > Dec 01 22:38:36 crc kubenswrapper[4962]: I1201 22:38:36.351446 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-78bfd" Dec 01 22:38:36 crc kubenswrapper[4962]: I1201 22:38:36.468698 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fh4xd\" (UniqueName: \"kubernetes.io/projected/cfb34df0-005b-47af-9575-b1dcc52652a2-kube-api-access-fh4xd\") pod \"cfb34df0-005b-47af-9575-b1dcc52652a2\" (UID: \"cfb34df0-005b-47af-9575-b1dcc52652a2\") " Dec 01 22:38:36 crc kubenswrapper[4962]: I1201 22:38:36.468763 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfb34df0-005b-47af-9575-b1dcc52652a2-catalog-content\") pod \"cfb34df0-005b-47af-9575-b1dcc52652a2\" (UID: \"cfb34df0-005b-47af-9575-b1dcc52652a2\") " Dec 01 22:38:36 crc kubenswrapper[4962]: I1201 22:38:36.468816 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfb34df0-005b-47af-9575-b1dcc52652a2-utilities\") pod \"cfb34df0-005b-47af-9575-b1dcc52652a2\" (UID: \"cfb34df0-005b-47af-9575-b1dcc52652a2\") " Dec 01 22:38:36 crc kubenswrapper[4962]: I1201 22:38:36.469538 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfb34df0-005b-47af-9575-b1dcc52652a2-utilities" (OuterVolumeSpecName: "utilities") pod "cfb34df0-005b-47af-9575-b1dcc52652a2" (UID: "cfb34df0-005b-47af-9575-b1dcc52652a2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 22:38:36 crc kubenswrapper[4962]: I1201 22:38:36.475223 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfb34df0-005b-47af-9575-b1dcc52652a2-kube-api-access-fh4xd" (OuterVolumeSpecName: "kube-api-access-fh4xd") pod "cfb34df0-005b-47af-9575-b1dcc52652a2" (UID: "cfb34df0-005b-47af-9575-b1dcc52652a2"). InnerVolumeSpecName "kube-api-access-fh4xd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 22:38:36 crc kubenswrapper[4962]: I1201 22:38:36.513084 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfb34df0-005b-47af-9575-b1dcc52652a2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cfb34df0-005b-47af-9575-b1dcc52652a2" (UID: "cfb34df0-005b-47af-9575-b1dcc52652a2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 22:38:36 crc kubenswrapper[4962]: I1201 22:38:36.571668 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fh4xd\" (UniqueName: \"kubernetes.io/projected/cfb34df0-005b-47af-9575-b1dcc52652a2-kube-api-access-fh4xd\") on node \"crc\" DevicePath \"\"" Dec 01 22:38:36 crc kubenswrapper[4962]: I1201 22:38:36.571703 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfb34df0-005b-47af-9575-b1dcc52652a2-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 22:38:36 crc kubenswrapper[4962]: I1201 22:38:36.571729 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfb34df0-005b-47af-9575-b1dcc52652a2-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 22:38:36 crc kubenswrapper[4962]: I1201 22:38:36.815477 4962 generic.go:334] "Generic (PLEG): container finished" podID="cfb34df0-005b-47af-9575-b1dcc52652a2" containerID="af8ddba418d0c0d7ddb1707508f55c93a0c9328582faef5d120be2c8c87da3c0" exitCode=0 Dec 01 22:38:36 crc kubenswrapper[4962]: I1201 22:38:36.815517 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-78bfd" event={"ID":"cfb34df0-005b-47af-9575-b1dcc52652a2","Type":"ContainerDied","Data":"af8ddba418d0c0d7ddb1707508f55c93a0c9328582faef5d120be2c8c87da3c0"} Dec 01 22:38:36 crc kubenswrapper[4962]: I1201 22:38:36.815543 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-78bfd" event={"ID":"cfb34df0-005b-47af-9575-b1dcc52652a2","Type":"ContainerDied","Data":"e215c2952978a9e781f666f6e0ebd1e899316d130ca424a2449c095a07394b23"} Dec 01 22:38:36 crc kubenswrapper[4962]: I1201 22:38:36.815558 4962 scope.go:117] "RemoveContainer" containerID="af8ddba418d0c0d7ddb1707508f55c93a0c9328582faef5d120be2c8c87da3c0" Dec 01 22:38:36 crc kubenswrapper[4962]: I1201 22:38:36.815563 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-78bfd" Dec 01 22:38:36 crc kubenswrapper[4962]: I1201 22:38:36.855643 4962 scope.go:117] "RemoveContainer" containerID="5c9f2515c949965f4e79e4f909373821c7ab97f51a07263d46ca83f23bbc3f06" Dec 01 22:38:36 crc kubenswrapper[4962]: I1201 22:38:36.865865 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-78bfd"] Dec 01 22:38:36 crc kubenswrapper[4962]: I1201 22:38:36.881071 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-78bfd"] Dec 01 22:38:36 crc kubenswrapper[4962]: I1201 22:38:36.894403 4962 scope.go:117] "RemoveContainer" containerID="1e452a72b0025e222ad7994c7115f2df76fb9f3a607f49a831d279bb2f451d6f" Dec 01 22:38:36 crc kubenswrapper[4962]: I1201 22:38:36.953809 4962 scope.go:117] "RemoveContainer" containerID="af8ddba418d0c0d7ddb1707508f55c93a0c9328582faef5d120be2c8c87da3c0" Dec 01 22:38:36 crc kubenswrapper[4962]: E1201 22:38:36.954359 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af8ddba418d0c0d7ddb1707508f55c93a0c9328582faef5d120be2c8c87da3c0\": container with ID starting with af8ddba418d0c0d7ddb1707508f55c93a0c9328582faef5d120be2c8c87da3c0 not found: ID does not exist" containerID="af8ddba418d0c0d7ddb1707508f55c93a0c9328582faef5d120be2c8c87da3c0" Dec 01 22:38:36 crc kubenswrapper[4962]: I1201 22:38:36.954412 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af8ddba418d0c0d7ddb1707508f55c93a0c9328582faef5d120be2c8c87da3c0"} err="failed to get container status \"af8ddba418d0c0d7ddb1707508f55c93a0c9328582faef5d120be2c8c87da3c0\": rpc error: code = NotFound desc = could not find container \"af8ddba418d0c0d7ddb1707508f55c93a0c9328582faef5d120be2c8c87da3c0\": container with ID starting with af8ddba418d0c0d7ddb1707508f55c93a0c9328582faef5d120be2c8c87da3c0 not found: ID does not exist" Dec 01 22:38:36 crc kubenswrapper[4962]: I1201 22:38:36.954446 4962 scope.go:117] "RemoveContainer" containerID="5c9f2515c949965f4e79e4f909373821c7ab97f51a07263d46ca83f23bbc3f06" Dec 01 22:38:36 crc kubenswrapper[4962]: E1201 22:38:36.954920 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c9f2515c949965f4e79e4f909373821c7ab97f51a07263d46ca83f23bbc3f06\": container with ID starting with 5c9f2515c949965f4e79e4f909373821c7ab97f51a07263d46ca83f23bbc3f06 not found: ID does not exist" containerID="5c9f2515c949965f4e79e4f909373821c7ab97f51a07263d46ca83f23bbc3f06" Dec 01 22:38:36 crc kubenswrapper[4962]: I1201 22:38:36.954983 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c9f2515c949965f4e79e4f909373821c7ab97f51a07263d46ca83f23bbc3f06"} err="failed to get container status \"5c9f2515c949965f4e79e4f909373821c7ab97f51a07263d46ca83f23bbc3f06\": rpc error: code = NotFound desc = could not find container \"5c9f2515c949965f4e79e4f909373821c7ab97f51a07263d46ca83f23bbc3f06\": container with ID starting with 5c9f2515c949965f4e79e4f909373821c7ab97f51a07263d46ca83f23bbc3f06 not found: ID does not exist" Dec 01 22:38:36 crc kubenswrapper[4962]: I1201 22:38:36.955013 4962 scope.go:117] "RemoveContainer" containerID="1e452a72b0025e222ad7994c7115f2df76fb9f3a607f49a831d279bb2f451d6f" Dec 01 22:38:36 crc kubenswrapper[4962]: E1201 22:38:36.955355 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e452a72b0025e222ad7994c7115f2df76fb9f3a607f49a831d279bb2f451d6f\": container with ID starting with 1e452a72b0025e222ad7994c7115f2df76fb9f3a607f49a831d279bb2f451d6f not found: ID does not exist" containerID="1e452a72b0025e222ad7994c7115f2df76fb9f3a607f49a831d279bb2f451d6f" Dec 01 22:38:36 crc kubenswrapper[4962]: I1201 22:38:36.955389 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e452a72b0025e222ad7994c7115f2df76fb9f3a607f49a831d279bb2f451d6f"} err="failed to get container status \"1e452a72b0025e222ad7994c7115f2df76fb9f3a607f49a831d279bb2f451d6f\": rpc error: code = NotFound desc = could not find container \"1e452a72b0025e222ad7994c7115f2df76fb9f3a607f49a831d279bb2f451d6f\": container with ID starting with 1e452a72b0025e222ad7994c7115f2df76fb9f3a607f49a831d279bb2f451d6f not found: ID does not exist" Dec 01 22:38:38 crc kubenswrapper[4962]: I1201 22:38:38.247041 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfb34df0-005b-47af-9575-b1dcc52652a2" path="/var/lib/kubelet/pods/cfb34df0-005b-47af-9575-b1dcc52652a2/volumes" Dec 01 22:38:48 crc kubenswrapper[4962]: I1201 22:38:48.222685 4962 scope.go:117] "RemoveContainer" containerID="36a09ef847cf5ec44e4a7f255c073008d056414bc9c5b5f97a96b84dbe8dff75" Dec 01 22:38:48 crc kubenswrapper[4962]: E1201 22:38:48.223528 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:39:00 crc kubenswrapper[4962]: I1201 22:39:00.220249 4962 scope.go:117] "RemoveContainer" containerID="36a09ef847cf5ec44e4a7f255c073008d056414bc9c5b5f97a96b84dbe8dff75" Dec 01 22:39:00 crc kubenswrapper[4962]: E1201 22:39:00.221187 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:39:13 crc kubenswrapper[4962]: I1201 22:39:13.219525 4962 scope.go:117] "RemoveContainer" containerID="36a09ef847cf5ec44e4a7f255c073008d056414bc9c5b5f97a96b84dbe8dff75" Dec 01 22:39:13 crc kubenswrapper[4962]: E1201 22:39:13.220441 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:39:25 crc kubenswrapper[4962]: I1201 22:39:25.219626 4962 scope.go:117] "RemoveContainer" containerID="36a09ef847cf5ec44e4a7f255c073008d056414bc9c5b5f97a96b84dbe8dff75" Dec 01 22:39:25 crc kubenswrapper[4962]: E1201 22:39:25.220336 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:39:38 crc kubenswrapper[4962]: I1201 22:39:38.219697 4962 scope.go:117] "RemoveContainer" containerID="36a09ef847cf5ec44e4a7f255c073008d056414bc9c5b5f97a96b84dbe8dff75" Dec 01 22:39:38 crc kubenswrapper[4962]: E1201 22:39:38.220702 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:39:53 crc kubenswrapper[4962]: I1201 22:39:53.219303 4962 scope.go:117] "RemoveContainer" containerID="36a09ef847cf5ec44e4a7f255c073008d056414bc9c5b5f97a96b84dbe8dff75" Dec 01 22:39:53 crc kubenswrapper[4962]: E1201 22:39:53.220179 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:40:06 crc kubenswrapper[4962]: I1201 22:40:06.229583 4962 scope.go:117] "RemoveContainer" containerID="36a09ef847cf5ec44e4a7f255c073008d056414bc9c5b5f97a96b84dbe8dff75" Dec 01 22:40:06 crc kubenswrapper[4962]: E1201 22:40:06.230844 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:40:20 crc kubenswrapper[4962]: I1201 22:40:20.220567 4962 scope.go:117] "RemoveContainer" containerID="36a09ef847cf5ec44e4a7f255c073008d056414bc9c5b5f97a96b84dbe8dff75" Dec 01 22:40:20 crc kubenswrapper[4962]: E1201 22:40:20.221738 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:40:32 crc kubenswrapper[4962]: I1201 22:40:32.223905 4962 scope.go:117] "RemoveContainer" containerID="36a09ef847cf5ec44e4a7f255c073008d056414bc9c5b5f97a96b84dbe8dff75" Dec 01 22:40:32 crc kubenswrapper[4962]: E1201 22:40:32.225783 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:40:44 crc kubenswrapper[4962]: I1201 22:40:44.221006 4962 scope.go:117] "RemoveContainer" containerID="36a09ef847cf5ec44e4a7f255c073008d056414bc9c5b5f97a96b84dbe8dff75" Dec 01 22:40:44 crc kubenswrapper[4962]: E1201 22:40:44.221925 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:40:55 crc kubenswrapper[4962]: I1201 22:40:55.220286 4962 scope.go:117] "RemoveContainer" containerID="36a09ef847cf5ec44e4a7f255c073008d056414bc9c5b5f97a96b84dbe8dff75" Dec 01 22:40:55 crc kubenswrapper[4962]: E1201 22:40:55.221687 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:41:07 crc kubenswrapper[4962]: I1201 22:41:07.221390 4962 scope.go:117] "RemoveContainer" containerID="36a09ef847cf5ec44e4a7f255c073008d056414bc9c5b5f97a96b84dbe8dff75" Dec 01 22:41:07 crc kubenswrapper[4962]: E1201 22:41:07.222733 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:41:20 crc kubenswrapper[4962]: I1201 22:41:20.220036 4962 scope.go:117] "RemoveContainer" containerID="36a09ef847cf5ec44e4a7f255c073008d056414bc9c5b5f97a96b84dbe8dff75" Dec 01 22:41:20 crc kubenswrapper[4962]: E1201 22:41:20.221009 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:41:32 crc kubenswrapper[4962]: I1201 22:41:32.220460 4962 scope.go:117] "RemoveContainer" containerID="36a09ef847cf5ec44e4a7f255c073008d056414bc9c5b5f97a96b84dbe8dff75" Dec 01 22:41:32 crc kubenswrapper[4962]: E1201 22:41:32.221715 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:41:47 crc kubenswrapper[4962]: I1201 22:41:47.220055 4962 scope.go:117] "RemoveContainer" containerID="36a09ef847cf5ec44e4a7f255c073008d056414bc9c5b5f97a96b84dbe8dff75" Dec 01 22:41:47 crc kubenswrapper[4962]: E1201 22:41:47.220649 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:42:00 crc kubenswrapper[4962]: I1201 22:42:00.220645 4962 scope.go:117] "RemoveContainer" containerID="36a09ef847cf5ec44e4a7f255c073008d056414bc9c5b5f97a96b84dbe8dff75" Dec 01 22:42:00 crc kubenswrapper[4962]: E1201 22:42:00.222000 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:42:11 crc kubenswrapper[4962]: I1201 22:42:11.219872 4962 scope.go:117] "RemoveContainer" containerID="36a09ef847cf5ec44e4a7f255c073008d056414bc9c5b5f97a96b84dbe8dff75" Dec 01 22:42:11 crc kubenswrapper[4962]: E1201 22:42:11.221020 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:42:25 crc kubenswrapper[4962]: I1201 22:42:25.221015 4962 scope.go:117] "RemoveContainer" containerID="36a09ef847cf5ec44e4a7f255c073008d056414bc9c5b5f97a96b84dbe8dff75" Dec 01 22:42:25 crc kubenswrapper[4962]: E1201 22:42:25.222407 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:42:37 crc kubenswrapper[4962]: I1201 22:42:37.220673 4962 scope.go:117] "RemoveContainer" containerID="36a09ef847cf5ec44e4a7f255c073008d056414bc9c5b5f97a96b84dbe8dff75" Dec 01 22:42:37 crc kubenswrapper[4962]: I1201 22:42:37.685960 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" event={"ID":"191b6ce3-f613-4217-b224-a65ee4cfdfe7","Type":"ContainerStarted","Data":"c4d9a7193f7ebfe349227a99972c1252553194d867623ce522be000424eb6be4"} Dec 01 22:44:17 crc kubenswrapper[4962]: I1201 22:44:17.417556 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7pb54"] Dec 01 22:44:17 crc kubenswrapper[4962]: E1201 22:44:17.418434 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfb34df0-005b-47af-9575-b1dcc52652a2" containerName="extract-content" Dec 01 22:44:17 crc kubenswrapper[4962]: I1201 22:44:17.418447 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfb34df0-005b-47af-9575-b1dcc52652a2" containerName="extract-content" Dec 01 22:44:17 crc kubenswrapper[4962]: E1201 22:44:17.418474 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfb34df0-005b-47af-9575-b1dcc52652a2" containerName="extract-utilities" Dec 01 22:44:17 crc kubenswrapper[4962]: I1201 22:44:17.418480 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfb34df0-005b-47af-9575-b1dcc52652a2" containerName="extract-utilities" Dec 01 22:44:17 crc kubenswrapper[4962]: E1201 22:44:17.418507 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfb34df0-005b-47af-9575-b1dcc52652a2" containerName="registry-server" Dec 01 22:44:17 crc kubenswrapper[4962]: I1201 22:44:17.418513 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfb34df0-005b-47af-9575-b1dcc52652a2" containerName="registry-server" Dec 01 22:44:17 crc kubenswrapper[4962]: I1201 22:44:17.418716 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfb34df0-005b-47af-9575-b1dcc52652a2" containerName="registry-server" Dec 01 22:44:17 crc kubenswrapper[4962]: I1201 22:44:17.420348 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7pb54" Dec 01 22:44:17 crc kubenswrapper[4962]: I1201 22:44:17.442347 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7pb54"] Dec 01 22:44:17 crc kubenswrapper[4962]: I1201 22:44:17.563659 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db11f2ef-a4f6-42bb-909a-95df7a35932d-catalog-content\") pod \"redhat-marketplace-7pb54\" (UID: \"db11f2ef-a4f6-42bb-909a-95df7a35932d\") " pod="openshift-marketplace/redhat-marketplace-7pb54" Dec 01 22:44:17 crc kubenswrapper[4962]: I1201 22:44:17.563712 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fh28k\" (UniqueName: \"kubernetes.io/projected/db11f2ef-a4f6-42bb-909a-95df7a35932d-kube-api-access-fh28k\") pod \"redhat-marketplace-7pb54\" (UID: \"db11f2ef-a4f6-42bb-909a-95df7a35932d\") " pod="openshift-marketplace/redhat-marketplace-7pb54" Dec 01 22:44:17 crc kubenswrapper[4962]: I1201 22:44:17.564583 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db11f2ef-a4f6-42bb-909a-95df7a35932d-utilities\") pod \"redhat-marketplace-7pb54\" (UID: \"db11f2ef-a4f6-42bb-909a-95df7a35932d\") " pod="openshift-marketplace/redhat-marketplace-7pb54" Dec 01 22:44:17 crc kubenswrapper[4962]: I1201 22:44:17.666239 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db11f2ef-a4f6-42bb-909a-95df7a35932d-catalog-content\") pod \"redhat-marketplace-7pb54\" (UID: \"db11f2ef-a4f6-42bb-909a-95df7a35932d\") " pod="openshift-marketplace/redhat-marketplace-7pb54" Dec 01 22:44:17 crc kubenswrapper[4962]: I1201 22:44:17.666292 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fh28k\" (UniqueName: \"kubernetes.io/projected/db11f2ef-a4f6-42bb-909a-95df7a35932d-kube-api-access-fh28k\") pod \"redhat-marketplace-7pb54\" (UID: \"db11f2ef-a4f6-42bb-909a-95df7a35932d\") " pod="openshift-marketplace/redhat-marketplace-7pb54" Dec 01 22:44:17 crc kubenswrapper[4962]: I1201 22:44:17.666381 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db11f2ef-a4f6-42bb-909a-95df7a35932d-utilities\") pod \"redhat-marketplace-7pb54\" (UID: \"db11f2ef-a4f6-42bb-909a-95df7a35932d\") " pod="openshift-marketplace/redhat-marketplace-7pb54" Dec 01 22:44:17 crc kubenswrapper[4962]: I1201 22:44:17.666907 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db11f2ef-a4f6-42bb-909a-95df7a35932d-utilities\") pod \"redhat-marketplace-7pb54\" (UID: \"db11f2ef-a4f6-42bb-909a-95df7a35932d\") " pod="openshift-marketplace/redhat-marketplace-7pb54" Dec 01 22:44:17 crc kubenswrapper[4962]: I1201 22:44:17.667079 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db11f2ef-a4f6-42bb-909a-95df7a35932d-catalog-content\") pod \"redhat-marketplace-7pb54\" (UID: \"db11f2ef-a4f6-42bb-909a-95df7a35932d\") " pod="openshift-marketplace/redhat-marketplace-7pb54" Dec 01 22:44:17 crc kubenswrapper[4962]: I1201 22:44:17.686118 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fh28k\" (UniqueName: \"kubernetes.io/projected/db11f2ef-a4f6-42bb-909a-95df7a35932d-kube-api-access-fh28k\") pod \"redhat-marketplace-7pb54\" (UID: \"db11f2ef-a4f6-42bb-909a-95df7a35932d\") " pod="openshift-marketplace/redhat-marketplace-7pb54" Dec 01 22:44:17 crc kubenswrapper[4962]: I1201 22:44:17.741781 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7pb54" Dec 01 22:44:18 crc kubenswrapper[4962]: I1201 22:44:18.303810 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7pb54"] Dec 01 22:44:19 crc kubenswrapper[4962]: I1201 22:44:19.140828 4962 generic.go:334] "Generic (PLEG): container finished" podID="db11f2ef-a4f6-42bb-909a-95df7a35932d" containerID="2618e02dda2647851d4c7c7c8c9412858b67c10239fc4e955835f7cd9e89d710" exitCode=0 Dec 01 22:44:19 crc kubenswrapper[4962]: I1201 22:44:19.140973 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7pb54" event={"ID":"db11f2ef-a4f6-42bb-909a-95df7a35932d","Type":"ContainerDied","Data":"2618e02dda2647851d4c7c7c8c9412858b67c10239fc4e955835f7cd9e89d710"} Dec 01 22:44:19 crc kubenswrapper[4962]: I1201 22:44:19.142086 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7pb54" event={"ID":"db11f2ef-a4f6-42bb-909a-95df7a35932d","Type":"ContainerStarted","Data":"9ddaea0da31817ac099d63d388fa3a73b8fdcf100748b395f0161f1d6734ba56"} Dec 01 22:44:19 crc kubenswrapper[4962]: I1201 22:44:19.144985 4962 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 22:44:21 crc kubenswrapper[4962]: I1201 22:44:21.170260 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7pb54" event={"ID":"db11f2ef-a4f6-42bb-909a-95df7a35932d","Type":"ContainerStarted","Data":"28358f4b2ea92c5820e3d97c333313d50c1687448daa447d5d2322a7fa44c369"} Dec 01 22:44:22 crc kubenswrapper[4962]: I1201 22:44:22.183497 4962 generic.go:334] "Generic (PLEG): container finished" podID="db11f2ef-a4f6-42bb-909a-95df7a35932d" containerID="28358f4b2ea92c5820e3d97c333313d50c1687448daa447d5d2322a7fa44c369" exitCode=0 Dec 01 22:44:22 crc kubenswrapper[4962]: I1201 22:44:22.183548 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7pb54" event={"ID":"db11f2ef-a4f6-42bb-909a-95df7a35932d","Type":"ContainerDied","Data":"28358f4b2ea92c5820e3d97c333313d50c1687448daa447d5d2322a7fa44c369"} Dec 01 22:44:24 crc kubenswrapper[4962]: I1201 22:44:24.215102 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7pb54" event={"ID":"db11f2ef-a4f6-42bb-909a-95df7a35932d","Type":"ContainerStarted","Data":"c836d3dcebb41d019ecfc173ce50830f7410a8c11f9019c07abfbb25d0011953"} Dec 01 22:44:24 crc kubenswrapper[4962]: I1201 22:44:24.260500 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7pb54" podStartSLOduration=3.649920616 podStartE2EDuration="7.260480253s" podCreationTimestamp="2025-12-01 22:44:17 +0000 UTC" firstStartedPulling="2025-12-01 22:44:19.144585064 +0000 UTC m=+4243.246024289" lastFinishedPulling="2025-12-01 22:44:22.755144721 +0000 UTC m=+4246.856583926" observedRunningTime="2025-12-01 22:44:24.244746937 +0000 UTC m=+4248.346186172" watchObservedRunningTime="2025-12-01 22:44:24.260480253 +0000 UTC m=+4248.361919458" Dec 01 22:44:27 crc kubenswrapper[4962]: I1201 22:44:27.742608 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7pb54" Dec 01 22:44:27 crc kubenswrapper[4962]: I1201 22:44:27.744195 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7pb54" Dec 01 22:44:27 crc kubenswrapper[4962]: I1201 22:44:27.796847 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7pb54" Dec 01 22:44:28 crc kubenswrapper[4962]: I1201 22:44:28.398015 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7pb54" Dec 01 22:44:28 crc kubenswrapper[4962]: I1201 22:44:28.447829 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7pb54"] Dec 01 22:44:30 crc kubenswrapper[4962]: I1201 22:44:30.299759 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7pb54" podUID="db11f2ef-a4f6-42bb-909a-95df7a35932d" containerName="registry-server" containerID="cri-o://c836d3dcebb41d019ecfc173ce50830f7410a8c11f9019c07abfbb25d0011953" gracePeriod=2 Dec 01 22:44:30 crc kubenswrapper[4962]: I1201 22:44:30.920889 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7pb54" Dec 01 22:44:30 crc kubenswrapper[4962]: I1201 22:44:30.937450 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fh28k\" (UniqueName: \"kubernetes.io/projected/db11f2ef-a4f6-42bb-909a-95df7a35932d-kube-api-access-fh28k\") pod \"db11f2ef-a4f6-42bb-909a-95df7a35932d\" (UID: \"db11f2ef-a4f6-42bb-909a-95df7a35932d\") " Dec 01 22:44:30 crc kubenswrapper[4962]: I1201 22:44:30.937683 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db11f2ef-a4f6-42bb-909a-95df7a35932d-catalog-content\") pod \"db11f2ef-a4f6-42bb-909a-95df7a35932d\" (UID: \"db11f2ef-a4f6-42bb-909a-95df7a35932d\") " Dec 01 22:44:30 crc kubenswrapper[4962]: I1201 22:44:30.937819 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db11f2ef-a4f6-42bb-909a-95df7a35932d-utilities\") pod \"db11f2ef-a4f6-42bb-909a-95df7a35932d\" (UID: \"db11f2ef-a4f6-42bb-909a-95df7a35932d\") " Dec 01 22:44:30 crc kubenswrapper[4962]: I1201 22:44:30.939169 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db11f2ef-a4f6-42bb-909a-95df7a35932d-utilities" (OuterVolumeSpecName: "utilities") pod "db11f2ef-a4f6-42bb-909a-95df7a35932d" (UID: "db11f2ef-a4f6-42bb-909a-95df7a35932d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 22:44:30 crc kubenswrapper[4962]: I1201 22:44:30.945062 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db11f2ef-a4f6-42bb-909a-95df7a35932d-kube-api-access-fh28k" (OuterVolumeSpecName: "kube-api-access-fh28k") pod "db11f2ef-a4f6-42bb-909a-95df7a35932d" (UID: "db11f2ef-a4f6-42bb-909a-95df7a35932d"). InnerVolumeSpecName "kube-api-access-fh28k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 22:44:30 crc kubenswrapper[4962]: I1201 22:44:30.963699 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db11f2ef-a4f6-42bb-909a-95df7a35932d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "db11f2ef-a4f6-42bb-909a-95df7a35932d" (UID: "db11f2ef-a4f6-42bb-909a-95df7a35932d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 22:44:31 crc kubenswrapper[4962]: I1201 22:44:31.039830 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fh28k\" (UniqueName: \"kubernetes.io/projected/db11f2ef-a4f6-42bb-909a-95df7a35932d-kube-api-access-fh28k\") on node \"crc\" DevicePath \"\"" Dec 01 22:44:31 crc kubenswrapper[4962]: I1201 22:44:31.040194 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db11f2ef-a4f6-42bb-909a-95df7a35932d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 22:44:31 crc kubenswrapper[4962]: I1201 22:44:31.040376 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db11f2ef-a4f6-42bb-909a-95df7a35932d-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 22:44:31 crc kubenswrapper[4962]: I1201 22:44:31.313887 4962 generic.go:334] "Generic (PLEG): container finished" podID="db11f2ef-a4f6-42bb-909a-95df7a35932d" containerID="c836d3dcebb41d019ecfc173ce50830f7410a8c11f9019c07abfbb25d0011953" exitCode=0 Dec 01 22:44:31 crc kubenswrapper[4962]: I1201 22:44:31.313957 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7pb54" event={"ID":"db11f2ef-a4f6-42bb-909a-95df7a35932d","Type":"ContainerDied","Data":"c836d3dcebb41d019ecfc173ce50830f7410a8c11f9019c07abfbb25d0011953"} Dec 01 22:44:31 crc kubenswrapper[4962]: I1201 22:44:31.314012 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7pb54" Dec 01 22:44:31 crc kubenswrapper[4962]: I1201 22:44:31.315699 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7pb54" event={"ID":"db11f2ef-a4f6-42bb-909a-95df7a35932d","Type":"ContainerDied","Data":"9ddaea0da31817ac099d63d388fa3a73b8fdcf100748b395f0161f1d6734ba56"} Dec 01 22:44:31 crc kubenswrapper[4962]: I1201 22:44:31.315807 4962 scope.go:117] "RemoveContainer" containerID="c836d3dcebb41d019ecfc173ce50830f7410a8c11f9019c07abfbb25d0011953" Dec 01 22:44:31 crc kubenswrapper[4962]: I1201 22:44:31.351214 4962 scope.go:117] "RemoveContainer" containerID="28358f4b2ea92c5820e3d97c333313d50c1687448daa447d5d2322a7fa44c369" Dec 01 22:44:31 crc kubenswrapper[4962]: I1201 22:44:31.476026 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7pb54"] Dec 01 22:44:31 crc kubenswrapper[4962]: I1201 22:44:31.492874 4962 scope.go:117] "RemoveContainer" containerID="2618e02dda2647851d4c7c7c8c9412858b67c10239fc4e955835f7cd9e89d710" Dec 01 22:44:31 crc kubenswrapper[4962]: I1201 22:44:31.495709 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7pb54"] Dec 01 22:44:31 crc kubenswrapper[4962]: I1201 22:44:31.546137 4962 scope.go:117] "RemoveContainer" containerID="c836d3dcebb41d019ecfc173ce50830f7410a8c11f9019c07abfbb25d0011953" Dec 01 22:44:31 crc kubenswrapper[4962]: E1201 22:44:31.546671 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c836d3dcebb41d019ecfc173ce50830f7410a8c11f9019c07abfbb25d0011953\": container with ID starting with c836d3dcebb41d019ecfc173ce50830f7410a8c11f9019c07abfbb25d0011953 not found: ID does not exist" containerID="c836d3dcebb41d019ecfc173ce50830f7410a8c11f9019c07abfbb25d0011953" Dec 01 22:44:31 crc kubenswrapper[4962]: I1201 22:44:31.546770 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c836d3dcebb41d019ecfc173ce50830f7410a8c11f9019c07abfbb25d0011953"} err="failed to get container status \"c836d3dcebb41d019ecfc173ce50830f7410a8c11f9019c07abfbb25d0011953\": rpc error: code = NotFound desc = could not find container \"c836d3dcebb41d019ecfc173ce50830f7410a8c11f9019c07abfbb25d0011953\": container with ID starting with c836d3dcebb41d019ecfc173ce50830f7410a8c11f9019c07abfbb25d0011953 not found: ID does not exist" Dec 01 22:44:31 crc kubenswrapper[4962]: I1201 22:44:31.546876 4962 scope.go:117] "RemoveContainer" containerID="28358f4b2ea92c5820e3d97c333313d50c1687448daa447d5d2322a7fa44c369" Dec 01 22:44:31 crc kubenswrapper[4962]: E1201 22:44:31.547477 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28358f4b2ea92c5820e3d97c333313d50c1687448daa447d5d2322a7fa44c369\": container with ID starting with 28358f4b2ea92c5820e3d97c333313d50c1687448daa447d5d2322a7fa44c369 not found: ID does not exist" containerID="28358f4b2ea92c5820e3d97c333313d50c1687448daa447d5d2322a7fa44c369" Dec 01 22:44:31 crc kubenswrapper[4962]: I1201 22:44:31.547560 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28358f4b2ea92c5820e3d97c333313d50c1687448daa447d5d2322a7fa44c369"} err="failed to get container status \"28358f4b2ea92c5820e3d97c333313d50c1687448daa447d5d2322a7fa44c369\": rpc error: code = NotFound desc = could not find container \"28358f4b2ea92c5820e3d97c333313d50c1687448daa447d5d2322a7fa44c369\": container with ID starting with 28358f4b2ea92c5820e3d97c333313d50c1687448daa447d5d2322a7fa44c369 not found: ID does not exist" Dec 01 22:44:31 crc kubenswrapper[4962]: I1201 22:44:31.547640 4962 scope.go:117] "RemoveContainer" containerID="2618e02dda2647851d4c7c7c8c9412858b67c10239fc4e955835f7cd9e89d710" Dec 01 22:44:31 crc kubenswrapper[4962]: E1201 22:44:31.547979 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2618e02dda2647851d4c7c7c8c9412858b67c10239fc4e955835f7cd9e89d710\": container with ID starting with 2618e02dda2647851d4c7c7c8c9412858b67c10239fc4e955835f7cd9e89d710 not found: ID does not exist" containerID="2618e02dda2647851d4c7c7c8c9412858b67c10239fc4e955835f7cd9e89d710" Dec 01 22:44:31 crc kubenswrapper[4962]: I1201 22:44:31.548059 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2618e02dda2647851d4c7c7c8c9412858b67c10239fc4e955835f7cd9e89d710"} err="failed to get container status \"2618e02dda2647851d4c7c7c8c9412858b67c10239fc4e955835f7cd9e89d710\": rpc error: code = NotFound desc = could not find container \"2618e02dda2647851d4c7c7c8c9412858b67c10239fc4e955835f7cd9e89d710\": container with ID starting with 2618e02dda2647851d4c7c7c8c9412858b67c10239fc4e955835f7cd9e89d710 not found: ID does not exist" Dec 01 22:44:32 crc kubenswrapper[4962]: I1201 22:44:32.232518 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db11f2ef-a4f6-42bb-909a-95df7a35932d" path="/var/lib/kubelet/pods/db11f2ef-a4f6-42bb-909a-95df7a35932d/volumes" Dec 01 22:45:00 crc kubenswrapper[4962]: I1201 22:45:00.215827 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410485-bj2bp"] Dec 01 22:45:00 crc kubenswrapper[4962]: E1201 22:45:00.217457 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db11f2ef-a4f6-42bb-909a-95df7a35932d" containerName="extract-content" Dec 01 22:45:00 crc kubenswrapper[4962]: I1201 22:45:00.217485 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="db11f2ef-a4f6-42bb-909a-95df7a35932d" containerName="extract-content" Dec 01 22:45:00 crc kubenswrapper[4962]: E1201 22:45:00.217537 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db11f2ef-a4f6-42bb-909a-95df7a35932d" containerName="registry-server" Dec 01 22:45:00 crc kubenswrapper[4962]: I1201 22:45:00.217552 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="db11f2ef-a4f6-42bb-909a-95df7a35932d" containerName="registry-server" Dec 01 22:45:00 crc kubenswrapper[4962]: E1201 22:45:00.217579 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db11f2ef-a4f6-42bb-909a-95df7a35932d" containerName="extract-utilities" Dec 01 22:45:00 crc kubenswrapper[4962]: I1201 22:45:00.217593 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="db11f2ef-a4f6-42bb-909a-95df7a35932d" containerName="extract-utilities" Dec 01 22:45:00 crc kubenswrapper[4962]: I1201 22:45:00.218111 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="db11f2ef-a4f6-42bb-909a-95df7a35932d" containerName="registry-server" Dec 01 22:45:00 crc kubenswrapper[4962]: I1201 22:45:00.219545 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410485-bj2bp" Dec 01 22:45:00 crc kubenswrapper[4962]: I1201 22:45:00.221921 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 01 22:45:00 crc kubenswrapper[4962]: I1201 22:45:00.223481 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 01 22:45:00 crc kubenswrapper[4962]: I1201 22:45:00.298385 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410485-bj2bp"] Dec 01 22:45:00 crc kubenswrapper[4962]: I1201 22:45:00.381561 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/258b4cf8-4035-421a-acca-98b2a30e66c4-secret-volume\") pod \"collect-profiles-29410485-bj2bp\" (UID: \"258b4cf8-4035-421a-acca-98b2a30e66c4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410485-bj2bp" Dec 01 22:45:00 crc kubenswrapper[4962]: I1201 22:45:00.381817 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/258b4cf8-4035-421a-acca-98b2a30e66c4-config-volume\") pod \"collect-profiles-29410485-bj2bp\" (UID: \"258b4cf8-4035-421a-acca-98b2a30e66c4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410485-bj2bp" Dec 01 22:45:00 crc kubenswrapper[4962]: I1201 22:45:00.382082 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knn9z\" (UniqueName: \"kubernetes.io/projected/258b4cf8-4035-421a-acca-98b2a30e66c4-kube-api-access-knn9z\") pod \"collect-profiles-29410485-bj2bp\" (UID: \"258b4cf8-4035-421a-acca-98b2a30e66c4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410485-bj2bp" Dec 01 22:45:00 crc kubenswrapper[4962]: I1201 22:45:00.485143 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knn9z\" (UniqueName: \"kubernetes.io/projected/258b4cf8-4035-421a-acca-98b2a30e66c4-kube-api-access-knn9z\") pod \"collect-profiles-29410485-bj2bp\" (UID: \"258b4cf8-4035-421a-acca-98b2a30e66c4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410485-bj2bp" Dec 01 22:45:00 crc kubenswrapper[4962]: I1201 22:45:00.485370 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/258b4cf8-4035-421a-acca-98b2a30e66c4-secret-volume\") pod \"collect-profiles-29410485-bj2bp\" (UID: \"258b4cf8-4035-421a-acca-98b2a30e66c4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410485-bj2bp" Dec 01 22:45:00 crc kubenswrapper[4962]: I1201 22:45:00.485580 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/258b4cf8-4035-421a-acca-98b2a30e66c4-config-volume\") pod \"collect-profiles-29410485-bj2bp\" (UID: \"258b4cf8-4035-421a-acca-98b2a30e66c4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410485-bj2bp" Dec 01 22:45:00 crc kubenswrapper[4962]: I1201 22:45:00.487379 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/258b4cf8-4035-421a-acca-98b2a30e66c4-config-volume\") pod \"collect-profiles-29410485-bj2bp\" (UID: \"258b4cf8-4035-421a-acca-98b2a30e66c4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410485-bj2bp" Dec 01 22:45:00 crc kubenswrapper[4962]: I1201 22:45:00.494410 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/258b4cf8-4035-421a-acca-98b2a30e66c4-secret-volume\") pod \"collect-profiles-29410485-bj2bp\" (UID: \"258b4cf8-4035-421a-acca-98b2a30e66c4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410485-bj2bp" Dec 01 22:45:00 crc kubenswrapper[4962]: I1201 22:45:00.505215 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knn9z\" (UniqueName: \"kubernetes.io/projected/258b4cf8-4035-421a-acca-98b2a30e66c4-kube-api-access-knn9z\") pod \"collect-profiles-29410485-bj2bp\" (UID: \"258b4cf8-4035-421a-acca-98b2a30e66c4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410485-bj2bp" Dec 01 22:45:00 crc kubenswrapper[4962]: I1201 22:45:00.562861 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410485-bj2bp" Dec 01 22:45:01 crc kubenswrapper[4962]: I1201 22:45:01.084596 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410485-bj2bp"] Dec 01 22:45:01 crc kubenswrapper[4962]: W1201 22:45:01.090039 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod258b4cf8_4035_421a_acca_98b2a30e66c4.slice/crio-6d560720ed3f60cc2b101236510e46200645f772e4ecde6657cf35c4e1107ad8 WatchSource:0}: Error finding container 6d560720ed3f60cc2b101236510e46200645f772e4ecde6657cf35c4e1107ad8: Status 404 returned error can't find the container with id 6d560720ed3f60cc2b101236510e46200645f772e4ecde6657cf35c4e1107ad8 Dec 01 22:45:01 crc kubenswrapper[4962]: I1201 22:45:01.760121 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410485-bj2bp" event={"ID":"258b4cf8-4035-421a-acca-98b2a30e66c4","Type":"ContainerStarted","Data":"6d560720ed3f60cc2b101236510e46200645f772e4ecde6657cf35c4e1107ad8"} Dec 01 22:45:02 crc kubenswrapper[4962]: I1201 22:45:02.779160 4962 generic.go:334] "Generic (PLEG): container finished" podID="258b4cf8-4035-421a-acca-98b2a30e66c4" containerID="0248c1caa1b487c689436888931f18d07f42432086c6f78811ef97f31a5f988c" exitCode=0 Dec 01 22:45:02 crc kubenswrapper[4962]: I1201 22:45:02.779487 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410485-bj2bp" event={"ID":"258b4cf8-4035-421a-acca-98b2a30e66c4","Type":"ContainerDied","Data":"0248c1caa1b487c689436888931f18d07f42432086c6f78811ef97f31a5f988c"} Dec 01 22:45:02 crc kubenswrapper[4962]: I1201 22:45:02.807168 4962 patch_prober.go:28] interesting pod/machine-config-daemon-b642k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 22:45:02 crc kubenswrapper[4962]: I1201 22:45:02.807493 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 22:45:04 crc kubenswrapper[4962]: I1201 22:45:04.714881 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410485-bj2bp" Dec 01 22:45:04 crc kubenswrapper[4962]: I1201 22:45:04.786900 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/258b4cf8-4035-421a-acca-98b2a30e66c4-secret-volume\") pod \"258b4cf8-4035-421a-acca-98b2a30e66c4\" (UID: \"258b4cf8-4035-421a-acca-98b2a30e66c4\") " Dec 01 22:45:04 crc kubenswrapper[4962]: I1201 22:45:04.787076 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/258b4cf8-4035-421a-acca-98b2a30e66c4-config-volume\") pod \"258b4cf8-4035-421a-acca-98b2a30e66c4\" (UID: \"258b4cf8-4035-421a-acca-98b2a30e66c4\") " Dec 01 22:45:04 crc kubenswrapper[4962]: I1201 22:45:04.787156 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-knn9z\" (UniqueName: \"kubernetes.io/projected/258b4cf8-4035-421a-acca-98b2a30e66c4-kube-api-access-knn9z\") pod \"258b4cf8-4035-421a-acca-98b2a30e66c4\" (UID: \"258b4cf8-4035-421a-acca-98b2a30e66c4\") " Dec 01 22:45:04 crc kubenswrapper[4962]: I1201 22:45:04.788186 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/258b4cf8-4035-421a-acca-98b2a30e66c4-config-volume" (OuterVolumeSpecName: "config-volume") pod "258b4cf8-4035-421a-acca-98b2a30e66c4" (UID: "258b4cf8-4035-421a-acca-98b2a30e66c4"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 22:45:04 crc kubenswrapper[4962]: I1201 22:45:04.800375 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/258b4cf8-4035-421a-acca-98b2a30e66c4-kube-api-access-knn9z" (OuterVolumeSpecName: "kube-api-access-knn9z") pod "258b4cf8-4035-421a-acca-98b2a30e66c4" (UID: "258b4cf8-4035-421a-acca-98b2a30e66c4"). InnerVolumeSpecName "kube-api-access-knn9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 22:45:04 crc kubenswrapper[4962]: I1201 22:45:04.800664 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/258b4cf8-4035-421a-acca-98b2a30e66c4-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "258b4cf8-4035-421a-acca-98b2a30e66c4" (UID: "258b4cf8-4035-421a-acca-98b2a30e66c4"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 22:45:04 crc kubenswrapper[4962]: I1201 22:45:04.819722 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410485-bj2bp" event={"ID":"258b4cf8-4035-421a-acca-98b2a30e66c4","Type":"ContainerDied","Data":"6d560720ed3f60cc2b101236510e46200645f772e4ecde6657cf35c4e1107ad8"} Dec 01 22:45:04 crc kubenswrapper[4962]: I1201 22:45:04.819845 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d560720ed3f60cc2b101236510e46200645f772e4ecde6657cf35c4e1107ad8" Dec 01 22:45:04 crc kubenswrapper[4962]: I1201 22:45:04.819871 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410485-bj2bp" Dec 01 22:45:04 crc kubenswrapper[4962]: I1201 22:45:04.890768 4962 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/258b4cf8-4035-421a-acca-98b2a30e66c4-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 22:45:04 crc kubenswrapper[4962]: I1201 22:45:04.891295 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-knn9z\" (UniqueName: \"kubernetes.io/projected/258b4cf8-4035-421a-acca-98b2a30e66c4-kube-api-access-knn9z\") on node \"crc\" DevicePath \"\"" Dec 01 22:45:04 crc kubenswrapper[4962]: I1201 22:45:04.891310 4962 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/258b4cf8-4035-421a-acca-98b2a30e66c4-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 01 22:45:05 crc kubenswrapper[4962]: I1201 22:45:05.838624 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410440-fqjh9"] Dec 01 22:45:05 crc kubenswrapper[4962]: I1201 22:45:05.849909 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410440-fqjh9"] Dec 01 22:45:06 crc kubenswrapper[4962]: I1201 22:45:06.247841 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5e2c40f-767a-4ec4-a768-f64f3d2b5b20" path="/var/lib/kubelet/pods/e5e2c40f-767a-4ec4-a768-f64f3d2b5b20/volumes" Dec 01 22:45:07 crc kubenswrapper[4962]: I1201 22:45:07.927482 4962 scope.go:117] "RemoveContainer" containerID="47f9cc9152e2e099a2a13a1dc683fecd637345e62e616527567081918534a58c" Dec 01 22:45:32 crc kubenswrapper[4962]: I1201 22:45:32.785103 4962 patch_prober.go:28] interesting pod/machine-config-daemon-b642k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 22:45:32 crc kubenswrapper[4962]: I1201 22:45:32.785789 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 22:45:52 crc kubenswrapper[4962]: I1201 22:45:52.010440 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lt8md"] Dec 01 22:45:52 crc kubenswrapper[4962]: E1201 22:45:52.011435 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="258b4cf8-4035-421a-acca-98b2a30e66c4" containerName="collect-profiles" Dec 01 22:45:52 crc kubenswrapper[4962]: I1201 22:45:52.011449 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="258b4cf8-4035-421a-acca-98b2a30e66c4" containerName="collect-profiles" Dec 01 22:45:52 crc kubenswrapper[4962]: I1201 22:45:52.011740 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="258b4cf8-4035-421a-acca-98b2a30e66c4" containerName="collect-profiles" Dec 01 22:45:52 crc kubenswrapper[4962]: I1201 22:45:52.013448 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lt8md" Dec 01 22:45:52 crc kubenswrapper[4962]: I1201 22:45:52.029272 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lt8md"] Dec 01 22:45:52 crc kubenswrapper[4962]: I1201 22:45:52.146894 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22b11b92-29b0-4134-a443-4656e66fbb77-utilities\") pod \"redhat-operators-lt8md\" (UID: \"22b11b92-29b0-4134-a443-4656e66fbb77\") " pod="openshift-marketplace/redhat-operators-lt8md" Dec 01 22:45:52 crc kubenswrapper[4962]: I1201 22:45:52.147192 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvhpd\" (UniqueName: \"kubernetes.io/projected/22b11b92-29b0-4134-a443-4656e66fbb77-kube-api-access-nvhpd\") pod \"redhat-operators-lt8md\" (UID: \"22b11b92-29b0-4134-a443-4656e66fbb77\") " pod="openshift-marketplace/redhat-operators-lt8md" Dec 01 22:45:52 crc kubenswrapper[4962]: I1201 22:45:52.147610 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22b11b92-29b0-4134-a443-4656e66fbb77-catalog-content\") pod \"redhat-operators-lt8md\" (UID: \"22b11b92-29b0-4134-a443-4656e66fbb77\") " pod="openshift-marketplace/redhat-operators-lt8md" Dec 01 22:45:52 crc kubenswrapper[4962]: I1201 22:45:52.250184 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22b11b92-29b0-4134-a443-4656e66fbb77-utilities\") pod \"redhat-operators-lt8md\" (UID: \"22b11b92-29b0-4134-a443-4656e66fbb77\") " pod="openshift-marketplace/redhat-operators-lt8md" Dec 01 22:45:52 crc kubenswrapper[4962]: I1201 22:45:52.250259 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvhpd\" (UniqueName: \"kubernetes.io/projected/22b11b92-29b0-4134-a443-4656e66fbb77-kube-api-access-nvhpd\") pod \"redhat-operators-lt8md\" (UID: \"22b11b92-29b0-4134-a443-4656e66fbb77\") " pod="openshift-marketplace/redhat-operators-lt8md" Dec 01 22:45:52 crc kubenswrapper[4962]: I1201 22:45:52.250392 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22b11b92-29b0-4134-a443-4656e66fbb77-catalog-content\") pod \"redhat-operators-lt8md\" (UID: \"22b11b92-29b0-4134-a443-4656e66fbb77\") " pod="openshift-marketplace/redhat-operators-lt8md" Dec 01 22:45:52 crc kubenswrapper[4962]: I1201 22:45:52.250731 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22b11b92-29b0-4134-a443-4656e66fbb77-catalog-content\") pod \"redhat-operators-lt8md\" (UID: \"22b11b92-29b0-4134-a443-4656e66fbb77\") " pod="openshift-marketplace/redhat-operators-lt8md" Dec 01 22:45:52 crc kubenswrapper[4962]: I1201 22:45:52.251038 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22b11b92-29b0-4134-a443-4656e66fbb77-utilities\") pod \"redhat-operators-lt8md\" (UID: \"22b11b92-29b0-4134-a443-4656e66fbb77\") " pod="openshift-marketplace/redhat-operators-lt8md" Dec 01 22:45:52 crc kubenswrapper[4962]: I1201 22:45:52.271460 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvhpd\" (UniqueName: \"kubernetes.io/projected/22b11b92-29b0-4134-a443-4656e66fbb77-kube-api-access-nvhpd\") pod \"redhat-operators-lt8md\" (UID: \"22b11b92-29b0-4134-a443-4656e66fbb77\") " pod="openshift-marketplace/redhat-operators-lt8md" Dec 01 22:45:52 crc kubenswrapper[4962]: I1201 22:45:52.344250 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lt8md" Dec 01 22:45:52 crc kubenswrapper[4962]: I1201 22:45:52.796646 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lt8md"] Dec 01 22:45:52 crc kubenswrapper[4962]: I1201 22:45:52.808612 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-f7rrw"] Dec 01 22:45:52 crc kubenswrapper[4962]: I1201 22:45:52.811483 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f7rrw" Dec 01 22:45:52 crc kubenswrapper[4962]: W1201 22:45:52.821201 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22b11b92_29b0_4134_a443_4656e66fbb77.slice/crio-e8fa8abf02b29dedbd97aad9bb25dddb45491d9bca7a3a8d32a7065ce1500649 WatchSource:0}: Error finding container e8fa8abf02b29dedbd97aad9bb25dddb45491d9bca7a3a8d32a7065ce1500649: Status 404 returned error can't find the container with id e8fa8abf02b29dedbd97aad9bb25dddb45491d9bca7a3a8d32a7065ce1500649 Dec 01 22:45:52 crc kubenswrapper[4962]: I1201 22:45:52.822301 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f7rrw"] Dec 01 22:45:52 crc kubenswrapper[4962]: I1201 22:45:52.968523 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8556f12-683b-4ea7-a325-a1e0f98f86e7-catalog-content\") pod \"certified-operators-f7rrw\" (UID: \"b8556f12-683b-4ea7-a325-a1e0f98f86e7\") " pod="openshift-marketplace/certified-operators-f7rrw" Dec 01 22:45:52 crc kubenswrapper[4962]: I1201 22:45:52.968800 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkq2x\" (UniqueName: \"kubernetes.io/projected/b8556f12-683b-4ea7-a325-a1e0f98f86e7-kube-api-access-wkq2x\") pod \"certified-operators-f7rrw\" (UID: \"b8556f12-683b-4ea7-a325-a1e0f98f86e7\") " pod="openshift-marketplace/certified-operators-f7rrw" Dec 01 22:45:52 crc kubenswrapper[4962]: I1201 22:45:52.968964 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8556f12-683b-4ea7-a325-a1e0f98f86e7-utilities\") pod \"certified-operators-f7rrw\" (UID: \"b8556f12-683b-4ea7-a325-a1e0f98f86e7\") " pod="openshift-marketplace/certified-operators-f7rrw" Dec 01 22:45:53 crc kubenswrapper[4962]: I1201 22:45:53.070844 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8556f12-683b-4ea7-a325-a1e0f98f86e7-catalog-content\") pod \"certified-operators-f7rrw\" (UID: \"b8556f12-683b-4ea7-a325-a1e0f98f86e7\") " pod="openshift-marketplace/certified-operators-f7rrw" Dec 01 22:45:53 crc kubenswrapper[4962]: I1201 22:45:53.070914 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkq2x\" (UniqueName: \"kubernetes.io/projected/b8556f12-683b-4ea7-a325-a1e0f98f86e7-kube-api-access-wkq2x\") pod \"certified-operators-f7rrw\" (UID: \"b8556f12-683b-4ea7-a325-a1e0f98f86e7\") " pod="openshift-marketplace/certified-operators-f7rrw" Dec 01 22:45:53 crc kubenswrapper[4962]: I1201 22:45:53.071145 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8556f12-683b-4ea7-a325-a1e0f98f86e7-utilities\") pod \"certified-operators-f7rrw\" (UID: \"b8556f12-683b-4ea7-a325-a1e0f98f86e7\") " pod="openshift-marketplace/certified-operators-f7rrw" Dec 01 22:45:53 crc kubenswrapper[4962]: I1201 22:45:53.071336 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8556f12-683b-4ea7-a325-a1e0f98f86e7-catalog-content\") pod \"certified-operators-f7rrw\" (UID: \"b8556f12-683b-4ea7-a325-a1e0f98f86e7\") " pod="openshift-marketplace/certified-operators-f7rrw" Dec 01 22:45:53 crc kubenswrapper[4962]: I1201 22:45:53.071552 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8556f12-683b-4ea7-a325-a1e0f98f86e7-utilities\") pod \"certified-operators-f7rrw\" (UID: \"b8556f12-683b-4ea7-a325-a1e0f98f86e7\") " pod="openshift-marketplace/certified-operators-f7rrw" Dec 01 22:45:53 crc kubenswrapper[4962]: I1201 22:45:53.090985 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkq2x\" (UniqueName: \"kubernetes.io/projected/b8556f12-683b-4ea7-a325-a1e0f98f86e7-kube-api-access-wkq2x\") pod \"certified-operators-f7rrw\" (UID: \"b8556f12-683b-4ea7-a325-a1e0f98f86e7\") " pod="openshift-marketplace/certified-operators-f7rrw" Dec 01 22:45:53 crc kubenswrapper[4962]: I1201 22:45:53.267474 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f7rrw" Dec 01 22:45:53 crc kubenswrapper[4962]: I1201 22:45:53.509439 4962 generic.go:334] "Generic (PLEG): container finished" podID="22b11b92-29b0-4134-a443-4656e66fbb77" containerID="8977ad18b647cdcbf8dc631f2090bd615d22c7b2eb0ddfc22600dbc7fb312e02" exitCode=0 Dec 01 22:45:53 crc kubenswrapper[4962]: I1201 22:45:53.509760 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lt8md" event={"ID":"22b11b92-29b0-4134-a443-4656e66fbb77","Type":"ContainerDied","Data":"8977ad18b647cdcbf8dc631f2090bd615d22c7b2eb0ddfc22600dbc7fb312e02"} Dec 01 22:45:53 crc kubenswrapper[4962]: I1201 22:45:53.509788 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lt8md" event={"ID":"22b11b92-29b0-4134-a443-4656e66fbb77","Type":"ContainerStarted","Data":"e8fa8abf02b29dedbd97aad9bb25dddb45491d9bca7a3a8d32a7065ce1500649"} Dec 01 22:45:53 crc kubenswrapper[4962]: I1201 22:45:53.914199 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f7rrw"] Dec 01 22:45:54 crc kubenswrapper[4962]: I1201 22:45:54.523507 4962 generic.go:334] "Generic (PLEG): container finished" podID="b8556f12-683b-4ea7-a325-a1e0f98f86e7" containerID="58140522fd40baab94327991e190577b42cdf569c628be974ae37a6afe2fc261" exitCode=0 Dec 01 22:45:54 crc kubenswrapper[4962]: I1201 22:45:54.523709 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f7rrw" event={"ID":"b8556f12-683b-4ea7-a325-a1e0f98f86e7","Type":"ContainerDied","Data":"58140522fd40baab94327991e190577b42cdf569c628be974ae37a6afe2fc261"} Dec 01 22:45:54 crc kubenswrapper[4962]: I1201 22:45:54.523896 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f7rrw" event={"ID":"b8556f12-683b-4ea7-a325-a1e0f98f86e7","Type":"ContainerStarted","Data":"dda822edc4936b40732aea3ead61efb5957dc01f9583190477b15d3e1e3a634b"} Dec 01 22:45:55 crc kubenswrapper[4962]: I1201 22:45:55.537070 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lt8md" event={"ID":"22b11b92-29b0-4134-a443-4656e66fbb77","Type":"ContainerStarted","Data":"18a97517a7ea6ee8b9cd2d438137fb6ad64161c0f8f16eb4d6bc65f4db81b580"} Dec 01 22:45:57 crc kubenswrapper[4962]: I1201 22:45:57.569349 4962 generic.go:334] "Generic (PLEG): container finished" podID="22b11b92-29b0-4134-a443-4656e66fbb77" containerID="18a97517a7ea6ee8b9cd2d438137fb6ad64161c0f8f16eb4d6bc65f4db81b580" exitCode=0 Dec 01 22:45:57 crc kubenswrapper[4962]: I1201 22:45:57.569472 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lt8md" event={"ID":"22b11b92-29b0-4134-a443-4656e66fbb77","Type":"ContainerDied","Data":"18a97517a7ea6ee8b9cd2d438137fb6ad64161c0f8f16eb4d6bc65f4db81b580"} Dec 01 22:45:57 crc kubenswrapper[4962]: I1201 22:45:57.589552 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f7rrw" event={"ID":"b8556f12-683b-4ea7-a325-a1e0f98f86e7","Type":"ContainerStarted","Data":"e39ed7d1b94d3829da78b7482cf1233cbf408127aa345a80aade3a83f04651aa"} Dec 01 22:45:58 crc kubenswrapper[4962]: I1201 22:45:58.634823 4962 generic.go:334] "Generic (PLEG): container finished" podID="b8556f12-683b-4ea7-a325-a1e0f98f86e7" containerID="e39ed7d1b94d3829da78b7482cf1233cbf408127aa345a80aade3a83f04651aa" exitCode=0 Dec 01 22:45:58 crc kubenswrapper[4962]: I1201 22:45:58.634911 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f7rrw" event={"ID":"b8556f12-683b-4ea7-a325-a1e0f98f86e7","Type":"ContainerDied","Data":"e39ed7d1b94d3829da78b7482cf1233cbf408127aa345a80aade3a83f04651aa"} Dec 01 22:46:00 crc kubenswrapper[4962]: I1201 22:46:00.664566 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f7rrw" event={"ID":"b8556f12-683b-4ea7-a325-a1e0f98f86e7","Type":"ContainerStarted","Data":"1c3b93d73a63ac49158388d635dc2f7e1a793eab0887833947c910bac31d1c2c"} Dec 01 22:46:00 crc kubenswrapper[4962]: I1201 22:46:00.667507 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lt8md" event={"ID":"22b11b92-29b0-4134-a443-4656e66fbb77","Type":"ContainerStarted","Data":"460916fc4567e84486ec58cb52cd15a8036920a838ecc51e8bf3b606f4176b03"} Dec 01 22:46:00 crc kubenswrapper[4962]: I1201 22:46:00.688218 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-f7rrw" podStartSLOduration=4.049166507 podStartE2EDuration="8.688193629s" podCreationTimestamp="2025-12-01 22:45:52 +0000 UTC" firstStartedPulling="2025-12-01 22:45:54.526994828 +0000 UTC m=+4338.628434023" lastFinishedPulling="2025-12-01 22:45:59.16602194 +0000 UTC m=+4343.267461145" observedRunningTime="2025-12-01 22:46:00.683007762 +0000 UTC m=+4344.784446997" watchObservedRunningTime="2025-12-01 22:46:00.688193629 +0000 UTC m=+4344.789632834" Dec 01 22:46:00 crc kubenswrapper[4962]: I1201 22:46:00.722712 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lt8md" podStartSLOduration=4.075640191 podStartE2EDuration="9.722690737s" podCreationTimestamp="2025-12-01 22:45:51 +0000 UTC" firstStartedPulling="2025-12-01 22:45:53.512494651 +0000 UTC m=+4337.613933846" lastFinishedPulling="2025-12-01 22:45:59.159545187 +0000 UTC m=+4343.260984392" observedRunningTime="2025-12-01 22:46:00.708752952 +0000 UTC m=+4344.810192157" watchObservedRunningTime="2025-12-01 22:46:00.722690737 +0000 UTC m=+4344.824129942" Dec 01 22:46:02 crc kubenswrapper[4962]: I1201 22:46:02.344849 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lt8md" Dec 01 22:46:02 crc kubenswrapper[4962]: I1201 22:46:02.345240 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lt8md" Dec 01 22:46:02 crc kubenswrapper[4962]: I1201 22:46:02.784210 4962 patch_prober.go:28] interesting pod/machine-config-daemon-b642k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 22:46:02 crc kubenswrapper[4962]: I1201 22:46:02.784283 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 22:46:02 crc kubenswrapper[4962]: I1201 22:46:02.784335 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b642k" Dec 01 22:46:02 crc kubenswrapper[4962]: I1201 22:46:02.785595 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c4d9a7193f7ebfe349227a99972c1252553194d867623ce522be000424eb6be4"} pod="openshift-machine-config-operator/machine-config-daemon-b642k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 22:46:02 crc kubenswrapper[4962]: I1201 22:46:02.785659 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" containerID="cri-o://c4d9a7193f7ebfe349227a99972c1252553194d867623ce522be000424eb6be4" gracePeriod=600 Dec 01 22:46:03 crc kubenswrapper[4962]: I1201 22:46:03.268407 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-f7rrw" Dec 01 22:46:03 crc kubenswrapper[4962]: I1201 22:46:03.269036 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-f7rrw" Dec 01 22:46:03 crc kubenswrapper[4962]: I1201 22:46:03.343215 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-f7rrw" Dec 01 22:46:03 crc kubenswrapper[4962]: I1201 22:46:03.416675 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lt8md" podUID="22b11b92-29b0-4134-a443-4656e66fbb77" containerName="registry-server" probeResult="failure" output=< Dec 01 22:46:03 crc kubenswrapper[4962]: timeout: failed to connect service ":50051" within 1s Dec 01 22:46:03 crc kubenswrapper[4962]: > Dec 01 22:46:03 crc kubenswrapper[4962]: I1201 22:46:03.708093 4962 generic.go:334] "Generic (PLEG): container finished" podID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerID="c4d9a7193f7ebfe349227a99972c1252553194d867623ce522be000424eb6be4" exitCode=0 Dec 01 22:46:03 crc kubenswrapper[4962]: I1201 22:46:03.708245 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" event={"ID":"191b6ce3-f613-4217-b224-a65ee4cfdfe7","Type":"ContainerDied","Data":"c4d9a7193f7ebfe349227a99972c1252553194d867623ce522be000424eb6be4"} Dec 01 22:46:03 crc kubenswrapper[4962]: I1201 22:46:03.708782 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" event={"ID":"191b6ce3-f613-4217-b224-a65ee4cfdfe7","Type":"ContainerStarted","Data":"1c8a86ee58923ccc90761e8aea7d554e82810ff183d68baeb16ce55d129d65d8"} Dec 01 22:46:03 crc kubenswrapper[4962]: I1201 22:46:03.708821 4962 scope.go:117] "RemoveContainer" containerID="36a09ef847cf5ec44e4a7f255c073008d056414bc9c5b5f97a96b84dbe8dff75" Dec 01 22:46:12 crc kubenswrapper[4962]: I1201 22:46:12.415465 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lt8md" Dec 01 22:46:12 crc kubenswrapper[4962]: I1201 22:46:12.485465 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lt8md" Dec 01 22:46:12 crc kubenswrapper[4962]: I1201 22:46:12.657661 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lt8md"] Dec 01 22:46:13 crc kubenswrapper[4962]: I1201 22:46:13.332386 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-f7rrw" Dec 01 22:46:13 crc kubenswrapper[4962]: I1201 22:46:13.848061 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lt8md" podUID="22b11b92-29b0-4134-a443-4656e66fbb77" containerName="registry-server" containerID="cri-o://460916fc4567e84486ec58cb52cd15a8036920a838ecc51e8bf3b606f4176b03" gracePeriod=2 Dec 01 22:46:14 crc kubenswrapper[4962]: I1201 22:46:14.408087 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lt8md" Dec 01 22:46:14 crc kubenswrapper[4962]: I1201 22:46:14.508367 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvhpd\" (UniqueName: \"kubernetes.io/projected/22b11b92-29b0-4134-a443-4656e66fbb77-kube-api-access-nvhpd\") pod \"22b11b92-29b0-4134-a443-4656e66fbb77\" (UID: \"22b11b92-29b0-4134-a443-4656e66fbb77\") " Dec 01 22:46:14 crc kubenswrapper[4962]: I1201 22:46:14.508412 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22b11b92-29b0-4134-a443-4656e66fbb77-utilities\") pod \"22b11b92-29b0-4134-a443-4656e66fbb77\" (UID: \"22b11b92-29b0-4134-a443-4656e66fbb77\") " Dec 01 22:46:14 crc kubenswrapper[4962]: I1201 22:46:14.508434 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22b11b92-29b0-4134-a443-4656e66fbb77-catalog-content\") pod \"22b11b92-29b0-4134-a443-4656e66fbb77\" (UID: \"22b11b92-29b0-4134-a443-4656e66fbb77\") " Dec 01 22:46:14 crc kubenswrapper[4962]: I1201 22:46:14.510282 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22b11b92-29b0-4134-a443-4656e66fbb77-utilities" (OuterVolumeSpecName: "utilities") pod "22b11b92-29b0-4134-a443-4656e66fbb77" (UID: "22b11b92-29b0-4134-a443-4656e66fbb77"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 22:46:14 crc kubenswrapper[4962]: I1201 22:46:14.518005 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22b11b92-29b0-4134-a443-4656e66fbb77-kube-api-access-nvhpd" (OuterVolumeSpecName: "kube-api-access-nvhpd") pod "22b11b92-29b0-4134-a443-4656e66fbb77" (UID: "22b11b92-29b0-4134-a443-4656e66fbb77"). InnerVolumeSpecName "kube-api-access-nvhpd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 22:46:14 crc kubenswrapper[4962]: I1201 22:46:14.612215 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvhpd\" (UniqueName: \"kubernetes.io/projected/22b11b92-29b0-4134-a443-4656e66fbb77-kube-api-access-nvhpd\") on node \"crc\" DevicePath \"\"" Dec 01 22:46:14 crc kubenswrapper[4962]: I1201 22:46:14.612275 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22b11b92-29b0-4134-a443-4656e66fbb77-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 22:46:14 crc kubenswrapper[4962]: I1201 22:46:14.630132 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22b11b92-29b0-4134-a443-4656e66fbb77-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "22b11b92-29b0-4134-a443-4656e66fbb77" (UID: "22b11b92-29b0-4134-a443-4656e66fbb77"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 22:46:14 crc kubenswrapper[4962]: I1201 22:46:14.714927 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22b11b92-29b0-4134-a443-4656e66fbb77-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 22:46:14 crc kubenswrapper[4962]: I1201 22:46:14.865117 4962 generic.go:334] "Generic (PLEG): container finished" podID="22b11b92-29b0-4134-a443-4656e66fbb77" containerID="460916fc4567e84486ec58cb52cd15a8036920a838ecc51e8bf3b606f4176b03" exitCode=0 Dec 01 22:46:14 crc kubenswrapper[4962]: I1201 22:46:14.865164 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lt8md" event={"ID":"22b11b92-29b0-4134-a443-4656e66fbb77","Type":"ContainerDied","Data":"460916fc4567e84486ec58cb52cd15a8036920a838ecc51e8bf3b606f4176b03"} Dec 01 22:46:14 crc kubenswrapper[4962]: I1201 22:46:14.865193 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lt8md" event={"ID":"22b11b92-29b0-4134-a443-4656e66fbb77","Type":"ContainerDied","Data":"e8fa8abf02b29dedbd97aad9bb25dddb45491d9bca7a3a8d32a7065ce1500649"} Dec 01 22:46:14 crc kubenswrapper[4962]: I1201 22:46:14.865215 4962 scope.go:117] "RemoveContainer" containerID="460916fc4567e84486ec58cb52cd15a8036920a838ecc51e8bf3b606f4176b03" Dec 01 22:46:14 crc kubenswrapper[4962]: I1201 22:46:14.865218 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lt8md" Dec 01 22:46:14 crc kubenswrapper[4962]: I1201 22:46:14.895998 4962 scope.go:117] "RemoveContainer" containerID="18a97517a7ea6ee8b9cd2d438137fb6ad64161c0f8f16eb4d6bc65f4db81b580" Dec 01 22:46:14 crc kubenswrapper[4962]: I1201 22:46:14.927620 4962 scope.go:117] "RemoveContainer" containerID="8977ad18b647cdcbf8dc631f2090bd615d22c7b2eb0ddfc22600dbc7fb312e02" Dec 01 22:46:14 crc kubenswrapper[4962]: I1201 22:46:14.953803 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lt8md"] Dec 01 22:46:14 crc kubenswrapper[4962]: I1201 22:46:14.970521 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lt8md"] Dec 01 22:46:14 crc kubenswrapper[4962]: I1201 22:46:14.990670 4962 scope.go:117] "RemoveContainer" containerID="460916fc4567e84486ec58cb52cd15a8036920a838ecc51e8bf3b606f4176b03" Dec 01 22:46:14 crc kubenswrapper[4962]: E1201 22:46:14.991296 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"460916fc4567e84486ec58cb52cd15a8036920a838ecc51e8bf3b606f4176b03\": container with ID starting with 460916fc4567e84486ec58cb52cd15a8036920a838ecc51e8bf3b606f4176b03 not found: ID does not exist" containerID="460916fc4567e84486ec58cb52cd15a8036920a838ecc51e8bf3b606f4176b03" Dec 01 22:46:14 crc kubenswrapper[4962]: I1201 22:46:14.991337 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"460916fc4567e84486ec58cb52cd15a8036920a838ecc51e8bf3b606f4176b03"} err="failed to get container status \"460916fc4567e84486ec58cb52cd15a8036920a838ecc51e8bf3b606f4176b03\": rpc error: code = NotFound desc = could not find container \"460916fc4567e84486ec58cb52cd15a8036920a838ecc51e8bf3b606f4176b03\": container with ID starting with 460916fc4567e84486ec58cb52cd15a8036920a838ecc51e8bf3b606f4176b03 not found: ID does not exist" Dec 01 22:46:14 crc kubenswrapper[4962]: I1201 22:46:14.991363 4962 scope.go:117] "RemoveContainer" containerID="18a97517a7ea6ee8b9cd2d438137fb6ad64161c0f8f16eb4d6bc65f4db81b580" Dec 01 22:46:14 crc kubenswrapper[4962]: E1201 22:46:14.991721 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18a97517a7ea6ee8b9cd2d438137fb6ad64161c0f8f16eb4d6bc65f4db81b580\": container with ID starting with 18a97517a7ea6ee8b9cd2d438137fb6ad64161c0f8f16eb4d6bc65f4db81b580 not found: ID does not exist" containerID="18a97517a7ea6ee8b9cd2d438137fb6ad64161c0f8f16eb4d6bc65f4db81b580" Dec 01 22:46:14 crc kubenswrapper[4962]: I1201 22:46:14.991748 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18a97517a7ea6ee8b9cd2d438137fb6ad64161c0f8f16eb4d6bc65f4db81b580"} err="failed to get container status \"18a97517a7ea6ee8b9cd2d438137fb6ad64161c0f8f16eb4d6bc65f4db81b580\": rpc error: code = NotFound desc = could not find container \"18a97517a7ea6ee8b9cd2d438137fb6ad64161c0f8f16eb4d6bc65f4db81b580\": container with ID starting with 18a97517a7ea6ee8b9cd2d438137fb6ad64161c0f8f16eb4d6bc65f4db81b580 not found: ID does not exist" Dec 01 22:46:14 crc kubenswrapper[4962]: I1201 22:46:14.991766 4962 scope.go:117] "RemoveContainer" containerID="8977ad18b647cdcbf8dc631f2090bd615d22c7b2eb0ddfc22600dbc7fb312e02" Dec 01 22:46:14 crc kubenswrapper[4962]: E1201 22:46:14.992112 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8977ad18b647cdcbf8dc631f2090bd615d22c7b2eb0ddfc22600dbc7fb312e02\": container with ID starting with 8977ad18b647cdcbf8dc631f2090bd615d22c7b2eb0ddfc22600dbc7fb312e02 not found: ID does not exist" containerID="8977ad18b647cdcbf8dc631f2090bd615d22c7b2eb0ddfc22600dbc7fb312e02" Dec 01 22:46:14 crc kubenswrapper[4962]: I1201 22:46:14.992167 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8977ad18b647cdcbf8dc631f2090bd615d22c7b2eb0ddfc22600dbc7fb312e02"} err="failed to get container status \"8977ad18b647cdcbf8dc631f2090bd615d22c7b2eb0ddfc22600dbc7fb312e02\": rpc error: code = NotFound desc = could not find container \"8977ad18b647cdcbf8dc631f2090bd615d22c7b2eb0ddfc22600dbc7fb312e02\": container with ID starting with 8977ad18b647cdcbf8dc631f2090bd615d22c7b2eb0ddfc22600dbc7fb312e02 not found: ID does not exist" Dec 01 22:46:15 crc kubenswrapper[4962]: I1201 22:46:15.666424 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-f7rrw"] Dec 01 22:46:15 crc kubenswrapper[4962]: I1201 22:46:15.666979 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-f7rrw" podUID="b8556f12-683b-4ea7-a325-a1e0f98f86e7" containerName="registry-server" containerID="cri-o://1c3b93d73a63ac49158388d635dc2f7e1a793eab0887833947c910bac31d1c2c" gracePeriod=2 Dec 01 22:46:15 crc kubenswrapper[4962]: I1201 22:46:15.884228 4962 generic.go:334] "Generic (PLEG): container finished" podID="b8556f12-683b-4ea7-a325-a1e0f98f86e7" containerID="1c3b93d73a63ac49158388d635dc2f7e1a793eab0887833947c910bac31d1c2c" exitCode=0 Dec 01 22:46:15 crc kubenswrapper[4962]: I1201 22:46:15.884536 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f7rrw" event={"ID":"b8556f12-683b-4ea7-a325-a1e0f98f86e7","Type":"ContainerDied","Data":"1c3b93d73a63ac49158388d635dc2f7e1a793eab0887833947c910bac31d1c2c"} Dec 01 22:46:16 crc kubenswrapper[4962]: I1201 22:46:16.233778 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f7rrw" Dec 01 22:46:16 crc kubenswrapper[4962]: I1201 22:46:16.235265 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22b11b92-29b0-4134-a443-4656e66fbb77" path="/var/lib/kubelet/pods/22b11b92-29b0-4134-a443-4656e66fbb77/volumes" Dec 01 22:46:16 crc kubenswrapper[4962]: I1201 22:46:16.370023 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8556f12-683b-4ea7-a325-a1e0f98f86e7-catalog-content\") pod \"b8556f12-683b-4ea7-a325-a1e0f98f86e7\" (UID: \"b8556f12-683b-4ea7-a325-a1e0f98f86e7\") " Dec 01 22:46:16 crc kubenswrapper[4962]: I1201 22:46:16.370578 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkq2x\" (UniqueName: \"kubernetes.io/projected/b8556f12-683b-4ea7-a325-a1e0f98f86e7-kube-api-access-wkq2x\") pod \"b8556f12-683b-4ea7-a325-a1e0f98f86e7\" (UID: \"b8556f12-683b-4ea7-a325-a1e0f98f86e7\") " Dec 01 22:46:16 crc kubenswrapper[4962]: I1201 22:46:16.370727 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8556f12-683b-4ea7-a325-a1e0f98f86e7-utilities\") pod \"b8556f12-683b-4ea7-a325-a1e0f98f86e7\" (UID: \"b8556f12-683b-4ea7-a325-a1e0f98f86e7\") " Dec 01 22:46:16 crc kubenswrapper[4962]: I1201 22:46:16.371707 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8556f12-683b-4ea7-a325-a1e0f98f86e7-utilities" (OuterVolumeSpecName: "utilities") pod "b8556f12-683b-4ea7-a325-a1e0f98f86e7" (UID: "b8556f12-683b-4ea7-a325-a1e0f98f86e7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 22:46:16 crc kubenswrapper[4962]: I1201 22:46:16.372524 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8556f12-683b-4ea7-a325-a1e0f98f86e7-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 22:46:16 crc kubenswrapper[4962]: I1201 22:46:16.378445 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8556f12-683b-4ea7-a325-a1e0f98f86e7-kube-api-access-wkq2x" (OuterVolumeSpecName: "kube-api-access-wkq2x") pod "b8556f12-683b-4ea7-a325-a1e0f98f86e7" (UID: "b8556f12-683b-4ea7-a325-a1e0f98f86e7"). InnerVolumeSpecName "kube-api-access-wkq2x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 22:46:16 crc kubenswrapper[4962]: I1201 22:46:16.422411 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8556f12-683b-4ea7-a325-a1e0f98f86e7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b8556f12-683b-4ea7-a325-a1e0f98f86e7" (UID: "b8556f12-683b-4ea7-a325-a1e0f98f86e7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 22:46:16 crc kubenswrapper[4962]: I1201 22:46:16.475128 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8556f12-683b-4ea7-a325-a1e0f98f86e7-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 22:46:16 crc kubenswrapper[4962]: I1201 22:46:16.475160 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wkq2x\" (UniqueName: \"kubernetes.io/projected/b8556f12-683b-4ea7-a325-a1e0f98f86e7-kube-api-access-wkq2x\") on node \"crc\" DevicePath \"\"" Dec 01 22:46:16 crc kubenswrapper[4962]: I1201 22:46:16.908056 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f7rrw" event={"ID":"b8556f12-683b-4ea7-a325-a1e0f98f86e7","Type":"ContainerDied","Data":"dda822edc4936b40732aea3ead61efb5957dc01f9583190477b15d3e1e3a634b"} Dec 01 22:46:16 crc kubenswrapper[4962]: I1201 22:46:16.908131 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f7rrw" Dec 01 22:46:16 crc kubenswrapper[4962]: I1201 22:46:16.908158 4962 scope.go:117] "RemoveContainer" containerID="1c3b93d73a63ac49158388d635dc2f7e1a793eab0887833947c910bac31d1c2c" Dec 01 22:46:16 crc kubenswrapper[4962]: I1201 22:46:16.952027 4962 scope.go:117] "RemoveContainer" containerID="e39ed7d1b94d3829da78b7482cf1233cbf408127aa345a80aade3a83f04651aa" Dec 01 22:46:16 crc kubenswrapper[4962]: I1201 22:46:16.957417 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-f7rrw"] Dec 01 22:46:16 crc kubenswrapper[4962]: I1201 22:46:16.968018 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-f7rrw"] Dec 01 22:46:16 crc kubenswrapper[4962]: I1201 22:46:16.983800 4962 scope.go:117] "RemoveContainer" containerID="58140522fd40baab94327991e190577b42cdf569c628be974ae37a6afe2fc261" Dec 01 22:46:18 crc kubenswrapper[4962]: I1201 22:46:18.292188 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8556f12-683b-4ea7-a325-a1e0f98f86e7" path="/var/lib/kubelet/pods/b8556f12-683b-4ea7-a325-a1e0f98f86e7/volumes" Dec 01 22:48:26 crc kubenswrapper[4962]: I1201 22:48:26.931836 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ngvfw"] Dec 01 22:48:26 crc kubenswrapper[4962]: E1201 22:48:26.933244 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22b11b92-29b0-4134-a443-4656e66fbb77" containerName="registry-server" Dec 01 22:48:26 crc kubenswrapper[4962]: I1201 22:48:26.933258 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="22b11b92-29b0-4134-a443-4656e66fbb77" containerName="registry-server" Dec 01 22:48:26 crc kubenswrapper[4962]: E1201 22:48:26.933283 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8556f12-683b-4ea7-a325-a1e0f98f86e7" containerName="extract-utilities" Dec 01 22:48:26 crc kubenswrapper[4962]: I1201 22:48:26.933289 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8556f12-683b-4ea7-a325-a1e0f98f86e7" containerName="extract-utilities" Dec 01 22:48:26 crc kubenswrapper[4962]: E1201 22:48:26.933317 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8556f12-683b-4ea7-a325-a1e0f98f86e7" containerName="extract-content" Dec 01 22:48:26 crc kubenswrapper[4962]: I1201 22:48:26.933324 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8556f12-683b-4ea7-a325-a1e0f98f86e7" containerName="extract-content" Dec 01 22:48:26 crc kubenswrapper[4962]: E1201 22:48:26.933358 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22b11b92-29b0-4134-a443-4656e66fbb77" containerName="extract-utilities" Dec 01 22:48:26 crc kubenswrapper[4962]: I1201 22:48:26.933364 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="22b11b92-29b0-4134-a443-4656e66fbb77" containerName="extract-utilities" Dec 01 22:48:26 crc kubenswrapper[4962]: E1201 22:48:26.933375 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8556f12-683b-4ea7-a325-a1e0f98f86e7" containerName="registry-server" Dec 01 22:48:26 crc kubenswrapper[4962]: I1201 22:48:26.933381 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8556f12-683b-4ea7-a325-a1e0f98f86e7" containerName="registry-server" Dec 01 22:48:26 crc kubenswrapper[4962]: E1201 22:48:26.933400 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22b11b92-29b0-4134-a443-4656e66fbb77" containerName="extract-content" Dec 01 22:48:26 crc kubenswrapper[4962]: I1201 22:48:26.933406 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="22b11b92-29b0-4134-a443-4656e66fbb77" containerName="extract-content" Dec 01 22:48:26 crc kubenswrapper[4962]: I1201 22:48:26.933614 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8556f12-683b-4ea7-a325-a1e0f98f86e7" containerName="registry-server" Dec 01 22:48:26 crc kubenswrapper[4962]: I1201 22:48:26.933636 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="22b11b92-29b0-4134-a443-4656e66fbb77" containerName="registry-server" Dec 01 22:48:26 crc kubenswrapper[4962]: I1201 22:48:26.935580 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ngvfw" Dec 01 22:48:26 crc kubenswrapper[4962]: I1201 22:48:26.952640 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ngvfw"] Dec 01 22:48:27 crc kubenswrapper[4962]: I1201 22:48:27.050943 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ea127e2-c032-440a-abcc-8e82da562a4e-catalog-content\") pod \"community-operators-ngvfw\" (UID: \"1ea127e2-c032-440a-abcc-8e82da562a4e\") " pod="openshift-marketplace/community-operators-ngvfw" Dec 01 22:48:27 crc kubenswrapper[4962]: I1201 22:48:27.051298 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55bcm\" (UniqueName: \"kubernetes.io/projected/1ea127e2-c032-440a-abcc-8e82da562a4e-kube-api-access-55bcm\") pod \"community-operators-ngvfw\" (UID: \"1ea127e2-c032-440a-abcc-8e82da562a4e\") " pod="openshift-marketplace/community-operators-ngvfw" Dec 01 22:48:27 crc kubenswrapper[4962]: I1201 22:48:27.051431 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ea127e2-c032-440a-abcc-8e82da562a4e-utilities\") pod \"community-operators-ngvfw\" (UID: \"1ea127e2-c032-440a-abcc-8e82da562a4e\") " pod="openshift-marketplace/community-operators-ngvfw" Dec 01 22:48:27 crc kubenswrapper[4962]: I1201 22:48:27.153253 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ea127e2-c032-440a-abcc-8e82da562a4e-catalog-content\") pod \"community-operators-ngvfw\" (UID: \"1ea127e2-c032-440a-abcc-8e82da562a4e\") " pod="openshift-marketplace/community-operators-ngvfw" Dec 01 22:48:27 crc kubenswrapper[4962]: I1201 22:48:27.153332 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55bcm\" (UniqueName: \"kubernetes.io/projected/1ea127e2-c032-440a-abcc-8e82da562a4e-kube-api-access-55bcm\") pod \"community-operators-ngvfw\" (UID: \"1ea127e2-c032-440a-abcc-8e82da562a4e\") " pod="openshift-marketplace/community-operators-ngvfw" Dec 01 22:48:27 crc kubenswrapper[4962]: I1201 22:48:27.153392 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ea127e2-c032-440a-abcc-8e82da562a4e-utilities\") pod \"community-operators-ngvfw\" (UID: \"1ea127e2-c032-440a-abcc-8e82da562a4e\") " pod="openshift-marketplace/community-operators-ngvfw" Dec 01 22:48:27 crc kubenswrapper[4962]: I1201 22:48:27.153896 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ea127e2-c032-440a-abcc-8e82da562a4e-catalog-content\") pod \"community-operators-ngvfw\" (UID: \"1ea127e2-c032-440a-abcc-8e82da562a4e\") " pod="openshift-marketplace/community-operators-ngvfw" Dec 01 22:48:27 crc kubenswrapper[4962]: I1201 22:48:27.153927 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ea127e2-c032-440a-abcc-8e82da562a4e-utilities\") pod \"community-operators-ngvfw\" (UID: \"1ea127e2-c032-440a-abcc-8e82da562a4e\") " pod="openshift-marketplace/community-operators-ngvfw" Dec 01 22:48:27 crc kubenswrapper[4962]: I1201 22:48:27.172845 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55bcm\" (UniqueName: \"kubernetes.io/projected/1ea127e2-c032-440a-abcc-8e82da562a4e-kube-api-access-55bcm\") pod \"community-operators-ngvfw\" (UID: \"1ea127e2-c032-440a-abcc-8e82da562a4e\") " pod="openshift-marketplace/community-operators-ngvfw" Dec 01 22:48:27 crc kubenswrapper[4962]: I1201 22:48:27.267605 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ngvfw" Dec 01 22:48:27 crc kubenswrapper[4962]: I1201 22:48:27.934998 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ngvfw"] Dec 01 22:48:28 crc kubenswrapper[4962]: I1201 22:48:28.779737 4962 generic.go:334] "Generic (PLEG): container finished" podID="1ea127e2-c032-440a-abcc-8e82da562a4e" containerID="e3cafb73966aec3ef3f1ba216d03763c0f66d0090d4b249026a5f3376a57ffe8" exitCode=0 Dec 01 22:48:28 crc kubenswrapper[4962]: I1201 22:48:28.780002 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ngvfw" event={"ID":"1ea127e2-c032-440a-abcc-8e82da562a4e","Type":"ContainerDied","Data":"e3cafb73966aec3ef3f1ba216d03763c0f66d0090d4b249026a5f3376a57ffe8"} Dec 01 22:48:28 crc kubenswrapper[4962]: I1201 22:48:28.780162 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ngvfw" event={"ID":"1ea127e2-c032-440a-abcc-8e82da562a4e","Type":"ContainerStarted","Data":"e5037f3a48a18583bfac401accd7f60494f6d23534e8277ded50dfa80bc44a82"} Dec 01 22:48:30 crc kubenswrapper[4962]: I1201 22:48:30.811443 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ngvfw" event={"ID":"1ea127e2-c032-440a-abcc-8e82da562a4e","Type":"ContainerStarted","Data":"7a6efa3ae526c891b140e703f411463407083af9dda1fd349f63ff6b42f27e25"} Dec 01 22:48:31 crc kubenswrapper[4962]: I1201 22:48:31.828849 4962 generic.go:334] "Generic (PLEG): container finished" podID="1ea127e2-c032-440a-abcc-8e82da562a4e" containerID="7a6efa3ae526c891b140e703f411463407083af9dda1fd349f63ff6b42f27e25" exitCode=0 Dec 01 22:48:31 crc kubenswrapper[4962]: I1201 22:48:31.829054 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ngvfw" event={"ID":"1ea127e2-c032-440a-abcc-8e82da562a4e","Type":"ContainerDied","Data":"7a6efa3ae526c891b140e703f411463407083af9dda1fd349f63ff6b42f27e25"} Dec 01 22:48:32 crc kubenswrapper[4962]: I1201 22:48:32.784657 4962 patch_prober.go:28] interesting pod/machine-config-daemon-b642k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 22:48:32 crc kubenswrapper[4962]: I1201 22:48:32.785338 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 22:48:32 crc kubenswrapper[4962]: I1201 22:48:32.840540 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ngvfw" event={"ID":"1ea127e2-c032-440a-abcc-8e82da562a4e","Type":"ContainerStarted","Data":"cfd5e77eef46a541845362e1d66b09f9897396c71400af143edc230970d8587c"} Dec 01 22:48:32 crc kubenswrapper[4962]: I1201 22:48:32.880339 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ngvfw" podStartSLOduration=3.357857362 podStartE2EDuration="6.880316192s" podCreationTimestamp="2025-12-01 22:48:26 +0000 UTC" firstStartedPulling="2025-12-01 22:48:28.782299216 +0000 UTC m=+4492.883738421" lastFinishedPulling="2025-12-01 22:48:32.304758016 +0000 UTC m=+4496.406197251" observedRunningTime="2025-12-01 22:48:32.867298673 +0000 UTC m=+4496.968737878" watchObservedRunningTime="2025-12-01 22:48:32.880316192 +0000 UTC m=+4496.981755377" Dec 01 22:48:37 crc kubenswrapper[4962]: I1201 22:48:37.268320 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ngvfw" Dec 01 22:48:37 crc kubenswrapper[4962]: I1201 22:48:37.268905 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ngvfw" Dec 01 22:48:37 crc kubenswrapper[4962]: I1201 22:48:37.355174 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ngvfw" Dec 01 22:48:38 crc kubenswrapper[4962]: I1201 22:48:38.096899 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ngvfw" Dec 01 22:48:38 crc kubenswrapper[4962]: I1201 22:48:38.154713 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ngvfw"] Dec 01 22:48:40 crc kubenswrapper[4962]: I1201 22:48:40.039913 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ngvfw" podUID="1ea127e2-c032-440a-abcc-8e82da562a4e" containerName="registry-server" containerID="cri-o://cfd5e77eef46a541845362e1d66b09f9897396c71400af143edc230970d8587c" gracePeriod=2 Dec 01 22:48:40 crc kubenswrapper[4962]: I1201 22:48:40.630327 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ngvfw" Dec 01 22:48:40 crc kubenswrapper[4962]: I1201 22:48:40.786215 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55bcm\" (UniqueName: \"kubernetes.io/projected/1ea127e2-c032-440a-abcc-8e82da562a4e-kube-api-access-55bcm\") pod \"1ea127e2-c032-440a-abcc-8e82da562a4e\" (UID: \"1ea127e2-c032-440a-abcc-8e82da562a4e\") " Dec 01 22:48:40 crc kubenswrapper[4962]: I1201 22:48:40.786285 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ea127e2-c032-440a-abcc-8e82da562a4e-utilities\") pod \"1ea127e2-c032-440a-abcc-8e82da562a4e\" (UID: \"1ea127e2-c032-440a-abcc-8e82da562a4e\") " Dec 01 22:48:40 crc kubenswrapper[4962]: I1201 22:48:40.786439 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ea127e2-c032-440a-abcc-8e82da562a4e-catalog-content\") pod \"1ea127e2-c032-440a-abcc-8e82da562a4e\" (UID: \"1ea127e2-c032-440a-abcc-8e82da562a4e\") " Dec 01 22:48:40 crc kubenswrapper[4962]: I1201 22:48:40.787224 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ea127e2-c032-440a-abcc-8e82da562a4e-utilities" (OuterVolumeSpecName: "utilities") pod "1ea127e2-c032-440a-abcc-8e82da562a4e" (UID: "1ea127e2-c032-440a-abcc-8e82da562a4e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 22:48:40 crc kubenswrapper[4962]: I1201 22:48:40.851799 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ea127e2-c032-440a-abcc-8e82da562a4e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1ea127e2-c032-440a-abcc-8e82da562a4e" (UID: "1ea127e2-c032-440a-abcc-8e82da562a4e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 22:48:40 crc kubenswrapper[4962]: I1201 22:48:40.888813 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ea127e2-c032-440a-abcc-8e82da562a4e-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 22:48:40 crc kubenswrapper[4962]: I1201 22:48:40.888844 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ea127e2-c032-440a-abcc-8e82da562a4e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 22:48:41 crc kubenswrapper[4962]: I1201 22:48:41.058386 4962 generic.go:334] "Generic (PLEG): container finished" podID="1ea127e2-c032-440a-abcc-8e82da562a4e" containerID="cfd5e77eef46a541845362e1d66b09f9897396c71400af143edc230970d8587c" exitCode=0 Dec 01 22:48:41 crc kubenswrapper[4962]: I1201 22:48:41.058455 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ngvfw" event={"ID":"1ea127e2-c032-440a-abcc-8e82da562a4e","Type":"ContainerDied","Data":"cfd5e77eef46a541845362e1d66b09f9897396c71400af143edc230970d8587c"} Dec 01 22:48:41 crc kubenswrapper[4962]: I1201 22:48:41.058506 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ngvfw" event={"ID":"1ea127e2-c032-440a-abcc-8e82da562a4e","Type":"ContainerDied","Data":"e5037f3a48a18583bfac401accd7f60494f6d23534e8277ded50dfa80bc44a82"} Dec 01 22:48:41 crc kubenswrapper[4962]: I1201 22:48:41.058518 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ngvfw" Dec 01 22:48:41 crc kubenswrapper[4962]: I1201 22:48:41.058537 4962 scope.go:117] "RemoveContainer" containerID="cfd5e77eef46a541845362e1d66b09f9897396c71400af143edc230970d8587c" Dec 01 22:48:41 crc kubenswrapper[4962]: I1201 22:48:41.103949 4962 scope.go:117] "RemoveContainer" containerID="7a6efa3ae526c891b140e703f411463407083af9dda1fd349f63ff6b42f27e25" Dec 01 22:48:41 crc kubenswrapper[4962]: I1201 22:48:41.390061 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ea127e2-c032-440a-abcc-8e82da562a4e-kube-api-access-55bcm" (OuterVolumeSpecName: "kube-api-access-55bcm") pod "1ea127e2-c032-440a-abcc-8e82da562a4e" (UID: "1ea127e2-c032-440a-abcc-8e82da562a4e"). InnerVolumeSpecName "kube-api-access-55bcm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 22:48:41 crc kubenswrapper[4962]: I1201 22:48:41.402329 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55bcm\" (UniqueName: \"kubernetes.io/projected/1ea127e2-c032-440a-abcc-8e82da562a4e-kube-api-access-55bcm\") on node \"crc\" DevicePath \"\"" Dec 01 22:48:41 crc kubenswrapper[4962]: I1201 22:48:41.450543 4962 scope.go:117] "RemoveContainer" containerID="e3cafb73966aec3ef3f1ba216d03763c0f66d0090d4b249026a5f3376a57ffe8" Dec 01 22:48:41 crc kubenswrapper[4962]: I1201 22:48:41.629221 4962 scope.go:117] "RemoveContainer" containerID="cfd5e77eef46a541845362e1d66b09f9897396c71400af143edc230970d8587c" Dec 01 22:48:41 crc kubenswrapper[4962]: E1201 22:48:41.629744 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfd5e77eef46a541845362e1d66b09f9897396c71400af143edc230970d8587c\": container with ID starting with cfd5e77eef46a541845362e1d66b09f9897396c71400af143edc230970d8587c not found: ID does not exist" containerID="cfd5e77eef46a541845362e1d66b09f9897396c71400af143edc230970d8587c" Dec 01 22:48:41 crc kubenswrapper[4962]: I1201 22:48:41.629788 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfd5e77eef46a541845362e1d66b09f9897396c71400af143edc230970d8587c"} err="failed to get container status \"cfd5e77eef46a541845362e1d66b09f9897396c71400af143edc230970d8587c\": rpc error: code = NotFound desc = could not find container \"cfd5e77eef46a541845362e1d66b09f9897396c71400af143edc230970d8587c\": container with ID starting with cfd5e77eef46a541845362e1d66b09f9897396c71400af143edc230970d8587c not found: ID does not exist" Dec 01 22:48:41 crc kubenswrapper[4962]: I1201 22:48:41.629816 4962 scope.go:117] "RemoveContainer" containerID="7a6efa3ae526c891b140e703f411463407083af9dda1fd349f63ff6b42f27e25" Dec 01 22:48:41 crc kubenswrapper[4962]: E1201 22:48:41.630298 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a6efa3ae526c891b140e703f411463407083af9dda1fd349f63ff6b42f27e25\": container with ID starting with 7a6efa3ae526c891b140e703f411463407083af9dda1fd349f63ff6b42f27e25 not found: ID does not exist" containerID="7a6efa3ae526c891b140e703f411463407083af9dda1fd349f63ff6b42f27e25" Dec 01 22:48:41 crc kubenswrapper[4962]: I1201 22:48:41.630331 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a6efa3ae526c891b140e703f411463407083af9dda1fd349f63ff6b42f27e25"} err="failed to get container status \"7a6efa3ae526c891b140e703f411463407083af9dda1fd349f63ff6b42f27e25\": rpc error: code = NotFound desc = could not find container \"7a6efa3ae526c891b140e703f411463407083af9dda1fd349f63ff6b42f27e25\": container with ID starting with 7a6efa3ae526c891b140e703f411463407083af9dda1fd349f63ff6b42f27e25 not found: ID does not exist" Dec 01 22:48:41 crc kubenswrapper[4962]: I1201 22:48:41.630353 4962 scope.go:117] "RemoveContainer" containerID="e3cafb73966aec3ef3f1ba216d03763c0f66d0090d4b249026a5f3376a57ffe8" Dec 01 22:48:41 crc kubenswrapper[4962]: E1201 22:48:41.630695 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3cafb73966aec3ef3f1ba216d03763c0f66d0090d4b249026a5f3376a57ffe8\": container with ID starting with e3cafb73966aec3ef3f1ba216d03763c0f66d0090d4b249026a5f3376a57ffe8 not found: ID does not exist" containerID="e3cafb73966aec3ef3f1ba216d03763c0f66d0090d4b249026a5f3376a57ffe8" Dec 01 22:48:41 crc kubenswrapper[4962]: I1201 22:48:41.630714 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3cafb73966aec3ef3f1ba216d03763c0f66d0090d4b249026a5f3376a57ffe8"} err="failed to get container status \"e3cafb73966aec3ef3f1ba216d03763c0f66d0090d4b249026a5f3376a57ffe8\": rpc error: code = NotFound desc = could not find container \"e3cafb73966aec3ef3f1ba216d03763c0f66d0090d4b249026a5f3376a57ffe8\": container with ID starting with e3cafb73966aec3ef3f1ba216d03763c0f66d0090d4b249026a5f3376a57ffe8 not found: ID does not exist" Dec 01 22:48:41 crc kubenswrapper[4962]: I1201 22:48:41.692213 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ngvfw"] Dec 01 22:48:41 crc kubenswrapper[4962]: I1201 22:48:41.703749 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ngvfw"] Dec 01 22:48:42 crc kubenswrapper[4962]: I1201 22:48:42.236921 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ea127e2-c032-440a-abcc-8e82da562a4e" path="/var/lib/kubelet/pods/1ea127e2-c032-440a-abcc-8e82da562a4e/volumes" Dec 01 22:49:02 crc kubenswrapper[4962]: I1201 22:49:02.784431 4962 patch_prober.go:28] interesting pod/machine-config-daemon-b642k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 22:49:02 crc kubenswrapper[4962]: I1201 22:49:02.784987 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 22:49:32 crc kubenswrapper[4962]: I1201 22:49:32.784169 4962 patch_prober.go:28] interesting pod/machine-config-daemon-b642k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 22:49:32 crc kubenswrapper[4962]: I1201 22:49:32.784581 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 22:49:32 crc kubenswrapper[4962]: I1201 22:49:32.784623 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b642k" Dec 01 22:49:32 crc kubenswrapper[4962]: I1201 22:49:32.785469 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1c8a86ee58923ccc90761e8aea7d554e82810ff183d68baeb16ce55d129d65d8"} pod="openshift-machine-config-operator/machine-config-daemon-b642k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 22:49:32 crc kubenswrapper[4962]: I1201 22:49:32.785515 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" containerID="cri-o://1c8a86ee58923ccc90761e8aea7d554e82810ff183d68baeb16ce55d129d65d8" gracePeriod=600 Dec 01 22:49:32 crc kubenswrapper[4962]: E1201 22:49:32.923417 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:49:33 crc kubenswrapper[4962]: I1201 22:49:33.727671 4962 generic.go:334] "Generic (PLEG): container finished" podID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerID="1c8a86ee58923ccc90761e8aea7d554e82810ff183d68baeb16ce55d129d65d8" exitCode=0 Dec 01 22:49:33 crc kubenswrapper[4962]: I1201 22:49:33.727775 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" event={"ID":"191b6ce3-f613-4217-b224-a65ee4cfdfe7","Type":"ContainerDied","Data":"1c8a86ee58923ccc90761e8aea7d554e82810ff183d68baeb16ce55d129d65d8"} Dec 01 22:49:33 crc kubenswrapper[4962]: I1201 22:49:33.728075 4962 scope.go:117] "RemoveContainer" containerID="c4d9a7193f7ebfe349227a99972c1252553194d867623ce522be000424eb6be4" Dec 01 22:49:33 crc kubenswrapper[4962]: I1201 22:49:33.728708 4962 scope.go:117] "RemoveContainer" containerID="1c8a86ee58923ccc90761e8aea7d554e82810ff183d68baeb16ce55d129d65d8" Dec 01 22:49:33 crc kubenswrapper[4962]: E1201 22:49:33.729232 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:49:47 crc kubenswrapper[4962]: I1201 22:49:47.220676 4962 scope.go:117] "RemoveContainer" containerID="1c8a86ee58923ccc90761e8aea7d554e82810ff183d68baeb16ce55d129d65d8" Dec 01 22:49:47 crc kubenswrapper[4962]: E1201 22:49:47.222023 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:50:00 crc kubenswrapper[4962]: I1201 22:50:00.221160 4962 scope.go:117] "RemoveContainer" containerID="1c8a86ee58923ccc90761e8aea7d554e82810ff183d68baeb16ce55d129d65d8" Dec 01 22:50:00 crc kubenswrapper[4962]: E1201 22:50:00.222330 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:50:14 crc kubenswrapper[4962]: I1201 22:50:14.221141 4962 scope.go:117] "RemoveContainer" containerID="1c8a86ee58923ccc90761e8aea7d554e82810ff183d68baeb16ce55d129d65d8" Dec 01 22:50:14 crc kubenswrapper[4962]: E1201 22:50:14.224917 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:50:29 crc kubenswrapper[4962]: I1201 22:50:29.220541 4962 scope.go:117] "RemoveContainer" containerID="1c8a86ee58923ccc90761e8aea7d554e82810ff183d68baeb16ce55d129d65d8" Dec 01 22:50:29 crc kubenswrapper[4962]: E1201 22:50:29.222055 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:50:30 crc kubenswrapper[4962]: E1201 22:50:30.912448 4962 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.110:39944->38.102.83.110:46143: write tcp 38.102.83.110:39944->38.102.83.110:46143: write: broken pipe Dec 01 22:50:43 crc kubenswrapper[4962]: I1201 22:50:43.220419 4962 scope.go:117] "RemoveContainer" containerID="1c8a86ee58923ccc90761e8aea7d554e82810ff183d68baeb16ce55d129d65d8" Dec 01 22:50:43 crc kubenswrapper[4962]: E1201 22:50:43.223326 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:50:56 crc kubenswrapper[4962]: I1201 22:50:56.220384 4962 scope.go:117] "RemoveContainer" containerID="1c8a86ee58923ccc90761e8aea7d554e82810ff183d68baeb16ce55d129d65d8" Dec 01 22:50:56 crc kubenswrapper[4962]: E1201 22:50:56.221355 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:51:08 crc kubenswrapper[4962]: I1201 22:51:08.220626 4962 scope.go:117] "RemoveContainer" containerID="1c8a86ee58923ccc90761e8aea7d554e82810ff183d68baeb16ce55d129d65d8" Dec 01 22:51:08 crc kubenswrapper[4962]: E1201 22:51:08.221595 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:51:20 crc kubenswrapper[4962]: I1201 22:51:20.220852 4962 scope.go:117] "RemoveContainer" containerID="1c8a86ee58923ccc90761e8aea7d554e82810ff183d68baeb16ce55d129d65d8" Dec 01 22:51:20 crc kubenswrapper[4962]: E1201 22:51:20.222033 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:51:31 crc kubenswrapper[4962]: I1201 22:51:31.220598 4962 scope.go:117] "RemoveContainer" containerID="1c8a86ee58923ccc90761e8aea7d554e82810ff183d68baeb16ce55d129d65d8" Dec 01 22:51:31 crc kubenswrapper[4962]: E1201 22:51:31.221784 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:51:44 crc kubenswrapper[4962]: I1201 22:51:44.220161 4962 scope.go:117] "RemoveContainer" containerID="1c8a86ee58923ccc90761e8aea7d554e82810ff183d68baeb16ce55d129d65d8" Dec 01 22:51:44 crc kubenswrapper[4962]: E1201 22:51:44.221481 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:51:57 crc kubenswrapper[4962]: I1201 22:51:57.221064 4962 scope.go:117] "RemoveContainer" containerID="1c8a86ee58923ccc90761e8aea7d554e82810ff183d68baeb16ce55d129d65d8" Dec 01 22:51:57 crc kubenswrapper[4962]: E1201 22:51:57.221911 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:52:08 crc kubenswrapper[4962]: I1201 22:52:08.222236 4962 scope.go:117] "RemoveContainer" containerID="1c8a86ee58923ccc90761e8aea7d554e82810ff183d68baeb16ce55d129d65d8" Dec 01 22:52:08 crc kubenswrapper[4962]: E1201 22:52:08.223872 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:52:20 crc kubenswrapper[4962]: I1201 22:52:20.220603 4962 scope.go:117] "RemoveContainer" containerID="1c8a86ee58923ccc90761e8aea7d554e82810ff183d68baeb16ce55d129d65d8" Dec 01 22:52:20 crc kubenswrapper[4962]: E1201 22:52:20.221909 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:52:31 crc kubenswrapper[4962]: I1201 22:52:31.219794 4962 scope.go:117] "RemoveContainer" containerID="1c8a86ee58923ccc90761e8aea7d554e82810ff183d68baeb16ce55d129d65d8" Dec 01 22:52:31 crc kubenswrapper[4962]: E1201 22:52:31.220898 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:52:44 crc kubenswrapper[4962]: I1201 22:52:44.332238 4962 scope.go:117] "RemoveContainer" containerID="1c8a86ee58923ccc90761e8aea7d554e82810ff183d68baeb16ce55d129d65d8" Dec 01 22:52:44 crc kubenswrapper[4962]: E1201 22:52:44.332986 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:52:59 crc kubenswrapper[4962]: I1201 22:52:59.221813 4962 scope.go:117] "RemoveContainer" containerID="1c8a86ee58923ccc90761e8aea7d554e82810ff183d68baeb16ce55d129d65d8" Dec 01 22:52:59 crc kubenswrapper[4962]: E1201 22:52:59.223722 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:53:11 crc kubenswrapper[4962]: I1201 22:53:11.221291 4962 scope.go:117] "RemoveContainer" containerID="1c8a86ee58923ccc90761e8aea7d554e82810ff183d68baeb16ce55d129d65d8" Dec 01 22:53:11 crc kubenswrapper[4962]: E1201 22:53:11.222566 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:53:22 crc kubenswrapper[4962]: I1201 22:53:22.220645 4962 scope.go:117] "RemoveContainer" containerID="1c8a86ee58923ccc90761e8aea7d554e82810ff183d68baeb16ce55d129d65d8" Dec 01 22:53:22 crc kubenswrapper[4962]: E1201 22:53:22.221694 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:53:36 crc kubenswrapper[4962]: I1201 22:53:36.230629 4962 scope.go:117] "RemoveContainer" containerID="1c8a86ee58923ccc90761e8aea7d554e82810ff183d68baeb16ce55d129d65d8" Dec 01 22:53:36 crc kubenswrapper[4962]: E1201 22:53:36.231668 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:53:51 crc kubenswrapper[4962]: I1201 22:53:51.220340 4962 scope.go:117] "RemoveContainer" containerID="1c8a86ee58923ccc90761e8aea7d554e82810ff183d68baeb16ce55d129d65d8" Dec 01 22:53:51 crc kubenswrapper[4962]: E1201 22:53:51.221409 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:54:06 crc kubenswrapper[4962]: I1201 22:54:06.233777 4962 scope.go:117] "RemoveContainer" containerID="1c8a86ee58923ccc90761e8aea7d554e82810ff183d68baeb16ce55d129d65d8" Dec 01 22:54:06 crc kubenswrapper[4962]: E1201 22:54:06.234582 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:54:12 crc kubenswrapper[4962]: I1201 22:54:12.864325 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Dec 01 22:54:12 crc kubenswrapper[4962]: E1201 22:54:12.865322 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ea127e2-c032-440a-abcc-8e82da562a4e" containerName="registry-server" Dec 01 22:54:12 crc kubenswrapper[4962]: I1201 22:54:12.865338 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ea127e2-c032-440a-abcc-8e82da562a4e" containerName="registry-server" Dec 01 22:54:12 crc kubenswrapper[4962]: E1201 22:54:12.865368 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ea127e2-c032-440a-abcc-8e82da562a4e" containerName="extract-content" Dec 01 22:54:12 crc kubenswrapper[4962]: I1201 22:54:12.865376 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ea127e2-c032-440a-abcc-8e82da562a4e" containerName="extract-content" Dec 01 22:54:12 crc kubenswrapper[4962]: E1201 22:54:12.865418 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ea127e2-c032-440a-abcc-8e82da562a4e" containerName="extract-utilities" Dec 01 22:54:12 crc kubenswrapper[4962]: I1201 22:54:12.865426 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ea127e2-c032-440a-abcc-8e82da562a4e" containerName="extract-utilities" Dec 01 22:54:12 crc kubenswrapper[4962]: I1201 22:54:12.865709 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ea127e2-c032-440a-abcc-8e82da562a4e" containerName="registry-server" Dec 01 22:54:12 crc kubenswrapper[4962]: I1201 22:54:12.866852 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 01 22:54:12 crc kubenswrapper[4962]: I1201 22:54:12.869859 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Dec 01 22:54:12 crc kubenswrapper[4962]: I1201 22:54:12.869926 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Dec 01 22:54:12 crc kubenswrapper[4962]: I1201 22:54:12.870099 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 01 22:54:12 crc kubenswrapper[4962]: I1201 22:54:12.870621 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-tqkv8" Dec 01 22:54:12 crc kubenswrapper[4962]: I1201 22:54:12.904813 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 01 22:54:12 crc kubenswrapper[4962]: I1201 22:54:12.994226 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/07461b2c-c45f-45cf-a540-4c24797e3f16-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"07461b2c-c45f-45cf-a540-4c24797e3f16\") " pod="openstack/tempest-tests-tempest" Dec 01 22:54:12 crc kubenswrapper[4962]: I1201 22:54:12.994500 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/07461b2c-c45f-45cf-a540-4c24797e3f16-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"07461b2c-c45f-45cf-a540-4c24797e3f16\") " pod="openstack/tempest-tests-tempest" Dec 01 22:54:12 crc kubenswrapper[4962]: I1201 22:54:12.994544 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/07461b2c-c45f-45cf-a540-4c24797e3f16-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"07461b2c-c45f-45cf-a540-4c24797e3f16\") " pod="openstack/tempest-tests-tempest" Dec 01 22:54:12 crc kubenswrapper[4962]: I1201 22:54:12.994862 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/07461b2c-c45f-45cf-a540-4c24797e3f16-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"07461b2c-c45f-45cf-a540-4c24797e3f16\") " pod="openstack/tempest-tests-tempest" Dec 01 22:54:12 crc kubenswrapper[4962]: I1201 22:54:12.995193 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/07461b2c-c45f-45cf-a540-4c24797e3f16-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"07461b2c-c45f-45cf-a540-4c24797e3f16\") " pod="openstack/tempest-tests-tempest" Dec 01 22:54:12 crc kubenswrapper[4962]: I1201 22:54:12.995369 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78mnh\" (UniqueName: \"kubernetes.io/projected/07461b2c-c45f-45cf-a540-4c24797e3f16-kube-api-access-78mnh\") pod \"tempest-tests-tempest\" (UID: \"07461b2c-c45f-45cf-a540-4c24797e3f16\") " pod="openstack/tempest-tests-tempest" Dec 01 22:54:12 crc kubenswrapper[4962]: I1201 22:54:12.995434 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"tempest-tests-tempest\" (UID: \"07461b2c-c45f-45cf-a540-4c24797e3f16\") " pod="openstack/tempest-tests-tempest" Dec 01 22:54:12 crc kubenswrapper[4962]: I1201 22:54:12.995500 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/07461b2c-c45f-45cf-a540-4c24797e3f16-config-data\") pod \"tempest-tests-tempest\" (UID: \"07461b2c-c45f-45cf-a540-4c24797e3f16\") " pod="openstack/tempest-tests-tempest" Dec 01 22:54:12 crc kubenswrapper[4962]: I1201 22:54:12.995759 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/07461b2c-c45f-45cf-a540-4c24797e3f16-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"07461b2c-c45f-45cf-a540-4c24797e3f16\") " pod="openstack/tempest-tests-tempest" Dec 01 22:54:13 crc kubenswrapper[4962]: I1201 22:54:13.097790 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/07461b2c-c45f-45cf-a540-4c24797e3f16-config-data\") pod \"tempest-tests-tempest\" (UID: \"07461b2c-c45f-45cf-a540-4c24797e3f16\") " pod="openstack/tempest-tests-tempest" Dec 01 22:54:13 crc kubenswrapper[4962]: I1201 22:54:13.098036 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/07461b2c-c45f-45cf-a540-4c24797e3f16-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"07461b2c-c45f-45cf-a540-4c24797e3f16\") " pod="openstack/tempest-tests-tempest" Dec 01 22:54:13 crc kubenswrapper[4962]: I1201 22:54:13.099038 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/07461b2c-c45f-45cf-a540-4c24797e3f16-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"07461b2c-c45f-45cf-a540-4c24797e3f16\") " pod="openstack/tempest-tests-tempest" Dec 01 22:54:13 crc kubenswrapper[4962]: I1201 22:54:13.099261 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/07461b2c-c45f-45cf-a540-4c24797e3f16-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"07461b2c-c45f-45cf-a540-4c24797e3f16\") " pod="openstack/tempest-tests-tempest" Dec 01 22:54:13 crc kubenswrapper[4962]: I1201 22:54:13.099301 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/07461b2c-c45f-45cf-a540-4c24797e3f16-config-data\") pod \"tempest-tests-tempest\" (UID: \"07461b2c-c45f-45cf-a540-4c24797e3f16\") " pod="openstack/tempest-tests-tempest" Dec 01 22:54:13 crc kubenswrapper[4962]: I1201 22:54:13.099648 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/07461b2c-c45f-45cf-a540-4c24797e3f16-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"07461b2c-c45f-45cf-a540-4c24797e3f16\") " pod="openstack/tempest-tests-tempest" Dec 01 22:54:13 crc kubenswrapper[4962]: I1201 22:54:13.099826 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/07461b2c-c45f-45cf-a540-4c24797e3f16-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"07461b2c-c45f-45cf-a540-4c24797e3f16\") " pod="openstack/tempest-tests-tempest" Dec 01 22:54:13 crc kubenswrapper[4962]: I1201 22:54:13.099876 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/07461b2c-c45f-45cf-a540-4c24797e3f16-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"07461b2c-c45f-45cf-a540-4c24797e3f16\") " pod="openstack/tempest-tests-tempest" Dec 01 22:54:13 crc kubenswrapper[4962]: I1201 22:54:13.100067 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/07461b2c-c45f-45cf-a540-4c24797e3f16-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"07461b2c-c45f-45cf-a540-4c24797e3f16\") " pod="openstack/tempest-tests-tempest" Dec 01 22:54:13 crc kubenswrapper[4962]: I1201 22:54:13.100192 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/07461b2c-c45f-45cf-a540-4c24797e3f16-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"07461b2c-c45f-45cf-a540-4c24797e3f16\") " pod="openstack/tempest-tests-tempest" Dec 01 22:54:13 crc kubenswrapper[4962]: I1201 22:54:13.100293 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78mnh\" (UniqueName: \"kubernetes.io/projected/07461b2c-c45f-45cf-a540-4c24797e3f16-kube-api-access-78mnh\") pod \"tempest-tests-tempest\" (UID: \"07461b2c-c45f-45cf-a540-4c24797e3f16\") " pod="openstack/tempest-tests-tempest" Dec 01 22:54:13 crc kubenswrapper[4962]: I1201 22:54:13.100342 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"tempest-tests-tempest\" (UID: \"07461b2c-c45f-45cf-a540-4c24797e3f16\") " pod="openstack/tempest-tests-tempest" Dec 01 22:54:13 crc kubenswrapper[4962]: I1201 22:54:13.101316 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"tempest-tests-tempest\" (UID: \"07461b2c-c45f-45cf-a540-4c24797e3f16\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/tempest-tests-tempest" Dec 01 22:54:13 crc kubenswrapper[4962]: I1201 22:54:13.101638 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/07461b2c-c45f-45cf-a540-4c24797e3f16-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"07461b2c-c45f-45cf-a540-4c24797e3f16\") " pod="openstack/tempest-tests-tempest" Dec 01 22:54:13 crc kubenswrapper[4962]: I1201 22:54:13.105402 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/07461b2c-c45f-45cf-a540-4c24797e3f16-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"07461b2c-c45f-45cf-a540-4c24797e3f16\") " pod="openstack/tempest-tests-tempest" Dec 01 22:54:13 crc kubenswrapper[4962]: I1201 22:54:13.105686 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/07461b2c-c45f-45cf-a540-4c24797e3f16-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"07461b2c-c45f-45cf-a540-4c24797e3f16\") " pod="openstack/tempest-tests-tempest" Dec 01 22:54:13 crc kubenswrapper[4962]: I1201 22:54:13.110551 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/07461b2c-c45f-45cf-a540-4c24797e3f16-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"07461b2c-c45f-45cf-a540-4c24797e3f16\") " pod="openstack/tempest-tests-tempest" Dec 01 22:54:13 crc kubenswrapper[4962]: I1201 22:54:13.127228 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78mnh\" (UniqueName: \"kubernetes.io/projected/07461b2c-c45f-45cf-a540-4c24797e3f16-kube-api-access-78mnh\") pod \"tempest-tests-tempest\" (UID: \"07461b2c-c45f-45cf-a540-4c24797e3f16\") " pod="openstack/tempest-tests-tempest" Dec 01 22:54:13 crc kubenswrapper[4962]: I1201 22:54:13.145648 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"tempest-tests-tempest\" (UID: \"07461b2c-c45f-45cf-a540-4c24797e3f16\") " pod="openstack/tempest-tests-tempest" Dec 01 22:54:13 crc kubenswrapper[4962]: I1201 22:54:13.212160 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 01 22:54:13 crc kubenswrapper[4962]: W1201 22:54:13.768384 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod07461b2c_c45f_45cf_a540_4c24797e3f16.slice/crio-5072c03127bd12d3e58ebd9fbdfdee2424fe9cf1e80c53f50a6bd6cc15fc6d98 WatchSource:0}: Error finding container 5072c03127bd12d3e58ebd9fbdfdee2424fe9cf1e80c53f50a6bd6cc15fc6d98: Status 404 returned error can't find the container with id 5072c03127bd12d3e58ebd9fbdfdee2424fe9cf1e80c53f50a6bd6cc15fc6d98 Dec 01 22:54:13 crc kubenswrapper[4962]: I1201 22:54:13.770650 4962 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 22:54:13 crc kubenswrapper[4962]: I1201 22:54:13.776958 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 01 22:54:14 crc kubenswrapper[4962]: I1201 22:54:14.267067 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"07461b2c-c45f-45cf-a540-4c24797e3f16","Type":"ContainerStarted","Data":"5072c03127bd12d3e58ebd9fbdfdee2424fe9cf1e80c53f50a6bd6cc15fc6d98"} Dec 01 22:54:17 crc kubenswrapper[4962]: I1201 22:54:17.220422 4962 scope.go:117] "RemoveContainer" containerID="1c8a86ee58923ccc90761e8aea7d554e82810ff183d68baeb16ce55d129d65d8" Dec 01 22:54:17 crc kubenswrapper[4962]: E1201 22:54:17.220871 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:54:32 crc kubenswrapper[4962]: I1201 22:54:32.220318 4962 scope.go:117] "RemoveContainer" containerID="1c8a86ee58923ccc90761e8aea7d554e82810ff183d68baeb16ce55d129d65d8" Dec 01 22:54:32 crc kubenswrapper[4962]: E1201 22:54:32.221200 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 22:54:44 crc kubenswrapper[4962]: I1201 22:54:44.220735 4962 scope.go:117] "RemoveContainer" containerID="1c8a86ee58923ccc90761e8aea7d554e82810ff183d68baeb16ce55d129d65d8" Dec 01 22:54:48 crc kubenswrapper[4962]: E1201 22:54:48.965810 4962 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Dec 01 22:54:48 crc kubenswrapper[4962]: E1201 22:54:48.967690 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-78mnh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(07461b2c-c45f-45cf-a540-4c24797e3f16): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 22:54:48 crc kubenswrapper[4962]: E1201 22:54:48.969538 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="07461b2c-c45f-45cf-a540-4c24797e3f16" Dec 01 22:54:49 crc kubenswrapper[4962]: I1201 22:54:49.727605 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" event={"ID":"191b6ce3-f613-4217-b224-a65ee4cfdfe7","Type":"ContainerStarted","Data":"3ad1e56f37f13a30e1370bba29af150bae21baabff59d52513f0ad149646a414"} Dec 01 22:54:49 crc kubenswrapper[4962]: E1201 22:54:49.731294 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="07461b2c-c45f-45cf-a540-4c24797e3f16" Dec 01 22:55:05 crc kubenswrapper[4962]: I1201 22:55:05.802264 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 01 22:55:08 crc kubenswrapper[4962]: I1201 22:55:08.031353 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"07461b2c-c45f-45cf-a540-4c24797e3f16","Type":"ContainerStarted","Data":"a0ab3808c752994cfb40eb5bf5f32918c1237cf6c55c98ddc6929d7e2787602e"} Dec 01 22:55:08 crc kubenswrapper[4962]: I1201 22:55:08.068633 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=5.041488645 podStartE2EDuration="57.068612385s" podCreationTimestamp="2025-12-01 22:54:11 +0000 UTC" firstStartedPulling="2025-12-01 22:54:13.770473999 +0000 UTC m=+4837.871913194" lastFinishedPulling="2025-12-01 22:55:05.797597709 +0000 UTC m=+4889.899036934" observedRunningTime="2025-12-01 22:55:08.053250679 +0000 UTC m=+4892.154689914" watchObservedRunningTime="2025-12-01 22:55:08.068612385 +0000 UTC m=+4892.170051600" Dec 01 22:57:02 crc kubenswrapper[4962]: I1201 22:57:02.786405 4962 patch_prober.go:28] interesting pod/machine-config-daemon-b642k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 22:57:02 crc kubenswrapper[4962]: I1201 22:57:02.787498 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 22:57:20 crc kubenswrapper[4962]: I1201 22:57:20.042314 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-m8pnw"] Dec 01 22:57:20 crc kubenswrapper[4962]: I1201 22:57:20.048708 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m8pnw" Dec 01 22:57:20 crc kubenswrapper[4962]: I1201 22:57:20.121257 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m8pnw"] Dec 01 22:57:20 crc kubenswrapper[4962]: I1201 22:57:20.220319 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvl8p\" (UniqueName: \"kubernetes.io/projected/b3d1245e-f548-4148-98b3-fea76d3ce979-kube-api-access-tvl8p\") pod \"redhat-operators-m8pnw\" (UID: \"b3d1245e-f548-4148-98b3-fea76d3ce979\") " pod="openshift-marketplace/redhat-operators-m8pnw" Dec 01 22:57:20 crc kubenswrapper[4962]: I1201 22:57:20.220494 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3d1245e-f548-4148-98b3-fea76d3ce979-utilities\") pod \"redhat-operators-m8pnw\" (UID: \"b3d1245e-f548-4148-98b3-fea76d3ce979\") " pod="openshift-marketplace/redhat-operators-m8pnw" Dec 01 22:57:20 crc kubenswrapper[4962]: I1201 22:57:20.220617 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3d1245e-f548-4148-98b3-fea76d3ce979-catalog-content\") pod \"redhat-operators-m8pnw\" (UID: \"b3d1245e-f548-4148-98b3-fea76d3ce979\") " pod="openshift-marketplace/redhat-operators-m8pnw" Dec 01 22:57:20 crc kubenswrapper[4962]: I1201 22:57:20.323354 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3d1245e-f548-4148-98b3-fea76d3ce979-utilities\") pod \"redhat-operators-m8pnw\" (UID: \"b3d1245e-f548-4148-98b3-fea76d3ce979\") " pod="openshift-marketplace/redhat-operators-m8pnw" Dec 01 22:57:20 crc kubenswrapper[4962]: I1201 22:57:20.323591 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3d1245e-f548-4148-98b3-fea76d3ce979-catalog-content\") pod \"redhat-operators-m8pnw\" (UID: \"b3d1245e-f548-4148-98b3-fea76d3ce979\") " pod="openshift-marketplace/redhat-operators-m8pnw" Dec 01 22:57:20 crc kubenswrapper[4962]: I1201 22:57:20.323733 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvl8p\" (UniqueName: \"kubernetes.io/projected/b3d1245e-f548-4148-98b3-fea76d3ce979-kube-api-access-tvl8p\") pod \"redhat-operators-m8pnw\" (UID: \"b3d1245e-f548-4148-98b3-fea76d3ce979\") " pod="openshift-marketplace/redhat-operators-m8pnw" Dec 01 22:57:20 crc kubenswrapper[4962]: I1201 22:57:20.326493 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3d1245e-f548-4148-98b3-fea76d3ce979-utilities\") pod \"redhat-operators-m8pnw\" (UID: \"b3d1245e-f548-4148-98b3-fea76d3ce979\") " pod="openshift-marketplace/redhat-operators-m8pnw" Dec 01 22:57:20 crc kubenswrapper[4962]: I1201 22:57:20.326600 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3d1245e-f548-4148-98b3-fea76d3ce979-catalog-content\") pod \"redhat-operators-m8pnw\" (UID: \"b3d1245e-f548-4148-98b3-fea76d3ce979\") " pod="openshift-marketplace/redhat-operators-m8pnw" Dec 01 22:57:20 crc kubenswrapper[4962]: I1201 22:57:20.368215 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvl8p\" (UniqueName: \"kubernetes.io/projected/b3d1245e-f548-4148-98b3-fea76d3ce979-kube-api-access-tvl8p\") pod \"redhat-operators-m8pnw\" (UID: \"b3d1245e-f548-4148-98b3-fea76d3ce979\") " pod="openshift-marketplace/redhat-operators-m8pnw" Dec 01 22:57:20 crc kubenswrapper[4962]: I1201 22:57:20.374242 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m8pnw" Dec 01 22:57:21 crc kubenswrapper[4962]: I1201 22:57:21.253104 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m8pnw"] Dec 01 22:57:21 crc kubenswrapper[4962]: W1201 22:57:21.261497 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3d1245e_f548_4148_98b3_fea76d3ce979.slice/crio-373961f78f5bc23622f51d298266c7b2e77a8c8f191e00e0fbe25cb9109ae264 WatchSource:0}: Error finding container 373961f78f5bc23622f51d298266c7b2e77a8c8f191e00e0fbe25cb9109ae264: Status 404 returned error can't find the container with id 373961f78f5bc23622f51d298266c7b2e77a8c8f191e00e0fbe25cb9109ae264 Dec 01 22:57:21 crc kubenswrapper[4962]: I1201 22:57:21.579378 4962 generic.go:334] "Generic (PLEG): container finished" podID="b3d1245e-f548-4148-98b3-fea76d3ce979" containerID="bfb004dce4776211af6a408e7e88230c1545cee5375dcb1be49c6a913d7f1a1f" exitCode=0 Dec 01 22:57:21 crc kubenswrapper[4962]: I1201 22:57:21.579819 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m8pnw" event={"ID":"b3d1245e-f548-4148-98b3-fea76d3ce979","Type":"ContainerDied","Data":"bfb004dce4776211af6a408e7e88230c1545cee5375dcb1be49c6a913d7f1a1f"} Dec 01 22:57:21 crc kubenswrapper[4962]: I1201 22:57:21.579853 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m8pnw" event={"ID":"b3d1245e-f548-4148-98b3-fea76d3ce979","Type":"ContainerStarted","Data":"373961f78f5bc23622f51d298266c7b2e77a8c8f191e00e0fbe25cb9109ae264"} Dec 01 22:57:24 crc kubenswrapper[4962]: I1201 22:57:24.620294 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m8pnw" event={"ID":"b3d1245e-f548-4148-98b3-fea76d3ce979","Type":"ContainerStarted","Data":"5c53f10a1dbbad7939cf0258dac1c322fe4b283c34e9280710f76f18d6602254"} Dec 01 22:57:28 crc kubenswrapper[4962]: I1201 22:57:28.011272 4962 generic.go:334] "Generic (PLEG): container finished" podID="b3d1245e-f548-4148-98b3-fea76d3ce979" containerID="5c53f10a1dbbad7939cf0258dac1c322fe4b283c34e9280710f76f18d6602254" exitCode=0 Dec 01 22:57:28 crc kubenswrapper[4962]: I1201 22:57:28.011335 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m8pnw" event={"ID":"b3d1245e-f548-4148-98b3-fea76d3ce979","Type":"ContainerDied","Data":"5c53f10a1dbbad7939cf0258dac1c322fe4b283c34e9280710f76f18d6602254"} Dec 01 22:57:29 crc kubenswrapper[4962]: I1201 22:57:29.025055 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m8pnw" event={"ID":"b3d1245e-f548-4148-98b3-fea76d3ce979","Type":"ContainerStarted","Data":"dd29ad025a3f7e93dee234b3257a015a5b071f72a3f8d8a3dc825b61ec83b0d7"} Dec 01 22:57:29 crc kubenswrapper[4962]: I1201 22:57:29.058613 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-m8pnw" podStartSLOduration=2.196765322 podStartE2EDuration="9.058592719s" podCreationTimestamp="2025-12-01 22:57:20 +0000 UTC" firstStartedPulling="2025-12-01 22:57:21.583715551 +0000 UTC m=+5025.685154746" lastFinishedPulling="2025-12-01 22:57:28.445542948 +0000 UTC m=+5032.546982143" observedRunningTime="2025-12-01 22:57:29.04876065 +0000 UTC m=+5033.150199945" watchObservedRunningTime="2025-12-01 22:57:29.058592719 +0000 UTC m=+5033.160031924" Dec 01 22:57:30 crc kubenswrapper[4962]: I1201 22:57:30.375540 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-m8pnw" Dec 01 22:57:30 crc kubenswrapper[4962]: I1201 22:57:30.375974 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-m8pnw" Dec 01 22:57:31 crc kubenswrapper[4962]: I1201 22:57:31.449718 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-m8pnw" podUID="b3d1245e-f548-4148-98b3-fea76d3ce979" containerName="registry-server" probeResult="failure" output=< Dec 01 22:57:31 crc kubenswrapper[4962]: timeout: failed to connect service ":50051" within 1s Dec 01 22:57:31 crc kubenswrapper[4962]: > Dec 01 22:57:32 crc kubenswrapper[4962]: I1201 22:57:32.784464 4962 patch_prober.go:28] interesting pod/machine-config-daemon-b642k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 22:57:32 crc kubenswrapper[4962]: I1201 22:57:32.784541 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 22:57:41 crc kubenswrapper[4962]: I1201 22:57:41.457663 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-m8pnw" podUID="b3d1245e-f548-4148-98b3-fea76d3ce979" containerName="registry-server" probeResult="failure" output=< Dec 01 22:57:41 crc kubenswrapper[4962]: timeout: failed to connect service ":50051" within 1s Dec 01 22:57:41 crc kubenswrapper[4962]: > Dec 01 22:57:50 crc kubenswrapper[4962]: I1201 22:57:50.442299 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-m8pnw" Dec 01 22:57:50 crc kubenswrapper[4962]: I1201 22:57:50.499636 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-m8pnw" Dec 01 22:57:51 crc kubenswrapper[4962]: I1201 22:57:51.245330 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-m8pnw"] Dec 01 22:57:51 crc kubenswrapper[4962]: I1201 22:57:51.877169 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-m8pnw" podUID="b3d1245e-f548-4148-98b3-fea76d3ce979" containerName="registry-server" containerID="cri-o://dd29ad025a3f7e93dee234b3257a015a5b071f72a3f8d8a3dc825b61ec83b0d7" gracePeriod=2 Dec 01 22:57:52 crc kubenswrapper[4962]: I1201 22:57:52.902267 4962 generic.go:334] "Generic (PLEG): container finished" podID="b3d1245e-f548-4148-98b3-fea76d3ce979" containerID="dd29ad025a3f7e93dee234b3257a015a5b071f72a3f8d8a3dc825b61ec83b0d7" exitCode=0 Dec 01 22:57:52 crc kubenswrapper[4962]: I1201 22:57:52.902359 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m8pnw" event={"ID":"b3d1245e-f548-4148-98b3-fea76d3ce979","Type":"ContainerDied","Data":"dd29ad025a3f7e93dee234b3257a015a5b071f72a3f8d8a3dc825b61ec83b0d7"} Dec 01 22:57:53 crc kubenswrapper[4962]: I1201 22:57:53.328538 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m8pnw" Dec 01 22:57:53 crc kubenswrapper[4962]: I1201 22:57:53.431306 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3d1245e-f548-4148-98b3-fea76d3ce979-utilities\") pod \"b3d1245e-f548-4148-98b3-fea76d3ce979\" (UID: \"b3d1245e-f548-4148-98b3-fea76d3ce979\") " Dec 01 22:57:53 crc kubenswrapper[4962]: I1201 22:57:53.431570 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvl8p\" (UniqueName: \"kubernetes.io/projected/b3d1245e-f548-4148-98b3-fea76d3ce979-kube-api-access-tvl8p\") pod \"b3d1245e-f548-4148-98b3-fea76d3ce979\" (UID: \"b3d1245e-f548-4148-98b3-fea76d3ce979\") " Dec 01 22:57:53 crc kubenswrapper[4962]: I1201 22:57:53.431649 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3d1245e-f548-4148-98b3-fea76d3ce979-catalog-content\") pod \"b3d1245e-f548-4148-98b3-fea76d3ce979\" (UID: \"b3d1245e-f548-4148-98b3-fea76d3ce979\") " Dec 01 22:57:53 crc kubenswrapper[4962]: I1201 22:57:53.435550 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3d1245e-f548-4148-98b3-fea76d3ce979-utilities" (OuterVolumeSpecName: "utilities") pod "b3d1245e-f548-4148-98b3-fea76d3ce979" (UID: "b3d1245e-f548-4148-98b3-fea76d3ce979"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 22:57:53 crc kubenswrapper[4962]: I1201 22:57:53.441893 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3d1245e-f548-4148-98b3-fea76d3ce979-kube-api-access-tvl8p" (OuterVolumeSpecName: "kube-api-access-tvl8p") pod "b3d1245e-f548-4148-98b3-fea76d3ce979" (UID: "b3d1245e-f548-4148-98b3-fea76d3ce979"). InnerVolumeSpecName "kube-api-access-tvl8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 22:57:53 crc kubenswrapper[4962]: I1201 22:57:53.533527 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3d1245e-f548-4148-98b3-fea76d3ce979-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b3d1245e-f548-4148-98b3-fea76d3ce979" (UID: "b3d1245e-f548-4148-98b3-fea76d3ce979"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 22:57:53 crc kubenswrapper[4962]: I1201 22:57:53.533792 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3d1245e-f548-4148-98b3-fea76d3ce979-catalog-content\") pod \"b3d1245e-f548-4148-98b3-fea76d3ce979\" (UID: \"b3d1245e-f548-4148-98b3-fea76d3ce979\") " Dec 01 22:57:53 crc kubenswrapper[4962]: W1201 22:57:53.534361 4962 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/b3d1245e-f548-4148-98b3-fea76d3ce979/volumes/kubernetes.io~empty-dir/catalog-content Dec 01 22:57:53 crc kubenswrapper[4962]: I1201 22:57:53.534395 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3d1245e-f548-4148-98b3-fea76d3ce979-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b3d1245e-f548-4148-98b3-fea76d3ce979" (UID: "b3d1245e-f548-4148-98b3-fea76d3ce979"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 22:57:53 crc kubenswrapper[4962]: I1201 22:57:53.534694 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3d1245e-f548-4148-98b3-fea76d3ce979-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 22:57:53 crc kubenswrapper[4962]: I1201 22:57:53.534709 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3d1245e-f548-4148-98b3-fea76d3ce979-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 22:57:53 crc kubenswrapper[4962]: I1201 22:57:53.534718 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvl8p\" (UniqueName: \"kubernetes.io/projected/b3d1245e-f548-4148-98b3-fea76d3ce979-kube-api-access-tvl8p\") on node \"crc\" DevicePath \"\"" Dec 01 22:57:53 crc kubenswrapper[4962]: I1201 22:57:53.918376 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m8pnw" event={"ID":"b3d1245e-f548-4148-98b3-fea76d3ce979","Type":"ContainerDied","Data":"373961f78f5bc23622f51d298266c7b2e77a8c8f191e00e0fbe25cb9109ae264"} Dec 01 22:57:53 crc kubenswrapper[4962]: I1201 22:57:53.918428 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m8pnw" Dec 01 22:57:53 crc kubenswrapper[4962]: I1201 22:57:53.918926 4962 scope.go:117] "RemoveContainer" containerID="dd29ad025a3f7e93dee234b3257a015a5b071f72a3f8d8a3dc825b61ec83b0d7" Dec 01 22:57:53 crc kubenswrapper[4962]: I1201 22:57:53.957836 4962 scope.go:117] "RemoveContainer" containerID="5c53f10a1dbbad7939cf0258dac1c322fe4b283c34e9280710f76f18d6602254" Dec 01 22:57:53 crc kubenswrapper[4962]: I1201 22:57:53.990054 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-m8pnw"] Dec 01 22:57:53 crc kubenswrapper[4962]: I1201 22:57:53.998833 4962 scope.go:117] "RemoveContainer" containerID="bfb004dce4776211af6a408e7e88230c1545cee5375dcb1be49c6a913d7f1a1f" Dec 01 22:57:54 crc kubenswrapper[4962]: I1201 22:57:54.003477 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-m8pnw"] Dec 01 22:57:54 crc kubenswrapper[4962]: I1201 22:57:54.234161 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3d1245e-f548-4148-98b3-fea76d3ce979" path="/var/lib/kubelet/pods/b3d1245e-f548-4148-98b3-fea76d3ce979/volumes" Dec 01 22:58:02 crc kubenswrapper[4962]: I1201 22:58:02.792154 4962 patch_prober.go:28] interesting pod/machine-config-daemon-b642k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 22:58:02 crc kubenswrapper[4962]: I1201 22:58:02.792639 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 22:58:02 crc kubenswrapper[4962]: I1201 22:58:02.792677 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b642k" Dec 01 22:58:02 crc kubenswrapper[4962]: I1201 22:58:02.793542 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3ad1e56f37f13a30e1370bba29af150bae21baabff59d52513f0ad149646a414"} pod="openshift-machine-config-operator/machine-config-daemon-b642k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 22:58:02 crc kubenswrapper[4962]: I1201 22:58:02.793586 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" containerID="cri-o://3ad1e56f37f13a30e1370bba29af150bae21baabff59d52513f0ad149646a414" gracePeriod=600 Dec 01 22:58:03 crc kubenswrapper[4962]: I1201 22:58:03.033523 4962 generic.go:334] "Generic (PLEG): container finished" podID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerID="3ad1e56f37f13a30e1370bba29af150bae21baabff59d52513f0ad149646a414" exitCode=0 Dec 01 22:58:03 crc kubenswrapper[4962]: I1201 22:58:03.033791 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" event={"ID":"191b6ce3-f613-4217-b224-a65ee4cfdfe7","Type":"ContainerDied","Data":"3ad1e56f37f13a30e1370bba29af150bae21baabff59d52513f0ad149646a414"} Dec 01 22:58:03 crc kubenswrapper[4962]: I1201 22:58:03.033828 4962 scope.go:117] "RemoveContainer" containerID="1c8a86ee58923ccc90761e8aea7d554e82810ff183d68baeb16ce55d129d65d8" Dec 01 22:58:04 crc kubenswrapper[4962]: I1201 22:58:04.053577 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" event={"ID":"191b6ce3-f613-4217-b224-a65ee4cfdfe7","Type":"ContainerStarted","Data":"559664edff742d9bb8a40e35e20ed3e4e8d8c1e191acbe7b87ce869314f431aa"} Dec 01 22:58:48 crc kubenswrapper[4962]: I1201 22:58:48.796194 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jmph5"] Dec 01 22:58:48 crc kubenswrapper[4962]: E1201 22:58:48.797726 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3d1245e-f548-4148-98b3-fea76d3ce979" containerName="registry-server" Dec 01 22:58:48 crc kubenswrapper[4962]: I1201 22:58:48.797904 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3d1245e-f548-4148-98b3-fea76d3ce979" containerName="registry-server" Dec 01 22:58:48 crc kubenswrapper[4962]: E1201 22:58:48.797971 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3d1245e-f548-4148-98b3-fea76d3ce979" containerName="extract-content" Dec 01 22:58:48 crc kubenswrapper[4962]: I1201 22:58:48.797984 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3d1245e-f548-4148-98b3-fea76d3ce979" containerName="extract-content" Dec 01 22:58:48 crc kubenswrapper[4962]: E1201 22:58:48.798035 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3d1245e-f548-4148-98b3-fea76d3ce979" containerName="extract-utilities" Dec 01 22:58:48 crc kubenswrapper[4962]: I1201 22:58:48.798052 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3d1245e-f548-4148-98b3-fea76d3ce979" containerName="extract-utilities" Dec 01 22:58:48 crc kubenswrapper[4962]: I1201 22:58:48.798675 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3d1245e-f548-4148-98b3-fea76d3ce979" containerName="registry-server" Dec 01 22:58:48 crc kubenswrapper[4962]: I1201 22:58:48.802086 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jmph5" Dec 01 22:58:48 crc kubenswrapper[4962]: I1201 22:58:48.826977 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jmph5"] Dec 01 22:58:48 crc kubenswrapper[4962]: I1201 22:58:48.850595 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1adeab69-fee0-4a3c-a575-c5d03963663d-utilities\") pod \"community-operators-jmph5\" (UID: \"1adeab69-fee0-4a3c-a575-c5d03963663d\") " pod="openshift-marketplace/community-operators-jmph5" Dec 01 22:58:48 crc kubenswrapper[4962]: I1201 22:58:48.850696 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1adeab69-fee0-4a3c-a575-c5d03963663d-catalog-content\") pod \"community-operators-jmph5\" (UID: \"1adeab69-fee0-4a3c-a575-c5d03963663d\") " pod="openshift-marketplace/community-operators-jmph5" Dec 01 22:58:48 crc kubenswrapper[4962]: I1201 22:58:48.850843 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gk8x\" (UniqueName: \"kubernetes.io/projected/1adeab69-fee0-4a3c-a575-c5d03963663d-kube-api-access-8gk8x\") pod \"community-operators-jmph5\" (UID: \"1adeab69-fee0-4a3c-a575-c5d03963663d\") " pod="openshift-marketplace/community-operators-jmph5" Dec 01 22:58:48 crc kubenswrapper[4962]: I1201 22:58:48.955347 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1adeab69-fee0-4a3c-a575-c5d03963663d-utilities\") pod \"community-operators-jmph5\" (UID: \"1adeab69-fee0-4a3c-a575-c5d03963663d\") " pod="openshift-marketplace/community-operators-jmph5" Dec 01 22:58:48 crc kubenswrapper[4962]: I1201 22:58:48.955515 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1adeab69-fee0-4a3c-a575-c5d03963663d-catalog-content\") pod \"community-operators-jmph5\" (UID: \"1adeab69-fee0-4a3c-a575-c5d03963663d\") " pod="openshift-marketplace/community-operators-jmph5" Dec 01 22:58:48 crc kubenswrapper[4962]: I1201 22:58:48.955739 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gk8x\" (UniqueName: \"kubernetes.io/projected/1adeab69-fee0-4a3c-a575-c5d03963663d-kube-api-access-8gk8x\") pod \"community-operators-jmph5\" (UID: \"1adeab69-fee0-4a3c-a575-c5d03963663d\") " pod="openshift-marketplace/community-operators-jmph5" Dec 01 22:58:48 crc kubenswrapper[4962]: I1201 22:58:48.955881 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1adeab69-fee0-4a3c-a575-c5d03963663d-utilities\") pod \"community-operators-jmph5\" (UID: \"1adeab69-fee0-4a3c-a575-c5d03963663d\") " pod="openshift-marketplace/community-operators-jmph5" Dec 01 22:58:48 crc kubenswrapper[4962]: I1201 22:58:48.956262 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1adeab69-fee0-4a3c-a575-c5d03963663d-catalog-content\") pod \"community-operators-jmph5\" (UID: \"1adeab69-fee0-4a3c-a575-c5d03963663d\") " pod="openshift-marketplace/community-operators-jmph5" Dec 01 22:58:48 crc kubenswrapper[4962]: I1201 22:58:48.993166 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gk8x\" (UniqueName: \"kubernetes.io/projected/1adeab69-fee0-4a3c-a575-c5d03963663d-kube-api-access-8gk8x\") pod \"community-operators-jmph5\" (UID: \"1adeab69-fee0-4a3c-a575-c5d03963663d\") " pod="openshift-marketplace/community-operators-jmph5" Dec 01 22:58:49 crc kubenswrapper[4962]: I1201 22:58:49.133250 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jmph5" Dec 01 22:58:49 crc kubenswrapper[4962]: I1201 22:58:49.809081 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jmph5"] Dec 01 22:58:50 crc kubenswrapper[4962]: I1201 22:58:50.630737 4962 generic.go:334] "Generic (PLEG): container finished" podID="1adeab69-fee0-4a3c-a575-c5d03963663d" containerID="5cfb6cd3f675f193c485a934951053a0c78089f2c57d8dc93dcff49e92728ba2" exitCode=0 Dec 01 22:58:50 crc kubenswrapper[4962]: I1201 22:58:50.630990 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jmph5" event={"ID":"1adeab69-fee0-4a3c-a575-c5d03963663d","Type":"ContainerDied","Data":"5cfb6cd3f675f193c485a934951053a0c78089f2c57d8dc93dcff49e92728ba2"} Dec 01 22:58:50 crc kubenswrapper[4962]: I1201 22:58:50.631337 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jmph5" event={"ID":"1adeab69-fee0-4a3c-a575-c5d03963663d","Type":"ContainerStarted","Data":"3e8dbf417ff8504729d71cb18a936d9cccc12cfbfe12210e68a2823e34121c5d"} Dec 01 22:58:52 crc kubenswrapper[4962]: I1201 22:58:52.660723 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jmph5" event={"ID":"1adeab69-fee0-4a3c-a575-c5d03963663d","Type":"ContainerStarted","Data":"c16d529e463e8eaea7e56b93099dbf851aa7d4d024d85a2adb578e21b1cd11a9"} Dec 01 22:58:53 crc kubenswrapper[4962]: I1201 22:58:53.672161 4962 generic.go:334] "Generic (PLEG): container finished" podID="1adeab69-fee0-4a3c-a575-c5d03963663d" containerID="c16d529e463e8eaea7e56b93099dbf851aa7d4d024d85a2adb578e21b1cd11a9" exitCode=0 Dec 01 22:58:53 crc kubenswrapper[4962]: I1201 22:58:53.672228 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jmph5" event={"ID":"1adeab69-fee0-4a3c-a575-c5d03963663d","Type":"ContainerDied","Data":"c16d529e463e8eaea7e56b93099dbf851aa7d4d024d85a2adb578e21b1cd11a9"} Dec 01 22:58:54 crc kubenswrapper[4962]: I1201 22:58:54.688450 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jmph5" event={"ID":"1adeab69-fee0-4a3c-a575-c5d03963663d","Type":"ContainerStarted","Data":"37bea95821cce7f55de03a6d967e4897cec5b5b4016a398cf622fd05e01ddafa"} Dec 01 22:58:54 crc kubenswrapper[4962]: I1201 22:58:54.708705 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jmph5" podStartSLOduration=3.1903622560000002 podStartE2EDuration="6.708685284s" podCreationTimestamp="2025-12-01 22:58:48 +0000 UTC" firstStartedPulling="2025-12-01 22:58:50.634977981 +0000 UTC m=+5114.736417186" lastFinishedPulling="2025-12-01 22:58:54.153301019 +0000 UTC m=+5118.254740214" observedRunningTime="2025-12-01 22:58:54.706044769 +0000 UTC m=+5118.807483994" watchObservedRunningTime="2025-12-01 22:58:54.708685284 +0000 UTC m=+5118.810124489" Dec 01 22:58:59 crc kubenswrapper[4962]: I1201 22:58:59.134048 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jmph5" Dec 01 22:58:59 crc kubenswrapper[4962]: I1201 22:58:59.134753 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jmph5" Dec 01 22:58:59 crc kubenswrapper[4962]: I1201 22:58:59.260909 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jmph5" Dec 01 22:58:59 crc kubenswrapper[4962]: I1201 22:58:59.808682 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jmph5" Dec 01 22:58:59 crc kubenswrapper[4962]: I1201 22:58:59.863180 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jmph5"] Dec 01 22:59:01 crc kubenswrapper[4962]: I1201 22:59:01.766644 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-jmph5" podUID="1adeab69-fee0-4a3c-a575-c5d03963663d" containerName="registry-server" containerID="cri-o://37bea95821cce7f55de03a6d967e4897cec5b5b4016a398cf622fd05e01ddafa" gracePeriod=2 Dec 01 22:59:02 crc kubenswrapper[4962]: I1201 22:59:02.519165 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jmph5" Dec 01 22:59:02 crc kubenswrapper[4962]: I1201 22:59:02.691342 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1adeab69-fee0-4a3c-a575-c5d03963663d-utilities\") pod \"1adeab69-fee0-4a3c-a575-c5d03963663d\" (UID: \"1adeab69-fee0-4a3c-a575-c5d03963663d\") " Dec 01 22:59:02 crc kubenswrapper[4962]: I1201 22:59:02.691500 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1adeab69-fee0-4a3c-a575-c5d03963663d-catalog-content\") pod \"1adeab69-fee0-4a3c-a575-c5d03963663d\" (UID: \"1adeab69-fee0-4a3c-a575-c5d03963663d\") " Dec 01 22:59:02 crc kubenswrapper[4962]: I1201 22:59:02.691600 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8gk8x\" (UniqueName: \"kubernetes.io/projected/1adeab69-fee0-4a3c-a575-c5d03963663d-kube-api-access-8gk8x\") pod \"1adeab69-fee0-4a3c-a575-c5d03963663d\" (UID: \"1adeab69-fee0-4a3c-a575-c5d03963663d\") " Dec 01 22:59:02 crc kubenswrapper[4962]: I1201 22:59:02.692323 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1adeab69-fee0-4a3c-a575-c5d03963663d-utilities" (OuterVolumeSpecName: "utilities") pod "1adeab69-fee0-4a3c-a575-c5d03963663d" (UID: "1adeab69-fee0-4a3c-a575-c5d03963663d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 22:59:02 crc kubenswrapper[4962]: I1201 22:59:02.698567 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1adeab69-fee0-4a3c-a575-c5d03963663d-kube-api-access-8gk8x" (OuterVolumeSpecName: "kube-api-access-8gk8x") pod "1adeab69-fee0-4a3c-a575-c5d03963663d" (UID: "1adeab69-fee0-4a3c-a575-c5d03963663d"). InnerVolumeSpecName "kube-api-access-8gk8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 22:59:02 crc kubenswrapper[4962]: I1201 22:59:02.743493 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1adeab69-fee0-4a3c-a575-c5d03963663d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1adeab69-fee0-4a3c-a575-c5d03963663d" (UID: "1adeab69-fee0-4a3c-a575-c5d03963663d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 22:59:02 crc kubenswrapper[4962]: I1201 22:59:02.790462 4962 generic.go:334] "Generic (PLEG): container finished" podID="1adeab69-fee0-4a3c-a575-c5d03963663d" containerID="37bea95821cce7f55de03a6d967e4897cec5b5b4016a398cf622fd05e01ddafa" exitCode=0 Dec 01 22:59:02 crc kubenswrapper[4962]: I1201 22:59:02.790508 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jmph5" event={"ID":"1adeab69-fee0-4a3c-a575-c5d03963663d","Type":"ContainerDied","Data":"37bea95821cce7f55de03a6d967e4897cec5b5b4016a398cf622fd05e01ddafa"} Dec 01 22:59:02 crc kubenswrapper[4962]: I1201 22:59:02.790534 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jmph5" event={"ID":"1adeab69-fee0-4a3c-a575-c5d03963663d","Type":"ContainerDied","Data":"3e8dbf417ff8504729d71cb18a936d9cccc12cfbfe12210e68a2823e34121c5d"} Dec 01 22:59:02 crc kubenswrapper[4962]: I1201 22:59:02.790555 4962 scope.go:117] "RemoveContainer" containerID="37bea95821cce7f55de03a6d967e4897cec5b5b4016a398cf622fd05e01ddafa" Dec 01 22:59:02 crc kubenswrapper[4962]: I1201 22:59:02.790686 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jmph5" Dec 01 22:59:02 crc kubenswrapper[4962]: I1201 22:59:02.793921 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1adeab69-fee0-4a3c-a575-c5d03963663d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 22:59:02 crc kubenswrapper[4962]: I1201 22:59:02.794533 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8gk8x\" (UniqueName: \"kubernetes.io/projected/1adeab69-fee0-4a3c-a575-c5d03963663d-kube-api-access-8gk8x\") on node \"crc\" DevicePath \"\"" Dec 01 22:59:02 crc kubenswrapper[4962]: I1201 22:59:02.794547 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1adeab69-fee0-4a3c-a575-c5d03963663d-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 22:59:02 crc kubenswrapper[4962]: I1201 22:59:02.826260 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jmph5"] Dec 01 22:59:02 crc kubenswrapper[4962]: I1201 22:59:02.827090 4962 scope.go:117] "RemoveContainer" containerID="c16d529e463e8eaea7e56b93099dbf851aa7d4d024d85a2adb578e21b1cd11a9" Dec 01 22:59:02 crc kubenswrapper[4962]: I1201 22:59:02.859411 4962 scope.go:117] "RemoveContainer" containerID="5cfb6cd3f675f193c485a934951053a0c78089f2c57d8dc93dcff49e92728ba2" Dec 01 22:59:02 crc kubenswrapper[4962]: I1201 22:59:02.861232 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-jmph5"] Dec 01 22:59:02 crc kubenswrapper[4962]: I1201 22:59:02.900545 4962 scope.go:117] "RemoveContainer" containerID="37bea95821cce7f55de03a6d967e4897cec5b5b4016a398cf622fd05e01ddafa" Dec 01 22:59:02 crc kubenswrapper[4962]: E1201 22:59:02.902175 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37bea95821cce7f55de03a6d967e4897cec5b5b4016a398cf622fd05e01ddafa\": container with ID starting with 37bea95821cce7f55de03a6d967e4897cec5b5b4016a398cf622fd05e01ddafa not found: ID does not exist" containerID="37bea95821cce7f55de03a6d967e4897cec5b5b4016a398cf622fd05e01ddafa" Dec 01 22:59:02 crc kubenswrapper[4962]: I1201 22:59:02.902270 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37bea95821cce7f55de03a6d967e4897cec5b5b4016a398cf622fd05e01ddafa"} err="failed to get container status \"37bea95821cce7f55de03a6d967e4897cec5b5b4016a398cf622fd05e01ddafa\": rpc error: code = NotFound desc = could not find container \"37bea95821cce7f55de03a6d967e4897cec5b5b4016a398cf622fd05e01ddafa\": container with ID starting with 37bea95821cce7f55de03a6d967e4897cec5b5b4016a398cf622fd05e01ddafa not found: ID does not exist" Dec 01 22:59:02 crc kubenswrapper[4962]: I1201 22:59:02.902312 4962 scope.go:117] "RemoveContainer" containerID="c16d529e463e8eaea7e56b93099dbf851aa7d4d024d85a2adb578e21b1cd11a9" Dec 01 22:59:02 crc kubenswrapper[4962]: E1201 22:59:02.903155 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c16d529e463e8eaea7e56b93099dbf851aa7d4d024d85a2adb578e21b1cd11a9\": container with ID starting with c16d529e463e8eaea7e56b93099dbf851aa7d4d024d85a2adb578e21b1cd11a9 not found: ID does not exist" containerID="c16d529e463e8eaea7e56b93099dbf851aa7d4d024d85a2adb578e21b1cd11a9" Dec 01 22:59:02 crc kubenswrapper[4962]: I1201 22:59:02.903208 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c16d529e463e8eaea7e56b93099dbf851aa7d4d024d85a2adb578e21b1cd11a9"} err="failed to get container status \"c16d529e463e8eaea7e56b93099dbf851aa7d4d024d85a2adb578e21b1cd11a9\": rpc error: code = NotFound desc = could not find container \"c16d529e463e8eaea7e56b93099dbf851aa7d4d024d85a2adb578e21b1cd11a9\": container with ID starting with c16d529e463e8eaea7e56b93099dbf851aa7d4d024d85a2adb578e21b1cd11a9 not found: ID does not exist" Dec 01 22:59:02 crc kubenswrapper[4962]: I1201 22:59:02.903230 4962 scope.go:117] "RemoveContainer" containerID="5cfb6cd3f675f193c485a934951053a0c78089f2c57d8dc93dcff49e92728ba2" Dec 01 22:59:02 crc kubenswrapper[4962]: E1201 22:59:02.903634 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cfb6cd3f675f193c485a934951053a0c78089f2c57d8dc93dcff49e92728ba2\": container with ID starting with 5cfb6cd3f675f193c485a934951053a0c78089f2c57d8dc93dcff49e92728ba2 not found: ID does not exist" containerID="5cfb6cd3f675f193c485a934951053a0c78089f2c57d8dc93dcff49e92728ba2" Dec 01 22:59:02 crc kubenswrapper[4962]: I1201 22:59:02.903655 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cfb6cd3f675f193c485a934951053a0c78089f2c57d8dc93dcff49e92728ba2"} err="failed to get container status \"5cfb6cd3f675f193c485a934951053a0c78089f2c57d8dc93dcff49e92728ba2\": rpc error: code = NotFound desc = could not find container \"5cfb6cd3f675f193c485a934951053a0c78089f2c57d8dc93dcff49e92728ba2\": container with ID starting with 5cfb6cd3f675f193c485a934951053a0c78089f2c57d8dc93dcff49e92728ba2 not found: ID does not exist" Dec 01 22:59:04 crc kubenswrapper[4962]: I1201 22:59:04.242309 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1adeab69-fee0-4a3c-a575-c5d03963663d" path="/var/lib/kubelet/pods/1adeab69-fee0-4a3c-a575-c5d03963663d/volumes" Dec 01 22:59:04 crc kubenswrapper[4962]: I1201 22:59:04.337447 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mht7f"] Dec 01 22:59:04 crc kubenswrapper[4962]: E1201 22:59:04.338112 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1adeab69-fee0-4a3c-a575-c5d03963663d" containerName="extract-content" Dec 01 22:59:04 crc kubenswrapper[4962]: I1201 22:59:04.338135 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="1adeab69-fee0-4a3c-a575-c5d03963663d" containerName="extract-content" Dec 01 22:59:04 crc kubenswrapper[4962]: E1201 22:59:04.338200 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1adeab69-fee0-4a3c-a575-c5d03963663d" containerName="registry-server" Dec 01 22:59:04 crc kubenswrapper[4962]: I1201 22:59:04.338209 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="1adeab69-fee0-4a3c-a575-c5d03963663d" containerName="registry-server" Dec 01 22:59:04 crc kubenswrapper[4962]: E1201 22:59:04.338240 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1adeab69-fee0-4a3c-a575-c5d03963663d" containerName="extract-utilities" Dec 01 22:59:04 crc kubenswrapper[4962]: I1201 22:59:04.338249 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="1adeab69-fee0-4a3c-a575-c5d03963663d" containerName="extract-utilities" Dec 01 22:59:04 crc kubenswrapper[4962]: I1201 22:59:04.338530 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="1adeab69-fee0-4a3c-a575-c5d03963663d" containerName="registry-server" Dec 01 22:59:04 crc kubenswrapper[4962]: I1201 22:59:04.340733 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mht7f" Dec 01 22:59:04 crc kubenswrapper[4962]: I1201 22:59:04.349778 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mht7f"] Dec 01 22:59:04 crc kubenswrapper[4962]: I1201 22:59:04.456112 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74982f98-3400-49b0-8629-ccc1313fa70e-utilities\") pod \"certified-operators-mht7f\" (UID: \"74982f98-3400-49b0-8629-ccc1313fa70e\") " pod="openshift-marketplace/certified-operators-mht7f" Dec 01 22:59:04 crc kubenswrapper[4962]: I1201 22:59:04.456198 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rb2nm\" (UniqueName: \"kubernetes.io/projected/74982f98-3400-49b0-8629-ccc1313fa70e-kube-api-access-rb2nm\") pod \"certified-operators-mht7f\" (UID: \"74982f98-3400-49b0-8629-ccc1313fa70e\") " pod="openshift-marketplace/certified-operators-mht7f" Dec 01 22:59:04 crc kubenswrapper[4962]: I1201 22:59:04.456538 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74982f98-3400-49b0-8629-ccc1313fa70e-catalog-content\") pod \"certified-operators-mht7f\" (UID: \"74982f98-3400-49b0-8629-ccc1313fa70e\") " pod="openshift-marketplace/certified-operators-mht7f" Dec 01 22:59:04 crc kubenswrapper[4962]: I1201 22:59:04.559421 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74982f98-3400-49b0-8629-ccc1313fa70e-utilities\") pod \"certified-operators-mht7f\" (UID: \"74982f98-3400-49b0-8629-ccc1313fa70e\") " pod="openshift-marketplace/certified-operators-mht7f" Dec 01 22:59:04 crc kubenswrapper[4962]: I1201 22:59:04.559479 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rb2nm\" (UniqueName: \"kubernetes.io/projected/74982f98-3400-49b0-8629-ccc1313fa70e-kube-api-access-rb2nm\") pod \"certified-operators-mht7f\" (UID: \"74982f98-3400-49b0-8629-ccc1313fa70e\") " pod="openshift-marketplace/certified-operators-mht7f" Dec 01 22:59:04 crc kubenswrapper[4962]: I1201 22:59:04.559560 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74982f98-3400-49b0-8629-ccc1313fa70e-catalog-content\") pod \"certified-operators-mht7f\" (UID: \"74982f98-3400-49b0-8629-ccc1313fa70e\") " pod="openshift-marketplace/certified-operators-mht7f" Dec 01 22:59:04 crc kubenswrapper[4962]: I1201 22:59:04.559991 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74982f98-3400-49b0-8629-ccc1313fa70e-utilities\") pod \"certified-operators-mht7f\" (UID: \"74982f98-3400-49b0-8629-ccc1313fa70e\") " pod="openshift-marketplace/certified-operators-mht7f" Dec 01 22:59:04 crc kubenswrapper[4962]: I1201 22:59:04.560008 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74982f98-3400-49b0-8629-ccc1313fa70e-catalog-content\") pod \"certified-operators-mht7f\" (UID: \"74982f98-3400-49b0-8629-ccc1313fa70e\") " pod="openshift-marketplace/certified-operators-mht7f" Dec 01 22:59:04 crc kubenswrapper[4962]: I1201 22:59:04.582029 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rb2nm\" (UniqueName: \"kubernetes.io/projected/74982f98-3400-49b0-8629-ccc1313fa70e-kube-api-access-rb2nm\") pod \"certified-operators-mht7f\" (UID: \"74982f98-3400-49b0-8629-ccc1313fa70e\") " pod="openshift-marketplace/certified-operators-mht7f" Dec 01 22:59:04 crc kubenswrapper[4962]: I1201 22:59:04.687222 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mht7f" Dec 01 22:59:05 crc kubenswrapper[4962]: I1201 22:59:05.228249 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mht7f"] Dec 01 22:59:05 crc kubenswrapper[4962]: I1201 22:59:05.846141 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mht7f" event={"ID":"74982f98-3400-49b0-8629-ccc1313fa70e","Type":"ContainerStarted","Data":"92834785ade487845ff01bdffb05011b810afed5057c7462e6d5edd15491523e"} Dec 01 22:59:06 crc kubenswrapper[4962]: I1201 22:59:06.864313 4962 generic.go:334] "Generic (PLEG): container finished" podID="74982f98-3400-49b0-8629-ccc1313fa70e" containerID="479677979bb6892f8c10554e07d02daccad9af7adc7d0708ce9c5b3536ed30f3" exitCode=0 Dec 01 22:59:06 crc kubenswrapper[4962]: I1201 22:59:06.864395 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mht7f" event={"ID":"74982f98-3400-49b0-8629-ccc1313fa70e","Type":"ContainerDied","Data":"479677979bb6892f8c10554e07d02daccad9af7adc7d0708ce9c5b3536ed30f3"} Dec 01 22:59:06 crc kubenswrapper[4962]: I1201 22:59:06.935807 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-cdxcw"] Dec 01 22:59:06 crc kubenswrapper[4962]: I1201 22:59:06.939595 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cdxcw" Dec 01 22:59:06 crc kubenswrapper[4962]: I1201 22:59:06.967547 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cdxcw"] Dec 01 22:59:07 crc kubenswrapper[4962]: I1201 22:59:07.128322 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bbf29d48-8dce-4db8-95e2-3a13585485d2-utilities\") pod \"redhat-marketplace-cdxcw\" (UID: \"bbf29d48-8dce-4db8-95e2-3a13585485d2\") " pod="openshift-marketplace/redhat-marketplace-cdxcw" Dec 01 22:59:07 crc kubenswrapper[4962]: I1201 22:59:07.129706 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9k8zt\" (UniqueName: \"kubernetes.io/projected/bbf29d48-8dce-4db8-95e2-3a13585485d2-kube-api-access-9k8zt\") pod \"redhat-marketplace-cdxcw\" (UID: \"bbf29d48-8dce-4db8-95e2-3a13585485d2\") " pod="openshift-marketplace/redhat-marketplace-cdxcw" Dec 01 22:59:07 crc kubenswrapper[4962]: I1201 22:59:07.129819 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bbf29d48-8dce-4db8-95e2-3a13585485d2-catalog-content\") pod \"redhat-marketplace-cdxcw\" (UID: \"bbf29d48-8dce-4db8-95e2-3a13585485d2\") " pod="openshift-marketplace/redhat-marketplace-cdxcw" Dec 01 22:59:07 crc kubenswrapper[4962]: I1201 22:59:07.233000 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9k8zt\" (UniqueName: \"kubernetes.io/projected/bbf29d48-8dce-4db8-95e2-3a13585485d2-kube-api-access-9k8zt\") pod \"redhat-marketplace-cdxcw\" (UID: \"bbf29d48-8dce-4db8-95e2-3a13585485d2\") " pod="openshift-marketplace/redhat-marketplace-cdxcw" Dec 01 22:59:07 crc kubenswrapper[4962]: I1201 22:59:07.233198 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bbf29d48-8dce-4db8-95e2-3a13585485d2-catalog-content\") pod \"redhat-marketplace-cdxcw\" (UID: \"bbf29d48-8dce-4db8-95e2-3a13585485d2\") " pod="openshift-marketplace/redhat-marketplace-cdxcw" Dec 01 22:59:07 crc kubenswrapper[4962]: I1201 22:59:07.234256 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bbf29d48-8dce-4db8-95e2-3a13585485d2-catalog-content\") pod \"redhat-marketplace-cdxcw\" (UID: \"bbf29d48-8dce-4db8-95e2-3a13585485d2\") " pod="openshift-marketplace/redhat-marketplace-cdxcw" Dec 01 22:59:07 crc kubenswrapper[4962]: I1201 22:59:07.234855 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bbf29d48-8dce-4db8-95e2-3a13585485d2-utilities\") pod \"redhat-marketplace-cdxcw\" (UID: \"bbf29d48-8dce-4db8-95e2-3a13585485d2\") " pod="openshift-marketplace/redhat-marketplace-cdxcw" Dec 01 22:59:07 crc kubenswrapper[4962]: I1201 22:59:07.235357 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bbf29d48-8dce-4db8-95e2-3a13585485d2-utilities\") pod \"redhat-marketplace-cdxcw\" (UID: \"bbf29d48-8dce-4db8-95e2-3a13585485d2\") " pod="openshift-marketplace/redhat-marketplace-cdxcw" Dec 01 22:59:07 crc kubenswrapper[4962]: I1201 22:59:07.271810 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9k8zt\" (UniqueName: \"kubernetes.io/projected/bbf29d48-8dce-4db8-95e2-3a13585485d2-kube-api-access-9k8zt\") pod \"redhat-marketplace-cdxcw\" (UID: \"bbf29d48-8dce-4db8-95e2-3a13585485d2\") " pod="openshift-marketplace/redhat-marketplace-cdxcw" Dec 01 22:59:07 crc kubenswrapper[4962]: I1201 22:59:07.571864 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cdxcw" Dec 01 22:59:08 crc kubenswrapper[4962]: I1201 22:59:08.085545 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cdxcw"] Dec 01 22:59:08 crc kubenswrapper[4962]: I1201 22:59:08.889237 4962 generic.go:334] "Generic (PLEG): container finished" podID="bbf29d48-8dce-4db8-95e2-3a13585485d2" containerID="7b2795be1527f4ef38ee702b269a7ded8bd084ba6fdd067bcdbc66482e665f12" exitCode=0 Dec 01 22:59:08 crc kubenswrapper[4962]: I1201 22:59:08.889303 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cdxcw" event={"ID":"bbf29d48-8dce-4db8-95e2-3a13585485d2","Type":"ContainerDied","Data":"7b2795be1527f4ef38ee702b269a7ded8bd084ba6fdd067bcdbc66482e665f12"} Dec 01 22:59:08 crc kubenswrapper[4962]: I1201 22:59:08.889636 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cdxcw" event={"ID":"bbf29d48-8dce-4db8-95e2-3a13585485d2","Type":"ContainerStarted","Data":"44f086f729f5aa2363ea69f82914875a4da07ccca62c90e356e3c4364a026e49"} Dec 01 22:59:08 crc kubenswrapper[4962]: I1201 22:59:08.892361 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mht7f" event={"ID":"74982f98-3400-49b0-8629-ccc1313fa70e","Type":"ContainerStarted","Data":"2bb610549cc4a43b2a1edf6ccf086fa522da13d4d64185892b7324c9cc7dcb78"} Dec 01 22:59:09 crc kubenswrapper[4962]: I1201 22:59:09.910314 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cdxcw" event={"ID":"bbf29d48-8dce-4db8-95e2-3a13585485d2","Type":"ContainerStarted","Data":"355407b2fe0f619edf61fecb7b6b175f8757f5522cf356e1520c9554058a6cfc"} Dec 01 22:59:09 crc kubenswrapper[4962]: I1201 22:59:09.917340 4962 generic.go:334] "Generic (PLEG): container finished" podID="74982f98-3400-49b0-8629-ccc1313fa70e" containerID="2bb610549cc4a43b2a1edf6ccf086fa522da13d4d64185892b7324c9cc7dcb78" exitCode=0 Dec 01 22:59:09 crc kubenswrapper[4962]: I1201 22:59:09.917395 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mht7f" event={"ID":"74982f98-3400-49b0-8629-ccc1313fa70e","Type":"ContainerDied","Data":"2bb610549cc4a43b2a1edf6ccf086fa522da13d4d64185892b7324c9cc7dcb78"} Dec 01 22:59:10 crc kubenswrapper[4962]: I1201 22:59:10.935642 4962 generic.go:334] "Generic (PLEG): container finished" podID="bbf29d48-8dce-4db8-95e2-3a13585485d2" containerID="355407b2fe0f619edf61fecb7b6b175f8757f5522cf356e1520c9554058a6cfc" exitCode=0 Dec 01 22:59:10 crc kubenswrapper[4962]: I1201 22:59:10.935698 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cdxcw" event={"ID":"bbf29d48-8dce-4db8-95e2-3a13585485d2","Type":"ContainerDied","Data":"355407b2fe0f619edf61fecb7b6b175f8757f5522cf356e1520c9554058a6cfc"} Dec 01 22:59:10 crc kubenswrapper[4962]: I1201 22:59:10.940694 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mht7f" event={"ID":"74982f98-3400-49b0-8629-ccc1313fa70e","Type":"ContainerStarted","Data":"daa0adf49738e143fa8cc9294b98c509917dc1d8ef47764d295cb16bfb99ab65"} Dec 01 22:59:10 crc kubenswrapper[4962]: I1201 22:59:10.999218 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mht7f" podStartSLOduration=3.4308197 podStartE2EDuration="6.999195956s" podCreationTimestamp="2025-12-01 22:59:04 +0000 UTC" firstStartedPulling="2025-12-01 22:59:06.867569611 +0000 UTC m=+5130.969008826" lastFinishedPulling="2025-12-01 22:59:10.435945887 +0000 UTC m=+5134.537385082" observedRunningTime="2025-12-01 22:59:10.979654142 +0000 UTC m=+5135.081093357" watchObservedRunningTime="2025-12-01 22:59:10.999195956 +0000 UTC m=+5135.100635161" Dec 01 22:59:12 crc kubenswrapper[4962]: I1201 22:59:12.970021 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cdxcw" event={"ID":"bbf29d48-8dce-4db8-95e2-3a13585485d2","Type":"ContainerStarted","Data":"3b38d9ed5d697a346b30e17de83265c10f7a1a61554fad7a6d2f4fbc8fa778b8"} Dec 01 22:59:14 crc kubenswrapper[4962]: I1201 22:59:14.687644 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mht7f" Dec 01 22:59:14 crc kubenswrapper[4962]: I1201 22:59:14.687969 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mht7f" Dec 01 22:59:15 crc kubenswrapper[4962]: I1201 22:59:15.852871 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-mht7f" podUID="74982f98-3400-49b0-8629-ccc1313fa70e" containerName="registry-server" probeResult="failure" output=< Dec 01 22:59:15 crc kubenswrapper[4962]: timeout: failed to connect service ":50051" within 1s Dec 01 22:59:15 crc kubenswrapper[4962]: > Dec 01 22:59:17 crc kubenswrapper[4962]: I1201 22:59:17.573220 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-cdxcw" Dec 01 22:59:17 crc kubenswrapper[4962]: I1201 22:59:17.573597 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-cdxcw" Dec 01 22:59:17 crc kubenswrapper[4962]: I1201 22:59:17.646081 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-cdxcw" Dec 01 22:59:17 crc kubenswrapper[4962]: I1201 22:59:17.664426 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-cdxcw" podStartSLOduration=8.833205964 podStartE2EDuration="11.664406791s" podCreationTimestamp="2025-12-01 22:59:06 +0000 UTC" firstStartedPulling="2025-12-01 22:59:08.892335484 +0000 UTC m=+5132.993774679" lastFinishedPulling="2025-12-01 22:59:11.723536311 +0000 UTC m=+5135.824975506" observedRunningTime="2025-12-01 22:59:12.996699756 +0000 UTC m=+5137.098138961" watchObservedRunningTime="2025-12-01 22:59:17.664406791 +0000 UTC m=+5141.765845986" Dec 01 22:59:18 crc kubenswrapper[4962]: I1201 22:59:18.108396 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-cdxcw" Dec 01 22:59:18 crc kubenswrapper[4962]: I1201 22:59:18.160554 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cdxcw"] Dec 01 22:59:20 crc kubenswrapper[4962]: I1201 22:59:20.099066 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-cdxcw" podUID="bbf29d48-8dce-4db8-95e2-3a13585485d2" containerName="registry-server" containerID="cri-o://3b38d9ed5d697a346b30e17de83265c10f7a1a61554fad7a6d2f4fbc8fa778b8" gracePeriod=2 Dec 01 22:59:20 crc kubenswrapper[4962]: I1201 22:59:20.814797 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cdxcw" Dec 01 22:59:20 crc kubenswrapper[4962]: I1201 22:59:20.900349 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bbf29d48-8dce-4db8-95e2-3a13585485d2-utilities\") pod \"bbf29d48-8dce-4db8-95e2-3a13585485d2\" (UID: \"bbf29d48-8dce-4db8-95e2-3a13585485d2\") " Dec 01 22:59:20 crc kubenswrapper[4962]: I1201 22:59:20.900393 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bbf29d48-8dce-4db8-95e2-3a13585485d2-catalog-content\") pod \"bbf29d48-8dce-4db8-95e2-3a13585485d2\" (UID: \"bbf29d48-8dce-4db8-95e2-3a13585485d2\") " Dec 01 22:59:20 crc kubenswrapper[4962]: I1201 22:59:20.900573 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9k8zt\" (UniqueName: \"kubernetes.io/projected/bbf29d48-8dce-4db8-95e2-3a13585485d2-kube-api-access-9k8zt\") pod \"bbf29d48-8dce-4db8-95e2-3a13585485d2\" (UID: \"bbf29d48-8dce-4db8-95e2-3a13585485d2\") " Dec 01 22:59:20 crc kubenswrapper[4962]: I1201 22:59:20.900822 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bbf29d48-8dce-4db8-95e2-3a13585485d2-utilities" (OuterVolumeSpecName: "utilities") pod "bbf29d48-8dce-4db8-95e2-3a13585485d2" (UID: "bbf29d48-8dce-4db8-95e2-3a13585485d2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 22:59:20 crc kubenswrapper[4962]: I1201 22:59:20.901696 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bbf29d48-8dce-4db8-95e2-3a13585485d2-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 22:59:20 crc kubenswrapper[4962]: I1201 22:59:20.907385 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbf29d48-8dce-4db8-95e2-3a13585485d2-kube-api-access-9k8zt" (OuterVolumeSpecName: "kube-api-access-9k8zt") pod "bbf29d48-8dce-4db8-95e2-3a13585485d2" (UID: "bbf29d48-8dce-4db8-95e2-3a13585485d2"). InnerVolumeSpecName "kube-api-access-9k8zt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 22:59:20 crc kubenswrapper[4962]: I1201 22:59:20.917357 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bbf29d48-8dce-4db8-95e2-3a13585485d2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bbf29d48-8dce-4db8-95e2-3a13585485d2" (UID: "bbf29d48-8dce-4db8-95e2-3a13585485d2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 22:59:21 crc kubenswrapper[4962]: I1201 22:59:21.004523 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bbf29d48-8dce-4db8-95e2-3a13585485d2-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 22:59:21 crc kubenswrapper[4962]: I1201 22:59:21.004561 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9k8zt\" (UniqueName: \"kubernetes.io/projected/bbf29d48-8dce-4db8-95e2-3a13585485d2-kube-api-access-9k8zt\") on node \"crc\" DevicePath \"\"" Dec 01 22:59:21 crc kubenswrapper[4962]: I1201 22:59:21.112946 4962 generic.go:334] "Generic (PLEG): container finished" podID="bbf29d48-8dce-4db8-95e2-3a13585485d2" containerID="3b38d9ed5d697a346b30e17de83265c10f7a1a61554fad7a6d2f4fbc8fa778b8" exitCode=0 Dec 01 22:59:21 crc kubenswrapper[4962]: I1201 22:59:21.113047 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cdxcw" event={"ID":"bbf29d48-8dce-4db8-95e2-3a13585485d2","Type":"ContainerDied","Data":"3b38d9ed5d697a346b30e17de83265c10f7a1a61554fad7a6d2f4fbc8fa778b8"} Dec 01 22:59:21 crc kubenswrapper[4962]: I1201 22:59:21.113279 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cdxcw" event={"ID":"bbf29d48-8dce-4db8-95e2-3a13585485d2","Type":"ContainerDied","Data":"44f086f729f5aa2363ea69f82914875a4da07ccca62c90e356e3c4364a026e49"} Dec 01 22:59:21 crc kubenswrapper[4962]: I1201 22:59:21.113301 4962 scope.go:117] "RemoveContainer" containerID="3b38d9ed5d697a346b30e17de83265c10f7a1a61554fad7a6d2f4fbc8fa778b8" Dec 01 22:59:21 crc kubenswrapper[4962]: I1201 22:59:21.113076 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cdxcw" Dec 01 22:59:21 crc kubenswrapper[4962]: I1201 22:59:21.146675 4962 scope.go:117] "RemoveContainer" containerID="355407b2fe0f619edf61fecb7b6b175f8757f5522cf356e1520c9554058a6cfc" Dec 01 22:59:21 crc kubenswrapper[4962]: I1201 22:59:21.157981 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cdxcw"] Dec 01 22:59:21 crc kubenswrapper[4962]: I1201 22:59:21.167874 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-cdxcw"] Dec 01 22:59:21 crc kubenswrapper[4962]: I1201 22:59:21.182409 4962 scope.go:117] "RemoveContainer" containerID="7b2795be1527f4ef38ee702b269a7ded8bd084ba6fdd067bcdbc66482e665f12" Dec 01 22:59:21 crc kubenswrapper[4962]: I1201 22:59:21.238619 4962 scope.go:117] "RemoveContainer" containerID="3b38d9ed5d697a346b30e17de83265c10f7a1a61554fad7a6d2f4fbc8fa778b8" Dec 01 22:59:21 crc kubenswrapper[4962]: E1201 22:59:21.239183 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b38d9ed5d697a346b30e17de83265c10f7a1a61554fad7a6d2f4fbc8fa778b8\": container with ID starting with 3b38d9ed5d697a346b30e17de83265c10f7a1a61554fad7a6d2f4fbc8fa778b8 not found: ID does not exist" containerID="3b38d9ed5d697a346b30e17de83265c10f7a1a61554fad7a6d2f4fbc8fa778b8" Dec 01 22:59:21 crc kubenswrapper[4962]: I1201 22:59:21.239226 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b38d9ed5d697a346b30e17de83265c10f7a1a61554fad7a6d2f4fbc8fa778b8"} err="failed to get container status \"3b38d9ed5d697a346b30e17de83265c10f7a1a61554fad7a6d2f4fbc8fa778b8\": rpc error: code = NotFound desc = could not find container \"3b38d9ed5d697a346b30e17de83265c10f7a1a61554fad7a6d2f4fbc8fa778b8\": container with ID starting with 3b38d9ed5d697a346b30e17de83265c10f7a1a61554fad7a6d2f4fbc8fa778b8 not found: ID does not exist" Dec 01 22:59:21 crc kubenswrapper[4962]: I1201 22:59:21.239250 4962 scope.go:117] "RemoveContainer" containerID="355407b2fe0f619edf61fecb7b6b175f8757f5522cf356e1520c9554058a6cfc" Dec 01 22:59:21 crc kubenswrapper[4962]: E1201 22:59:21.239706 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"355407b2fe0f619edf61fecb7b6b175f8757f5522cf356e1520c9554058a6cfc\": container with ID starting with 355407b2fe0f619edf61fecb7b6b175f8757f5522cf356e1520c9554058a6cfc not found: ID does not exist" containerID="355407b2fe0f619edf61fecb7b6b175f8757f5522cf356e1520c9554058a6cfc" Dec 01 22:59:21 crc kubenswrapper[4962]: I1201 22:59:21.239737 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"355407b2fe0f619edf61fecb7b6b175f8757f5522cf356e1520c9554058a6cfc"} err="failed to get container status \"355407b2fe0f619edf61fecb7b6b175f8757f5522cf356e1520c9554058a6cfc\": rpc error: code = NotFound desc = could not find container \"355407b2fe0f619edf61fecb7b6b175f8757f5522cf356e1520c9554058a6cfc\": container with ID starting with 355407b2fe0f619edf61fecb7b6b175f8757f5522cf356e1520c9554058a6cfc not found: ID does not exist" Dec 01 22:59:21 crc kubenswrapper[4962]: I1201 22:59:21.239766 4962 scope.go:117] "RemoveContainer" containerID="7b2795be1527f4ef38ee702b269a7ded8bd084ba6fdd067bcdbc66482e665f12" Dec 01 22:59:21 crc kubenswrapper[4962]: E1201 22:59:21.240476 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b2795be1527f4ef38ee702b269a7ded8bd084ba6fdd067bcdbc66482e665f12\": container with ID starting with 7b2795be1527f4ef38ee702b269a7ded8bd084ba6fdd067bcdbc66482e665f12 not found: ID does not exist" containerID="7b2795be1527f4ef38ee702b269a7ded8bd084ba6fdd067bcdbc66482e665f12" Dec 01 22:59:21 crc kubenswrapper[4962]: I1201 22:59:21.240503 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b2795be1527f4ef38ee702b269a7ded8bd084ba6fdd067bcdbc66482e665f12"} err="failed to get container status \"7b2795be1527f4ef38ee702b269a7ded8bd084ba6fdd067bcdbc66482e665f12\": rpc error: code = NotFound desc = could not find container \"7b2795be1527f4ef38ee702b269a7ded8bd084ba6fdd067bcdbc66482e665f12\": container with ID starting with 7b2795be1527f4ef38ee702b269a7ded8bd084ba6fdd067bcdbc66482e665f12 not found: ID does not exist" Dec 01 22:59:22 crc kubenswrapper[4962]: I1201 22:59:22.237123 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbf29d48-8dce-4db8-95e2-3a13585485d2" path="/var/lib/kubelet/pods/bbf29d48-8dce-4db8-95e2-3a13585485d2/volumes" Dec 01 22:59:24 crc kubenswrapper[4962]: I1201 22:59:24.737606 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mht7f" Dec 01 22:59:24 crc kubenswrapper[4962]: I1201 22:59:24.801344 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mht7f" Dec 01 22:59:25 crc kubenswrapper[4962]: I1201 22:59:25.291902 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mht7f"] Dec 01 22:59:26 crc kubenswrapper[4962]: I1201 22:59:26.183974 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mht7f" podUID="74982f98-3400-49b0-8629-ccc1313fa70e" containerName="registry-server" containerID="cri-o://daa0adf49738e143fa8cc9294b98c509917dc1d8ef47764d295cb16bfb99ab65" gracePeriod=2 Dec 01 22:59:27 crc kubenswrapper[4962]: I1201 22:59:27.200254 4962 generic.go:334] "Generic (PLEG): container finished" podID="74982f98-3400-49b0-8629-ccc1313fa70e" containerID="daa0adf49738e143fa8cc9294b98c509917dc1d8ef47764d295cb16bfb99ab65" exitCode=0 Dec 01 22:59:27 crc kubenswrapper[4962]: I1201 22:59:27.200359 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mht7f" event={"ID":"74982f98-3400-49b0-8629-ccc1313fa70e","Type":"ContainerDied","Data":"daa0adf49738e143fa8cc9294b98c509917dc1d8ef47764d295cb16bfb99ab65"} Dec 01 22:59:28 crc kubenswrapper[4962]: I1201 22:59:28.084736 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mht7f" Dec 01 22:59:28 crc kubenswrapper[4962]: I1201 22:59:28.142755 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74982f98-3400-49b0-8629-ccc1313fa70e-utilities\") pod \"74982f98-3400-49b0-8629-ccc1313fa70e\" (UID: \"74982f98-3400-49b0-8629-ccc1313fa70e\") " Dec 01 22:59:28 crc kubenswrapper[4962]: I1201 22:59:28.142947 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74982f98-3400-49b0-8629-ccc1313fa70e-catalog-content\") pod \"74982f98-3400-49b0-8629-ccc1313fa70e\" (UID: \"74982f98-3400-49b0-8629-ccc1313fa70e\") " Dec 01 22:59:28 crc kubenswrapper[4962]: I1201 22:59:28.143003 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rb2nm\" (UniqueName: \"kubernetes.io/projected/74982f98-3400-49b0-8629-ccc1313fa70e-kube-api-access-rb2nm\") pod \"74982f98-3400-49b0-8629-ccc1313fa70e\" (UID: \"74982f98-3400-49b0-8629-ccc1313fa70e\") " Dec 01 22:59:28 crc kubenswrapper[4962]: I1201 22:59:28.143626 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74982f98-3400-49b0-8629-ccc1313fa70e-utilities" (OuterVolumeSpecName: "utilities") pod "74982f98-3400-49b0-8629-ccc1313fa70e" (UID: "74982f98-3400-49b0-8629-ccc1313fa70e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 22:59:28 crc kubenswrapper[4962]: I1201 22:59:28.154312 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74982f98-3400-49b0-8629-ccc1313fa70e-kube-api-access-rb2nm" (OuterVolumeSpecName: "kube-api-access-rb2nm") pod "74982f98-3400-49b0-8629-ccc1313fa70e" (UID: "74982f98-3400-49b0-8629-ccc1313fa70e"). InnerVolumeSpecName "kube-api-access-rb2nm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 22:59:28 crc kubenswrapper[4962]: I1201 22:59:28.190840 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74982f98-3400-49b0-8629-ccc1313fa70e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "74982f98-3400-49b0-8629-ccc1313fa70e" (UID: "74982f98-3400-49b0-8629-ccc1313fa70e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 22:59:28 crc kubenswrapper[4962]: I1201 22:59:28.217070 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mht7f" event={"ID":"74982f98-3400-49b0-8629-ccc1313fa70e","Type":"ContainerDied","Data":"92834785ade487845ff01bdffb05011b810afed5057c7462e6d5edd15491523e"} Dec 01 22:59:28 crc kubenswrapper[4962]: I1201 22:59:28.217131 4962 scope.go:117] "RemoveContainer" containerID="daa0adf49738e143fa8cc9294b98c509917dc1d8ef47764d295cb16bfb99ab65" Dec 01 22:59:28 crc kubenswrapper[4962]: I1201 22:59:28.217304 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mht7f" Dec 01 22:59:28 crc kubenswrapper[4962]: I1201 22:59:28.250079 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rb2nm\" (UniqueName: \"kubernetes.io/projected/74982f98-3400-49b0-8629-ccc1313fa70e-kube-api-access-rb2nm\") on node \"crc\" DevicePath \"\"" Dec 01 22:59:28 crc kubenswrapper[4962]: I1201 22:59:28.250466 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74982f98-3400-49b0-8629-ccc1313fa70e-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 22:59:28 crc kubenswrapper[4962]: I1201 22:59:28.250480 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74982f98-3400-49b0-8629-ccc1313fa70e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 22:59:28 crc kubenswrapper[4962]: I1201 22:59:28.254313 4962 scope.go:117] "RemoveContainer" containerID="2bb610549cc4a43b2a1edf6ccf086fa522da13d4d64185892b7324c9cc7dcb78" Dec 01 22:59:28 crc kubenswrapper[4962]: I1201 22:59:28.267628 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mht7f"] Dec 01 22:59:28 crc kubenswrapper[4962]: I1201 22:59:28.278027 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mht7f"] Dec 01 22:59:28 crc kubenswrapper[4962]: I1201 22:59:28.280541 4962 scope.go:117] "RemoveContainer" containerID="479677979bb6892f8c10554e07d02daccad9af7adc7d0708ce9c5b3536ed30f3" Dec 01 22:59:30 crc kubenswrapper[4962]: I1201 22:59:30.236270 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74982f98-3400-49b0-8629-ccc1313fa70e" path="/var/lib/kubelet/pods/74982f98-3400-49b0-8629-ccc1313fa70e/volumes" Dec 01 23:00:00 crc kubenswrapper[4962]: I1201 23:00:00.243299 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410500-b6dkd"] Dec 01 23:00:00 crc kubenswrapper[4962]: E1201 23:00:00.244292 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbf29d48-8dce-4db8-95e2-3a13585485d2" containerName="registry-server" Dec 01 23:00:00 crc kubenswrapper[4962]: I1201 23:00:00.244370 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbf29d48-8dce-4db8-95e2-3a13585485d2" containerName="registry-server" Dec 01 23:00:00 crc kubenswrapper[4962]: E1201 23:00:00.244401 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74982f98-3400-49b0-8629-ccc1313fa70e" containerName="extract-content" Dec 01 23:00:00 crc kubenswrapper[4962]: I1201 23:00:00.244408 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="74982f98-3400-49b0-8629-ccc1313fa70e" containerName="extract-content" Dec 01 23:00:00 crc kubenswrapper[4962]: E1201 23:00:00.244420 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbf29d48-8dce-4db8-95e2-3a13585485d2" containerName="extract-utilities" Dec 01 23:00:00 crc kubenswrapper[4962]: I1201 23:00:00.244426 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbf29d48-8dce-4db8-95e2-3a13585485d2" containerName="extract-utilities" Dec 01 23:00:00 crc kubenswrapper[4962]: E1201 23:00:00.244441 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74982f98-3400-49b0-8629-ccc1313fa70e" containerName="registry-server" Dec 01 23:00:00 crc kubenswrapper[4962]: I1201 23:00:00.244446 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="74982f98-3400-49b0-8629-ccc1313fa70e" containerName="registry-server" Dec 01 23:00:00 crc kubenswrapper[4962]: E1201 23:00:00.244471 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbf29d48-8dce-4db8-95e2-3a13585485d2" containerName="extract-content" Dec 01 23:00:00 crc kubenswrapper[4962]: I1201 23:00:00.244476 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbf29d48-8dce-4db8-95e2-3a13585485d2" containerName="extract-content" Dec 01 23:00:00 crc kubenswrapper[4962]: E1201 23:00:00.244484 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74982f98-3400-49b0-8629-ccc1313fa70e" containerName="extract-utilities" Dec 01 23:00:00 crc kubenswrapper[4962]: I1201 23:00:00.244490 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="74982f98-3400-49b0-8629-ccc1313fa70e" containerName="extract-utilities" Dec 01 23:00:00 crc kubenswrapper[4962]: I1201 23:00:00.244681 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbf29d48-8dce-4db8-95e2-3a13585485d2" containerName="registry-server" Dec 01 23:00:00 crc kubenswrapper[4962]: I1201 23:00:00.244690 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="74982f98-3400-49b0-8629-ccc1313fa70e" containerName="registry-server" Dec 01 23:00:00 crc kubenswrapper[4962]: I1201 23:00:00.246031 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410500-b6dkd" Dec 01 23:00:00 crc kubenswrapper[4962]: I1201 23:00:00.255588 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 01 23:00:00 crc kubenswrapper[4962]: I1201 23:00:00.256106 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 01 23:00:00 crc kubenswrapper[4962]: I1201 23:00:00.361655 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410500-b6dkd"] Dec 01 23:00:00 crc kubenswrapper[4962]: I1201 23:00:00.417801 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3855da1a-5c0c-47f3-a434-6812d8decdcd-secret-volume\") pod \"collect-profiles-29410500-b6dkd\" (UID: \"3855da1a-5c0c-47f3-a434-6812d8decdcd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410500-b6dkd" Dec 01 23:00:00 crc kubenswrapper[4962]: I1201 23:00:00.417914 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3855da1a-5c0c-47f3-a434-6812d8decdcd-config-volume\") pod \"collect-profiles-29410500-b6dkd\" (UID: \"3855da1a-5c0c-47f3-a434-6812d8decdcd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410500-b6dkd" Dec 01 23:00:00 crc kubenswrapper[4962]: I1201 23:00:00.418154 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxkfk\" (UniqueName: \"kubernetes.io/projected/3855da1a-5c0c-47f3-a434-6812d8decdcd-kube-api-access-sxkfk\") pod \"collect-profiles-29410500-b6dkd\" (UID: \"3855da1a-5c0c-47f3-a434-6812d8decdcd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410500-b6dkd" Dec 01 23:00:00 crc kubenswrapper[4962]: I1201 23:00:00.520254 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxkfk\" (UniqueName: \"kubernetes.io/projected/3855da1a-5c0c-47f3-a434-6812d8decdcd-kube-api-access-sxkfk\") pod \"collect-profiles-29410500-b6dkd\" (UID: \"3855da1a-5c0c-47f3-a434-6812d8decdcd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410500-b6dkd" Dec 01 23:00:00 crc kubenswrapper[4962]: I1201 23:00:00.520430 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3855da1a-5c0c-47f3-a434-6812d8decdcd-secret-volume\") pod \"collect-profiles-29410500-b6dkd\" (UID: \"3855da1a-5c0c-47f3-a434-6812d8decdcd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410500-b6dkd" Dec 01 23:00:00 crc kubenswrapper[4962]: I1201 23:00:00.520481 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3855da1a-5c0c-47f3-a434-6812d8decdcd-config-volume\") pod \"collect-profiles-29410500-b6dkd\" (UID: \"3855da1a-5c0c-47f3-a434-6812d8decdcd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410500-b6dkd" Dec 01 23:00:00 crc kubenswrapper[4962]: I1201 23:00:00.521360 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3855da1a-5c0c-47f3-a434-6812d8decdcd-config-volume\") pod \"collect-profiles-29410500-b6dkd\" (UID: \"3855da1a-5c0c-47f3-a434-6812d8decdcd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410500-b6dkd" Dec 01 23:00:00 crc kubenswrapper[4962]: I1201 23:00:00.537340 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3855da1a-5c0c-47f3-a434-6812d8decdcd-secret-volume\") pod \"collect-profiles-29410500-b6dkd\" (UID: \"3855da1a-5c0c-47f3-a434-6812d8decdcd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410500-b6dkd" Dec 01 23:00:00 crc kubenswrapper[4962]: I1201 23:00:00.537953 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxkfk\" (UniqueName: \"kubernetes.io/projected/3855da1a-5c0c-47f3-a434-6812d8decdcd-kube-api-access-sxkfk\") pod \"collect-profiles-29410500-b6dkd\" (UID: \"3855da1a-5c0c-47f3-a434-6812d8decdcd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410500-b6dkd" Dec 01 23:00:00 crc kubenswrapper[4962]: I1201 23:00:00.570311 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410500-b6dkd" Dec 01 23:00:01 crc kubenswrapper[4962]: I1201 23:00:01.091293 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410500-b6dkd"] Dec 01 23:00:01 crc kubenswrapper[4962]: I1201 23:00:01.609282 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410500-b6dkd" event={"ID":"3855da1a-5c0c-47f3-a434-6812d8decdcd","Type":"ContainerStarted","Data":"e55698e6563568e76dd60ccfe8491b201d225750f2f13a2238e92b035314f1ad"} Dec 01 23:00:01 crc kubenswrapper[4962]: I1201 23:00:01.609557 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410500-b6dkd" event={"ID":"3855da1a-5c0c-47f3-a434-6812d8decdcd","Type":"ContainerStarted","Data":"7bb2f3336a5379d065f637dd47ce60375a25492987f1be7079644f33dc67e2e7"} Dec 01 23:00:01 crc kubenswrapper[4962]: I1201 23:00:01.644920 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29410500-b6dkd" podStartSLOduration=1.6448934259999999 podStartE2EDuration="1.644893426s" podCreationTimestamp="2025-12-01 23:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 23:00:01.623655063 +0000 UTC m=+5185.725094298" watchObservedRunningTime="2025-12-01 23:00:01.644893426 +0000 UTC m=+5185.746332641" Dec 01 23:00:02 crc kubenswrapper[4962]: I1201 23:00:02.626148 4962 generic.go:334] "Generic (PLEG): container finished" podID="3855da1a-5c0c-47f3-a434-6812d8decdcd" containerID="e55698e6563568e76dd60ccfe8491b201d225750f2f13a2238e92b035314f1ad" exitCode=0 Dec 01 23:00:02 crc kubenswrapper[4962]: I1201 23:00:02.626481 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410500-b6dkd" event={"ID":"3855da1a-5c0c-47f3-a434-6812d8decdcd","Type":"ContainerDied","Data":"e55698e6563568e76dd60ccfe8491b201d225750f2f13a2238e92b035314f1ad"} Dec 01 23:00:04 crc kubenswrapper[4962]: I1201 23:00:04.197057 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410500-b6dkd" Dec 01 23:00:04 crc kubenswrapper[4962]: I1201 23:00:04.335995 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxkfk\" (UniqueName: \"kubernetes.io/projected/3855da1a-5c0c-47f3-a434-6812d8decdcd-kube-api-access-sxkfk\") pod \"3855da1a-5c0c-47f3-a434-6812d8decdcd\" (UID: \"3855da1a-5c0c-47f3-a434-6812d8decdcd\") " Dec 01 23:00:04 crc kubenswrapper[4962]: I1201 23:00:04.336369 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3855da1a-5c0c-47f3-a434-6812d8decdcd-secret-volume\") pod \"3855da1a-5c0c-47f3-a434-6812d8decdcd\" (UID: \"3855da1a-5c0c-47f3-a434-6812d8decdcd\") " Dec 01 23:00:04 crc kubenswrapper[4962]: I1201 23:00:04.336441 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3855da1a-5c0c-47f3-a434-6812d8decdcd-config-volume\") pod \"3855da1a-5c0c-47f3-a434-6812d8decdcd\" (UID: \"3855da1a-5c0c-47f3-a434-6812d8decdcd\") " Dec 01 23:00:04 crc kubenswrapper[4962]: I1201 23:00:04.337064 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3855da1a-5c0c-47f3-a434-6812d8decdcd-config-volume" (OuterVolumeSpecName: "config-volume") pod "3855da1a-5c0c-47f3-a434-6812d8decdcd" (UID: "3855da1a-5c0c-47f3-a434-6812d8decdcd"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 23:00:04 crc kubenswrapper[4962]: I1201 23:00:04.342876 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3855da1a-5c0c-47f3-a434-6812d8decdcd-kube-api-access-sxkfk" (OuterVolumeSpecName: "kube-api-access-sxkfk") pod "3855da1a-5c0c-47f3-a434-6812d8decdcd" (UID: "3855da1a-5c0c-47f3-a434-6812d8decdcd"). InnerVolumeSpecName "kube-api-access-sxkfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 23:00:04 crc kubenswrapper[4962]: I1201 23:00:04.344110 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3855da1a-5c0c-47f3-a434-6812d8decdcd-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3855da1a-5c0c-47f3-a434-6812d8decdcd" (UID: "3855da1a-5c0c-47f3-a434-6812d8decdcd"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 23:00:04 crc kubenswrapper[4962]: I1201 23:00:04.438783 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxkfk\" (UniqueName: \"kubernetes.io/projected/3855da1a-5c0c-47f3-a434-6812d8decdcd-kube-api-access-sxkfk\") on node \"crc\" DevicePath \"\"" Dec 01 23:00:04 crc kubenswrapper[4962]: I1201 23:00:04.438810 4962 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3855da1a-5c0c-47f3-a434-6812d8decdcd-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 01 23:00:04 crc kubenswrapper[4962]: I1201 23:00:04.438820 4962 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3855da1a-5c0c-47f3-a434-6812d8decdcd-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 23:00:04 crc kubenswrapper[4962]: I1201 23:00:04.647015 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410500-b6dkd" event={"ID":"3855da1a-5c0c-47f3-a434-6812d8decdcd","Type":"ContainerDied","Data":"7bb2f3336a5379d065f637dd47ce60375a25492987f1be7079644f33dc67e2e7"} Dec 01 23:00:04 crc kubenswrapper[4962]: I1201 23:00:04.647056 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7bb2f3336a5379d065f637dd47ce60375a25492987f1be7079644f33dc67e2e7" Dec 01 23:00:04 crc kubenswrapper[4962]: I1201 23:00:04.647082 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410500-b6dkd" Dec 01 23:00:04 crc kubenswrapper[4962]: I1201 23:00:04.729837 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410455-m4kkr"] Dec 01 23:00:04 crc kubenswrapper[4962]: I1201 23:00:04.739031 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410455-m4kkr"] Dec 01 23:00:06 crc kubenswrapper[4962]: I1201 23:00:06.235012 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c084a91-ba28-43e1-b1d7-bb0c15be6c97" path="/var/lib/kubelet/pods/5c084a91-ba28-43e1-b1d7-bb0c15be6c97/volumes" Dec 01 23:00:08 crc kubenswrapper[4962]: I1201 23:00:08.757911 4962 scope.go:117] "RemoveContainer" containerID="03826c7bbfe40b786382eebb831fd7e55698210c7d2284243d7e483cd28931cb" Dec 01 23:00:32 crc kubenswrapper[4962]: I1201 23:00:32.785302 4962 patch_prober.go:28] interesting pod/machine-config-daemon-b642k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 23:00:32 crc kubenswrapper[4962]: I1201 23:00:32.786013 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 23:01:00 crc kubenswrapper[4962]: I1201 23:01:00.186787 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29410501-d9brm"] Dec 01 23:01:00 crc kubenswrapper[4962]: E1201 23:01:00.188285 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3855da1a-5c0c-47f3-a434-6812d8decdcd" containerName="collect-profiles" Dec 01 23:01:00 crc kubenswrapper[4962]: I1201 23:01:00.188309 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="3855da1a-5c0c-47f3-a434-6812d8decdcd" containerName="collect-profiles" Dec 01 23:01:00 crc kubenswrapper[4962]: I1201 23:01:00.188794 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="3855da1a-5c0c-47f3-a434-6812d8decdcd" containerName="collect-profiles" Dec 01 23:01:00 crc kubenswrapper[4962]: I1201 23:01:00.190620 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29410501-d9brm" Dec 01 23:01:00 crc kubenswrapper[4962]: I1201 23:01:00.197661 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29410501-d9brm"] Dec 01 23:01:00 crc kubenswrapper[4962]: I1201 23:01:00.313284 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3395cab0-9781-4fec-8e37-a3c4be3aca9a-combined-ca-bundle\") pod \"keystone-cron-29410501-d9brm\" (UID: \"3395cab0-9781-4fec-8e37-a3c4be3aca9a\") " pod="openstack/keystone-cron-29410501-d9brm" Dec 01 23:01:00 crc kubenswrapper[4962]: I1201 23:01:00.313399 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3395cab0-9781-4fec-8e37-a3c4be3aca9a-fernet-keys\") pod \"keystone-cron-29410501-d9brm\" (UID: \"3395cab0-9781-4fec-8e37-a3c4be3aca9a\") " pod="openstack/keystone-cron-29410501-d9brm" Dec 01 23:01:00 crc kubenswrapper[4962]: I1201 23:01:00.313432 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxsd7\" (UniqueName: \"kubernetes.io/projected/3395cab0-9781-4fec-8e37-a3c4be3aca9a-kube-api-access-sxsd7\") pod \"keystone-cron-29410501-d9brm\" (UID: \"3395cab0-9781-4fec-8e37-a3c4be3aca9a\") " pod="openstack/keystone-cron-29410501-d9brm" Dec 01 23:01:00 crc kubenswrapper[4962]: I1201 23:01:00.313489 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3395cab0-9781-4fec-8e37-a3c4be3aca9a-config-data\") pod \"keystone-cron-29410501-d9brm\" (UID: \"3395cab0-9781-4fec-8e37-a3c4be3aca9a\") " pod="openstack/keystone-cron-29410501-d9brm" Dec 01 23:01:00 crc kubenswrapper[4962]: I1201 23:01:00.416092 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3395cab0-9781-4fec-8e37-a3c4be3aca9a-combined-ca-bundle\") pod \"keystone-cron-29410501-d9brm\" (UID: \"3395cab0-9781-4fec-8e37-a3c4be3aca9a\") " pod="openstack/keystone-cron-29410501-d9brm" Dec 01 23:01:00 crc kubenswrapper[4962]: I1201 23:01:00.416208 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3395cab0-9781-4fec-8e37-a3c4be3aca9a-fernet-keys\") pod \"keystone-cron-29410501-d9brm\" (UID: \"3395cab0-9781-4fec-8e37-a3c4be3aca9a\") " pod="openstack/keystone-cron-29410501-d9brm" Dec 01 23:01:00 crc kubenswrapper[4962]: I1201 23:01:00.416251 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxsd7\" (UniqueName: \"kubernetes.io/projected/3395cab0-9781-4fec-8e37-a3c4be3aca9a-kube-api-access-sxsd7\") pod \"keystone-cron-29410501-d9brm\" (UID: \"3395cab0-9781-4fec-8e37-a3c4be3aca9a\") " pod="openstack/keystone-cron-29410501-d9brm" Dec 01 23:01:00 crc kubenswrapper[4962]: I1201 23:01:00.416323 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3395cab0-9781-4fec-8e37-a3c4be3aca9a-config-data\") pod \"keystone-cron-29410501-d9brm\" (UID: \"3395cab0-9781-4fec-8e37-a3c4be3aca9a\") " pod="openstack/keystone-cron-29410501-d9brm" Dec 01 23:01:00 crc kubenswrapper[4962]: I1201 23:01:00.590539 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3395cab0-9781-4fec-8e37-a3c4be3aca9a-fernet-keys\") pod \"keystone-cron-29410501-d9brm\" (UID: \"3395cab0-9781-4fec-8e37-a3c4be3aca9a\") " pod="openstack/keystone-cron-29410501-d9brm" Dec 01 23:01:00 crc kubenswrapper[4962]: I1201 23:01:00.590919 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3395cab0-9781-4fec-8e37-a3c4be3aca9a-config-data\") pod \"keystone-cron-29410501-d9brm\" (UID: \"3395cab0-9781-4fec-8e37-a3c4be3aca9a\") " pod="openstack/keystone-cron-29410501-d9brm" Dec 01 23:01:00 crc kubenswrapper[4962]: I1201 23:01:00.592384 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3395cab0-9781-4fec-8e37-a3c4be3aca9a-combined-ca-bundle\") pod \"keystone-cron-29410501-d9brm\" (UID: \"3395cab0-9781-4fec-8e37-a3c4be3aca9a\") " pod="openstack/keystone-cron-29410501-d9brm" Dec 01 23:01:00 crc kubenswrapper[4962]: I1201 23:01:00.593570 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxsd7\" (UniqueName: \"kubernetes.io/projected/3395cab0-9781-4fec-8e37-a3c4be3aca9a-kube-api-access-sxsd7\") pod \"keystone-cron-29410501-d9brm\" (UID: \"3395cab0-9781-4fec-8e37-a3c4be3aca9a\") " pod="openstack/keystone-cron-29410501-d9brm" Dec 01 23:01:00 crc kubenswrapper[4962]: I1201 23:01:00.833412 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29410501-d9brm" Dec 01 23:01:01 crc kubenswrapper[4962]: I1201 23:01:01.398234 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29410501-d9brm"] Dec 01 23:01:02 crc kubenswrapper[4962]: I1201 23:01:02.380066 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29410501-d9brm" event={"ID":"3395cab0-9781-4fec-8e37-a3c4be3aca9a","Type":"ContainerStarted","Data":"d8b0bbd62cfd8157637f7b7b2a18bd1f80bdbc0c09cf164fa6053fe9d4321c0b"} Dec 01 23:01:02 crc kubenswrapper[4962]: I1201 23:01:02.380497 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29410501-d9brm" event={"ID":"3395cab0-9781-4fec-8e37-a3c4be3aca9a","Type":"ContainerStarted","Data":"727fd69f7fad737fae44efe50aae409ad0dd95fcefc8cc958401075ad32c0ddf"} Dec 01 23:01:02 crc kubenswrapper[4962]: I1201 23:01:02.784151 4962 patch_prober.go:28] interesting pod/machine-config-daemon-b642k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 23:01:02 crc kubenswrapper[4962]: I1201 23:01:02.785300 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 23:01:04 crc kubenswrapper[4962]: I1201 23:01:04.405173 4962 generic.go:334] "Generic (PLEG): container finished" podID="3395cab0-9781-4fec-8e37-a3c4be3aca9a" containerID="d8b0bbd62cfd8157637f7b7b2a18bd1f80bdbc0c09cf164fa6053fe9d4321c0b" exitCode=0 Dec 01 23:01:04 crc kubenswrapper[4962]: I1201 23:01:04.405297 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29410501-d9brm" event={"ID":"3395cab0-9781-4fec-8e37-a3c4be3aca9a","Type":"ContainerDied","Data":"d8b0bbd62cfd8157637f7b7b2a18bd1f80bdbc0c09cf164fa6053fe9d4321c0b"} Dec 01 23:01:05 crc kubenswrapper[4962]: I1201 23:01:05.885187 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29410501-d9brm" Dec 01 23:01:06 crc kubenswrapper[4962]: I1201 23:01:06.074052 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3395cab0-9781-4fec-8e37-a3c4be3aca9a-combined-ca-bundle\") pod \"3395cab0-9781-4fec-8e37-a3c4be3aca9a\" (UID: \"3395cab0-9781-4fec-8e37-a3c4be3aca9a\") " Dec 01 23:01:06 crc kubenswrapper[4962]: I1201 23:01:06.074179 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3395cab0-9781-4fec-8e37-a3c4be3aca9a-config-data\") pod \"3395cab0-9781-4fec-8e37-a3c4be3aca9a\" (UID: \"3395cab0-9781-4fec-8e37-a3c4be3aca9a\") " Dec 01 23:01:06 crc kubenswrapper[4962]: I1201 23:01:06.074285 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3395cab0-9781-4fec-8e37-a3c4be3aca9a-fernet-keys\") pod \"3395cab0-9781-4fec-8e37-a3c4be3aca9a\" (UID: \"3395cab0-9781-4fec-8e37-a3c4be3aca9a\") " Dec 01 23:01:06 crc kubenswrapper[4962]: I1201 23:01:06.074328 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxsd7\" (UniqueName: \"kubernetes.io/projected/3395cab0-9781-4fec-8e37-a3c4be3aca9a-kube-api-access-sxsd7\") pod \"3395cab0-9781-4fec-8e37-a3c4be3aca9a\" (UID: \"3395cab0-9781-4fec-8e37-a3c4be3aca9a\") " Dec 01 23:01:06 crc kubenswrapper[4962]: I1201 23:01:06.082461 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3395cab0-9781-4fec-8e37-a3c4be3aca9a-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "3395cab0-9781-4fec-8e37-a3c4be3aca9a" (UID: "3395cab0-9781-4fec-8e37-a3c4be3aca9a"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 23:01:06 crc kubenswrapper[4962]: I1201 23:01:06.085298 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3395cab0-9781-4fec-8e37-a3c4be3aca9a-kube-api-access-sxsd7" (OuterVolumeSpecName: "kube-api-access-sxsd7") pod "3395cab0-9781-4fec-8e37-a3c4be3aca9a" (UID: "3395cab0-9781-4fec-8e37-a3c4be3aca9a"). InnerVolumeSpecName "kube-api-access-sxsd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 23:01:06 crc kubenswrapper[4962]: I1201 23:01:06.123599 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3395cab0-9781-4fec-8e37-a3c4be3aca9a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3395cab0-9781-4fec-8e37-a3c4be3aca9a" (UID: "3395cab0-9781-4fec-8e37-a3c4be3aca9a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 23:01:06 crc kubenswrapper[4962]: I1201 23:01:06.175240 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3395cab0-9781-4fec-8e37-a3c4be3aca9a-config-data" (OuterVolumeSpecName: "config-data") pod "3395cab0-9781-4fec-8e37-a3c4be3aca9a" (UID: "3395cab0-9781-4fec-8e37-a3c4be3aca9a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 23:01:06 crc kubenswrapper[4962]: I1201 23:01:06.176154 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3395cab0-9781-4fec-8e37-a3c4be3aca9a-config-data\") pod \"3395cab0-9781-4fec-8e37-a3c4be3aca9a\" (UID: \"3395cab0-9781-4fec-8e37-a3c4be3aca9a\") " Dec 01 23:01:06 crc kubenswrapper[4962]: W1201 23:01:06.176407 4962 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/3395cab0-9781-4fec-8e37-a3c4be3aca9a/volumes/kubernetes.io~secret/config-data Dec 01 23:01:06 crc kubenswrapper[4962]: I1201 23:01:06.176439 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3395cab0-9781-4fec-8e37-a3c4be3aca9a-config-data" (OuterVolumeSpecName: "config-data") pod "3395cab0-9781-4fec-8e37-a3c4be3aca9a" (UID: "3395cab0-9781-4fec-8e37-a3c4be3aca9a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 23:01:06 crc kubenswrapper[4962]: I1201 23:01:06.177534 4962 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3395cab0-9781-4fec-8e37-a3c4be3aca9a-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 01 23:01:06 crc kubenswrapper[4962]: I1201 23:01:06.177565 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxsd7\" (UniqueName: \"kubernetes.io/projected/3395cab0-9781-4fec-8e37-a3c4be3aca9a-kube-api-access-sxsd7\") on node \"crc\" DevicePath \"\"" Dec 01 23:01:06 crc kubenswrapper[4962]: I1201 23:01:06.177580 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3395cab0-9781-4fec-8e37-a3c4be3aca9a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 23:01:06 crc kubenswrapper[4962]: I1201 23:01:06.177591 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3395cab0-9781-4fec-8e37-a3c4be3aca9a-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 23:01:06 crc kubenswrapper[4962]: I1201 23:01:06.434584 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29410501-d9brm" event={"ID":"3395cab0-9781-4fec-8e37-a3c4be3aca9a","Type":"ContainerDied","Data":"727fd69f7fad737fae44efe50aae409ad0dd95fcefc8cc958401075ad32c0ddf"} Dec 01 23:01:06 crc kubenswrapper[4962]: I1201 23:01:06.434685 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="727fd69f7fad737fae44efe50aae409ad0dd95fcefc8cc958401075ad32c0ddf" Dec 01 23:01:06 crc kubenswrapper[4962]: I1201 23:01:06.434730 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29410501-d9brm" Dec 01 23:01:32 crc kubenswrapper[4962]: I1201 23:01:32.784974 4962 patch_prober.go:28] interesting pod/machine-config-daemon-b642k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 23:01:32 crc kubenswrapper[4962]: I1201 23:01:32.785538 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 23:01:32 crc kubenswrapper[4962]: I1201 23:01:32.785580 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b642k" Dec 01 23:01:32 crc kubenswrapper[4962]: I1201 23:01:32.799618 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"559664edff742d9bb8a40e35e20ed3e4e8d8c1e191acbe7b87ce869314f431aa"} pod="openshift-machine-config-operator/machine-config-daemon-b642k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 23:01:32 crc kubenswrapper[4962]: I1201 23:01:32.799718 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" containerID="cri-o://559664edff742d9bb8a40e35e20ed3e4e8d8c1e191acbe7b87ce869314f431aa" gracePeriod=600 Dec 01 23:01:32 crc kubenswrapper[4962]: E1201 23:01:32.922402 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 23:01:33 crc kubenswrapper[4962]: I1201 23:01:33.815107 4962 generic.go:334] "Generic (PLEG): container finished" podID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerID="559664edff742d9bb8a40e35e20ed3e4e8d8c1e191acbe7b87ce869314f431aa" exitCode=0 Dec 01 23:01:33 crc kubenswrapper[4962]: I1201 23:01:33.815157 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" event={"ID":"191b6ce3-f613-4217-b224-a65ee4cfdfe7","Type":"ContainerDied","Data":"559664edff742d9bb8a40e35e20ed3e4e8d8c1e191acbe7b87ce869314f431aa"} Dec 01 23:01:33 crc kubenswrapper[4962]: I1201 23:01:33.815191 4962 scope.go:117] "RemoveContainer" containerID="3ad1e56f37f13a30e1370bba29af150bae21baabff59d52513f0ad149646a414" Dec 01 23:01:33 crc kubenswrapper[4962]: I1201 23:01:33.816031 4962 scope.go:117] "RemoveContainer" containerID="559664edff742d9bb8a40e35e20ed3e4e8d8c1e191acbe7b87ce869314f431aa" Dec 01 23:01:33 crc kubenswrapper[4962]: E1201 23:01:33.816438 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 23:01:48 crc kubenswrapper[4962]: I1201 23:01:48.222237 4962 scope.go:117] "RemoveContainer" containerID="559664edff742d9bb8a40e35e20ed3e4e8d8c1e191acbe7b87ce869314f431aa" Dec 01 23:01:48 crc kubenswrapper[4962]: E1201 23:01:48.223429 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 23:02:02 crc kubenswrapper[4962]: I1201 23:02:02.219869 4962 scope.go:117] "RemoveContainer" containerID="559664edff742d9bb8a40e35e20ed3e4e8d8c1e191acbe7b87ce869314f431aa" Dec 01 23:02:02 crc kubenswrapper[4962]: E1201 23:02:02.220679 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 23:02:17 crc kubenswrapper[4962]: I1201 23:02:17.221010 4962 scope.go:117] "RemoveContainer" containerID="559664edff742d9bb8a40e35e20ed3e4e8d8c1e191acbe7b87ce869314f431aa" Dec 01 23:02:17 crc kubenswrapper[4962]: E1201 23:02:17.222453 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 23:02:28 crc kubenswrapper[4962]: I1201 23:02:28.221275 4962 scope.go:117] "RemoveContainer" containerID="559664edff742d9bb8a40e35e20ed3e4e8d8c1e191acbe7b87ce869314f431aa" Dec 01 23:02:28 crc kubenswrapper[4962]: E1201 23:02:28.222666 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 23:02:40 crc kubenswrapper[4962]: I1201 23:02:40.220042 4962 scope.go:117] "RemoveContainer" containerID="559664edff742d9bb8a40e35e20ed3e4e8d8c1e191acbe7b87ce869314f431aa" Dec 01 23:02:40 crc kubenswrapper[4962]: E1201 23:02:40.221069 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 23:02:52 crc kubenswrapper[4962]: I1201 23:02:52.220676 4962 scope.go:117] "RemoveContainer" containerID="559664edff742d9bb8a40e35e20ed3e4e8d8c1e191acbe7b87ce869314f431aa" Dec 01 23:02:52 crc kubenswrapper[4962]: E1201 23:02:52.222212 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 23:03:03 crc kubenswrapper[4962]: I1201 23:03:03.226915 4962 scope.go:117] "RemoveContainer" containerID="559664edff742d9bb8a40e35e20ed3e4e8d8c1e191acbe7b87ce869314f431aa" Dec 01 23:03:03 crc kubenswrapper[4962]: E1201 23:03:03.227804 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 23:03:17 crc kubenswrapper[4962]: I1201 23:03:17.220093 4962 scope.go:117] "RemoveContainer" containerID="559664edff742d9bb8a40e35e20ed3e4e8d8c1e191acbe7b87ce869314f431aa" Dec 01 23:03:17 crc kubenswrapper[4962]: E1201 23:03:17.222715 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 23:03:29 crc kubenswrapper[4962]: I1201 23:03:29.219860 4962 scope.go:117] "RemoveContainer" containerID="559664edff742d9bb8a40e35e20ed3e4e8d8c1e191acbe7b87ce869314f431aa" Dec 01 23:03:29 crc kubenswrapper[4962]: E1201 23:03:29.221042 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 23:03:44 crc kubenswrapper[4962]: I1201 23:03:44.221295 4962 scope.go:117] "RemoveContainer" containerID="559664edff742d9bb8a40e35e20ed3e4e8d8c1e191acbe7b87ce869314f431aa" Dec 01 23:03:44 crc kubenswrapper[4962]: E1201 23:03:44.223295 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 23:03:59 crc kubenswrapper[4962]: I1201 23:03:59.221206 4962 scope.go:117] "RemoveContainer" containerID="559664edff742d9bb8a40e35e20ed3e4e8d8c1e191acbe7b87ce869314f431aa" Dec 01 23:03:59 crc kubenswrapper[4962]: E1201 23:03:59.221967 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 23:04:13 crc kubenswrapper[4962]: I1201 23:04:13.219636 4962 scope.go:117] "RemoveContainer" containerID="559664edff742d9bb8a40e35e20ed3e4e8d8c1e191acbe7b87ce869314f431aa" Dec 01 23:04:13 crc kubenswrapper[4962]: E1201 23:04:13.220772 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 23:04:26 crc kubenswrapper[4962]: I1201 23:04:26.249666 4962 scope.go:117] "RemoveContainer" containerID="559664edff742d9bb8a40e35e20ed3e4e8d8c1e191acbe7b87ce869314f431aa" Dec 01 23:04:26 crc kubenswrapper[4962]: E1201 23:04:26.252484 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 23:04:39 crc kubenswrapper[4962]: I1201 23:04:39.220438 4962 scope.go:117] "RemoveContainer" containerID="559664edff742d9bb8a40e35e20ed3e4e8d8c1e191acbe7b87ce869314f431aa" Dec 01 23:04:39 crc kubenswrapper[4962]: E1201 23:04:39.221499 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 23:04:52 crc kubenswrapper[4962]: I1201 23:04:52.220197 4962 scope.go:117] "RemoveContainer" containerID="559664edff742d9bb8a40e35e20ed3e4e8d8c1e191acbe7b87ce869314f431aa" Dec 01 23:04:52 crc kubenswrapper[4962]: E1201 23:04:52.221063 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 23:05:06 crc kubenswrapper[4962]: I1201 23:05:06.231111 4962 scope.go:117] "RemoveContainer" containerID="559664edff742d9bb8a40e35e20ed3e4e8d8c1e191acbe7b87ce869314f431aa" Dec 01 23:05:06 crc kubenswrapper[4962]: E1201 23:05:06.232035 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 23:05:18 crc kubenswrapper[4962]: I1201 23:05:18.224107 4962 scope.go:117] "RemoveContainer" containerID="559664edff742d9bb8a40e35e20ed3e4e8d8c1e191acbe7b87ce869314f431aa" Dec 01 23:05:18 crc kubenswrapper[4962]: E1201 23:05:18.253748 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 23:05:31 crc kubenswrapper[4962]: I1201 23:05:31.220071 4962 scope.go:117] "RemoveContainer" containerID="559664edff742d9bb8a40e35e20ed3e4e8d8c1e191acbe7b87ce869314f431aa" Dec 01 23:05:31 crc kubenswrapper[4962]: E1201 23:05:31.221232 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 23:05:45 crc kubenswrapper[4962]: I1201 23:05:45.220492 4962 scope.go:117] "RemoveContainer" containerID="559664edff742d9bb8a40e35e20ed3e4e8d8c1e191acbe7b87ce869314f431aa" Dec 01 23:05:45 crc kubenswrapper[4962]: E1201 23:05:45.221747 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 23:05:59 crc kubenswrapper[4962]: I1201 23:05:59.220371 4962 scope.go:117] "RemoveContainer" containerID="559664edff742d9bb8a40e35e20ed3e4e8d8c1e191acbe7b87ce869314f431aa" Dec 01 23:05:59 crc kubenswrapper[4962]: E1201 23:05:59.221454 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 23:06:11 crc kubenswrapper[4962]: I1201 23:06:11.220913 4962 scope.go:117] "RemoveContainer" containerID="559664edff742d9bb8a40e35e20ed3e4e8d8c1e191acbe7b87ce869314f431aa" Dec 01 23:06:11 crc kubenswrapper[4962]: E1201 23:06:11.222059 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 23:06:23 crc kubenswrapper[4962]: I1201 23:06:23.220382 4962 scope.go:117] "RemoveContainer" containerID="559664edff742d9bb8a40e35e20ed3e4e8d8c1e191acbe7b87ce869314f431aa" Dec 01 23:06:23 crc kubenswrapper[4962]: E1201 23:06:23.221662 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 23:06:35 crc kubenswrapper[4962]: I1201 23:06:35.220303 4962 scope.go:117] "RemoveContainer" containerID="559664edff742d9bb8a40e35e20ed3e4e8d8c1e191acbe7b87ce869314f431aa" Dec 01 23:06:35 crc kubenswrapper[4962]: I1201 23:06:35.884048 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" event={"ID":"191b6ce3-f613-4217-b224-a65ee4cfdfe7","Type":"ContainerStarted","Data":"4f0dd1a4061cd9271344a31e8d89eab4bdd66bf2f565beae87f3ca6dd0c8508b"} Dec 01 23:07:58 crc kubenswrapper[4962]: I1201 23:07:58.738415 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6jz7w"] Dec 01 23:07:58 crc kubenswrapper[4962]: E1201 23:07:58.741831 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3395cab0-9781-4fec-8e37-a3c4be3aca9a" containerName="keystone-cron" Dec 01 23:07:58 crc kubenswrapper[4962]: I1201 23:07:58.741868 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="3395cab0-9781-4fec-8e37-a3c4be3aca9a" containerName="keystone-cron" Dec 01 23:07:58 crc kubenswrapper[4962]: I1201 23:07:58.742285 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="3395cab0-9781-4fec-8e37-a3c4be3aca9a" containerName="keystone-cron" Dec 01 23:07:58 crc kubenswrapper[4962]: I1201 23:07:58.749351 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6jz7w" Dec 01 23:07:58 crc kubenswrapper[4962]: I1201 23:07:58.775577 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6jz7w"] Dec 01 23:07:58 crc kubenswrapper[4962]: I1201 23:07:58.848167 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7744bf46-b550-4785-b654-ebbfb99153f3-catalog-content\") pod \"redhat-operators-6jz7w\" (UID: \"7744bf46-b550-4785-b654-ebbfb99153f3\") " pod="openshift-marketplace/redhat-operators-6jz7w" Dec 01 23:07:58 crc kubenswrapper[4962]: I1201 23:07:58.848388 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwzwd\" (UniqueName: \"kubernetes.io/projected/7744bf46-b550-4785-b654-ebbfb99153f3-kube-api-access-dwzwd\") pod \"redhat-operators-6jz7w\" (UID: \"7744bf46-b550-4785-b654-ebbfb99153f3\") " pod="openshift-marketplace/redhat-operators-6jz7w" Dec 01 23:07:58 crc kubenswrapper[4962]: I1201 23:07:58.848485 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7744bf46-b550-4785-b654-ebbfb99153f3-utilities\") pod \"redhat-operators-6jz7w\" (UID: \"7744bf46-b550-4785-b654-ebbfb99153f3\") " pod="openshift-marketplace/redhat-operators-6jz7w" Dec 01 23:07:58 crc kubenswrapper[4962]: I1201 23:07:58.950762 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwzwd\" (UniqueName: \"kubernetes.io/projected/7744bf46-b550-4785-b654-ebbfb99153f3-kube-api-access-dwzwd\") pod \"redhat-operators-6jz7w\" (UID: \"7744bf46-b550-4785-b654-ebbfb99153f3\") " pod="openshift-marketplace/redhat-operators-6jz7w" Dec 01 23:07:58 crc kubenswrapper[4962]: I1201 23:07:58.950874 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7744bf46-b550-4785-b654-ebbfb99153f3-utilities\") pod \"redhat-operators-6jz7w\" (UID: \"7744bf46-b550-4785-b654-ebbfb99153f3\") " pod="openshift-marketplace/redhat-operators-6jz7w" Dec 01 23:07:58 crc kubenswrapper[4962]: I1201 23:07:58.950976 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7744bf46-b550-4785-b654-ebbfb99153f3-catalog-content\") pod \"redhat-operators-6jz7w\" (UID: \"7744bf46-b550-4785-b654-ebbfb99153f3\") " pod="openshift-marketplace/redhat-operators-6jz7w" Dec 01 23:07:58 crc kubenswrapper[4962]: I1201 23:07:58.951582 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7744bf46-b550-4785-b654-ebbfb99153f3-catalog-content\") pod \"redhat-operators-6jz7w\" (UID: \"7744bf46-b550-4785-b654-ebbfb99153f3\") " pod="openshift-marketplace/redhat-operators-6jz7w" Dec 01 23:07:58 crc kubenswrapper[4962]: I1201 23:07:58.952158 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7744bf46-b550-4785-b654-ebbfb99153f3-utilities\") pod \"redhat-operators-6jz7w\" (UID: \"7744bf46-b550-4785-b654-ebbfb99153f3\") " pod="openshift-marketplace/redhat-operators-6jz7w" Dec 01 23:07:58 crc kubenswrapper[4962]: I1201 23:07:58.973732 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwzwd\" (UniqueName: \"kubernetes.io/projected/7744bf46-b550-4785-b654-ebbfb99153f3-kube-api-access-dwzwd\") pod \"redhat-operators-6jz7w\" (UID: \"7744bf46-b550-4785-b654-ebbfb99153f3\") " pod="openshift-marketplace/redhat-operators-6jz7w" Dec 01 23:07:59 crc kubenswrapper[4962]: I1201 23:07:59.086863 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6jz7w" Dec 01 23:07:59 crc kubenswrapper[4962]: I1201 23:07:59.693383 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6jz7w"] Dec 01 23:08:00 crc kubenswrapper[4962]: I1201 23:08:00.150002 4962 generic.go:334] "Generic (PLEG): container finished" podID="7744bf46-b550-4785-b654-ebbfb99153f3" containerID="096c06d4f39b8504b8687f182efe441fdbd3bcf0762b02cf6a9033822e4091bb" exitCode=0 Dec 01 23:08:00 crc kubenswrapper[4962]: I1201 23:08:00.150058 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6jz7w" event={"ID":"7744bf46-b550-4785-b654-ebbfb99153f3","Type":"ContainerDied","Data":"096c06d4f39b8504b8687f182efe441fdbd3bcf0762b02cf6a9033822e4091bb"} Dec 01 23:08:00 crc kubenswrapper[4962]: I1201 23:08:00.150336 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6jz7w" event={"ID":"7744bf46-b550-4785-b654-ebbfb99153f3","Type":"ContainerStarted","Data":"aca3f868eabf0b510e51fb54f09d7c2a9b30dae673162754f12a9ed86cd85a33"} Dec 01 23:08:00 crc kubenswrapper[4962]: I1201 23:08:00.152538 4962 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 23:08:02 crc kubenswrapper[4962]: I1201 23:08:02.179041 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6jz7w" event={"ID":"7744bf46-b550-4785-b654-ebbfb99153f3","Type":"ContainerStarted","Data":"7e52013f4f9f4b2e432765258467777d40c4d3b0e0820cf57223b94d57d1b169"} Dec 01 23:08:05 crc kubenswrapper[4962]: I1201 23:08:05.227566 4962 generic.go:334] "Generic (PLEG): container finished" podID="7744bf46-b550-4785-b654-ebbfb99153f3" containerID="7e52013f4f9f4b2e432765258467777d40c4d3b0e0820cf57223b94d57d1b169" exitCode=0 Dec 01 23:08:05 crc kubenswrapper[4962]: I1201 23:08:05.227714 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6jz7w" event={"ID":"7744bf46-b550-4785-b654-ebbfb99153f3","Type":"ContainerDied","Data":"7e52013f4f9f4b2e432765258467777d40c4d3b0e0820cf57223b94d57d1b169"} Dec 01 23:08:06 crc kubenswrapper[4962]: I1201 23:08:06.241648 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6jz7w" event={"ID":"7744bf46-b550-4785-b654-ebbfb99153f3","Type":"ContainerStarted","Data":"e8162cbce3bfc3ba384dea84f444066d67c839bd6e28af08ccb16422f9cc21cc"} Dec 01 23:08:06 crc kubenswrapper[4962]: I1201 23:08:06.326988 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6jz7w" podStartSLOduration=2.836186101 podStartE2EDuration="8.326968127s" podCreationTimestamp="2025-12-01 23:07:58 +0000 UTC" firstStartedPulling="2025-12-01 23:08:00.152272845 +0000 UTC m=+5664.253712050" lastFinishedPulling="2025-12-01 23:08:05.643054881 +0000 UTC m=+5669.744494076" observedRunningTime="2025-12-01 23:08:06.316436678 +0000 UTC m=+5670.417875873" watchObservedRunningTime="2025-12-01 23:08:06.326968127 +0000 UTC m=+5670.428407322" Dec 01 23:08:09 crc kubenswrapper[4962]: I1201 23:08:09.087709 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6jz7w" Dec 01 23:08:09 crc kubenswrapper[4962]: I1201 23:08:09.088350 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6jz7w" Dec 01 23:08:10 crc kubenswrapper[4962]: I1201 23:08:10.157228 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6jz7w" podUID="7744bf46-b550-4785-b654-ebbfb99153f3" containerName="registry-server" probeResult="failure" output=< Dec 01 23:08:10 crc kubenswrapper[4962]: timeout: failed to connect service ":50051" within 1s Dec 01 23:08:10 crc kubenswrapper[4962]: > Dec 01 23:08:20 crc kubenswrapper[4962]: I1201 23:08:20.532048 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6jz7w" podUID="7744bf46-b550-4785-b654-ebbfb99153f3" containerName="registry-server" probeResult="failure" output=< Dec 01 23:08:20 crc kubenswrapper[4962]: timeout: failed to connect service ":50051" within 1s Dec 01 23:08:20 crc kubenswrapper[4962]: > Dec 01 23:08:29 crc kubenswrapper[4962]: I1201 23:08:29.162708 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6jz7w" Dec 01 23:08:29 crc kubenswrapper[4962]: I1201 23:08:29.240011 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6jz7w" Dec 01 23:08:29 crc kubenswrapper[4962]: I1201 23:08:29.924818 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6jz7w"] Dec 01 23:08:30 crc kubenswrapper[4962]: I1201 23:08:30.572536 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6jz7w" podUID="7744bf46-b550-4785-b654-ebbfb99153f3" containerName="registry-server" containerID="cri-o://e8162cbce3bfc3ba384dea84f444066d67c839bd6e28af08ccb16422f9cc21cc" gracePeriod=2 Dec 01 23:08:31 crc kubenswrapper[4962]: I1201 23:08:31.304380 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6jz7w" Dec 01 23:08:31 crc kubenswrapper[4962]: I1201 23:08:31.316467 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7744bf46-b550-4785-b654-ebbfb99153f3-catalog-content\") pod \"7744bf46-b550-4785-b654-ebbfb99153f3\" (UID: \"7744bf46-b550-4785-b654-ebbfb99153f3\") " Dec 01 23:08:31 crc kubenswrapper[4962]: I1201 23:08:31.316550 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7744bf46-b550-4785-b654-ebbfb99153f3-utilities\") pod \"7744bf46-b550-4785-b654-ebbfb99153f3\" (UID: \"7744bf46-b550-4785-b654-ebbfb99153f3\") " Dec 01 23:08:31 crc kubenswrapper[4962]: I1201 23:08:31.316669 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwzwd\" (UniqueName: \"kubernetes.io/projected/7744bf46-b550-4785-b654-ebbfb99153f3-kube-api-access-dwzwd\") pod \"7744bf46-b550-4785-b654-ebbfb99153f3\" (UID: \"7744bf46-b550-4785-b654-ebbfb99153f3\") " Dec 01 23:08:31 crc kubenswrapper[4962]: I1201 23:08:31.317655 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7744bf46-b550-4785-b654-ebbfb99153f3-utilities" (OuterVolumeSpecName: "utilities") pod "7744bf46-b550-4785-b654-ebbfb99153f3" (UID: "7744bf46-b550-4785-b654-ebbfb99153f3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 23:08:31 crc kubenswrapper[4962]: I1201 23:08:31.327161 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7744bf46-b550-4785-b654-ebbfb99153f3-kube-api-access-dwzwd" (OuterVolumeSpecName: "kube-api-access-dwzwd") pod "7744bf46-b550-4785-b654-ebbfb99153f3" (UID: "7744bf46-b550-4785-b654-ebbfb99153f3"). InnerVolumeSpecName "kube-api-access-dwzwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 23:08:31 crc kubenswrapper[4962]: I1201 23:08:31.418803 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7744bf46-b550-4785-b654-ebbfb99153f3-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 23:08:31 crc kubenswrapper[4962]: I1201 23:08:31.418836 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwzwd\" (UniqueName: \"kubernetes.io/projected/7744bf46-b550-4785-b654-ebbfb99153f3-kube-api-access-dwzwd\") on node \"crc\" DevicePath \"\"" Dec 01 23:08:31 crc kubenswrapper[4962]: I1201 23:08:31.440201 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7744bf46-b550-4785-b654-ebbfb99153f3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7744bf46-b550-4785-b654-ebbfb99153f3" (UID: "7744bf46-b550-4785-b654-ebbfb99153f3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 23:08:31 crc kubenswrapper[4962]: I1201 23:08:31.521794 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7744bf46-b550-4785-b654-ebbfb99153f3-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 23:08:31 crc kubenswrapper[4962]: I1201 23:08:31.590660 4962 generic.go:334] "Generic (PLEG): container finished" podID="7744bf46-b550-4785-b654-ebbfb99153f3" containerID="e8162cbce3bfc3ba384dea84f444066d67c839bd6e28af08ccb16422f9cc21cc" exitCode=0 Dec 01 23:08:31 crc kubenswrapper[4962]: I1201 23:08:31.590710 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6jz7w" event={"ID":"7744bf46-b550-4785-b654-ebbfb99153f3","Type":"ContainerDied","Data":"e8162cbce3bfc3ba384dea84f444066d67c839bd6e28af08ccb16422f9cc21cc"} Dec 01 23:08:31 crc kubenswrapper[4962]: I1201 23:08:31.590742 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6jz7w" event={"ID":"7744bf46-b550-4785-b654-ebbfb99153f3","Type":"ContainerDied","Data":"aca3f868eabf0b510e51fb54f09d7c2a9b30dae673162754f12a9ed86cd85a33"} Dec 01 23:08:31 crc kubenswrapper[4962]: I1201 23:08:31.590762 4962 scope.go:117] "RemoveContainer" containerID="e8162cbce3bfc3ba384dea84f444066d67c839bd6e28af08ccb16422f9cc21cc" Dec 01 23:08:31 crc kubenswrapper[4962]: I1201 23:08:31.590923 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6jz7w" Dec 01 23:08:31 crc kubenswrapper[4962]: I1201 23:08:31.638214 4962 scope.go:117] "RemoveContainer" containerID="7e52013f4f9f4b2e432765258467777d40c4d3b0e0820cf57223b94d57d1b169" Dec 01 23:08:31 crc kubenswrapper[4962]: I1201 23:08:31.653171 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6jz7w"] Dec 01 23:08:31 crc kubenswrapper[4962]: I1201 23:08:31.667333 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6jz7w"] Dec 01 23:08:32 crc kubenswrapper[4962]: I1201 23:08:32.219200 4962 scope.go:117] "RemoveContainer" containerID="096c06d4f39b8504b8687f182efe441fdbd3bcf0762b02cf6a9033822e4091bb" Dec 01 23:08:32 crc kubenswrapper[4962]: I1201 23:08:32.244475 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7744bf46-b550-4785-b654-ebbfb99153f3" path="/var/lib/kubelet/pods/7744bf46-b550-4785-b654-ebbfb99153f3/volumes" Dec 01 23:08:32 crc kubenswrapper[4962]: I1201 23:08:32.304541 4962 scope.go:117] "RemoveContainer" containerID="e8162cbce3bfc3ba384dea84f444066d67c839bd6e28af08ccb16422f9cc21cc" Dec 01 23:08:32 crc kubenswrapper[4962]: E1201 23:08:32.305024 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8162cbce3bfc3ba384dea84f444066d67c839bd6e28af08ccb16422f9cc21cc\": container with ID starting with e8162cbce3bfc3ba384dea84f444066d67c839bd6e28af08ccb16422f9cc21cc not found: ID does not exist" containerID="e8162cbce3bfc3ba384dea84f444066d67c839bd6e28af08ccb16422f9cc21cc" Dec 01 23:08:32 crc kubenswrapper[4962]: I1201 23:08:32.305066 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8162cbce3bfc3ba384dea84f444066d67c839bd6e28af08ccb16422f9cc21cc"} err="failed to get container status \"e8162cbce3bfc3ba384dea84f444066d67c839bd6e28af08ccb16422f9cc21cc\": rpc error: code = NotFound desc = could not find container \"e8162cbce3bfc3ba384dea84f444066d67c839bd6e28af08ccb16422f9cc21cc\": container with ID starting with e8162cbce3bfc3ba384dea84f444066d67c839bd6e28af08ccb16422f9cc21cc not found: ID does not exist" Dec 01 23:08:32 crc kubenswrapper[4962]: I1201 23:08:32.305092 4962 scope.go:117] "RemoveContainer" containerID="7e52013f4f9f4b2e432765258467777d40c4d3b0e0820cf57223b94d57d1b169" Dec 01 23:08:32 crc kubenswrapper[4962]: E1201 23:08:32.305389 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e52013f4f9f4b2e432765258467777d40c4d3b0e0820cf57223b94d57d1b169\": container with ID starting with 7e52013f4f9f4b2e432765258467777d40c4d3b0e0820cf57223b94d57d1b169 not found: ID does not exist" containerID="7e52013f4f9f4b2e432765258467777d40c4d3b0e0820cf57223b94d57d1b169" Dec 01 23:08:32 crc kubenswrapper[4962]: I1201 23:08:32.305422 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e52013f4f9f4b2e432765258467777d40c4d3b0e0820cf57223b94d57d1b169"} err="failed to get container status \"7e52013f4f9f4b2e432765258467777d40c4d3b0e0820cf57223b94d57d1b169\": rpc error: code = NotFound desc = could not find container \"7e52013f4f9f4b2e432765258467777d40c4d3b0e0820cf57223b94d57d1b169\": container with ID starting with 7e52013f4f9f4b2e432765258467777d40c4d3b0e0820cf57223b94d57d1b169 not found: ID does not exist" Dec 01 23:08:32 crc kubenswrapper[4962]: I1201 23:08:32.305439 4962 scope.go:117] "RemoveContainer" containerID="096c06d4f39b8504b8687f182efe441fdbd3bcf0762b02cf6a9033822e4091bb" Dec 01 23:08:32 crc kubenswrapper[4962]: E1201 23:08:32.305870 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"096c06d4f39b8504b8687f182efe441fdbd3bcf0762b02cf6a9033822e4091bb\": container with ID starting with 096c06d4f39b8504b8687f182efe441fdbd3bcf0762b02cf6a9033822e4091bb not found: ID does not exist" containerID="096c06d4f39b8504b8687f182efe441fdbd3bcf0762b02cf6a9033822e4091bb" Dec 01 23:08:32 crc kubenswrapper[4962]: I1201 23:08:32.305931 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"096c06d4f39b8504b8687f182efe441fdbd3bcf0762b02cf6a9033822e4091bb"} err="failed to get container status \"096c06d4f39b8504b8687f182efe441fdbd3bcf0762b02cf6a9033822e4091bb\": rpc error: code = NotFound desc = could not find container \"096c06d4f39b8504b8687f182efe441fdbd3bcf0762b02cf6a9033822e4091bb\": container with ID starting with 096c06d4f39b8504b8687f182efe441fdbd3bcf0762b02cf6a9033822e4091bb not found: ID does not exist" Dec 01 23:09:02 crc kubenswrapper[4962]: I1201 23:09:02.784691 4962 patch_prober.go:28] interesting pod/machine-config-daemon-b642k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 23:09:02 crc kubenswrapper[4962]: I1201 23:09:02.785197 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 23:09:22 crc kubenswrapper[4962]: I1201 23:09:22.843315 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7dpmw"] Dec 01 23:09:22 crc kubenswrapper[4962]: E1201 23:09:22.844594 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7744bf46-b550-4785-b654-ebbfb99153f3" containerName="registry-server" Dec 01 23:09:22 crc kubenswrapper[4962]: I1201 23:09:22.844614 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="7744bf46-b550-4785-b654-ebbfb99153f3" containerName="registry-server" Dec 01 23:09:22 crc kubenswrapper[4962]: E1201 23:09:22.844655 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7744bf46-b550-4785-b654-ebbfb99153f3" containerName="extract-content" Dec 01 23:09:22 crc kubenswrapper[4962]: I1201 23:09:22.844663 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="7744bf46-b550-4785-b654-ebbfb99153f3" containerName="extract-content" Dec 01 23:09:22 crc kubenswrapper[4962]: E1201 23:09:22.844678 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7744bf46-b550-4785-b654-ebbfb99153f3" containerName="extract-utilities" Dec 01 23:09:22 crc kubenswrapper[4962]: I1201 23:09:22.844686 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="7744bf46-b550-4785-b654-ebbfb99153f3" containerName="extract-utilities" Dec 01 23:09:22 crc kubenswrapper[4962]: I1201 23:09:22.844994 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="7744bf46-b550-4785-b654-ebbfb99153f3" containerName="registry-server" Dec 01 23:09:22 crc kubenswrapper[4962]: I1201 23:09:22.847687 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7dpmw" Dec 01 23:09:22 crc kubenswrapper[4962]: I1201 23:09:22.872570 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7dpmw"] Dec 01 23:09:23 crc kubenswrapper[4962]: I1201 23:09:23.027210 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f55786d-e472-4a9d-b23d-edaf5a52f306-catalog-content\") pod \"community-operators-7dpmw\" (UID: \"2f55786d-e472-4a9d-b23d-edaf5a52f306\") " pod="openshift-marketplace/community-operators-7dpmw" Dec 01 23:09:23 crc kubenswrapper[4962]: I1201 23:09:23.027754 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f55786d-e472-4a9d-b23d-edaf5a52f306-utilities\") pod \"community-operators-7dpmw\" (UID: \"2f55786d-e472-4a9d-b23d-edaf5a52f306\") " pod="openshift-marketplace/community-operators-7dpmw" Dec 01 23:09:23 crc kubenswrapper[4962]: I1201 23:09:23.028073 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpbmk\" (UniqueName: \"kubernetes.io/projected/2f55786d-e472-4a9d-b23d-edaf5a52f306-kube-api-access-hpbmk\") pod \"community-operators-7dpmw\" (UID: \"2f55786d-e472-4a9d-b23d-edaf5a52f306\") " pod="openshift-marketplace/community-operators-7dpmw" Dec 01 23:09:23 crc kubenswrapper[4962]: I1201 23:09:23.130961 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpbmk\" (UniqueName: \"kubernetes.io/projected/2f55786d-e472-4a9d-b23d-edaf5a52f306-kube-api-access-hpbmk\") pod \"community-operators-7dpmw\" (UID: \"2f55786d-e472-4a9d-b23d-edaf5a52f306\") " pod="openshift-marketplace/community-operators-7dpmw" Dec 01 23:09:23 crc kubenswrapper[4962]: I1201 23:09:23.131105 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f55786d-e472-4a9d-b23d-edaf5a52f306-catalog-content\") pod \"community-operators-7dpmw\" (UID: \"2f55786d-e472-4a9d-b23d-edaf5a52f306\") " pod="openshift-marketplace/community-operators-7dpmw" Dec 01 23:09:23 crc kubenswrapper[4962]: I1201 23:09:23.131219 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f55786d-e472-4a9d-b23d-edaf5a52f306-utilities\") pod \"community-operators-7dpmw\" (UID: \"2f55786d-e472-4a9d-b23d-edaf5a52f306\") " pod="openshift-marketplace/community-operators-7dpmw" Dec 01 23:09:23 crc kubenswrapper[4962]: I1201 23:09:23.131879 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f55786d-e472-4a9d-b23d-edaf5a52f306-utilities\") pod \"community-operators-7dpmw\" (UID: \"2f55786d-e472-4a9d-b23d-edaf5a52f306\") " pod="openshift-marketplace/community-operators-7dpmw" Dec 01 23:09:23 crc kubenswrapper[4962]: I1201 23:09:23.132219 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f55786d-e472-4a9d-b23d-edaf5a52f306-catalog-content\") pod \"community-operators-7dpmw\" (UID: \"2f55786d-e472-4a9d-b23d-edaf5a52f306\") " pod="openshift-marketplace/community-operators-7dpmw" Dec 01 23:09:23 crc kubenswrapper[4962]: I1201 23:09:23.154513 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpbmk\" (UniqueName: \"kubernetes.io/projected/2f55786d-e472-4a9d-b23d-edaf5a52f306-kube-api-access-hpbmk\") pod \"community-operators-7dpmw\" (UID: \"2f55786d-e472-4a9d-b23d-edaf5a52f306\") " pod="openshift-marketplace/community-operators-7dpmw" Dec 01 23:09:23 crc kubenswrapper[4962]: I1201 23:09:23.170742 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7dpmw" Dec 01 23:09:23 crc kubenswrapper[4962]: I1201 23:09:23.773653 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7dpmw"] Dec 01 23:09:23 crc kubenswrapper[4962]: W1201 23:09:23.777111 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f55786d_e472_4a9d_b23d_edaf5a52f306.slice/crio-e6cad3435f850e05dcde0d84b1eb04baa252eee961aa72fa1d8233215d97bae2 WatchSource:0}: Error finding container e6cad3435f850e05dcde0d84b1eb04baa252eee961aa72fa1d8233215d97bae2: Status 404 returned error can't find the container with id e6cad3435f850e05dcde0d84b1eb04baa252eee961aa72fa1d8233215d97bae2 Dec 01 23:09:24 crc kubenswrapper[4962]: I1201 23:09:24.266998 4962 generic.go:334] "Generic (PLEG): container finished" podID="2f55786d-e472-4a9d-b23d-edaf5a52f306" containerID="f6af8a23b19ad511bb485b36dfa298d07b87507d231373d21a4bdd631a4d93c0" exitCode=0 Dec 01 23:09:24 crc kubenswrapper[4962]: I1201 23:09:24.267065 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7dpmw" event={"ID":"2f55786d-e472-4a9d-b23d-edaf5a52f306","Type":"ContainerDied","Data":"f6af8a23b19ad511bb485b36dfa298d07b87507d231373d21a4bdd631a4d93c0"} Dec 01 23:09:24 crc kubenswrapper[4962]: I1201 23:09:24.267392 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7dpmw" event={"ID":"2f55786d-e472-4a9d-b23d-edaf5a52f306","Type":"ContainerStarted","Data":"e6cad3435f850e05dcde0d84b1eb04baa252eee961aa72fa1d8233215d97bae2"} Dec 01 23:09:26 crc kubenswrapper[4962]: I1201 23:09:26.299265 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7dpmw" event={"ID":"2f55786d-e472-4a9d-b23d-edaf5a52f306","Type":"ContainerStarted","Data":"aa80f7d8842b58ad1381883a4451b4ed29fc9d2a7e2e77803ac812668e4df6f6"} Dec 01 23:09:27 crc kubenswrapper[4962]: I1201 23:09:27.316398 4962 generic.go:334] "Generic (PLEG): container finished" podID="2f55786d-e472-4a9d-b23d-edaf5a52f306" containerID="aa80f7d8842b58ad1381883a4451b4ed29fc9d2a7e2e77803ac812668e4df6f6" exitCode=0 Dec 01 23:09:27 crc kubenswrapper[4962]: I1201 23:09:27.316512 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7dpmw" event={"ID":"2f55786d-e472-4a9d-b23d-edaf5a52f306","Type":"ContainerDied","Data":"aa80f7d8842b58ad1381883a4451b4ed29fc9d2a7e2e77803ac812668e4df6f6"} Dec 01 23:09:28 crc kubenswrapper[4962]: I1201 23:09:28.332890 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7dpmw" event={"ID":"2f55786d-e472-4a9d-b23d-edaf5a52f306","Type":"ContainerStarted","Data":"4d1dd8a58a5123718dd3671d5c82f94fb092b0a80e2f6477433dee77c3d52e23"} Dec 01 23:09:28 crc kubenswrapper[4962]: I1201 23:09:28.377095 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7dpmw" podStartSLOduration=2.680799931 podStartE2EDuration="6.377072834s" podCreationTimestamp="2025-12-01 23:09:22 +0000 UTC" firstStartedPulling="2025-12-01 23:09:24.269245649 +0000 UTC m=+5748.370684874" lastFinishedPulling="2025-12-01 23:09:27.965518582 +0000 UTC m=+5752.066957777" observedRunningTime="2025-12-01 23:09:28.353919497 +0000 UTC m=+5752.455358692" watchObservedRunningTime="2025-12-01 23:09:28.377072834 +0000 UTC m=+5752.478512039" Dec 01 23:09:32 crc kubenswrapper[4962]: I1201 23:09:32.785024 4962 patch_prober.go:28] interesting pod/machine-config-daemon-b642k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 23:09:32 crc kubenswrapper[4962]: I1201 23:09:32.785652 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 23:09:33 crc kubenswrapper[4962]: I1201 23:09:33.171749 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7dpmw" Dec 01 23:09:33 crc kubenswrapper[4962]: I1201 23:09:33.172231 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7dpmw" Dec 01 23:09:33 crc kubenswrapper[4962]: I1201 23:09:33.241914 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7dpmw" Dec 01 23:09:33 crc kubenswrapper[4962]: I1201 23:09:33.460250 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7dpmw" Dec 01 23:09:33 crc kubenswrapper[4962]: I1201 23:09:33.512445 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7dpmw"] Dec 01 23:09:35 crc kubenswrapper[4962]: I1201 23:09:35.421775 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7dpmw" podUID="2f55786d-e472-4a9d-b23d-edaf5a52f306" containerName="registry-server" containerID="cri-o://4d1dd8a58a5123718dd3671d5c82f94fb092b0a80e2f6477433dee77c3d52e23" gracePeriod=2 Dec 01 23:09:36 crc kubenswrapper[4962]: I1201 23:09:36.059246 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7dpmw" Dec 01 23:09:36 crc kubenswrapper[4962]: I1201 23:09:36.151043 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f55786d-e472-4a9d-b23d-edaf5a52f306-catalog-content\") pod \"2f55786d-e472-4a9d-b23d-edaf5a52f306\" (UID: \"2f55786d-e472-4a9d-b23d-edaf5a52f306\") " Dec 01 23:09:36 crc kubenswrapper[4962]: I1201 23:09:36.151642 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hpbmk\" (UniqueName: \"kubernetes.io/projected/2f55786d-e472-4a9d-b23d-edaf5a52f306-kube-api-access-hpbmk\") pod \"2f55786d-e472-4a9d-b23d-edaf5a52f306\" (UID: \"2f55786d-e472-4a9d-b23d-edaf5a52f306\") " Dec 01 23:09:36 crc kubenswrapper[4962]: I1201 23:09:36.151727 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f55786d-e472-4a9d-b23d-edaf5a52f306-utilities\") pod \"2f55786d-e472-4a9d-b23d-edaf5a52f306\" (UID: \"2f55786d-e472-4a9d-b23d-edaf5a52f306\") " Dec 01 23:09:36 crc kubenswrapper[4962]: I1201 23:09:36.153706 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f55786d-e472-4a9d-b23d-edaf5a52f306-utilities" (OuterVolumeSpecName: "utilities") pod "2f55786d-e472-4a9d-b23d-edaf5a52f306" (UID: "2f55786d-e472-4a9d-b23d-edaf5a52f306"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 23:09:36 crc kubenswrapper[4962]: I1201 23:09:36.164222 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f55786d-e472-4a9d-b23d-edaf5a52f306-kube-api-access-hpbmk" (OuterVolumeSpecName: "kube-api-access-hpbmk") pod "2f55786d-e472-4a9d-b23d-edaf5a52f306" (UID: "2f55786d-e472-4a9d-b23d-edaf5a52f306"). InnerVolumeSpecName "kube-api-access-hpbmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 23:09:36 crc kubenswrapper[4962]: I1201 23:09:36.239023 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f55786d-e472-4a9d-b23d-edaf5a52f306-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2f55786d-e472-4a9d-b23d-edaf5a52f306" (UID: "2f55786d-e472-4a9d-b23d-edaf5a52f306"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 23:09:36 crc kubenswrapper[4962]: I1201 23:09:36.254959 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f55786d-e472-4a9d-b23d-edaf5a52f306-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 23:09:36 crc kubenswrapper[4962]: I1201 23:09:36.254992 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hpbmk\" (UniqueName: \"kubernetes.io/projected/2f55786d-e472-4a9d-b23d-edaf5a52f306-kube-api-access-hpbmk\") on node \"crc\" DevicePath \"\"" Dec 01 23:09:36 crc kubenswrapper[4962]: I1201 23:09:36.255008 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f55786d-e472-4a9d-b23d-edaf5a52f306-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 23:09:36 crc kubenswrapper[4962]: I1201 23:09:36.441589 4962 generic.go:334] "Generic (PLEG): container finished" podID="2f55786d-e472-4a9d-b23d-edaf5a52f306" containerID="4d1dd8a58a5123718dd3671d5c82f94fb092b0a80e2f6477433dee77c3d52e23" exitCode=0 Dec 01 23:09:36 crc kubenswrapper[4962]: I1201 23:09:36.441659 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7dpmw" event={"ID":"2f55786d-e472-4a9d-b23d-edaf5a52f306","Type":"ContainerDied","Data":"4d1dd8a58a5123718dd3671d5c82f94fb092b0a80e2f6477433dee77c3d52e23"} Dec 01 23:09:36 crc kubenswrapper[4962]: I1201 23:09:36.441688 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7dpmw" Dec 01 23:09:36 crc kubenswrapper[4962]: I1201 23:09:36.441724 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7dpmw" event={"ID":"2f55786d-e472-4a9d-b23d-edaf5a52f306","Type":"ContainerDied","Data":"e6cad3435f850e05dcde0d84b1eb04baa252eee961aa72fa1d8233215d97bae2"} Dec 01 23:09:36 crc kubenswrapper[4962]: I1201 23:09:36.441745 4962 scope.go:117] "RemoveContainer" containerID="4d1dd8a58a5123718dd3671d5c82f94fb092b0a80e2f6477433dee77c3d52e23" Dec 01 23:09:36 crc kubenswrapper[4962]: I1201 23:09:36.483255 4962 scope.go:117] "RemoveContainer" containerID="aa80f7d8842b58ad1381883a4451b4ed29fc9d2a7e2e77803ac812668e4df6f6" Dec 01 23:09:36 crc kubenswrapper[4962]: I1201 23:09:36.494652 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7dpmw"] Dec 01 23:09:36 crc kubenswrapper[4962]: I1201 23:09:36.514902 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7dpmw"] Dec 01 23:09:36 crc kubenswrapper[4962]: I1201 23:09:36.519140 4962 scope.go:117] "RemoveContainer" containerID="f6af8a23b19ad511bb485b36dfa298d07b87507d231373d21a4bdd631a4d93c0" Dec 01 23:09:36 crc kubenswrapper[4962]: I1201 23:09:36.589752 4962 scope.go:117] "RemoveContainer" containerID="4d1dd8a58a5123718dd3671d5c82f94fb092b0a80e2f6477433dee77c3d52e23" Dec 01 23:09:36 crc kubenswrapper[4962]: E1201 23:09:36.590307 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d1dd8a58a5123718dd3671d5c82f94fb092b0a80e2f6477433dee77c3d52e23\": container with ID starting with 4d1dd8a58a5123718dd3671d5c82f94fb092b0a80e2f6477433dee77c3d52e23 not found: ID does not exist" containerID="4d1dd8a58a5123718dd3671d5c82f94fb092b0a80e2f6477433dee77c3d52e23" Dec 01 23:09:36 crc kubenswrapper[4962]: I1201 23:09:36.590479 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d1dd8a58a5123718dd3671d5c82f94fb092b0a80e2f6477433dee77c3d52e23"} err="failed to get container status \"4d1dd8a58a5123718dd3671d5c82f94fb092b0a80e2f6477433dee77c3d52e23\": rpc error: code = NotFound desc = could not find container \"4d1dd8a58a5123718dd3671d5c82f94fb092b0a80e2f6477433dee77c3d52e23\": container with ID starting with 4d1dd8a58a5123718dd3671d5c82f94fb092b0a80e2f6477433dee77c3d52e23 not found: ID does not exist" Dec 01 23:09:36 crc kubenswrapper[4962]: I1201 23:09:36.590583 4962 scope.go:117] "RemoveContainer" containerID="aa80f7d8842b58ad1381883a4451b4ed29fc9d2a7e2e77803ac812668e4df6f6" Dec 01 23:09:36 crc kubenswrapper[4962]: E1201 23:09:36.591114 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa80f7d8842b58ad1381883a4451b4ed29fc9d2a7e2e77803ac812668e4df6f6\": container with ID starting with aa80f7d8842b58ad1381883a4451b4ed29fc9d2a7e2e77803ac812668e4df6f6 not found: ID does not exist" containerID="aa80f7d8842b58ad1381883a4451b4ed29fc9d2a7e2e77803ac812668e4df6f6" Dec 01 23:09:36 crc kubenswrapper[4962]: I1201 23:09:36.591240 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa80f7d8842b58ad1381883a4451b4ed29fc9d2a7e2e77803ac812668e4df6f6"} err="failed to get container status \"aa80f7d8842b58ad1381883a4451b4ed29fc9d2a7e2e77803ac812668e4df6f6\": rpc error: code = NotFound desc = could not find container \"aa80f7d8842b58ad1381883a4451b4ed29fc9d2a7e2e77803ac812668e4df6f6\": container with ID starting with aa80f7d8842b58ad1381883a4451b4ed29fc9d2a7e2e77803ac812668e4df6f6 not found: ID does not exist" Dec 01 23:09:36 crc kubenswrapper[4962]: I1201 23:09:36.591338 4962 scope.go:117] "RemoveContainer" containerID="f6af8a23b19ad511bb485b36dfa298d07b87507d231373d21a4bdd631a4d93c0" Dec 01 23:09:36 crc kubenswrapper[4962]: E1201 23:09:36.591740 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6af8a23b19ad511bb485b36dfa298d07b87507d231373d21a4bdd631a4d93c0\": container with ID starting with f6af8a23b19ad511bb485b36dfa298d07b87507d231373d21a4bdd631a4d93c0 not found: ID does not exist" containerID="f6af8a23b19ad511bb485b36dfa298d07b87507d231373d21a4bdd631a4d93c0" Dec 01 23:09:36 crc kubenswrapper[4962]: I1201 23:09:36.591826 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6af8a23b19ad511bb485b36dfa298d07b87507d231373d21a4bdd631a4d93c0"} err="failed to get container status \"f6af8a23b19ad511bb485b36dfa298d07b87507d231373d21a4bdd631a4d93c0\": rpc error: code = NotFound desc = could not find container \"f6af8a23b19ad511bb485b36dfa298d07b87507d231373d21a4bdd631a4d93c0\": container with ID starting with f6af8a23b19ad511bb485b36dfa298d07b87507d231373d21a4bdd631a4d93c0 not found: ID does not exist" Dec 01 23:09:38 crc kubenswrapper[4962]: I1201 23:09:38.262776 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f55786d-e472-4a9d-b23d-edaf5a52f306" path="/var/lib/kubelet/pods/2f55786d-e472-4a9d-b23d-edaf5a52f306/volumes" Dec 01 23:09:59 crc kubenswrapper[4962]: I1201 23:09:59.588736 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nqpbs"] Dec 01 23:09:59 crc kubenswrapper[4962]: E1201 23:09:59.590268 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f55786d-e472-4a9d-b23d-edaf5a52f306" containerName="extract-content" Dec 01 23:09:59 crc kubenswrapper[4962]: I1201 23:09:59.590287 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f55786d-e472-4a9d-b23d-edaf5a52f306" containerName="extract-content" Dec 01 23:09:59 crc kubenswrapper[4962]: E1201 23:09:59.590324 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f55786d-e472-4a9d-b23d-edaf5a52f306" containerName="registry-server" Dec 01 23:09:59 crc kubenswrapper[4962]: I1201 23:09:59.590333 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f55786d-e472-4a9d-b23d-edaf5a52f306" containerName="registry-server" Dec 01 23:09:59 crc kubenswrapper[4962]: E1201 23:09:59.590376 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f55786d-e472-4a9d-b23d-edaf5a52f306" containerName="extract-utilities" Dec 01 23:09:59 crc kubenswrapper[4962]: I1201 23:09:59.590386 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f55786d-e472-4a9d-b23d-edaf5a52f306" containerName="extract-utilities" Dec 01 23:09:59 crc kubenswrapper[4962]: I1201 23:09:59.590700 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f55786d-e472-4a9d-b23d-edaf5a52f306" containerName="registry-server" Dec 01 23:09:59 crc kubenswrapper[4962]: I1201 23:09:59.593245 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nqpbs" Dec 01 23:09:59 crc kubenswrapper[4962]: I1201 23:09:59.604230 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nqpbs"] Dec 01 23:09:59 crc kubenswrapper[4962]: I1201 23:09:59.751618 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d609100-530f-41a2-b44e-3de196ed28eb-catalog-content\") pod \"certified-operators-nqpbs\" (UID: \"0d609100-530f-41a2-b44e-3de196ed28eb\") " pod="openshift-marketplace/certified-operators-nqpbs" Dec 01 23:09:59 crc kubenswrapper[4962]: I1201 23:09:59.751913 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d609100-530f-41a2-b44e-3de196ed28eb-utilities\") pod \"certified-operators-nqpbs\" (UID: \"0d609100-530f-41a2-b44e-3de196ed28eb\") " pod="openshift-marketplace/certified-operators-nqpbs" Dec 01 23:09:59 crc kubenswrapper[4962]: I1201 23:09:59.751962 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqlbl\" (UniqueName: \"kubernetes.io/projected/0d609100-530f-41a2-b44e-3de196ed28eb-kube-api-access-lqlbl\") pod \"certified-operators-nqpbs\" (UID: \"0d609100-530f-41a2-b44e-3de196ed28eb\") " pod="openshift-marketplace/certified-operators-nqpbs" Dec 01 23:09:59 crc kubenswrapper[4962]: I1201 23:09:59.853949 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d609100-530f-41a2-b44e-3de196ed28eb-utilities\") pod \"certified-operators-nqpbs\" (UID: \"0d609100-530f-41a2-b44e-3de196ed28eb\") " pod="openshift-marketplace/certified-operators-nqpbs" Dec 01 23:09:59 crc kubenswrapper[4962]: I1201 23:09:59.854041 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqlbl\" (UniqueName: \"kubernetes.io/projected/0d609100-530f-41a2-b44e-3de196ed28eb-kube-api-access-lqlbl\") pod \"certified-operators-nqpbs\" (UID: \"0d609100-530f-41a2-b44e-3de196ed28eb\") " pod="openshift-marketplace/certified-operators-nqpbs" Dec 01 23:09:59 crc kubenswrapper[4962]: I1201 23:09:59.854145 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d609100-530f-41a2-b44e-3de196ed28eb-catalog-content\") pod \"certified-operators-nqpbs\" (UID: \"0d609100-530f-41a2-b44e-3de196ed28eb\") " pod="openshift-marketplace/certified-operators-nqpbs" Dec 01 23:09:59 crc kubenswrapper[4962]: I1201 23:09:59.854536 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d609100-530f-41a2-b44e-3de196ed28eb-utilities\") pod \"certified-operators-nqpbs\" (UID: \"0d609100-530f-41a2-b44e-3de196ed28eb\") " pod="openshift-marketplace/certified-operators-nqpbs" Dec 01 23:09:59 crc kubenswrapper[4962]: I1201 23:09:59.854811 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d609100-530f-41a2-b44e-3de196ed28eb-catalog-content\") pod \"certified-operators-nqpbs\" (UID: \"0d609100-530f-41a2-b44e-3de196ed28eb\") " pod="openshift-marketplace/certified-operators-nqpbs" Dec 01 23:09:59 crc kubenswrapper[4962]: I1201 23:09:59.880986 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqlbl\" (UniqueName: \"kubernetes.io/projected/0d609100-530f-41a2-b44e-3de196ed28eb-kube-api-access-lqlbl\") pod \"certified-operators-nqpbs\" (UID: \"0d609100-530f-41a2-b44e-3de196ed28eb\") " pod="openshift-marketplace/certified-operators-nqpbs" Dec 01 23:10:00 crc kubenswrapper[4962]: I1201 23:10:00.047850 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nqpbs" Dec 01 23:10:00 crc kubenswrapper[4962]: I1201 23:10:00.580520 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nqpbs"] Dec 01 23:10:00 crc kubenswrapper[4962]: I1201 23:10:00.762889 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nqpbs" event={"ID":"0d609100-530f-41a2-b44e-3de196ed28eb","Type":"ContainerStarted","Data":"350b76d3b7237f2173b047a6f9122ad997885066ae877d3cd6c44c0e3047ab95"} Dec 01 23:10:01 crc kubenswrapper[4962]: I1201 23:10:01.776832 4962 generic.go:334] "Generic (PLEG): container finished" podID="0d609100-530f-41a2-b44e-3de196ed28eb" containerID="00f149960531efb732cdc0905d0102d08a38fa3ff6128086e46777076d4b0ed9" exitCode=0 Dec 01 23:10:01 crc kubenswrapper[4962]: I1201 23:10:01.776973 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nqpbs" event={"ID":"0d609100-530f-41a2-b44e-3de196ed28eb","Type":"ContainerDied","Data":"00f149960531efb732cdc0905d0102d08a38fa3ff6128086e46777076d4b0ed9"} Dec 01 23:10:02 crc kubenswrapper[4962]: I1201 23:10:02.785778 4962 patch_prober.go:28] interesting pod/machine-config-daemon-b642k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 23:10:02 crc kubenswrapper[4962]: I1201 23:10:02.786706 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 23:10:02 crc kubenswrapper[4962]: I1201 23:10:02.786769 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b642k" Dec 01 23:10:02 crc kubenswrapper[4962]: I1201 23:10:02.788255 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4f0dd1a4061cd9271344a31e8d89eab4bdd66bf2f565beae87f3ca6dd0c8508b"} pod="openshift-machine-config-operator/machine-config-daemon-b642k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 23:10:02 crc kubenswrapper[4962]: I1201 23:10:02.788322 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" containerID="cri-o://4f0dd1a4061cd9271344a31e8d89eab4bdd66bf2f565beae87f3ca6dd0c8508b" gracePeriod=600 Dec 01 23:10:03 crc kubenswrapper[4962]: I1201 23:10:03.833807 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nqpbs" event={"ID":"0d609100-530f-41a2-b44e-3de196ed28eb","Type":"ContainerStarted","Data":"b7776b9affde50adda616fa8c1d22d767f6a06641939320e37e814eca10daa2e"} Dec 01 23:10:03 crc kubenswrapper[4962]: I1201 23:10:03.838467 4962 generic.go:334] "Generic (PLEG): container finished" podID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerID="4f0dd1a4061cd9271344a31e8d89eab4bdd66bf2f565beae87f3ca6dd0c8508b" exitCode=0 Dec 01 23:10:03 crc kubenswrapper[4962]: I1201 23:10:03.838509 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" event={"ID":"191b6ce3-f613-4217-b224-a65ee4cfdfe7","Type":"ContainerDied","Data":"4f0dd1a4061cd9271344a31e8d89eab4bdd66bf2f565beae87f3ca6dd0c8508b"} Dec 01 23:10:03 crc kubenswrapper[4962]: I1201 23:10:03.838537 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" event={"ID":"191b6ce3-f613-4217-b224-a65ee4cfdfe7","Type":"ContainerStarted","Data":"9069c6b0afe806496066b75f21b8e7f8eb26d789d60c34eb7173619cac2f7eff"} Dec 01 23:10:03 crc kubenswrapper[4962]: I1201 23:10:03.838555 4962 scope.go:117] "RemoveContainer" containerID="559664edff742d9bb8a40e35e20ed3e4e8d8c1e191acbe7b87ce869314f431aa" Dec 01 23:10:04 crc kubenswrapper[4962]: I1201 23:10:04.865700 4962 generic.go:334] "Generic (PLEG): container finished" podID="0d609100-530f-41a2-b44e-3de196ed28eb" containerID="b7776b9affde50adda616fa8c1d22d767f6a06641939320e37e814eca10daa2e" exitCode=0 Dec 01 23:10:04 crc kubenswrapper[4962]: I1201 23:10:04.866032 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nqpbs" event={"ID":"0d609100-530f-41a2-b44e-3de196ed28eb","Type":"ContainerDied","Data":"b7776b9affde50adda616fa8c1d22d767f6a06641939320e37e814eca10daa2e"} Dec 01 23:10:05 crc kubenswrapper[4962]: I1201 23:10:05.888315 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nqpbs" event={"ID":"0d609100-530f-41a2-b44e-3de196ed28eb","Type":"ContainerStarted","Data":"62f49520d7224580fe8acd20a929579e3118ad6664bbd2441cf799efa0301755"} Dec 01 23:10:05 crc kubenswrapper[4962]: I1201 23:10:05.919898 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nqpbs" podStartSLOduration=3.245892919 podStartE2EDuration="6.919877342s" podCreationTimestamp="2025-12-01 23:09:59 +0000 UTC" firstStartedPulling="2025-12-01 23:10:01.780316385 +0000 UTC m=+5785.881755620" lastFinishedPulling="2025-12-01 23:10:05.454300838 +0000 UTC m=+5789.555740043" observedRunningTime="2025-12-01 23:10:05.910018102 +0000 UTC m=+5790.011457307" watchObservedRunningTime="2025-12-01 23:10:05.919877342 +0000 UTC m=+5790.021316537" Dec 01 23:10:10 crc kubenswrapper[4962]: I1201 23:10:10.048722 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nqpbs" Dec 01 23:10:10 crc kubenswrapper[4962]: I1201 23:10:10.049574 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nqpbs" Dec 01 23:10:11 crc kubenswrapper[4962]: I1201 23:10:11.123999 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-nqpbs" podUID="0d609100-530f-41a2-b44e-3de196ed28eb" containerName="registry-server" probeResult="failure" output=< Dec 01 23:10:11 crc kubenswrapper[4962]: timeout: failed to connect service ":50051" within 1s Dec 01 23:10:11 crc kubenswrapper[4962]: > Dec 01 23:10:20 crc kubenswrapper[4962]: I1201 23:10:20.115503 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nqpbs" Dec 01 23:10:20 crc kubenswrapper[4962]: I1201 23:10:20.167878 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nqpbs" Dec 01 23:10:20 crc kubenswrapper[4962]: I1201 23:10:20.380712 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nqpbs"] Dec 01 23:10:22 crc kubenswrapper[4962]: I1201 23:10:22.130530 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-nqpbs" podUID="0d609100-530f-41a2-b44e-3de196ed28eb" containerName="registry-server" containerID="cri-o://62f49520d7224580fe8acd20a929579e3118ad6664bbd2441cf799efa0301755" gracePeriod=2 Dec 01 23:10:23 crc kubenswrapper[4962]: I1201 23:10:23.147169 4962 generic.go:334] "Generic (PLEG): container finished" podID="0d609100-530f-41a2-b44e-3de196ed28eb" containerID="62f49520d7224580fe8acd20a929579e3118ad6664bbd2441cf799efa0301755" exitCode=0 Dec 01 23:10:23 crc kubenswrapper[4962]: I1201 23:10:23.147271 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nqpbs" event={"ID":"0d609100-530f-41a2-b44e-3de196ed28eb","Type":"ContainerDied","Data":"62f49520d7224580fe8acd20a929579e3118ad6664bbd2441cf799efa0301755"} Dec 01 23:10:23 crc kubenswrapper[4962]: I1201 23:10:23.353154 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nqpbs" Dec 01 23:10:23 crc kubenswrapper[4962]: I1201 23:10:23.473309 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqlbl\" (UniqueName: \"kubernetes.io/projected/0d609100-530f-41a2-b44e-3de196ed28eb-kube-api-access-lqlbl\") pod \"0d609100-530f-41a2-b44e-3de196ed28eb\" (UID: \"0d609100-530f-41a2-b44e-3de196ed28eb\") " Dec 01 23:10:23 crc kubenswrapper[4962]: I1201 23:10:23.473420 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d609100-530f-41a2-b44e-3de196ed28eb-utilities\") pod \"0d609100-530f-41a2-b44e-3de196ed28eb\" (UID: \"0d609100-530f-41a2-b44e-3de196ed28eb\") " Dec 01 23:10:23 crc kubenswrapper[4962]: I1201 23:10:23.473695 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d609100-530f-41a2-b44e-3de196ed28eb-catalog-content\") pod \"0d609100-530f-41a2-b44e-3de196ed28eb\" (UID: \"0d609100-530f-41a2-b44e-3de196ed28eb\") " Dec 01 23:10:23 crc kubenswrapper[4962]: I1201 23:10:23.474453 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d609100-530f-41a2-b44e-3de196ed28eb-utilities" (OuterVolumeSpecName: "utilities") pod "0d609100-530f-41a2-b44e-3de196ed28eb" (UID: "0d609100-530f-41a2-b44e-3de196ed28eb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 23:10:23 crc kubenswrapper[4962]: I1201 23:10:23.482211 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d609100-530f-41a2-b44e-3de196ed28eb-kube-api-access-lqlbl" (OuterVolumeSpecName: "kube-api-access-lqlbl") pod "0d609100-530f-41a2-b44e-3de196ed28eb" (UID: "0d609100-530f-41a2-b44e-3de196ed28eb"). InnerVolumeSpecName "kube-api-access-lqlbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 23:10:23 crc kubenswrapper[4962]: I1201 23:10:23.544752 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d609100-530f-41a2-b44e-3de196ed28eb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0d609100-530f-41a2-b44e-3de196ed28eb" (UID: "0d609100-530f-41a2-b44e-3de196ed28eb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 23:10:23 crc kubenswrapper[4962]: I1201 23:10:23.578840 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqlbl\" (UniqueName: \"kubernetes.io/projected/0d609100-530f-41a2-b44e-3de196ed28eb-kube-api-access-lqlbl\") on node \"crc\" DevicePath \"\"" Dec 01 23:10:23 crc kubenswrapper[4962]: I1201 23:10:23.578886 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d609100-530f-41a2-b44e-3de196ed28eb-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 23:10:23 crc kubenswrapper[4962]: I1201 23:10:23.578899 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d609100-530f-41a2-b44e-3de196ed28eb-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 23:10:24 crc kubenswrapper[4962]: I1201 23:10:24.162493 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nqpbs" event={"ID":"0d609100-530f-41a2-b44e-3de196ed28eb","Type":"ContainerDied","Data":"350b76d3b7237f2173b047a6f9122ad997885066ae877d3cd6c44c0e3047ab95"} Dec 01 23:10:24 crc kubenswrapper[4962]: I1201 23:10:24.163693 4962 scope.go:117] "RemoveContainer" containerID="62f49520d7224580fe8acd20a929579e3118ad6664bbd2441cf799efa0301755" Dec 01 23:10:24 crc kubenswrapper[4962]: I1201 23:10:24.162613 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nqpbs" Dec 01 23:10:24 crc kubenswrapper[4962]: I1201 23:10:24.208813 4962 scope.go:117] "RemoveContainer" containerID="b7776b9affde50adda616fa8c1d22d767f6a06641939320e37e814eca10daa2e" Dec 01 23:10:24 crc kubenswrapper[4962]: I1201 23:10:24.248685 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nqpbs"] Dec 01 23:10:24 crc kubenswrapper[4962]: I1201 23:10:24.248745 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-nqpbs"] Dec 01 23:10:24 crc kubenswrapper[4962]: I1201 23:10:24.261719 4962 scope.go:117] "RemoveContainer" containerID="00f149960531efb732cdc0905d0102d08a38fa3ff6128086e46777076d4b0ed9" Dec 01 23:10:26 crc kubenswrapper[4962]: I1201 23:10:26.242214 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d609100-530f-41a2-b44e-3de196ed28eb" path="/var/lib/kubelet/pods/0d609100-530f-41a2-b44e-3de196ed28eb/volumes" Dec 01 23:12:17 crc kubenswrapper[4962]: I1201 23:12:17.065640 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pjm9h"] Dec 01 23:12:17 crc kubenswrapper[4962]: E1201 23:12:17.067128 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d609100-530f-41a2-b44e-3de196ed28eb" containerName="extract-content" Dec 01 23:12:17 crc kubenswrapper[4962]: I1201 23:12:17.067162 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d609100-530f-41a2-b44e-3de196ed28eb" containerName="extract-content" Dec 01 23:12:17 crc kubenswrapper[4962]: E1201 23:12:17.067194 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d609100-530f-41a2-b44e-3de196ed28eb" containerName="extract-utilities" Dec 01 23:12:17 crc kubenswrapper[4962]: I1201 23:12:17.067206 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d609100-530f-41a2-b44e-3de196ed28eb" containerName="extract-utilities" Dec 01 23:12:17 crc kubenswrapper[4962]: E1201 23:12:17.067224 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d609100-530f-41a2-b44e-3de196ed28eb" containerName="registry-server" Dec 01 23:12:17 crc kubenswrapper[4962]: I1201 23:12:17.067236 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d609100-530f-41a2-b44e-3de196ed28eb" containerName="registry-server" Dec 01 23:12:17 crc kubenswrapper[4962]: I1201 23:12:17.067696 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d609100-530f-41a2-b44e-3de196ed28eb" containerName="registry-server" Dec 01 23:12:17 crc kubenswrapper[4962]: I1201 23:12:17.070528 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pjm9h" Dec 01 23:12:17 crc kubenswrapper[4962]: I1201 23:12:17.071604 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzj9q\" (UniqueName: \"kubernetes.io/projected/4b184910-eb11-4a66-a7cf-c2ee7c6e18b8-kube-api-access-hzj9q\") pod \"redhat-marketplace-pjm9h\" (UID: \"4b184910-eb11-4a66-a7cf-c2ee7c6e18b8\") " pod="openshift-marketplace/redhat-marketplace-pjm9h" Dec 01 23:12:17 crc kubenswrapper[4962]: I1201 23:12:17.072264 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b184910-eb11-4a66-a7cf-c2ee7c6e18b8-utilities\") pod \"redhat-marketplace-pjm9h\" (UID: \"4b184910-eb11-4a66-a7cf-c2ee7c6e18b8\") " pod="openshift-marketplace/redhat-marketplace-pjm9h" Dec 01 23:12:17 crc kubenswrapper[4962]: I1201 23:12:17.072646 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b184910-eb11-4a66-a7cf-c2ee7c6e18b8-catalog-content\") pod \"redhat-marketplace-pjm9h\" (UID: \"4b184910-eb11-4a66-a7cf-c2ee7c6e18b8\") " pod="openshift-marketplace/redhat-marketplace-pjm9h" Dec 01 23:12:17 crc kubenswrapper[4962]: I1201 23:12:17.099052 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pjm9h"] Dec 01 23:12:17 crc kubenswrapper[4962]: I1201 23:12:17.177973 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzj9q\" (UniqueName: \"kubernetes.io/projected/4b184910-eb11-4a66-a7cf-c2ee7c6e18b8-kube-api-access-hzj9q\") pod \"redhat-marketplace-pjm9h\" (UID: \"4b184910-eb11-4a66-a7cf-c2ee7c6e18b8\") " pod="openshift-marketplace/redhat-marketplace-pjm9h" Dec 01 23:12:17 crc kubenswrapper[4962]: I1201 23:12:17.178072 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b184910-eb11-4a66-a7cf-c2ee7c6e18b8-utilities\") pod \"redhat-marketplace-pjm9h\" (UID: \"4b184910-eb11-4a66-a7cf-c2ee7c6e18b8\") " pod="openshift-marketplace/redhat-marketplace-pjm9h" Dec 01 23:12:17 crc kubenswrapper[4962]: I1201 23:12:17.178105 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b184910-eb11-4a66-a7cf-c2ee7c6e18b8-catalog-content\") pod \"redhat-marketplace-pjm9h\" (UID: \"4b184910-eb11-4a66-a7cf-c2ee7c6e18b8\") " pod="openshift-marketplace/redhat-marketplace-pjm9h" Dec 01 23:12:17 crc kubenswrapper[4962]: I1201 23:12:17.178698 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b184910-eb11-4a66-a7cf-c2ee7c6e18b8-catalog-content\") pod \"redhat-marketplace-pjm9h\" (UID: \"4b184910-eb11-4a66-a7cf-c2ee7c6e18b8\") " pod="openshift-marketplace/redhat-marketplace-pjm9h" Dec 01 23:12:17 crc kubenswrapper[4962]: I1201 23:12:17.179251 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b184910-eb11-4a66-a7cf-c2ee7c6e18b8-utilities\") pod \"redhat-marketplace-pjm9h\" (UID: \"4b184910-eb11-4a66-a7cf-c2ee7c6e18b8\") " pod="openshift-marketplace/redhat-marketplace-pjm9h" Dec 01 23:12:17 crc kubenswrapper[4962]: I1201 23:12:17.218654 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzj9q\" (UniqueName: \"kubernetes.io/projected/4b184910-eb11-4a66-a7cf-c2ee7c6e18b8-kube-api-access-hzj9q\") pod \"redhat-marketplace-pjm9h\" (UID: \"4b184910-eb11-4a66-a7cf-c2ee7c6e18b8\") " pod="openshift-marketplace/redhat-marketplace-pjm9h" Dec 01 23:12:17 crc kubenswrapper[4962]: I1201 23:12:17.401614 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pjm9h" Dec 01 23:12:17 crc kubenswrapper[4962]: I1201 23:12:17.885120 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pjm9h"] Dec 01 23:12:17 crc kubenswrapper[4962]: W1201 23:12:17.888315 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b184910_eb11_4a66_a7cf_c2ee7c6e18b8.slice/crio-fe6a1e37822bafa42d57f7482710978bfd7244abec78a38a57f7ea772ffe1d42 WatchSource:0}: Error finding container fe6a1e37822bafa42d57f7482710978bfd7244abec78a38a57f7ea772ffe1d42: Status 404 returned error can't find the container with id fe6a1e37822bafa42d57f7482710978bfd7244abec78a38a57f7ea772ffe1d42 Dec 01 23:12:18 crc kubenswrapper[4962]: I1201 23:12:18.770800 4962 generic.go:334] "Generic (PLEG): container finished" podID="4b184910-eb11-4a66-a7cf-c2ee7c6e18b8" containerID="f1888bf4ecce807a19afb969fedc66fbcc52a8178bff7fef7c60c55952cd6024" exitCode=0 Dec 01 23:12:18 crc kubenswrapper[4962]: I1201 23:12:18.770851 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pjm9h" event={"ID":"4b184910-eb11-4a66-a7cf-c2ee7c6e18b8","Type":"ContainerDied","Data":"f1888bf4ecce807a19afb969fedc66fbcc52a8178bff7fef7c60c55952cd6024"} Dec 01 23:12:18 crc kubenswrapper[4962]: I1201 23:12:18.771162 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pjm9h" event={"ID":"4b184910-eb11-4a66-a7cf-c2ee7c6e18b8","Type":"ContainerStarted","Data":"fe6a1e37822bafa42d57f7482710978bfd7244abec78a38a57f7ea772ffe1d42"} Dec 01 23:12:20 crc kubenswrapper[4962]: I1201 23:12:20.801816 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pjm9h" event={"ID":"4b184910-eb11-4a66-a7cf-c2ee7c6e18b8","Type":"ContainerStarted","Data":"bf883a4231b09d78e7c2bd98b52cab70af1d6b7887e175776201f186ffa85340"} Dec 01 23:12:21 crc kubenswrapper[4962]: I1201 23:12:21.226154 4962 trace.go:236] Trace[427013055]: "Calculate volume metrics of wal for pod openshift-logging/logging-loki-ingester-0" (01-Dec-2025 23:12:20.217) (total time: 1008ms): Dec 01 23:12:21 crc kubenswrapper[4962]: Trace[427013055]: [1.008288084s] [1.008288084s] END Dec 01 23:12:21 crc kubenswrapper[4962]: I1201 23:12:21.818978 4962 generic.go:334] "Generic (PLEG): container finished" podID="4b184910-eb11-4a66-a7cf-c2ee7c6e18b8" containerID="bf883a4231b09d78e7c2bd98b52cab70af1d6b7887e175776201f186ffa85340" exitCode=0 Dec 01 23:12:21 crc kubenswrapper[4962]: I1201 23:12:21.819054 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pjm9h" event={"ID":"4b184910-eb11-4a66-a7cf-c2ee7c6e18b8","Type":"ContainerDied","Data":"bf883a4231b09d78e7c2bd98b52cab70af1d6b7887e175776201f186ffa85340"} Dec 01 23:12:22 crc kubenswrapper[4962]: I1201 23:12:22.834433 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pjm9h" event={"ID":"4b184910-eb11-4a66-a7cf-c2ee7c6e18b8","Type":"ContainerStarted","Data":"7d5b32486bd2c44ecf78f46bb26d39028ab414657c664da8e7e638b4f99f8e5d"} Dec 01 23:12:22 crc kubenswrapper[4962]: I1201 23:12:22.858062 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pjm9h" podStartSLOduration=2.215273243 podStartE2EDuration="5.858041391s" podCreationTimestamp="2025-12-01 23:12:17 +0000 UTC" firstStartedPulling="2025-12-01 23:12:18.774378079 +0000 UTC m=+5922.875817274" lastFinishedPulling="2025-12-01 23:12:22.417146217 +0000 UTC m=+5926.518585422" observedRunningTime="2025-12-01 23:12:22.851704711 +0000 UTC m=+5926.953143906" watchObservedRunningTime="2025-12-01 23:12:22.858041391 +0000 UTC m=+5926.959480586" Dec 01 23:12:27 crc kubenswrapper[4962]: I1201 23:12:27.402417 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pjm9h" Dec 01 23:12:27 crc kubenswrapper[4962]: I1201 23:12:27.403159 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pjm9h" Dec 01 23:12:27 crc kubenswrapper[4962]: I1201 23:12:27.496170 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pjm9h" Dec 01 23:12:27 crc kubenswrapper[4962]: I1201 23:12:27.948684 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pjm9h" Dec 01 23:12:28 crc kubenswrapper[4962]: I1201 23:12:28.015321 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pjm9h"] Dec 01 23:12:29 crc kubenswrapper[4962]: I1201 23:12:29.923439 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pjm9h" podUID="4b184910-eb11-4a66-a7cf-c2ee7c6e18b8" containerName="registry-server" containerID="cri-o://7d5b32486bd2c44ecf78f46bb26d39028ab414657c664da8e7e638b4f99f8e5d" gracePeriod=2 Dec 01 23:12:30 crc kubenswrapper[4962]: I1201 23:12:30.533290 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pjm9h" Dec 01 23:12:30 crc kubenswrapper[4962]: I1201 23:12:30.574798 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b184910-eb11-4a66-a7cf-c2ee7c6e18b8-utilities\") pod \"4b184910-eb11-4a66-a7cf-c2ee7c6e18b8\" (UID: \"4b184910-eb11-4a66-a7cf-c2ee7c6e18b8\") " Dec 01 23:12:30 crc kubenswrapper[4962]: I1201 23:12:30.575148 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hzj9q\" (UniqueName: \"kubernetes.io/projected/4b184910-eb11-4a66-a7cf-c2ee7c6e18b8-kube-api-access-hzj9q\") pod \"4b184910-eb11-4a66-a7cf-c2ee7c6e18b8\" (UID: \"4b184910-eb11-4a66-a7cf-c2ee7c6e18b8\") " Dec 01 23:12:30 crc kubenswrapper[4962]: I1201 23:12:30.575241 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b184910-eb11-4a66-a7cf-c2ee7c6e18b8-catalog-content\") pod \"4b184910-eb11-4a66-a7cf-c2ee7c6e18b8\" (UID: \"4b184910-eb11-4a66-a7cf-c2ee7c6e18b8\") " Dec 01 23:12:30 crc kubenswrapper[4962]: I1201 23:12:30.575800 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b184910-eb11-4a66-a7cf-c2ee7c6e18b8-utilities" (OuterVolumeSpecName: "utilities") pod "4b184910-eb11-4a66-a7cf-c2ee7c6e18b8" (UID: "4b184910-eb11-4a66-a7cf-c2ee7c6e18b8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 23:12:30 crc kubenswrapper[4962]: I1201 23:12:30.577981 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b184910-eb11-4a66-a7cf-c2ee7c6e18b8-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 23:12:30 crc kubenswrapper[4962]: I1201 23:12:30.587607 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b184910-eb11-4a66-a7cf-c2ee7c6e18b8-kube-api-access-hzj9q" (OuterVolumeSpecName: "kube-api-access-hzj9q") pod "4b184910-eb11-4a66-a7cf-c2ee7c6e18b8" (UID: "4b184910-eb11-4a66-a7cf-c2ee7c6e18b8"). InnerVolumeSpecName "kube-api-access-hzj9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 23:12:30 crc kubenswrapper[4962]: I1201 23:12:30.600960 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b184910-eb11-4a66-a7cf-c2ee7c6e18b8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4b184910-eb11-4a66-a7cf-c2ee7c6e18b8" (UID: "4b184910-eb11-4a66-a7cf-c2ee7c6e18b8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 23:12:30 crc kubenswrapper[4962]: I1201 23:12:30.679404 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hzj9q\" (UniqueName: \"kubernetes.io/projected/4b184910-eb11-4a66-a7cf-c2ee7c6e18b8-kube-api-access-hzj9q\") on node \"crc\" DevicePath \"\"" Dec 01 23:12:30 crc kubenswrapper[4962]: I1201 23:12:30.679437 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b184910-eb11-4a66-a7cf-c2ee7c6e18b8-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 23:12:30 crc kubenswrapper[4962]: I1201 23:12:30.936525 4962 generic.go:334] "Generic (PLEG): container finished" podID="4b184910-eb11-4a66-a7cf-c2ee7c6e18b8" containerID="7d5b32486bd2c44ecf78f46bb26d39028ab414657c664da8e7e638b4f99f8e5d" exitCode=0 Dec 01 23:12:30 crc kubenswrapper[4962]: I1201 23:12:30.936564 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pjm9h" event={"ID":"4b184910-eb11-4a66-a7cf-c2ee7c6e18b8","Type":"ContainerDied","Data":"7d5b32486bd2c44ecf78f46bb26d39028ab414657c664da8e7e638b4f99f8e5d"} Dec 01 23:12:30 crc kubenswrapper[4962]: I1201 23:12:30.936609 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pjm9h" event={"ID":"4b184910-eb11-4a66-a7cf-c2ee7c6e18b8","Type":"ContainerDied","Data":"fe6a1e37822bafa42d57f7482710978bfd7244abec78a38a57f7ea772ffe1d42"} Dec 01 23:12:30 crc kubenswrapper[4962]: I1201 23:12:30.936629 4962 scope.go:117] "RemoveContainer" containerID="7d5b32486bd2c44ecf78f46bb26d39028ab414657c664da8e7e638b4f99f8e5d" Dec 01 23:12:30 crc kubenswrapper[4962]: I1201 23:12:30.937367 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pjm9h" Dec 01 23:12:31 crc kubenswrapper[4962]: I1201 23:12:30.972255 4962 scope.go:117] "RemoveContainer" containerID="bf883a4231b09d78e7c2bd98b52cab70af1d6b7887e175776201f186ffa85340" Dec 01 23:12:31 crc kubenswrapper[4962]: I1201 23:12:30.992913 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pjm9h"] Dec 01 23:12:31 crc kubenswrapper[4962]: I1201 23:12:31.004921 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pjm9h"] Dec 01 23:12:31 crc kubenswrapper[4962]: I1201 23:12:31.026841 4962 scope.go:117] "RemoveContainer" containerID="f1888bf4ecce807a19afb969fedc66fbcc52a8178bff7fef7c60c55952cd6024" Dec 01 23:12:31 crc kubenswrapper[4962]: I1201 23:12:31.059549 4962 scope.go:117] "RemoveContainer" containerID="7d5b32486bd2c44ecf78f46bb26d39028ab414657c664da8e7e638b4f99f8e5d" Dec 01 23:12:31 crc kubenswrapper[4962]: E1201 23:12:31.060223 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d5b32486bd2c44ecf78f46bb26d39028ab414657c664da8e7e638b4f99f8e5d\": container with ID starting with 7d5b32486bd2c44ecf78f46bb26d39028ab414657c664da8e7e638b4f99f8e5d not found: ID does not exist" containerID="7d5b32486bd2c44ecf78f46bb26d39028ab414657c664da8e7e638b4f99f8e5d" Dec 01 23:12:31 crc kubenswrapper[4962]: I1201 23:12:31.060265 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d5b32486bd2c44ecf78f46bb26d39028ab414657c664da8e7e638b4f99f8e5d"} err="failed to get container status \"7d5b32486bd2c44ecf78f46bb26d39028ab414657c664da8e7e638b4f99f8e5d\": rpc error: code = NotFound desc = could not find container \"7d5b32486bd2c44ecf78f46bb26d39028ab414657c664da8e7e638b4f99f8e5d\": container with ID starting with 7d5b32486bd2c44ecf78f46bb26d39028ab414657c664da8e7e638b4f99f8e5d not found: ID does not exist" Dec 01 23:12:31 crc kubenswrapper[4962]: I1201 23:12:31.060291 4962 scope.go:117] "RemoveContainer" containerID="bf883a4231b09d78e7c2bd98b52cab70af1d6b7887e175776201f186ffa85340" Dec 01 23:12:31 crc kubenswrapper[4962]: E1201 23:12:31.060832 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf883a4231b09d78e7c2bd98b52cab70af1d6b7887e175776201f186ffa85340\": container with ID starting with bf883a4231b09d78e7c2bd98b52cab70af1d6b7887e175776201f186ffa85340 not found: ID does not exist" containerID="bf883a4231b09d78e7c2bd98b52cab70af1d6b7887e175776201f186ffa85340" Dec 01 23:12:31 crc kubenswrapper[4962]: I1201 23:12:31.060902 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf883a4231b09d78e7c2bd98b52cab70af1d6b7887e175776201f186ffa85340"} err="failed to get container status \"bf883a4231b09d78e7c2bd98b52cab70af1d6b7887e175776201f186ffa85340\": rpc error: code = NotFound desc = could not find container \"bf883a4231b09d78e7c2bd98b52cab70af1d6b7887e175776201f186ffa85340\": container with ID starting with bf883a4231b09d78e7c2bd98b52cab70af1d6b7887e175776201f186ffa85340 not found: ID does not exist" Dec 01 23:12:31 crc kubenswrapper[4962]: I1201 23:12:31.060965 4962 scope.go:117] "RemoveContainer" containerID="f1888bf4ecce807a19afb969fedc66fbcc52a8178bff7fef7c60c55952cd6024" Dec 01 23:12:31 crc kubenswrapper[4962]: E1201 23:12:31.063274 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1888bf4ecce807a19afb969fedc66fbcc52a8178bff7fef7c60c55952cd6024\": container with ID starting with f1888bf4ecce807a19afb969fedc66fbcc52a8178bff7fef7c60c55952cd6024 not found: ID does not exist" containerID="f1888bf4ecce807a19afb969fedc66fbcc52a8178bff7fef7c60c55952cd6024" Dec 01 23:12:31 crc kubenswrapper[4962]: I1201 23:12:31.063342 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1888bf4ecce807a19afb969fedc66fbcc52a8178bff7fef7c60c55952cd6024"} err="failed to get container status \"f1888bf4ecce807a19afb969fedc66fbcc52a8178bff7fef7c60c55952cd6024\": rpc error: code = NotFound desc = could not find container \"f1888bf4ecce807a19afb969fedc66fbcc52a8178bff7fef7c60c55952cd6024\": container with ID starting with f1888bf4ecce807a19afb969fedc66fbcc52a8178bff7fef7c60c55952cd6024 not found: ID does not exist" Dec 01 23:12:32 crc kubenswrapper[4962]: I1201 23:12:32.243370 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b184910-eb11-4a66-a7cf-c2ee7c6e18b8" path="/var/lib/kubelet/pods/4b184910-eb11-4a66-a7cf-c2ee7c6e18b8/volumes" Dec 01 23:12:32 crc kubenswrapper[4962]: I1201 23:12:32.785030 4962 patch_prober.go:28] interesting pod/machine-config-daemon-b642k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 23:12:32 crc kubenswrapper[4962]: I1201 23:12:32.785324 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 23:13:02 crc kubenswrapper[4962]: I1201 23:13:02.784283 4962 patch_prober.go:28] interesting pod/machine-config-daemon-b642k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 23:13:02 crc kubenswrapper[4962]: I1201 23:13:02.785004 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 23:13:32 crc kubenswrapper[4962]: I1201 23:13:32.784930 4962 patch_prober.go:28] interesting pod/machine-config-daemon-b642k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 23:13:32 crc kubenswrapper[4962]: I1201 23:13:32.785760 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 23:13:32 crc kubenswrapper[4962]: I1201 23:13:32.785822 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b642k" Dec 01 23:13:32 crc kubenswrapper[4962]: I1201 23:13:32.786996 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9069c6b0afe806496066b75f21b8e7f8eb26d789d60c34eb7173619cac2f7eff"} pod="openshift-machine-config-operator/machine-config-daemon-b642k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 23:13:32 crc kubenswrapper[4962]: I1201 23:13:32.787097 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" containerID="cri-o://9069c6b0afe806496066b75f21b8e7f8eb26d789d60c34eb7173619cac2f7eff" gracePeriod=600 Dec 01 23:13:32 crc kubenswrapper[4962]: E1201 23:13:32.940173 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 23:13:33 crc kubenswrapper[4962]: I1201 23:13:33.924602 4962 generic.go:334] "Generic (PLEG): container finished" podID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerID="9069c6b0afe806496066b75f21b8e7f8eb26d789d60c34eb7173619cac2f7eff" exitCode=0 Dec 01 23:13:33 crc kubenswrapper[4962]: I1201 23:13:33.924664 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" event={"ID":"191b6ce3-f613-4217-b224-a65ee4cfdfe7","Type":"ContainerDied","Data":"9069c6b0afe806496066b75f21b8e7f8eb26d789d60c34eb7173619cac2f7eff"} Dec 01 23:13:33 crc kubenswrapper[4962]: I1201 23:13:33.924710 4962 scope.go:117] "RemoveContainer" containerID="4f0dd1a4061cd9271344a31e8d89eab4bdd66bf2f565beae87f3ca6dd0c8508b" Dec 01 23:13:33 crc kubenswrapper[4962]: I1201 23:13:33.925900 4962 scope.go:117] "RemoveContainer" containerID="9069c6b0afe806496066b75f21b8e7f8eb26d789d60c34eb7173619cac2f7eff" Dec 01 23:13:33 crc kubenswrapper[4962]: E1201 23:13:33.926600 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 23:13:46 crc kubenswrapper[4962]: I1201 23:13:46.232597 4962 scope.go:117] "RemoveContainer" containerID="9069c6b0afe806496066b75f21b8e7f8eb26d789d60c34eb7173619cac2f7eff" Dec 01 23:13:46 crc kubenswrapper[4962]: E1201 23:13:46.233871 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 23:13:58 crc kubenswrapper[4962]: I1201 23:13:58.221385 4962 scope.go:117] "RemoveContainer" containerID="9069c6b0afe806496066b75f21b8e7f8eb26d789d60c34eb7173619cac2f7eff" Dec 01 23:13:58 crc kubenswrapper[4962]: E1201 23:13:58.222094 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 23:14:12 crc kubenswrapper[4962]: I1201 23:14:12.222075 4962 scope.go:117] "RemoveContainer" containerID="9069c6b0afe806496066b75f21b8e7f8eb26d789d60c34eb7173619cac2f7eff" Dec 01 23:14:12 crc kubenswrapper[4962]: E1201 23:14:12.223229 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 23:14:25 crc kubenswrapper[4962]: I1201 23:14:25.220127 4962 scope.go:117] "RemoveContainer" containerID="9069c6b0afe806496066b75f21b8e7f8eb26d789d60c34eb7173619cac2f7eff" Dec 01 23:14:25 crc kubenswrapper[4962]: E1201 23:14:25.221232 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 23:14:39 crc kubenswrapper[4962]: I1201 23:14:39.220714 4962 scope.go:117] "RemoveContainer" containerID="9069c6b0afe806496066b75f21b8e7f8eb26d789d60c34eb7173619cac2f7eff" Dec 01 23:14:39 crc kubenswrapper[4962]: E1201 23:14:39.221631 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 23:14:50 crc kubenswrapper[4962]: I1201 23:14:50.220248 4962 scope.go:117] "RemoveContainer" containerID="9069c6b0afe806496066b75f21b8e7f8eb26d789d60c34eb7173619cac2f7eff" Dec 01 23:14:50 crc kubenswrapper[4962]: E1201 23:14:50.221331 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 23:15:00 crc kubenswrapper[4962]: I1201 23:15:00.176655 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410515-nwjkm"] Dec 01 23:15:00 crc kubenswrapper[4962]: E1201 23:15:00.177904 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b184910-eb11-4a66-a7cf-c2ee7c6e18b8" containerName="extract-utilities" Dec 01 23:15:00 crc kubenswrapper[4962]: I1201 23:15:00.177947 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b184910-eb11-4a66-a7cf-c2ee7c6e18b8" containerName="extract-utilities" Dec 01 23:15:00 crc kubenswrapper[4962]: E1201 23:15:00.177969 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b184910-eb11-4a66-a7cf-c2ee7c6e18b8" containerName="registry-server" Dec 01 23:15:00 crc kubenswrapper[4962]: I1201 23:15:00.177977 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b184910-eb11-4a66-a7cf-c2ee7c6e18b8" containerName="registry-server" Dec 01 23:15:00 crc kubenswrapper[4962]: E1201 23:15:00.178012 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b184910-eb11-4a66-a7cf-c2ee7c6e18b8" containerName="extract-content" Dec 01 23:15:00 crc kubenswrapper[4962]: I1201 23:15:00.178021 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b184910-eb11-4a66-a7cf-c2ee7c6e18b8" containerName="extract-content" Dec 01 23:15:00 crc kubenswrapper[4962]: I1201 23:15:00.178327 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b184910-eb11-4a66-a7cf-c2ee7c6e18b8" containerName="registry-server" Dec 01 23:15:00 crc kubenswrapper[4962]: I1201 23:15:00.179446 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410515-nwjkm" Dec 01 23:15:00 crc kubenswrapper[4962]: I1201 23:15:00.185719 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 01 23:15:00 crc kubenswrapper[4962]: I1201 23:15:00.190481 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 01 23:15:00 crc kubenswrapper[4962]: I1201 23:15:00.201710 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410515-nwjkm"] Dec 01 23:15:00 crc kubenswrapper[4962]: I1201 23:15:00.238764 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8dkd\" (UniqueName: \"kubernetes.io/projected/35dde230-6822-4679-8436-376d5fae4be0-kube-api-access-n8dkd\") pod \"collect-profiles-29410515-nwjkm\" (UID: \"35dde230-6822-4679-8436-376d5fae4be0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410515-nwjkm" Dec 01 23:15:00 crc kubenswrapper[4962]: I1201 23:15:00.239183 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/35dde230-6822-4679-8436-376d5fae4be0-config-volume\") pod \"collect-profiles-29410515-nwjkm\" (UID: \"35dde230-6822-4679-8436-376d5fae4be0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410515-nwjkm" Dec 01 23:15:00 crc kubenswrapper[4962]: I1201 23:15:00.239300 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/35dde230-6822-4679-8436-376d5fae4be0-secret-volume\") pod \"collect-profiles-29410515-nwjkm\" (UID: \"35dde230-6822-4679-8436-376d5fae4be0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410515-nwjkm" Dec 01 23:15:00 crc kubenswrapper[4962]: I1201 23:15:00.340889 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/35dde230-6822-4679-8436-376d5fae4be0-config-volume\") pod \"collect-profiles-29410515-nwjkm\" (UID: \"35dde230-6822-4679-8436-376d5fae4be0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410515-nwjkm" Dec 01 23:15:00 crc kubenswrapper[4962]: I1201 23:15:00.341006 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/35dde230-6822-4679-8436-376d5fae4be0-secret-volume\") pod \"collect-profiles-29410515-nwjkm\" (UID: \"35dde230-6822-4679-8436-376d5fae4be0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410515-nwjkm" Dec 01 23:15:00 crc kubenswrapper[4962]: I1201 23:15:00.341123 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8dkd\" (UniqueName: \"kubernetes.io/projected/35dde230-6822-4679-8436-376d5fae4be0-kube-api-access-n8dkd\") pod \"collect-profiles-29410515-nwjkm\" (UID: \"35dde230-6822-4679-8436-376d5fae4be0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410515-nwjkm" Dec 01 23:15:00 crc kubenswrapper[4962]: I1201 23:15:00.343044 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/35dde230-6822-4679-8436-376d5fae4be0-config-volume\") pod \"collect-profiles-29410515-nwjkm\" (UID: \"35dde230-6822-4679-8436-376d5fae4be0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410515-nwjkm" Dec 01 23:15:00 crc kubenswrapper[4962]: I1201 23:15:00.389049 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/35dde230-6822-4679-8436-376d5fae4be0-secret-volume\") pod \"collect-profiles-29410515-nwjkm\" (UID: \"35dde230-6822-4679-8436-376d5fae4be0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410515-nwjkm" Dec 01 23:15:00 crc kubenswrapper[4962]: I1201 23:15:00.390266 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8dkd\" (UniqueName: \"kubernetes.io/projected/35dde230-6822-4679-8436-376d5fae4be0-kube-api-access-n8dkd\") pod \"collect-profiles-29410515-nwjkm\" (UID: \"35dde230-6822-4679-8436-376d5fae4be0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410515-nwjkm" Dec 01 23:15:00 crc kubenswrapper[4962]: I1201 23:15:00.508767 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410515-nwjkm" Dec 01 23:15:01 crc kubenswrapper[4962]: I1201 23:15:01.026039 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410515-nwjkm"] Dec 01 23:15:01 crc kubenswrapper[4962]: I1201 23:15:01.105616 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410515-nwjkm" event={"ID":"35dde230-6822-4679-8436-376d5fae4be0","Type":"ContainerStarted","Data":"18af75a92533e11451f8e4fc54cd73c9bdefb437c0cda608c5047537cdae4783"} Dec 01 23:15:02 crc kubenswrapper[4962]: I1201 23:15:02.119781 4962 generic.go:334] "Generic (PLEG): container finished" podID="35dde230-6822-4679-8436-376d5fae4be0" containerID="8d5c67d879f8922ecd3f11e4d27e92c440f7d07880d713b609cf269464a82ed5" exitCode=0 Dec 01 23:15:02 crc kubenswrapper[4962]: I1201 23:15:02.119902 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410515-nwjkm" event={"ID":"35dde230-6822-4679-8436-376d5fae4be0","Type":"ContainerDied","Data":"8d5c67d879f8922ecd3f11e4d27e92c440f7d07880d713b609cf269464a82ed5"} Dec 01 23:15:02 crc kubenswrapper[4962]: I1201 23:15:02.221046 4962 scope.go:117] "RemoveContainer" containerID="9069c6b0afe806496066b75f21b8e7f8eb26d789d60c34eb7173619cac2f7eff" Dec 01 23:15:02 crc kubenswrapper[4962]: E1201 23:15:02.221619 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 23:15:03 crc kubenswrapper[4962]: I1201 23:15:03.581818 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410515-nwjkm" Dec 01 23:15:03 crc kubenswrapper[4962]: I1201 23:15:03.617611 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8dkd\" (UniqueName: \"kubernetes.io/projected/35dde230-6822-4679-8436-376d5fae4be0-kube-api-access-n8dkd\") pod \"35dde230-6822-4679-8436-376d5fae4be0\" (UID: \"35dde230-6822-4679-8436-376d5fae4be0\") " Dec 01 23:15:03 crc kubenswrapper[4962]: I1201 23:15:03.617737 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/35dde230-6822-4679-8436-376d5fae4be0-secret-volume\") pod \"35dde230-6822-4679-8436-376d5fae4be0\" (UID: \"35dde230-6822-4679-8436-376d5fae4be0\") " Dec 01 23:15:03 crc kubenswrapper[4962]: I1201 23:15:03.617788 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/35dde230-6822-4679-8436-376d5fae4be0-config-volume\") pod \"35dde230-6822-4679-8436-376d5fae4be0\" (UID: \"35dde230-6822-4679-8436-376d5fae4be0\") " Dec 01 23:15:03 crc kubenswrapper[4962]: I1201 23:15:03.619009 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35dde230-6822-4679-8436-376d5fae4be0-config-volume" (OuterVolumeSpecName: "config-volume") pod "35dde230-6822-4679-8436-376d5fae4be0" (UID: "35dde230-6822-4679-8436-376d5fae4be0"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 23:15:03 crc kubenswrapper[4962]: I1201 23:15:03.627244 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35dde230-6822-4679-8436-376d5fae4be0-kube-api-access-n8dkd" (OuterVolumeSpecName: "kube-api-access-n8dkd") pod "35dde230-6822-4679-8436-376d5fae4be0" (UID: "35dde230-6822-4679-8436-376d5fae4be0"). InnerVolumeSpecName "kube-api-access-n8dkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 23:15:03 crc kubenswrapper[4962]: I1201 23:15:03.627298 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35dde230-6822-4679-8436-376d5fae4be0-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "35dde230-6822-4679-8436-376d5fae4be0" (UID: "35dde230-6822-4679-8436-376d5fae4be0"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 23:15:03 crc kubenswrapper[4962]: I1201 23:15:03.720165 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n8dkd\" (UniqueName: \"kubernetes.io/projected/35dde230-6822-4679-8436-376d5fae4be0-kube-api-access-n8dkd\") on node \"crc\" DevicePath \"\"" Dec 01 23:15:03 crc kubenswrapper[4962]: I1201 23:15:03.720200 4962 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/35dde230-6822-4679-8436-376d5fae4be0-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 01 23:15:03 crc kubenswrapper[4962]: I1201 23:15:03.720214 4962 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/35dde230-6822-4679-8436-376d5fae4be0-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 23:15:04 crc kubenswrapper[4962]: I1201 23:15:04.148137 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410515-nwjkm" event={"ID":"35dde230-6822-4679-8436-376d5fae4be0","Type":"ContainerDied","Data":"18af75a92533e11451f8e4fc54cd73c9bdefb437c0cda608c5047537cdae4783"} Dec 01 23:15:04 crc kubenswrapper[4962]: I1201 23:15:04.148424 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="18af75a92533e11451f8e4fc54cd73c9bdefb437c0cda608c5047537cdae4783" Dec 01 23:15:04 crc kubenswrapper[4962]: I1201 23:15:04.148227 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410515-nwjkm" Dec 01 23:15:04 crc kubenswrapper[4962]: I1201 23:15:04.659862 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410470-pbdpn"] Dec 01 23:15:04 crc kubenswrapper[4962]: I1201 23:15:04.670318 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410470-pbdpn"] Dec 01 23:15:06 crc kubenswrapper[4962]: I1201 23:15:06.232069 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e43abc92-633e-496a-9ec0-68bb2520c12d" path="/var/lib/kubelet/pods/e43abc92-633e-496a-9ec0-68bb2520c12d/volumes" Dec 01 23:15:09 crc kubenswrapper[4962]: I1201 23:15:09.342175 4962 scope.go:117] "RemoveContainer" containerID="c91095aa1dd359fbddea12128fa7e2c98fb93ea7a9b63ecddcfbea15f9b270da" Dec 01 23:15:16 crc kubenswrapper[4962]: I1201 23:15:16.237392 4962 scope.go:117] "RemoveContainer" containerID="9069c6b0afe806496066b75f21b8e7f8eb26d789d60c34eb7173619cac2f7eff" Dec 01 23:15:16 crc kubenswrapper[4962]: E1201 23:15:16.238318 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 23:15:20 crc kubenswrapper[4962]: I1201 23:15:20.378184 4962 generic.go:334] "Generic (PLEG): container finished" podID="07461b2c-c45f-45cf-a540-4c24797e3f16" containerID="a0ab3808c752994cfb40eb5bf5f32918c1237cf6c55c98ddc6929d7e2787602e" exitCode=0 Dec 01 23:15:20 crc kubenswrapper[4962]: I1201 23:15:20.378325 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"07461b2c-c45f-45cf-a540-4c24797e3f16","Type":"ContainerDied","Data":"a0ab3808c752994cfb40eb5bf5f32918c1237cf6c55c98ddc6929d7e2787602e"} Dec 01 23:15:21 crc kubenswrapper[4962]: I1201 23:15:21.869542 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 01 23:15:22 crc kubenswrapper[4962]: I1201 23:15:22.013012 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/07461b2c-c45f-45cf-a540-4c24797e3f16-config-data\") pod \"07461b2c-c45f-45cf-a540-4c24797e3f16\" (UID: \"07461b2c-c45f-45cf-a540-4c24797e3f16\") " Dec 01 23:15:22 crc kubenswrapper[4962]: I1201 23:15:22.013132 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/07461b2c-c45f-45cf-a540-4c24797e3f16-ca-certs\") pod \"07461b2c-c45f-45cf-a540-4c24797e3f16\" (UID: \"07461b2c-c45f-45cf-a540-4c24797e3f16\") " Dec 01 23:15:22 crc kubenswrapper[4962]: I1201 23:15:22.013166 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/07461b2c-c45f-45cf-a540-4c24797e3f16-openstack-config\") pod \"07461b2c-c45f-45cf-a540-4c24797e3f16\" (UID: \"07461b2c-c45f-45cf-a540-4c24797e3f16\") " Dec 01 23:15:22 crc kubenswrapper[4962]: I1201 23:15:22.013284 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78mnh\" (UniqueName: \"kubernetes.io/projected/07461b2c-c45f-45cf-a540-4c24797e3f16-kube-api-access-78mnh\") pod \"07461b2c-c45f-45cf-a540-4c24797e3f16\" (UID: \"07461b2c-c45f-45cf-a540-4c24797e3f16\") " Dec 01 23:15:22 crc kubenswrapper[4962]: I1201 23:15:22.013399 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/07461b2c-c45f-45cf-a540-4c24797e3f16-test-operator-ephemeral-temporary\") pod \"07461b2c-c45f-45cf-a540-4c24797e3f16\" (UID: \"07461b2c-c45f-45cf-a540-4c24797e3f16\") " Dec 01 23:15:22 crc kubenswrapper[4962]: I1201 23:15:22.013435 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/07461b2c-c45f-45cf-a540-4c24797e3f16-openstack-config-secret\") pod \"07461b2c-c45f-45cf-a540-4c24797e3f16\" (UID: \"07461b2c-c45f-45cf-a540-4c24797e3f16\") " Dec 01 23:15:22 crc kubenswrapper[4962]: I1201 23:15:22.013550 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/07461b2c-c45f-45cf-a540-4c24797e3f16-ssh-key\") pod \"07461b2c-c45f-45cf-a540-4c24797e3f16\" (UID: \"07461b2c-c45f-45cf-a540-4c24797e3f16\") " Dec 01 23:15:22 crc kubenswrapper[4962]: I1201 23:15:22.013642 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/07461b2c-c45f-45cf-a540-4c24797e3f16-test-operator-ephemeral-workdir\") pod \"07461b2c-c45f-45cf-a540-4c24797e3f16\" (UID: \"07461b2c-c45f-45cf-a540-4c24797e3f16\") " Dec 01 23:15:22 crc kubenswrapper[4962]: I1201 23:15:22.013675 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"07461b2c-c45f-45cf-a540-4c24797e3f16\" (UID: \"07461b2c-c45f-45cf-a540-4c24797e3f16\") " Dec 01 23:15:22 crc kubenswrapper[4962]: I1201 23:15:22.015918 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07461b2c-c45f-45cf-a540-4c24797e3f16-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "07461b2c-c45f-45cf-a540-4c24797e3f16" (UID: "07461b2c-c45f-45cf-a540-4c24797e3f16"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 23:15:22 crc kubenswrapper[4962]: I1201 23:15:22.017889 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07461b2c-c45f-45cf-a540-4c24797e3f16-config-data" (OuterVolumeSpecName: "config-data") pod "07461b2c-c45f-45cf-a540-4c24797e3f16" (UID: "07461b2c-c45f-45cf-a540-4c24797e3f16"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 23:15:22 crc kubenswrapper[4962]: I1201 23:15:22.019332 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "test-operator-logs") pod "07461b2c-c45f-45cf-a540-4c24797e3f16" (UID: "07461b2c-c45f-45cf-a540-4c24797e3f16"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 23:15:22 crc kubenswrapper[4962]: I1201 23:15:22.021834 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07461b2c-c45f-45cf-a540-4c24797e3f16-kube-api-access-78mnh" (OuterVolumeSpecName: "kube-api-access-78mnh") pod "07461b2c-c45f-45cf-a540-4c24797e3f16" (UID: "07461b2c-c45f-45cf-a540-4c24797e3f16"). InnerVolumeSpecName "kube-api-access-78mnh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 23:15:22 crc kubenswrapper[4962]: I1201 23:15:22.022225 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07461b2c-c45f-45cf-a540-4c24797e3f16-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "07461b2c-c45f-45cf-a540-4c24797e3f16" (UID: "07461b2c-c45f-45cf-a540-4c24797e3f16"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 23:15:22 crc kubenswrapper[4962]: I1201 23:15:22.047790 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07461b2c-c45f-45cf-a540-4c24797e3f16-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "07461b2c-c45f-45cf-a540-4c24797e3f16" (UID: "07461b2c-c45f-45cf-a540-4c24797e3f16"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 23:15:22 crc kubenswrapper[4962]: I1201 23:15:22.048551 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07461b2c-c45f-45cf-a540-4c24797e3f16-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "07461b2c-c45f-45cf-a540-4c24797e3f16" (UID: "07461b2c-c45f-45cf-a540-4c24797e3f16"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 23:15:22 crc kubenswrapper[4962]: I1201 23:15:22.051323 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07461b2c-c45f-45cf-a540-4c24797e3f16-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "07461b2c-c45f-45cf-a540-4c24797e3f16" (UID: "07461b2c-c45f-45cf-a540-4c24797e3f16"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 23:15:22 crc kubenswrapper[4962]: I1201 23:15:22.085857 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07461b2c-c45f-45cf-a540-4c24797e3f16-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "07461b2c-c45f-45cf-a540-4c24797e3f16" (UID: "07461b2c-c45f-45cf-a540-4c24797e3f16"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 23:15:22 crc kubenswrapper[4962]: I1201 23:15:22.117372 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/07461b2c-c45f-45cf-a540-4c24797e3f16-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 23:15:22 crc kubenswrapper[4962]: I1201 23:15:22.117849 4962 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/07461b2c-c45f-45cf-a540-4c24797e3f16-ca-certs\") on node \"crc\" DevicePath \"\"" Dec 01 23:15:22 crc kubenswrapper[4962]: I1201 23:15:22.118041 4962 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/07461b2c-c45f-45cf-a540-4c24797e3f16-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 01 23:15:22 crc kubenswrapper[4962]: I1201 23:15:22.118166 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78mnh\" (UniqueName: \"kubernetes.io/projected/07461b2c-c45f-45cf-a540-4c24797e3f16-kube-api-access-78mnh\") on node \"crc\" DevicePath \"\"" Dec 01 23:15:22 crc kubenswrapper[4962]: I1201 23:15:22.118256 4962 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/07461b2c-c45f-45cf-a540-4c24797e3f16-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Dec 01 23:15:22 crc kubenswrapper[4962]: I1201 23:15:22.118340 4962 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/07461b2c-c45f-45cf-a540-4c24797e3f16-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 01 23:15:22 crc kubenswrapper[4962]: I1201 23:15:22.118417 4962 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/07461b2c-c45f-45cf-a540-4c24797e3f16-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 23:15:22 crc kubenswrapper[4962]: I1201 23:15:22.118904 4962 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/07461b2c-c45f-45cf-a540-4c24797e3f16-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Dec 01 23:15:22 crc kubenswrapper[4962]: I1201 23:15:22.119531 4962 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Dec 01 23:15:22 crc kubenswrapper[4962]: I1201 23:15:22.147121 4962 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Dec 01 23:15:22 crc kubenswrapper[4962]: I1201 23:15:22.224144 4962 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Dec 01 23:15:22 crc kubenswrapper[4962]: I1201 23:15:22.407519 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"07461b2c-c45f-45cf-a540-4c24797e3f16","Type":"ContainerDied","Data":"5072c03127bd12d3e58ebd9fbdfdee2424fe9cf1e80c53f50a6bd6cc15fc6d98"} Dec 01 23:15:22 crc kubenswrapper[4962]: I1201 23:15:22.407575 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5072c03127bd12d3e58ebd9fbdfdee2424fe9cf1e80c53f50a6bd6cc15fc6d98" Dec 01 23:15:22 crc kubenswrapper[4962]: I1201 23:15:22.407584 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 01 23:15:31 crc kubenswrapper[4962]: I1201 23:15:31.221333 4962 scope.go:117] "RemoveContainer" containerID="9069c6b0afe806496066b75f21b8e7f8eb26d789d60c34eb7173619cac2f7eff" Dec 01 23:15:31 crc kubenswrapper[4962]: E1201 23:15:31.222364 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 23:15:31 crc kubenswrapper[4962]: I1201 23:15:31.846242 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 01 23:15:31 crc kubenswrapper[4962]: E1201 23:15:31.847191 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35dde230-6822-4679-8436-376d5fae4be0" containerName="collect-profiles" Dec 01 23:15:31 crc kubenswrapper[4962]: I1201 23:15:31.847251 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="35dde230-6822-4679-8436-376d5fae4be0" containerName="collect-profiles" Dec 01 23:15:31 crc kubenswrapper[4962]: E1201 23:15:31.847288 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07461b2c-c45f-45cf-a540-4c24797e3f16" containerName="tempest-tests-tempest-tests-runner" Dec 01 23:15:31 crc kubenswrapper[4962]: I1201 23:15:31.847302 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="07461b2c-c45f-45cf-a540-4c24797e3f16" containerName="tempest-tests-tempest-tests-runner" Dec 01 23:15:31 crc kubenswrapper[4962]: I1201 23:15:31.847766 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="35dde230-6822-4679-8436-376d5fae4be0" containerName="collect-profiles" Dec 01 23:15:31 crc kubenswrapper[4962]: I1201 23:15:31.847805 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="07461b2c-c45f-45cf-a540-4c24797e3f16" containerName="tempest-tests-tempest-tests-runner" Dec 01 23:15:31 crc kubenswrapper[4962]: I1201 23:15:31.849344 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 01 23:15:31 crc kubenswrapper[4962]: I1201 23:15:31.852632 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-tqkv8" Dec 01 23:15:31 crc kubenswrapper[4962]: I1201 23:15:31.878530 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 01 23:15:32 crc kubenswrapper[4962]: I1201 23:15:32.014618 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"792d09ec-504b-41d0-a382-4503283ad0d5\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 01 23:15:32 crc kubenswrapper[4962]: I1201 23:15:32.014791 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfrjm\" (UniqueName: \"kubernetes.io/projected/792d09ec-504b-41d0-a382-4503283ad0d5-kube-api-access-tfrjm\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"792d09ec-504b-41d0-a382-4503283ad0d5\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 01 23:15:32 crc kubenswrapper[4962]: I1201 23:15:32.117128 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfrjm\" (UniqueName: \"kubernetes.io/projected/792d09ec-504b-41d0-a382-4503283ad0d5-kube-api-access-tfrjm\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"792d09ec-504b-41d0-a382-4503283ad0d5\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 01 23:15:32 crc kubenswrapper[4962]: I1201 23:15:32.118543 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"792d09ec-504b-41d0-a382-4503283ad0d5\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 01 23:15:32 crc kubenswrapper[4962]: I1201 23:15:32.119612 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"792d09ec-504b-41d0-a382-4503283ad0d5\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 01 23:15:32 crc kubenswrapper[4962]: I1201 23:15:32.140622 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfrjm\" (UniqueName: \"kubernetes.io/projected/792d09ec-504b-41d0-a382-4503283ad0d5-kube-api-access-tfrjm\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"792d09ec-504b-41d0-a382-4503283ad0d5\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 01 23:15:32 crc kubenswrapper[4962]: I1201 23:15:32.164804 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"792d09ec-504b-41d0-a382-4503283ad0d5\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 01 23:15:32 crc kubenswrapper[4962]: I1201 23:15:32.180383 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 01 23:15:32 crc kubenswrapper[4962]: I1201 23:15:32.688201 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 01 23:15:32 crc kubenswrapper[4962]: I1201 23:15:32.690885 4962 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 23:15:33 crc kubenswrapper[4962]: I1201 23:15:33.563125 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"792d09ec-504b-41d0-a382-4503283ad0d5","Type":"ContainerStarted","Data":"3f69637104234a55f4f30e51a2768e5d1976698e2fb87c971e3056f5a1438777"} Dec 01 23:15:34 crc kubenswrapper[4962]: I1201 23:15:34.578063 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"792d09ec-504b-41d0-a382-4503283ad0d5","Type":"ContainerStarted","Data":"63be5e8902dd27f3c009ffd266183e5d806b44e62309f36a6657801c62b55ecf"} Dec 01 23:15:34 crc kubenswrapper[4962]: I1201 23:15:34.602739 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.5635095 podStartE2EDuration="3.602718651s" podCreationTimestamp="2025-12-01 23:15:31 +0000 UTC" firstStartedPulling="2025-12-01 23:15:32.690513922 +0000 UTC m=+6116.791953107" lastFinishedPulling="2025-12-01 23:15:33.729723053 +0000 UTC m=+6117.831162258" observedRunningTime="2025-12-01 23:15:34.600739435 +0000 UTC m=+6118.702178630" watchObservedRunningTime="2025-12-01 23:15:34.602718651 +0000 UTC m=+6118.704157856" Dec 01 23:15:42 crc kubenswrapper[4962]: I1201 23:15:42.220228 4962 scope.go:117] "RemoveContainer" containerID="9069c6b0afe806496066b75f21b8e7f8eb26d789d60c34eb7173619cac2f7eff" Dec 01 23:15:42 crc kubenswrapper[4962]: E1201 23:15:42.221257 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 23:15:57 crc kubenswrapper[4962]: I1201 23:15:57.219324 4962 scope.go:117] "RemoveContainer" containerID="9069c6b0afe806496066b75f21b8e7f8eb26d789d60c34eb7173619cac2f7eff" Dec 01 23:15:57 crc kubenswrapper[4962]: E1201 23:15:57.220262 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 23:16:02 crc kubenswrapper[4962]: I1201 23:16:02.413408 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-ps2df/must-gather-5pcl2"] Dec 01 23:16:02 crc kubenswrapper[4962]: I1201 23:16:02.415877 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ps2df/must-gather-5pcl2" Dec 01 23:16:02 crc kubenswrapper[4962]: I1201 23:16:02.420893 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-ps2df"/"default-dockercfg-tqrp6" Dec 01 23:16:02 crc kubenswrapper[4962]: I1201 23:16:02.420902 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-ps2df"/"openshift-service-ca.crt" Dec 01 23:16:02 crc kubenswrapper[4962]: I1201 23:16:02.421735 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-ps2df"/"kube-root-ca.crt" Dec 01 23:16:02 crc kubenswrapper[4962]: I1201 23:16:02.444547 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-ps2df/must-gather-5pcl2"] Dec 01 23:16:02 crc kubenswrapper[4962]: I1201 23:16:02.476373 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlspn\" (UniqueName: \"kubernetes.io/projected/f31dec33-accc-4d65-88f7-1c3e6d179671-kube-api-access-jlspn\") pod \"must-gather-5pcl2\" (UID: \"f31dec33-accc-4d65-88f7-1c3e6d179671\") " pod="openshift-must-gather-ps2df/must-gather-5pcl2" Dec 01 23:16:02 crc kubenswrapper[4962]: I1201 23:16:02.476583 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f31dec33-accc-4d65-88f7-1c3e6d179671-must-gather-output\") pod \"must-gather-5pcl2\" (UID: \"f31dec33-accc-4d65-88f7-1c3e6d179671\") " pod="openshift-must-gather-ps2df/must-gather-5pcl2" Dec 01 23:16:02 crc kubenswrapper[4962]: I1201 23:16:02.578197 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f31dec33-accc-4d65-88f7-1c3e6d179671-must-gather-output\") pod \"must-gather-5pcl2\" (UID: \"f31dec33-accc-4d65-88f7-1c3e6d179671\") " pod="openshift-must-gather-ps2df/must-gather-5pcl2" Dec 01 23:16:02 crc kubenswrapper[4962]: I1201 23:16:02.578325 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlspn\" (UniqueName: \"kubernetes.io/projected/f31dec33-accc-4d65-88f7-1c3e6d179671-kube-api-access-jlspn\") pod \"must-gather-5pcl2\" (UID: \"f31dec33-accc-4d65-88f7-1c3e6d179671\") " pod="openshift-must-gather-ps2df/must-gather-5pcl2" Dec 01 23:16:02 crc kubenswrapper[4962]: I1201 23:16:02.579102 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f31dec33-accc-4d65-88f7-1c3e6d179671-must-gather-output\") pod \"must-gather-5pcl2\" (UID: \"f31dec33-accc-4d65-88f7-1c3e6d179671\") " pod="openshift-must-gather-ps2df/must-gather-5pcl2" Dec 01 23:16:02 crc kubenswrapper[4962]: I1201 23:16:02.599083 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlspn\" (UniqueName: \"kubernetes.io/projected/f31dec33-accc-4d65-88f7-1c3e6d179671-kube-api-access-jlspn\") pod \"must-gather-5pcl2\" (UID: \"f31dec33-accc-4d65-88f7-1c3e6d179671\") " pod="openshift-must-gather-ps2df/must-gather-5pcl2" Dec 01 23:16:02 crc kubenswrapper[4962]: I1201 23:16:02.744423 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ps2df/must-gather-5pcl2" Dec 01 23:16:03 crc kubenswrapper[4962]: I1201 23:16:03.258907 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-ps2df/must-gather-5pcl2"] Dec 01 23:16:04 crc kubenswrapper[4962]: I1201 23:16:04.024015 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ps2df/must-gather-5pcl2" event={"ID":"f31dec33-accc-4d65-88f7-1c3e6d179671","Type":"ContainerStarted","Data":"056ec1e3bed30b6bf04071dffe0318e72edf41f44c188954595f3907ee108da1"} Dec 01 23:16:08 crc kubenswrapper[4962]: I1201 23:16:08.219727 4962 scope.go:117] "RemoveContainer" containerID="9069c6b0afe806496066b75f21b8e7f8eb26d789d60c34eb7173619cac2f7eff" Dec 01 23:16:08 crc kubenswrapper[4962]: E1201 23:16:08.220961 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 23:16:10 crc kubenswrapper[4962]: I1201 23:16:10.099049 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ps2df/must-gather-5pcl2" event={"ID":"f31dec33-accc-4d65-88f7-1c3e6d179671","Type":"ContainerStarted","Data":"5474913e902ab2a0db121683e228b7e6a8df406cc278e2aea6672b812dc132e2"} Dec 01 23:16:10 crc kubenswrapper[4962]: I1201 23:16:10.099575 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ps2df/must-gather-5pcl2" event={"ID":"f31dec33-accc-4d65-88f7-1c3e6d179671","Type":"ContainerStarted","Data":"361def20a233997b5d820ed2051d0c269c57be73a8e197f8b0322c30120a870c"} Dec 01 23:16:10 crc kubenswrapper[4962]: I1201 23:16:10.114781 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-ps2df/must-gather-5pcl2" podStartSLOduration=1.9329240049999998 podStartE2EDuration="8.114764879s" podCreationTimestamp="2025-12-01 23:16:02 +0000 UTC" firstStartedPulling="2025-12-01 23:16:03.278043682 +0000 UTC m=+6147.379482897" lastFinishedPulling="2025-12-01 23:16:09.459884536 +0000 UTC m=+6153.561323771" observedRunningTime="2025-12-01 23:16:10.113586805 +0000 UTC m=+6154.215026000" watchObservedRunningTime="2025-12-01 23:16:10.114764879 +0000 UTC m=+6154.216204074" Dec 01 23:16:14 crc kubenswrapper[4962]: I1201 23:16:14.843540 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-ps2df/crc-debug-dr78f"] Dec 01 23:16:14 crc kubenswrapper[4962]: I1201 23:16:14.847310 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ps2df/crc-debug-dr78f" Dec 01 23:16:14 crc kubenswrapper[4962]: I1201 23:16:14.904970 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3a19f8c6-82c0-403c-bf2c-2a28e74c2376-host\") pod \"crc-debug-dr78f\" (UID: \"3a19f8c6-82c0-403c-bf2c-2a28e74c2376\") " pod="openshift-must-gather-ps2df/crc-debug-dr78f" Dec 01 23:16:14 crc kubenswrapper[4962]: I1201 23:16:14.905079 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7k7t\" (UniqueName: \"kubernetes.io/projected/3a19f8c6-82c0-403c-bf2c-2a28e74c2376-kube-api-access-j7k7t\") pod \"crc-debug-dr78f\" (UID: \"3a19f8c6-82c0-403c-bf2c-2a28e74c2376\") " pod="openshift-must-gather-ps2df/crc-debug-dr78f" Dec 01 23:16:15 crc kubenswrapper[4962]: I1201 23:16:15.007696 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3a19f8c6-82c0-403c-bf2c-2a28e74c2376-host\") pod \"crc-debug-dr78f\" (UID: \"3a19f8c6-82c0-403c-bf2c-2a28e74c2376\") " pod="openshift-must-gather-ps2df/crc-debug-dr78f" Dec 01 23:16:15 crc kubenswrapper[4962]: I1201 23:16:15.007782 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7k7t\" (UniqueName: \"kubernetes.io/projected/3a19f8c6-82c0-403c-bf2c-2a28e74c2376-kube-api-access-j7k7t\") pod \"crc-debug-dr78f\" (UID: \"3a19f8c6-82c0-403c-bf2c-2a28e74c2376\") " pod="openshift-must-gather-ps2df/crc-debug-dr78f" Dec 01 23:16:15 crc kubenswrapper[4962]: I1201 23:16:15.008181 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3a19f8c6-82c0-403c-bf2c-2a28e74c2376-host\") pod \"crc-debug-dr78f\" (UID: \"3a19f8c6-82c0-403c-bf2c-2a28e74c2376\") " pod="openshift-must-gather-ps2df/crc-debug-dr78f" Dec 01 23:16:15 crc kubenswrapper[4962]: I1201 23:16:15.041379 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7k7t\" (UniqueName: \"kubernetes.io/projected/3a19f8c6-82c0-403c-bf2c-2a28e74c2376-kube-api-access-j7k7t\") pod \"crc-debug-dr78f\" (UID: \"3a19f8c6-82c0-403c-bf2c-2a28e74c2376\") " pod="openshift-must-gather-ps2df/crc-debug-dr78f" Dec 01 23:16:15 crc kubenswrapper[4962]: I1201 23:16:15.167468 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ps2df/crc-debug-dr78f" Dec 01 23:16:16 crc kubenswrapper[4962]: I1201 23:16:16.180018 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ps2df/crc-debug-dr78f" event={"ID":"3a19f8c6-82c0-403c-bf2c-2a28e74c2376","Type":"ContainerStarted","Data":"002fb957a239c0975e2f3b7dadea1481f8593843998cf3d82839739b39a0d167"} Dec 01 23:16:20 crc kubenswrapper[4962]: I1201 23:16:20.219736 4962 scope.go:117] "RemoveContainer" containerID="9069c6b0afe806496066b75f21b8e7f8eb26d789d60c34eb7173619cac2f7eff" Dec 01 23:16:20 crc kubenswrapper[4962]: E1201 23:16:20.220507 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 23:16:28 crc kubenswrapper[4962]: I1201 23:16:28.334977 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ps2df/crc-debug-dr78f" event={"ID":"3a19f8c6-82c0-403c-bf2c-2a28e74c2376","Type":"ContainerStarted","Data":"f4d6205daa2c6812d0d5f064d1d86997b28ac62d45ade3e0ba3b8e20b6f9276b"} Dec 01 23:16:28 crc kubenswrapper[4962]: I1201 23:16:28.364829 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-ps2df/crc-debug-dr78f" podStartSLOduration=1.707262351 podStartE2EDuration="14.364807273s" podCreationTimestamp="2025-12-01 23:16:14 +0000 UTC" firstStartedPulling="2025-12-01 23:16:15.208271979 +0000 UTC m=+6159.309711174" lastFinishedPulling="2025-12-01 23:16:27.865816901 +0000 UTC m=+6171.967256096" observedRunningTime="2025-12-01 23:16:28.35730767 +0000 UTC m=+6172.458746865" watchObservedRunningTime="2025-12-01 23:16:28.364807273 +0000 UTC m=+6172.466246478" Dec 01 23:16:31 crc kubenswrapper[4962]: I1201 23:16:31.219719 4962 scope.go:117] "RemoveContainer" containerID="9069c6b0afe806496066b75f21b8e7f8eb26d789d60c34eb7173619cac2f7eff" Dec 01 23:16:31 crc kubenswrapper[4962]: E1201 23:16:31.220577 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 23:16:45 crc kubenswrapper[4962]: I1201 23:16:45.220327 4962 scope.go:117] "RemoveContainer" containerID="9069c6b0afe806496066b75f21b8e7f8eb26d789d60c34eb7173619cac2f7eff" Dec 01 23:16:45 crc kubenswrapper[4962]: E1201 23:16:45.221255 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 23:17:00 crc kubenswrapper[4962]: I1201 23:17:00.221169 4962 scope.go:117] "RemoveContainer" containerID="9069c6b0afe806496066b75f21b8e7f8eb26d789d60c34eb7173619cac2f7eff" Dec 01 23:17:00 crc kubenswrapper[4962]: E1201 23:17:00.221909 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 23:17:12 crc kubenswrapper[4962]: I1201 23:17:12.219386 4962 scope.go:117] "RemoveContainer" containerID="9069c6b0afe806496066b75f21b8e7f8eb26d789d60c34eb7173619cac2f7eff" Dec 01 23:17:12 crc kubenswrapper[4962]: E1201 23:17:12.220053 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 23:17:15 crc kubenswrapper[4962]: I1201 23:17:15.878117 4962 generic.go:334] "Generic (PLEG): container finished" podID="3a19f8c6-82c0-403c-bf2c-2a28e74c2376" containerID="f4d6205daa2c6812d0d5f064d1d86997b28ac62d45ade3e0ba3b8e20b6f9276b" exitCode=0 Dec 01 23:17:15 crc kubenswrapper[4962]: I1201 23:17:15.878260 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ps2df/crc-debug-dr78f" event={"ID":"3a19f8c6-82c0-403c-bf2c-2a28e74c2376","Type":"ContainerDied","Data":"f4d6205daa2c6812d0d5f064d1d86997b28ac62d45ade3e0ba3b8e20b6f9276b"} Dec 01 23:17:17 crc kubenswrapper[4962]: I1201 23:17:17.040443 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ps2df/crc-debug-dr78f" Dec 01 23:17:17 crc kubenswrapper[4962]: I1201 23:17:17.083143 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3a19f8c6-82c0-403c-bf2c-2a28e74c2376-host\") pod \"3a19f8c6-82c0-403c-bf2c-2a28e74c2376\" (UID: \"3a19f8c6-82c0-403c-bf2c-2a28e74c2376\") " Dec 01 23:17:17 crc kubenswrapper[4962]: I1201 23:17:17.083792 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7k7t\" (UniqueName: \"kubernetes.io/projected/3a19f8c6-82c0-403c-bf2c-2a28e74c2376-kube-api-access-j7k7t\") pod \"3a19f8c6-82c0-403c-bf2c-2a28e74c2376\" (UID: \"3a19f8c6-82c0-403c-bf2c-2a28e74c2376\") " Dec 01 23:17:17 crc kubenswrapper[4962]: I1201 23:17:17.084282 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3a19f8c6-82c0-403c-bf2c-2a28e74c2376-host" (OuterVolumeSpecName: "host") pod "3a19f8c6-82c0-403c-bf2c-2a28e74c2376" (UID: "3a19f8c6-82c0-403c-bf2c-2a28e74c2376"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 23:17:17 crc kubenswrapper[4962]: I1201 23:17:17.084686 4962 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3a19f8c6-82c0-403c-bf2c-2a28e74c2376-host\") on node \"crc\" DevicePath \"\"" Dec 01 23:17:17 crc kubenswrapper[4962]: I1201 23:17:17.085282 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-ps2df/crc-debug-dr78f"] Dec 01 23:17:17 crc kubenswrapper[4962]: I1201 23:17:17.097122 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-ps2df/crc-debug-dr78f"] Dec 01 23:17:17 crc kubenswrapper[4962]: I1201 23:17:17.103852 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a19f8c6-82c0-403c-bf2c-2a28e74c2376-kube-api-access-j7k7t" (OuterVolumeSpecName: "kube-api-access-j7k7t") pod "3a19f8c6-82c0-403c-bf2c-2a28e74c2376" (UID: "3a19f8c6-82c0-403c-bf2c-2a28e74c2376"). InnerVolumeSpecName "kube-api-access-j7k7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 23:17:17 crc kubenswrapper[4962]: I1201 23:17:17.185857 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7k7t\" (UniqueName: \"kubernetes.io/projected/3a19f8c6-82c0-403c-bf2c-2a28e74c2376-kube-api-access-j7k7t\") on node \"crc\" DevicePath \"\"" Dec 01 23:17:17 crc kubenswrapper[4962]: I1201 23:17:17.906272 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="002fb957a239c0975e2f3b7dadea1481f8593843998cf3d82839739b39a0d167" Dec 01 23:17:17 crc kubenswrapper[4962]: I1201 23:17:17.906328 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ps2df/crc-debug-dr78f" Dec 01 23:17:18 crc kubenswrapper[4962]: I1201 23:17:18.234059 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a19f8c6-82c0-403c-bf2c-2a28e74c2376" path="/var/lib/kubelet/pods/3a19f8c6-82c0-403c-bf2c-2a28e74c2376/volumes" Dec 01 23:17:18 crc kubenswrapper[4962]: I1201 23:17:18.365078 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-ps2df/crc-debug-fltcn"] Dec 01 23:17:18 crc kubenswrapper[4962]: E1201 23:17:18.365913 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a19f8c6-82c0-403c-bf2c-2a28e74c2376" containerName="container-00" Dec 01 23:17:18 crc kubenswrapper[4962]: I1201 23:17:18.366045 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a19f8c6-82c0-403c-bf2c-2a28e74c2376" containerName="container-00" Dec 01 23:17:18 crc kubenswrapper[4962]: I1201 23:17:18.366408 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a19f8c6-82c0-403c-bf2c-2a28e74c2376" containerName="container-00" Dec 01 23:17:18 crc kubenswrapper[4962]: I1201 23:17:18.367641 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ps2df/crc-debug-fltcn" Dec 01 23:17:18 crc kubenswrapper[4962]: I1201 23:17:18.522684 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3bb4b384-02d9-419b-9c32-dd4fcedd6d2b-host\") pod \"crc-debug-fltcn\" (UID: \"3bb4b384-02d9-419b-9c32-dd4fcedd6d2b\") " pod="openshift-must-gather-ps2df/crc-debug-fltcn" Dec 01 23:17:18 crc kubenswrapper[4962]: I1201 23:17:18.522752 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sschr\" (UniqueName: \"kubernetes.io/projected/3bb4b384-02d9-419b-9c32-dd4fcedd6d2b-kube-api-access-sschr\") pod \"crc-debug-fltcn\" (UID: \"3bb4b384-02d9-419b-9c32-dd4fcedd6d2b\") " pod="openshift-must-gather-ps2df/crc-debug-fltcn" Dec 01 23:17:18 crc kubenswrapper[4962]: I1201 23:17:18.625863 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3bb4b384-02d9-419b-9c32-dd4fcedd6d2b-host\") pod \"crc-debug-fltcn\" (UID: \"3bb4b384-02d9-419b-9c32-dd4fcedd6d2b\") " pod="openshift-must-gather-ps2df/crc-debug-fltcn" Dec 01 23:17:18 crc kubenswrapper[4962]: I1201 23:17:18.625926 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sschr\" (UniqueName: \"kubernetes.io/projected/3bb4b384-02d9-419b-9c32-dd4fcedd6d2b-kube-api-access-sschr\") pod \"crc-debug-fltcn\" (UID: \"3bb4b384-02d9-419b-9c32-dd4fcedd6d2b\") " pod="openshift-must-gather-ps2df/crc-debug-fltcn" Dec 01 23:17:18 crc kubenswrapper[4962]: I1201 23:17:18.626023 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3bb4b384-02d9-419b-9c32-dd4fcedd6d2b-host\") pod \"crc-debug-fltcn\" (UID: \"3bb4b384-02d9-419b-9c32-dd4fcedd6d2b\") " pod="openshift-must-gather-ps2df/crc-debug-fltcn" Dec 01 23:17:18 crc kubenswrapper[4962]: I1201 23:17:18.644211 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sschr\" (UniqueName: \"kubernetes.io/projected/3bb4b384-02d9-419b-9c32-dd4fcedd6d2b-kube-api-access-sschr\") pod \"crc-debug-fltcn\" (UID: \"3bb4b384-02d9-419b-9c32-dd4fcedd6d2b\") " pod="openshift-must-gather-ps2df/crc-debug-fltcn" Dec 01 23:17:18 crc kubenswrapper[4962]: I1201 23:17:18.684828 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ps2df/crc-debug-fltcn" Dec 01 23:17:18 crc kubenswrapper[4962]: I1201 23:17:18.925342 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ps2df/crc-debug-fltcn" event={"ID":"3bb4b384-02d9-419b-9c32-dd4fcedd6d2b","Type":"ContainerStarted","Data":"81f3d5ad72a1f024beb493925d857ad2c3c1179be79338fac82b88d19caff9ac"} Dec 01 23:17:19 crc kubenswrapper[4962]: E1201 23:17:19.335225 4962 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3bb4b384_02d9_419b_9c32_dd4fcedd6d2b.slice/crio-7612b67d6c56216b58753cfacc25857ec263f6e885b006fc9a9f49732c6b1d60.scope\": RecentStats: unable to find data in memory cache]" Dec 01 23:17:19 crc kubenswrapper[4962]: I1201 23:17:19.940265 4962 generic.go:334] "Generic (PLEG): container finished" podID="3bb4b384-02d9-419b-9c32-dd4fcedd6d2b" containerID="7612b67d6c56216b58753cfacc25857ec263f6e885b006fc9a9f49732c6b1d60" exitCode=0 Dec 01 23:17:19 crc kubenswrapper[4962]: I1201 23:17:19.940316 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ps2df/crc-debug-fltcn" event={"ID":"3bb4b384-02d9-419b-9c32-dd4fcedd6d2b","Type":"ContainerDied","Data":"7612b67d6c56216b58753cfacc25857ec263f6e885b006fc9a9f49732c6b1d60"} Dec 01 23:17:21 crc kubenswrapper[4962]: I1201 23:17:21.053277 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ps2df/crc-debug-fltcn" Dec 01 23:17:21 crc kubenswrapper[4962]: I1201 23:17:21.179140 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3bb4b384-02d9-419b-9c32-dd4fcedd6d2b-host\") pod \"3bb4b384-02d9-419b-9c32-dd4fcedd6d2b\" (UID: \"3bb4b384-02d9-419b-9c32-dd4fcedd6d2b\") " Dec 01 23:17:21 crc kubenswrapper[4962]: I1201 23:17:21.179291 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3bb4b384-02d9-419b-9c32-dd4fcedd6d2b-host" (OuterVolumeSpecName: "host") pod "3bb4b384-02d9-419b-9c32-dd4fcedd6d2b" (UID: "3bb4b384-02d9-419b-9c32-dd4fcedd6d2b"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 23:17:21 crc kubenswrapper[4962]: I1201 23:17:21.179309 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sschr\" (UniqueName: \"kubernetes.io/projected/3bb4b384-02d9-419b-9c32-dd4fcedd6d2b-kube-api-access-sschr\") pod \"3bb4b384-02d9-419b-9c32-dd4fcedd6d2b\" (UID: \"3bb4b384-02d9-419b-9c32-dd4fcedd6d2b\") " Dec 01 23:17:21 crc kubenswrapper[4962]: I1201 23:17:21.180096 4962 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3bb4b384-02d9-419b-9c32-dd4fcedd6d2b-host\") on node \"crc\" DevicePath \"\"" Dec 01 23:17:21 crc kubenswrapper[4962]: I1201 23:17:21.185355 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bb4b384-02d9-419b-9c32-dd4fcedd6d2b-kube-api-access-sschr" (OuterVolumeSpecName: "kube-api-access-sschr") pod "3bb4b384-02d9-419b-9c32-dd4fcedd6d2b" (UID: "3bb4b384-02d9-419b-9c32-dd4fcedd6d2b"). InnerVolumeSpecName "kube-api-access-sschr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 23:17:21 crc kubenswrapper[4962]: I1201 23:17:21.283244 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sschr\" (UniqueName: \"kubernetes.io/projected/3bb4b384-02d9-419b-9c32-dd4fcedd6d2b-kube-api-access-sschr\") on node \"crc\" DevicePath \"\"" Dec 01 23:17:21 crc kubenswrapper[4962]: I1201 23:17:21.962174 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ps2df/crc-debug-fltcn" event={"ID":"3bb4b384-02d9-419b-9c32-dd4fcedd6d2b","Type":"ContainerDied","Data":"81f3d5ad72a1f024beb493925d857ad2c3c1179be79338fac82b88d19caff9ac"} Dec 01 23:17:21 crc kubenswrapper[4962]: I1201 23:17:21.962482 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81f3d5ad72a1f024beb493925d857ad2c3c1179be79338fac82b88d19caff9ac" Dec 01 23:17:21 crc kubenswrapper[4962]: I1201 23:17:21.962553 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ps2df/crc-debug-fltcn" Dec 01 23:17:22 crc kubenswrapper[4962]: I1201 23:17:22.338623 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-ps2df/crc-debug-fltcn"] Dec 01 23:17:22 crc kubenswrapper[4962]: I1201 23:17:22.351582 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-ps2df/crc-debug-fltcn"] Dec 01 23:17:23 crc kubenswrapper[4962]: I1201 23:17:23.546229 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-ps2df/crc-debug-fv7w2"] Dec 01 23:17:23 crc kubenswrapper[4962]: E1201 23:17:23.547140 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bb4b384-02d9-419b-9c32-dd4fcedd6d2b" containerName="container-00" Dec 01 23:17:23 crc kubenswrapper[4962]: I1201 23:17:23.547156 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bb4b384-02d9-419b-9c32-dd4fcedd6d2b" containerName="container-00" Dec 01 23:17:23 crc kubenswrapper[4962]: I1201 23:17:23.547500 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bb4b384-02d9-419b-9c32-dd4fcedd6d2b" containerName="container-00" Dec 01 23:17:23 crc kubenswrapper[4962]: I1201 23:17:23.548588 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ps2df/crc-debug-fv7w2" Dec 01 23:17:23 crc kubenswrapper[4962]: I1201 23:17:23.746723 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/213950ca-98fb-4dd3-acf2-7b7b2b6a104c-host\") pod \"crc-debug-fv7w2\" (UID: \"213950ca-98fb-4dd3-acf2-7b7b2b6a104c\") " pod="openshift-must-gather-ps2df/crc-debug-fv7w2" Dec 01 23:17:23 crc kubenswrapper[4962]: I1201 23:17:23.747347 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bvvm\" (UniqueName: \"kubernetes.io/projected/213950ca-98fb-4dd3-acf2-7b7b2b6a104c-kube-api-access-7bvvm\") pod \"crc-debug-fv7w2\" (UID: \"213950ca-98fb-4dd3-acf2-7b7b2b6a104c\") " pod="openshift-must-gather-ps2df/crc-debug-fv7w2" Dec 01 23:17:23 crc kubenswrapper[4962]: I1201 23:17:23.849790 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/213950ca-98fb-4dd3-acf2-7b7b2b6a104c-host\") pod \"crc-debug-fv7w2\" (UID: \"213950ca-98fb-4dd3-acf2-7b7b2b6a104c\") " pod="openshift-must-gather-ps2df/crc-debug-fv7w2" Dec 01 23:17:23 crc kubenswrapper[4962]: I1201 23:17:23.849867 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bvvm\" (UniqueName: \"kubernetes.io/projected/213950ca-98fb-4dd3-acf2-7b7b2b6a104c-kube-api-access-7bvvm\") pod \"crc-debug-fv7w2\" (UID: \"213950ca-98fb-4dd3-acf2-7b7b2b6a104c\") " pod="openshift-must-gather-ps2df/crc-debug-fv7w2" Dec 01 23:17:23 crc kubenswrapper[4962]: I1201 23:17:23.849929 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/213950ca-98fb-4dd3-acf2-7b7b2b6a104c-host\") pod \"crc-debug-fv7w2\" (UID: \"213950ca-98fb-4dd3-acf2-7b7b2b6a104c\") " pod="openshift-must-gather-ps2df/crc-debug-fv7w2" Dec 01 23:17:23 crc kubenswrapper[4962]: I1201 23:17:23.886225 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bvvm\" (UniqueName: \"kubernetes.io/projected/213950ca-98fb-4dd3-acf2-7b7b2b6a104c-kube-api-access-7bvvm\") pod \"crc-debug-fv7w2\" (UID: \"213950ca-98fb-4dd3-acf2-7b7b2b6a104c\") " pod="openshift-must-gather-ps2df/crc-debug-fv7w2" Dec 01 23:17:24 crc kubenswrapper[4962]: I1201 23:17:24.166370 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ps2df/crc-debug-fv7w2" Dec 01 23:17:24 crc kubenswrapper[4962]: I1201 23:17:24.246112 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bb4b384-02d9-419b-9c32-dd4fcedd6d2b" path="/var/lib/kubelet/pods/3bb4b384-02d9-419b-9c32-dd4fcedd6d2b/volumes" Dec 01 23:17:25 crc kubenswrapper[4962]: I1201 23:17:25.004234 4962 generic.go:334] "Generic (PLEG): container finished" podID="213950ca-98fb-4dd3-acf2-7b7b2b6a104c" containerID="e8458f124dcfc2007c1741ee18b3d8dbba3312ad0d0b8edf20947860c5a3b994" exitCode=0 Dec 01 23:17:25 crc kubenswrapper[4962]: I1201 23:17:25.004284 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ps2df/crc-debug-fv7w2" event={"ID":"213950ca-98fb-4dd3-acf2-7b7b2b6a104c","Type":"ContainerDied","Data":"e8458f124dcfc2007c1741ee18b3d8dbba3312ad0d0b8edf20947860c5a3b994"} Dec 01 23:17:25 crc kubenswrapper[4962]: I1201 23:17:25.004314 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ps2df/crc-debug-fv7w2" event={"ID":"213950ca-98fb-4dd3-acf2-7b7b2b6a104c","Type":"ContainerStarted","Data":"01b1e20e02c5d10081d6a97ba2847a6ee0477525e4059ac77d36f52de649bc62"} Dec 01 23:17:25 crc kubenswrapper[4962]: I1201 23:17:25.053024 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-ps2df/crc-debug-fv7w2"] Dec 01 23:17:25 crc kubenswrapper[4962]: I1201 23:17:25.066282 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-ps2df/crc-debug-fv7w2"] Dec 01 23:17:26 crc kubenswrapper[4962]: I1201 23:17:26.152446 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ps2df/crc-debug-fv7w2" Dec 01 23:17:26 crc kubenswrapper[4962]: I1201 23:17:26.311436 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/213950ca-98fb-4dd3-acf2-7b7b2b6a104c-host\") pod \"213950ca-98fb-4dd3-acf2-7b7b2b6a104c\" (UID: \"213950ca-98fb-4dd3-acf2-7b7b2b6a104c\") " Dec 01 23:17:26 crc kubenswrapper[4962]: I1201 23:17:26.311744 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bvvm\" (UniqueName: \"kubernetes.io/projected/213950ca-98fb-4dd3-acf2-7b7b2b6a104c-kube-api-access-7bvvm\") pod \"213950ca-98fb-4dd3-acf2-7b7b2b6a104c\" (UID: \"213950ca-98fb-4dd3-acf2-7b7b2b6a104c\") " Dec 01 23:17:26 crc kubenswrapper[4962]: I1201 23:17:26.311847 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/213950ca-98fb-4dd3-acf2-7b7b2b6a104c-host" (OuterVolumeSpecName: "host") pod "213950ca-98fb-4dd3-acf2-7b7b2b6a104c" (UID: "213950ca-98fb-4dd3-acf2-7b7b2b6a104c"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 23:17:26 crc kubenswrapper[4962]: I1201 23:17:26.313176 4962 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/213950ca-98fb-4dd3-acf2-7b7b2b6a104c-host\") on node \"crc\" DevicePath \"\"" Dec 01 23:17:26 crc kubenswrapper[4962]: I1201 23:17:26.317350 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/213950ca-98fb-4dd3-acf2-7b7b2b6a104c-kube-api-access-7bvvm" (OuterVolumeSpecName: "kube-api-access-7bvvm") pod "213950ca-98fb-4dd3-acf2-7b7b2b6a104c" (UID: "213950ca-98fb-4dd3-acf2-7b7b2b6a104c"). InnerVolumeSpecName "kube-api-access-7bvvm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 23:17:26 crc kubenswrapper[4962]: I1201 23:17:26.414966 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7bvvm\" (UniqueName: \"kubernetes.io/projected/213950ca-98fb-4dd3-acf2-7b7b2b6a104c-kube-api-access-7bvvm\") on node \"crc\" DevicePath \"\"" Dec 01 23:17:27 crc kubenswrapper[4962]: I1201 23:17:27.025444 4962 scope.go:117] "RemoveContainer" containerID="e8458f124dcfc2007c1741ee18b3d8dbba3312ad0d0b8edf20947860c5a3b994" Dec 01 23:17:27 crc kubenswrapper[4962]: I1201 23:17:27.025706 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ps2df/crc-debug-fv7w2" Dec 01 23:17:27 crc kubenswrapper[4962]: I1201 23:17:27.220263 4962 scope.go:117] "RemoveContainer" containerID="9069c6b0afe806496066b75f21b8e7f8eb26d789d60c34eb7173619cac2f7eff" Dec 01 23:17:27 crc kubenswrapper[4962]: E1201 23:17:27.220621 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 23:17:28 crc kubenswrapper[4962]: I1201 23:17:28.234120 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="213950ca-98fb-4dd3-acf2-7b7b2b6a104c" path="/var/lib/kubelet/pods/213950ca-98fb-4dd3-acf2-7b7b2b6a104c/volumes" Dec 01 23:17:41 crc kubenswrapper[4962]: I1201 23:17:41.219855 4962 scope.go:117] "RemoveContainer" containerID="9069c6b0afe806496066b75f21b8e7f8eb26d789d60c34eb7173619cac2f7eff" Dec 01 23:17:41 crc kubenswrapper[4962]: E1201 23:17:41.220759 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 23:17:51 crc kubenswrapper[4962]: I1201 23:17:51.810918 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_f1377a9a-eb6d-42f1-89f5-f8383c69b93e/aodh-api/0.log" Dec 01 23:17:52 crc kubenswrapper[4962]: I1201 23:17:52.037860 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_f1377a9a-eb6d-42f1-89f5-f8383c69b93e/aodh-evaluator/0.log" Dec 01 23:17:52 crc kubenswrapper[4962]: I1201 23:17:52.056255 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_f1377a9a-eb6d-42f1-89f5-f8383c69b93e/aodh-listener/0.log" Dec 01 23:17:52 crc kubenswrapper[4962]: I1201 23:17:52.129493 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_f1377a9a-eb6d-42f1-89f5-f8383c69b93e/aodh-notifier/0.log" Dec 01 23:17:52 crc kubenswrapper[4962]: I1201 23:17:52.240638 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-65c9498c86-5xmqd_91d67b1b-578f-46c7-aec8-83785d2fe411/barbican-api/0.log" Dec 01 23:17:52 crc kubenswrapper[4962]: I1201 23:17:52.317540 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-65c9498c86-5xmqd_91d67b1b-578f-46c7-aec8-83785d2fe411/barbican-api-log/0.log" Dec 01 23:17:52 crc kubenswrapper[4962]: I1201 23:17:52.411097 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7fbb7f6fcd-9l2gb_909a52c1-349c-4b1b-929f-7d2c554cad32/barbican-keystone-listener/0.log" Dec 01 23:17:52 crc kubenswrapper[4962]: I1201 23:17:52.551828 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7fbb7f6fcd-9l2gb_909a52c1-349c-4b1b-929f-7d2c554cad32/barbican-keystone-listener-log/0.log" Dec 01 23:17:52 crc kubenswrapper[4962]: I1201 23:17:52.605930 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-76b577fdff-d85rv_0fcefe11-14bc-40f6-8552-da42f7b63977/barbican-worker/0.log" Dec 01 23:17:52 crc kubenswrapper[4962]: I1201 23:17:52.669143 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-76b577fdff-d85rv_0fcefe11-14bc-40f6-8552-da42f7b63977/barbican-worker-log/0.log" Dec 01 23:17:52 crc kubenswrapper[4962]: I1201 23:17:52.790192 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-4d6td_6dc28247-9b3f-421b-a195-2f89ea5b50f8/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 23:17:52 crc kubenswrapper[4962]: I1201 23:17:52.967238 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_6dc4f027-5299-427c-9726-65012507b49b/ceilometer-central-agent/0.log" Dec 01 23:17:53 crc kubenswrapper[4962]: I1201 23:17:53.000688 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_6dc4f027-5299-427c-9726-65012507b49b/proxy-httpd/0.log" Dec 01 23:17:53 crc kubenswrapper[4962]: I1201 23:17:53.039551 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_6dc4f027-5299-427c-9726-65012507b49b/ceilometer-notification-agent/0.log" Dec 01 23:17:53 crc kubenswrapper[4962]: I1201 23:17:53.104682 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_6dc4f027-5299-427c-9726-65012507b49b/sg-core/0.log" Dec 01 23:17:53 crc kubenswrapper[4962]: I1201 23:17:53.260875 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_356fbcf6-1bde-4b7b-bf5f-7be551d1a03c/cinder-api-log/0.log" Dec 01 23:17:53 crc kubenswrapper[4962]: I1201 23:17:53.326015 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_356fbcf6-1bde-4b7b-bf5f-7be551d1a03c/cinder-api/0.log" Dec 01 23:17:53 crc kubenswrapper[4962]: I1201 23:17:53.481461 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_3c2a5fd2-18a9-46c7-8450-9c9cdeadea4d/cinder-scheduler/0.log" Dec 01 23:17:53 crc kubenswrapper[4962]: I1201 23:17:53.510870 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_3c2a5fd2-18a9-46c7-8450-9c9cdeadea4d/probe/0.log" Dec 01 23:17:53 crc kubenswrapper[4962]: I1201 23:17:53.623749 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-wl946_fae4b755-5a52-461b-939c-b870ddcc521b/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 23:17:53 crc kubenswrapper[4962]: I1201 23:17:53.745477 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-cjt2f_e004e0a7-fc4c-4236-a816-288147301262/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 23:17:53 crc kubenswrapper[4962]: I1201 23:17:53.822740 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5d75f767dc-grqhx_c626973f-0e99-4e4b-bc2b-8caddbada7aa/init/0.log" Dec 01 23:17:54 crc kubenswrapper[4962]: I1201 23:17:54.045166 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5d75f767dc-grqhx_c626973f-0e99-4e4b-bc2b-8caddbada7aa/init/0.log" Dec 01 23:17:54 crc kubenswrapper[4962]: I1201 23:17:54.110837 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-d2c4z_de8a047d-3a82-4ffe-a734-76c25d8997e5/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 23:17:54 crc kubenswrapper[4962]: I1201 23:17:54.119498 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5d75f767dc-grqhx_c626973f-0e99-4e4b-bc2b-8caddbada7aa/dnsmasq-dns/0.log" Dec 01 23:17:54 crc kubenswrapper[4962]: I1201 23:17:54.337582 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_87780c0b-00e7-44cd-93da-c22f2b2a771c/glance-httpd/0.log" Dec 01 23:17:54 crc kubenswrapper[4962]: I1201 23:17:54.345165 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_87780c0b-00e7-44cd-93da-c22f2b2a771c/glance-log/0.log" Dec 01 23:17:54 crc kubenswrapper[4962]: I1201 23:17:54.535429 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_891b6978-5cc9-464e-ae37-f9f7b3dadc62/glance-log/0.log" Dec 01 23:17:54 crc kubenswrapper[4962]: I1201 23:17:54.549444 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_891b6978-5cc9-464e-ae37-f9f7b3dadc62/glance-httpd/0.log" Dec 01 23:17:55 crc kubenswrapper[4962]: I1201 23:17:55.219262 4962 scope.go:117] "RemoveContainer" containerID="9069c6b0afe806496066b75f21b8e7f8eb26d789d60c34eb7173619cac2f7eff" Dec 01 23:17:55 crc kubenswrapper[4962]: E1201 23:17:55.219886 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 23:17:55 crc kubenswrapper[4962]: I1201 23:17:55.340297 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-77477565cc-t4xcz_673eaa4d-d246-4ca5-8f8e-7b464149d355/heat-api/0.log" Dec 01 23:17:55 crc kubenswrapper[4962]: I1201 23:17:55.444461 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-65cfb46b8d-72cj6_59a79ba3-6726-4020-8e97-80654b9cc661/heat-engine/0.log" Dec 01 23:17:55 crc kubenswrapper[4962]: I1201 23:17:55.480036 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-6d7cf4b459-tkf5n_bb34fc59-5f49-4cdf-81f2-9d2b03afc6e4/heat-cfnapi/0.log" Dec 01 23:17:55 crc kubenswrapper[4962]: I1201 23:17:55.732679 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-544xz_3dd32d50-cae8-4762-ba07-ce8d8d1996b8/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 23:17:56 crc kubenswrapper[4962]: I1201 23:17:56.153502 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-68jbf_d7f57b07-0c88-4569-9062-bbaaf50abefe/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 23:17:56 crc kubenswrapper[4962]: I1201 23:17:56.284136 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29410441-sgtnd_14dca7aa-3ee9-4af1-85ba-e92ac88fd223/keystone-cron/0.log" Dec 01 23:17:56 crc kubenswrapper[4962]: I1201 23:17:56.502347 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-bfc984cd5-wc42c_e30f9e2d-e0be-4484-bf6b-83c39beaa7e6/keystone-api/0.log" Dec 01 23:17:56 crc kubenswrapper[4962]: I1201 23:17:56.637568 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29410501-d9brm_3395cab0-9781-4fec-8e37-a3c4be3aca9a/keystone-cron/0.log" Dec 01 23:17:56 crc kubenswrapper[4962]: I1201 23:17:56.812968 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_412ecf69-be53-4cb2-9ea4-867884bbf8cf/kube-state-metrics/0.log" Dec 01 23:17:56 crc kubenswrapper[4962]: I1201 23:17:56.925638 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-nh8kx_db6f9af8-342d-4a5d-bd75-21d8d0f95c04/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 23:17:57 crc kubenswrapper[4962]: I1201 23:17:57.024784 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_logging-edpm-deployment-openstack-edpm-ipam-xrsvk_e3c41ae3-36b2-43dd-9580-fac72dc88d09/logging-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 23:17:57 crc kubenswrapper[4962]: I1201 23:17:57.305522 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mysqld-exporter-0_7a4079d4-140a-438c-a252-c0669217e113/mysqld-exporter/0.log" Dec 01 23:17:57 crc kubenswrapper[4962]: I1201 23:17:57.626797 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5b9776df9c-m5wv9_6ab1fe3f-42f6-4652-8d33-5f97b860b8fc/neutron-api/0.log" Dec 01 23:17:57 crc kubenswrapper[4962]: I1201 23:17:57.628388 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-bcxmz_dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 23:17:57 crc kubenswrapper[4962]: I1201 23:17:57.736357 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5b9776df9c-m5wv9_6ab1fe3f-42f6-4652-8d33-5f97b860b8fc/neutron-httpd/0.log" Dec 01 23:17:58 crc kubenswrapper[4962]: I1201 23:17:58.290396 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_9b1a4101-b960-4d3a-bba2-8472f8b2a726/nova-cell0-conductor-conductor/0.log" Dec 01 23:17:58 crc kubenswrapper[4962]: I1201 23:17:58.674732 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_aabf746f-d3b5-4858-a2ec-9a5cec96720a/nova-api-log/0.log" Dec 01 23:17:58 crc kubenswrapper[4962]: I1201 23:17:58.712159 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_7027bfd7-ae97-419b-aebd-11e811b45486/nova-cell1-conductor-conductor/0.log" Dec 01 23:17:58 crc kubenswrapper[4962]: I1201 23:17:58.976679 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_bf253872-abad-4b40-b941-2cbada4988ac/nova-cell1-novncproxy-novncproxy/0.log" Dec 01 23:17:59 crc kubenswrapper[4962]: I1201 23:17:59.023960 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-pl278_db0505bf-0445-4d21-9bc6-a483fdf94816/nova-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 23:17:59 crc kubenswrapper[4962]: I1201 23:17:59.087120 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_aabf746f-d3b5-4858-a2ec-9a5cec96720a/nova-api-api/0.log" Dec 01 23:17:59 crc kubenswrapper[4962]: I1201 23:17:59.540561 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_46f24440-8e28-4c9e-908d-ca07fd2edcfc/nova-metadata-log/0.log" Dec 01 23:17:59 crc kubenswrapper[4962]: I1201 23:17:59.754546 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_02074ca6-7293-4d4a-8354-f299b4ae4b5a/nova-scheduler-scheduler/0.log" Dec 01 23:17:59 crc kubenswrapper[4962]: I1201 23:17:59.776795 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_aebd10ab-b3dd-4bc7-8ea0-f5883d794715/mysql-bootstrap/0.log" Dec 01 23:17:59 crc kubenswrapper[4962]: I1201 23:17:59.965036 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_aebd10ab-b3dd-4bc7-8ea0-f5883d794715/galera/0.log" Dec 01 23:17:59 crc kubenswrapper[4962]: I1201 23:17:59.973556 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_aebd10ab-b3dd-4bc7-8ea0-f5883d794715/mysql-bootstrap/0.log" Dec 01 23:18:00 crc kubenswrapper[4962]: I1201 23:18:00.169827 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_c09bcbbf-f96b-4f90-8f2d-9d635454a05e/mysql-bootstrap/0.log" Dec 01 23:18:00 crc kubenswrapper[4962]: I1201 23:18:00.331742 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_c09bcbbf-f96b-4f90-8f2d-9d635454a05e/galera/0.log" Dec 01 23:18:00 crc kubenswrapper[4962]: I1201 23:18:00.340067 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_c09bcbbf-f96b-4f90-8f2d-9d635454a05e/mysql-bootstrap/0.log" Dec 01 23:18:00 crc kubenswrapper[4962]: I1201 23:18:00.568328 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_9fd1f254-7f23-46a6-b2fd-986de362e028/openstackclient/0.log" Dec 01 23:18:00 crc kubenswrapper[4962]: I1201 23:18:00.675705 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-vqjlm_1f47734e-2a33-432b-8030-c82a75ec77c3/openstack-network-exporter/0.log" Dec 01 23:18:00 crc kubenswrapper[4962]: I1201 23:18:00.884758 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-cdpb9_88fa575e-baee-41dd-8c7e-72baff22783e/ovsdb-server-init/0.log" Dec 01 23:18:01 crc kubenswrapper[4962]: I1201 23:18:01.175669 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-cdpb9_88fa575e-baee-41dd-8c7e-72baff22783e/ovsdb-server/0.log" Dec 01 23:18:01 crc kubenswrapper[4962]: I1201 23:18:01.188053 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-cdpb9_88fa575e-baee-41dd-8c7e-72baff22783e/ovsdb-server-init/0.log" Dec 01 23:18:01 crc kubenswrapper[4962]: I1201 23:18:01.189564 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-cdpb9_88fa575e-baee-41dd-8c7e-72baff22783e/ovs-vswitchd/0.log" Dec 01 23:18:01 crc kubenswrapper[4962]: I1201 23:18:01.446710 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-xd7ph_f4f0c9ce-824a-4a5b-adc9-f7a09b3ab97d/ovn-controller/0.log" Dec 01 23:18:01 crc kubenswrapper[4962]: I1201 23:18:01.624045 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-hwvmq_4ea2d579-b57c-41a6-a255-ee852e50ec7a/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 23:18:01 crc kubenswrapper[4962]: I1201 23:18:01.717475 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_08d09a46-a04a-4b53-aa6c-e24f284063f0/openstack-network-exporter/0.log" Dec 01 23:18:01 crc kubenswrapper[4962]: I1201 23:18:01.756828 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_46f24440-8e28-4c9e-908d-ca07fd2edcfc/nova-metadata-metadata/0.log" Dec 01 23:18:01 crc kubenswrapper[4962]: I1201 23:18:01.866238 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_08d09a46-a04a-4b53-aa6c-e24f284063f0/ovn-northd/0.log" Dec 01 23:18:01 crc kubenswrapper[4962]: I1201 23:18:01.966785 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_f893a462-9c1f-4b76-84fc-ba5e84364399/ovsdbserver-nb/0.log" Dec 01 23:18:01 crc kubenswrapper[4962]: I1201 23:18:01.977505 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_f893a462-9c1f-4b76-84fc-ba5e84364399/openstack-network-exporter/0.log" Dec 01 23:18:02 crc kubenswrapper[4962]: I1201 23:18:02.291666 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_fe00e319-7859-4bac-9316-156263865d80/openstack-network-exporter/0.log" Dec 01 23:18:02 crc kubenswrapper[4962]: I1201 23:18:02.332069 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_fe00e319-7859-4bac-9316-156263865d80/ovsdbserver-sb/0.log" Dec 01 23:18:02 crc kubenswrapper[4962]: I1201 23:18:02.658364 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-54647f544-d6jzt_6107fcdb-ceea-4953-a667-e3a973c68de3/placement-api/0.log" Dec 01 23:18:02 crc kubenswrapper[4962]: I1201 23:18:02.846194 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620/init-config-reloader/0.log" Dec 01 23:18:03 crc kubenswrapper[4962]: I1201 23:18:03.010413 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-54647f544-d6jzt_6107fcdb-ceea-4953-a667-e3a973c68de3/placement-log/0.log" Dec 01 23:18:03 crc kubenswrapper[4962]: I1201 23:18:03.090319 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620/init-config-reloader/0.log" Dec 01 23:18:03 crc kubenswrapper[4962]: I1201 23:18:03.129313 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620/config-reloader/0.log" Dec 01 23:18:03 crc kubenswrapper[4962]: I1201 23:18:03.144304 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620/prometheus/0.log" Dec 01 23:18:03 crc kubenswrapper[4962]: I1201 23:18:03.241894 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620/thanos-sidecar/0.log" Dec 01 23:18:03 crc kubenswrapper[4962]: I1201 23:18:03.389661 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_42940ca4-6f73-42b9-97b9-8fcf3fa4f968/setup-container/0.log" Dec 01 23:18:03 crc kubenswrapper[4962]: I1201 23:18:03.562443 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_42940ca4-6f73-42b9-97b9-8fcf3fa4f968/setup-container/0.log" Dec 01 23:18:03 crc kubenswrapper[4962]: I1201 23:18:03.584984 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_42940ca4-6f73-42b9-97b9-8fcf3fa4f968/rabbitmq/0.log" Dec 01 23:18:03 crc kubenswrapper[4962]: I1201 23:18:03.643059 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_2284f352-fb8b-4432-b26f-106c1255dd90/setup-container/0.log" Dec 01 23:18:03 crc kubenswrapper[4962]: I1201 23:18:03.850918 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_2284f352-fb8b-4432-b26f-106c1255dd90/setup-container/0.log" Dec 01 23:18:03 crc kubenswrapper[4962]: I1201 23:18:03.941572 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-sp5b2_aee89e9b-f93a-4100-bc51-0a701a9d9549/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 23:18:03 crc kubenswrapper[4962]: I1201 23:18:03.955324 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_2284f352-fb8b-4432-b26f-106c1255dd90/rabbitmq/0.log" Dec 01 23:18:04 crc kubenswrapper[4962]: I1201 23:18:04.165512 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-t8n5k_b8f0680f-6407-4e35-a927-3c0613e4f3e5/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 23:18:04 crc kubenswrapper[4962]: I1201 23:18:04.221515 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-qs68c_805f56ee-17d1-4e5a-8655-756050592352/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 23:18:04 crc kubenswrapper[4962]: I1201 23:18:04.475686 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-mwgwj_1e8e98c7-cc77-45f5-be56-cb73df6427e4/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 23:18:04 crc kubenswrapper[4962]: I1201 23:18:04.530294 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-stgzx_4356462c-43b0-40db-824b-f9abb87cb9dd/ssh-known-hosts-edpm-deployment/0.log" Dec 01 23:18:04 crc kubenswrapper[4962]: I1201 23:18:04.797416 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-65c954fcc-wpwn7_85abfbd6-374e-486e-93f1-8e8c4e8b5da0/proxy-server/0.log" Dec 01 23:18:04 crc kubenswrapper[4962]: I1201 23:18:04.978222 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-65c954fcc-wpwn7_85abfbd6-374e-486e-93f1-8e8c4e8b5da0/proxy-httpd/0.log" Dec 01 23:18:05 crc kubenswrapper[4962]: I1201 23:18:05.008714 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-5db2p_34cbe04f-2bf2-4b5e-bf91-00787b7e4fee/swift-ring-rebalance/0.log" Dec 01 23:18:05 crc kubenswrapper[4962]: I1201 23:18:05.110153 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3/account-auditor/0.log" Dec 01 23:18:05 crc kubenswrapper[4962]: I1201 23:18:05.194796 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3/account-reaper/0.log" Dec 01 23:18:05 crc kubenswrapper[4962]: I1201 23:18:05.353393 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3/account-replicator/0.log" Dec 01 23:18:05 crc kubenswrapper[4962]: I1201 23:18:05.391842 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3/account-server/0.log" Dec 01 23:18:05 crc kubenswrapper[4962]: I1201 23:18:05.405099 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3/container-auditor/0.log" Dec 01 23:18:05 crc kubenswrapper[4962]: I1201 23:18:05.510298 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3/container-replicator/0.log" Dec 01 23:18:05 crc kubenswrapper[4962]: I1201 23:18:05.588469 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3/container-server/0.log" Dec 01 23:18:05 crc kubenswrapper[4962]: I1201 23:18:05.644983 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3/container-updater/0.log" Dec 01 23:18:05 crc kubenswrapper[4962]: I1201 23:18:05.672079 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3/object-auditor/0.log" Dec 01 23:18:05 crc kubenswrapper[4962]: I1201 23:18:05.784845 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3/object-expirer/0.log" Dec 01 23:18:05 crc kubenswrapper[4962]: I1201 23:18:05.865470 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3/object-replicator/0.log" Dec 01 23:18:05 crc kubenswrapper[4962]: I1201 23:18:05.919895 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3/object-updater/0.log" Dec 01 23:18:05 crc kubenswrapper[4962]: I1201 23:18:05.939644 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3/object-server/0.log" Dec 01 23:18:06 crc kubenswrapper[4962]: I1201 23:18:06.292328 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3/rsync/0.log" Dec 01 23:18:06 crc kubenswrapper[4962]: I1201 23:18:06.303395 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3/swift-recon-cron/0.log" Dec 01 23:18:06 crc kubenswrapper[4962]: I1201 23:18:06.537631 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-4cmp6_a7fabf85-9e84-477d-9831-1f6ff8c52e3e/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 23:18:06 crc kubenswrapper[4962]: I1201 23:18:06.624897 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-power-monitoring-edpm-deployment-openstack-edpm-wbrjz_6a103f12-9cb1-4018-9db7-67553233f69d/telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 23:18:06 crc kubenswrapper[4962]: I1201 23:18:06.859109 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_792d09ec-504b-41d0-a382-4503283ad0d5/test-operator-logs-container/0.log" Dec 01 23:18:07 crc kubenswrapper[4962]: I1201 23:18:07.122608 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-tjg57_f2a3fbd2-3eb8-4784-8442-3299926b0172/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 23:18:07 crc kubenswrapper[4962]: I1201 23:18:07.748485 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_07461b2c-c45f-45cf-a540-4c24797e3f16/tempest-tests-tempest-tests-runner/0.log" Dec 01 23:18:10 crc kubenswrapper[4962]: I1201 23:18:10.219259 4962 scope.go:117] "RemoveContainer" containerID="9069c6b0afe806496066b75f21b8e7f8eb26d789d60c34eb7173619cac2f7eff" Dec 01 23:18:10 crc kubenswrapper[4962]: E1201 23:18:10.219863 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 23:18:16 crc kubenswrapper[4962]: I1201 23:18:16.169019 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_c1c35af6-81b8-418f-a1e9-e19209bab14d/memcached/0.log" Dec 01 23:18:18 crc kubenswrapper[4962]: I1201 23:18:18.815068 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8jwqq"] Dec 01 23:18:18 crc kubenswrapper[4962]: E1201 23:18:18.816059 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="213950ca-98fb-4dd3-acf2-7b7b2b6a104c" containerName="container-00" Dec 01 23:18:18 crc kubenswrapper[4962]: I1201 23:18:18.816071 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="213950ca-98fb-4dd3-acf2-7b7b2b6a104c" containerName="container-00" Dec 01 23:18:18 crc kubenswrapper[4962]: I1201 23:18:18.816328 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="213950ca-98fb-4dd3-acf2-7b7b2b6a104c" containerName="container-00" Dec 01 23:18:18 crc kubenswrapper[4962]: I1201 23:18:18.818247 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8jwqq" Dec 01 23:18:18 crc kubenswrapper[4962]: I1201 23:18:18.841392 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8jwqq"] Dec 01 23:18:18 crc kubenswrapper[4962]: I1201 23:18:18.974019 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65401351-9641-407f-92f2-bda55ab8cfbf-catalog-content\") pod \"redhat-operators-8jwqq\" (UID: \"65401351-9641-407f-92f2-bda55ab8cfbf\") " pod="openshift-marketplace/redhat-operators-8jwqq" Dec 01 23:18:18 crc kubenswrapper[4962]: I1201 23:18:18.974310 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmxzr\" (UniqueName: \"kubernetes.io/projected/65401351-9641-407f-92f2-bda55ab8cfbf-kube-api-access-mmxzr\") pod \"redhat-operators-8jwqq\" (UID: \"65401351-9641-407f-92f2-bda55ab8cfbf\") " pod="openshift-marketplace/redhat-operators-8jwqq" Dec 01 23:18:18 crc kubenswrapper[4962]: I1201 23:18:18.974583 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65401351-9641-407f-92f2-bda55ab8cfbf-utilities\") pod \"redhat-operators-8jwqq\" (UID: \"65401351-9641-407f-92f2-bda55ab8cfbf\") " pod="openshift-marketplace/redhat-operators-8jwqq" Dec 01 23:18:19 crc kubenswrapper[4962]: I1201 23:18:19.076728 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65401351-9641-407f-92f2-bda55ab8cfbf-utilities\") pod \"redhat-operators-8jwqq\" (UID: \"65401351-9641-407f-92f2-bda55ab8cfbf\") " pod="openshift-marketplace/redhat-operators-8jwqq" Dec 01 23:18:19 crc kubenswrapper[4962]: I1201 23:18:19.076834 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65401351-9641-407f-92f2-bda55ab8cfbf-catalog-content\") pod \"redhat-operators-8jwqq\" (UID: \"65401351-9641-407f-92f2-bda55ab8cfbf\") " pod="openshift-marketplace/redhat-operators-8jwqq" Dec 01 23:18:19 crc kubenswrapper[4962]: I1201 23:18:19.076980 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmxzr\" (UniqueName: \"kubernetes.io/projected/65401351-9641-407f-92f2-bda55ab8cfbf-kube-api-access-mmxzr\") pod \"redhat-operators-8jwqq\" (UID: \"65401351-9641-407f-92f2-bda55ab8cfbf\") " pod="openshift-marketplace/redhat-operators-8jwqq" Dec 01 23:18:19 crc kubenswrapper[4962]: I1201 23:18:19.077763 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65401351-9641-407f-92f2-bda55ab8cfbf-utilities\") pod \"redhat-operators-8jwqq\" (UID: \"65401351-9641-407f-92f2-bda55ab8cfbf\") " pod="openshift-marketplace/redhat-operators-8jwqq" Dec 01 23:18:19 crc kubenswrapper[4962]: I1201 23:18:19.077989 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65401351-9641-407f-92f2-bda55ab8cfbf-catalog-content\") pod \"redhat-operators-8jwqq\" (UID: \"65401351-9641-407f-92f2-bda55ab8cfbf\") " pod="openshift-marketplace/redhat-operators-8jwqq" Dec 01 23:18:19 crc kubenswrapper[4962]: I1201 23:18:19.113281 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmxzr\" (UniqueName: \"kubernetes.io/projected/65401351-9641-407f-92f2-bda55ab8cfbf-kube-api-access-mmxzr\") pod \"redhat-operators-8jwqq\" (UID: \"65401351-9641-407f-92f2-bda55ab8cfbf\") " pod="openshift-marketplace/redhat-operators-8jwqq" Dec 01 23:18:19 crc kubenswrapper[4962]: I1201 23:18:19.148628 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8jwqq" Dec 01 23:18:19 crc kubenswrapper[4962]: I1201 23:18:19.748060 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8jwqq"] Dec 01 23:18:20 crc kubenswrapper[4962]: I1201 23:18:20.682209 4962 generic.go:334] "Generic (PLEG): container finished" podID="65401351-9641-407f-92f2-bda55ab8cfbf" containerID="1fc7e1c8d1ffe62ed4620d4c20ae075cc70db6b12be841f431edd106043f236d" exitCode=0 Dec 01 23:18:20 crc kubenswrapper[4962]: I1201 23:18:20.683431 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8jwqq" event={"ID":"65401351-9641-407f-92f2-bda55ab8cfbf","Type":"ContainerDied","Data":"1fc7e1c8d1ffe62ed4620d4c20ae075cc70db6b12be841f431edd106043f236d"} Dec 01 23:18:20 crc kubenswrapper[4962]: I1201 23:18:20.683528 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8jwqq" event={"ID":"65401351-9641-407f-92f2-bda55ab8cfbf","Type":"ContainerStarted","Data":"f51fdf304a5c968ad753915757fbabc1037f03bfba93ad9e86365a163e1aeba0"} Dec 01 23:18:22 crc kubenswrapper[4962]: I1201 23:18:22.712712 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8jwqq" event={"ID":"65401351-9641-407f-92f2-bda55ab8cfbf","Type":"ContainerStarted","Data":"bf16ebb6d730cb5b74820072d08e27cec263037b262c8d6bb9d6a7bf89ccf7a0"} Dec 01 23:18:23 crc kubenswrapper[4962]: I1201 23:18:23.220356 4962 scope.go:117] "RemoveContainer" containerID="9069c6b0afe806496066b75f21b8e7f8eb26d789d60c34eb7173619cac2f7eff" Dec 01 23:18:23 crc kubenswrapper[4962]: E1201 23:18:23.251134 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 23:18:25 crc kubenswrapper[4962]: I1201 23:18:25.777685 4962 generic.go:334] "Generic (PLEG): container finished" podID="65401351-9641-407f-92f2-bda55ab8cfbf" containerID="bf16ebb6d730cb5b74820072d08e27cec263037b262c8d6bb9d6a7bf89ccf7a0" exitCode=0 Dec 01 23:18:25 crc kubenswrapper[4962]: I1201 23:18:25.778004 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8jwqq" event={"ID":"65401351-9641-407f-92f2-bda55ab8cfbf","Type":"ContainerDied","Data":"bf16ebb6d730cb5b74820072d08e27cec263037b262c8d6bb9d6a7bf89ccf7a0"} Dec 01 23:18:26 crc kubenswrapper[4962]: I1201 23:18:26.790982 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8jwqq" event={"ID":"65401351-9641-407f-92f2-bda55ab8cfbf","Type":"ContainerStarted","Data":"18443887513b125ab0520f7f079d4334d35c0b5da999719e6fa5dfa7511eb16b"} Dec 01 23:18:26 crc kubenswrapper[4962]: I1201 23:18:26.817339 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8jwqq" podStartSLOduration=3.32588689 podStartE2EDuration="8.817317592s" podCreationTimestamp="2025-12-01 23:18:18 +0000 UTC" firstStartedPulling="2025-12-01 23:18:20.684157099 +0000 UTC m=+6284.785596294" lastFinishedPulling="2025-12-01 23:18:26.175587791 +0000 UTC m=+6290.277026996" observedRunningTime="2025-12-01 23:18:26.817117727 +0000 UTC m=+6290.918556922" watchObservedRunningTime="2025-12-01 23:18:26.817317592 +0000 UTC m=+6290.918756787" Dec 01 23:18:29 crc kubenswrapper[4962]: I1201 23:18:29.149030 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8jwqq" Dec 01 23:18:29 crc kubenswrapper[4962]: I1201 23:18:29.149484 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8jwqq" Dec 01 23:18:30 crc kubenswrapper[4962]: I1201 23:18:30.273321 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8jwqq" podUID="65401351-9641-407f-92f2-bda55ab8cfbf" containerName="registry-server" probeResult="failure" output=< Dec 01 23:18:30 crc kubenswrapper[4962]: timeout: failed to connect service ":50051" within 1s Dec 01 23:18:30 crc kubenswrapper[4962]: > Dec 01 23:18:36 crc kubenswrapper[4962]: I1201 23:18:36.236257 4962 scope.go:117] "RemoveContainer" containerID="9069c6b0afe806496066b75f21b8e7f8eb26d789d60c34eb7173619cac2f7eff" Dec 01 23:18:36 crc kubenswrapper[4962]: I1201 23:18:36.902019 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" event={"ID":"191b6ce3-f613-4217-b224-a65ee4cfdfe7","Type":"ContainerStarted","Data":"240fdf592428a97c287ff53cfcb71065a9dae3fb374060e47fe22b9eeb666e0a"} Dec 01 23:18:38 crc kubenswrapper[4962]: I1201 23:18:38.197382 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_73f65eef14c85a3f197e77d6eb02f32b624918b5a1b27530e5300f306dlvx4g_dfd7b1bd-1314-48d9-92f7-1adcba0a1cd3/util/0.log" Dec 01 23:18:38 crc kubenswrapper[4962]: I1201 23:18:38.396425 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_73f65eef14c85a3f197e77d6eb02f32b624918b5a1b27530e5300f306dlvx4g_dfd7b1bd-1314-48d9-92f7-1adcba0a1cd3/util/0.log" Dec 01 23:18:38 crc kubenswrapper[4962]: I1201 23:18:38.415507 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_73f65eef14c85a3f197e77d6eb02f32b624918b5a1b27530e5300f306dlvx4g_dfd7b1bd-1314-48d9-92f7-1adcba0a1cd3/pull/0.log" Dec 01 23:18:38 crc kubenswrapper[4962]: I1201 23:18:38.428222 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_73f65eef14c85a3f197e77d6eb02f32b624918b5a1b27530e5300f306dlvx4g_dfd7b1bd-1314-48d9-92f7-1adcba0a1cd3/pull/0.log" Dec 01 23:18:38 crc kubenswrapper[4962]: I1201 23:18:38.592367 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_73f65eef14c85a3f197e77d6eb02f32b624918b5a1b27530e5300f306dlvx4g_dfd7b1bd-1314-48d9-92f7-1adcba0a1cd3/pull/0.log" Dec 01 23:18:38 crc kubenswrapper[4962]: I1201 23:18:38.594702 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_73f65eef14c85a3f197e77d6eb02f32b624918b5a1b27530e5300f306dlvx4g_dfd7b1bd-1314-48d9-92f7-1adcba0a1cd3/util/0.log" Dec 01 23:18:38 crc kubenswrapper[4962]: I1201 23:18:38.635948 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_73f65eef14c85a3f197e77d6eb02f32b624918b5a1b27530e5300f306dlvx4g_dfd7b1bd-1314-48d9-92f7-1adcba0a1cd3/extract/0.log" Dec 01 23:18:38 crc kubenswrapper[4962]: I1201 23:18:38.804015 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-t725d_0e2461fa-57b4-406a-9801-522b2e3ee2f0/kube-rbac-proxy/0.log" Dec 01 23:18:38 crc kubenswrapper[4962]: I1201 23:18:38.825581 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-t725d_0e2461fa-57b4-406a-9801-522b2e3ee2f0/manager/0.log" Dec 01 23:18:38 crc kubenswrapper[4962]: I1201 23:18:38.923678 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-2nvkp_39217f35-ba4e-402b-84fe-876ca232ff60/kube-rbac-proxy/0.log" Dec 01 23:18:39 crc kubenswrapper[4962]: I1201 23:18:39.084641 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-2nvkp_39217f35-ba4e-402b-84fe-876ca232ff60/manager/0.log" Dec 01 23:18:39 crc kubenswrapper[4962]: I1201 23:18:39.106302 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-sqwvg_8e655cd6-3169-46a0-b299-37d13dae8d3a/kube-rbac-proxy/0.log" Dec 01 23:18:39 crc kubenswrapper[4962]: I1201 23:18:39.149739 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-sqwvg_8e655cd6-3169-46a0-b299-37d13dae8d3a/manager/0.log" Dec 01 23:18:39 crc kubenswrapper[4962]: I1201 23:18:39.211455 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8jwqq" Dec 01 23:18:39 crc kubenswrapper[4962]: I1201 23:18:39.279546 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8jwqq" Dec 01 23:18:39 crc kubenswrapper[4962]: I1201 23:18:39.312708 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-668d9c48b9-8xc97_d871da7f-4b47-4931-aa3b-1525f50b2bde/kube-rbac-proxy/0.log" Dec 01 23:18:39 crc kubenswrapper[4962]: I1201 23:18:39.403825 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-668d9c48b9-8xc97_d871da7f-4b47-4931-aa3b-1525f50b2bde/manager/0.log" Dec 01 23:18:39 crc kubenswrapper[4962]: I1201 23:18:39.445284 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8jwqq"] Dec 01 23:18:39 crc kubenswrapper[4962]: I1201 23:18:39.501594 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-sjzl9_1a9bd198-45fa-40ba-b3a0-55c150c211d6/kube-rbac-proxy/0.log" Dec 01 23:18:39 crc kubenswrapper[4962]: I1201 23:18:39.633715 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-sjzl9_1a9bd198-45fa-40ba-b3a0-55c150c211d6/manager/0.log" Dec 01 23:18:39 crc kubenswrapper[4962]: I1201 23:18:39.828036 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-mllgh_fb3ad1a2-8ee0-4d12-8499-d10819081f1b/manager/0.log" Dec 01 23:18:39 crc kubenswrapper[4962]: I1201 23:18:39.879756 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-mllgh_fb3ad1a2-8ee0-4d12-8499-d10819081f1b/kube-rbac-proxy/0.log" Dec 01 23:18:40 crc kubenswrapper[4962]: I1201 23:18:40.028856 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-27r4m_2e5d2ffc-89ea-4f45-a3a2-1745f3b107e8/kube-rbac-proxy/0.log" Dec 01 23:18:40 crc kubenswrapper[4962]: I1201 23:18:40.226603 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-27r4m_2e5d2ffc-89ea-4f45-a3a2-1745f3b107e8/manager/0.log" Dec 01 23:18:40 crc kubenswrapper[4962]: I1201 23:18:40.242048 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-w68ng_d0ae9966-90f0-4d97-a056-dd9e86c81949/kube-rbac-proxy/0.log" Dec 01 23:18:40 crc kubenswrapper[4962]: I1201 23:18:40.259373 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-w68ng_d0ae9966-90f0-4d97-a056-dd9e86c81949/manager/0.log" Dec 01 23:18:40 crc kubenswrapper[4962]: I1201 23:18:40.413682 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-546d4bdf48-fkvsq_ed78cdfd-dc4e-4528-9542-6fc778f54e5f/kube-rbac-proxy/0.log" Dec 01 23:18:40 crc kubenswrapper[4962]: I1201 23:18:40.526978 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-546d4bdf48-fkvsq_ed78cdfd-dc4e-4528-9542-6fc778f54e5f/manager/0.log" Dec 01 23:18:40 crc kubenswrapper[4962]: I1201 23:18:40.599318 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6546668bfd-zvp89_3673ec86-6e36-4f0b-ac14-87e5d89e283e/manager/0.log" Dec 01 23:18:40 crc kubenswrapper[4962]: I1201 23:18:40.656897 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6546668bfd-zvp89_3673ec86-6e36-4f0b-ac14-87e5d89e283e/kube-rbac-proxy/0.log" Dec 01 23:18:40 crc kubenswrapper[4962]: I1201 23:18:40.747045 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-nptld_fb72edda-e449-44f6-a85d-b74c0f3f9ad2/kube-rbac-proxy/0.log" Dec 01 23:18:40 crc kubenswrapper[4962]: I1201 23:18:40.791876 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-nptld_fb72edda-e449-44f6-a85d-b74c0f3f9ad2/manager/0.log" Dec 01 23:18:40 crc kubenswrapper[4962]: I1201 23:18:40.936524 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8jwqq" podUID="65401351-9641-407f-92f2-bda55ab8cfbf" containerName="registry-server" containerID="cri-o://18443887513b125ab0520f7f079d4334d35c0b5da999719e6fa5dfa7511eb16b" gracePeriod=2 Dec 01 23:18:40 crc kubenswrapper[4962]: I1201 23:18:40.962737 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-7d98c_795b9a42-a6d4-487b-84ef-0f1b3617ebfc/kube-rbac-proxy/0.log" Dec 01 23:18:41 crc kubenswrapper[4962]: I1201 23:18:41.005609 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-7d98c_795b9a42-a6d4-487b-84ef-0f1b3617ebfc/manager/0.log" Dec 01 23:18:41 crc kubenswrapper[4962]: I1201 23:18:41.105289 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-mqrwk_1fb020cd-66c6-401d-be7e-9a26b62eb8d8/kube-rbac-proxy/0.log" Dec 01 23:18:41 crc kubenswrapper[4962]: I1201 23:18:41.321346 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-mqrwk_1fb020cd-66c6-401d-be7e-9a26b62eb8d8/manager/0.log" Dec 01 23:18:41 crc kubenswrapper[4962]: I1201 23:18:41.352885 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-xvbpf_f2e499a5-b89a-45d4-bd3e-9f743e010a51/kube-rbac-proxy/0.log" Dec 01 23:18:41 crc kubenswrapper[4962]: I1201 23:18:41.491867 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-xvbpf_f2e499a5-b89a-45d4-bd3e-9f743e010a51/manager/0.log" Dec 01 23:18:41 crc kubenswrapper[4962]: I1201 23:18:41.585073 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4z6mbc_fec45066-0c5d-48de-9c33-f166f33131f0/kube-rbac-proxy/0.log" Dec 01 23:18:41 crc kubenswrapper[4962]: I1201 23:18:41.653012 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8jwqq" Dec 01 23:18:41 crc kubenswrapper[4962]: I1201 23:18:41.659860 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4z6mbc_fec45066-0c5d-48de-9c33-f166f33131f0/manager/0.log" Dec 01 23:18:41 crc kubenswrapper[4962]: I1201 23:18:41.712159 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65401351-9641-407f-92f2-bda55ab8cfbf-catalog-content\") pod \"65401351-9641-407f-92f2-bda55ab8cfbf\" (UID: \"65401351-9641-407f-92f2-bda55ab8cfbf\") " Dec 01 23:18:41 crc kubenswrapper[4962]: I1201 23:18:41.712255 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65401351-9641-407f-92f2-bda55ab8cfbf-utilities\") pod \"65401351-9641-407f-92f2-bda55ab8cfbf\" (UID: \"65401351-9641-407f-92f2-bda55ab8cfbf\") " Dec 01 23:18:41 crc kubenswrapper[4962]: I1201 23:18:41.712291 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmxzr\" (UniqueName: \"kubernetes.io/projected/65401351-9641-407f-92f2-bda55ab8cfbf-kube-api-access-mmxzr\") pod \"65401351-9641-407f-92f2-bda55ab8cfbf\" (UID: \"65401351-9641-407f-92f2-bda55ab8cfbf\") " Dec 01 23:18:41 crc kubenswrapper[4962]: I1201 23:18:41.714758 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65401351-9641-407f-92f2-bda55ab8cfbf-utilities" (OuterVolumeSpecName: "utilities") pod "65401351-9641-407f-92f2-bda55ab8cfbf" (UID: "65401351-9641-407f-92f2-bda55ab8cfbf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 23:18:41 crc kubenswrapper[4962]: I1201 23:18:41.722742 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65401351-9641-407f-92f2-bda55ab8cfbf-kube-api-access-mmxzr" (OuterVolumeSpecName: "kube-api-access-mmxzr") pod "65401351-9641-407f-92f2-bda55ab8cfbf" (UID: "65401351-9641-407f-92f2-bda55ab8cfbf"). InnerVolumeSpecName "kube-api-access-mmxzr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 23:18:41 crc kubenswrapper[4962]: I1201 23:18:41.816305 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65401351-9641-407f-92f2-bda55ab8cfbf-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 23:18:41 crc kubenswrapper[4962]: I1201 23:18:41.816343 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mmxzr\" (UniqueName: \"kubernetes.io/projected/65401351-9641-407f-92f2-bda55ab8cfbf-kube-api-access-mmxzr\") on node \"crc\" DevicePath \"\"" Dec 01 23:18:41 crc kubenswrapper[4962]: I1201 23:18:41.841612 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65401351-9641-407f-92f2-bda55ab8cfbf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "65401351-9641-407f-92f2-bda55ab8cfbf" (UID: "65401351-9641-407f-92f2-bda55ab8cfbf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 23:18:41 crc kubenswrapper[4962]: I1201 23:18:41.920718 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65401351-9641-407f-92f2-bda55ab8cfbf-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 23:18:41 crc kubenswrapper[4962]: I1201 23:18:41.970424 4962 generic.go:334] "Generic (PLEG): container finished" podID="65401351-9641-407f-92f2-bda55ab8cfbf" containerID="18443887513b125ab0520f7f079d4334d35c0b5da999719e6fa5dfa7511eb16b" exitCode=0 Dec 01 23:18:41 crc kubenswrapper[4962]: I1201 23:18:41.970466 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8jwqq" event={"ID":"65401351-9641-407f-92f2-bda55ab8cfbf","Type":"ContainerDied","Data":"18443887513b125ab0520f7f079d4334d35c0b5da999719e6fa5dfa7511eb16b"} Dec 01 23:18:41 crc kubenswrapper[4962]: I1201 23:18:41.970491 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8jwqq" event={"ID":"65401351-9641-407f-92f2-bda55ab8cfbf","Type":"ContainerDied","Data":"f51fdf304a5c968ad753915757fbabc1037f03bfba93ad9e86365a163e1aeba0"} Dec 01 23:18:41 crc kubenswrapper[4962]: I1201 23:18:41.970513 4962 scope.go:117] "RemoveContainer" containerID="18443887513b125ab0520f7f079d4334d35c0b5da999719e6fa5dfa7511eb16b" Dec 01 23:18:41 crc kubenswrapper[4962]: I1201 23:18:41.970649 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8jwqq" Dec 01 23:18:42 crc kubenswrapper[4962]: I1201 23:18:42.016426 4962 scope.go:117] "RemoveContainer" containerID="bf16ebb6d730cb5b74820072d08e27cec263037b262c8d6bb9d6a7bf89ccf7a0" Dec 01 23:18:42 crc kubenswrapper[4962]: I1201 23:18:42.023482 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8jwqq"] Dec 01 23:18:42 crc kubenswrapper[4962]: I1201 23:18:42.047235 4962 scope.go:117] "RemoveContainer" containerID="1fc7e1c8d1ffe62ed4620d4c20ae075cc70db6b12be841f431edd106043f236d" Dec 01 23:18:42 crc kubenswrapper[4962]: I1201 23:18:42.071182 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8jwqq"] Dec 01 23:18:42 crc kubenswrapper[4962]: I1201 23:18:42.087007 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-sd67v_d151cbe8-8f07-425d-bd99-c06451f4a3cf/registry-server/0.log" Dec 01 23:18:42 crc kubenswrapper[4962]: I1201 23:18:42.105923 4962 scope.go:117] "RemoveContainer" containerID="18443887513b125ab0520f7f079d4334d35c0b5da999719e6fa5dfa7511eb16b" Dec 01 23:18:42 crc kubenswrapper[4962]: E1201 23:18:42.106686 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18443887513b125ab0520f7f079d4334d35c0b5da999719e6fa5dfa7511eb16b\": container with ID starting with 18443887513b125ab0520f7f079d4334d35c0b5da999719e6fa5dfa7511eb16b not found: ID does not exist" containerID="18443887513b125ab0520f7f079d4334d35c0b5da999719e6fa5dfa7511eb16b" Dec 01 23:18:42 crc kubenswrapper[4962]: I1201 23:18:42.106738 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18443887513b125ab0520f7f079d4334d35c0b5da999719e6fa5dfa7511eb16b"} err="failed to get container status \"18443887513b125ab0520f7f079d4334d35c0b5da999719e6fa5dfa7511eb16b\": rpc error: code = NotFound desc = could not find container \"18443887513b125ab0520f7f079d4334d35c0b5da999719e6fa5dfa7511eb16b\": container with ID starting with 18443887513b125ab0520f7f079d4334d35c0b5da999719e6fa5dfa7511eb16b not found: ID does not exist" Dec 01 23:18:42 crc kubenswrapper[4962]: I1201 23:18:42.106760 4962 scope.go:117] "RemoveContainer" containerID="bf16ebb6d730cb5b74820072d08e27cec263037b262c8d6bb9d6a7bf89ccf7a0" Dec 01 23:18:42 crc kubenswrapper[4962]: E1201 23:18:42.107328 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf16ebb6d730cb5b74820072d08e27cec263037b262c8d6bb9d6a7bf89ccf7a0\": container with ID starting with bf16ebb6d730cb5b74820072d08e27cec263037b262c8d6bb9d6a7bf89ccf7a0 not found: ID does not exist" containerID="bf16ebb6d730cb5b74820072d08e27cec263037b262c8d6bb9d6a7bf89ccf7a0" Dec 01 23:18:42 crc kubenswrapper[4962]: I1201 23:18:42.107358 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf16ebb6d730cb5b74820072d08e27cec263037b262c8d6bb9d6a7bf89ccf7a0"} err="failed to get container status \"bf16ebb6d730cb5b74820072d08e27cec263037b262c8d6bb9d6a7bf89ccf7a0\": rpc error: code = NotFound desc = could not find container \"bf16ebb6d730cb5b74820072d08e27cec263037b262c8d6bb9d6a7bf89ccf7a0\": container with ID starting with bf16ebb6d730cb5b74820072d08e27cec263037b262c8d6bb9d6a7bf89ccf7a0 not found: ID does not exist" Dec 01 23:18:42 crc kubenswrapper[4962]: I1201 23:18:42.107375 4962 scope.go:117] "RemoveContainer" containerID="1fc7e1c8d1ffe62ed4620d4c20ae075cc70db6b12be841f431edd106043f236d" Dec 01 23:18:42 crc kubenswrapper[4962]: E1201 23:18:42.108108 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fc7e1c8d1ffe62ed4620d4c20ae075cc70db6b12be841f431edd106043f236d\": container with ID starting with 1fc7e1c8d1ffe62ed4620d4c20ae075cc70db6b12be841f431edd106043f236d not found: ID does not exist" containerID="1fc7e1c8d1ffe62ed4620d4c20ae075cc70db6b12be841f431edd106043f236d" Dec 01 23:18:42 crc kubenswrapper[4962]: I1201 23:18:42.108206 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fc7e1c8d1ffe62ed4620d4c20ae075cc70db6b12be841f431edd106043f236d"} err="failed to get container status \"1fc7e1c8d1ffe62ed4620d4c20ae075cc70db6b12be841f431edd106043f236d\": rpc error: code = NotFound desc = could not find container \"1fc7e1c8d1ffe62ed4620d4c20ae075cc70db6b12be841f431edd106043f236d\": container with ID starting with 1fc7e1c8d1ffe62ed4620d4c20ae075cc70db6b12be841f431edd106043f236d not found: ID does not exist" Dec 01 23:18:42 crc kubenswrapper[4962]: I1201 23:18:42.185531 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-5dc9c6958f-52l87_555a34ee-8a52-4159-8e01-ed6dcceb45e9/operator/0.log" Dec 01 23:18:42 crc kubenswrapper[4962]: I1201 23:18:42.233774 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65401351-9641-407f-92f2-bda55ab8cfbf" path="/var/lib/kubelet/pods/65401351-9641-407f-92f2-bda55ab8cfbf/volumes" Dec 01 23:18:42 crc kubenswrapper[4962]: I1201 23:18:42.279562 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-hfkpq_d62cdff4-c4d1-44fb-99dc-bdd6a31d03af/kube-rbac-proxy/0.log" Dec 01 23:18:42 crc kubenswrapper[4962]: I1201 23:18:42.499415 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-hfkpq_d62cdff4-c4d1-44fb-99dc-bdd6a31d03af/manager/0.log" Dec 01 23:18:42 crc kubenswrapper[4962]: I1201 23:18:42.582908 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-4hgng_ec3039da-9f5e-4870-8579-8560a63221a8/kube-rbac-proxy/0.log" Dec 01 23:18:42 crc kubenswrapper[4962]: I1201 23:18:42.697010 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-4hgng_ec3039da-9f5e-4870-8579-8560a63221a8/manager/0.log" Dec 01 23:18:42 crc kubenswrapper[4962]: I1201 23:18:42.795926 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-lv89z_400ba839-34f0-4463-a318-c1bcba6e5039/operator/0.log" Dec 01 23:18:42 crc kubenswrapper[4962]: I1201 23:18:42.902533 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-6tfrn_03f5786b-da6f-4b56-ac07-fb563f0a85b4/kube-rbac-proxy/0.log" Dec 01 23:18:42 crc kubenswrapper[4962]: I1201 23:18:42.955979 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-6tfrn_03f5786b-da6f-4b56-ac07-fb563f0a85b4/manager/0.log" Dec 01 23:18:43 crc kubenswrapper[4962]: I1201 23:18:43.052046 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6c484b4dc4-ch82f_af182ba4-78a6-41eb-bf65-8abd64207122/kube-rbac-proxy/0.log" Dec 01 23:18:43 crc kubenswrapper[4962]: I1201 23:18:43.127972 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-d8646fccf-4h8tf_05992e60-e6fc-43a0-b44a-d177ae3f4c83/manager/0.log" Dec 01 23:18:43 crc kubenswrapper[4962]: I1201 23:18:43.339490 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-zzc5v_0aff0b93-1032-412b-9628-3ab9e94717a8/kube-rbac-proxy/0.log" Dec 01 23:18:43 crc kubenswrapper[4962]: I1201 23:18:43.366797 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-zzc5v_0aff0b93-1032-412b-9628-3ab9e94717a8/manager/0.log" Dec 01 23:18:43 crc kubenswrapper[4962]: I1201 23:18:43.421480 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6c484b4dc4-ch82f_af182ba4-78a6-41eb-bf65-8abd64207122/manager/0.log" Dec 01 23:18:43 crc kubenswrapper[4962]: I1201 23:18:43.437351 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-q7bxg_c847e733-65b6-4724-8037-5199d847f1ba/kube-rbac-proxy/0.log" Dec 01 23:18:43 crc kubenswrapper[4962]: I1201 23:18:43.550628 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-q7bxg_c847e733-65b6-4724-8037-5199d847f1ba/manager/0.log" Dec 01 23:19:04 crc kubenswrapper[4962]: I1201 23:19:04.404489 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-nl5jh_00f6ed0c-f791-460d-acd4-d100a0b21710/control-plane-machine-set-operator/0.log" Dec 01 23:19:04 crc kubenswrapper[4962]: I1201 23:19:04.658214 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-qj8zv_12136e64-010e-49bc-9c3e-d1c65467f361/kube-rbac-proxy/0.log" Dec 01 23:19:04 crc kubenswrapper[4962]: I1201 23:19:04.697553 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-qj8zv_12136e64-010e-49bc-9c3e-d1c65467f361/machine-api-operator/0.log" Dec 01 23:19:18 crc kubenswrapper[4962]: I1201 23:19:18.917329 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-lzx5s_09bfd310-14dd-4f11-90d4-2b67683a468a/cert-manager-controller/0.log" Dec 01 23:19:19 crc kubenswrapper[4962]: I1201 23:19:19.098291 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-mfzw8_1ed989fa-af32-4ec2-9ead-2681d1b96741/cert-manager-cainjector/0.log" Dec 01 23:19:19 crc kubenswrapper[4962]: I1201 23:19:19.135411 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-zwlk2_bce7cb19-aeee-4ad9-9284-46e78c5e1d6f/cert-manager-webhook/0.log" Dec 01 23:19:31 crc kubenswrapper[4962]: I1201 23:19:31.728293 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-srzbx"] Dec 01 23:19:31 crc kubenswrapper[4962]: E1201 23:19:31.729360 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65401351-9641-407f-92f2-bda55ab8cfbf" containerName="registry-server" Dec 01 23:19:31 crc kubenswrapper[4962]: I1201 23:19:31.729378 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="65401351-9641-407f-92f2-bda55ab8cfbf" containerName="registry-server" Dec 01 23:19:31 crc kubenswrapper[4962]: E1201 23:19:31.729408 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65401351-9641-407f-92f2-bda55ab8cfbf" containerName="extract-utilities" Dec 01 23:19:31 crc kubenswrapper[4962]: I1201 23:19:31.729418 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="65401351-9641-407f-92f2-bda55ab8cfbf" containerName="extract-utilities" Dec 01 23:19:31 crc kubenswrapper[4962]: E1201 23:19:31.729442 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65401351-9641-407f-92f2-bda55ab8cfbf" containerName="extract-content" Dec 01 23:19:31 crc kubenswrapper[4962]: I1201 23:19:31.729448 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="65401351-9641-407f-92f2-bda55ab8cfbf" containerName="extract-content" Dec 01 23:19:31 crc kubenswrapper[4962]: I1201 23:19:31.729723 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="65401351-9641-407f-92f2-bda55ab8cfbf" containerName="registry-server" Dec 01 23:19:31 crc kubenswrapper[4962]: I1201 23:19:31.731697 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-srzbx" Dec 01 23:19:31 crc kubenswrapper[4962]: I1201 23:19:31.740015 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-srzbx"] Dec 01 23:19:31 crc kubenswrapper[4962]: I1201 23:19:31.809567 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8a0e643-eae2-46fb-9a9a-2b0483085460-catalog-content\") pod \"community-operators-srzbx\" (UID: \"b8a0e643-eae2-46fb-9a9a-2b0483085460\") " pod="openshift-marketplace/community-operators-srzbx" Dec 01 23:19:31 crc kubenswrapper[4962]: I1201 23:19:31.809611 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qstwq\" (UniqueName: \"kubernetes.io/projected/b8a0e643-eae2-46fb-9a9a-2b0483085460-kube-api-access-qstwq\") pod \"community-operators-srzbx\" (UID: \"b8a0e643-eae2-46fb-9a9a-2b0483085460\") " pod="openshift-marketplace/community-operators-srzbx" Dec 01 23:19:31 crc kubenswrapper[4962]: I1201 23:19:31.809801 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8a0e643-eae2-46fb-9a9a-2b0483085460-utilities\") pod \"community-operators-srzbx\" (UID: \"b8a0e643-eae2-46fb-9a9a-2b0483085460\") " pod="openshift-marketplace/community-operators-srzbx" Dec 01 23:19:31 crc kubenswrapper[4962]: I1201 23:19:31.912554 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8a0e643-eae2-46fb-9a9a-2b0483085460-catalog-content\") pod \"community-operators-srzbx\" (UID: \"b8a0e643-eae2-46fb-9a9a-2b0483085460\") " pod="openshift-marketplace/community-operators-srzbx" Dec 01 23:19:31 crc kubenswrapper[4962]: I1201 23:19:31.912600 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qstwq\" (UniqueName: \"kubernetes.io/projected/b8a0e643-eae2-46fb-9a9a-2b0483085460-kube-api-access-qstwq\") pod \"community-operators-srzbx\" (UID: \"b8a0e643-eae2-46fb-9a9a-2b0483085460\") " pod="openshift-marketplace/community-operators-srzbx" Dec 01 23:19:31 crc kubenswrapper[4962]: I1201 23:19:31.912711 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8a0e643-eae2-46fb-9a9a-2b0483085460-utilities\") pod \"community-operators-srzbx\" (UID: \"b8a0e643-eae2-46fb-9a9a-2b0483085460\") " pod="openshift-marketplace/community-operators-srzbx" Dec 01 23:19:31 crc kubenswrapper[4962]: I1201 23:19:31.913183 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8a0e643-eae2-46fb-9a9a-2b0483085460-utilities\") pod \"community-operators-srzbx\" (UID: \"b8a0e643-eae2-46fb-9a9a-2b0483085460\") " pod="openshift-marketplace/community-operators-srzbx" Dec 01 23:19:31 crc kubenswrapper[4962]: I1201 23:19:31.913385 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8a0e643-eae2-46fb-9a9a-2b0483085460-catalog-content\") pod \"community-operators-srzbx\" (UID: \"b8a0e643-eae2-46fb-9a9a-2b0483085460\") " pod="openshift-marketplace/community-operators-srzbx" Dec 01 23:19:31 crc kubenswrapper[4962]: I1201 23:19:31.931831 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qstwq\" (UniqueName: \"kubernetes.io/projected/b8a0e643-eae2-46fb-9a9a-2b0483085460-kube-api-access-qstwq\") pod \"community-operators-srzbx\" (UID: \"b8a0e643-eae2-46fb-9a9a-2b0483085460\") " pod="openshift-marketplace/community-operators-srzbx" Dec 01 23:19:32 crc kubenswrapper[4962]: I1201 23:19:32.051411 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-srzbx" Dec 01 23:19:32 crc kubenswrapper[4962]: I1201 23:19:32.586435 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-srzbx"] Dec 01 23:19:32 crc kubenswrapper[4962]: I1201 23:19:32.673875 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-srzbx" event={"ID":"b8a0e643-eae2-46fb-9a9a-2b0483085460","Type":"ContainerStarted","Data":"87204c0db3c4f97ba6cc270a2d20446e357fadeca63e80931fb035c991ee06f5"} Dec 01 23:19:33 crc kubenswrapper[4962]: I1201 23:19:33.177286 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-48pxc_5a7b0f93-3ea3-4a0f-baef-4ca08977cbde/nmstate-console-plugin/0.log" Dec 01 23:19:33 crc kubenswrapper[4962]: I1201 23:19:33.423423 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-s4vbq_770c0f72-8589-4617-8b07-92d0702ff5b8/nmstate-handler/0.log" Dec 01 23:19:33 crc kubenswrapper[4962]: I1201 23:19:33.439104 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-l62dd_838c46a9-9378-4801-8cc4-e203bf8c2972/kube-rbac-proxy/0.log" Dec 01 23:19:33 crc kubenswrapper[4962]: I1201 23:19:33.502819 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-l62dd_838c46a9-9378-4801-8cc4-e203bf8c2972/nmstate-metrics/0.log" Dec 01 23:19:33 crc kubenswrapper[4962]: I1201 23:19:33.630906 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-r6ndt_c7fabe32-40b1-4300-bd18-c51c12e45a21/nmstate-operator/0.log" Dec 01 23:19:33 crc kubenswrapper[4962]: I1201 23:19:33.685232 4962 generic.go:334] "Generic (PLEG): container finished" podID="b8a0e643-eae2-46fb-9a9a-2b0483085460" containerID="65def3e956d53edea1d0e5a9e0c2394b611aa729468e8b40cdf1bef4bd9bc6cb" exitCode=0 Dec 01 23:19:33 crc kubenswrapper[4962]: I1201 23:19:33.685343 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-srzbx" event={"ID":"b8a0e643-eae2-46fb-9a9a-2b0483085460","Type":"ContainerDied","Data":"65def3e956d53edea1d0e5a9e0c2394b611aa729468e8b40cdf1bef4bd9bc6cb"} Dec 01 23:19:33 crc kubenswrapper[4962]: I1201 23:19:33.691189 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-s5czc_2473e9c3-5f3d-4122-ae3c-c0ef0de79201/nmstate-webhook/0.log" Dec 01 23:19:34 crc kubenswrapper[4962]: I1201 23:19:34.695795 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-srzbx" event={"ID":"b8a0e643-eae2-46fb-9a9a-2b0483085460","Type":"ContainerStarted","Data":"6240f163db57c346ccf1f220d0a973e4e92adddcd1aaa64dcc24b05ffd61b97d"} Dec 01 23:19:35 crc kubenswrapper[4962]: I1201 23:19:35.709431 4962 generic.go:334] "Generic (PLEG): container finished" podID="b8a0e643-eae2-46fb-9a9a-2b0483085460" containerID="6240f163db57c346ccf1f220d0a973e4e92adddcd1aaa64dcc24b05ffd61b97d" exitCode=0 Dec 01 23:19:35 crc kubenswrapper[4962]: I1201 23:19:35.709620 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-srzbx" event={"ID":"b8a0e643-eae2-46fb-9a9a-2b0483085460","Type":"ContainerDied","Data":"6240f163db57c346ccf1f220d0a973e4e92adddcd1aaa64dcc24b05ffd61b97d"} Dec 01 23:19:36 crc kubenswrapper[4962]: I1201 23:19:36.725511 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-srzbx" event={"ID":"b8a0e643-eae2-46fb-9a9a-2b0483085460","Type":"ContainerStarted","Data":"1c7196a0b93b33c5b2ea919234062d35d505cd91324f92bab25aa613368fd2aa"} Dec 01 23:19:36 crc kubenswrapper[4962]: I1201 23:19:36.757015 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-srzbx" podStartSLOduration=2.963579497 podStartE2EDuration="5.756993661s" podCreationTimestamp="2025-12-01 23:19:31 +0000 UTC" firstStartedPulling="2025-12-01 23:19:33.687616605 +0000 UTC m=+6357.789055800" lastFinishedPulling="2025-12-01 23:19:36.481030769 +0000 UTC m=+6360.582469964" observedRunningTime="2025-12-01 23:19:36.753710528 +0000 UTC m=+6360.855149723" watchObservedRunningTime="2025-12-01 23:19:36.756993661 +0000 UTC m=+6360.858432866" Dec 01 23:19:42 crc kubenswrapper[4962]: I1201 23:19:42.051716 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-srzbx" Dec 01 23:19:42 crc kubenswrapper[4962]: I1201 23:19:42.052220 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-srzbx" Dec 01 23:19:42 crc kubenswrapper[4962]: I1201 23:19:42.132407 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-srzbx" Dec 01 23:19:42 crc kubenswrapper[4962]: I1201 23:19:42.883403 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-srzbx" Dec 01 23:19:42 crc kubenswrapper[4962]: I1201 23:19:42.930621 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-srzbx"] Dec 01 23:19:44 crc kubenswrapper[4962]: I1201 23:19:44.840197 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-srzbx" podUID="b8a0e643-eae2-46fb-9a9a-2b0483085460" containerName="registry-server" containerID="cri-o://1c7196a0b93b33c5b2ea919234062d35d505cd91324f92bab25aa613368fd2aa" gracePeriod=2 Dec 01 23:19:45 crc kubenswrapper[4962]: I1201 23:19:45.430051 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-srzbx" Dec 01 23:19:45 crc kubenswrapper[4962]: I1201 23:19:45.537174 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8a0e643-eae2-46fb-9a9a-2b0483085460-catalog-content\") pod \"b8a0e643-eae2-46fb-9a9a-2b0483085460\" (UID: \"b8a0e643-eae2-46fb-9a9a-2b0483085460\") " Dec 01 23:19:45 crc kubenswrapper[4962]: I1201 23:19:45.537545 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qstwq\" (UniqueName: \"kubernetes.io/projected/b8a0e643-eae2-46fb-9a9a-2b0483085460-kube-api-access-qstwq\") pod \"b8a0e643-eae2-46fb-9a9a-2b0483085460\" (UID: \"b8a0e643-eae2-46fb-9a9a-2b0483085460\") " Dec 01 23:19:45 crc kubenswrapper[4962]: I1201 23:19:45.538304 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8a0e643-eae2-46fb-9a9a-2b0483085460-utilities\") pod \"b8a0e643-eae2-46fb-9a9a-2b0483085460\" (UID: \"b8a0e643-eae2-46fb-9a9a-2b0483085460\") " Dec 01 23:19:45 crc kubenswrapper[4962]: I1201 23:19:45.539348 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8a0e643-eae2-46fb-9a9a-2b0483085460-utilities" (OuterVolumeSpecName: "utilities") pod "b8a0e643-eae2-46fb-9a9a-2b0483085460" (UID: "b8a0e643-eae2-46fb-9a9a-2b0483085460"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 23:19:45 crc kubenswrapper[4962]: I1201 23:19:45.549670 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8a0e643-eae2-46fb-9a9a-2b0483085460-kube-api-access-qstwq" (OuterVolumeSpecName: "kube-api-access-qstwq") pod "b8a0e643-eae2-46fb-9a9a-2b0483085460" (UID: "b8a0e643-eae2-46fb-9a9a-2b0483085460"). InnerVolumeSpecName "kube-api-access-qstwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 23:19:45 crc kubenswrapper[4962]: I1201 23:19:45.584665 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8a0e643-eae2-46fb-9a9a-2b0483085460-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b8a0e643-eae2-46fb-9a9a-2b0483085460" (UID: "b8a0e643-eae2-46fb-9a9a-2b0483085460"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 23:19:45 crc kubenswrapper[4962]: I1201 23:19:45.640693 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8a0e643-eae2-46fb-9a9a-2b0483085460-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 23:19:45 crc kubenswrapper[4962]: I1201 23:19:45.640729 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qstwq\" (UniqueName: \"kubernetes.io/projected/b8a0e643-eae2-46fb-9a9a-2b0483085460-kube-api-access-qstwq\") on node \"crc\" DevicePath \"\"" Dec 01 23:19:45 crc kubenswrapper[4962]: I1201 23:19:45.640743 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8a0e643-eae2-46fb-9a9a-2b0483085460-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 23:19:45 crc kubenswrapper[4962]: I1201 23:19:45.856723 4962 generic.go:334] "Generic (PLEG): container finished" podID="b8a0e643-eae2-46fb-9a9a-2b0483085460" containerID="1c7196a0b93b33c5b2ea919234062d35d505cd91324f92bab25aa613368fd2aa" exitCode=0 Dec 01 23:19:45 crc kubenswrapper[4962]: I1201 23:19:45.856773 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-srzbx" event={"ID":"b8a0e643-eae2-46fb-9a9a-2b0483085460","Type":"ContainerDied","Data":"1c7196a0b93b33c5b2ea919234062d35d505cd91324f92bab25aa613368fd2aa"} Dec 01 23:19:45 crc kubenswrapper[4962]: I1201 23:19:45.856786 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-srzbx" Dec 01 23:19:45 crc kubenswrapper[4962]: I1201 23:19:45.856805 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-srzbx" event={"ID":"b8a0e643-eae2-46fb-9a9a-2b0483085460","Type":"ContainerDied","Data":"87204c0db3c4f97ba6cc270a2d20446e357fadeca63e80931fb035c991ee06f5"} Dec 01 23:19:45 crc kubenswrapper[4962]: I1201 23:19:45.856828 4962 scope.go:117] "RemoveContainer" containerID="1c7196a0b93b33c5b2ea919234062d35d505cd91324f92bab25aa613368fd2aa" Dec 01 23:19:45 crc kubenswrapper[4962]: I1201 23:19:45.892634 4962 scope.go:117] "RemoveContainer" containerID="6240f163db57c346ccf1f220d0a973e4e92adddcd1aaa64dcc24b05ffd61b97d" Dec 01 23:19:45 crc kubenswrapper[4962]: I1201 23:19:45.902410 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-srzbx"] Dec 01 23:19:45 crc kubenswrapper[4962]: I1201 23:19:45.916280 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-srzbx"] Dec 01 23:19:45 crc kubenswrapper[4962]: I1201 23:19:45.922171 4962 scope.go:117] "RemoveContainer" containerID="65def3e956d53edea1d0e5a9e0c2394b611aa729468e8b40cdf1bef4bd9bc6cb" Dec 01 23:19:45 crc kubenswrapper[4962]: I1201 23:19:45.976639 4962 scope.go:117] "RemoveContainer" containerID="1c7196a0b93b33c5b2ea919234062d35d505cd91324f92bab25aa613368fd2aa" Dec 01 23:19:45 crc kubenswrapper[4962]: E1201 23:19:45.977433 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c7196a0b93b33c5b2ea919234062d35d505cd91324f92bab25aa613368fd2aa\": container with ID starting with 1c7196a0b93b33c5b2ea919234062d35d505cd91324f92bab25aa613368fd2aa not found: ID does not exist" containerID="1c7196a0b93b33c5b2ea919234062d35d505cd91324f92bab25aa613368fd2aa" Dec 01 23:19:45 crc kubenswrapper[4962]: I1201 23:19:45.977485 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c7196a0b93b33c5b2ea919234062d35d505cd91324f92bab25aa613368fd2aa"} err="failed to get container status \"1c7196a0b93b33c5b2ea919234062d35d505cd91324f92bab25aa613368fd2aa\": rpc error: code = NotFound desc = could not find container \"1c7196a0b93b33c5b2ea919234062d35d505cd91324f92bab25aa613368fd2aa\": container with ID starting with 1c7196a0b93b33c5b2ea919234062d35d505cd91324f92bab25aa613368fd2aa not found: ID does not exist" Dec 01 23:19:45 crc kubenswrapper[4962]: I1201 23:19:45.977514 4962 scope.go:117] "RemoveContainer" containerID="6240f163db57c346ccf1f220d0a973e4e92adddcd1aaa64dcc24b05ffd61b97d" Dec 01 23:19:45 crc kubenswrapper[4962]: E1201 23:19:45.977806 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6240f163db57c346ccf1f220d0a973e4e92adddcd1aaa64dcc24b05ffd61b97d\": container with ID starting with 6240f163db57c346ccf1f220d0a973e4e92adddcd1aaa64dcc24b05ffd61b97d not found: ID does not exist" containerID="6240f163db57c346ccf1f220d0a973e4e92adddcd1aaa64dcc24b05ffd61b97d" Dec 01 23:19:45 crc kubenswrapper[4962]: I1201 23:19:45.977836 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6240f163db57c346ccf1f220d0a973e4e92adddcd1aaa64dcc24b05ffd61b97d"} err="failed to get container status \"6240f163db57c346ccf1f220d0a973e4e92adddcd1aaa64dcc24b05ffd61b97d\": rpc error: code = NotFound desc = could not find container \"6240f163db57c346ccf1f220d0a973e4e92adddcd1aaa64dcc24b05ffd61b97d\": container with ID starting with 6240f163db57c346ccf1f220d0a973e4e92adddcd1aaa64dcc24b05ffd61b97d not found: ID does not exist" Dec 01 23:19:45 crc kubenswrapper[4962]: I1201 23:19:45.977860 4962 scope.go:117] "RemoveContainer" containerID="65def3e956d53edea1d0e5a9e0c2394b611aa729468e8b40cdf1bef4bd9bc6cb" Dec 01 23:19:45 crc kubenswrapper[4962]: E1201 23:19:45.978259 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65def3e956d53edea1d0e5a9e0c2394b611aa729468e8b40cdf1bef4bd9bc6cb\": container with ID starting with 65def3e956d53edea1d0e5a9e0c2394b611aa729468e8b40cdf1bef4bd9bc6cb not found: ID does not exist" containerID="65def3e956d53edea1d0e5a9e0c2394b611aa729468e8b40cdf1bef4bd9bc6cb" Dec 01 23:19:45 crc kubenswrapper[4962]: I1201 23:19:45.978322 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65def3e956d53edea1d0e5a9e0c2394b611aa729468e8b40cdf1bef4bd9bc6cb"} err="failed to get container status \"65def3e956d53edea1d0e5a9e0c2394b611aa729468e8b40cdf1bef4bd9bc6cb\": rpc error: code = NotFound desc = could not find container \"65def3e956d53edea1d0e5a9e0c2394b611aa729468e8b40cdf1bef4bd9bc6cb\": container with ID starting with 65def3e956d53edea1d0e5a9e0c2394b611aa729468e8b40cdf1bef4bd9bc6cb not found: ID does not exist" Dec 01 23:19:46 crc kubenswrapper[4962]: I1201 23:19:46.239235 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8a0e643-eae2-46fb-9a9a-2b0483085460" path="/var/lib/kubelet/pods/b8a0e643-eae2-46fb-9a9a-2b0483085460/volumes" Dec 01 23:19:48 crc kubenswrapper[4962]: I1201 23:19:48.078987 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-9658d6667-z5mx5_7b4dc242-7421-44a8-862e-b78e3e4310f3/kube-rbac-proxy/0.log" Dec 01 23:19:48 crc kubenswrapper[4962]: I1201 23:19:48.139199 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-9658d6667-z5mx5_7b4dc242-7421-44a8-862e-b78e3e4310f3/manager/0.log" Dec 01 23:20:03 crc kubenswrapper[4962]: I1201 23:20:03.904297 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_cluster-logging-operator-ff9846bd-6l74q_6965bdb4-04f5-486b-9897-b190e56d69b0/cluster-logging-operator/0.log" Dec 01 23:20:04 crc kubenswrapper[4962]: I1201 23:20:04.094371 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_collector-ppj4c_15e991cf-b72c-462a-bc84-b157fee8ac90/collector/0.log" Dec 01 23:20:04 crc kubenswrapper[4962]: I1201 23:20:04.185413 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-compactor-0_b10ba804-253d-4972-bfd5-9f5fb9847989/loki-compactor/0.log" Dec 01 23:20:04 crc kubenswrapper[4962]: I1201 23:20:04.276753 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-distributor-76cc67bf56-prtfr_deb58cb2-860d-49d2-95e1-12aa147bd419/loki-distributor/0.log" Dec 01 23:20:04 crc kubenswrapper[4962]: I1201 23:20:04.396650 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-749d76f66f-8szr5_a89c265b-cf90-4c13-9e7e-ebd27f1b3463/gateway/0.log" Dec 01 23:20:04 crc kubenswrapper[4962]: I1201 23:20:04.447015 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-749d76f66f-8szr5_a89c265b-cf90-4c13-9e7e-ebd27f1b3463/opa/0.log" Dec 01 23:20:04 crc kubenswrapper[4962]: I1201 23:20:04.614198 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-749d76f66f-pnn6j_8fdb44c5-cad3-460a-a6c8-90e65be7c1ce/gateway/0.log" Dec 01 23:20:04 crc kubenswrapper[4962]: I1201 23:20:04.636109 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-749d76f66f-pnn6j_8fdb44c5-cad3-460a-a6c8-90e65be7c1ce/opa/0.log" Dec 01 23:20:04 crc kubenswrapper[4962]: I1201 23:20:04.740345 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-index-gateway-0_92297031-5f57-47f1-a6de-4a94b6490937/loki-index-gateway/0.log" Dec 01 23:20:04 crc kubenswrapper[4962]: I1201 23:20:04.868181 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-ingester-0_30d8b489-fae1-4ed5-8a5c-19d7bad83a3d/loki-ingester/0.log" Dec 01 23:20:04 crc kubenswrapper[4962]: I1201 23:20:04.956374 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-querier-5895d59bb8-khxd2_5e9077bf-815a-4c1f-8956-bc4094f59ceb/loki-querier/0.log" Dec 01 23:20:05 crc kubenswrapper[4962]: I1201 23:20:05.047385 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-query-frontend-84558f7c9f-7wvzd_dc99ea3a-cc9e-4e3d-9d0d-16aaa0ae5faa/loki-query-frontend/0.log" Dec 01 23:20:20 crc kubenswrapper[4962]: I1201 23:20:20.004322 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-hf6jx_519181c6-2c70-42ee-825f-427fe5942b07/kube-rbac-proxy/0.log" Dec 01 23:20:20 crc kubenswrapper[4962]: I1201 23:20:20.067161 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-hf6jx_519181c6-2c70-42ee-825f-427fe5942b07/controller/0.log" Dec 01 23:20:20 crc kubenswrapper[4962]: I1201 23:20:20.217810 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mgx99_7590e32f-b0cb-46dc-a679-46b2ede43ba0/cp-frr-files/0.log" Dec 01 23:20:20 crc kubenswrapper[4962]: I1201 23:20:20.357647 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mgx99_7590e32f-b0cb-46dc-a679-46b2ede43ba0/cp-reloader/0.log" Dec 01 23:20:20 crc kubenswrapper[4962]: I1201 23:20:20.407619 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mgx99_7590e32f-b0cb-46dc-a679-46b2ede43ba0/cp-reloader/0.log" Dec 01 23:20:20 crc kubenswrapper[4962]: I1201 23:20:20.407689 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mgx99_7590e32f-b0cb-46dc-a679-46b2ede43ba0/cp-metrics/0.log" Dec 01 23:20:20 crc kubenswrapper[4962]: I1201 23:20:20.409567 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mgx99_7590e32f-b0cb-46dc-a679-46b2ede43ba0/cp-frr-files/0.log" Dec 01 23:20:20 crc kubenswrapper[4962]: I1201 23:20:20.622201 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mgx99_7590e32f-b0cb-46dc-a679-46b2ede43ba0/cp-frr-files/0.log" Dec 01 23:20:20 crc kubenswrapper[4962]: I1201 23:20:20.634512 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mgx99_7590e32f-b0cb-46dc-a679-46b2ede43ba0/cp-metrics/0.log" Dec 01 23:20:20 crc kubenswrapper[4962]: I1201 23:20:20.663975 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mgx99_7590e32f-b0cb-46dc-a679-46b2ede43ba0/cp-reloader/0.log" Dec 01 23:20:20 crc kubenswrapper[4962]: I1201 23:20:20.673986 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mgx99_7590e32f-b0cb-46dc-a679-46b2ede43ba0/cp-metrics/0.log" Dec 01 23:20:20 crc kubenswrapper[4962]: I1201 23:20:20.864946 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mgx99_7590e32f-b0cb-46dc-a679-46b2ede43ba0/cp-frr-files/0.log" Dec 01 23:20:20 crc kubenswrapper[4962]: I1201 23:20:20.907184 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mgx99_7590e32f-b0cb-46dc-a679-46b2ede43ba0/cp-reloader/0.log" Dec 01 23:20:20 crc kubenswrapper[4962]: I1201 23:20:20.926892 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mgx99_7590e32f-b0cb-46dc-a679-46b2ede43ba0/cp-metrics/0.log" Dec 01 23:20:20 crc kubenswrapper[4962]: I1201 23:20:20.945317 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mgx99_7590e32f-b0cb-46dc-a679-46b2ede43ba0/controller/0.log" Dec 01 23:20:21 crc kubenswrapper[4962]: I1201 23:20:21.133085 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mgx99_7590e32f-b0cb-46dc-a679-46b2ede43ba0/kube-rbac-proxy/0.log" Dec 01 23:20:21 crc kubenswrapper[4962]: I1201 23:20:21.151970 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mgx99_7590e32f-b0cb-46dc-a679-46b2ede43ba0/frr-metrics/0.log" Dec 01 23:20:21 crc kubenswrapper[4962]: I1201 23:20:21.157455 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mgx99_7590e32f-b0cb-46dc-a679-46b2ede43ba0/kube-rbac-proxy-frr/0.log" Dec 01 23:20:21 crc kubenswrapper[4962]: I1201 23:20:21.318693 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mgx99_7590e32f-b0cb-46dc-a679-46b2ede43ba0/reloader/0.log" Dec 01 23:20:21 crc kubenswrapper[4962]: I1201 23:20:21.484358 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-2xwv6_ba7de090-9085-47a3-a086-73f78775d865/frr-k8s-webhook-server/0.log" Dec 01 23:20:21 crc kubenswrapper[4962]: I1201 23:20:21.589330 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-fc9ff4f78-6q794_06d500dd-2267-451a-992d-d676f1033bb6/manager/0.log" Dec 01 23:20:21 crc kubenswrapper[4962]: I1201 23:20:21.728179 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-74475bd8d7-k5jkf_e46e036e-ca57-4675-a356-6a0cf72b184d/webhook-server/0.log" Dec 01 23:20:22 crc kubenswrapper[4962]: I1201 23:20:22.047972 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-z5gxh_9dc8d3dc-4cdb-45b7-a54f-83db94bdde05/kube-rbac-proxy/0.log" Dec 01 23:20:22 crc kubenswrapper[4962]: I1201 23:20:22.536923 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-z5gxh_9dc8d3dc-4cdb-45b7-a54f-83db94bdde05/speaker/0.log" Dec 01 23:20:22 crc kubenswrapper[4962]: I1201 23:20:22.880723 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mgx99_7590e32f-b0cb-46dc-a679-46b2ede43ba0/frr/0.log" Dec 01 23:20:36 crc kubenswrapper[4962]: I1201 23:20:36.626529 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8lv8cp_1a573027-bd61-4a79-b4da-d0b25cb44908/util/0.log" Dec 01 23:20:36 crc kubenswrapper[4962]: I1201 23:20:36.747477 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8lv8cp_1a573027-bd61-4a79-b4da-d0b25cb44908/util/0.log" Dec 01 23:20:36 crc kubenswrapper[4962]: I1201 23:20:36.748789 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8lv8cp_1a573027-bd61-4a79-b4da-d0b25cb44908/pull/0.log" Dec 01 23:20:36 crc kubenswrapper[4962]: I1201 23:20:36.764745 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8lv8cp_1a573027-bd61-4a79-b4da-d0b25cb44908/pull/0.log" Dec 01 23:20:36 crc kubenswrapper[4962]: I1201 23:20:36.934076 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8lv8cp_1a573027-bd61-4a79-b4da-d0b25cb44908/util/0.log" Dec 01 23:20:36 crc kubenswrapper[4962]: I1201 23:20:36.941145 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8lv8cp_1a573027-bd61-4a79-b4da-d0b25cb44908/pull/0.log" Dec 01 23:20:36 crc kubenswrapper[4962]: I1201 23:20:36.960621 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8lv8cp_1a573027-bd61-4a79-b4da-d0b25cb44908/extract/0.log" Dec 01 23:20:37 crc kubenswrapper[4962]: I1201 23:20:37.109996 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f72t6l_fc7027df-456f-4562-8c33-b9902049338d/util/0.log" Dec 01 23:20:37 crc kubenswrapper[4962]: I1201 23:20:37.307825 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f72t6l_fc7027df-456f-4562-8c33-b9902049338d/pull/0.log" Dec 01 23:20:37 crc kubenswrapper[4962]: I1201 23:20:37.319308 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f72t6l_fc7027df-456f-4562-8c33-b9902049338d/util/0.log" Dec 01 23:20:37 crc kubenswrapper[4962]: I1201 23:20:37.335331 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f72t6l_fc7027df-456f-4562-8c33-b9902049338d/pull/0.log" Dec 01 23:20:37 crc kubenswrapper[4962]: I1201 23:20:37.500217 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f72t6l_fc7027df-456f-4562-8c33-b9902049338d/util/0.log" Dec 01 23:20:37 crc kubenswrapper[4962]: I1201 23:20:37.550104 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f72t6l_fc7027df-456f-4562-8c33-b9902049338d/pull/0.log" Dec 01 23:20:37 crc kubenswrapper[4962]: I1201 23:20:37.572701 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f72t6l_fc7027df-456f-4562-8c33-b9902049338d/extract/0.log" Dec 01 23:20:37 crc kubenswrapper[4962]: I1201 23:20:37.705063 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rljxv_1e0bbfb5-52f3-49bc-9db7-0d80859dbb2c/util/0.log" Dec 01 23:20:37 crc kubenswrapper[4962]: I1201 23:20:37.895348 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rljxv_1e0bbfb5-52f3-49bc-9db7-0d80859dbb2c/pull/0.log" Dec 01 23:20:37 crc kubenswrapper[4962]: I1201 23:20:37.934497 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rljxv_1e0bbfb5-52f3-49bc-9db7-0d80859dbb2c/pull/0.log" Dec 01 23:20:37 crc kubenswrapper[4962]: I1201 23:20:37.945080 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rljxv_1e0bbfb5-52f3-49bc-9db7-0d80859dbb2c/util/0.log" Dec 01 23:20:38 crc kubenswrapper[4962]: I1201 23:20:38.092882 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rljxv_1e0bbfb5-52f3-49bc-9db7-0d80859dbb2c/pull/0.log" Dec 01 23:20:38 crc kubenswrapper[4962]: I1201 23:20:38.130883 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rljxv_1e0bbfb5-52f3-49bc-9db7-0d80859dbb2c/util/0.log" Dec 01 23:20:38 crc kubenswrapper[4962]: I1201 23:20:38.136060 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rljxv_1e0bbfb5-52f3-49bc-9db7-0d80859dbb2c/extract/0.log" Dec 01 23:20:38 crc kubenswrapper[4962]: I1201 23:20:38.259810 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fttmb9_2cb7ef80-3761-4574-9c1f-52405a401ebc/util/0.log" Dec 01 23:20:38 crc kubenswrapper[4962]: I1201 23:20:38.458914 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fttmb9_2cb7ef80-3761-4574-9c1f-52405a401ebc/util/0.log" Dec 01 23:20:38 crc kubenswrapper[4962]: I1201 23:20:38.464215 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fttmb9_2cb7ef80-3761-4574-9c1f-52405a401ebc/pull/0.log" Dec 01 23:20:38 crc kubenswrapper[4962]: I1201 23:20:38.485829 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fttmb9_2cb7ef80-3761-4574-9c1f-52405a401ebc/pull/0.log" Dec 01 23:20:38 crc kubenswrapper[4962]: I1201 23:20:38.635969 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fttmb9_2cb7ef80-3761-4574-9c1f-52405a401ebc/extract/0.log" Dec 01 23:20:38 crc kubenswrapper[4962]: I1201 23:20:38.663755 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fttmb9_2cb7ef80-3761-4574-9c1f-52405a401ebc/pull/0.log" Dec 01 23:20:38 crc kubenswrapper[4962]: I1201 23:20:38.664630 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fttmb9_2cb7ef80-3761-4574-9c1f-52405a401ebc/util/0.log" Dec 01 23:20:38 crc kubenswrapper[4962]: I1201 23:20:38.812021 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832wntw_b6528667-e4ce-4641-9cbb-5ebfac003777/util/0.log" Dec 01 23:20:38 crc kubenswrapper[4962]: I1201 23:20:38.979906 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832wntw_b6528667-e4ce-4641-9cbb-5ebfac003777/util/0.log" Dec 01 23:20:38 crc kubenswrapper[4962]: I1201 23:20:38.980771 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832wntw_b6528667-e4ce-4641-9cbb-5ebfac003777/pull/0.log" Dec 01 23:20:39 crc kubenswrapper[4962]: I1201 23:20:39.014588 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832wntw_b6528667-e4ce-4641-9cbb-5ebfac003777/pull/0.log" Dec 01 23:20:39 crc kubenswrapper[4962]: I1201 23:20:39.201609 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832wntw_b6528667-e4ce-4641-9cbb-5ebfac003777/pull/0.log" Dec 01 23:20:39 crc kubenswrapper[4962]: I1201 23:20:39.212181 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832wntw_b6528667-e4ce-4641-9cbb-5ebfac003777/util/0.log" Dec 01 23:20:39 crc kubenswrapper[4962]: I1201 23:20:39.227112 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832wntw_b6528667-e4ce-4641-9cbb-5ebfac003777/extract/0.log" Dec 01 23:20:39 crc kubenswrapper[4962]: I1201 23:20:39.385715 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4sv58_3ea88a22-b18e-4b46-812c-35cb8dcdeb30/extract-utilities/0.log" Dec 01 23:20:39 crc kubenswrapper[4962]: I1201 23:20:39.575355 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4sv58_3ea88a22-b18e-4b46-812c-35cb8dcdeb30/extract-utilities/0.log" Dec 01 23:20:39 crc kubenswrapper[4962]: I1201 23:20:39.611730 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4sv58_3ea88a22-b18e-4b46-812c-35cb8dcdeb30/extract-content/0.log" Dec 01 23:20:39 crc kubenswrapper[4962]: I1201 23:20:39.660288 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4sv58_3ea88a22-b18e-4b46-812c-35cb8dcdeb30/extract-content/0.log" Dec 01 23:20:39 crc kubenswrapper[4962]: I1201 23:20:39.737693 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4sv58_3ea88a22-b18e-4b46-812c-35cb8dcdeb30/extract-utilities/0.log" Dec 01 23:20:39 crc kubenswrapper[4962]: I1201 23:20:39.786081 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4sv58_3ea88a22-b18e-4b46-812c-35cb8dcdeb30/extract-content/0.log" Dec 01 23:20:39 crc kubenswrapper[4962]: I1201 23:20:39.910646 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-t6qvm_21cefc69-51ca-4baa-a4f4-aa7de0d8aa7a/extract-utilities/0.log" Dec 01 23:20:40 crc kubenswrapper[4962]: I1201 23:20:40.110817 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-t6qvm_21cefc69-51ca-4baa-a4f4-aa7de0d8aa7a/extract-utilities/0.log" Dec 01 23:20:40 crc kubenswrapper[4962]: I1201 23:20:40.154900 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-t6qvm_21cefc69-51ca-4baa-a4f4-aa7de0d8aa7a/extract-content/0.log" Dec 01 23:20:40 crc kubenswrapper[4962]: I1201 23:20:40.168843 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-t6qvm_21cefc69-51ca-4baa-a4f4-aa7de0d8aa7a/extract-content/0.log" Dec 01 23:20:40 crc kubenswrapper[4962]: I1201 23:20:40.382599 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-t6qvm_21cefc69-51ca-4baa-a4f4-aa7de0d8aa7a/extract-content/0.log" Dec 01 23:20:40 crc kubenswrapper[4962]: I1201 23:20:40.385578 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-t6qvm_21cefc69-51ca-4baa-a4f4-aa7de0d8aa7a/extract-utilities/0.log" Dec 01 23:20:40 crc kubenswrapper[4962]: I1201 23:20:40.721633 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-lg4bg_def1945b-b735-4267-8798-cdb6e28ac006/marketplace-operator/0.log" Dec 01 23:20:40 crc kubenswrapper[4962]: I1201 23:20:40.830214 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6bphp_6ee35195-33b7-4bc8-80fb-7eb9f0ca221f/extract-utilities/0.log" Dec 01 23:20:40 crc kubenswrapper[4962]: I1201 23:20:40.903839 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4sv58_3ea88a22-b18e-4b46-812c-35cb8dcdeb30/registry-server/0.log" Dec 01 23:20:40 crc kubenswrapper[4962]: I1201 23:20:40.975256 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6bphp_6ee35195-33b7-4bc8-80fb-7eb9f0ca221f/extract-utilities/0.log" Dec 01 23:20:41 crc kubenswrapper[4962]: I1201 23:20:41.011856 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6bphp_6ee35195-33b7-4bc8-80fb-7eb9f0ca221f/extract-content/0.log" Dec 01 23:20:41 crc kubenswrapper[4962]: I1201 23:20:41.061244 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6bphp_6ee35195-33b7-4bc8-80fb-7eb9f0ca221f/extract-content/0.log" Dec 01 23:20:41 crc kubenswrapper[4962]: I1201 23:20:41.304301 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6bphp_6ee35195-33b7-4bc8-80fb-7eb9f0ca221f/extract-content/0.log" Dec 01 23:20:41 crc kubenswrapper[4962]: I1201 23:20:41.309506 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6bphp_6ee35195-33b7-4bc8-80fb-7eb9f0ca221f/extract-utilities/0.log" Dec 01 23:20:41 crc kubenswrapper[4962]: I1201 23:20:41.481348 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cckxw_d6ce364a-a2f9-48fa-9c65-5f8e65da569f/extract-utilities/0.log" Dec 01 23:20:41 crc kubenswrapper[4962]: I1201 23:20:41.660417 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6bphp_6ee35195-33b7-4bc8-80fb-7eb9f0ca221f/registry-server/0.log" Dec 01 23:20:41 crc kubenswrapper[4962]: I1201 23:20:41.725143 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cckxw_d6ce364a-a2f9-48fa-9c65-5f8e65da569f/extract-content/0.log" Dec 01 23:20:41 crc kubenswrapper[4962]: I1201 23:20:41.733655 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cckxw_d6ce364a-a2f9-48fa-9c65-5f8e65da569f/extract-content/0.log" Dec 01 23:20:41 crc kubenswrapper[4962]: I1201 23:20:41.743432 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-t6qvm_21cefc69-51ca-4baa-a4f4-aa7de0d8aa7a/registry-server/0.log" Dec 01 23:20:41 crc kubenswrapper[4962]: I1201 23:20:41.769006 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cckxw_d6ce364a-a2f9-48fa-9c65-5f8e65da569f/extract-utilities/0.log" Dec 01 23:20:41 crc kubenswrapper[4962]: I1201 23:20:41.896688 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cckxw_d6ce364a-a2f9-48fa-9c65-5f8e65da569f/extract-utilities/0.log" Dec 01 23:20:41 crc kubenswrapper[4962]: I1201 23:20:41.923370 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cckxw_d6ce364a-a2f9-48fa-9c65-5f8e65da569f/extract-content/0.log" Dec 01 23:20:42 crc kubenswrapper[4962]: I1201 23:20:42.644497 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cckxw_d6ce364a-a2f9-48fa-9c65-5f8e65da569f/registry-server/0.log" Dec 01 23:20:55 crc kubenswrapper[4962]: I1201 23:20:55.614141 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-668cf9dfbb-btqq9_c021e2cb-ee5d-4e74-a5fa-1ede1fde37df/prometheus-operator/0.log" Dec 01 23:20:55 crc kubenswrapper[4962]: I1201 23:20:55.776003 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-68b58c7cc9-gdr5f_b6a9273c-4395-4883-abbd-cfd15b5d552d/prometheus-operator-admission-webhook/0.log" Dec 01 23:20:55 crc kubenswrapper[4962]: I1201 23:20:55.846556 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-68b58c7cc9-mdbrz_bf5940c1-cfd9-4ed4-93a0-db06782924ae/prometheus-operator-admission-webhook/0.log" Dec 01 23:20:55 crc kubenswrapper[4962]: I1201 23:20:55.965943 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-d8bb48f5d-mbs2x_6cb92407-0085-483e-8079-3aa441bfd214/operator/0.log" Dec 01 23:20:56 crc kubenswrapper[4962]: I1201 23:20:56.099289 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-ui-dashboards-7d5fb4cbfb-9stzc_07284111-fb8f-4fc6-9693-dfe6869248bf/observability-ui-dashboards/0.log" Dec 01 23:20:56 crc kubenswrapper[4962]: I1201 23:20:56.213733 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5446b9c989-ff5lc_32158a1b-c7c3-4fda-98d1-69443d10d0a5/perses-operator/0.log" Dec 01 23:21:02 crc kubenswrapper[4962]: I1201 23:21:02.784316 4962 patch_prober.go:28] interesting pod/machine-config-daemon-b642k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 23:21:02 crc kubenswrapper[4962]: I1201 23:21:02.784923 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 23:21:10 crc kubenswrapper[4962]: I1201 23:21:10.304345 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-9658d6667-z5mx5_7b4dc242-7421-44a8-862e-b78e3e4310f3/manager/0.log" Dec 01 23:21:10 crc kubenswrapper[4962]: I1201 23:21:10.317238 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-9658d6667-z5mx5_7b4dc242-7421-44a8-862e-b78e3e4310f3/kube-rbac-proxy/0.log" Dec 01 23:21:15 crc kubenswrapper[4962]: I1201 23:21:15.457210 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tchdz"] Dec 01 23:21:15 crc kubenswrapper[4962]: E1201 23:21:15.458182 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8a0e643-eae2-46fb-9a9a-2b0483085460" containerName="extract-utilities" Dec 01 23:21:15 crc kubenswrapper[4962]: I1201 23:21:15.458200 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8a0e643-eae2-46fb-9a9a-2b0483085460" containerName="extract-utilities" Dec 01 23:21:15 crc kubenswrapper[4962]: E1201 23:21:15.458244 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8a0e643-eae2-46fb-9a9a-2b0483085460" containerName="extract-content" Dec 01 23:21:15 crc kubenswrapper[4962]: I1201 23:21:15.458250 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8a0e643-eae2-46fb-9a9a-2b0483085460" containerName="extract-content" Dec 01 23:21:15 crc kubenswrapper[4962]: E1201 23:21:15.458262 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8a0e643-eae2-46fb-9a9a-2b0483085460" containerName="registry-server" Dec 01 23:21:15 crc kubenswrapper[4962]: I1201 23:21:15.458268 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8a0e643-eae2-46fb-9a9a-2b0483085460" containerName="registry-server" Dec 01 23:21:15 crc kubenswrapper[4962]: I1201 23:21:15.458518 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8a0e643-eae2-46fb-9a9a-2b0483085460" containerName="registry-server" Dec 01 23:21:15 crc kubenswrapper[4962]: I1201 23:21:15.460266 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tchdz" Dec 01 23:21:15 crc kubenswrapper[4962]: I1201 23:21:15.476432 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tchdz"] Dec 01 23:21:15 crc kubenswrapper[4962]: I1201 23:21:15.559435 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ch9sn\" (UniqueName: \"kubernetes.io/projected/04792a2c-328d-44c2-bde6-9822f85f23d9-kube-api-access-ch9sn\") pod \"certified-operators-tchdz\" (UID: \"04792a2c-328d-44c2-bde6-9822f85f23d9\") " pod="openshift-marketplace/certified-operators-tchdz" Dec 01 23:21:15 crc kubenswrapper[4962]: I1201 23:21:15.559688 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04792a2c-328d-44c2-bde6-9822f85f23d9-catalog-content\") pod \"certified-operators-tchdz\" (UID: \"04792a2c-328d-44c2-bde6-9822f85f23d9\") " pod="openshift-marketplace/certified-operators-tchdz" Dec 01 23:21:15 crc kubenswrapper[4962]: I1201 23:21:15.559758 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04792a2c-328d-44c2-bde6-9822f85f23d9-utilities\") pod \"certified-operators-tchdz\" (UID: \"04792a2c-328d-44c2-bde6-9822f85f23d9\") " pod="openshift-marketplace/certified-operators-tchdz" Dec 01 23:21:15 crc kubenswrapper[4962]: I1201 23:21:15.662232 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04792a2c-328d-44c2-bde6-9822f85f23d9-catalog-content\") pod \"certified-operators-tchdz\" (UID: \"04792a2c-328d-44c2-bde6-9822f85f23d9\") " pod="openshift-marketplace/certified-operators-tchdz" Dec 01 23:21:15 crc kubenswrapper[4962]: I1201 23:21:15.662299 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04792a2c-328d-44c2-bde6-9822f85f23d9-utilities\") pod \"certified-operators-tchdz\" (UID: \"04792a2c-328d-44c2-bde6-9822f85f23d9\") " pod="openshift-marketplace/certified-operators-tchdz" Dec 01 23:21:15 crc kubenswrapper[4962]: I1201 23:21:15.662446 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ch9sn\" (UniqueName: \"kubernetes.io/projected/04792a2c-328d-44c2-bde6-9822f85f23d9-kube-api-access-ch9sn\") pod \"certified-operators-tchdz\" (UID: \"04792a2c-328d-44c2-bde6-9822f85f23d9\") " pod="openshift-marketplace/certified-operators-tchdz" Dec 01 23:21:15 crc kubenswrapper[4962]: I1201 23:21:15.662680 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04792a2c-328d-44c2-bde6-9822f85f23d9-catalog-content\") pod \"certified-operators-tchdz\" (UID: \"04792a2c-328d-44c2-bde6-9822f85f23d9\") " pod="openshift-marketplace/certified-operators-tchdz" Dec 01 23:21:15 crc kubenswrapper[4962]: I1201 23:21:15.662994 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04792a2c-328d-44c2-bde6-9822f85f23d9-utilities\") pod \"certified-operators-tchdz\" (UID: \"04792a2c-328d-44c2-bde6-9822f85f23d9\") " pod="openshift-marketplace/certified-operators-tchdz" Dec 01 23:21:15 crc kubenswrapper[4962]: I1201 23:21:15.682899 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ch9sn\" (UniqueName: \"kubernetes.io/projected/04792a2c-328d-44c2-bde6-9822f85f23d9-kube-api-access-ch9sn\") pod \"certified-operators-tchdz\" (UID: \"04792a2c-328d-44c2-bde6-9822f85f23d9\") " pod="openshift-marketplace/certified-operators-tchdz" Dec 01 23:21:15 crc kubenswrapper[4962]: I1201 23:21:15.803711 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tchdz" Dec 01 23:21:16 crc kubenswrapper[4962]: I1201 23:21:16.416162 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tchdz"] Dec 01 23:21:16 crc kubenswrapper[4962]: I1201 23:21:16.961058 4962 generic.go:334] "Generic (PLEG): container finished" podID="04792a2c-328d-44c2-bde6-9822f85f23d9" containerID="28ce81100cd60ec466ddbd6eeb3adeeaaece3c2bed7c5fbde3307e32a53e7510" exitCode=0 Dec 01 23:21:16 crc kubenswrapper[4962]: I1201 23:21:16.961252 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tchdz" event={"ID":"04792a2c-328d-44c2-bde6-9822f85f23d9","Type":"ContainerDied","Data":"28ce81100cd60ec466ddbd6eeb3adeeaaece3c2bed7c5fbde3307e32a53e7510"} Dec 01 23:21:16 crc kubenswrapper[4962]: I1201 23:21:16.961337 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tchdz" event={"ID":"04792a2c-328d-44c2-bde6-9822f85f23d9","Type":"ContainerStarted","Data":"896d4d5802677a7df553202cb6ff32f7863fd2804935f886180c6573a373f607"} Dec 01 23:21:16 crc kubenswrapper[4962]: I1201 23:21:16.964387 4962 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 23:21:17 crc kubenswrapper[4962]: I1201 23:21:17.973327 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tchdz" event={"ID":"04792a2c-328d-44c2-bde6-9822f85f23d9","Type":"ContainerStarted","Data":"a2127f4efa39215667d44fa7dab3d23fe5320679ee3df34e66d63c1b542bc8dc"} Dec 01 23:21:19 crc kubenswrapper[4962]: I1201 23:21:19.993304 4962 generic.go:334] "Generic (PLEG): container finished" podID="04792a2c-328d-44c2-bde6-9822f85f23d9" containerID="a2127f4efa39215667d44fa7dab3d23fe5320679ee3df34e66d63c1b542bc8dc" exitCode=0 Dec 01 23:21:19 crc kubenswrapper[4962]: I1201 23:21:19.993789 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tchdz" event={"ID":"04792a2c-328d-44c2-bde6-9822f85f23d9","Type":"ContainerDied","Data":"a2127f4efa39215667d44fa7dab3d23fe5320679ee3df34e66d63c1b542bc8dc"} Dec 01 23:21:21 crc kubenswrapper[4962]: I1201 23:21:21.030914 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tchdz" event={"ID":"04792a2c-328d-44c2-bde6-9822f85f23d9","Type":"ContainerStarted","Data":"52c8dd09f28e54c148b6dcd125f9a34fddfbb9c4d926ee1be35994e811b074bc"} Dec 01 23:21:21 crc kubenswrapper[4962]: I1201 23:21:21.068265 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tchdz" podStartSLOduration=2.485391526 podStartE2EDuration="6.068240267s" podCreationTimestamp="2025-12-01 23:21:15 +0000 UTC" firstStartedPulling="2025-12-01 23:21:16.963671517 +0000 UTC m=+6461.065110712" lastFinishedPulling="2025-12-01 23:21:20.546520258 +0000 UTC m=+6464.647959453" observedRunningTime="2025-12-01 23:21:21.054187089 +0000 UTC m=+6465.155626284" watchObservedRunningTime="2025-12-01 23:21:21.068240267 +0000 UTC m=+6465.169679462" Dec 01 23:21:24 crc kubenswrapper[4962]: E1201 23:21:24.394696 4962 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.110:42194->38.102.83.110:46143: read tcp 38.102.83.110:42194->38.102.83.110:46143: read: connection reset by peer Dec 01 23:21:25 crc kubenswrapper[4962]: I1201 23:21:25.812323 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-tchdz" Dec 01 23:21:25 crc kubenswrapper[4962]: I1201 23:21:25.812637 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tchdz" Dec 01 23:21:25 crc kubenswrapper[4962]: I1201 23:21:25.888223 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tchdz" Dec 01 23:21:26 crc kubenswrapper[4962]: I1201 23:21:26.153451 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tchdz" Dec 01 23:21:26 crc kubenswrapper[4962]: I1201 23:21:26.215696 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tchdz"] Dec 01 23:21:28 crc kubenswrapper[4962]: I1201 23:21:28.114759 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-tchdz" podUID="04792a2c-328d-44c2-bde6-9822f85f23d9" containerName="registry-server" containerID="cri-o://52c8dd09f28e54c148b6dcd125f9a34fddfbb9c4d926ee1be35994e811b074bc" gracePeriod=2 Dec 01 23:21:28 crc kubenswrapper[4962]: I1201 23:21:28.777454 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tchdz" Dec 01 23:21:28 crc kubenswrapper[4962]: I1201 23:21:28.901672 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04792a2c-328d-44c2-bde6-9822f85f23d9-utilities\") pod \"04792a2c-328d-44c2-bde6-9822f85f23d9\" (UID: \"04792a2c-328d-44c2-bde6-9822f85f23d9\") " Dec 01 23:21:28 crc kubenswrapper[4962]: I1201 23:21:28.902047 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04792a2c-328d-44c2-bde6-9822f85f23d9-catalog-content\") pod \"04792a2c-328d-44c2-bde6-9822f85f23d9\" (UID: \"04792a2c-328d-44c2-bde6-9822f85f23d9\") " Dec 01 23:21:28 crc kubenswrapper[4962]: I1201 23:21:28.902073 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ch9sn\" (UniqueName: \"kubernetes.io/projected/04792a2c-328d-44c2-bde6-9822f85f23d9-kube-api-access-ch9sn\") pod \"04792a2c-328d-44c2-bde6-9822f85f23d9\" (UID: \"04792a2c-328d-44c2-bde6-9822f85f23d9\") " Dec 01 23:21:28 crc kubenswrapper[4962]: I1201 23:21:28.902095 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04792a2c-328d-44c2-bde6-9822f85f23d9-utilities" (OuterVolumeSpecName: "utilities") pod "04792a2c-328d-44c2-bde6-9822f85f23d9" (UID: "04792a2c-328d-44c2-bde6-9822f85f23d9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 23:21:28 crc kubenswrapper[4962]: I1201 23:21:28.902719 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04792a2c-328d-44c2-bde6-9822f85f23d9-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 23:21:28 crc kubenswrapper[4962]: I1201 23:21:28.929120 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04792a2c-328d-44c2-bde6-9822f85f23d9-kube-api-access-ch9sn" (OuterVolumeSpecName: "kube-api-access-ch9sn") pod "04792a2c-328d-44c2-bde6-9822f85f23d9" (UID: "04792a2c-328d-44c2-bde6-9822f85f23d9"). InnerVolumeSpecName "kube-api-access-ch9sn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 23:21:28 crc kubenswrapper[4962]: I1201 23:21:28.948917 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04792a2c-328d-44c2-bde6-9822f85f23d9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "04792a2c-328d-44c2-bde6-9822f85f23d9" (UID: "04792a2c-328d-44c2-bde6-9822f85f23d9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 23:21:29 crc kubenswrapper[4962]: I1201 23:21:29.022178 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04792a2c-328d-44c2-bde6-9822f85f23d9-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 23:21:29 crc kubenswrapper[4962]: I1201 23:21:29.022224 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ch9sn\" (UniqueName: \"kubernetes.io/projected/04792a2c-328d-44c2-bde6-9822f85f23d9-kube-api-access-ch9sn\") on node \"crc\" DevicePath \"\"" Dec 01 23:21:29 crc kubenswrapper[4962]: I1201 23:21:29.137439 4962 generic.go:334] "Generic (PLEG): container finished" podID="04792a2c-328d-44c2-bde6-9822f85f23d9" containerID="52c8dd09f28e54c148b6dcd125f9a34fddfbb9c4d926ee1be35994e811b074bc" exitCode=0 Dec 01 23:21:29 crc kubenswrapper[4962]: I1201 23:21:29.137497 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tchdz" event={"ID":"04792a2c-328d-44c2-bde6-9822f85f23d9","Type":"ContainerDied","Data":"52c8dd09f28e54c148b6dcd125f9a34fddfbb9c4d926ee1be35994e811b074bc"} Dec 01 23:21:29 crc kubenswrapper[4962]: I1201 23:21:29.137531 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tchdz" event={"ID":"04792a2c-328d-44c2-bde6-9822f85f23d9","Type":"ContainerDied","Data":"896d4d5802677a7df553202cb6ff32f7863fd2804935f886180c6573a373f607"} Dec 01 23:21:29 crc kubenswrapper[4962]: I1201 23:21:29.137551 4962 scope.go:117] "RemoveContainer" containerID="52c8dd09f28e54c148b6dcd125f9a34fddfbb9c4d926ee1be35994e811b074bc" Dec 01 23:21:29 crc kubenswrapper[4962]: I1201 23:21:29.137737 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tchdz" Dec 01 23:21:29 crc kubenswrapper[4962]: I1201 23:21:29.182015 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tchdz"] Dec 01 23:21:29 crc kubenswrapper[4962]: I1201 23:21:29.192168 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-tchdz"] Dec 01 23:21:29 crc kubenswrapper[4962]: I1201 23:21:29.214361 4962 scope.go:117] "RemoveContainer" containerID="a2127f4efa39215667d44fa7dab3d23fe5320679ee3df34e66d63c1b542bc8dc" Dec 01 23:21:29 crc kubenswrapper[4962]: I1201 23:21:29.247087 4962 scope.go:117] "RemoveContainer" containerID="28ce81100cd60ec466ddbd6eeb3adeeaaece3c2bed7c5fbde3307e32a53e7510" Dec 01 23:21:29 crc kubenswrapper[4962]: I1201 23:21:29.292380 4962 scope.go:117] "RemoveContainer" containerID="52c8dd09f28e54c148b6dcd125f9a34fddfbb9c4d926ee1be35994e811b074bc" Dec 01 23:21:29 crc kubenswrapper[4962]: E1201 23:21:29.292862 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52c8dd09f28e54c148b6dcd125f9a34fddfbb9c4d926ee1be35994e811b074bc\": container with ID starting with 52c8dd09f28e54c148b6dcd125f9a34fddfbb9c4d926ee1be35994e811b074bc not found: ID does not exist" containerID="52c8dd09f28e54c148b6dcd125f9a34fddfbb9c4d926ee1be35994e811b074bc" Dec 01 23:21:29 crc kubenswrapper[4962]: I1201 23:21:29.292895 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52c8dd09f28e54c148b6dcd125f9a34fddfbb9c4d926ee1be35994e811b074bc"} err="failed to get container status \"52c8dd09f28e54c148b6dcd125f9a34fddfbb9c4d926ee1be35994e811b074bc\": rpc error: code = NotFound desc = could not find container \"52c8dd09f28e54c148b6dcd125f9a34fddfbb9c4d926ee1be35994e811b074bc\": container with ID starting with 52c8dd09f28e54c148b6dcd125f9a34fddfbb9c4d926ee1be35994e811b074bc not found: ID does not exist" Dec 01 23:21:29 crc kubenswrapper[4962]: I1201 23:21:29.292916 4962 scope.go:117] "RemoveContainer" containerID="a2127f4efa39215667d44fa7dab3d23fe5320679ee3df34e66d63c1b542bc8dc" Dec 01 23:21:29 crc kubenswrapper[4962]: E1201 23:21:29.293465 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2127f4efa39215667d44fa7dab3d23fe5320679ee3df34e66d63c1b542bc8dc\": container with ID starting with a2127f4efa39215667d44fa7dab3d23fe5320679ee3df34e66d63c1b542bc8dc not found: ID does not exist" containerID="a2127f4efa39215667d44fa7dab3d23fe5320679ee3df34e66d63c1b542bc8dc" Dec 01 23:21:29 crc kubenswrapper[4962]: I1201 23:21:29.293486 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2127f4efa39215667d44fa7dab3d23fe5320679ee3df34e66d63c1b542bc8dc"} err="failed to get container status \"a2127f4efa39215667d44fa7dab3d23fe5320679ee3df34e66d63c1b542bc8dc\": rpc error: code = NotFound desc = could not find container \"a2127f4efa39215667d44fa7dab3d23fe5320679ee3df34e66d63c1b542bc8dc\": container with ID starting with a2127f4efa39215667d44fa7dab3d23fe5320679ee3df34e66d63c1b542bc8dc not found: ID does not exist" Dec 01 23:21:29 crc kubenswrapper[4962]: I1201 23:21:29.293498 4962 scope.go:117] "RemoveContainer" containerID="28ce81100cd60ec466ddbd6eeb3adeeaaece3c2bed7c5fbde3307e32a53e7510" Dec 01 23:21:29 crc kubenswrapper[4962]: E1201 23:21:29.295187 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28ce81100cd60ec466ddbd6eeb3adeeaaece3c2bed7c5fbde3307e32a53e7510\": container with ID starting with 28ce81100cd60ec466ddbd6eeb3adeeaaece3c2bed7c5fbde3307e32a53e7510 not found: ID does not exist" containerID="28ce81100cd60ec466ddbd6eeb3adeeaaece3c2bed7c5fbde3307e32a53e7510" Dec 01 23:21:29 crc kubenswrapper[4962]: I1201 23:21:29.295215 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28ce81100cd60ec466ddbd6eeb3adeeaaece3c2bed7c5fbde3307e32a53e7510"} err="failed to get container status \"28ce81100cd60ec466ddbd6eeb3adeeaaece3c2bed7c5fbde3307e32a53e7510\": rpc error: code = NotFound desc = could not find container \"28ce81100cd60ec466ddbd6eeb3adeeaaece3c2bed7c5fbde3307e32a53e7510\": container with ID starting with 28ce81100cd60ec466ddbd6eeb3adeeaaece3c2bed7c5fbde3307e32a53e7510 not found: ID does not exist" Dec 01 23:21:30 crc kubenswrapper[4962]: I1201 23:21:30.233350 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04792a2c-328d-44c2-bde6-9822f85f23d9" path="/var/lib/kubelet/pods/04792a2c-328d-44c2-bde6-9822f85f23d9/volumes" Dec 01 23:21:32 crc kubenswrapper[4962]: I1201 23:21:32.785076 4962 patch_prober.go:28] interesting pod/machine-config-daemon-b642k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 23:21:32 crc kubenswrapper[4962]: I1201 23:21:32.785600 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 23:22:02 crc kubenswrapper[4962]: I1201 23:22:02.785119 4962 patch_prober.go:28] interesting pod/machine-config-daemon-b642k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 23:22:02 crc kubenswrapper[4962]: I1201 23:22:02.785720 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 23:22:02 crc kubenswrapper[4962]: I1201 23:22:02.785769 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b642k" Dec 01 23:22:02 crc kubenswrapper[4962]: I1201 23:22:02.787102 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"240fdf592428a97c287ff53cfcb71065a9dae3fb374060e47fe22b9eeb666e0a"} pod="openshift-machine-config-operator/machine-config-daemon-b642k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 23:22:02 crc kubenswrapper[4962]: I1201 23:22:02.787175 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" containerID="cri-o://240fdf592428a97c287ff53cfcb71065a9dae3fb374060e47fe22b9eeb666e0a" gracePeriod=600 Dec 01 23:22:03 crc kubenswrapper[4962]: I1201 23:22:03.613183 4962 generic.go:334] "Generic (PLEG): container finished" podID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerID="240fdf592428a97c287ff53cfcb71065a9dae3fb374060e47fe22b9eeb666e0a" exitCode=0 Dec 01 23:22:03 crc kubenswrapper[4962]: I1201 23:22:03.613850 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" event={"ID":"191b6ce3-f613-4217-b224-a65ee4cfdfe7","Type":"ContainerDied","Data":"240fdf592428a97c287ff53cfcb71065a9dae3fb374060e47fe22b9eeb666e0a"} Dec 01 23:22:03 crc kubenswrapper[4962]: I1201 23:22:03.613889 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" event={"ID":"191b6ce3-f613-4217-b224-a65ee4cfdfe7","Type":"ContainerStarted","Data":"c6fb37a1aaa81633d4a05ea5583bfc5dfc3c8b8e5c3dfe9438063a6a533a05f8"} Dec 01 23:22:03 crc kubenswrapper[4962]: I1201 23:22:03.614032 4962 scope.go:117] "RemoveContainer" containerID="9069c6b0afe806496066b75f21b8e7f8eb26d789d60c34eb7173619cac2f7eff" Dec 01 23:22:44 crc kubenswrapper[4962]: I1201 23:22:44.449560 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-v24f8"] Dec 01 23:22:44 crc kubenswrapper[4962]: E1201 23:22:44.450407 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04792a2c-328d-44c2-bde6-9822f85f23d9" containerName="registry-server" Dec 01 23:22:44 crc kubenswrapper[4962]: I1201 23:22:44.450421 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="04792a2c-328d-44c2-bde6-9822f85f23d9" containerName="registry-server" Dec 01 23:22:44 crc kubenswrapper[4962]: E1201 23:22:44.450453 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04792a2c-328d-44c2-bde6-9822f85f23d9" containerName="extract-content" Dec 01 23:22:44 crc kubenswrapper[4962]: I1201 23:22:44.450459 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="04792a2c-328d-44c2-bde6-9822f85f23d9" containerName="extract-content" Dec 01 23:22:44 crc kubenswrapper[4962]: E1201 23:22:44.450475 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04792a2c-328d-44c2-bde6-9822f85f23d9" containerName="extract-utilities" Dec 01 23:22:44 crc kubenswrapper[4962]: I1201 23:22:44.450481 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="04792a2c-328d-44c2-bde6-9822f85f23d9" containerName="extract-utilities" Dec 01 23:22:44 crc kubenswrapper[4962]: I1201 23:22:44.450695 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="04792a2c-328d-44c2-bde6-9822f85f23d9" containerName="registry-server" Dec 01 23:22:44 crc kubenswrapper[4962]: I1201 23:22:44.452408 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v24f8" Dec 01 23:22:44 crc kubenswrapper[4962]: I1201 23:22:44.463765 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-v24f8"] Dec 01 23:22:44 crc kubenswrapper[4962]: I1201 23:22:44.544444 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83d5def9-05c1-484f-ac66-4ceb92efce0e-catalog-content\") pod \"redhat-marketplace-v24f8\" (UID: \"83d5def9-05c1-484f-ac66-4ceb92efce0e\") " pod="openshift-marketplace/redhat-marketplace-v24f8" Dec 01 23:22:44 crc kubenswrapper[4962]: I1201 23:22:44.544514 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ll9f4\" (UniqueName: \"kubernetes.io/projected/83d5def9-05c1-484f-ac66-4ceb92efce0e-kube-api-access-ll9f4\") pod \"redhat-marketplace-v24f8\" (UID: \"83d5def9-05c1-484f-ac66-4ceb92efce0e\") " pod="openshift-marketplace/redhat-marketplace-v24f8" Dec 01 23:22:44 crc kubenswrapper[4962]: I1201 23:22:44.544575 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83d5def9-05c1-484f-ac66-4ceb92efce0e-utilities\") pod \"redhat-marketplace-v24f8\" (UID: \"83d5def9-05c1-484f-ac66-4ceb92efce0e\") " pod="openshift-marketplace/redhat-marketplace-v24f8" Dec 01 23:22:44 crc kubenswrapper[4962]: I1201 23:22:44.646038 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83d5def9-05c1-484f-ac66-4ceb92efce0e-catalog-content\") pod \"redhat-marketplace-v24f8\" (UID: \"83d5def9-05c1-484f-ac66-4ceb92efce0e\") " pod="openshift-marketplace/redhat-marketplace-v24f8" Dec 01 23:22:44 crc kubenswrapper[4962]: I1201 23:22:44.646349 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ll9f4\" (UniqueName: \"kubernetes.io/projected/83d5def9-05c1-484f-ac66-4ceb92efce0e-kube-api-access-ll9f4\") pod \"redhat-marketplace-v24f8\" (UID: \"83d5def9-05c1-484f-ac66-4ceb92efce0e\") " pod="openshift-marketplace/redhat-marketplace-v24f8" Dec 01 23:22:44 crc kubenswrapper[4962]: I1201 23:22:44.646478 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83d5def9-05c1-484f-ac66-4ceb92efce0e-utilities\") pod \"redhat-marketplace-v24f8\" (UID: \"83d5def9-05c1-484f-ac66-4ceb92efce0e\") " pod="openshift-marketplace/redhat-marketplace-v24f8" Dec 01 23:22:44 crc kubenswrapper[4962]: I1201 23:22:44.646501 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83d5def9-05c1-484f-ac66-4ceb92efce0e-catalog-content\") pod \"redhat-marketplace-v24f8\" (UID: \"83d5def9-05c1-484f-ac66-4ceb92efce0e\") " pod="openshift-marketplace/redhat-marketplace-v24f8" Dec 01 23:22:44 crc kubenswrapper[4962]: I1201 23:22:44.647023 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83d5def9-05c1-484f-ac66-4ceb92efce0e-utilities\") pod \"redhat-marketplace-v24f8\" (UID: \"83d5def9-05c1-484f-ac66-4ceb92efce0e\") " pod="openshift-marketplace/redhat-marketplace-v24f8" Dec 01 23:22:44 crc kubenswrapper[4962]: I1201 23:22:44.675925 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ll9f4\" (UniqueName: \"kubernetes.io/projected/83d5def9-05c1-484f-ac66-4ceb92efce0e-kube-api-access-ll9f4\") pod \"redhat-marketplace-v24f8\" (UID: \"83d5def9-05c1-484f-ac66-4ceb92efce0e\") " pod="openshift-marketplace/redhat-marketplace-v24f8" Dec 01 23:22:44 crc kubenswrapper[4962]: I1201 23:22:44.769062 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v24f8" Dec 01 23:22:45 crc kubenswrapper[4962]: I1201 23:22:45.274252 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-v24f8"] Dec 01 23:22:45 crc kubenswrapper[4962]: W1201 23:22:45.307944 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83d5def9_05c1_484f_ac66_4ceb92efce0e.slice/crio-35c874ddecd3441645dcff1ebd1789f3af8cfa1a74564fd8a885e6c1a79f2abd WatchSource:0}: Error finding container 35c874ddecd3441645dcff1ebd1789f3af8cfa1a74564fd8a885e6c1a79f2abd: Status 404 returned error can't find the container with id 35c874ddecd3441645dcff1ebd1789f3af8cfa1a74564fd8a885e6c1a79f2abd Dec 01 23:22:46 crc kubenswrapper[4962]: I1201 23:22:46.224237 4962 generic.go:334] "Generic (PLEG): container finished" podID="83d5def9-05c1-484f-ac66-4ceb92efce0e" containerID="b4d5fbc8d72458de67364eb964a6a8a74490601e032bb3a5f98606c2229abf14" exitCode=0 Dec 01 23:22:46 crc kubenswrapper[4962]: I1201 23:22:46.252487 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v24f8" event={"ID":"83d5def9-05c1-484f-ac66-4ceb92efce0e","Type":"ContainerDied","Data":"b4d5fbc8d72458de67364eb964a6a8a74490601e032bb3a5f98606c2229abf14"} Dec 01 23:22:46 crc kubenswrapper[4962]: I1201 23:22:46.252882 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v24f8" event={"ID":"83d5def9-05c1-484f-ac66-4ceb92efce0e","Type":"ContainerStarted","Data":"35c874ddecd3441645dcff1ebd1789f3af8cfa1a74564fd8a885e6c1a79f2abd"} Dec 01 23:22:48 crc kubenswrapper[4962]: I1201 23:22:48.262717 4962 generic.go:334] "Generic (PLEG): container finished" podID="83d5def9-05c1-484f-ac66-4ceb92efce0e" containerID="f9780db3d1242de7263e130fc5e96a4ac91e88de24c3e7275af8ab601c412129" exitCode=0 Dec 01 23:22:48 crc kubenswrapper[4962]: I1201 23:22:48.263235 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v24f8" event={"ID":"83d5def9-05c1-484f-ac66-4ceb92efce0e","Type":"ContainerDied","Data":"f9780db3d1242de7263e130fc5e96a4ac91e88de24c3e7275af8ab601c412129"} Dec 01 23:22:50 crc kubenswrapper[4962]: I1201 23:22:50.291191 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v24f8" event={"ID":"83d5def9-05c1-484f-ac66-4ceb92efce0e","Type":"ContainerStarted","Data":"c5809d61f0cde43d4330073e85363e8449c297759a01b57b4b97c21a1e22bca1"} Dec 01 23:22:50 crc kubenswrapper[4962]: I1201 23:22:50.318566 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-v24f8" podStartSLOduration=2.841730704 podStartE2EDuration="6.31854717s" podCreationTimestamp="2025-12-01 23:22:44 +0000 UTC" firstStartedPulling="2025-12-01 23:22:46.254991841 +0000 UTC m=+6550.356431036" lastFinishedPulling="2025-12-01 23:22:49.731808297 +0000 UTC m=+6553.833247502" observedRunningTime="2025-12-01 23:22:50.318516059 +0000 UTC m=+6554.419955254" watchObservedRunningTime="2025-12-01 23:22:50.31854717 +0000 UTC m=+6554.419986365" Dec 01 23:22:54 crc kubenswrapper[4962]: I1201 23:22:54.770314 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-v24f8" Dec 01 23:22:54 crc kubenswrapper[4962]: I1201 23:22:54.772833 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-v24f8" Dec 01 23:22:54 crc kubenswrapper[4962]: I1201 23:22:54.868463 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-v24f8" Dec 01 23:22:55 crc kubenswrapper[4962]: I1201 23:22:55.434209 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-v24f8" Dec 01 23:22:55 crc kubenswrapper[4962]: I1201 23:22:55.547109 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-v24f8"] Dec 01 23:22:57 crc kubenswrapper[4962]: I1201 23:22:57.376197 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-v24f8" podUID="83d5def9-05c1-484f-ac66-4ceb92efce0e" containerName="registry-server" containerID="cri-o://c5809d61f0cde43d4330073e85363e8449c297759a01b57b4b97c21a1e22bca1" gracePeriod=2 Dec 01 23:22:57 crc kubenswrapper[4962]: I1201 23:22:57.947672 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v24f8" Dec 01 23:22:58 crc kubenswrapper[4962]: I1201 23:22:58.113169 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ll9f4\" (UniqueName: \"kubernetes.io/projected/83d5def9-05c1-484f-ac66-4ceb92efce0e-kube-api-access-ll9f4\") pod \"83d5def9-05c1-484f-ac66-4ceb92efce0e\" (UID: \"83d5def9-05c1-484f-ac66-4ceb92efce0e\") " Dec 01 23:22:58 crc kubenswrapper[4962]: I1201 23:22:58.113539 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83d5def9-05c1-484f-ac66-4ceb92efce0e-catalog-content\") pod \"83d5def9-05c1-484f-ac66-4ceb92efce0e\" (UID: \"83d5def9-05c1-484f-ac66-4ceb92efce0e\") " Dec 01 23:22:58 crc kubenswrapper[4962]: I1201 23:22:58.113731 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83d5def9-05c1-484f-ac66-4ceb92efce0e-utilities\") pod \"83d5def9-05c1-484f-ac66-4ceb92efce0e\" (UID: \"83d5def9-05c1-484f-ac66-4ceb92efce0e\") " Dec 01 23:22:58 crc kubenswrapper[4962]: I1201 23:22:58.114529 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83d5def9-05c1-484f-ac66-4ceb92efce0e-utilities" (OuterVolumeSpecName: "utilities") pod "83d5def9-05c1-484f-ac66-4ceb92efce0e" (UID: "83d5def9-05c1-484f-ac66-4ceb92efce0e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 23:22:58 crc kubenswrapper[4962]: I1201 23:22:58.122189 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83d5def9-05c1-484f-ac66-4ceb92efce0e-kube-api-access-ll9f4" (OuterVolumeSpecName: "kube-api-access-ll9f4") pod "83d5def9-05c1-484f-ac66-4ceb92efce0e" (UID: "83d5def9-05c1-484f-ac66-4ceb92efce0e"). InnerVolumeSpecName "kube-api-access-ll9f4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 23:22:58 crc kubenswrapper[4962]: I1201 23:22:58.136100 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83d5def9-05c1-484f-ac66-4ceb92efce0e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "83d5def9-05c1-484f-ac66-4ceb92efce0e" (UID: "83d5def9-05c1-484f-ac66-4ceb92efce0e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 23:22:58 crc kubenswrapper[4962]: I1201 23:22:58.216705 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83d5def9-05c1-484f-ac66-4ceb92efce0e-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 23:22:58 crc kubenswrapper[4962]: I1201 23:22:58.216761 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ll9f4\" (UniqueName: \"kubernetes.io/projected/83d5def9-05c1-484f-ac66-4ceb92efce0e-kube-api-access-ll9f4\") on node \"crc\" DevicePath \"\"" Dec 01 23:22:58 crc kubenswrapper[4962]: I1201 23:22:58.216783 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83d5def9-05c1-484f-ac66-4ceb92efce0e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 23:22:58 crc kubenswrapper[4962]: I1201 23:22:58.392516 4962 generic.go:334] "Generic (PLEG): container finished" podID="83d5def9-05c1-484f-ac66-4ceb92efce0e" containerID="c5809d61f0cde43d4330073e85363e8449c297759a01b57b4b97c21a1e22bca1" exitCode=0 Dec 01 23:22:58 crc kubenswrapper[4962]: I1201 23:22:58.392564 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v24f8" event={"ID":"83d5def9-05c1-484f-ac66-4ceb92efce0e","Type":"ContainerDied","Data":"c5809d61f0cde43d4330073e85363e8449c297759a01b57b4b97c21a1e22bca1"} Dec 01 23:22:58 crc kubenswrapper[4962]: I1201 23:22:58.392594 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v24f8" event={"ID":"83d5def9-05c1-484f-ac66-4ceb92efce0e","Type":"ContainerDied","Data":"35c874ddecd3441645dcff1ebd1789f3af8cfa1a74564fd8a885e6c1a79f2abd"} Dec 01 23:22:58 crc kubenswrapper[4962]: I1201 23:22:58.392615 4962 scope.go:117] "RemoveContainer" containerID="c5809d61f0cde43d4330073e85363e8449c297759a01b57b4b97c21a1e22bca1" Dec 01 23:22:58 crc kubenswrapper[4962]: I1201 23:22:58.392758 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v24f8" Dec 01 23:22:58 crc kubenswrapper[4962]: I1201 23:22:58.425583 4962 scope.go:117] "RemoveContainer" containerID="f9780db3d1242de7263e130fc5e96a4ac91e88de24c3e7275af8ab601c412129" Dec 01 23:22:58 crc kubenswrapper[4962]: I1201 23:22:58.445013 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-v24f8"] Dec 01 23:22:58 crc kubenswrapper[4962]: I1201 23:22:58.471844 4962 scope.go:117] "RemoveContainer" containerID="b4d5fbc8d72458de67364eb964a6a8a74490601e032bb3a5f98606c2229abf14" Dec 01 23:22:58 crc kubenswrapper[4962]: I1201 23:22:58.475514 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-v24f8"] Dec 01 23:22:58 crc kubenswrapper[4962]: I1201 23:22:58.519139 4962 scope.go:117] "RemoveContainer" containerID="c5809d61f0cde43d4330073e85363e8449c297759a01b57b4b97c21a1e22bca1" Dec 01 23:22:58 crc kubenswrapper[4962]: E1201 23:22:58.519564 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5809d61f0cde43d4330073e85363e8449c297759a01b57b4b97c21a1e22bca1\": container with ID starting with c5809d61f0cde43d4330073e85363e8449c297759a01b57b4b97c21a1e22bca1 not found: ID does not exist" containerID="c5809d61f0cde43d4330073e85363e8449c297759a01b57b4b97c21a1e22bca1" Dec 01 23:22:58 crc kubenswrapper[4962]: I1201 23:22:58.519591 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5809d61f0cde43d4330073e85363e8449c297759a01b57b4b97c21a1e22bca1"} err="failed to get container status \"c5809d61f0cde43d4330073e85363e8449c297759a01b57b4b97c21a1e22bca1\": rpc error: code = NotFound desc = could not find container \"c5809d61f0cde43d4330073e85363e8449c297759a01b57b4b97c21a1e22bca1\": container with ID starting with c5809d61f0cde43d4330073e85363e8449c297759a01b57b4b97c21a1e22bca1 not found: ID does not exist" Dec 01 23:22:58 crc kubenswrapper[4962]: I1201 23:22:58.519615 4962 scope.go:117] "RemoveContainer" containerID="f9780db3d1242de7263e130fc5e96a4ac91e88de24c3e7275af8ab601c412129" Dec 01 23:22:58 crc kubenswrapper[4962]: E1201 23:22:58.519904 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9780db3d1242de7263e130fc5e96a4ac91e88de24c3e7275af8ab601c412129\": container with ID starting with f9780db3d1242de7263e130fc5e96a4ac91e88de24c3e7275af8ab601c412129 not found: ID does not exist" containerID="f9780db3d1242de7263e130fc5e96a4ac91e88de24c3e7275af8ab601c412129" Dec 01 23:22:58 crc kubenswrapper[4962]: I1201 23:22:58.519958 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9780db3d1242de7263e130fc5e96a4ac91e88de24c3e7275af8ab601c412129"} err="failed to get container status \"f9780db3d1242de7263e130fc5e96a4ac91e88de24c3e7275af8ab601c412129\": rpc error: code = NotFound desc = could not find container \"f9780db3d1242de7263e130fc5e96a4ac91e88de24c3e7275af8ab601c412129\": container with ID starting with f9780db3d1242de7263e130fc5e96a4ac91e88de24c3e7275af8ab601c412129 not found: ID does not exist" Dec 01 23:22:58 crc kubenswrapper[4962]: I1201 23:22:58.519976 4962 scope.go:117] "RemoveContainer" containerID="b4d5fbc8d72458de67364eb964a6a8a74490601e032bb3a5f98606c2229abf14" Dec 01 23:22:58 crc kubenswrapper[4962]: E1201 23:22:58.520289 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4d5fbc8d72458de67364eb964a6a8a74490601e032bb3a5f98606c2229abf14\": container with ID starting with b4d5fbc8d72458de67364eb964a6a8a74490601e032bb3a5f98606c2229abf14 not found: ID does not exist" containerID="b4d5fbc8d72458de67364eb964a6a8a74490601e032bb3a5f98606c2229abf14" Dec 01 23:22:58 crc kubenswrapper[4962]: I1201 23:22:58.520309 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4d5fbc8d72458de67364eb964a6a8a74490601e032bb3a5f98606c2229abf14"} err="failed to get container status \"b4d5fbc8d72458de67364eb964a6a8a74490601e032bb3a5f98606c2229abf14\": rpc error: code = NotFound desc = could not find container \"b4d5fbc8d72458de67364eb964a6a8a74490601e032bb3a5f98606c2229abf14\": container with ID starting with b4d5fbc8d72458de67364eb964a6a8a74490601e032bb3a5f98606c2229abf14 not found: ID does not exist" Dec 01 23:22:59 crc kubenswrapper[4962]: I1201 23:22:59.409295 4962 generic.go:334] "Generic (PLEG): container finished" podID="f31dec33-accc-4d65-88f7-1c3e6d179671" containerID="361def20a233997b5d820ed2051d0c269c57be73a8e197f8b0322c30120a870c" exitCode=0 Dec 01 23:22:59 crc kubenswrapper[4962]: I1201 23:22:59.409390 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ps2df/must-gather-5pcl2" event={"ID":"f31dec33-accc-4d65-88f7-1c3e6d179671","Type":"ContainerDied","Data":"361def20a233997b5d820ed2051d0c269c57be73a8e197f8b0322c30120a870c"} Dec 01 23:22:59 crc kubenswrapper[4962]: I1201 23:22:59.410305 4962 scope.go:117] "RemoveContainer" containerID="361def20a233997b5d820ed2051d0c269c57be73a8e197f8b0322c30120a870c" Dec 01 23:23:00 crc kubenswrapper[4962]: I1201 23:23:00.240245 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83d5def9-05c1-484f-ac66-4ceb92efce0e" path="/var/lib/kubelet/pods/83d5def9-05c1-484f-ac66-4ceb92efce0e/volumes" Dec 01 23:23:00 crc kubenswrapper[4962]: I1201 23:23:00.265484 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-ps2df_must-gather-5pcl2_f31dec33-accc-4d65-88f7-1c3e6d179671/gather/0.log" Dec 01 23:23:07 crc kubenswrapper[4962]: I1201 23:23:07.955913 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-ps2df/must-gather-5pcl2"] Dec 01 23:23:07 crc kubenswrapper[4962]: I1201 23:23:07.957024 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-ps2df/must-gather-5pcl2" podUID="f31dec33-accc-4d65-88f7-1c3e6d179671" containerName="copy" containerID="cri-o://5474913e902ab2a0db121683e228b7e6a8df406cc278e2aea6672b812dc132e2" gracePeriod=2 Dec 01 23:23:07 crc kubenswrapper[4962]: I1201 23:23:07.969668 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-ps2df/must-gather-5pcl2"] Dec 01 23:23:08 crc kubenswrapper[4962]: I1201 23:23:08.436306 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-ps2df_must-gather-5pcl2_f31dec33-accc-4d65-88f7-1c3e6d179671/copy/0.log" Dec 01 23:23:08 crc kubenswrapper[4962]: I1201 23:23:08.437322 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ps2df/must-gather-5pcl2" Dec 01 23:23:08 crc kubenswrapper[4962]: I1201 23:23:08.519453 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-ps2df_must-gather-5pcl2_f31dec33-accc-4d65-88f7-1c3e6d179671/copy/0.log" Dec 01 23:23:08 crc kubenswrapper[4962]: I1201 23:23:08.520082 4962 generic.go:334] "Generic (PLEG): container finished" podID="f31dec33-accc-4d65-88f7-1c3e6d179671" containerID="5474913e902ab2a0db121683e228b7e6a8df406cc278e2aea6672b812dc132e2" exitCode=143 Dec 01 23:23:08 crc kubenswrapper[4962]: I1201 23:23:08.520129 4962 scope.go:117] "RemoveContainer" containerID="5474913e902ab2a0db121683e228b7e6a8df406cc278e2aea6672b812dc132e2" Dec 01 23:23:08 crc kubenswrapper[4962]: I1201 23:23:08.520254 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ps2df/must-gather-5pcl2" Dec 01 23:23:08 crc kubenswrapper[4962]: I1201 23:23:08.541464 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f31dec33-accc-4d65-88f7-1c3e6d179671-must-gather-output\") pod \"f31dec33-accc-4d65-88f7-1c3e6d179671\" (UID: \"f31dec33-accc-4d65-88f7-1c3e6d179671\") " Dec 01 23:23:08 crc kubenswrapper[4962]: I1201 23:23:08.541719 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jlspn\" (UniqueName: \"kubernetes.io/projected/f31dec33-accc-4d65-88f7-1c3e6d179671-kube-api-access-jlspn\") pod \"f31dec33-accc-4d65-88f7-1c3e6d179671\" (UID: \"f31dec33-accc-4d65-88f7-1c3e6d179671\") " Dec 01 23:23:08 crc kubenswrapper[4962]: I1201 23:23:08.549261 4962 scope.go:117] "RemoveContainer" containerID="361def20a233997b5d820ed2051d0c269c57be73a8e197f8b0322c30120a870c" Dec 01 23:23:08 crc kubenswrapper[4962]: I1201 23:23:08.553587 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f31dec33-accc-4d65-88f7-1c3e6d179671-kube-api-access-jlspn" (OuterVolumeSpecName: "kube-api-access-jlspn") pod "f31dec33-accc-4d65-88f7-1c3e6d179671" (UID: "f31dec33-accc-4d65-88f7-1c3e6d179671"). InnerVolumeSpecName "kube-api-access-jlspn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 23:23:08 crc kubenswrapper[4962]: I1201 23:23:08.644874 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jlspn\" (UniqueName: \"kubernetes.io/projected/f31dec33-accc-4d65-88f7-1c3e6d179671-kube-api-access-jlspn\") on node \"crc\" DevicePath \"\"" Dec 01 23:23:08 crc kubenswrapper[4962]: I1201 23:23:08.657610 4962 scope.go:117] "RemoveContainer" containerID="5474913e902ab2a0db121683e228b7e6a8df406cc278e2aea6672b812dc132e2" Dec 01 23:23:08 crc kubenswrapper[4962]: E1201 23:23:08.658009 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5474913e902ab2a0db121683e228b7e6a8df406cc278e2aea6672b812dc132e2\": container with ID starting with 5474913e902ab2a0db121683e228b7e6a8df406cc278e2aea6672b812dc132e2 not found: ID does not exist" containerID="5474913e902ab2a0db121683e228b7e6a8df406cc278e2aea6672b812dc132e2" Dec 01 23:23:08 crc kubenswrapper[4962]: I1201 23:23:08.658046 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5474913e902ab2a0db121683e228b7e6a8df406cc278e2aea6672b812dc132e2"} err="failed to get container status \"5474913e902ab2a0db121683e228b7e6a8df406cc278e2aea6672b812dc132e2\": rpc error: code = NotFound desc = could not find container \"5474913e902ab2a0db121683e228b7e6a8df406cc278e2aea6672b812dc132e2\": container with ID starting with 5474913e902ab2a0db121683e228b7e6a8df406cc278e2aea6672b812dc132e2 not found: ID does not exist" Dec 01 23:23:08 crc kubenswrapper[4962]: I1201 23:23:08.658088 4962 scope.go:117] "RemoveContainer" containerID="361def20a233997b5d820ed2051d0c269c57be73a8e197f8b0322c30120a870c" Dec 01 23:23:08 crc kubenswrapper[4962]: E1201 23:23:08.658337 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"361def20a233997b5d820ed2051d0c269c57be73a8e197f8b0322c30120a870c\": container with ID starting with 361def20a233997b5d820ed2051d0c269c57be73a8e197f8b0322c30120a870c not found: ID does not exist" containerID="361def20a233997b5d820ed2051d0c269c57be73a8e197f8b0322c30120a870c" Dec 01 23:23:08 crc kubenswrapper[4962]: I1201 23:23:08.658367 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"361def20a233997b5d820ed2051d0c269c57be73a8e197f8b0322c30120a870c"} err="failed to get container status \"361def20a233997b5d820ed2051d0c269c57be73a8e197f8b0322c30120a870c\": rpc error: code = NotFound desc = could not find container \"361def20a233997b5d820ed2051d0c269c57be73a8e197f8b0322c30120a870c\": container with ID starting with 361def20a233997b5d820ed2051d0c269c57be73a8e197f8b0322c30120a870c not found: ID does not exist" Dec 01 23:23:08 crc kubenswrapper[4962]: I1201 23:23:08.765661 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f31dec33-accc-4d65-88f7-1c3e6d179671-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "f31dec33-accc-4d65-88f7-1c3e6d179671" (UID: "f31dec33-accc-4d65-88f7-1c3e6d179671"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 23:23:08 crc kubenswrapper[4962]: I1201 23:23:08.850178 4962 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f31dec33-accc-4d65-88f7-1c3e6d179671-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 01 23:23:09 crc kubenswrapper[4962]: I1201 23:23:09.833851 4962 scope.go:117] "RemoveContainer" containerID="f4d6205daa2c6812d0d5f064d1d86997b28ac62d45ade3e0ba3b8e20b6f9276b" Dec 01 23:23:10 crc kubenswrapper[4962]: I1201 23:23:10.235624 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f31dec33-accc-4d65-88f7-1c3e6d179671" path="/var/lib/kubelet/pods/f31dec33-accc-4d65-88f7-1c3e6d179671/volumes" Dec 01 23:24:09 crc kubenswrapper[4962]: I1201 23:24:09.978179 4962 scope.go:117] "RemoveContainer" containerID="7612b67d6c56216b58753cfacc25857ec263f6e885b006fc9a9f49732c6b1d60" Dec 01 23:24:32 crc kubenswrapper[4962]: I1201 23:24:32.785035 4962 patch_prober.go:28] interesting pod/machine-config-daemon-b642k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 23:24:32 crc kubenswrapper[4962]: I1201 23:24:32.785920 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 23:25:02 crc kubenswrapper[4962]: I1201 23:25:02.784157 4962 patch_prober.go:28] interesting pod/machine-config-daemon-b642k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 23:25:02 crc kubenswrapper[4962]: I1201 23:25:02.784874 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 23:25:32 crc kubenswrapper[4962]: I1201 23:25:32.784419 4962 patch_prober.go:28] interesting pod/machine-config-daemon-b642k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 23:25:32 crc kubenswrapper[4962]: I1201 23:25:32.785012 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 23:25:32 crc kubenswrapper[4962]: I1201 23:25:32.785060 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b642k" Dec 01 23:25:32 crc kubenswrapper[4962]: I1201 23:25:32.786315 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c6fb37a1aaa81633d4a05ea5583bfc5dfc3c8b8e5c3dfe9438063a6a533a05f8"} pod="openshift-machine-config-operator/machine-config-daemon-b642k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 23:25:32 crc kubenswrapper[4962]: I1201 23:25:32.786417 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" containerID="cri-o://c6fb37a1aaa81633d4a05ea5583bfc5dfc3c8b8e5c3dfe9438063a6a533a05f8" gracePeriod=600 Dec 01 23:25:32 crc kubenswrapper[4962]: E1201 23:25:32.937846 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 23:25:33 crc kubenswrapper[4962]: I1201 23:25:33.571704 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" event={"ID":"191b6ce3-f613-4217-b224-a65ee4cfdfe7","Type":"ContainerDied","Data":"c6fb37a1aaa81633d4a05ea5583bfc5dfc3c8b8e5c3dfe9438063a6a533a05f8"} Dec 01 23:25:33 crc kubenswrapper[4962]: I1201 23:25:33.571801 4962 scope.go:117] "RemoveContainer" containerID="240fdf592428a97c287ff53cfcb71065a9dae3fb374060e47fe22b9eeb666e0a" Dec 01 23:25:33 crc kubenswrapper[4962]: I1201 23:25:33.571645 4962 generic.go:334] "Generic (PLEG): container finished" podID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerID="c6fb37a1aaa81633d4a05ea5583bfc5dfc3c8b8e5c3dfe9438063a6a533a05f8" exitCode=0 Dec 01 23:25:33 crc kubenswrapper[4962]: I1201 23:25:33.572878 4962 scope.go:117] "RemoveContainer" containerID="c6fb37a1aaa81633d4a05ea5583bfc5dfc3c8b8e5c3dfe9438063a6a533a05f8" Dec 01 23:25:33 crc kubenswrapper[4962]: E1201 23:25:33.573551 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 23:25:49 crc kubenswrapper[4962]: I1201 23:25:49.221341 4962 scope.go:117] "RemoveContainer" containerID="c6fb37a1aaa81633d4a05ea5583bfc5dfc3c8b8e5c3dfe9438063a6a533a05f8" Dec 01 23:25:49 crc kubenswrapper[4962]: E1201 23:25:49.222246 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 23:26:03 crc kubenswrapper[4962]: I1201 23:26:03.220022 4962 scope.go:117] "RemoveContainer" containerID="c6fb37a1aaa81633d4a05ea5583bfc5dfc3c8b8e5c3dfe9438063a6a533a05f8" Dec 01 23:26:03 crc kubenswrapper[4962]: E1201 23:26:03.220765 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 23:26:17 crc kubenswrapper[4962]: I1201 23:26:17.220112 4962 scope.go:117] "RemoveContainer" containerID="c6fb37a1aaa81633d4a05ea5583bfc5dfc3c8b8e5c3dfe9438063a6a533a05f8" Dec 01 23:26:17 crc kubenswrapper[4962]: E1201 23:26:17.221012 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 23:26:32 crc kubenswrapper[4962]: I1201 23:26:32.220757 4962 scope.go:117] "RemoveContainer" containerID="c6fb37a1aaa81633d4a05ea5583bfc5dfc3c8b8e5c3dfe9438063a6a533a05f8" Dec 01 23:26:32 crc kubenswrapper[4962]: E1201 23:26:32.221862 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 23:26:33 crc kubenswrapper[4962]: I1201 23:26:33.915397 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-mx56v/must-gather-2mg5k"] Dec 01 23:26:33 crc kubenswrapper[4962]: E1201 23:26:33.916099 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f31dec33-accc-4d65-88f7-1c3e6d179671" containerName="gather" Dec 01 23:26:33 crc kubenswrapper[4962]: I1201 23:26:33.916112 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f31dec33-accc-4d65-88f7-1c3e6d179671" containerName="gather" Dec 01 23:26:33 crc kubenswrapper[4962]: E1201 23:26:33.916135 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f31dec33-accc-4d65-88f7-1c3e6d179671" containerName="copy" Dec 01 23:26:33 crc kubenswrapper[4962]: I1201 23:26:33.916143 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f31dec33-accc-4d65-88f7-1c3e6d179671" containerName="copy" Dec 01 23:26:33 crc kubenswrapper[4962]: E1201 23:26:33.916152 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83d5def9-05c1-484f-ac66-4ceb92efce0e" containerName="extract-content" Dec 01 23:26:33 crc kubenswrapper[4962]: I1201 23:26:33.916158 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="83d5def9-05c1-484f-ac66-4ceb92efce0e" containerName="extract-content" Dec 01 23:26:33 crc kubenswrapper[4962]: E1201 23:26:33.916170 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83d5def9-05c1-484f-ac66-4ceb92efce0e" containerName="extract-utilities" Dec 01 23:26:33 crc kubenswrapper[4962]: I1201 23:26:33.916178 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="83d5def9-05c1-484f-ac66-4ceb92efce0e" containerName="extract-utilities" Dec 01 23:26:33 crc kubenswrapper[4962]: E1201 23:26:33.916192 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83d5def9-05c1-484f-ac66-4ceb92efce0e" containerName="registry-server" Dec 01 23:26:33 crc kubenswrapper[4962]: I1201 23:26:33.916198 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="83d5def9-05c1-484f-ac66-4ceb92efce0e" containerName="registry-server" Dec 01 23:26:33 crc kubenswrapper[4962]: I1201 23:26:33.916447 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f31dec33-accc-4d65-88f7-1c3e6d179671" containerName="gather" Dec 01 23:26:33 crc kubenswrapper[4962]: I1201 23:26:33.916466 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f31dec33-accc-4d65-88f7-1c3e6d179671" containerName="copy" Dec 01 23:26:33 crc kubenswrapper[4962]: I1201 23:26:33.916481 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="83d5def9-05c1-484f-ac66-4ceb92efce0e" containerName="registry-server" Dec 01 23:26:33 crc kubenswrapper[4962]: I1201 23:26:33.917667 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mx56v/must-gather-2mg5k" Dec 01 23:26:33 crc kubenswrapper[4962]: I1201 23:26:33.934604 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-mx56v"/"openshift-service-ca.crt" Dec 01 23:26:33 crc kubenswrapper[4962]: I1201 23:26:33.934610 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-mx56v"/"kube-root-ca.crt" Dec 01 23:26:34 crc kubenswrapper[4962]: I1201 23:26:34.017142 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mx56v/must-gather-2mg5k"] Dec 01 23:26:34 crc kubenswrapper[4962]: I1201 23:26:34.094713 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9tph\" (UniqueName: \"kubernetes.io/projected/984aec14-799d-464c-a22a-d6511adacd1b-kube-api-access-p9tph\") pod \"must-gather-2mg5k\" (UID: \"984aec14-799d-464c-a22a-d6511adacd1b\") " pod="openshift-must-gather-mx56v/must-gather-2mg5k" Dec 01 23:26:34 crc kubenswrapper[4962]: I1201 23:26:34.094915 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/984aec14-799d-464c-a22a-d6511adacd1b-must-gather-output\") pod \"must-gather-2mg5k\" (UID: \"984aec14-799d-464c-a22a-d6511adacd1b\") " pod="openshift-must-gather-mx56v/must-gather-2mg5k" Dec 01 23:26:34 crc kubenswrapper[4962]: I1201 23:26:34.197296 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9tph\" (UniqueName: \"kubernetes.io/projected/984aec14-799d-464c-a22a-d6511adacd1b-kube-api-access-p9tph\") pod \"must-gather-2mg5k\" (UID: \"984aec14-799d-464c-a22a-d6511adacd1b\") " pod="openshift-must-gather-mx56v/must-gather-2mg5k" Dec 01 23:26:34 crc kubenswrapper[4962]: I1201 23:26:34.197530 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/984aec14-799d-464c-a22a-d6511adacd1b-must-gather-output\") pod \"must-gather-2mg5k\" (UID: \"984aec14-799d-464c-a22a-d6511adacd1b\") " pod="openshift-must-gather-mx56v/must-gather-2mg5k" Dec 01 23:26:34 crc kubenswrapper[4962]: I1201 23:26:34.198070 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/984aec14-799d-464c-a22a-d6511adacd1b-must-gather-output\") pod \"must-gather-2mg5k\" (UID: \"984aec14-799d-464c-a22a-d6511adacd1b\") " pod="openshift-must-gather-mx56v/must-gather-2mg5k" Dec 01 23:26:34 crc kubenswrapper[4962]: I1201 23:26:34.219705 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9tph\" (UniqueName: \"kubernetes.io/projected/984aec14-799d-464c-a22a-d6511adacd1b-kube-api-access-p9tph\") pod \"must-gather-2mg5k\" (UID: \"984aec14-799d-464c-a22a-d6511adacd1b\") " pod="openshift-must-gather-mx56v/must-gather-2mg5k" Dec 01 23:26:34 crc kubenswrapper[4962]: I1201 23:26:34.246749 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mx56v/must-gather-2mg5k" Dec 01 23:26:34 crc kubenswrapper[4962]: I1201 23:26:34.734861 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mx56v/must-gather-2mg5k"] Dec 01 23:26:35 crc kubenswrapper[4962]: I1201 23:26:35.487903 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mx56v/must-gather-2mg5k" event={"ID":"984aec14-799d-464c-a22a-d6511adacd1b","Type":"ContainerStarted","Data":"815a17eba76e45bf9a9a1dd2d6ed5356c9c5c61ccd72e431c60f0e13dfa65cc2"} Dec 01 23:26:35 crc kubenswrapper[4962]: I1201 23:26:35.488429 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mx56v/must-gather-2mg5k" event={"ID":"984aec14-799d-464c-a22a-d6511adacd1b","Type":"ContainerStarted","Data":"dae883d7de94a470fe044bba7437d28204dd97cb95cf87d2c0e359768c801595"} Dec 01 23:26:35 crc kubenswrapper[4962]: I1201 23:26:35.488458 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mx56v/must-gather-2mg5k" event={"ID":"984aec14-799d-464c-a22a-d6511adacd1b","Type":"ContainerStarted","Data":"7fb7cdaeff9a816f78bd3a878a9ee944d0150be5c26e70f73862c37da4273a1f"} Dec 01 23:26:35 crc kubenswrapper[4962]: I1201 23:26:35.512555 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-mx56v/must-gather-2mg5k" podStartSLOduration=2.51252754 podStartE2EDuration="2.51252754s" podCreationTimestamp="2025-12-01 23:26:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 23:26:35.50725136 +0000 UTC m=+6779.608690595" watchObservedRunningTime="2025-12-01 23:26:35.51252754 +0000 UTC m=+6779.613966745" Dec 01 23:26:38 crc kubenswrapper[4962]: I1201 23:26:38.955056 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-mx56v/crc-debug-fq7ls"] Dec 01 23:26:38 crc kubenswrapper[4962]: I1201 23:26:38.957068 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mx56v/crc-debug-fq7ls" Dec 01 23:26:38 crc kubenswrapper[4962]: I1201 23:26:38.959979 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-mx56v"/"default-dockercfg-vcz52" Dec 01 23:26:39 crc kubenswrapper[4962]: I1201 23:26:39.114738 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5a507cdf-bdae-4cb9-a5ec-213ad340619a-host\") pod \"crc-debug-fq7ls\" (UID: \"5a507cdf-bdae-4cb9-a5ec-213ad340619a\") " pod="openshift-must-gather-mx56v/crc-debug-fq7ls" Dec 01 23:26:39 crc kubenswrapper[4962]: I1201 23:26:39.114825 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mhrx\" (UniqueName: \"kubernetes.io/projected/5a507cdf-bdae-4cb9-a5ec-213ad340619a-kube-api-access-7mhrx\") pod \"crc-debug-fq7ls\" (UID: \"5a507cdf-bdae-4cb9-a5ec-213ad340619a\") " pod="openshift-must-gather-mx56v/crc-debug-fq7ls" Dec 01 23:26:39 crc kubenswrapper[4962]: I1201 23:26:39.217113 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mhrx\" (UniqueName: \"kubernetes.io/projected/5a507cdf-bdae-4cb9-a5ec-213ad340619a-kube-api-access-7mhrx\") pod \"crc-debug-fq7ls\" (UID: \"5a507cdf-bdae-4cb9-a5ec-213ad340619a\") " pod="openshift-must-gather-mx56v/crc-debug-fq7ls" Dec 01 23:26:39 crc kubenswrapper[4962]: I1201 23:26:39.217304 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5a507cdf-bdae-4cb9-a5ec-213ad340619a-host\") pod \"crc-debug-fq7ls\" (UID: \"5a507cdf-bdae-4cb9-a5ec-213ad340619a\") " pod="openshift-must-gather-mx56v/crc-debug-fq7ls" Dec 01 23:26:39 crc kubenswrapper[4962]: I1201 23:26:39.217807 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5a507cdf-bdae-4cb9-a5ec-213ad340619a-host\") pod \"crc-debug-fq7ls\" (UID: \"5a507cdf-bdae-4cb9-a5ec-213ad340619a\") " pod="openshift-must-gather-mx56v/crc-debug-fq7ls" Dec 01 23:26:39 crc kubenswrapper[4962]: I1201 23:26:39.235747 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mhrx\" (UniqueName: \"kubernetes.io/projected/5a507cdf-bdae-4cb9-a5ec-213ad340619a-kube-api-access-7mhrx\") pod \"crc-debug-fq7ls\" (UID: \"5a507cdf-bdae-4cb9-a5ec-213ad340619a\") " pod="openshift-must-gather-mx56v/crc-debug-fq7ls" Dec 01 23:26:39 crc kubenswrapper[4962]: I1201 23:26:39.276403 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mx56v/crc-debug-fq7ls" Dec 01 23:26:39 crc kubenswrapper[4962]: W1201 23:26:39.318488 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a507cdf_bdae_4cb9_a5ec_213ad340619a.slice/crio-ddb1e0c74979203569c217edd4d6f72bb6b703967d9ab975b4876a4ba4138f2f WatchSource:0}: Error finding container ddb1e0c74979203569c217edd4d6f72bb6b703967d9ab975b4876a4ba4138f2f: Status 404 returned error can't find the container with id ddb1e0c74979203569c217edd4d6f72bb6b703967d9ab975b4876a4ba4138f2f Dec 01 23:26:39 crc kubenswrapper[4962]: I1201 23:26:39.532859 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mx56v/crc-debug-fq7ls" event={"ID":"5a507cdf-bdae-4cb9-a5ec-213ad340619a","Type":"ContainerStarted","Data":"ddb1e0c74979203569c217edd4d6f72bb6b703967d9ab975b4876a4ba4138f2f"} Dec 01 23:26:40 crc kubenswrapper[4962]: I1201 23:26:40.546226 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mx56v/crc-debug-fq7ls" event={"ID":"5a507cdf-bdae-4cb9-a5ec-213ad340619a","Type":"ContainerStarted","Data":"f5c6acc25bde2708179e70cca992411f3e1fcc4dc13e5aab39d191b31baea2fc"} Dec 01 23:26:40 crc kubenswrapper[4962]: I1201 23:26:40.575220 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-mx56v/crc-debug-fq7ls" podStartSLOduration=2.57520139 podStartE2EDuration="2.57520139s" podCreationTimestamp="2025-12-01 23:26:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 23:26:40.560973557 +0000 UTC m=+6784.662412752" watchObservedRunningTime="2025-12-01 23:26:40.57520139 +0000 UTC m=+6784.676640585" Dec 01 23:26:45 crc kubenswrapper[4962]: I1201 23:26:45.219740 4962 scope.go:117] "RemoveContainer" containerID="c6fb37a1aaa81633d4a05ea5583bfc5dfc3c8b8e5c3dfe9438063a6a533a05f8" Dec 01 23:26:45 crc kubenswrapper[4962]: E1201 23:26:45.220613 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 23:27:00 crc kubenswrapper[4962]: I1201 23:27:00.219736 4962 scope.go:117] "RemoveContainer" containerID="c6fb37a1aaa81633d4a05ea5583bfc5dfc3c8b8e5c3dfe9438063a6a533a05f8" Dec 01 23:27:00 crc kubenswrapper[4962]: E1201 23:27:00.221861 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 23:27:13 crc kubenswrapper[4962]: I1201 23:27:13.219753 4962 scope.go:117] "RemoveContainer" containerID="c6fb37a1aaa81633d4a05ea5583bfc5dfc3c8b8e5c3dfe9438063a6a533a05f8" Dec 01 23:27:13 crc kubenswrapper[4962]: E1201 23:27:13.220614 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 23:27:22 crc kubenswrapper[4962]: I1201 23:27:22.017769 4962 generic.go:334] "Generic (PLEG): container finished" podID="5a507cdf-bdae-4cb9-a5ec-213ad340619a" containerID="f5c6acc25bde2708179e70cca992411f3e1fcc4dc13e5aab39d191b31baea2fc" exitCode=0 Dec 01 23:27:22 crc kubenswrapper[4962]: I1201 23:27:22.017899 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mx56v/crc-debug-fq7ls" event={"ID":"5a507cdf-bdae-4cb9-a5ec-213ad340619a","Type":"ContainerDied","Data":"f5c6acc25bde2708179e70cca992411f3e1fcc4dc13e5aab39d191b31baea2fc"} Dec 01 23:27:23 crc kubenswrapper[4962]: I1201 23:27:23.186704 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mx56v/crc-debug-fq7ls" Dec 01 23:27:23 crc kubenswrapper[4962]: I1201 23:27:23.231528 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-mx56v/crc-debug-fq7ls"] Dec 01 23:27:23 crc kubenswrapper[4962]: I1201 23:27:23.247403 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-mx56v/crc-debug-fq7ls"] Dec 01 23:27:23 crc kubenswrapper[4962]: I1201 23:27:23.349856 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5a507cdf-bdae-4cb9-a5ec-213ad340619a-host\") pod \"5a507cdf-bdae-4cb9-a5ec-213ad340619a\" (UID: \"5a507cdf-bdae-4cb9-a5ec-213ad340619a\") " Dec 01 23:27:23 crc kubenswrapper[4962]: I1201 23:27:23.349992 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mhrx\" (UniqueName: \"kubernetes.io/projected/5a507cdf-bdae-4cb9-a5ec-213ad340619a-kube-api-access-7mhrx\") pod \"5a507cdf-bdae-4cb9-a5ec-213ad340619a\" (UID: \"5a507cdf-bdae-4cb9-a5ec-213ad340619a\") " Dec 01 23:27:23 crc kubenswrapper[4962]: I1201 23:27:23.350022 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5a507cdf-bdae-4cb9-a5ec-213ad340619a-host" (OuterVolumeSpecName: "host") pod "5a507cdf-bdae-4cb9-a5ec-213ad340619a" (UID: "5a507cdf-bdae-4cb9-a5ec-213ad340619a"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 23:27:23 crc kubenswrapper[4962]: I1201 23:27:23.351235 4962 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5a507cdf-bdae-4cb9-a5ec-213ad340619a-host\") on node \"crc\" DevicePath \"\"" Dec 01 23:27:23 crc kubenswrapper[4962]: I1201 23:27:23.357272 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a507cdf-bdae-4cb9-a5ec-213ad340619a-kube-api-access-7mhrx" (OuterVolumeSpecName: "kube-api-access-7mhrx") pod "5a507cdf-bdae-4cb9-a5ec-213ad340619a" (UID: "5a507cdf-bdae-4cb9-a5ec-213ad340619a"). InnerVolumeSpecName "kube-api-access-7mhrx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 23:27:23 crc kubenswrapper[4962]: I1201 23:27:23.453289 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mhrx\" (UniqueName: \"kubernetes.io/projected/5a507cdf-bdae-4cb9-a5ec-213ad340619a-kube-api-access-7mhrx\") on node \"crc\" DevicePath \"\"" Dec 01 23:27:24 crc kubenswrapper[4962]: I1201 23:27:24.039761 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ddb1e0c74979203569c217edd4d6f72bb6b703967d9ab975b4876a4ba4138f2f" Dec 01 23:27:24 crc kubenswrapper[4962]: I1201 23:27:24.039829 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mx56v/crc-debug-fq7ls" Dec 01 23:27:24 crc kubenswrapper[4962]: I1201 23:27:24.236431 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a507cdf-bdae-4cb9-a5ec-213ad340619a" path="/var/lib/kubelet/pods/5a507cdf-bdae-4cb9-a5ec-213ad340619a/volumes" Dec 01 23:27:24 crc kubenswrapper[4962]: I1201 23:27:24.414444 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-mx56v/crc-debug-s7cjj"] Dec 01 23:27:24 crc kubenswrapper[4962]: E1201 23:27:24.415158 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a507cdf-bdae-4cb9-a5ec-213ad340619a" containerName="container-00" Dec 01 23:27:24 crc kubenswrapper[4962]: I1201 23:27:24.415175 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a507cdf-bdae-4cb9-a5ec-213ad340619a" containerName="container-00" Dec 01 23:27:24 crc kubenswrapper[4962]: I1201 23:27:24.415397 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a507cdf-bdae-4cb9-a5ec-213ad340619a" containerName="container-00" Dec 01 23:27:24 crc kubenswrapper[4962]: I1201 23:27:24.416175 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mx56v/crc-debug-s7cjj" Dec 01 23:27:24 crc kubenswrapper[4962]: I1201 23:27:24.418890 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-mx56v"/"default-dockercfg-vcz52" Dec 01 23:27:24 crc kubenswrapper[4962]: I1201 23:27:24.479398 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cj9x\" (UniqueName: \"kubernetes.io/projected/bd757916-f952-4d9c-8336-0ff41d9c17b9-kube-api-access-8cj9x\") pod \"crc-debug-s7cjj\" (UID: \"bd757916-f952-4d9c-8336-0ff41d9c17b9\") " pod="openshift-must-gather-mx56v/crc-debug-s7cjj" Dec 01 23:27:24 crc kubenswrapper[4962]: I1201 23:27:24.479767 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bd757916-f952-4d9c-8336-0ff41d9c17b9-host\") pod \"crc-debug-s7cjj\" (UID: \"bd757916-f952-4d9c-8336-0ff41d9c17b9\") " pod="openshift-must-gather-mx56v/crc-debug-s7cjj" Dec 01 23:27:24 crc kubenswrapper[4962]: I1201 23:27:24.582170 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cj9x\" (UniqueName: \"kubernetes.io/projected/bd757916-f952-4d9c-8336-0ff41d9c17b9-kube-api-access-8cj9x\") pod \"crc-debug-s7cjj\" (UID: \"bd757916-f952-4d9c-8336-0ff41d9c17b9\") " pod="openshift-must-gather-mx56v/crc-debug-s7cjj" Dec 01 23:27:24 crc kubenswrapper[4962]: I1201 23:27:24.582371 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bd757916-f952-4d9c-8336-0ff41d9c17b9-host\") pod \"crc-debug-s7cjj\" (UID: \"bd757916-f952-4d9c-8336-0ff41d9c17b9\") " pod="openshift-must-gather-mx56v/crc-debug-s7cjj" Dec 01 23:27:24 crc kubenswrapper[4962]: I1201 23:27:24.582528 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bd757916-f952-4d9c-8336-0ff41d9c17b9-host\") pod \"crc-debug-s7cjj\" (UID: \"bd757916-f952-4d9c-8336-0ff41d9c17b9\") " pod="openshift-must-gather-mx56v/crc-debug-s7cjj" Dec 01 23:27:24 crc kubenswrapper[4962]: I1201 23:27:24.604977 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cj9x\" (UniqueName: \"kubernetes.io/projected/bd757916-f952-4d9c-8336-0ff41d9c17b9-kube-api-access-8cj9x\") pod \"crc-debug-s7cjj\" (UID: \"bd757916-f952-4d9c-8336-0ff41d9c17b9\") " pod="openshift-must-gather-mx56v/crc-debug-s7cjj" Dec 01 23:27:24 crc kubenswrapper[4962]: I1201 23:27:24.738349 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mx56v/crc-debug-s7cjj" Dec 01 23:27:25 crc kubenswrapper[4962]: I1201 23:27:25.052513 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mx56v/crc-debug-s7cjj" event={"ID":"bd757916-f952-4d9c-8336-0ff41d9c17b9","Type":"ContainerStarted","Data":"95e6a8ecd21d0ea0efcd19e8e6d67bfb8956767dd93b391ac1ac73551cc7f0d1"} Dec 01 23:27:26 crc kubenswrapper[4962]: I1201 23:27:26.088104 4962 generic.go:334] "Generic (PLEG): container finished" podID="bd757916-f952-4d9c-8336-0ff41d9c17b9" containerID="813a9e396b8a81e5bd0b95087d22beea8f41b0b50a7de04b14344b6009234b93" exitCode=0 Dec 01 23:27:26 crc kubenswrapper[4962]: I1201 23:27:26.088432 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mx56v/crc-debug-s7cjj" event={"ID":"bd757916-f952-4d9c-8336-0ff41d9c17b9","Type":"ContainerDied","Data":"813a9e396b8a81e5bd0b95087d22beea8f41b0b50a7de04b14344b6009234b93"} Dec 01 23:27:27 crc kubenswrapper[4962]: I1201 23:27:27.216741 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mx56v/crc-debug-s7cjj" Dec 01 23:27:27 crc kubenswrapper[4962]: I1201 23:27:27.345535 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bd757916-f952-4d9c-8336-0ff41d9c17b9-host\") pod \"bd757916-f952-4d9c-8336-0ff41d9c17b9\" (UID: \"bd757916-f952-4d9c-8336-0ff41d9c17b9\") " Dec 01 23:27:27 crc kubenswrapper[4962]: I1201 23:27:27.345641 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bd757916-f952-4d9c-8336-0ff41d9c17b9-host" (OuterVolumeSpecName: "host") pod "bd757916-f952-4d9c-8336-0ff41d9c17b9" (UID: "bd757916-f952-4d9c-8336-0ff41d9c17b9"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 23:27:27 crc kubenswrapper[4962]: I1201 23:27:27.346036 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8cj9x\" (UniqueName: \"kubernetes.io/projected/bd757916-f952-4d9c-8336-0ff41d9c17b9-kube-api-access-8cj9x\") pod \"bd757916-f952-4d9c-8336-0ff41d9c17b9\" (UID: \"bd757916-f952-4d9c-8336-0ff41d9c17b9\") " Dec 01 23:27:27 crc kubenswrapper[4962]: I1201 23:27:27.346853 4962 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bd757916-f952-4d9c-8336-0ff41d9c17b9-host\") on node \"crc\" DevicePath \"\"" Dec 01 23:27:27 crc kubenswrapper[4962]: I1201 23:27:27.351110 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd757916-f952-4d9c-8336-0ff41d9c17b9-kube-api-access-8cj9x" (OuterVolumeSpecName: "kube-api-access-8cj9x") pod "bd757916-f952-4d9c-8336-0ff41d9c17b9" (UID: "bd757916-f952-4d9c-8336-0ff41d9c17b9"). InnerVolumeSpecName "kube-api-access-8cj9x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 23:27:27 crc kubenswrapper[4962]: I1201 23:27:27.448713 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8cj9x\" (UniqueName: \"kubernetes.io/projected/bd757916-f952-4d9c-8336-0ff41d9c17b9-kube-api-access-8cj9x\") on node \"crc\" DevicePath \"\"" Dec 01 23:27:28 crc kubenswrapper[4962]: I1201 23:27:28.113187 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mx56v/crc-debug-s7cjj" event={"ID":"bd757916-f952-4d9c-8336-0ff41d9c17b9","Type":"ContainerDied","Data":"95e6a8ecd21d0ea0efcd19e8e6d67bfb8956767dd93b391ac1ac73551cc7f0d1"} Dec 01 23:27:28 crc kubenswrapper[4962]: I1201 23:27:28.113464 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="95e6a8ecd21d0ea0efcd19e8e6d67bfb8956767dd93b391ac1ac73551cc7f0d1" Dec 01 23:27:28 crc kubenswrapper[4962]: I1201 23:27:28.113300 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mx56v/crc-debug-s7cjj" Dec 01 23:27:28 crc kubenswrapper[4962]: I1201 23:27:28.220237 4962 scope.go:117] "RemoveContainer" containerID="c6fb37a1aaa81633d4a05ea5583bfc5dfc3c8b8e5c3dfe9438063a6a533a05f8" Dec 01 23:27:28 crc kubenswrapper[4962]: E1201 23:27:28.222128 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 23:27:28 crc kubenswrapper[4962]: I1201 23:27:28.370922 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-mx56v/crc-debug-s7cjj"] Dec 01 23:27:28 crc kubenswrapper[4962]: I1201 23:27:28.380502 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-mx56v/crc-debug-s7cjj"] Dec 01 23:27:29 crc kubenswrapper[4962]: I1201 23:27:29.620328 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-mx56v/crc-debug-thsp9"] Dec 01 23:27:29 crc kubenswrapper[4962]: E1201 23:27:29.621736 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd757916-f952-4d9c-8336-0ff41d9c17b9" containerName="container-00" Dec 01 23:27:29 crc kubenswrapper[4962]: I1201 23:27:29.621842 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd757916-f952-4d9c-8336-0ff41d9c17b9" containerName="container-00" Dec 01 23:27:29 crc kubenswrapper[4962]: I1201 23:27:29.622326 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd757916-f952-4d9c-8336-0ff41d9c17b9" containerName="container-00" Dec 01 23:27:29 crc kubenswrapper[4962]: I1201 23:27:29.623440 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mx56v/crc-debug-thsp9" Dec 01 23:27:29 crc kubenswrapper[4962]: I1201 23:27:29.627430 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-mx56v"/"default-dockercfg-vcz52" Dec 01 23:27:29 crc kubenswrapper[4962]: I1201 23:27:29.713828 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wfww\" (UniqueName: \"kubernetes.io/projected/e21d8006-0e58-4d47-8adf-480ad0c08a1b-kube-api-access-7wfww\") pod \"crc-debug-thsp9\" (UID: \"e21d8006-0e58-4d47-8adf-480ad0c08a1b\") " pod="openshift-must-gather-mx56v/crc-debug-thsp9" Dec 01 23:27:29 crc kubenswrapper[4962]: I1201 23:27:29.713967 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e21d8006-0e58-4d47-8adf-480ad0c08a1b-host\") pod \"crc-debug-thsp9\" (UID: \"e21d8006-0e58-4d47-8adf-480ad0c08a1b\") " pod="openshift-must-gather-mx56v/crc-debug-thsp9" Dec 01 23:27:29 crc kubenswrapper[4962]: I1201 23:27:29.816712 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wfww\" (UniqueName: \"kubernetes.io/projected/e21d8006-0e58-4d47-8adf-480ad0c08a1b-kube-api-access-7wfww\") pod \"crc-debug-thsp9\" (UID: \"e21d8006-0e58-4d47-8adf-480ad0c08a1b\") " pod="openshift-must-gather-mx56v/crc-debug-thsp9" Dec 01 23:27:29 crc kubenswrapper[4962]: I1201 23:27:29.816814 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e21d8006-0e58-4d47-8adf-480ad0c08a1b-host\") pod \"crc-debug-thsp9\" (UID: \"e21d8006-0e58-4d47-8adf-480ad0c08a1b\") " pod="openshift-must-gather-mx56v/crc-debug-thsp9" Dec 01 23:27:29 crc kubenswrapper[4962]: I1201 23:27:29.817025 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e21d8006-0e58-4d47-8adf-480ad0c08a1b-host\") pod \"crc-debug-thsp9\" (UID: \"e21d8006-0e58-4d47-8adf-480ad0c08a1b\") " pod="openshift-must-gather-mx56v/crc-debug-thsp9" Dec 01 23:27:29 crc kubenswrapper[4962]: I1201 23:27:29.845242 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wfww\" (UniqueName: \"kubernetes.io/projected/e21d8006-0e58-4d47-8adf-480ad0c08a1b-kube-api-access-7wfww\") pod \"crc-debug-thsp9\" (UID: \"e21d8006-0e58-4d47-8adf-480ad0c08a1b\") " pod="openshift-must-gather-mx56v/crc-debug-thsp9" Dec 01 23:27:29 crc kubenswrapper[4962]: I1201 23:27:29.948134 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mx56v/crc-debug-thsp9" Dec 01 23:27:30 crc kubenswrapper[4962]: I1201 23:27:30.155741 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mx56v/crc-debug-thsp9" event={"ID":"e21d8006-0e58-4d47-8adf-480ad0c08a1b","Type":"ContainerStarted","Data":"fb662e9c958168a966c36f375873bd86fdc413c5d225b31b698adf295100b038"} Dec 01 23:27:30 crc kubenswrapper[4962]: I1201 23:27:30.233538 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd757916-f952-4d9c-8336-0ff41d9c17b9" path="/var/lib/kubelet/pods/bd757916-f952-4d9c-8336-0ff41d9c17b9/volumes" Dec 01 23:27:31 crc kubenswrapper[4962]: I1201 23:27:31.164784 4962 generic.go:334] "Generic (PLEG): container finished" podID="e21d8006-0e58-4d47-8adf-480ad0c08a1b" containerID="c505d905b2f12f23489e356e024ff297b0a7663022b9bba2fa44aeb2c65d50bf" exitCode=0 Dec 01 23:27:31 crc kubenswrapper[4962]: I1201 23:27:31.164945 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mx56v/crc-debug-thsp9" event={"ID":"e21d8006-0e58-4d47-8adf-480ad0c08a1b","Type":"ContainerDied","Data":"c505d905b2f12f23489e356e024ff297b0a7663022b9bba2fa44aeb2c65d50bf"} Dec 01 23:27:31 crc kubenswrapper[4962]: I1201 23:27:31.218784 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-mx56v/crc-debug-thsp9"] Dec 01 23:27:31 crc kubenswrapper[4962]: I1201 23:27:31.229542 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-mx56v/crc-debug-thsp9"] Dec 01 23:27:32 crc kubenswrapper[4962]: I1201 23:27:32.320560 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mx56v/crc-debug-thsp9" Dec 01 23:27:32 crc kubenswrapper[4962]: I1201 23:27:32.390173 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e21d8006-0e58-4d47-8adf-480ad0c08a1b-host\") pod \"e21d8006-0e58-4d47-8adf-480ad0c08a1b\" (UID: \"e21d8006-0e58-4d47-8adf-480ad0c08a1b\") " Dec 01 23:27:32 crc kubenswrapper[4962]: I1201 23:27:32.390270 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wfww\" (UniqueName: \"kubernetes.io/projected/e21d8006-0e58-4d47-8adf-480ad0c08a1b-kube-api-access-7wfww\") pod \"e21d8006-0e58-4d47-8adf-480ad0c08a1b\" (UID: \"e21d8006-0e58-4d47-8adf-480ad0c08a1b\") " Dec 01 23:27:32 crc kubenswrapper[4962]: I1201 23:27:32.390285 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e21d8006-0e58-4d47-8adf-480ad0c08a1b-host" (OuterVolumeSpecName: "host") pod "e21d8006-0e58-4d47-8adf-480ad0c08a1b" (UID: "e21d8006-0e58-4d47-8adf-480ad0c08a1b"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 23:27:32 crc kubenswrapper[4962]: I1201 23:27:32.390894 4962 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e21d8006-0e58-4d47-8adf-480ad0c08a1b-host\") on node \"crc\" DevicePath \"\"" Dec 01 23:27:32 crc kubenswrapper[4962]: I1201 23:27:32.399928 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e21d8006-0e58-4d47-8adf-480ad0c08a1b-kube-api-access-7wfww" (OuterVolumeSpecName: "kube-api-access-7wfww") pod "e21d8006-0e58-4d47-8adf-480ad0c08a1b" (UID: "e21d8006-0e58-4d47-8adf-480ad0c08a1b"). InnerVolumeSpecName "kube-api-access-7wfww". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 23:27:32 crc kubenswrapper[4962]: I1201 23:27:32.493558 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wfww\" (UniqueName: \"kubernetes.io/projected/e21d8006-0e58-4d47-8adf-480ad0c08a1b-kube-api-access-7wfww\") on node \"crc\" DevicePath \"\"" Dec 01 23:27:33 crc kubenswrapper[4962]: I1201 23:27:33.185298 4962 scope.go:117] "RemoveContainer" containerID="c505d905b2f12f23489e356e024ff297b0a7663022b9bba2fa44aeb2c65d50bf" Dec 01 23:27:33 crc kubenswrapper[4962]: I1201 23:27:33.185333 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mx56v/crc-debug-thsp9" Dec 01 23:27:33 crc kubenswrapper[4962]: E1201 23:27:33.498659 4962 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode21d8006_0e58_4d47_8adf_480ad0c08a1b.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode21d8006_0e58_4d47_8adf_480ad0c08a1b.slice/crio-fb662e9c958168a966c36f375873bd86fdc413c5d225b31b698adf295100b038\": RecentStats: unable to find data in memory cache]" Dec 01 23:27:34 crc kubenswrapper[4962]: I1201 23:27:34.239769 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e21d8006-0e58-4d47-8adf-480ad0c08a1b" path="/var/lib/kubelet/pods/e21d8006-0e58-4d47-8adf-480ad0c08a1b/volumes" Dec 01 23:27:43 crc kubenswrapper[4962]: I1201 23:27:43.220544 4962 scope.go:117] "RemoveContainer" containerID="c6fb37a1aaa81633d4a05ea5583bfc5dfc3c8b8e5c3dfe9438063a6a533a05f8" Dec 01 23:27:43 crc kubenswrapper[4962]: E1201 23:27:43.221532 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 23:27:57 crc kubenswrapper[4962]: I1201 23:27:57.222614 4962 scope.go:117] "RemoveContainer" containerID="c6fb37a1aaa81633d4a05ea5583bfc5dfc3c8b8e5c3dfe9438063a6a533a05f8" Dec 01 23:27:57 crc kubenswrapper[4962]: E1201 23:27:57.224093 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 23:28:05 crc kubenswrapper[4962]: I1201 23:28:05.302146 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_f1377a9a-eb6d-42f1-89f5-f8383c69b93e/aodh-api/0.log" Dec 01 23:28:05 crc kubenswrapper[4962]: I1201 23:28:05.485793 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_f1377a9a-eb6d-42f1-89f5-f8383c69b93e/aodh-evaluator/0.log" Dec 01 23:28:05 crc kubenswrapper[4962]: I1201 23:28:05.490372 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_f1377a9a-eb6d-42f1-89f5-f8383c69b93e/aodh-notifier/0.log" Dec 01 23:28:05 crc kubenswrapper[4962]: I1201 23:28:05.511655 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_f1377a9a-eb6d-42f1-89f5-f8383c69b93e/aodh-listener/0.log" Dec 01 23:28:05 crc kubenswrapper[4962]: I1201 23:28:05.680248 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-65c9498c86-5xmqd_91d67b1b-578f-46c7-aec8-83785d2fe411/barbican-api-log/0.log" Dec 01 23:28:05 crc kubenswrapper[4962]: I1201 23:28:05.699400 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-65c9498c86-5xmqd_91d67b1b-578f-46c7-aec8-83785d2fe411/barbican-api/0.log" Dec 01 23:28:05 crc kubenswrapper[4962]: I1201 23:28:05.814871 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7fbb7f6fcd-9l2gb_909a52c1-349c-4b1b-929f-7d2c554cad32/barbican-keystone-listener/0.log" Dec 01 23:28:05 crc kubenswrapper[4962]: I1201 23:28:05.985471 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7fbb7f6fcd-9l2gb_909a52c1-349c-4b1b-929f-7d2c554cad32/barbican-keystone-listener-log/0.log" Dec 01 23:28:06 crc kubenswrapper[4962]: I1201 23:28:06.080876 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-76b577fdff-d85rv_0fcefe11-14bc-40f6-8552-da42f7b63977/barbican-worker/0.log" Dec 01 23:28:06 crc kubenswrapper[4962]: I1201 23:28:06.109644 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-76b577fdff-d85rv_0fcefe11-14bc-40f6-8552-da42f7b63977/barbican-worker-log/0.log" Dec 01 23:28:06 crc kubenswrapper[4962]: I1201 23:28:06.293001 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-4d6td_6dc28247-9b3f-421b-a195-2f89ea5b50f8/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 23:28:06 crc kubenswrapper[4962]: I1201 23:28:06.412295 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_6dc4f027-5299-427c-9726-65012507b49b/ceilometer-central-agent/0.log" Dec 01 23:28:06 crc kubenswrapper[4962]: I1201 23:28:06.547740 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_6dc4f027-5299-427c-9726-65012507b49b/proxy-httpd/0.log" Dec 01 23:28:06 crc kubenswrapper[4962]: I1201 23:28:06.596022 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_6dc4f027-5299-427c-9726-65012507b49b/ceilometer-notification-agent/0.log" Dec 01 23:28:06 crc kubenswrapper[4962]: I1201 23:28:06.607365 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_6dc4f027-5299-427c-9726-65012507b49b/sg-core/0.log" Dec 01 23:28:06 crc kubenswrapper[4962]: I1201 23:28:06.832015 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_356fbcf6-1bde-4b7b-bf5f-7be551d1a03c/cinder-api-log/0.log" Dec 01 23:28:06 crc kubenswrapper[4962]: I1201 23:28:06.872303 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_356fbcf6-1bde-4b7b-bf5f-7be551d1a03c/cinder-api/0.log" Dec 01 23:28:07 crc kubenswrapper[4962]: I1201 23:28:07.003269 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_3c2a5fd2-18a9-46c7-8450-9c9cdeadea4d/cinder-scheduler/0.log" Dec 01 23:28:07 crc kubenswrapper[4962]: I1201 23:28:07.121920 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_3c2a5fd2-18a9-46c7-8450-9c9cdeadea4d/probe/0.log" Dec 01 23:28:07 crc kubenswrapper[4962]: I1201 23:28:07.191492 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-wl946_fae4b755-5a52-461b-939c-b870ddcc521b/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 23:28:07 crc kubenswrapper[4962]: I1201 23:28:07.424600 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-cjt2f_e004e0a7-fc4c-4236-a816-288147301262/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 23:28:07 crc kubenswrapper[4962]: I1201 23:28:07.646661 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5d75f767dc-grqhx_c626973f-0e99-4e4b-bc2b-8caddbada7aa/init/0.log" Dec 01 23:28:07 crc kubenswrapper[4962]: I1201 23:28:07.898914 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-d2c4z_de8a047d-3a82-4ffe-a734-76c25d8997e5/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 23:28:07 crc kubenswrapper[4962]: I1201 23:28:07.932811 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5d75f767dc-grqhx_c626973f-0e99-4e4b-bc2b-8caddbada7aa/init/0.log" Dec 01 23:28:07 crc kubenswrapper[4962]: I1201 23:28:07.938679 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5d75f767dc-grqhx_c626973f-0e99-4e4b-bc2b-8caddbada7aa/dnsmasq-dns/0.log" Dec 01 23:28:08 crc kubenswrapper[4962]: I1201 23:28:08.142896 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_87780c0b-00e7-44cd-93da-c22f2b2a771c/glance-httpd/0.log" Dec 01 23:28:08 crc kubenswrapper[4962]: I1201 23:28:08.155272 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_87780c0b-00e7-44cd-93da-c22f2b2a771c/glance-log/0.log" Dec 01 23:28:08 crc kubenswrapper[4962]: I1201 23:28:08.220253 4962 scope.go:117] "RemoveContainer" containerID="c6fb37a1aaa81633d4a05ea5583bfc5dfc3c8b8e5c3dfe9438063a6a533a05f8" Dec 01 23:28:08 crc kubenswrapper[4962]: E1201 23:28:08.220605 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 23:28:08 crc kubenswrapper[4962]: I1201 23:28:08.325514 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_891b6978-5cc9-464e-ae37-f9f7b3dadc62/glance-httpd/0.log" Dec 01 23:28:08 crc kubenswrapper[4962]: I1201 23:28:08.352335 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_891b6978-5cc9-464e-ae37-f9f7b3dadc62/glance-log/0.log" Dec 01 23:28:08 crc kubenswrapper[4962]: I1201 23:28:08.948916 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-544xz_3dd32d50-cae8-4762-ba07-ce8d8d1996b8/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 23:28:08 crc kubenswrapper[4962]: I1201 23:28:08.981795 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-65cfb46b8d-72cj6_59a79ba3-6726-4020-8e97-80654b9cc661/heat-engine/0.log" Dec 01 23:28:09 crc kubenswrapper[4962]: I1201 23:28:09.282407 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-68jbf_d7f57b07-0c88-4569-9062-bbaaf50abefe/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 23:28:09 crc kubenswrapper[4962]: I1201 23:28:09.323368 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-77477565cc-t4xcz_673eaa4d-d246-4ca5-8f8e-7b464149d355/heat-api/0.log" Dec 01 23:28:09 crc kubenswrapper[4962]: I1201 23:28:09.525538 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-6d7cf4b459-tkf5n_bb34fc59-5f49-4cdf-81f2-9d2b03afc6e4/heat-cfnapi/0.log" Dec 01 23:28:09 crc kubenswrapper[4962]: I1201 23:28:09.533887 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29410441-sgtnd_14dca7aa-3ee9-4af1-85ba-e92ac88fd223/keystone-cron/0.log" Dec 01 23:28:09 crc kubenswrapper[4962]: I1201 23:28:09.731965 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29410501-d9brm_3395cab0-9781-4fec-8e37-a3c4be3aca9a/keystone-cron/0.log" Dec 01 23:28:09 crc kubenswrapper[4962]: I1201 23:28:09.804217 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_412ecf69-be53-4cb2-9ea4-867884bbf8cf/kube-state-metrics/0.log" Dec 01 23:28:09 crc kubenswrapper[4962]: I1201 23:28:09.850572 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-bfc984cd5-wc42c_e30f9e2d-e0be-4484-bf6b-83c39beaa7e6/keystone-api/0.log" Dec 01 23:28:09 crc kubenswrapper[4962]: I1201 23:28:09.943361 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-nh8kx_db6f9af8-342d-4a5d-bd75-21d8d0f95c04/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 23:28:10 crc kubenswrapper[4962]: I1201 23:28:10.032971 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_logging-edpm-deployment-openstack-edpm-ipam-xrsvk_e3c41ae3-36b2-43dd-9580-fac72dc88d09/logging-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 23:28:10 crc kubenswrapper[4962]: I1201 23:28:10.255787 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mysqld-exporter-0_7a4079d4-140a-438c-a252-c0669217e113/mysqld-exporter/0.log" Dec 01 23:28:10 crc kubenswrapper[4962]: I1201 23:28:10.933718 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5b9776df9c-m5wv9_6ab1fe3f-42f6-4652-8d33-5f97b860b8fc/neutron-httpd/0.log" Dec 01 23:28:10 crc kubenswrapper[4962]: I1201 23:28:10.941814 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-bcxmz_dd8fa2aa-dbde-49e8-9e3f-bff76efaf29c/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 23:28:11 crc kubenswrapper[4962]: I1201 23:28:11.050808 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5b9776df9c-m5wv9_6ab1fe3f-42f6-4652-8d33-5f97b860b8fc/neutron-api/0.log" Dec 01 23:28:11 crc kubenswrapper[4962]: I1201 23:28:11.532616 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_9b1a4101-b960-4d3a-bba2-8472f8b2a726/nova-cell0-conductor-conductor/0.log" Dec 01 23:28:11 crc kubenswrapper[4962]: I1201 23:28:11.867326 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_aabf746f-d3b5-4858-a2ec-9a5cec96720a/nova-api-log/0.log" Dec 01 23:28:12 crc kubenswrapper[4962]: I1201 23:28:12.015365 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_7027bfd7-ae97-419b-aebd-11e811b45486/nova-cell1-conductor-conductor/0.log" Dec 01 23:28:12 crc kubenswrapper[4962]: I1201 23:28:12.296794 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_bf253872-abad-4b40-b941-2cbada4988ac/nova-cell1-novncproxy-novncproxy/0.log" Dec 01 23:28:12 crc kubenswrapper[4962]: I1201 23:28:12.337161 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-pl278_db0505bf-0445-4d21-9bc6-a483fdf94816/nova-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 23:28:12 crc kubenswrapper[4962]: I1201 23:28:12.625748 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_46f24440-8e28-4c9e-908d-ca07fd2edcfc/nova-metadata-log/0.log" Dec 01 23:28:12 crc kubenswrapper[4962]: I1201 23:28:12.719269 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_aabf746f-d3b5-4858-a2ec-9a5cec96720a/nova-api-api/0.log" Dec 01 23:28:13 crc kubenswrapper[4962]: I1201 23:28:13.224091 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_aebd10ab-b3dd-4bc7-8ea0-f5883d794715/mysql-bootstrap/0.log" Dec 01 23:28:13 crc kubenswrapper[4962]: I1201 23:28:13.356202 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_02074ca6-7293-4d4a-8354-f299b4ae4b5a/nova-scheduler-scheduler/0.log" Dec 01 23:28:13 crc kubenswrapper[4962]: I1201 23:28:13.440160 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_aebd10ab-b3dd-4bc7-8ea0-f5883d794715/mysql-bootstrap/0.log" Dec 01 23:28:13 crc kubenswrapper[4962]: I1201 23:28:13.451816 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_aebd10ab-b3dd-4bc7-8ea0-f5883d794715/galera/0.log" Dec 01 23:28:13 crc kubenswrapper[4962]: I1201 23:28:13.644825 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_c09bcbbf-f96b-4f90-8f2d-9d635454a05e/mysql-bootstrap/0.log" Dec 01 23:28:14 crc kubenswrapper[4962]: I1201 23:28:14.113314 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_c09bcbbf-f96b-4f90-8f2d-9d635454a05e/galera/0.log" Dec 01 23:28:14 crc kubenswrapper[4962]: I1201 23:28:14.153452 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_c09bcbbf-f96b-4f90-8f2d-9d635454a05e/mysql-bootstrap/0.log" Dec 01 23:28:14 crc kubenswrapper[4962]: I1201 23:28:14.176215 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_9fd1f254-7f23-46a6-b2fd-986de362e028/openstackclient/0.log" Dec 01 23:28:14 crc kubenswrapper[4962]: I1201 23:28:14.400749 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-vqjlm_1f47734e-2a33-432b-8030-c82a75ec77c3/openstack-network-exporter/0.log" Dec 01 23:28:14 crc kubenswrapper[4962]: I1201 23:28:14.618476 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-cdpb9_88fa575e-baee-41dd-8c7e-72baff22783e/ovsdb-server-init/0.log" Dec 01 23:28:14 crc kubenswrapper[4962]: I1201 23:28:14.825961 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-cdpb9_88fa575e-baee-41dd-8c7e-72baff22783e/ovsdb-server/0.log" Dec 01 23:28:14 crc kubenswrapper[4962]: I1201 23:28:14.870676 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-cdpb9_88fa575e-baee-41dd-8c7e-72baff22783e/ovs-vswitchd/0.log" Dec 01 23:28:14 crc kubenswrapper[4962]: I1201 23:28:14.890615 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-cdpb9_88fa575e-baee-41dd-8c7e-72baff22783e/ovsdb-server-init/0.log" Dec 01 23:28:15 crc kubenswrapper[4962]: I1201 23:28:15.104565 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-xd7ph_f4f0c9ce-824a-4a5b-adc9-f7a09b3ab97d/ovn-controller/0.log" Dec 01 23:28:15 crc kubenswrapper[4962]: I1201 23:28:15.309673 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_08d09a46-a04a-4b53-aa6c-e24f284063f0/openstack-network-exporter/0.log" Dec 01 23:28:15 crc kubenswrapper[4962]: I1201 23:28:15.382307 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_46f24440-8e28-4c9e-908d-ca07fd2edcfc/nova-metadata-metadata/0.log" Dec 01 23:28:15 crc kubenswrapper[4962]: I1201 23:28:15.395897 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-hwvmq_4ea2d579-b57c-41a6-a255-ee852e50ec7a/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 23:28:15 crc kubenswrapper[4962]: I1201 23:28:15.587045 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_08d09a46-a04a-4b53-aa6c-e24f284063f0/ovn-northd/0.log" Dec 01 23:28:15 crc kubenswrapper[4962]: I1201 23:28:15.741386 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_f893a462-9c1f-4b76-84fc-ba5e84364399/ovsdbserver-nb/0.log" Dec 01 23:28:15 crc kubenswrapper[4962]: I1201 23:28:15.776093 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_f893a462-9c1f-4b76-84fc-ba5e84364399/openstack-network-exporter/0.log" Dec 01 23:28:15 crc kubenswrapper[4962]: I1201 23:28:15.954866 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_fe00e319-7859-4bac-9316-156263865d80/openstack-network-exporter/0.log" Dec 01 23:28:16 crc kubenswrapper[4962]: I1201 23:28:16.160066 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_fe00e319-7859-4bac-9316-156263865d80/ovsdbserver-sb/0.log" Dec 01 23:28:16 crc kubenswrapper[4962]: I1201 23:28:16.432274 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-54647f544-d6jzt_6107fcdb-ceea-4953-a667-e3a973c68de3/placement-api/0.log" Dec 01 23:28:16 crc kubenswrapper[4962]: I1201 23:28:16.466322 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-54647f544-d6jzt_6107fcdb-ceea-4953-a667-e3a973c68de3/placement-log/0.log" Dec 01 23:28:16 crc kubenswrapper[4962]: I1201 23:28:16.515392 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620/init-config-reloader/0.log" Dec 01 23:28:16 crc kubenswrapper[4962]: I1201 23:28:16.703140 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620/thanos-sidecar/0.log" Dec 01 23:28:16 crc kubenswrapper[4962]: I1201 23:28:16.737208 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620/prometheus/0.log" Dec 01 23:28:16 crc kubenswrapper[4962]: I1201 23:28:16.795895 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620/config-reloader/0.log" Dec 01 23:28:16 crc kubenswrapper[4962]: I1201 23:28:16.796052 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_7bb7f1e5-3ee6-4c9f-8d8a-3e66dd8f5620/init-config-reloader/0.log" Dec 01 23:28:17 crc kubenswrapper[4962]: I1201 23:28:17.005549 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_42940ca4-6f73-42b9-97b9-8fcf3fa4f968/setup-container/0.log" Dec 01 23:28:17 crc kubenswrapper[4962]: I1201 23:28:17.272640 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_42940ca4-6f73-42b9-97b9-8fcf3fa4f968/setup-container/0.log" Dec 01 23:28:17 crc kubenswrapper[4962]: I1201 23:28:17.483862 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_42940ca4-6f73-42b9-97b9-8fcf3fa4f968/rabbitmq/0.log" Dec 01 23:28:17 crc kubenswrapper[4962]: I1201 23:28:17.486489 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_2284f352-fb8b-4432-b26f-106c1255dd90/setup-container/0.log" Dec 01 23:28:17 crc kubenswrapper[4962]: I1201 23:28:17.804701 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-sp5b2_aee89e9b-f93a-4100-bc51-0a701a9d9549/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 23:28:17 crc kubenswrapper[4962]: I1201 23:28:17.810967 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_2284f352-fb8b-4432-b26f-106c1255dd90/setup-container/0.log" Dec 01 23:28:17 crc kubenswrapper[4962]: I1201 23:28:17.818220 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_2284f352-fb8b-4432-b26f-106c1255dd90/rabbitmq/0.log" Dec 01 23:28:18 crc kubenswrapper[4962]: I1201 23:28:18.061809 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-t8n5k_b8f0680f-6407-4e35-a927-3c0613e4f3e5/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 23:28:18 crc kubenswrapper[4962]: I1201 23:28:18.066981 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-qs68c_805f56ee-17d1-4e5a-8655-756050592352/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 23:28:18 crc kubenswrapper[4962]: I1201 23:28:18.259702 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-mwgwj_1e8e98c7-cc77-45f5-be56-cb73df6427e4/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 23:28:18 crc kubenswrapper[4962]: I1201 23:28:18.382569 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-stgzx_4356462c-43b0-40db-824b-f9abb87cb9dd/ssh-known-hosts-edpm-deployment/0.log" Dec 01 23:28:18 crc kubenswrapper[4962]: I1201 23:28:18.659026 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-65c954fcc-wpwn7_85abfbd6-374e-486e-93f1-8e8c4e8b5da0/proxy-server/0.log" Dec 01 23:28:18 crc kubenswrapper[4962]: I1201 23:28:18.748809 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-5db2p_34cbe04f-2bf2-4b5e-bf91-00787b7e4fee/swift-ring-rebalance/0.log" Dec 01 23:28:18 crc kubenswrapper[4962]: I1201 23:28:18.816303 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-65c954fcc-wpwn7_85abfbd6-374e-486e-93f1-8e8c4e8b5da0/proxy-httpd/0.log" Dec 01 23:28:18 crc kubenswrapper[4962]: I1201 23:28:18.909142 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3/account-auditor/0.log" Dec 01 23:28:18 crc kubenswrapper[4962]: I1201 23:28:18.931871 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3/account-reaper/0.log" Dec 01 23:28:19 crc kubenswrapper[4962]: I1201 23:28:19.084811 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3/account-replicator/0.log" Dec 01 23:28:19 crc kubenswrapper[4962]: I1201 23:28:19.151629 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3/account-server/0.log" Dec 01 23:28:19 crc kubenswrapper[4962]: I1201 23:28:19.188960 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3/container-auditor/0.log" Dec 01 23:28:19 crc kubenswrapper[4962]: I1201 23:28:19.260032 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3/container-replicator/0.log" Dec 01 23:28:19 crc kubenswrapper[4962]: I1201 23:28:19.376273 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3/container-server/0.log" Dec 01 23:28:19 crc kubenswrapper[4962]: I1201 23:28:19.422578 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3/container-updater/0.log" Dec 01 23:28:19 crc kubenswrapper[4962]: I1201 23:28:19.455437 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3/object-auditor/0.log" Dec 01 23:28:19 crc kubenswrapper[4962]: I1201 23:28:19.616752 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3/object-replicator/0.log" Dec 01 23:28:19 crc kubenswrapper[4962]: I1201 23:28:19.622442 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3/object-expirer/0.log" Dec 01 23:28:19 crc kubenswrapper[4962]: I1201 23:28:19.718036 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3/object-updater/0.log" Dec 01 23:28:19 crc kubenswrapper[4962]: I1201 23:28:19.796784 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3/object-server/0.log" Dec 01 23:28:19 crc kubenswrapper[4962]: I1201 23:28:19.864035 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3/rsync/0.log" Dec 01 23:28:19 crc kubenswrapper[4962]: I1201 23:28:19.880126 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f53f9dd4-f949-4ae6-a2d5-7a19a21d80c3/swift-recon-cron/0.log" Dec 01 23:28:20 crc kubenswrapper[4962]: I1201 23:28:20.090240 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-4cmp6_a7fabf85-9e84-477d-9831-1f6ff8c52e3e/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 23:28:20 crc kubenswrapper[4962]: I1201 23:28:20.189905 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-power-monitoring-edpm-deployment-openstack-edpm-wbrjz_6a103f12-9cb1-4018-9db7-67553233f69d/telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 23:28:20 crc kubenswrapper[4962]: I1201 23:28:20.432550 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_792d09ec-504b-41d0-a382-4503283ad0d5/test-operator-logs-container/0.log" Dec 01 23:28:20 crc kubenswrapper[4962]: I1201 23:28:20.679055 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-tjg57_f2a3fbd2-3eb8-4784-8442-3299926b0172/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 23:28:21 crc kubenswrapper[4962]: I1201 23:28:21.307611 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_07461b2c-c45f-45cf-a540-4c24797e3f16/tempest-tests-tempest-tests-runner/0.log" Dec 01 23:28:23 crc kubenswrapper[4962]: I1201 23:28:23.220818 4962 scope.go:117] "RemoveContainer" containerID="c6fb37a1aaa81633d4a05ea5583bfc5dfc3c8b8e5c3dfe9438063a6a533a05f8" Dec 01 23:28:23 crc kubenswrapper[4962]: E1201 23:28:23.221064 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 23:28:29 crc kubenswrapper[4962]: I1201 23:28:29.529047 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_c1c35af6-81b8-418f-a1e9-e19209bab14d/memcached/0.log" Dec 01 23:28:36 crc kubenswrapper[4962]: I1201 23:28:36.227023 4962 scope.go:117] "RemoveContainer" containerID="c6fb37a1aaa81633d4a05ea5583bfc5dfc3c8b8e5c3dfe9438063a6a533a05f8" Dec 01 23:28:36 crc kubenswrapper[4962]: E1201 23:28:36.227861 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 23:28:46 crc kubenswrapper[4962]: I1201 23:28:46.500305 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2mr5z"] Dec 01 23:28:46 crc kubenswrapper[4962]: E1201 23:28:46.502070 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e21d8006-0e58-4d47-8adf-480ad0c08a1b" containerName="container-00" Dec 01 23:28:46 crc kubenswrapper[4962]: I1201 23:28:46.502144 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="e21d8006-0e58-4d47-8adf-480ad0c08a1b" containerName="container-00" Dec 01 23:28:46 crc kubenswrapper[4962]: I1201 23:28:46.502442 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="e21d8006-0e58-4d47-8adf-480ad0c08a1b" containerName="container-00" Dec 01 23:28:46 crc kubenswrapper[4962]: I1201 23:28:46.505389 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2mr5z" Dec 01 23:28:46 crc kubenswrapper[4962]: I1201 23:28:46.522390 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2mr5z"] Dec 01 23:28:46 crc kubenswrapper[4962]: I1201 23:28:46.677625 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fd477b3-9078-40f9-961a-ec8cb2423490-utilities\") pod \"redhat-operators-2mr5z\" (UID: \"6fd477b3-9078-40f9-961a-ec8cb2423490\") " pod="openshift-marketplace/redhat-operators-2mr5z" Dec 01 23:28:46 crc kubenswrapper[4962]: I1201 23:28:46.677811 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vl2sh\" (UniqueName: \"kubernetes.io/projected/6fd477b3-9078-40f9-961a-ec8cb2423490-kube-api-access-vl2sh\") pod \"redhat-operators-2mr5z\" (UID: \"6fd477b3-9078-40f9-961a-ec8cb2423490\") " pod="openshift-marketplace/redhat-operators-2mr5z" Dec 01 23:28:46 crc kubenswrapper[4962]: I1201 23:28:46.677851 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fd477b3-9078-40f9-961a-ec8cb2423490-catalog-content\") pod \"redhat-operators-2mr5z\" (UID: \"6fd477b3-9078-40f9-961a-ec8cb2423490\") " pod="openshift-marketplace/redhat-operators-2mr5z" Dec 01 23:28:46 crc kubenswrapper[4962]: I1201 23:28:46.780074 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vl2sh\" (UniqueName: \"kubernetes.io/projected/6fd477b3-9078-40f9-961a-ec8cb2423490-kube-api-access-vl2sh\") pod \"redhat-operators-2mr5z\" (UID: \"6fd477b3-9078-40f9-961a-ec8cb2423490\") " pod="openshift-marketplace/redhat-operators-2mr5z" Dec 01 23:28:46 crc kubenswrapper[4962]: I1201 23:28:46.780150 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fd477b3-9078-40f9-961a-ec8cb2423490-catalog-content\") pod \"redhat-operators-2mr5z\" (UID: \"6fd477b3-9078-40f9-961a-ec8cb2423490\") " pod="openshift-marketplace/redhat-operators-2mr5z" Dec 01 23:28:46 crc kubenswrapper[4962]: I1201 23:28:46.780250 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fd477b3-9078-40f9-961a-ec8cb2423490-utilities\") pod \"redhat-operators-2mr5z\" (UID: \"6fd477b3-9078-40f9-961a-ec8cb2423490\") " pod="openshift-marketplace/redhat-operators-2mr5z" Dec 01 23:28:46 crc kubenswrapper[4962]: I1201 23:28:46.780729 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fd477b3-9078-40f9-961a-ec8cb2423490-catalog-content\") pod \"redhat-operators-2mr5z\" (UID: \"6fd477b3-9078-40f9-961a-ec8cb2423490\") " pod="openshift-marketplace/redhat-operators-2mr5z" Dec 01 23:28:46 crc kubenswrapper[4962]: I1201 23:28:46.780781 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fd477b3-9078-40f9-961a-ec8cb2423490-utilities\") pod \"redhat-operators-2mr5z\" (UID: \"6fd477b3-9078-40f9-961a-ec8cb2423490\") " pod="openshift-marketplace/redhat-operators-2mr5z" Dec 01 23:28:46 crc kubenswrapper[4962]: I1201 23:28:46.812733 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vl2sh\" (UniqueName: \"kubernetes.io/projected/6fd477b3-9078-40f9-961a-ec8cb2423490-kube-api-access-vl2sh\") pod \"redhat-operators-2mr5z\" (UID: \"6fd477b3-9078-40f9-961a-ec8cb2423490\") " pod="openshift-marketplace/redhat-operators-2mr5z" Dec 01 23:28:46 crc kubenswrapper[4962]: I1201 23:28:46.827482 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2mr5z" Dec 01 23:28:47 crc kubenswrapper[4962]: I1201 23:28:47.291564 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2mr5z"] Dec 01 23:28:48 crc kubenswrapper[4962]: I1201 23:28:48.023891 4962 generic.go:334] "Generic (PLEG): container finished" podID="6fd477b3-9078-40f9-961a-ec8cb2423490" containerID="5040ca60918c18575b2896047fb76cbe8b63bf8a0a0489abdfc75945b8759aea" exitCode=0 Dec 01 23:28:48 crc kubenswrapper[4962]: I1201 23:28:48.023983 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2mr5z" event={"ID":"6fd477b3-9078-40f9-961a-ec8cb2423490","Type":"ContainerDied","Data":"5040ca60918c18575b2896047fb76cbe8b63bf8a0a0489abdfc75945b8759aea"} Dec 01 23:28:48 crc kubenswrapper[4962]: I1201 23:28:48.024207 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2mr5z" event={"ID":"6fd477b3-9078-40f9-961a-ec8cb2423490","Type":"ContainerStarted","Data":"62abd2199f5f938eab4d2f9740553f0c55abfa82f9dfd2a12ae3102a45ee22ec"} Dec 01 23:28:48 crc kubenswrapper[4962]: I1201 23:28:48.026624 4962 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 23:28:49 crc kubenswrapper[4962]: I1201 23:28:49.035064 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2mr5z" event={"ID":"6fd477b3-9078-40f9-961a-ec8cb2423490","Type":"ContainerStarted","Data":"b41c47b43a35e5850c63bec58cd4daf1e12241c1366d86be11615a11de779f2e"} Dec 01 23:28:49 crc kubenswrapper[4962]: I1201 23:28:49.220294 4962 scope.go:117] "RemoveContainer" containerID="c6fb37a1aaa81633d4a05ea5583bfc5dfc3c8b8e5c3dfe9438063a6a533a05f8" Dec 01 23:28:49 crc kubenswrapper[4962]: E1201 23:28:49.220562 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 23:28:50 crc kubenswrapper[4962]: I1201 23:28:50.371360 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_73f65eef14c85a3f197e77d6eb02f32b624918b5a1b27530e5300f306dlvx4g_dfd7b1bd-1314-48d9-92f7-1adcba0a1cd3/util/0.log" Dec 01 23:28:50 crc kubenswrapper[4962]: I1201 23:28:50.621891 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_73f65eef14c85a3f197e77d6eb02f32b624918b5a1b27530e5300f306dlvx4g_dfd7b1bd-1314-48d9-92f7-1adcba0a1cd3/util/0.log" Dec 01 23:28:50 crc kubenswrapper[4962]: I1201 23:28:50.633411 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_73f65eef14c85a3f197e77d6eb02f32b624918b5a1b27530e5300f306dlvx4g_dfd7b1bd-1314-48d9-92f7-1adcba0a1cd3/pull/0.log" Dec 01 23:28:50 crc kubenswrapper[4962]: I1201 23:28:50.675423 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_73f65eef14c85a3f197e77d6eb02f32b624918b5a1b27530e5300f306dlvx4g_dfd7b1bd-1314-48d9-92f7-1adcba0a1cd3/pull/0.log" Dec 01 23:28:51 crc kubenswrapper[4962]: I1201 23:28:51.362975 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_73f65eef14c85a3f197e77d6eb02f32b624918b5a1b27530e5300f306dlvx4g_dfd7b1bd-1314-48d9-92f7-1adcba0a1cd3/util/0.log" Dec 01 23:28:51 crc kubenswrapper[4962]: I1201 23:28:51.415705 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_73f65eef14c85a3f197e77d6eb02f32b624918b5a1b27530e5300f306dlvx4g_dfd7b1bd-1314-48d9-92f7-1adcba0a1cd3/extract/0.log" Dec 01 23:28:51 crc kubenswrapper[4962]: I1201 23:28:51.522329 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_73f65eef14c85a3f197e77d6eb02f32b624918b5a1b27530e5300f306dlvx4g_dfd7b1bd-1314-48d9-92f7-1adcba0a1cd3/pull/0.log" Dec 01 23:28:51 crc kubenswrapper[4962]: I1201 23:28:51.624228 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-t725d_0e2461fa-57b4-406a-9801-522b2e3ee2f0/kube-rbac-proxy/0.log" Dec 01 23:28:51 crc kubenswrapper[4962]: I1201 23:28:51.826848 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-t725d_0e2461fa-57b4-406a-9801-522b2e3ee2f0/manager/0.log" Dec 01 23:28:51 crc kubenswrapper[4962]: I1201 23:28:51.852634 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-2nvkp_39217f35-ba4e-402b-84fe-876ca232ff60/kube-rbac-proxy/0.log" Dec 01 23:28:52 crc kubenswrapper[4962]: I1201 23:28:52.034175 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-2nvkp_39217f35-ba4e-402b-84fe-876ca232ff60/manager/0.log" Dec 01 23:28:52 crc kubenswrapper[4962]: I1201 23:28:52.037215 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-sqwvg_8e655cd6-3169-46a0-b299-37d13dae8d3a/kube-rbac-proxy/0.log" Dec 01 23:28:52 crc kubenswrapper[4962]: I1201 23:28:52.108039 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-sqwvg_8e655cd6-3169-46a0-b299-37d13dae8d3a/manager/0.log" Dec 01 23:28:52 crc kubenswrapper[4962]: I1201 23:28:52.264775 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-668d9c48b9-8xc97_d871da7f-4b47-4931-aa3b-1525f50b2bde/kube-rbac-proxy/0.log" Dec 01 23:28:52 crc kubenswrapper[4962]: I1201 23:28:52.278944 4962 generic.go:334] "Generic (PLEG): container finished" podID="6fd477b3-9078-40f9-961a-ec8cb2423490" containerID="b41c47b43a35e5850c63bec58cd4daf1e12241c1366d86be11615a11de779f2e" exitCode=0 Dec 01 23:28:52 crc kubenswrapper[4962]: I1201 23:28:52.278974 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2mr5z" event={"ID":"6fd477b3-9078-40f9-961a-ec8cb2423490","Type":"ContainerDied","Data":"b41c47b43a35e5850c63bec58cd4daf1e12241c1366d86be11615a11de779f2e"} Dec 01 23:28:52 crc kubenswrapper[4962]: I1201 23:28:52.390290 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-668d9c48b9-8xc97_d871da7f-4b47-4931-aa3b-1525f50b2bde/manager/0.log" Dec 01 23:28:52 crc kubenswrapper[4962]: I1201 23:28:52.512445 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-sjzl9_1a9bd198-45fa-40ba-b3a0-55c150c211d6/kube-rbac-proxy/0.log" Dec 01 23:28:52 crc kubenswrapper[4962]: I1201 23:28:52.576263 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-sjzl9_1a9bd198-45fa-40ba-b3a0-55c150c211d6/manager/0.log" Dec 01 23:28:52 crc kubenswrapper[4962]: I1201 23:28:52.655078 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-mllgh_fb3ad1a2-8ee0-4d12-8499-d10819081f1b/kube-rbac-proxy/0.log" Dec 01 23:28:52 crc kubenswrapper[4962]: I1201 23:28:52.747345 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-mllgh_fb3ad1a2-8ee0-4d12-8499-d10819081f1b/manager/0.log" Dec 01 23:28:52 crc kubenswrapper[4962]: I1201 23:28:52.792132 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-27r4m_2e5d2ffc-89ea-4f45-a3a2-1745f3b107e8/kube-rbac-proxy/0.log" Dec 01 23:28:53 crc kubenswrapper[4962]: I1201 23:28:53.029147 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-w68ng_d0ae9966-90f0-4d97-a056-dd9e86c81949/kube-rbac-proxy/0.log" Dec 01 23:28:53 crc kubenswrapper[4962]: I1201 23:28:53.052407 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-27r4m_2e5d2ffc-89ea-4f45-a3a2-1745f3b107e8/manager/0.log" Dec 01 23:28:53 crc kubenswrapper[4962]: I1201 23:28:53.135884 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-w68ng_d0ae9966-90f0-4d97-a056-dd9e86c81949/manager/0.log" Dec 01 23:28:53 crc kubenswrapper[4962]: I1201 23:28:53.223354 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-546d4bdf48-fkvsq_ed78cdfd-dc4e-4528-9542-6fc778f54e5f/kube-rbac-proxy/0.log" Dec 01 23:28:53 crc kubenswrapper[4962]: I1201 23:28:53.291533 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2mr5z" event={"ID":"6fd477b3-9078-40f9-961a-ec8cb2423490","Type":"ContainerStarted","Data":"48dae431a93705c2a2f6df48e4ae73b09649bcf676f2c9546c5a7dfa1df2f7fb"} Dec 01 23:28:53 crc kubenswrapper[4962]: I1201 23:28:53.306961 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2mr5z" podStartSLOduration=2.6308953710000003 podStartE2EDuration="7.306945751s" podCreationTimestamp="2025-12-01 23:28:46 +0000 UTC" firstStartedPulling="2025-12-01 23:28:48.026411836 +0000 UTC m=+6912.127851031" lastFinishedPulling="2025-12-01 23:28:52.702462216 +0000 UTC m=+6916.803901411" observedRunningTime="2025-12-01 23:28:53.306036946 +0000 UTC m=+6917.407476141" watchObservedRunningTime="2025-12-01 23:28:53.306945751 +0000 UTC m=+6917.408384946" Dec 01 23:28:53 crc kubenswrapper[4962]: I1201 23:28:53.406970 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-546d4bdf48-fkvsq_ed78cdfd-dc4e-4528-9542-6fc778f54e5f/manager/0.log" Dec 01 23:28:53 crc kubenswrapper[4962]: I1201 23:28:53.533905 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6546668bfd-zvp89_3673ec86-6e36-4f0b-ac14-87e5d89e283e/kube-rbac-proxy/0.log" Dec 01 23:28:53 crc kubenswrapper[4962]: I1201 23:28:53.620988 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6546668bfd-zvp89_3673ec86-6e36-4f0b-ac14-87e5d89e283e/manager/0.log" Dec 01 23:28:53 crc kubenswrapper[4962]: I1201 23:28:53.727333 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-nptld_fb72edda-e449-44f6-a85d-b74c0f3f9ad2/kube-rbac-proxy/0.log" Dec 01 23:28:53 crc kubenswrapper[4962]: I1201 23:28:53.846348 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-nptld_fb72edda-e449-44f6-a85d-b74c0f3f9ad2/manager/0.log" Dec 01 23:28:53 crc kubenswrapper[4962]: I1201 23:28:53.957141 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-7d98c_795b9a42-a6d4-487b-84ef-0f1b3617ebfc/kube-rbac-proxy/0.log" Dec 01 23:28:53 crc kubenswrapper[4962]: I1201 23:28:53.958166 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-7d98c_795b9a42-a6d4-487b-84ef-0f1b3617ebfc/manager/0.log" Dec 01 23:28:54 crc kubenswrapper[4962]: I1201 23:28:54.138787 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-mqrwk_1fb020cd-66c6-401d-be7e-9a26b62eb8d8/kube-rbac-proxy/0.log" Dec 01 23:28:54 crc kubenswrapper[4962]: I1201 23:28:54.306557 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-mqrwk_1fb020cd-66c6-401d-be7e-9a26b62eb8d8/manager/0.log" Dec 01 23:28:54 crc kubenswrapper[4962]: I1201 23:28:54.363122 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-xvbpf_f2e499a5-b89a-45d4-bd3e-9f743e010a51/kube-rbac-proxy/0.log" Dec 01 23:28:54 crc kubenswrapper[4962]: I1201 23:28:54.411064 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-xvbpf_f2e499a5-b89a-45d4-bd3e-9f743e010a51/manager/0.log" Dec 01 23:28:54 crc kubenswrapper[4962]: I1201 23:28:54.577515 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4z6mbc_fec45066-0c5d-48de-9c33-f166f33131f0/kube-rbac-proxy/0.log" Dec 01 23:28:54 crc kubenswrapper[4962]: I1201 23:28:54.853010 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4z6mbc_fec45066-0c5d-48de-9c33-f166f33131f0/manager/0.log" Dec 01 23:28:55 crc kubenswrapper[4962]: I1201 23:28:55.016460 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-5dc9c6958f-52l87_555a34ee-8a52-4159-8e01-ed6dcceb45e9/operator/0.log" Dec 01 23:28:55 crc kubenswrapper[4962]: I1201 23:28:55.130068 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-sd67v_d151cbe8-8f07-425d-bd99-c06451f4a3cf/registry-server/0.log" Dec 01 23:28:55 crc kubenswrapper[4962]: I1201 23:28:55.381041 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-hfkpq_d62cdff4-c4d1-44fb-99dc-bdd6a31d03af/kube-rbac-proxy/0.log" Dec 01 23:28:55 crc kubenswrapper[4962]: I1201 23:28:55.463757 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-hfkpq_d62cdff4-c4d1-44fb-99dc-bdd6a31d03af/manager/0.log" Dec 01 23:28:55 crc kubenswrapper[4962]: I1201 23:28:55.479495 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-4hgng_ec3039da-9f5e-4870-8579-8560a63221a8/kube-rbac-proxy/0.log" Dec 01 23:28:55 crc kubenswrapper[4962]: I1201 23:28:55.716408 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-4hgng_ec3039da-9f5e-4870-8579-8560a63221a8/manager/0.log" Dec 01 23:28:55 crc kubenswrapper[4962]: I1201 23:28:55.794285 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-lv89z_400ba839-34f0-4463-a318-c1bcba6e5039/operator/0.log" Dec 01 23:28:55 crc kubenswrapper[4962]: I1201 23:28:55.925875 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-6tfrn_03f5786b-da6f-4b56-ac07-fb563f0a85b4/kube-rbac-proxy/0.log" Dec 01 23:28:56 crc kubenswrapper[4962]: I1201 23:28:56.016926 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-6tfrn_03f5786b-da6f-4b56-ac07-fb563f0a85b4/manager/0.log" Dec 01 23:28:56 crc kubenswrapper[4962]: I1201 23:28:56.135163 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6c484b4dc4-ch82f_af182ba4-78a6-41eb-bf65-8abd64207122/kube-rbac-proxy/0.log" Dec 01 23:28:56 crc kubenswrapper[4962]: I1201 23:28:56.294020 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-zzc5v_0aff0b93-1032-412b-9628-3ab9e94717a8/kube-rbac-proxy/0.log" Dec 01 23:28:56 crc kubenswrapper[4962]: I1201 23:28:56.325547 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-d8646fccf-4h8tf_05992e60-e6fc-43a0-b44a-d177ae3f4c83/manager/0.log" Dec 01 23:28:56 crc kubenswrapper[4962]: I1201 23:28:56.416654 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-zzc5v_0aff0b93-1032-412b-9628-3ab9e94717a8/manager/0.log" Dec 01 23:28:56 crc kubenswrapper[4962]: I1201 23:28:56.494870 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6c484b4dc4-ch82f_af182ba4-78a6-41eb-bf65-8abd64207122/manager/0.log" Dec 01 23:28:56 crc kubenswrapper[4962]: I1201 23:28:56.573080 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-q7bxg_c847e733-65b6-4724-8037-5199d847f1ba/kube-rbac-proxy/0.log" Dec 01 23:28:56 crc kubenswrapper[4962]: I1201 23:28:56.596277 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-q7bxg_c847e733-65b6-4724-8037-5199d847f1ba/manager/0.log" Dec 01 23:28:56 crc kubenswrapper[4962]: I1201 23:28:56.827560 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2mr5z" Dec 01 23:28:56 crc kubenswrapper[4962]: I1201 23:28:56.827861 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2mr5z" Dec 01 23:28:57 crc kubenswrapper[4962]: I1201 23:28:57.873571 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2mr5z" podUID="6fd477b3-9078-40f9-961a-ec8cb2423490" containerName="registry-server" probeResult="failure" output=< Dec 01 23:28:57 crc kubenswrapper[4962]: timeout: failed to connect service ":50051" within 1s Dec 01 23:28:57 crc kubenswrapper[4962]: > Dec 01 23:29:02 crc kubenswrapper[4962]: I1201 23:29:02.220678 4962 scope.go:117] "RemoveContainer" containerID="c6fb37a1aaa81633d4a05ea5583bfc5dfc3c8b8e5c3dfe9438063a6a533a05f8" Dec 01 23:29:02 crc kubenswrapper[4962]: E1201 23:29:02.221680 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 23:29:06 crc kubenswrapper[4962]: I1201 23:29:06.926312 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2mr5z" Dec 01 23:29:06 crc kubenswrapper[4962]: I1201 23:29:06.997447 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2mr5z" Dec 01 23:29:07 crc kubenswrapper[4962]: I1201 23:29:07.166021 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2mr5z"] Dec 01 23:29:08 crc kubenswrapper[4962]: I1201 23:29:08.452510 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2mr5z" podUID="6fd477b3-9078-40f9-961a-ec8cb2423490" containerName="registry-server" containerID="cri-o://48dae431a93705c2a2f6df48e4ae73b09649bcf676f2c9546c5a7dfa1df2f7fb" gracePeriod=2 Dec 01 23:29:09 crc kubenswrapper[4962]: I1201 23:29:09.078705 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2mr5z" Dec 01 23:29:09 crc kubenswrapper[4962]: I1201 23:29:09.216170 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vl2sh\" (UniqueName: \"kubernetes.io/projected/6fd477b3-9078-40f9-961a-ec8cb2423490-kube-api-access-vl2sh\") pod \"6fd477b3-9078-40f9-961a-ec8cb2423490\" (UID: \"6fd477b3-9078-40f9-961a-ec8cb2423490\") " Dec 01 23:29:09 crc kubenswrapper[4962]: I1201 23:29:09.216328 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fd477b3-9078-40f9-961a-ec8cb2423490-catalog-content\") pod \"6fd477b3-9078-40f9-961a-ec8cb2423490\" (UID: \"6fd477b3-9078-40f9-961a-ec8cb2423490\") " Dec 01 23:29:09 crc kubenswrapper[4962]: I1201 23:29:09.216455 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fd477b3-9078-40f9-961a-ec8cb2423490-utilities\") pod \"6fd477b3-9078-40f9-961a-ec8cb2423490\" (UID: \"6fd477b3-9078-40f9-961a-ec8cb2423490\") " Dec 01 23:29:09 crc kubenswrapper[4962]: I1201 23:29:09.217096 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6fd477b3-9078-40f9-961a-ec8cb2423490-utilities" (OuterVolumeSpecName: "utilities") pod "6fd477b3-9078-40f9-961a-ec8cb2423490" (UID: "6fd477b3-9078-40f9-961a-ec8cb2423490"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 23:29:09 crc kubenswrapper[4962]: I1201 23:29:09.217782 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fd477b3-9078-40f9-961a-ec8cb2423490-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 23:29:09 crc kubenswrapper[4962]: I1201 23:29:09.229171 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fd477b3-9078-40f9-961a-ec8cb2423490-kube-api-access-vl2sh" (OuterVolumeSpecName: "kube-api-access-vl2sh") pod "6fd477b3-9078-40f9-961a-ec8cb2423490" (UID: "6fd477b3-9078-40f9-961a-ec8cb2423490"). InnerVolumeSpecName "kube-api-access-vl2sh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 23:29:09 crc kubenswrapper[4962]: I1201 23:29:09.322920 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vl2sh\" (UniqueName: \"kubernetes.io/projected/6fd477b3-9078-40f9-961a-ec8cb2423490-kube-api-access-vl2sh\") on node \"crc\" DevicePath \"\"" Dec 01 23:29:09 crc kubenswrapper[4962]: I1201 23:29:09.345561 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6fd477b3-9078-40f9-961a-ec8cb2423490-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6fd477b3-9078-40f9-961a-ec8cb2423490" (UID: "6fd477b3-9078-40f9-961a-ec8cb2423490"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 23:29:09 crc kubenswrapper[4962]: I1201 23:29:09.426198 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fd477b3-9078-40f9-961a-ec8cb2423490-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 23:29:09 crc kubenswrapper[4962]: I1201 23:29:09.467890 4962 generic.go:334] "Generic (PLEG): container finished" podID="6fd477b3-9078-40f9-961a-ec8cb2423490" containerID="48dae431a93705c2a2f6df48e4ae73b09649bcf676f2c9546c5a7dfa1df2f7fb" exitCode=0 Dec 01 23:29:09 crc kubenswrapper[4962]: I1201 23:29:09.467946 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2mr5z" event={"ID":"6fd477b3-9078-40f9-961a-ec8cb2423490","Type":"ContainerDied","Data":"48dae431a93705c2a2f6df48e4ae73b09649bcf676f2c9546c5a7dfa1df2f7fb"} Dec 01 23:29:09 crc kubenswrapper[4962]: I1201 23:29:09.467974 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2mr5z" event={"ID":"6fd477b3-9078-40f9-961a-ec8cb2423490","Type":"ContainerDied","Data":"62abd2199f5f938eab4d2f9740553f0c55abfa82f9dfd2a12ae3102a45ee22ec"} Dec 01 23:29:09 crc kubenswrapper[4962]: I1201 23:29:09.467991 4962 scope.go:117] "RemoveContainer" containerID="48dae431a93705c2a2f6df48e4ae73b09649bcf676f2c9546c5a7dfa1df2f7fb" Dec 01 23:29:09 crc kubenswrapper[4962]: I1201 23:29:09.469795 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2mr5z" Dec 01 23:29:09 crc kubenswrapper[4962]: I1201 23:29:09.517643 4962 scope.go:117] "RemoveContainer" containerID="b41c47b43a35e5850c63bec58cd4daf1e12241c1366d86be11615a11de779f2e" Dec 01 23:29:09 crc kubenswrapper[4962]: I1201 23:29:09.538368 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2mr5z"] Dec 01 23:29:09 crc kubenswrapper[4962]: I1201 23:29:09.546212 4962 scope.go:117] "RemoveContainer" containerID="5040ca60918c18575b2896047fb76cbe8b63bf8a0a0489abdfc75945b8759aea" Dec 01 23:29:09 crc kubenswrapper[4962]: I1201 23:29:09.560838 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2mr5z"] Dec 01 23:29:09 crc kubenswrapper[4962]: I1201 23:29:09.627231 4962 scope.go:117] "RemoveContainer" containerID="48dae431a93705c2a2f6df48e4ae73b09649bcf676f2c9546c5a7dfa1df2f7fb" Dec 01 23:29:09 crc kubenswrapper[4962]: E1201 23:29:09.628397 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48dae431a93705c2a2f6df48e4ae73b09649bcf676f2c9546c5a7dfa1df2f7fb\": container with ID starting with 48dae431a93705c2a2f6df48e4ae73b09649bcf676f2c9546c5a7dfa1df2f7fb not found: ID does not exist" containerID="48dae431a93705c2a2f6df48e4ae73b09649bcf676f2c9546c5a7dfa1df2f7fb" Dec 01 23:29:09 crc kubenswrapper[4962]: I1201 23:29:09.628435 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48dae431a93705c2a2f6df48e4ae73b09649bcf676f2c9546c5a7dfa1df2f7fb"} err="failed to get container status \"48dae431a93705c2a2f6df48e4ae73b09649bcf676f2c9546c5a7dfa1df2f7fb\": rpc error: code = NotFound desc = could not find container \"48dae431a93705c2a2f6df48e4ae73b09649bcf676f2c9546c5a7dfa1df2f7fb\": container with ID starting with 48dae431a93705c2a2f6df48e4ae73b09649bcf676f2c9546c5a7dfa1df2f7fb not found: ID does not exist" Dec 01 23:29:09 crc kubenswrapper[4962]: I1201 23:29:09.628463 4962 scope.go:117] "RemoveContainer" containerID="b41c47b43a35e5850c63bec58cd4daf1e12241c1366d86be11615a11de779f2e" Dec 01 23:29:09 crc kubenswrapper[4962]: E1201 23:29:09.628985 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b41c47b43a35e5850c63bec58cd4daf1e12241c1366d86be11615a11de779f2e\": container with ID starting with b41c47b43a35e5850c63bec58cd4daf1e12241c1366d86be11615a11de779f2e not found: ID does not exist" containerID="b41c47b43a35e5850c63bec58cd4daf1e12241c1366d86be11615a11de779f2e" Dec 01 23:29:09 crc kubenswrapper[4962]: I1201 23:29:09.629017 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b41c47b43a35e5850c63bec58cd4daf1e12241c1366d86be11615a11de779f2e"} err="failed to get container status \"b41c47b43a35e5850c63bec58cd4daf1e12241c1366d86be11615a11de779f2e\": rpc error: code = NotFound desc = could not find container \"b41c47b43a35e5850c63bec58cd4daf1e12241c1366d86be11615a11de779f2e\": container with ID starting with b41c47b43a35e5850c63bec58cd4daf1e12241c1366d86be11615a11de779f2e not found: ID does not exist" Dec 01 23:29:09 crc kubenswrapper[4962]: I1201 23:29:09.629038 4962 scope.go:117] "RemoveContainer" containerID="5040ca60918c18575b2896047fb76cbe8b63bf8a0a0489abdfc75945b8759aea" Dec 01 23:29:09 crc kubenswrapper[4962]: E1201 23:29:09.629334 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5040ca60918c18575b2896047fb76cbe8b63bf8a0a0489abdfc75945b8759aea\": container with ID starting with 5040ca60918c18575b2896047fb76cbe8b63bf8a0a0489abdfc75945b8759aea not found: ID does not exist" containerID="5040ca60918c18575b2896047fb76cbe8b63bf8a0a0489abdfc75945b8759aea" Dec 01 23:29:09 crc kubenswrapper[4962]: I1201 23:29:09.629380 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5040ca60918c18575b2896047fb76cbe8b63bf8a0a0489abdfc75945b8759aea"} err="failed to get container status \"5040ca60918c18575b2896047fb76cbe8b63bf8a0a0489abdfc75945b8759aea\": rpc error: code = NotFound desc = could not find container \"5040ca60918c18575b2896047fb76cbe8b63bf8a0a0489abdfc75945b8759aea\": container with ID starting with 5040ca60918c18575b2896047fb76cbe8b63bf8a0a0489abdfc75945b8759aea not found: ID does not exist" Dec 01 23:29:10 crc kubenswrapper[4962]: I1201 23:29:10.240899 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fd477b3-9078-40f9-961a-ec8cb2423490" path="/var/lib/kubelet/pods/6fd477b3-9078-40f9-961a-ec8cb2423490/volumes" Dec 01 23:29:17 crc kubenswrapper[4962]: I1201 23:29:17.084285 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-nl5jh_00f6ed0c-f791-460d-acd4-d100a0b21710/control-plane-machine-set-operator/0.log" Dec 01 23:29:17 crc kubenswrapper[4962]: I1201 23:29:17.221327 4962 scope.go:117] "RemoveContainer" containerID="c6fb37a1aaa81633d4a05ea5583bfc5dfc3c8b8e5c3dfe9438063a6a533a05f8" Dec 01 23:29:17 crc kubenswrapper[4962]: E1201 23:29:17.221958 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 23:29:17 crc kubenswrapper[4962]: I1201 23:29:17.289243 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-qj8zv_12136e64-010e-49bc-9c3e-d1c65467f361/kube-rbac-proxy/0.log" Dec 01 23:29:17 crc kubenswrapper[4962]: I1201 23:29:17.291460 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-qj8zv_12136e64-010e-49bc-9c3e-d1c65467f361/machine-api-operator/0.log" Dec 01 23:29:29 crc kubenswrapper[4962]: I1201 23:29:29.220049 4962 scope.go:117] "RemoveContainer" containerID="c6fb37a1aaa81633d4a05ea5583bfc5dfc3c8b8e5c3dfe9438063a6a533a05f8" Dec 01 23:29:29 crc kubenswrapper[4962]: E1201 23:29:29.220810 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 23:29:31 crc kubenswrapper[4962]: I1201 23:29:31.380897 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-lzx5s_09bfd310-14dd-4f11-90d4-2b67683a468a/cert-manager-controller/0.log" Dec 01 23:29:31 crc kubenswrapper[4962]: I1201 23:29:31.553846 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-mfzw8_1ed989fa-af32-4ec2-9ead-2681d1b96741/cert-manager-cainjector/0.log" Dec 01 23:29:31 crc kubenswrapper[4962]: I1201 23:29:31.574860 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-zwlk2_bce7cb19-aeee-4ad9-9284-46e78c5e1d6f/cert-manager-webhook/0.log" Dec 01 23:29:40 crc kubenswrapper[4962]: I1201 23:29:40.219840 4962 scope.go:117] "RemoveContainer" containerID="c6fb37a1aaa81633d4a05ea5583bfc5dfc3c8b8e5c3dfe9438063a6a533a05f8" Dec 01 23:29:40 crc kubenswrapper[4962]: E1201 23:29:40.220664 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 23:29:45 crc kubenswrapper[4962]: I1201 23:29:45.132986 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-48pxc_5a7b0f93-3ea3-4a0f-baef-4ca08977cbde/nmstate-console-plugin/0.log" Dec 01 23:29:45 crc kubenswrapper[4962]: I1201 23:29:45.347451 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-s4vbq_770c0f72-8589-4617-8b07-92d0702ff5b8/nmstate-handler/0.log" Dec 01 23:29:45 crc kubenswrapper[4962]: I1201 23:29:45.350510 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-l62dd_838c46a9-9378-4801-8cc4-e203bf8c2972/kube-rbac-proxy/0.log" Dec 01 23:29:45 crc kubenswrapper[4962]: I1201 23:29:45.477692 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-l62dd_838c46a9-9378-4801-8cc4-e203bf8c2972/nmstate-metrics/0.log" Dec 01 23:29:45 crc kubenswrapper[4962]: I1201 23:29:45.692868 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-r6ndt_c7fabe32-40b1-4300-bd18-c51c12e45a21/nmstate-operator/0.log" Dec 01 23:29:45 crc kubenswrapper[4962]: I1201 23:29:45.780111 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-s5czc_2473e9c3-5f3d-4122-ae3c-c0ef0de79201/nmstate-webhook/0.log" Dec 01 23:29:55 crc kubenswrapper[4962]: I1201 23:29:55.221436 4962 scope.go:117] "RemoveContainer" containerID="c6fb37a1aaa81633d4a05ea5583bfc5dfc3c8b8e5c3dfe9438063a6a533a05f8" Dec 01 23:29:55 crc kubenswrapper[4962]: E1201 23:29:55.222846 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 23:29:59 crc kubenswrapper[4962]: I1201 23:29:59.390438 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-9658d6667-z5mx5_7b4dc242-7421-44a8-862e-b78e3e4310f3/kube-rbac-proxy/0.log" Dec 01 23:29:59 crc kubenswrapper[4962]: I1201 23:29:59.511600 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-9658d6667-z5mx5_7b4dc242-7421-44a8-862e-b78e3e4310f3/manager/0.log" Dec 01 23:30:00 crc kubenswrapper[4962]: I1201 23:30:00.178713 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410530-6w5hn"] Dec 01 23:30:00 crc kubenswrapper[4962]: E1201 23:30:00.179528 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fd477b3-9078-40f9-961a-ec8cb2423490" containerName="extract-content" Dec 01 23:30:00 crc kubenswrapper[4962]: I1201 23:30:00.179634 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fd477b3-9078-40f9-961a-ec8cb2423490" containerName="extract-content" Dec 01 23:30:00 crc kubenswrapper[4962]: E1201 23:30:00.179731 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fd477b3-9078-40f9-961a-ec8cb2423490" containerName="extract-utilities" Dec 01 23:30:00 crc kubenswrapper[4962]: I1201 23:30:00.179812 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fd477b3-9078-40f9-961a-ec8cb2423490" containerName="extract-utilities" Dec 01 23:30:00 crc kubenswrapper[4962]: E1201 23:30:00.179970 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fd477b3-9078-40f9-961a-ec8cb2423490" containerName="registry-server" Dec 01 23:30:00 crc kubenswrapper[4962]: I1201 23:30:00.180076 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fd477b3-9078-40f9-961a-ec8cb2423490" containerName="registry-server" Dec 01 23:30:00 crc kubenswrapper[4962]: I1201 23:30:00.180594 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fd477b3-9078-40f9-961a-ec8cb2423490" containerName="registry-server" Dec 01 23:30:00 crc kubenswrapper[4962]: I1201 23:30:00.182358 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410530-6w5hn" Dec 01 23:30:00 crc kubenswrapper[4962]: I1201 23:30:00.184593 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 01 23:30:00 crc kubenswrapper[4962]: I1201 23:30:00.189515 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 01 23:30:00 crc kubenswrapper[4962]: I1201 23:30:00.213791 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410530-6w5hn"] Dec 01 23:30:00 crc kubenswrapper[4962]: I1201 23:30:00.343193 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4m4kg\" (UniqueName: \"kubernetes.io/projected/e333bd86-25f3-427d-b668-a12b1442f6e9-kube-api-access-4m4kg\") pod \"collect-profiles-29410530-6w5hn\" (UID: \"e333bd86-25f3-427d-b668-a12b1442f6e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410530-6w5hn" Dec 01 23:30:00 crc kubenswrapper[4962]: I1201 23:30:00.343495 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e333bd86-25f3-427d-b668-a12b1442f6e9-secret-volume\") pod \"collect-profiles-29410530-6w5hn\" (UID: \"e333bd86-25f3-427d-b668-a12b1442f6e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410530-6w5hn" Dec 01 23:30:00 crc kubenswrapper[4962]: I1201 23:30:00.343560 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e333bd86-25f3-427d-b668-a12b1442f6e9-config-volume\") pod \"collect-profiles-29410530-6w5hn\" (UID: \"e333bd86-25f3-427d-b668-a12b1442f6e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410530-6w5hn" Dec 01 23:30:00 crc kubenswrapper[4962]: I1201 23:30:00.446655 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4m4kg\" (UniqueName: \"kubernetes.io/projected/e333bd86-25f3-427d-b668-a12b1442f6e9-kube-api-access-4m4kg\") pod \"collect-profiles-29410530-6w5hn\" (UID: \"e333bd86-25f3-427d-b668-a12b1442f6e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410530-6w5hn" Dec 01 23:30:00 crc kubenswrapper[4962]: I1201 23:30:00.447765 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e333bd86-25f3-427d-b668-a12b1442f6e9-secret-volume\") pod \"collect-profiles-29410530-6w5hn\" (UID: \"e333bd86-25f3-427d-b668-a12b1442f6e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410530-6w5hn" Dec 01 23:30:00 crc kubenswrapper[4962]: I1201 23:30:00.448016 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e333bd86-25f3-427d-b668-a12b1442f6e9-config-volume\") pod \"collect-profiles-29410530-6w5hn\" (UID: \"e333bd86-25f3-427d-b668-a12b1442f6e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410530-6w5hn" Dec 01 23:30:00 crc kubenswrapper[4962]: I1201 23:30:00.449071 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e333bd86-25f3-427d-b668-a12b1442f6e9-config-volume\") pod \"collect-profiles-29410530-6w5hn\" (UID: \"e333bd86-25f3-427d-b668-a12b1442f6e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410530-6w5hn" Dec 01 23:30:00 crc kubenswrapper[4962]: I1201 23:30:00.454631 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e333bd86-25f3-427d-b668-a12b1442f6e9-secret-volume\") pod \"collect-profiles-29410530-6w5hn\" (UID: \"e333bd86-25f3-427d-b668-a12b1442f6e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410530-6w5hn" Dec 01 23:30:00 crc kubenswrapper[4962]: I1201 23:30:00.472586 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4m4kg\" (UniqueName: \"kubernetes.io/projected/e333bd86-25f3-427d-b668-a12b1442f6e9-kube-api-access-4m4kg\") pod \"collect-profiles-29410530-6w5hn\" (UID: \"e333bd86-25f3-427d-b668-a12b1442f6e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410530-6w5hn" Dec 01 23:30:00 crc kubenswrapper[4962]: I1201 23:30:00.510658 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410530-6w5hn" Dec 01 23:30:01 crc kubenswrapper[4962]: I1201 23:30:01.078277 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410530-6w5hn"] Dec 01 23:30:01 crc kubenswrapper[4962]: I1201 23:30:01.106164 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410530-6w5hn" event={"ID":"e333bd86-25f3-427d-b668-a12b1442f6e9","Type":"ContainerStarted","Data":"d0144f8ce96c359d89217b03abda522c6496d71b4789a2db183de0f80fcdc96e"} Dec 01 23:30:02 crc kubenswrapper[4962]: I1201 23:30:02.119707 4962 generic.go:334] "Generic (PLEG): container finished" podID="e333bd86-25f3-427d-b668-a12b1442f6e9" containerID="c66d21ea729d5119feb85c01dd31ca254fd0ffd521b1640734cc778d6fbb2931" exitCode=0 Dec 01 23:30:02 crc kubenswrapper[4962]: I1201 23:30:02.120213 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410530-6w5hn" event={"ID":"e333bd86-25f3-427d-b668-a12b1442f6e9","Type":"ContainerDied","Data":"c66d21ea729d5119feb85c01dd31ca254fd0ffd521b1640734cc778d6fbb2931"} Dec 01 23:30:03 crc kubenswrapper[4962]: I1201 23:30:03.609208 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410530-6w5hn" Dec 01 23:30:03 crc kubenswrapper[4962]: I1201 23:30:03.736797 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e333bd86-25f3-427d-b668-a12b1442f6e9-config-volume\") pod \"e333bd86-25f3-427d-b668-a12b1442f6e9\" (UID: \"e333bd86-25f3-427d-b668-a12b1442f6e9\") " Dec 01 23:30:03 crc kubenswrapper[4962]: I1201 23:30:03.737340 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4m4kg\" (UniqueName: \"kubernetes.io/projected/e333bd86-25f3-427d-b668-a12b1442f6e9-kube-api-access-4m4kg\") pod \"e333bd86-25f3-427d-b668-a12b1442f6e9\" (UID: \"e333bd86-25f3-427d-b668-a12b1442f6e9\") " Dec 01 23:30:03 crc kubenswrapper[4962]: I1201 23:30:03.737588 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e333bd86-25f3-427d-b668-a12b1442f6e9-secret-volume\") pod \"e333bd86-25f3-427d-b668-a12b1442f6e9\" (UID: \"e333bd86-25f3-427d-b668-a12b1442f6e9\") " Dec 01 23:30:03 crc kubenswrapper[4962]: I1201 23:30:03.737559 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e333bd86-25f3-427d-b668-a12b1442f6e9-config-volume" (OuterVolumeSpecName: "config-volume") pod "e333bd86-25f3-427d-b668-a12b1442f6e9" (UID: "e333bd86-25f3-427d-b668-a12b1442f6e9"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 23:30:03 crc kubenswrapper[4962]: I1201 23:30:03.738768 4962 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e333bd86-25f3-427d-b668-a12b1442f6e9-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 23:30:03 crc kubenswrapper[4962]: I1201 23:30:03.745016 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e333bd86-25f3-427d-b668-a12b1442f6e9-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e333bd86-25f3-427d-b668-a12b1442f6e9" (UID: "e333bd86-25f3-427d-b668-a12b1442f6e9"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 23:30:03 crc kubenswrapper[4962]: I1201 23:30:03.745101 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e333bd86-25f3-427d-b668-a12b1442f6e9-kube-api-access-4m4kg" (OuterVolumeSpecName: "kube-api-access-4m4kg") pod "e333bd86-25f3-427d-b668-a12b1442f6e9" (UID: "e333bd86-25f3-427d-b668-a12b1442f6e9"). InnerVolumeSpecName "kube-api-access-4m4kg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 23:30:03 crc kubenswrapper[4962]: I1201 23:30:03.841238 4962 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e333bd86-25f3-427d-b668-a12b1442f6e9-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 01 23:30:03 crc kubenswrapper[4962]: I1201 23:30:03.841281 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4m4kg\" (UniqueName: \"kubernetes.io/projected/e333bd86-25f3-427d-b668-a12b1442f6e9-kube-api-access-4m4kg\") on node \"crc\" DevicePath \"\"" Dec 01 23:30:04 crc kubenswrapper[4962]: I1201 23:30:04.143459 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410530-6w5hn" event={"ID":"e333bd86-25f3-427d-b668-a12b1442f6e9","Type":"ContainerDied","Data":"d0144f8ce96c359d89217b03abda522c6496d71b4789a2db183de0f80fcdc96e"} Dec 01 23:30:04 crc kubenswrapper[4962]: I1201 23:30:04.143497 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0144f8ce96c359d89217b03abda522c6496d71b4789a2db183de0f80fcdc96e" Dec 01 23:30:04 crc kubenswrapper[4962]: I1201 23:30:04.143583 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410530-6w5hn" Dec 01 23:30:04 crc kubenswrapper[4962]: I1201 23:30:04.719281 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410485-bj2bp"] Dec 01 23:30:04 crc kubenswrapper[4962]: I1201 23:30:04.730032 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410485-bj2bp"] Dec 01 23:30:06 crc kubenswrapper[4962]: I1201 23:30:06.239133 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="258b4cf8-4035-421a-acca-98b2a30e66c4" path="/var/lib/kubelet/pods/258b4cf8-4035-421a-acca-98b2a30e66c4/volumes" Dec 01 23:30:08 crc kubenswrapper[4962]: I1201 23:30:08.220066 4962 scope.go:117] "RemoveContainer" containerID="c6fb37a1aaa81633d4a05ea5583bfc5dfc3c8b8e5c3dfe9438063a6a533a05f8" Dec 01 23:30:08 crc kubenswrapper[4962]: E1201 23:30:08.221017 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 23:30:10 crc kubenswrapper[4962]: I1201 23:30:10.290089 4962 scope.go:117] "RemoveContainer" containerID="0248c1caa1b487c689436888931f18d07f42432086c6f78811ef97f31a5f988c" Dec 01 23:30:14 crc kubenswrapper[4962]: I1201 23:30:14.420385 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_cluster-logging-operator-ff9846bd-6l74q_6965bdb4-04f5-486b-9897-b190e56d69b0/cluster-logging-operator/0.log" Dec 01 23:30:14 crc kubenswrapper[4962]: I1201 23:30:14.571680 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_collector-ppj4c_15e991cf-b72c-462a-bc84-b157fee8ac90/collector/0.log" Dec 01 23:30:14 crc kubenswrapper[4962]: I1201 23:30:14.636267 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-compactor-0_b10ba804-253d-4972-bfd5-9f5fb9847989/loki-compactor/0.log" Dec 01 23:30:14 crc kubenswrapper[4962]: I1201 23:30:14.726373 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-distributor-76cc67bf56-prtfr_deb58cb2-860d-49d2-95e1-12aa147bd419/loki-distributor/0.log" Dec 01 23:30:14 crc kubenswrapper[4962]: I1201 23:30:14.815766 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-749d76f66f-8szr5_a89c265b-cf90-4c13-9e7e-ebd27f1b3463/gateway/0.log" Dec 01 23:30:14 crc kubenswrapper[4962]: I1201 23:30:14.877334 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-749d76f66f-8szr5_a89c265b-cf90-4c13-9e7e-ebd27f1b3463/opa/0.log" Dec 01 23:30:14 crc kubenswrapper[4962]: I1201 23:30:14.964727 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-749d76f66f-pnn6j_8fdb44c5-cad3-460a-a6c8-90e65be7c1ce/gateway/0.log" Dec 01 23:30:14 crc kubenswrapper[4962]: I1201 23:30:14.967493 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-749d76f66f-pnn6j_8fdb44c5-cad3-460a-a6c8-90e65be7c1ce/opa/0.log" Dec 01 23:30:15 crc kubenswrapper[4962]: I1201 23:30:15.095822 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-index-gateway-0_92297031-5f57-47f1-a6de-4a94b6490937/loki-index-gateway/0.log" Dec 01 23:30:15 crc kubenswrapper[4962]: I1201 23:30:15.252994 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-ingester-0_30d8b489-fae1-4ed5-8a5c-19d7bad83a3d/loki-ingester/0.log" Dec 01 23:30:15 crc kubenswrapper[4962]: I1201 23:30:15.320451 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-querier-5895d59bb8-khxd2_5e9077bf-815a-4c1f-8956-bc4094f59ceb/loki-querier/0.log" Dec 01 23:30:15 crc kubenswrapper[4962]: I1201 23:30:15.430342 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-query-frontend-84558f7c9f-7wvzd_dc99ea3a-cc9e-4e3d-9d0d-16aaa0ae5faa/loki-query-frontend/0.log" Dec 01 23:30:22 crc kubenswrapper[4962]: I1201 23:30:22.222186 4962 scope.go:117] "RemoveContainer" containerID="c6fb37a1aaa81633d4a05ea5583bfc5dfc3c8b8e5c3dfe9438063a6a533a05f8" Dec 01 23:30:22 crc kubenswrapper[4962]: E1201 23:30:22.223375 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 23:30:28 crc kubenswrapper[4962]: I1201 23:30:28.100023 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rzggv"] Dec 01 23:30:28 crc kubenswrapper[4962]: E1201 23:30:28.101051 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e333bd86-25f3-427d-b668-a12b1442f6e9" containerName="collect-profiles" Dec 01 23:30:28 crc kubenswrapper[4962]: I1201 23:30:28.101063 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="e333bd86-25f3-427d-b668-a12b1442f6e9" containerName="collect-profiles" Dec 01 23:30:28 crc kubenswrapper[4962]: I1201 23:30:28.101314 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="e333bd86-25f3-427d-b668-a12b1442f6e9" containerName="collect-profiles" Dec 01 23:30:28 crc kubenswrapper[4962]: I1201 23:30:28.103332 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rzggv" Dec 01 23:30:28 crc kubenswrapper[4962]: I1201 23:30:28.130777 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rzggv"] Dec 01 23:30:28 crc kubenswrapper[4962]: I1201 23:30:28.153800 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c5b803d-7c90-4650-ba0e-895b8da4d9aa-utilities\") pod \"community-operators-rzggv\" (UID: \"4c5b803d-7c90-4650-ba0e-895b8da4d9aa\") " pod="openshift-marketplace/community-operators-rzggv" Dec 01 23:30:28 crc kubenswrapper[4962]: I1201 23:30:28.154380 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c5b803d-7c90-4650-ba0e-895b8da4d9aa-catalog-content\") pod \"community-operators-rzggv\" (UID: \"4c5b803d-7c90-4650-ba0e-895b8da4d9aa\") " pod="openshift-marketplace/community-operators-rzggv" Dec 01 23:30:28 crc kubenswrapper[4962]: I1201 23:30:28.154422 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpd2w\" (UniqueName: \"kubernetes.io/projected/4c5b803d-7c90-4650-ba0e-895b8da4d9aa-kube-api-access-fpd2w\") pod \"community-operators-rzggv\" (UID: \"4c5b803d-7c90-4650-ba0e-895b8da4d9aa\") " pod="openshift-marketplace/community-operators-rzggv" Dec 01 23:30:28 crc kubenswrapper[4962]: I1201 23:30:28.257212 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c5b803d-7c90-4650-ba0e-895b8da4d9aa-utilities\") pod \"community-operators-rzggv\" (UID: \"4c5b803d-7c90-4650-ba0e-895b8da4d9aa\") " pod="openshift-marketplace/community-operators-rzggv" Dec 01 23:30:28 crc kubenswrapper[4962]: I1201 23:30:28.257468 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpd2w\" (UniqueName: \"kubernetes.io/projected/4c5b803d-7c90-4650-ba0e-895b8da4d9aa-kube-api-access-fpd2w\") pod \"community-operators-rzggv\" (UID: \"4c5b803d-7c90-4650-ba0e-895b8da4d9aa\") " pod="openshift-marketplace/community-operators-rzggv" Dec 01 23:30:28 crc kubenswrapper[4962]: I1201 23:30:28.257493 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c5b803d-7c90-4650-ba0e-895b8da4d9aa-catalog-content\") pod \"community-operators-rzggv\" (UID: \"4c5b803d-7c90-4650-ba0e-895b8da4d9aa\") " pod="openshift-marketplace/community-operators-rzggv" Dec 01 23:30:28 crc kubenswrapper[4962]: I1201 23:30:28.258191 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c5b803d-7c90-4650-ba0e-895b8da4d9aa-utilities\") pod \"community-operators-rzggv\" (UID: \"4c5b803d-7c90-4650-ba0e-895b8da4d9aa\") " pod="openshift-marketplace/community-operators-rzggv" Dec 01 23:30:28 crc kubenswrapper[4962]: I1201 23:30:28.258430 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c5b803d-7c90-4650-ba0e-895b8da4d9aa-catalog-content\") pod \"community-operators-rzggv\" (UID: \"4c5b803d-7c90-4650-ba0e-895b8da4d9aa\") " pod="openshift-marketplace/community-operators-rzggv" Dec 01 23:30:28 crc kubenswrapper[4962]: I1201 23:30:28.281687 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpd2w\" (UniqueName: \"kubernetes.io/projected/4c5b803d-7c90-4650-ba0e-895b8da4d9aa-kube-api-access-fpd2w\") pod \"community-operators-rzggv\" (UID: \"4c5b803d-7c90-4650-ba0e-895b8da4d9aa\") " pod="openshift-marketplace/community-operators-rzggv" Dec 01 23:30:28 crc kubenswrapper[4962]: I1201 23:30:28.423199 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rzggv" Dec 01 23:30:28 crc kubenswrapper[4962]: I1201 23:30:28.915135 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rzggv"] Dec 01 23:30:29 crc kubenswrapper[4962]: I1201 23:30:29.441946 4962 generic.go:334] "Generic (PLEG): container finished" podID="4c5b803d-7c90-4650-ba0e-895b8da4d9aa" containerID="7ab3b40b14781c327d43392fae4c188ec3b6e96012a52c6a077f9fde2f79239c" exitCode=0 Dec 01 23:30:29 crc kubenswrapper[4962]: I1201 23:30:29.441992 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rzggv" event={"ID":"4c5b803d-7c90-4650-ba0e-895b8da4d9aa","Type":"ContainerDied","Data":"7ab3b40b14781c327d43392fae4c188ec3b6e96012a52c6a077f9fde2f79239c"} Dec 01 23:30:29 crc kubenswrapper[4962]: I1201 23:30:29.442206 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rzggv" event={"ID":"4c5b803d-7c90-4650-ba0e-895b8da4d9aa","Type":"ContainerStarted","Data":"4da69310667267d5229ccaf6f65695f277fab022c829bd849365b9c5edeb097c"} Dec 01 23:30:30 crc kubenswrapper[4962]: I1201 23:30:30.324777 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-hf6jx_519181c6-2c70-42ee-825f-427fe5942b07/kube-rbac-proxy/0.log" Dec 01 23:30:30 crc kubenswrapper[4962]: I1201 23:30:30.466329 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rzggv" event={"ID":"4c5b803d-7c90-4650-ba0e-895b8da4d9aa","Type":"ContainerStarted","Data":"fc729ba6bee3653e01c69b262cfac4907636b04a686f3259ba4a5b6cecc1b2bd"} Dec 01 23:30:30 crc kubenswrapper[4962]: I1201 23:30:30.532842 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-hf6jx_519181c6-2c70-42ee-825f-427fe5942b07/controller/0.log" Dec 01 23:30:30 crc kubenswrapper[4962]: I1201 23:30:30.591409 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mgx99_7590e32f-b0cb-46dc-a679-46b2ede43ba0/cp-frr-files/0.log" Dec 01 23:30:30 crc kubenswrapper[4962]: I1201 23:30:30.773353 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mgx99_7590e32f-b0cb-46dc-a679-46b2ede43ba0/cp-frr-files/0.log" Dec 01 23:30:30 crc kubenswrapper[4962]: I1201 23:30:30.801691 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mgx99_7590e32f-b0cb-46dc-a679-46b2ede43ba0/cp-metrics/0.log" Dec 01 23:30:30 crc kubenswrapper[4962]: I1201 23:30:30.850814 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mgx99_7590e32f-b0cb-46dc-a679-46b2ede43ba0/cp-reloader/0.log" Dec 01 23:30:30 crc kubenswrapper[4962]: I1201 23:30:30.864649 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mgx99_7590e32f-b0cb-46dc-a679-46b2ede43ba0/cp-reloader/0.log" Dec 01 23:30:31 crc kubenswrapper[4962]: I1201 23:30:31.002610 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mgx99_7590e32f-b0cb-46dc-a679-46b2ede43ba0/cp-frr-files/0.log" Dec 01 23:30:31 crc kubenswrapper[4962]: I1201 23:30:31.008656 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mgx99_7590e32f-b0cb-46dc-a679-46b2ede43ba0/cp-reloader/0.log" Dec 01 23:30:31 crc kubenswrapper[4962]: I1201 23:30:31.094091 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mgx99_7590e32f-b0cb-46dc-a679-46b2ede43ba0/cp-metrics/0.log" Dec 01 23:30:31 crc kubenswrapper[4962]: I1201 23:30:31.151702 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mgx99_7590e32f-b0cb-46dc-a679-46b2ede43ba0/cp-metrics/0.log" Dec 01 23:30:31 crc kubenswrapper[4962]: I1201 23:30:31.287799 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mgx99_7590e32f-b0cb-46dc-a679-46b2ede43ba0/cp-reloader/0.log" Dec 01 23:30:31 crc kubenswrapper[4962]: I1201 23:30:31.303045 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mgx99_7590e32f-b0cb-46dc-a679-46b2ede43ba0/cp-frr-files/0.log" Dec 01 23:30:31 crc kubenswrapper[4962]: I1201 23:30:31.305809 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mgx99_7590e32f-b0cb-46dc-a679-46b2ede43ba0/cp-metrics/0.log" Dec 01 23:30:31 crc kubenswrapper[4962]: I1201 23:30:31.406184 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mgx99_7590e32f-b0cb-46dc-a679-46b2ede43ba0/controller/0.log" Dec 01 23:30:31 crc kubenswrapper[4962]: I1201 23:30:31.479379 4962 generic.go:334] "Generic (PLEG): container finished" podID="4c5b803d-7c90-4650-ba0e-895b8da4d9aa" containerID="fc729ba6bee3653e01c69b262cfac4907636b04a686f3259ba4a5b6cecc1b2bd" exitCode=0 Dec 01 23:30:31 crc kubenswrapper[4962]: I1201 23:30:31.479428 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rzggv" event={"ID":"4c5b803d-7c90-4650-ba0e-895b8da4d9aa","Type":"ContainerDied","Data":"fc729ba6bee3653e01c69b262cfac4907636b04a686f3259ba4a5b6cecc1b2bd"} Dec 01 23:30:31 crc kubenswrapper[4962]: I1201 23:30:31.490187 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mgx99_7590e32f-b0cb-46dc-a679-46b2ede43ba0/frr-metrics/0.log" Dec 01 23:30:31 crc kubenswrapper[4962]: I1201 23:30:31.552543 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mgx99_7590e32f-b0cb-46dc-a679-46b2ede43ba0/kube-rbac-proxy/0.log" Dec 01 23:30:31 crc kubenswrapper[4962]: I1201 23:30:31.633513 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mgx99_7590e32f-b0cb-46dc-a679-46b2ede43ba0/kube-rbac-proxy-frr/0.log" Dec 01 23:30:31 crc kubenswrapper[4962]: I1201 23:30:31.766431 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mgx99_7590e32f-b0cb-46dc-a679-46b2ede43ba0/reloader/0.log" Dec 01 23:30:31 crc kubenswrapper[4962]: I1201 23:30:31.926093 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-2xwv6_ba7de090-9085-47a3-a086-73f78775d865/frr-k8s-webhook-server/0.log" Dec 01 23:30:32 crc kubenswrapper[4962]: I1201 23:30:32.287596 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-74475bd8d7-k5jkf_e46e036e-ca57-4675-a356-6a0cf72b184d/webhook-server/0.log" Dec 01 23:30:32 crc kubenswrapper[4962]: I1201 23:30:32.307263 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-fc9ff4f78-6q794_06d500dd-2267-451a-992d-d676f1033bb6/manager/0.log" Dec 01 23:30:32 crc kubenswrapper[4962]: I1201 23:30:32.619514 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-z5gxh_9dc8d3dc-4cdb-45b7-a54f-83db94bdde05/kube-rbac-proxy/0.log" Dec 01 23:30:33 crc kubenswrapper[4962]: I1201 23:30:33.144380 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-z5gxh_9dc8d3dc-4cdb-45b7-a54f-83db94bdde05/speaker/0.log" Dec 01 23:30:33 crc kubenswrapper[4962]: I1201 23:30:33.504858 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rzggv" event={"ID":"4c5b803d-7c90-4650-ba0e-895b8da4d9aa","Type":"ContainerStarted","Data":"0c6abc62223877ec409f1613cb746238226007c9d888d39f19379dbc5316cf44"} Dec 01 23:30:33 crc kubenswrapper[4962]: I1201 23:30:33.531814 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rzggv" podStartSLOduration=2.6637029 podStartE2EDuration="5.531798611s" podCreationTimestamp="2025-12-01 23:30:28 +0000 UTC" firstStartedPulling="2025-12-01 23:30:29.444527261 +0000 UTC m=+7013.545966446" lastFinishedPulling="2025-12-01 23:30:32.312622962 +0000 UTC m=+7016.414062157" observedRunningTime="2025-12-01 23:30:33.531671558 +0000 UTC m=+7017.633110753" watchObservedRunningTime="2025-12-01 23:30:33.531798611 +0000 UTC m=+7017.633237806" Dec 01 23:30:33 crc kubenswrapper[4962]: I1201 23:30:33.589134 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mgx99_7590e32f-b0cb-46dc-a679-46b2ede43ba0/frr/0.log" Dec 01 23:30:36 crc kubenswrapper[4962]: I1201 23:30:36.236853 4962 scope.go:117] "RemoveContainer" containerID="c6fb37a1aaa81633d4a05ea5583bfc5dfc3c8b8e5c3dfe9438063a6a533a05f8" Dec 01 23:30:36 crc kubenswrapper[4962]: I1201 23:30:36.539026 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" event={"ID":"191b6ce3-f613-4217-b224-a65ee4cfdfe7","Type":"ContainerStarted","Data":"29bbb591d47c3a36d7be2b98516f84b98893107f8cb3db99681cfac14431b8bd"} Dec 01 23:30:38 crc kubenswrapper[4962]: I1201 23:30:38.423585 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rzggv" Dec 01 23:30:38 crc kubenswrapper[4962]: I1201 23:30:38.424258 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rzggv" Dec 01 23:30:38 crc kubenswrapper[4962]: I1201 23:30:38.485259 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rzggv" Dec 01 23:30:38 crc kubenswrapper[4962]: I1201 23:30:38.621435 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rzggv" Dec 01 23:30:38 crc kubenswrapper[4962]: I1201 23:30:38.749475 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rzggv"] Dec 01 23:30:40 crc kubenswrapper[4962]: I1201 23:30:40.589681 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rzggv" podUID="4c5b803d-7c90-4650-ba0e-895b8da4d9aa" containerName="registry-server" containerID="cri-o://0c6abc62223877ec409f1613cb746238226007c9d888d39f19379dbc5316cf44" gracePeriod=2 Dec 01 23:30:41 crc kubenswrapper[4962]: I1201 23:30:41.144990 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rzggv" Dec 01 23:30:41 crc kubenswrapper[4962]: I1201 23:30:41.278593 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c5b803d-7c90-4650-ba0e-895b8da4d9aa-catalog-content\") pod \"4c5b803d-7c90-4650-ba0e-895b8da4d9aa\" (UID: \"4c5b803d-7c90-4650-ba0e-895b8da4d9aa\") " Dec 01 23:30:41 crc kubenswrapper[4962]: I1201 23:30:41.278988 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpd2w\" (UniqueName: \"kubernetes.io/projected/4c5b803d-7c90-4650-ba0e-895b8da4d9aa-kube-api-access-fpd2w\") pod \"4c5b803d-7c90-4650-ba0e-895b8da4d9aa\" (UID: \"4c5b803d-7c90-4650-ba0e-895b8da4d9aa\") " Dec 01 23:30:41 crc kubenswrapper[4962]: I1201 23:30:41.279134 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c5b803d-7c90-4650-ba0e-895b8da4d9aa-utilities\") pod \"4c5b803d-7c90-4650-ba0e-895b8da4d9aa\" (UID: \"4c5b803d-7c90-4650-ba0e-895b8da4d9aa\") " Dec 01 23:30:41 crc kubenswrapper[4962]: I1201 23:30:41.280997 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c5b803d-7c90-4650-ba0e-895b8da4d9aa-utilities" (OuterVolumeSpecName: "utilities") pod "4c5b803d-7c90-4650-ba0e-895b8da4d9aa" (UID: "4c5b803d-7c90-4650-ba0e-895b8da4d9aa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 23:30:41 crc kubenswrapper[4962]: I1201 23:30:41.285319 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c5b803d-7c90-4650-ba0e-895b8da4d9aa-kube-api-access-fpd2w" (OuterVolumeSpecName: "kube-api-access-fpd2w") pod "4c5b803d-7c90-4650-ba0e-895b8da4d9aa" (UID: "4c5b803d-7c90-4650-ba0e-895b8da4d9aa"). InnerVolumeSpecName "kube-api-access-fpd2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 23:30:41 crc kubenswrapper[4962]: I1201 23:30:41.332924 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c5b803d-7c90-4650-ba0e-895b8da4d9aa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4c5b803d-7c90-4650-ba0e-895b8da4d9aa" (UID: "4c5b803d-7c90-4650-ba0e-895b8da4d9aa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 23:30:41 crc kubenswrapper[4962]: I1201 23:30:41.381790 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c5b803d-7c90-4650-ba0e-895b8da4d9aa-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 23:30:41 crc kubenswrapper[4962]: I1201 23:30:41.381821 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c5b803d-7c90-4650-ba0e-895b8da4d9aa-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 23:30:41 crc kubenswrapper[4962]: I1201 23:30:41.381834 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fpd2w\" (UniqueName: \"kubernetes.io/projected/4c5b803d-7c90-4650-ba0e-895b8da4d9aa-kube-api-access-fpd2w\") on node \"crc\" DevicePath \"\"" Dec 01 23:30:41 crc kubenswrapper[4962]: I1201 23:30:41.601090 4962 generic.go:334] "Generic (PLEG): container finished" podID="4c5b803d-7c90-4650-ba0e-895b8da4d9aa" containerID="0c6abc62223877ec409f1613cb746238226007c9d888d39f19379dbc5316cf44" exitCode=0 Dec 01 23:30:41 crc kubenswrapper[4962]: I1201 23:30:41.601144 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rzggv" Dec 01 23:30:41 crc kubenswrapper[4962]: I1201 23:30:41.601142 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rzggv" event={"ID":"4c5b803d-7c90-4650-ba0e-895b8da4d9aa","Type":"ContainerDied","Data":"0c6abc62223877ec409f1613cb746238226007c9d888d39f19379dbc5316cf44"} Dec 01 23:30:41 crc kubenswrapper[4962]: I1201 23:30:41.601625 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rzggv" event={"ID":"4c5b803d-7c90-4650-ba0e-895b8da4d9aa","Type":"ContainerDied","Data":"4da69310667267d5229ccaf6f65695f277fab022c829bd849365b9c5edeb097c"} Dec 01 23:30:41 crc kubenswrapper[4962]: I1201 23:30:41.601666 4962 scope.go:117] "RemoveContainer" containerID="0c6abc62223877ec409f1613cb746238226007c9d888d39f19379dbc5316cf44" Dec 01 23:30:41 crc kubenswrapper[4962]: I1201 23:30:41.636717 4962 scope.go:117] "RemoveContainer" containerID="fc729ba6bee3653e01c69b262cfac4907636b04a686f3259ba4a5b6cecc1b2bd" Dec 01 23:30:41 crc kubenswrapper[4962]: I1201 23:30:41.649097 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rzggv"] Dec 01 23:30:41 crc kubenswrapper[4962]: I1201 23:30:41.665489 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rzggv"] Dec 01 23:30:41 crc kubenswrapper[4962]: I1201 23:30:41.676619 4962 scope.go:117] "RemoveContainer" containerID="7ab3b40b14781c327d43392fae4c188ec3b6e96012a52c6a077f9fde2f79239c" Dec 01 23:30:41 crc kubenswrapper[4962]: I1201 23:30:41.747094 4962 scope.go:117] "RemoveContainer" containerID="0c6abc62223877ec409f1613cb746238226007c9d888d39f19379dbc5316cf44" Dec 01 23:30:41 crc kubenswrapper[4962]: E1201 23:30:41.747645 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c6abc62223877ec409f1613cb746238226007c9d888d39f19379dbc5316cf44\": container with ID starting with 0c6abc62223877ec409f1613cb746238226007c9d888d39f19379dbc5316cf44 not found: ID does not exist" containerID="0c6abc62223877ec409f1613cb746238226007c9d888d39f19379dbc5316cf44" Dec 01 23:30:41 crc kubenswrapper[4962]: I1201 23:30:41.747682 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c6abc62223877ec409f1613cb746238226007c9d888d39f19379dbc5316cf44"} err="failed to get container status \"0c6abc62223877ec409f1613cb746238226007c9d888d39f19379dbc5316cf44\": rpc error: code = NotFound desc = could not find container \"0c6abc62223877ec409f1613cb746238226007c9d888d39f19379dbc5316cf44\": container with ID starting with 0c6abc62223877ec409f1613cb746238226007c9d888d39f19379dbc5316cf44 not found: ID does not exist" Dec 01 23:30:41 crc kubenswrapper[4962]: I1201 23:30:41.747706 4962 scope.go:117] "RemoveContainer" containerID="fc729ba6bee3653e01c69b262cfac4907636b04a686f3259ba4a5b6cecc1b2bd" Dec 01 23:30:41 crc kubenswrapper[4962]: E1201 23:30:41.748256 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc729ba6bee3653e01c69b262cfac4907636b04a686f3259ba4a5b6cecc1b2bd\": container with ID starting with fc729ba6bee3653e01c69b262cfac4907636b04a686f3259ba4a5b6cecc1b2bd not found: ID does not exist" containerID="fc729ba6bee3653e01c69b262cfac4907636b04a686f3259ba4a5b6cecc1b2bd" Dec 01 23:30:41 crc kubenswrapper[4962]: I1201 23:30:41.748306 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc729ba6bee3653e01c69b262cfac4907636b04a686f3259ba4a5b6cecc1b2bd"} err="failed to get container status \"fc729ba6bee3653e01c69b262cfac4907636b04a686f3259ba4a5b6cecc1b2bd\": rpc error: code = NotFound desc = could not find container \"fc729ba6bee3653e01c69b262cfac4907636b04a686f3259ba4a5b6cecc1b2bd\": container with ID starting with fc729ba6bee3653e01c69b262cfac4907636b04a686f3259ba4a5b6cecc1b2bd not found: ID does not exist" Dec 01 23:30:41 crc kubenswrapper[4962]: I1201 23:30:41.748337 4962 scope.go:117] "RemoveContainer" containerID="7ab3b40b14781c327d43392fae4c188ec3b6e96012a52c6a077f9fde2f79239c" Dec 01 23:30:41 crc kubenswrapper[4962]: E1201 23:30:41.748711 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ab3b40b14781c327d43392fae4c188ec3b6e96012a52c6a077f9fde2f79239c\": container with ID starting with 7ab3b40b14781c327d43392fae4c188ec3b6e96012a52c6a077f9fde2f79239c not found: ID does not exist" containerID="7ab3b40b14781c327d43392fae4c188ec3b6e96012a52c6a077f9fde2f79239c" Dec 01 23:30:41 crc kubenswrapper[4962]: I1201 23:30:41.748736 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ab3b40b14781c327d43392fae4c188ec3b6e96012a52c6a077f9fde2f79239c"} err="failed to get container status \"7ab3b40b14781c327d43392fae4c188ec3b6e96012a52c6a077f9fde2f79239c\": rpc error: code = NotFound desc = could not find container \"7ab3b40b14781c327d43392fae4c188ec3b6e96012a52c6a077f9fde2f79239c\": container with ID starting with 7ab3b40b14781c327d43392fae4c188ec3b6e96012a52c6a077f9fde2f79239c not found: ID does not exist" Dec 01 23:30:42 crc kubenswrapper[4962]: I1201 23:30:42.234442 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c5b803d-7c90-4650-ba0e-895b8da4d9aa" path="/var/lib/kubelet/pods/4c5b803d-7c90-4650-ba0e-895b8da4d9aa/volumes" Dec 01 23:30:46 crc kubenswrapper[4962]: I1201 23:30:46.850632 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8lv8cp_1a573027-bd61-4a79-b4da-d0b25cb44908/util/0.log" Dec 01 23:30:47 crc kubenswrapper[4962]: I1201 23:30:47.053489 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8lv8cp_1a573027-bd61-4a79-b4da-d0b25cb44908/util/0.log" Dec 01 23:30:47 crc kubenswrapper[4962]: I1201 23:30:47.056199 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8lv8cp_1a573027-bd61-4a79-b4da-d0b25cb44908/pull/0.log" Dec 01 23:30:47 crc kubenswrapper[4962]: I1201 23:30:47.056332 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8lv8cp_1a573027-bd61-4a79-b4da-d0b25cb44908/pull/0.log" Dec 01 23:30:47 crc kubenswrapper[4962]: I1201 23:30:47.254675 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8lv8cp_1a573027-bd61-4a79-b4da-d0b25cb44908/pull/0.log" Dec 01 23:30:47 crc kubenswrapper[4962]: I1201 23:30:47.270967 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8lv8cp_1a573027-bd61-4a79-b4da-d0b25cb44908/util/0.log" Dec 01 23:30:47 crc kubenswrapper[4962]: I1201 23:30:47.291610 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8lv8cp_1a573027-bd61-4a79-b4da-d0b25cb44908/extract/0.log" Dec 01 23:30:47 crc kubenswrapper[4962]: I1201 23:30:47.634406 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f72t6l_fc7027df-456f-4562-8c33-b9902049338d/util/0.log" Dec 01 23:30:47 crc kubenswrapper[4962]: I1201 23:30:47.833992 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f72t6l_fc7027df-456f-4562-8c33-b9902049338d/pull/0.log" Dec 01 23:30:47 crc kubenswrapper[4962]: I1201 23:30:47.869205 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f72t6l_fc7027df-456f-4562-8c33-b9902049338d/pull/0.log" Dec 01 23:30:47 crc kubenswrapper[4962]: I1201 23:30:47.869614 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f72t6l_fc7027df-456f-4562-8c33-b9902049338d/util/0.log" Dec 01 23:30:48 crc kubenswrapper[4962]: I1201 23:30:48.026260 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f72t6l_fc7027df-456f-4562-8c33-b9902049338d/pull/0.log" Dec 01 23:30:48 crc kubenswrapper[4962]: I1201 23:30:48.118313 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f72t6l_fc7027df-456f-4562-8c33-b9902049338d/util/0.log" Dec 01 23:30:48 crc kubenswrapper[4962]: I1201 23:30:48.166738 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f72t6l_fc7027df-456f-4562-8c33-b9902049338d/extract/0.log" Dec 01 23:30:48 crc kubenswrapper[4962]: I1201 23:30:48.292266 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rljxv_1e0bbfb5-52f3-49bc-9db7-0d80859dbb2c/util/0.log" Dec 01 23:30:48 crc kubenswrapper[4962]: I1201 23:30:48.491184 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rljxv_1e0bbfb5-52f3-49bc-9db7-0d80859dbb2c/util/0.log" Dec 01 23:30:48 crc kubenswrapper[4962]: I1201 23:30:48.514302 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rljxv_1e0bbfb5-52f3-49bc-9db7-0d80859dbb2c/pull/0.log" Dec 01 23:30:48 crc kubenswrapper[4962]: I1201 23:30:48.605107 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rljxv_1e0bbfb5-52f3-49bc-9db7-0d80859dbb2c/pull/0.log" Dec 01 23:30:48 crc kubenswrapper[4962]: I1201 23:30:48.656970 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rljxv_1e0bbfb5-52f3-49bc-9db7-0d80859dbb2c/util/0.log" Dec 01 23:30:48 crc kubenswrapper[4962]: I1201 23:30:48.715040 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rljxv_1e0bbfb5-52f3-49bc-9db7-0d80859dbb2c/pull/0.log" Dec 01 23:30:48 crc kubenswrapper[4962]: I1201 23:30:48.738793 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rljxv_1e0bbfb5-52f3-49bc-9db7-0d80859dbb2c/extract/0.log" Dec 01 23:30:48 crc kubenswrapper[4962]: I1201 23:30:48.910693 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fttmb9_2cb7ef80-3761-4574-9c1f-52405a401ebc/util/0.log" Dec 01 23:30:49 crc kubenswrapper[4962]: I1201 23:30:49.026606 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fttmb9_2cb7ef80-3761-4574-9c1f-52405a401ebc/util/0.log" Dec 01 23:30:49 crc kubenswrapper[4962]: I1201 23:30:49.033237 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fttmb9_2cb7ef80-3761-4574-9c1f-52405a401ebc/pull/0.log" Dec 01 23:30:49 crc kubenswrapper[4962]: I1201 23:30:49.060480 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fttmb9_2cb7ef80-3761-4574-9c1f-52405a401ebc/pull/0.log" Dec 01 23:30:49 crc kubenswrapper[4962]: I1201 23:30:49.229113 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fttmb9_2cb7ef80-3761-4574-9c1f-52405a401ebc/pull/0.log" Dec 01 23:30:49 crc kubenswrapper[4962]: I1201 23:30:49.259227 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fttmb9_2cb7ef80-3761-4574-9c1f-52405a401ebc/extract/0.log" Dec 01 23:30:49 crc kubenswrapper[4962]: I1201 23:30:49.296697 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fttmb9_2cb7ef80-3761-4574-9c1f-52405a401ebc/util/0.log" Dec 01 23:30:49 crc kubenswrapper[4962]: I1201 23:30:49.457725 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832wntw_b6528667-e4ce-4641-9cbb-5ebfac003777/util/0.log" Dec 01 23:30:49 crc kubenswrapper[4962]: I1201 23:30:49.610538 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832wntw_b6528667-e4ce-4641-9cbb-5ebfac003777/util/0.log" Dec 01 23:30:49 crc kubenswrapper[4962]: I1201 23:30:49.642033 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832wntw_b6528667-e4ce-4641-9cbb-5ebfac003777/pull/0.log" Dec 01 23:30:49 crc kubenswrapper[4962]: I1201 23:30:49.642119 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832wntw_b6528667-e4ce-4641-9cbb-5ebfac003777/pull/0.log" Dec 01 23:30:49 crc kubenswrapper[4962]: I1201 23:30:49.859861 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832wntw_b6528667-e4ce-4641-9cbb-5ebfac003777/util/0.log" Dec 01 23:30:49 crc kubenswrapper[4962]: I1201 23:30:49.866061 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832wntw_b6528667-e4ce-4641-9cbb-5ebfac003777/extract/0.log" Dec 01 23:30:49 crc kubenswrapper[4962]: I1201 23:30:49.871485 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832wntw_b6528667-e4ce-4641-9cbb-5ebfac003777/pull/0.log" Dec 01 23:30:50 crc kubenswrapper[4962]: I1201 23:30:50.075601 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4sv58_3ea88a22-b18e-4b46-812c-35cb8dcdeb30/extract-utilities/0.log" Dec 01 23:30:50 crc kubenswrapper[4962]: I1201 23:30:50.255473 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4sv58_3ea88a22-b18e-4b46-812c-35cb8dcdeb30/extract-utilities/0.log" Dec 01 23:30:50 crc kubenswrapper[4962]: I1201 23:30:50.273699 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4sv58_3ea88a22-b18e-4b46-812c-35cb8dcdeb30/extract-content/0.log" Dec 01 23:30:50 crc kubenswrapper[4962]: I1201 23:30:50.314099 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4sv58_3ea88a22-b18e-4b46-812c-35cb8dcdeb30/extract-content/0.log" Dec 01 23:30:50 crc kubenswrapper[4962]: I1201 23:30:50.457675 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4sv58_3ea88a22-b18e-4b46-812c-35cb8dcdeb30/extract-utilities/0.log" Dec 01 23:30:50 crc kubenswrapper[4962]: I1201 23:30:50.513834 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4sv58_3ea88a22-b18e-4b46-812c-35cb8dcdeb30/extract-content/0.log" Dec 01 23:30:50 crc kubenswrapper[4962]: I1201 23:30:50.996814 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-t6qvm_21cefc69-51ca-4baa-a4f4-aa7de0d8aa7a/extract-utilities/0.log" Dec 01 23:30:51 crc kubenswrapper[4962]: I1201 23:30:51.199794 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-t6qvm_21cefc69-51ca-4baa-a4f4-aa7de0d8aa7a/extract-utilities/0.log" Dec 01 23:30:51 crc kubenswrapper[4962]: I1201 23:30:51.203246 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-t6qvm_21cefc69-51ca-4baa-a4f4-aa7de0d8aa7a/extract-content/0.log" Dec 01 23:30:51 crc kubenswrapper[4962]: I1201 23:30:51.404827 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-t6qvm_21cefc69-51ca-4baa-a4f4-aa7de0d8aa7a/extract-content/0.log" Dec 01 23:30:51 crc kubenswrapper[4962]: I1201 23:30:51.540030 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4sv58_3ea88a22-b18e-4b46-812c-35cb8dcdeb30/registry-server/0.log" Dec 01 23:30:51 crc kubenswrapper[4962]: I1201 23:30:51.612626 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-t6qvm_21cefc69-51ca-4baa-a4f4-aa7de0d8aa7a/extract-utilities/0.log" Dec 01 23:30:51 crc kubenswrapper[4962]: I1201 23:30:51.632061 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-t6qvm_21cefc69-51ca-4baa-a4f4-aa7de0d8aa7a/extract-content/0.log" Dec 01 23:30:51 crc kubenswrapper[4962]: I1201 23:30:51.750749 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-lg4bg_def1945b-b735-4267-8798-cdb6e28ac006/marketplace-operator/0.log" Dec 01 23:30:51 crc kubenswrapper[4962]: I1201 23:30:51.845683 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6bphp_6ee35195-33b7-4bc8-80fb-7eb9f0ca221f/extract-utilities/0.log" Dec 01 23:30:52 crc kubenswrapper[4962]: I1201 23:30:52.081945 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6bphp_6ee35195-33b7-4bc8-80fb-7eb9f0ca221f/extract-utilities/0.log" Dec 01 23:30:52 crc kubenswrapper[4962]: I1201 23:30:52.099172 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6bphp_6ee35195-33b7-4bc8-80fb-7eb9f0ca221f/extract-content/0.log" Dec 01 23:30:52 crc kubenswrapper[4962]: I1201 23:30:52.159477 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6bphp_6ee35195-33b7-4bc8-80fb-7eb9f0ca221f/extract-content/0.log" Dec 01 23:30:52 crc kubenswrapper[4962]: I1201 23:30:52.298246 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6bphp_6ee35195-33b7-4bc8-80fb-7eb9f0ca221f/extract-utilities/0.log" Dec 01 23:30:52 crc kubenswrapper[4962]: I1201 23:30:52.335206 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6bphp_6ee35195-33b7-4bc8-80fb-7eb9f0ca221f/extract-content/0.log" Dec 01 23:30:52 crc kubenswrapper[4962]: I1201 23:30:52.519076 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cckxw_d6ce364a-a2f9-48fa-9c65-5f8e65da569f/extract-utilities/0.log" Dec 01 23:30:52 crc kubenswrapper[4962]: I1201 23:30:52.718809 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6bphp_6ee35195-33b7-4bc8-80fb-7eb9f0ca221f/registry-server/0.log" Dec 01 23:30:52 crc kubenswrapper[4962]: I1201 23:30:52.795637 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cckxw_d6ce364a-a2f9-48fa-9c65-5f8e65da569f/extract-content/0.log" Dec 01 23:30:52 crc kubenswrapper[4962]: I1201 23:30:52.815648 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-t6qvm_21cefc69-51ca-4baa-a4f4-aa7de0d8aa7a/registry-server/0.log" Dec 01 23:30:52 crc kubenswrapper[4962]: I1201 23:30:52.829153 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cckxw_d6ce364a-a2f9-48fa-9c65-5f8e65da569f/extract-content/0.log" Dec 01 23:30:52 crc kubenswrapper[4962]: I1201 23:30:52.856603 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cckxw_d6ce364a-a2f9-48fa-9c65-5f8e65da569f/extract-utilities/0.log" Dec 01 23:30:53 crc kubenswrapper[4962]: I1201 23:30:53.111523 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cckxw_d6ce364a-a2f9-48fa-9c65-5f8e65da569f/extract-content/0.log" Dec 01 23:30:53 crc kubenswrapper[4962]: I1201 23:30:53.111573 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cckxw_d6ce364a-a2f9-48fa-9c65-5f8e65da569f/extract-utilities/0.log" Dec 01 23:30:53 crc kubenswrapper[4962]: I1201 23:30:53.736769 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cckxw_d6ce364a-a2f9-48fa-9c65-5f8e65da569f/registry-server/0.log" Dec 01 23:31:08 crc kubenswrapper[4962]: I1201 23:31:08.044095 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-668cf9dfbb-btqq9_c021e2cb-ee5d-4e74-a5fa-1ede1fde37df/prometheus-operator/0.log" Dec 01 23:31:08 crc kubenswrapper[4962]: I1201 23:31:08.243378 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-68b58c7cc9-gdr5f_b6a9273c-4395-4883-abbd-cfd15b5d552d/prometheus-operator-admission-webhook/0.log" Dec 01 23:31:08 crc kubenswrapper[4962]: I1201 23:31:08.288908 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-68b58c7cc9-mdbrz_bf5940c1-cfd9-4ed4-93a0-db06782924ae/prometheus-operator-admission-webhook/0.log" Dec 01 23:31:08 crc kubenswrapper[4962]: I1201 23:31:08.471843 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-d8bb48f5d-mbs2x_6cb92407-0085-483e-8079-3aa441bfd214/operator/0.log" Dec 01 23:31:08 crc kubenswrapper[4962]: I1201 23:31:08.510247 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-ui-dashboards-7d5fb4cbfb-9stzc_07284111-fb8f-4fc6-9693-dfe6869248bf/observability-ui-dashboards/0.log" Dec 01 23:31:08 crc kubenswrapper[4962]: I1201 23:31:08.713835 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5446b9c989-ff5lc_32158a1b-c7c3-4fda-98d1-69443d10d0a5/perses-operator/0.log" Dec 01 23:31:24 crc kubenswrapper[4962]: I1201 23:31:24.506025 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-9658d6667-z5mx5_7b4dc242-7421-44a8-862e-b78e3e4310f3/kube-rbac-proxy/0.log" Dec 01 23:31:24 crc kubenswrapper[4962]: I1201 23:31:24.528701 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-9658d6667-z5mx5_7b4dc242-7421-44a8-862e-b78e3e4310f3/manager/0.log" Dec 01 23:31:48 crc kubenswrapper[4962]: E1201 23:31:48.216258 4962 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.110:54380->38.102.83.110:46143: write tcp 38.102.83.110:54380->38.102.83.110:46143: write: broken pipe Dec 01 23:32:22 crc kubenswrapper[4962]: I1201 23:32:22.913784 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ldqbk"] Dec 01 23:32:22 crc kubenswrapper[4962]: E1201 23:32:22.915148 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c5b803d-7c90-4650-ba0e-895b8da4d9aa" containerName="registry-server" Dec 01 23:32:22 crc kubenswrapper[4962]: I1201 23:32:22.915172 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c5b803d-7c90-4650-ba0e-895b8da4d9aa" containerName="registry-server" Dec 01 23:32:22 crc kubenswrapper[4962]: E1201 23:32:22.915222 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c5b803d-7c90-4650-ba0e-895b8da4d9aa" containerName="extract-content" Dec 01 23:32:22 crc kubenswrapper[4962]: I1201 23:32:22.915235 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c5b803d-7c90-4650-ba0e-895b8da4d9aa" containerName="extract-content" Dec 01 23:32:22 crc kubenswrapper[4962]: E1201 23:32:22.915262 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c5b803d-7c90-4650-ba0e-895b8da4d9aa" containerName="extract-utilities" Dec 01 23:32:22 crc kubenswrapper[4962]: I1201 23:32:22.915274 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c5b803d-7c90-4650-ba0e-895b8da4d9aa" containerName="extract-utilities" Dec 01 23:32:22 crc kubenswrapper[4962]: I1201 23:32:22.915725 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c5b803d-7c90-4650-ba0e-895b8da4d9aa" containerName="registry-server" Dec 01 23:32:22 crc kubenswrapper[4962]: I1201 23:32:22.918980 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ldqbk" Dec 01 23:32:22 crc kubenswrapper[4962]: I1201 23:32:22.952746 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ldqbk"] Dec 01 23:32:23 crc kubenswrapper[4962]: I1201 23:32:23.073194 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3035858f-7ac4-4319-811f-30819baa91b5-utilities\") pod \"certified-operators-ldqbk\" (UID: \"3035858f-7ac4-4319-811f-30819baa91b5\") " pod="openshift-marketplace/certified-operators-ldqbk" Dec 01 23:32:23 crc kubenswrapper[4962]: I1201 23:32:23.073284 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52c6x\" (UniqueName: \"kubernetes.io/projected/3035858f-7ac4-4319-811f-30819baa91b5-kube-api-access-52c6x\") pod \"certified-operators-ldqbk\" (UID: \"3035858f-7ac4-4319-811f-30819baa91b5\") " pod="openshift-marketplace/certified-operators-ldqbk" Dec 01 23:32:23 crc kubenswrapper[4962]: I1201 23:32:23.073590 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3035858f-7ac4-4319-811f-30819baa91b5-catalog-content\") pod \"certified-operators-ldqbk\" (UID: \"3035858f-7ac4-4319-811f-30819baa91b5\") " pod="openshift-marketplace/certified-operators-ldqbk" Dec 01 23:32:23 crc kubenswrapper[4962]: I1201 23:32:23.176144 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3035858f-7ac4-4319-811f-30819baa91b5-catalog-content\") pod \"certified-operators-ldqbk\" (UID: \"3035858f-7ac4-4319-811f-30819baa91b5\") " pod="openshift-marketplace/certified-operators-ldqbk" Dec 01 23:32:23 crc kubenswrapper[4962]: I1201 23:32:23.176441 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3035858f-7ac4-4319-811f-30819baa91b5-utilities\") pod \"certified-operators-ldqbk\" (UID: \"3035858f-7ac4-4319-811f-30819baa91b5\") " pod="openshift-marketplace/certified-operators-ldqbk" Dec 01 23:32:23 crc kubenswrapper[4962]: I1201 23:32:23.176521 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52c6x\" (UniqueName: \"kubernetes.io/projected/3035858f-7ac4-4319-811f-30819baa91b5-kube-api-access-52c6x\") pod \"certified-operators-ldqbk\" (UID: \"3035858f-7ac4-4319-811f-30819baa91b5\") " pod="openshift-marketplace/certified-operators-ldqbk" Dec 01 23:32:23 crc kubenswrapper[4962]: I1201 23:32:23.178010 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3035858f-7ac4-4319-811f-30819baa91b5-catalog-content\") pod \"certified-operators-ldqbk\" (UID: \"3035858f-7ac4-4319-811f-30819baa91b5\") " pod="openshift-marketplace/certified-operators-ldqbk" Dec 01 23:32:23 crc kubenswrapper[4962]: I1201 23:32:23.179760 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3035858f-7ac4-4319-811f-30819baa91b5-utilities\") pod \"certified-operators-ldqbk\" (UID: \"3035858f-7ac4-4319-811f-30819baa91b5\") " pod="openshift-marketplace/certified-operators-ldqbk" Dec 01 23:32:23 crc kubenswrapper[4962]: I1201 23:32:23.211676 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52c6x\" (UniqueName: \"kubernetes.io/projected/3035858f-7ac4-4319-811f-30819baa91b5-kube-api-access-52c6x\") pod \"certified-operators-ldqbk\" (UID: \"3035858f-7ac4-4319-811f-30819baa91b5\") " pod="openshift-marketplace/certified-operators-ldqbk" Dec 01 23:32:23 crc kubenswrapper[4962]: I1201 23:32:23.246694 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ldqbk" Dec 01 23:32:23 crc kubenswrapper[4962]: I1201 23:32:23.852407 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ldqbk"] Dec 01 23:32:23 crc kubenswrapper[4962]: I1201 23:32:23.936900 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ldqbk" event={"ID":"3035858f-7ac4-4319-811f-30819baa91b5","Type":"ContainerStarted","Data":"3e8835a77b207d8fbe639369daae6896c7e37dcc230dc2c6ffe952954cafe195"} Dec 01 23:32:24 crc kubenswrapper[4962]: I1201 23:32:24.957651 4962 generic.go:334] "Generic (PLEG): container finished" podID="3035858f-7ac4-4319-811f-30819baa91b5" containerID="17f54adb2d96b8f3c40e85fa4cecca97ed49589f94ade99060a561332e39613b" exitCode=0 Dec 01 23:32:24 crc kubenswrapper[4962]: I1201 23:32:24.957773 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ldqbk" event={"ID":"3035858f-7ac4-4319-811f-30819baa91b5","Type":"ContainerDied","Data":"17f54adb2d96b8f3c40e85fa4cecca97ed49589f94ade99060a561332e39613b"} Dec 01 23:32:26 crc kubenswrapper[4962]: I1201 23:32:26.986853 4962 generic.go:334] "Generic (PLEG): container finished" podID="3035858f-7ac4-4319-811f-30819baa91b5" containerID="fee89e3555167aba989f2af70d4f95feeb34485291010346816766c028ae79b8" exitCode=0 Dec 01 23:32:26 crc kubenswrapper[4962]: I1201 23:32:26.986984 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ldqbk" event={"ID":"3035858f-7ac4-4319-811f-30819baa91b5","Type":"ContainerDied","Data":"fee89e3555167aba989f2af70d4f95feeb34485291010346816766c028ae79b8"} Dec 01 23:32:28 crc kubenswrapper[4962]: I1201 23:32:28.001345 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ldqbk" event={"ID":"3035858f-7ac4-4319-811f-30819baa91b5","Type":"ContainerStarted","Data":"1a76512aa4bc64c832ca9665551ccd7f5a19146d001c82ecf6f1e805792a886a"} Dec 01 23:32:28 crc kubenswrapper[4962]: I1201 23:32:28.030859 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ldqbk" podStartSLOduration=3.532343554 podStartE2EDuration="6.030839258s" podCreationTimestamp="2025-12-01 23:32:22 +0000 UTC" firstStartedPulling="2025-12-01 23:32:24.96068951 +0000 UTC m=+7129.062128705" lastFinishedPulling="2025-12-01 23:32:27.459185204 +0000 UTC m=+7131.560624409" observedRunningTime="2025-12-01 23:32:28.026046963 +0000 UTC m=+7132.127486168" watchObservedRunningTime="2025-12-01 23:32:28.030839258 +0000 UTC m=+7132.132278453" Dec 01 23:32:33 crc kubenswrapper[4962]: I1201 23:32:33.248174 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ldqbk" Dec 01 23:32:33 crc kubenswrapper[4962]: I1201 23:32:33.249583 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ldqbk" Dec 01 23:32:33 crc kubenswrapper[4962]: I1201 23:32:33.322903 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ldqbk" Dec 01 23:32:34 crc kubenswrapper[4962]: I1201 23:32:34.116428 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ldqbk" Dec 01 23:32:34 crc kubenswrapper[4962]: I1201 23:32:34.174795 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ldqbk"] Dec 01 23:32:36 crc kubenswrapper[4962]: I1201 23:32:36.087425 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ldqbk" podUID="3035858f-7ac4-4319-811f-30819baa91b5" containerName="registry-server" containerID="cri-o://1a76512aa4bc64c832ca9665551ccd7f5a19146d001c82ecf6f1e805792a886a" gracePeriod=2 Dec 01 23:32:36 crc kubenswrapper[4962]: I1201 23:32:36.602152 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ldqbk" Dec 01 23:32:36 crc kubenswrapper[4962]: I1201 23:32:36.725053 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52c6x\" (UniqueName: \"kubernetes.io/projected/3035858f-7ac4-4319-811f-30819baa91b5-kube-api-access-52c6x\") pod \"3035858f-7ac4-4319-811f-30819baa91b5\" (UID: \"3035858f-7ac4-4319-811f-30819baa91b5\") " Dec 01 23:32:36 crc kubenswrapper[4962]: I1201 23:32:36.725438 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3035858f-7ac4-4319-811f-30819baa91b5-utilities\") pod \"3035858f-7ac4-4319-811f-30819baa91b5\" (UID: \"3035858f-7ac4-4319-811f-30819baa91b5\") " Dec 01 23:32:36 crc kubenswrapper[4962]: I1201 23:32:36.725763 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3035858f-7ac4-4319-811f-30819baa91b5-catalog-content\") pod \"3035858f-7ac4-4319-811f-30819baa91b5\" (UID: \"3035858f-7ac4-4319-811f-30819baa91b5\") " Dec 01 23:32:36 crc kubenswrapper[4962]: I1201 23:32:36.726596 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3035858f-7ac4-4319-811f-30819baa91b5-utilities" (OuterVolumeSpecName: "utilities") pod "3035858f-7ac4-4319-811f-30819baa91b5" (UID: "3035858f-7ac4-4319-811f-30819baa91b5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 23:32:36 crc kubenswrapper[4962]: I1201 23:32:36.727269 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3035858f-7ac4-4319-811f-30819baa91b5-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 23:32:36 crc kubenswrapper[4962]: I1201 23:32:36.734355 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3035858f-7ac4-4319-811f-30819baa91b5-kube-api-access-52c6x" (OuterVolumeSpecName: "kube-api-access-52c6x") pod "3035858f-7ac4-4319-811f-30819baa91b5" (UID: "3035858f-7ac4-4319-811f-30819baa91b5"). InnerVolumeSpecName "kube-api-access-52c6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 23:32:36 crc kubenswrapper[4962]: I1201 23:32:36.778638 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3035858f-7ac4-4319-811f-30819baa91b5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3035858f-7ac4-4319-811f-30819baa91b5" (UID: "3035858f-7ac4-4319-811f-30819baa91b5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 23:32:36 crc kubenswrapper[4962]: I1201 23:32:36.830054 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52c6x\" (UniqueName: \"kubernetes.io/projected/3035858f-7ac4-4319-811f-30819baa91b5-kube-api-access-52c6x\") on node \"crc\" DevicePath \"\"" Dec 01 23:32:36 crc kubenswrapper[4962]: I1201 23:32:36.830282 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3035858f-7ac4-4319-811f-30819baa91b5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 23:32:37 crc kubenswrapper[4962]: I1201 23:32:37.101764 4962 generic.go:334] "Generic (PLEG): container finished" podID="3035858f-7ac4-4319-811f-30819baa91b5" containerID="1a76512aa4bc64c832ca9665551ccd7f5a19146d001c82ecf6f1e805792a886a" exitCode=0 Dec 01 23:32:37 crc kubenswrapper[4962]: I1201 23:32:37.101811 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ldqbk" event={"ID":"3035858f-7ac4-4319-811f-30819baa91b5","Type":"ContainerDied","Data":"1a76512aa4bc64c832ca9665551ccd7f5a19146d001c82ecf6f1e805792a886a"} Dec 01 23:32:37 crc kubenswrapper[4962]: I1201 23:32:37.101843 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ldqbk" event={"ID":"3035858f-7ac4-4319-811f-30819baa91b5","Type":"ContainerDied","Data":"3e8835a77b207d8fbe639369daae6896c7e37dcc230dc2c6ffe952954cafe195"} Dec 01 23:32:37 crc kubenswrapper[4962]: I1201 23:32:37.101864 4962 scope.go:117] "RemoveContainer" containerID="1a76512aa4bc64c832ca9665551ccd7f5a19146d001c82ecf6f1e805792a886a" Dec 01 23:32:37 crc kubenswrapper[4962]: I1201 23:32:37.102043 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ldqbk" Dec 01 23:32:37 crc kubenswrapper[4962]: I1201 23:32:37.144986 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ldqbk"] Dec 01 23:32:37 crc kubenswrapper[4962]: I1201 23:32:37.147420 4962 scope.go:117] "RemoveContainer" containerID="fee89e3555167aba989f2af70d4f95feeb34485291010346816766c028ae79b8" Dec 01 23:32:37 crc kubenswrapper[4962]: I1201 23:32:37.152367 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ldqbk"] Dec 01 23:32:37 crc kubenswrapper[4962]: I1201 23:32:37.177097 4962 scope.go:117] "RemoveContainer" containerID="17f54adb2d96b8f3c40e85fa4cecca97ed49589f94ade99060a561332e39613b" Dec 01 23:32:37 crc kubenswrapper[4962]: I1201 23:32:37.251880 4962 scope.go:117] "RemoveContainer" containerID="1a76512aa4bc64c832ca9665551ccd7f5a19146d001c82ecf6f1e805792a886a" Dec 01 23:32:37 crc kubenswrapper[4962]: E1201 23:32:37.252757 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a76512aa4bc64c832ca9665551ccd7f5a19146d001c82ecf6f1e805792a886a\": container with ID starting with 1a76512aa4bc64c832ca9665551ccd7f5a19146d001c82ecf6f1e805792a886a not found: ID does not exist" containerID="1a76512aa4bc64c832ca9665551ccd7f5a19146d001c82ecf6f1e805792a886a" Dec 01 23:32:37 crc kubenswrapper[4962]: I1201 23:32:37.252798 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a76512aa4bc64c832ca9665551ccd7f5a19146d001c82ecf6f1e805792a886a"} err="failed to get container status \"1a76512aa4bc64c832ca9665551ccd7f5a19146d001c82ecf6f1e805792a886a\": rpc error: code = NotFound desc = could not find container \"1a76512aa4bc64c832ca9665551ccd7f5a19146d001c82ecf6f1e805792a886a\": container with ID starting with 1a76512aa4bc64c832ca9665551ccd7f5a19146d001c82ecf6f1e805792a886a not found: ID does not exist" Dec 01 23:32:37 crc kubenswrapper[4962]: I1201 23:32:37.252824 4962 scope.go:117] "RemoveContainer" containerID="fee89e3555167aba989f2af70d4f95feeb34485291010346816766c028ae79b8" Dec 01 23:32:37 crc kubenswrapper[4962]: E1201 23:32:37.253533 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fee89e3555167aba989f2af70d4f95feeb34485291010346816766c028ae79b8\": container with ID starting with fee89e3555167aba989f2af70d4f95feeb34485291010346816766c028ae79b8 not found: ID does not exist" containerID="fee89e3555167aba989f2af70d4f95feeb34485291010346816766c028ae79b8" Dec 01 23:32:37 crc kubenswrapper[4962]: I1201 23:32:37.253554 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fee89e3555167aba989f2af70d4f95feeb34485291010346816766c028ae79b8"} err="failed to get container status \"fee89e3555167aba989f2af70d4f95feeb34485291010346816766c028ae79b8\": rpc error: code = NotFound desc = could not find container \"fee89e3555167aba989f2af70d4f95feeb34485291010346816766c028ae79b8\": container with ID starting with fee89e3555167aba989f2af70d4f95feeb34485291010346816766c028ae79b8 not found: ID does not exist" Dec 01 23:32:37 crc kubenswrapper[4962]: I1201 23:32:37.253569 4962 scope.go:117] "RemoveContainer" containerID="17f54adb2d96b8f3c40e85fa4cecca97ed49589f94ade99060a561332e39613b" Dec 01 23:32:37 crc kubenswrapper[4962]: E1201 23:32:37.254130 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17f54adb2d96b8f3c40e85fa4cecca97ed49589f94ade99060a561332e39613b\": container with ID starting with 17f54adb2d96b8f3c40e85fa4cecca97ed49589f94ade99060a561332e39613b not found: ID does not exist" containerID="17f54adb2d96b8f3c40e85fa4cecca97ed49589f94ade99060a561332e39613b" Dec 01 23:32:37 crc kubenswrapper[4962]: I1201 23:32:37.254174 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17f54adb2d96b8f3c40e85fa4cecca97ed49589f94ade99060a561332e39613b"} err="failed to get container status \"17f54adb2d96b8f3c40e85fa4cecca97ed49589f94ade99060a561332e39613b\": rpc error: code = NotFound desc = could not find container \"17f54adb2d96b8f3c40e85fa4cecca97ed49589f94ade99060a561332e39613b\": container with ID starting with 17f54adb2d96b8f3c40e85fa4cecca97ed49589f94ade99060a561332e39613b not found: ID does not exist" Dec 01 23:32:38 crc kubenswrapper[4962]: I1201 23:32:38.252697 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3035858f-7ac4-4319-811f-30819baa91b5" path="/var/lib/kubelet/pods/3035858f-7ac4-4319-811f-30819baa91b5/volumes" Dec 01 23:33:02 crc kubenswrapper[4962]: I1201 23:33:02.784189 4962 patch_prober.go:28] interesting pod/machine-config-daemon-b642k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 23:33:02 crc kubenswrapper[4962]: I1201 23:33:02.785022 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 23:33:10 crc kubenswrapper[4962]: I1201 23:33:10.462399 4962 scope.go:117] "RemoveContainer" containerID="f5c6acc25bde2708179e70cca992411f3e1fcc4dc13e5aab39d191b31baea2fc" Dec 01 23:33:14 crc kubenswrapper[4962]: I1201 23:33:14.654142 4962 generic.go:334] "Generic (PLEG): container finished" podID="984aec14-799d-464c-a22a-d6511adacd1b" containerID="dae883d7de94a470fe044bba7437d28204dd97cb95cf87d2c0e359768c801595" exitCode=0 Dec 01 23:33:14 crc kubenswrapper[4962]: I1201 23:33:14.654269 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mx56v/must-gather-2mg5k" event={"ID":"984aec14-799d-464c-a22a-d6511adacd1b","Type":"ContainerDied","Data":"dae883d7de94a470fe044bba7437d28204dd97cb95cf87d2c0e359768c801595"} Dec 01 23:33:14 crc kubenswrapper[4962]: I1201 23:33:14.655309 4962 scope.go:117] "RemoveContainer" containerID="dae883d7de94a470fe044bba7437d28204dd97cb95cf87d2c0e359768c801595" Dec 01 23:33:15 crc kubenswrapper[4962]: I1201 23:33:15.430631 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-mx56v_must-gather-2mg5k_984aec14-799d-464c-a22a-d6511adacd1b/gather/0.log" Dec 01 23:33:27 crc kubenswrapper[4962]: I1201 23:33:27.931502 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-mx56v/must-gather-2mg5k"] Dec 01 23:33:27 crc kubenswrapper[4962]: I1201 23:33:27.932172 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-mx56v/must-gather-2mg5k" podUID="984aec14-799d-464c-a22a-d6511adacd1b" containerName="copy" containerID="cri-o://815a17eba76e45bf9a9a1dd2d6ed5356c9c5c61ccd72e431c60f0e13dfa65cc2" gracePeriod=2 Dec 01 23:33:27 crc kubenswrapper[4962]: I1201 23:33:27.949123 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-mx56v/must-gather-2mg5k"] Dec 01 23:33:28 crc kubenswrapper[4962]: I1201 23:33:28.445420 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-mx56v_must-gather-2mg5k_984aec14-799d-464c-a22a-d6511adacd1b/copy/0.log" Dec 01 23:33:28 crc kubenswrapper[4962]: I1201 23:33:28.446145 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mx56v/must-gather-2mg5k" Dec 01 23:33:28 crc kubenswrapper[4962]: I1201 23:33:28.597318 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p9tph\" (UniqueName: \"kubernetes.io/projected/984aec14-799d-464c-a22a-d6511adacd1b-kube-api-access-p9tph\") pod \"984aec14-799d-464c-a22a-d6511adacd1b\" (UID: \"984aec14-799d-464c-a22a-d6511adacd1b\") " Dec 01 23:33:28 crc kubenswrapper[4962]: I1201 23:33:28.597571 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/984aec14-799d-464c-a22a-d6511adacd1b-must-gather-output\") pod \"984aec14-799d-464c-a22a-d6511adacd1b\" (UID: \"984aec14-799d-464c-a22a-d6511adacd1b\") " Dec 01 23:33:28 crc kubenswrapper[4962]: I1201 23:33:28.609274 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/984aec14-799d-464c-a22a-d6511adacd1b-kube-api-access-p9tph" (OuterVolumeSpecName: "kube-api-access-p9tph") pod "984aec14-799d-464c-a22a-d6511adacd1b" (UID: "984aec14-799d-464c-a22a-d6511adacd1b"). InnerVolumeSpecName "kube-api-access-p9tph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 23:33:28 crc kubenswrapper[4962]: I1201 23:33:28.700571 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p9tph\" (UniqueName: \"kubernetes.io/projected/984aec14-799d-464c-a22a-d6511adacd1b-kube-api-access-p9tph\") on node \"crc\" DevicePath \"\"" Dec 01 23:33:28 crc kubenswrapper[4962]: I1201 23:33:28.812905 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/984aec14-799d-464c-a22a-d6511adacd1b-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "984aec14-799d-464c-a22a-d6511adacd1b" (UID: "984aec14-799d-464c-a22a-d6511adacd1b"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 23:33:28 crc kubenswrapper[4962]: I1201 23:33:28.850394 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-mx56v_must-gather-2mg5k_984aec14-799d-464c-a22a-d6511adacd1b/copy/0.log" Dec 01 23:33:28 crc kubenswrapper[4962]: I1201 23:33:28.850748 4962 generic.go:334] "Generic (PLEG): container finished" podID="984aec14-799d-464c-a22a-d6511adacd1b" containerID="815a17eba76e45bf9a9a1dd2d6ed5356c9c5c61ccd72e431c60f0e13dfa65cc2" exitCode=143 Dec 01 23:33:28 crc kubenswrapper[4962]: I1201 23:33:28.850802 4962 scope.go:117] "RemoveContainer" containerID="815a17eba76e45bf9a9a1dd2d6ed5356c9c5c61ccd72e431c60f0e13dfa65cc2" Dec 01 23:33:28 crc kubenswrapper[4962]: I1201 23:33:28.850821 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mx56v/must-gather-2mg5k" Dec 01 23:33:28 crc kubenswrapper[4962]: I1201 23:33:28.871252 4962 scope.go:117] "RemoveContainer" containerID="dae883d7de94a470fe044bba7437d28204dd97cb95cf87d2c0e359768c801595" Dec 01 23:33:28 crc kubenswrapper[4962]: I1201 23:33:28.905563 4962 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/984aec14-799d-464c-a22a-d6511adacd1b-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 01 23:33:28 crc kubenswrapper[4962]: I1201 23:33:28.918909 4962 scope.go:117] "RemoveContainer" containerID="815a17eba76e45bf9a9a1dd2d6ed5356c9c5c61ccd72e431c60f0e13dfa65cc2" Dec 01 23:33:28 crc kubenswrapper[4962]: E1201 23:33:28.919309 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"815a17eba76e45bf9a9a1dd2d6ed5356c9c5c61ccd72e431c60f0e13dfa65cc2\": container with ID starting with 815a17eba76e45bf9a9a1dd2d6ed5356c9c5c61ccd72e431c60f0e13dfa65cc2 not found: ID does not exist" containerID="815a17eba76e45bf9a9a1dd2d6ed5356c9c5c61ccd72e431c60f0e13dfa65cc2" Dec 01 23:33:28 crc kubenswrapper[4962]: I1201 23:33:28.919337 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"815a17eba76e45bf9a9a1dd2d6ed5356c9c5c61ccd72e431c60f0e13dfa65cc2"} err="failed to get container status \"815a17eba76e45bf9a9a1dd2d6ed5356c9c5c61ccd72e431c60f0e13dfa65cc2\": rpc error: code = NotFound desc = could not find container \"815a17eba76e45bf9a9a1dd2d6ed5356c9c5c61ccd72e431c60f0e13dfa65cc2\": container with ID starting with 815a17eba76e45bf9a9a1dd2d6ed5356c9c5c61ccd72e431c60f0e13dfa65cc2 not found: ID does not exist" Dec 01 23:33:28 crc kubenswrapper[4962]: I1201 23:33:28.919356 4962 scope.go:117] "RemoveContainer" containerID="dae883d7de94a470fe044bba7437d28204dd97cb95cf87d2c0e359768c801595" Dec 01 23:33:28 crc kubenswrapper[4962]: E1201 23:33:28.919586 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dae883d7de94a470fe044bba7437d28204dd97cb95cf87d2c0e359768c801595\": container with ID starting with dae883d7de94a470fe044bba7437d28204dd97cb95cf87d2c0e359768c801595 not found: ID does not exist" containerID="dae883d7de94a470fe044bba7437d28204dd97cb95cf87d2c0e359768c801595" Dec 01 23:33:28 crc kubenswrapper[4962]: I1201 23:33:28.919605 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dae883d7de94a470fe044bba7437d28204dd97cb95cf87d2c0e359768c801595"} err="failed to get container status \"dae883d7de94a470fe044bba7437d28204dd97cb95cf87d2c0e359768c801595\": rpc error: code = NotFound desc = could not find container \"dae883d7de94a470fe044bba7437d28204dd97cb95cf87d2c0e359768c801595\": container with ID starting with dae883d7de94a470fe044bba7437d28204dd97cb95cf87d2c0e359768c801595 not found: ID does not exist" Dec 01 23:33:30 crc kubenswrapper[4962]: I1201 23:33:30.238455 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="984aec14-799d-464c-a22a-d6511adacd1b" path="/var/lib/kubelet/pods/984aec14-799d-464c-a22a-d6511adacd1b/volumes" Dec 01 23:33:32 crc kubenswrapper[4962]: I1201 23:33:32.784616 4962 patch_prober.go:28] interesting pod/machine-config-daemon-b642k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 23:33:32 crc kubenswrapper[4962]: I1201 23:33:32.785039 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 23:34:02 crc kubenswrapper[4962]: I1201 23:34:02.784156 4962 patch_prober.go:28] interesting pod/machine-config-daemon-b642k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 23:34:02 crc kubenswrapper[4962]: I1201 23:34:02.784724 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 23:34:02 crc kubenswrapper[4962]: I1201 23:34:02.784769 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b642k" Dec 01 23:34:02 crc kubenswrapper[4962]: I1201 23:34:02.785630 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"29bbb591d47c3a36d7be2b98516f84b98893107f8cb3db99681cfac14431b8bd"} pod="openshift-machine-config-operator/machine-config-daemon-b642k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 23:34:02 crc kubenswrapper[4962]: I1201 23:34:02.785685 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" containerID="cri-o://29bbb591d47c3a36d7be2b98516f84b98893107f8cb3db99681cfac14431b8bd" gracePeriod=600 Dec 01 23:34:03 crc kubenswrapper[4962]: I1201 23:34:03.313532 4962 generic.go:334] "Generic (PLEG): container finished" podID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerID="29bbb591d47c3a36d7be2b98516f84b98893107f8cb3db99681cfac14431b8bd" exitCode=0 Dec 01 23:34:03 crc kubenswrapper[4962]: I1201 23:34:03.313606 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" event={"ID":"191b6ce3-f613-4217-b224-a65ee4cfdfe7","Type":"ContainerDied","Data":"29bbb591d47c3a36d7be2b98516f84b98893107f8cb3db99681cfac14431b8bd"} Dec 01 23:34:03 crc kubenswrapper[4962]: I1201 23:34:03.319087 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" event={"ID":"191b6ce3-f613-4217-b224-a65ee4cfdfe7","Type":"ContainerStarted","Data":"fea815e65bb0aef33f3214253f41814c65cf9aad69bdca6216bc099fcf3e1e81"} Dec 01 23:34:03 crc kubenswrapper[4962]: I1201 23:34:03.319132 4962 scope.go:117] "RemoveContainer" containerID="c6fb37a1aaa81633d4a05ea5583bfc5dfc3c8b8e5c3dfe9438063a6a533a05f8" Dec 01 23:34:10 crc kubenswrapper[4962]: I1201 23:34:10.628364 4962 scope.go:117] "RemoveContainer" containerID="813a9e396b8a81e5bd0b95087d22beea8f41b0b50a7de04b14344b6009234b93" Dec 01 23:36:19 crc kubenswrapper[4962]: I1201 23:36:19.187537 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-txpxv"] Dec 01 23:36:19 crc kubenswrapper[4962]: E1201 23:36:19.189197 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3035858f-7ac4-4319-811f-30819baa91b5" containerName="registry-server" Dec 01 23:36:19 crc kubenswrapper[4962]: I1201 23:36:19.189223 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="3035858f-7ac4-4319-811f-30819baa91b5" containerName="registry-server" Dec 01 23:36:19 crc kubenswrapper[4962]: E1201 23:36:19.189272 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="984aec14-799d-464c-a22a-d6511adacd1b" containerName="copy" Dec 01 23:36:19 crc kubenswrapper[4962]: I1201 23:36:19.189283 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="984aec14-799d-464c-a22a-d6511adacd1b" containerName="copy" Dec 01 23:36:19 crc kubenswrapper[4962]: E1201 23:36:19.189322 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="984aec14-799d-464c-a22a-d6511adacd1b" containerName="gather" Dec 01 23:36:19 crc kubenswrapper[4962]: I1201 23:36:19.189334 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="984aec14-799d-464c-a22a-d6511adacd1b" containerName="gather" Dec 01 23:36:19 crc kubenswrapper[4962]: E1201 23:36:19.189352 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3035858f-7ac4-4319-811f-30819baa91b5" containerName="extract-utilities" Dec 01 23:36:19 crc kubenswrapper[4962]: I1201 23:36:19.189363 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="3035858f-7ac4-4319-811f-30819baa91b5" containerName="extract-utilities" Dec 01 23:36:19 crc kubenswrapper[4962]: E1201 23:36:19.189388 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3035858f-7ac4-4319-811f-30819baa91b5" containerName="extract-content" Dec 01 23:36:19 crc kubenswrapper[4962]: I1201 23:36:19.189428 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="3035858f-7ac4-4319-811f-30819baa91b5" containerName="extract-content" Dec 01 23:36:19 crc kubenswrapper[4962]: I1201 23:36:19.189829 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="3035858f-7ac4-4319-811f-30819baa91b5" containerName="registry-server" Dec 01 23:36:19 crc kubenswrapper[4962]: I1201 23:36:19.189887 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="984aec14-799d-464c-a22a-d6511adacd1b" containerName="gather" Dec 01 23:36:19 crc kubenswrapper[4962]: I1201 23:36:19.189902 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="984aec14-799d-464c-a22a-d6511adacd1b" containerName="copy" Dec 01 23:36:19 crc kubenswrapper[4962]: I1201 23:36:19.194458 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-txpxv" Dec 01 23:36:19 crc kubenswrapper[4962]: I1201 23:36:19.210413 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-txpxv"] Dec 01 23:36:19 crc kubenswrapper[4962]: I1201 23:36:19.249282 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b89a7d2-e63c-43af-b064-fbb28c1fa72a-catalog-content\") pod \"redhat-marketplace-txpxv\" (UID: \"6b89a7d2-e63c-43af-b064-fbb28c1fa72a\") " pod="openshift-marketplace/redhat-marketplace-txpxv" Dec 01 23:36:19 crc kubenswrapper[4962]: I1201 23:36:19.249469 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b89a7d2-e63c-43af-b064-fbb28c1fa72a-utilities\") pod \"redhat-marketplace-txpxv\" (UID: \"6b89a7d2-e63c-43af-b064-fbb28c1fa72a\") " pod="openshift-marketplace/redhat-marketplace-txpxv" Dec 01 23:36:19 crc kubenswrapper[4962]: I1201 23:36:19.249599 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjghw\" (UniqueName: \"kubernetes.io/projected/6b89a7d2-e63c-43af-b064-fbb28c1fa72a-kube-api-access-zjghw\") pod \"redhat-marketplace-txpxv\" (UID: \"6b89a7d2-e63c-43af-b064-fbb28c1fa72a\") " pod="openshift-marketplace/redhat-marketplace-txpxv" Dec 01 23:36:19 crc kubenswrapper[4962]: I1201 23:36:19.352579 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b89a7d2-e63c-43af-b064-fbb28c1fa72a-catalog-content\") pod \"redhat-marketplace-txpxv\" (UID: \"6b89a7d2-e63c-43af-b064-fbb28c1fa72a\") " pod="openshift-marketplace/redhat-marketplace-txpxv" Dec 01 23:36:19 crc kubenswrapper[4962]: I1201 23:36:19.352699 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b89a7d2-e63c-43af-b064-fbb28c1fa72a-utilities\") pod \"redhat-marketplace-txpxv\" (UID: \"6b89a7d2-e63c-43af-b064-fbb28c1fa72a\") " pod="openshift-marketplace/redhat-marketplace-txpxv" Dec 01 23:36:19 crc kubenswrapper[4962]: I1201 23:36:19.352800 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjghw\" (UniqueName: \"kubernetes.io/projected/6b89a7d2-e63c-43af-b064-fbb28c1fa72a-kube-api-access-zjghw\") pod \"redhat-marketplace-txpxv\" (UID: \"6b89a7d2-e63c-43af-b064-fbb28c1fa72a\") " pod="openshift-marketplace/redhat-marketplace-txpxv" Dec 01 23:36:19 crc kubenswrapper[4962]: I1201 23:36:19.353069 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b89a7d2-e63c-43af-b064-fbb28c1fa72a-catalog-content\") pod \"redhat-marketplace-txpxv\" (UID: \"6b89a7d2-e63c-43af-b064-fbb28c1fa72a\") " pod="openshift-marketplace/redhat-marketplace-txpxv" Dec 01 23:36:19 crc kubenswrapper[4962]: I1201 23:36:19.353119 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b89a7d2-e63c-43af-b064-fbb28c1fa72a-utilities\") pod \"redhat-marketplace-txpxv\" (UID: \"6b89a7d2-e63c-43af-b064-fbb28c1fa72a\") " pod="openshift-marketplace/redhat-marketplace-txpxv" Dec 01 23:36:19 crc kubenswrapper[4962]: I1201 23:36:19.370878 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjghw\" (UniqueName: \"kubernetes.io/projected/6b89a7d2-e63c-43af-b064-fbb28c1fa72a-kube-api-access-zjghw\") pod \"redhat-marketplace-txpxv\" (UID: \"6b89a7d2-e63c-43af-b064-fbb28c1fa72a\") " pod="openshift-marketplace/redhat-marketplace-txpxv" Dec 01 23:36:19 crc kubenswrapper[4962]: I1201 23:36:19.551355 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-txpxv" Dec 01 23:36:20 crc kubenswrapper[4962]: I1201 23:36:20.144605 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-txpxv"] Dec 01 23:36:20 crc kubenswrapper[4962]: I1201 23:36:20.330254 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-txpxv" event={"ID":"6b89a7d2-e63c-43af-b064-fbb28c1fa72a","Type":"ContainerStarted","Data":"5011ffa0c72977b73ce36fa5ea9bb061379dc05ad323b7dbea9b81eb90690cde"} Dec 01 23:36:21 crc kubenswrapper[4962]: I1201 23:36:21.345428 4962 generic.go:334] "Generic (PLEG): container finished" podID="6b89a7d2-e63c-43af-b064-fbb28c1fa72a" containerID="3f8ea7c4332169f4642d6a3120f4306bc90cdb40abf2a50b91dae80a594c0941" exitCode=0 Dec 01 23:36:21 crc kubenswrapper[4962]: I1201 23:36:21.345504 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-txpxv" event={"ID":"6b89a7d2-e63c-43af-b064-fbb28c1fa72a","Type":"ContainerDied","Data":"3f8ea7c4332169f4642d6a3120f4306bc90cdb40abf2a50b91dae80a594c0941"} Dec 01 23:36:21 crc kubenswrapper[4962]: I1201 23:36:21.349087 4962 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 23:36:23 crc kubenswrapper[4962]: I1201 23:36:23.373518 4962 generic.go:334] "Generic (PLEG): container finished" podID="6b89a7d2-e63c-43af-b064-fbb28c1fa72a" containerID="bb79bc89c085e7b6a116094c77adaa8a90df4fc44ea534e72e0a81470aa82644" exitCode=0 Dec 01 23:36:23 crc kubenswrapper[4962]: I1201 23:36:23.373576 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-txpxv" event={"ID":"6b89a7d2-e63c-43af-b064-fbb28c1fa72a","Type":"ContainerDied","Data":"bb79bc89c085e7b6a116094c77adaa8a90df4fc44ea534e72e0a81470aa82644"} Dec 01 23:36:24 crc kubenswrapper[4962]: I1201 23:36:24.390100 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-txpxv" event={"ID":"6b89a7d2-e63c-43af-b064-fbb28c1fa72a","Type":"ContainerStarted","Data":"a9f22e00b7cced006819b547046c811ec917c4c40ffc867445ae0532db85d17c"} Dec 01 23:36:24 crc kubenswrapper[4962]: I1201 23:36:24.424160 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-txpxv" podStartSLOduration=2.857732105 podStartE2EDuration="5.424144417s" podCreationTimestamp="2025-12-01 23:36:19 +0000 UTC" firstStartedPulling="2025-12-01 23:36:21.348456874 +0000 UTC m=+7365.449896069" lastFinishedPulling="2025-12-01 23:36:23.914869186 +0000 UTC m=+7368.016308381" observedRunningTime="2025-12-01 23:36:24.418219659 +0000 UTC m=+7368.519658874" watchObservedRunningTime="2025-12-01 23:36:24.424144417 +0000 UTC m=+7368.525583612" Dec 01 23:36:29 crc kubenswrapper[4962]: I1201 23:36:29.551851 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-txpxv" Dec 01 23:36:29 crc kubenswrapper[4962]: I1201 23:36:29.552530 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-txpxv" Dec 01 23:36:29 crc kubenswrapper[4962]: I1201 23:36:29.624046 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-txpxv" Dec 01 23:36:30 crc kubenswrapper[4962]: I1201 23:36:30.578702 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-txpxv" Dec 01 23:36:30 crc kubenswrapper[4962]: I1201 23:36:30.662575 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-txpxv"] Dec 01 23:36:32 crc kubenswrapper[4962]: I1201 23:36:32.524990 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-txpxv" podUID="6b89a7d2-e63c-43af-b064-fbb28c1fa72a" containerName="registry-server" containerID="cri-o://a9f22e00b7cced006819b547046c811ec917c4c40ffc867445ae0532db85d17c" gracePeriod=2 Dec 01 23:36:32 crc kubenswrapper[4962]: I1201 23:36:32.784417 4962 patch_prober.go:28] interesting pod/machine-config-daemon-b642k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 23:36:32 crc kubenswrapper[4962]: I1201 23:36:32.784836 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 23:36:33 crc kubenswrapper[4962]: I1201 23:36:33.090881 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-txpxv" Dec 01 23:36:33 crc kubenswrapper[4962]: I1201 23:36:33.263526 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b89a7d2-e63c-43af-b064-fbb28c1fa72a-utilities\") pod \"6b89a7d2-e63c-43af-b064-fbb28c1fa72a\" (UID: \"6b89a7d2-e63c-43af-b064-fbb28c1fa72a\") " Dec 01 23:36:33 crc kubenswrapper[4962]: I1201 23:36:33.264143 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjghw\" (UniqueName: \"kubernetes.io/projected/6b89a7d2-e63c-43af-b064-fbb28c1fa72a-kube-api-access-zjghw\") pod \"6b89a7d2-e63c-43af-b064-fbb28c1fa72a\" (UID: \"6b89a7d2-e63c-43af-b064-fbb28c1fa72a\") " Dec 01 23:36:33 crc kubenswrapper[4962]: I1201 23:36:33.264259 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b89a7d2-e63c-43af-b064-fbb28c1fa72a-catalog-content\") pod \"6b89a7d2-e63c-43af-b064-fbb28c1fa72a\" (UID: \"6b89a7d2-e63c-43af-b064-fbb28c1fa72a\") " Dec 01 23:36:33 crc kubenswrapper[4962]: I1201 23:36:33.264925 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b89a7d2-e63c-43af-b064-fbb28c1fa72a-utilities" (OuterVolumeSpecName: "utilities") pod "6b89a7d2-e63c-43af-b064-fbb28c1fa72a" (UID: "6b89a7d2-e63c-43af-b064-fbb28c1fa72a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 23:36:33 crc kubenswrapper[4962]: I1201 23:36:33.265286 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b89a7d2-e63c-43af-b064-fbb28c1fa72a-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 23:36:33 crc kubenswrapper[4962]: I1201 23:36:33.269348 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b89a7d2-e63c-43af-b064-fbb28c1fa72a-kube-api-access-zjghw" (OuterVolumeSpecName: "kube-api-access-zjghw") pod "6b89a7d2-e63c-43af-b064-fbb28c1fa72a" (UID: "6b89a7d2-e63c-43af-b064-fbb28c1fa72a"). InnerVolumeSpecName "kube-api-access-zjghw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 23:36:33 crc kubenswrapper[4962]: I1201 23:36:33.284687 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b89a7d2-e63c-43af-b064-fbb28c1fa72a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6b89a7d2-e63c-43af-b064-fbb28c1fa72a" (UID: "6b89a7d2-e63c-43af-b064-fbb28c1fa72a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 23:36:33 crc kubenswrapper[4962]: I1201 23:36:33.367843 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjghw\" (UniqueName: \"kubernetes.io/projected/6b89a7d2-e63c-43af-b064-fbb28c1fa72a-kube-api-access-zjghw\") on node \"crc\" DevicePath \"\"" Dec 01 23:36:33 crc kubenswrapper[4962]: I1201 23:36:33.368165 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b89a7d2-e63c-43af-b064-fbb28c1fa72a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 23:36:33 crc kubenswrapper[4962]: I1201 23:36:33.542466 4962 generic.go:334] "Generic (PLEG): container finished" podID="6b89a7d2-e63c-43af-b064-fbb28c1fa72a" containerID="a9f22e00b7cced006819b547046c811ec917c4c40ffc867445ae0532db85d17c" exitCode=0 Dec 01 23:36:33 crc kubenswrapper[4962]: I1201 23:36:33.542524 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-txpxv" event={"ID":"6b89a7d2-e63c-43af-b064-fbb28c1fa72a","Type":"ContainerDied","Data":"a9f22e00b7cced006819b547046c811ec917c4c40ffc867445ae0532db85d17c"} Dec 01 23:36:33 crc kubenswrapper[4962]: I1201 23:36:33.542583 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-txpxv" event={"ID":"6b89a7d2-e63c-43af-b064-fbb28c1fa72a","Type":"ContainerDied","Data":"5011ffa0c72977b73ce36fa5ea9bb061379dc05ad323b7dbea9b81eb90690cde"} Dec 01 23:36:33 crc kubenswrapper[4962]: I1201 23:36:33.542593 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-txpxv" Dec 01 23:36:33 crc kubenswrapper[4962]: I1201 23:36:33.542650 4962 scope.go:117] "RemoveContainer" containerID="a9f22e00b7cced006819b547046c811ec917c4c40ffc867445ae0532db85d17c" Dec 01 23:36:33 crc kubenswrapper[4962]: I1201 23:36:33.569734 4962 scope.go:117] "RemoveContainer" containerID="bb79bc89c085e7b6a116094c77adaa8a90df4fc44ea534e72e0a81470aa82644" Dec 01 23:36:33 crc kubenswrapper[4962]: I1201 23:36:33.617440 4962 scope.go:117] "RemoveContainer" containerID="3f8ea7c4332169f4642d6a3120f4306bc90cdb40abf2a50b91dae80a594c0941" Dec 01 23:36:33 crc kubenswrapper[4962]: I1201 23:36:33.619153 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-txpxv"] Dec 01 23:36:33 crc kubenswrapper[4962]: I1201 23:36:33.633383 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-txpxv"] Dec 01 23:36:33 crc kubenswrapper[4962]: I1201 23:36:33.691057 4962 scope.go:117] "RemoveContainer" containerID="a9f22e00b7cced006819b547046c811ec917c4c40ffc867445ae0532db85d17c" Dec 01 23:36:33 crc kubenswrapper[4962]: E1201 23:36:33.691562 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9f22e00b7cced006819b547046c811ec917c4c40ffc867445ae0532db85d17c\": container with ID starting with a9f22e00b7cced006819b547046c811ec917c4c40ffc867445ae0532db85d17c not found: ID does not exist" containerID="a9f22e00b7cced006819b547046c811ec917c4c40ffc867445ae0532db85d17c" Dec 01 23:36:33 crc kubenswrapper[4962]: I1201 23:36:33.691623 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9f22e00b7cced006819b547046c811ec917c4c40ffc867445ae0532db85d17c"} err="failed to get container status \"a9f22e00b7cced006819b547046c811ec917c4c40ffc867445ae0532db85d17c\": rpc error: code = NotFound desc = could not find container \"a9f22e00b7cced006819b547046c811ec917c4c40ffc867445ae0532db85d17c\": container with ID starting with a9f22e00b7cced006819b547046c811ec917c4c40ffc867445ae0532db85d17c not found: ID does not exist" Dec 01 23:36:33 crc kubenswrapper[4962]: I1201 23:36:33.691659 4962 scope.go:117] "RemoveContainer" containerID="bb79bc89c085e7b6a116094c77adaa8a90df4fc44ea534e72e0a81470aa82644" Dec 01 23:36:33 crc kubenswrapper[4962]: E1201 23:36:33.692178 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb79bc89c085e7b6a116094c77adaa8a90df4fc44ea534e72e0a81470aa82644\": container with ID starting with bb79bc89c085e7b6a116094c77adaa8a90df4fc44ea534e72e0a81470aa82644 not found: ID does not exist" containerID="bb79bc89c085e7b6a116094c77adaa8a90df4fc44ea534e72e0a81470aa82644" Dec 01 23:36:33 crc kubenswrapper[4962]: I1201 23:36:33.692208 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb79bc89c085e7b6a116094c77adaa8a90df4fc44ea534e72e0a81470aa82644"} err="failed to get container status \"bb79bc89c085e7b6a116094c77adaa8a90df4fc44ea534e72e0a81470aa82644\": rpc error: code = NotFound desc = could not find container \"bb79bc89c085e7b6a116094c77adaa8a90df4fc44ea534e72e0a81470aa82644\": container with ID starting with bb79bc89c085e7b6a116094c77adaa8a90df4fc44ea534e72e0a81470aa82644 not found: ID does not exist" Dec 01 23:36:33 crc kubenswrapper[4962]: I1201 23:36:33.692229 4962 scope.go:117] "RemoveContainer" containerID="3f8ea7c4332169f4642d6a3120f4306bc90cdb40abf2a50b91dae80a594c0941" Dec 01 23:36:33 crc kubenswrapper[4962]: E1201 23:36:33.692518 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f8ea7c4332169f4642d6a3120f4306bc90cdb40abf2a50b91dae80a594c0941\": container with ID starting with 3f8ea7c4332169f4642d6a3120f4306bc90cdb40abf2a50b91dae80a594c0941 not found: ID does not exist" containerID="3f8ea7c4332169f4642d6a3120f4306bc90cdb40abf2a50b91dae80a594c0941" Dec 01 23:36:33 crc kubenswrapper[4962]: I1201 23:36:33.692549 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f8ea7c4332169f4642d6a3120f4306bc90cdb40abf2a50b91dae80a594c0941"} err="failed to get container status \"3f8ea7c4332169f4642d6a3120f4306bc90cdb40abf2a50b91dae80a594c0941\": rpc error: code = NotFound desc = could not find container \"3f8ea7c4332169f4642d6a3120f4306bc90cdb40abf2a50b91dae80a594c0941\": container with ID starting with 3f8ea7c4332169f4642d6a3120f4306bc90cdb40abf2a50b91dae80a594c0941 not found: ID does not exist" Dec 01 23:36:34 crc kubenswrapper[4962]: I1201 23:36:34.231327 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b89a7d2-e63c-43af-b064-fbb28c1fa72a" path="/var/lib/kubelet/pods/6b89a7d2-e63c-43af-b064-fbb28c1fa72a/volumes" Dec 01 23:37:02 crc kubenswrapper[4962]: I1201 23:37:02.784444 4962 patch_prober.go:28] interesting pod/machine-config-daemon-b642k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 23:37:02 crc kubenswrapper[4962]: I1201 23:37:02.784905 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 23:37:14 crc kubenswrapper[4962]: I1201 23:37:14.091607 4962 patch_prober.go:28] interesting pod/authentication-operator-69f744f599-7x948 container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 01 23:37:14 crc kubenswrapper[4962]: I1201 23:37:14.093707 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-69f744f599-7x948" podUID="9f31ef6d-c116-4335-bc5d-5357a379d202" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 01 23:37:32 crc kubenswrapper[4962]: I1201 23:37:32.784819 4962 patch_prober.go:28] interesting pod/machine-config-daemon-b642k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 23:37:32 crc kubenswrapper[4962]: I1201 23:37:32.785420 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 23:37:32 crc kubenswrapper[4962]: I1201 23:37:32.785466 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b642k" Dec 01 23:37:32 crc kubenswrapper[4962]: I1201 23:37:32.786273 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fea815e65bb0aef33f3214253f41814c65cf9aad69bdca6216bc099fcf3e1e81"} pod="openshift-machine-config-operator/machine-config-daemon-b642k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 23:37:32 crc kubenswrapper[4962]: I1201 23:37:32.786342 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerName="machine-config-daemon" containerID="cri-o://fea815e65bb0aef33f3214253f41814c65cf9aad69bdca6216bc099fcf3e1e81" gracePeriod=600 Dec 01 23:37:32 crc kubenswrapper[4962]: E1201 23:37:32.907282 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 23:37:33 crc kubenswrapper[4962]: I1201 23:37:33.442780 4962 generic.go:334] "Generic (PLEG): container finished" podID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" containerID="fea815e65bb0aef33f3214253f41814c65cf9aad69bdca6216bc099fcf3e1e81" exitCode=0 Dec 01 23:37:33 crc kubenswrapper[4962]: I1201 23:37:33.443242 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b642k" event={"ID":"191b6ce3-f613-4217-b224-a65ee4cfdfe7","Type":"ContainerDied","Data":"fea815e65bb0aef33f3214253f41814c65cf9aad69bdca6216bc099fcf3e1e81"} Dec 01 23:37:33 crc kubenswrapper[4962]: I1201 23:37:33.443717 4962 scope.go:117] "RemoveContainer" containerID="29bbb591d47c3a36d7be2b98516f84b98893107f8cb3db99681cfac14431b8bd" Dec 01 23:37:33 crc kubenswrapper[4962]: I1201 23:37:33.446230 4962 scope.go:117] "RemoveContainer" containerID="fea815e65bb0aef33f3214253f41814c65cf9aad69bdca6216bc099fcf3e1e81" Dec 01 23:37:33 crc kubenswrapper[4962]: E1201 23:37:33.446994 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 23:37:45 crc kubenswrapper[4962]: I1201 23:37:45.221101 4962 scope.go:117] "RemoveContainer" containerID="fea815e65bb0aef33f3214253f41814c65cf9aad69bdca6216bc099fcf3e1e81" Dec 01 23:37:45 crc kubenswrapper[4962]: E1201 23:37:45.222385 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7" Dec 01 23:37:57 crc kubenswrapper[4962]: I1201 23:37:57.220790 4962 scope.go:117] "RemoveContainer" containerID="fea815e65bb0aef33f3214253f41814c65cf9aad69bdca6216bc099fcf3e1e81" Dec 01 23:37:57 crc kubenswrapper[4962]: E1201 23:37:57.221916 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b642k_openshift-machine-config-operator(191b6ce3-f613-4217-b224-a65ee4cfdfe7)\"" pod="openshift-machine-config-operator/machine-config-daemon-b642k" podUID="191b6ce3-f613-4217-b224-a65ee4cfdfe7"